Wednesday, September 23, 2015

Data Warehouse Administration Console (DAC)



Oracle Business Analytics Warehouse Architecture

The DAC provides a framework for the entire life cycle of data warehouse implementations. It enables you to create, configure, execute, and monitor modular data warehouse applications in a parallel, high-performing environment.

The DAC complements the Informatica ETL platform. It provides application-specific capabilities that are not prebuilt into ETL platforms. For example, ETL platforms are not aware of the semantics of the subject areas being populated in the data warehouse nor the method in which they are populated.

The DAC provides the following application capabilities at a layer of abstraction above the ETL execution platform:

  • Dynamic generation of subject areas and execution plans
  • Dynamic settings for parallelism and load balancing
  • Intelligent task queue engine based on user- defined and computed scores
  • Automatic full and incremental mode aware
  • Index management for ETL and query performance
  • Embedded high performance Siebel OLTP change capture techniques
  • Ability to restart at any point of failure
  • Phase-based analysis tools for isolating ETL bottlenecks

Data Warehouse Administration Console Components

The Data Warehouse Administration Console comprises the following components:

DAC Client:
    A command and control interface for the data warehouse to allow for schema management, and configuration, administration, and monitoring of data warehouse processes. It also allows you to design subject areas and build execution plans.
DAC Server:
    Executes the instructions from the DAC client. The DAC server manages data warehouse processes, including loading of the ETL and scheduling execution plans. It dynamically adjusts its actions based on information in the DAC repository. Depending on your business needs, you might incrementally refresh the Oracle Business Analytics Warehouse once a day, once a week, once a month, or on another similar schedule.
DAC Repository:
    Stores the metadata (semantics of the Oracle Business Analytics Warehouse) that represents the data warehouse processes.
    The DAC repository stores application objects in a hierarchical framework that defines a data warehouse application. The DAC allows you to view the repository application objects based on the source system container you specify. The source system container holds the metadata that corresponds to the source system with which you are working.
A data warehouse application comprises the following repository objects
    Subject area: A logical grouping of tables related to a particular subject or application context. It also includes the tasks that are associated with the tables, as well as the tasks required to load the tables. Subject areas are assigned to execution plans, which can be scheduled for full or incremental loads.
    Tables: Physical database tables defined in the database schema. Can be transactional database tables or data warehouse tables. Table types can be fact, dimension, hierarchy, aggregate, and so on, as well as flat files that can be sources or targets.
    Task: A unit of work for loading one or more tables. A task comprises the following: source and target tables, phase, execution type, truncate properties, and commands for full or incremental loads. When you assemble a subject area, the DAC automatically assigns tasks to it. Tasks that are automatically assigned to the subject area by the DAC are indicated by the Auto generated flag in the Tasks sub-tab of the Subject Areas tab.
    Task Groups: A group of tasks that you define because you want to impose a specific order of execution. A task group is considered to be a "special task."
    Execution plan: A data transformation plan defined on subject areas that needs to be transformed at certain frequencies of time. An execution plan is defined based on business requirements for when the data warehouse needs to be loaded. An execution plan comprises the following: ordered tasks, indexes, tags, parameters, source system folders, and phases.
    Schedule: A schedule specifies when and how often an execution plan runs. An execution plan can be scheduled for different frequencies or recurrences by defining multiple schedules.
DAC Process Life Cycle

The phases of the process and the actions associated with them are as follows:


Setup
    Set up database connections
    Set up ETL processes (Informatica)
    Set up email recipients
Design
    Define application objects
    Design execution plans
Execute
    Define scheduling parameters to run execution plans
    Access runtime controls to restart or stop currently running schedules
Monitor
    Monitor runtime execution of data warehouse applications
    Monitor users, DAC repository, and application maintenance jobs


DAC Interface


Main DAC Window


The DAC Server Monitor Icons



DAC Repository Command Line Options

This section describes the DAC repository command line parameters that are exposed by the AutomationUtils.bat file, which is located in the DAC installtion folder.

Import DAC Metadata by Application
    The IMPORT option imports DAC metadata into the DAC repository for specified source system containers. The import process truncates all imported tables. You cannot perform an incremental import with this command.
    Syntax : IMPORT [folderName] [contName1] [contName2] ...
      Where:
        folderName Full path to the root of the import file structure.
        contName (Optional) Name of the source system container for which you want to import DAC metadata. If no container is named, all containers that are found in the file structure will be imported.
Export DAC Metadata by Application
    The EXPORT option exports DAC metadata into the DAC repository for specified source system containers.
    Syntax : EXPORT [folderName] [contName1] [contName2] ...
      Where:
        folderName Full path to the root of the export file structure.
        contName (Optional) Name of the source system container for which you want to export DAC metadata. If no container is named, all containers that are found in the file structure will be exported.
Import DAC Metadata by Categories
    The IMPORTCATEGORY option imports DAC metadata into the DAC repository based on the Logical, Run Time, or System categories. The import process truncates all imported tables. You cannot perform an incremental import with this command.
    Syntax : IMPORTCATEGORY [folderName] [logicalFlag] [runtimeFlag] [systemFlag]
      Where:
        folderName Full path to the root of the export file structure.
        logicalFlag If the value of this parameter is true, all data categorized as logical is imported (information contained in the DAC Design view). Otherwise, this parameter is ignored.
        runtimeFlag If the value of this parameter is true, all data categorized as run time is imported (information contained in the DAC Execute view). Otherwise, this parameter is ignored.
        systemFlag If the value of this parameter is true, all data categorized as run time is imported (information contained in the DAC Setup view). Otherwise, this parameter is ignored.
Export DAC Metadata by Categories
    The EXPORTCATEGORY option imports DAC metadata into the DAC repository based on the Logical, Run Time, or System categories. The import process truncates all exported tables. You cannot perform an incremental import with this command.
    Syntax : EXPORTCATEGORY [folderName] [logicalFlag] [runtimeFlag] [systemFlag]
      Where:
        folderName Full path to the root of the export file structure.
        logicalFlag If the value of this parameter is true, all data categorized as logical is exported (information contained in the DAC Design view). Otherwise, this parameter is ignored.
        runtimeFlag If the value of this parameter is true, all data categorized as run time is exported (information contained in the DAC Execute view). Otherwise, this parameter is ignored.
        systemFlag If the value of this parameter is true, all data categorized as run time is exported (information contained in the DAC Setup view). Otherwise, this parameter is ignored.

DAC Code Deployment using Command Line

This section describes the current existing process used for exporting, Importing and moving the logical container level code migration from one environment to another environment such as development, test, uat and production.
  • Import_cmd.bat
  • Export_cmd.bat
  • Move_file.bat
Import_cmd.bat
    set CURR_DATE=%DATE%
    set MONTH=%CURR_DATE:~4,2%
    set DAY=%CURR_DATE:~7,2%
    set YEAR=%CURR_DATE:~12,2%
    set VERSION=%DAY%%YEAR%%MONTH%
    rename DAC_EXPORT DAC_EXPORT_%VERSION%
    automationUtils.bat importcategory :Dir\DAC_POC\DAC_EXPORT_%VERSION% "source_folder_name" logical
Export_cmd.bat
    automationUtils.bat exportcategory :Dir\DAC_POC\DAC_EXPORT Source_Folder_Name logical CC113B27D9316E5E968C2EBC263A678C
Move_file.bat
    xcopy D:\DAC_POC\DAC_EXPORT \\inf-fbi-uat2-a1\D$\DAC_POC\DAC_EXPORT /c /d /e /h /i /k /q /r /s /x /y

Oracle Business Intelligence Applications (OBIA) - Financial Analytics

Overview  
The benefit of accessing data from across the enterprise and delivering deep insight directly to business users is faster and more informed decisions that help the organization optimize resources, reduce costs, and improve the effectiveness of front- and back-office activities ranging from Financial Analytics to Procurement and Spend Analytics to custom modules such as Axiom, HR, Travel & Expense, FX, Detail Revenue (DRR), Projects, BPC, Coupa and Concur.

Finance BI Summary

Finance BI Analytics
Financial Analytics provides organizations with better visibility into the factors that drive revenues, costs, and shareholder value. With dashboards that track key performance indicators, managers can see how staffing costs and supplier performance correlate with increased revenue and customer satisfaction. Financial Analytics also offers insight into the general ledger, customer profitability, actual performance versus budget, projects and payables and receivables. As a result, managers are empowered to make the best decisions, close the books faster, and comply with all regulatory laws. Dashboards and alerts allow financial and business managers to monitor financial performance in near real time. Detailed financial reports generated at a greater frequency and delivered to a broader range of users allow managers to understand how their business is performing while there is still time to make adjustments. Financial Analytics enables users to more effectively manage their financial performance and improve business by
  • Analyzing detailed, transaction-level data to understand the factors driving revenue, cost, and profitability across business units, geographic locations, customers, accounts/ cost centers, and projects in time to take action
  • Optimizing cash flow through detailed accounts receivable, accounts payable analysis.
  • Enhancing the general ledger reports with providing the journal level details, account category, comparing the balances vs journals
  • Ensuring budget compliance with effective expense controls that deliver expense line details to departmental managers in time to take corrective action

General Ledger: Manage financial performance across locations, customers, products, and territories, and receive real-time alerts on events that may impact financial condition. The General Ledger Analytics application provides information to support your enterprise's balance sheet and provides a detailed analysis on the overall health of your company. The default configuration for the General Ledger Analytics application is based on what is identified as the most-common level of detail or granularity. However, you can configure and modify the extracts to best meet your business requirements.

Payables: Assess cash management and monitor operational effectiveness of the payables department to ensure lowest transaction costs. The Oracle Payables Analytics application provides information about your enterprise's accounts payable information and identifies the cash requirements to meet your obligations.
The information found in the Oracle Payables Analytics application pertains to data found exclusively under Accounts Payable (AP) in your financial statements and chart of accounts. Analysis of your payables allows you to evaluate the efficiency of your cash outflows. The need for analysis is increasingly important because suppliers are becoming strategic business partners with the focus on increased efficiency for just in time, and quality purchasing relationships.

Receivables: Monitor DSOs and cash cycles to manage working capital, manage collections, and control receivables risk. The Receivables Analytics application provides information to support your credit and collection activities, and to monitor and identify potential, receivables problems. The information found in the Receivables Analytics application pertains to data found exclusively in the Accounts Receivable (AR) account grouping of your financial statements and chart of accounts. Each day that your receivables are past the due date represents a significant, opportunity-cost to your company. Keeping a close eye on the trends, and clearing of AR is one way to assess the efficiency of your sales operations, the quality of your receivables, and the value of key customers. The default configuration for the Oracle Receivables Analytics application is based on what is identified as the most-common level of detail or granularity. However, you may configure and modify the extracts to best meet your business requirements.

Procurement Spend Analytics: Procurement Spend Analytics provides procurement organizations with visibility into spend, procurement performance, supplier performance and employee expenses. Procurement Spend Analytics elevates the value of the traditional procurement organization from processing purchase orders to driving strategic value through identification of savings opportunities, supplier performance improvements and procurement cycle time reductions and a complete view of the procure-to-pay process—including comprehensive analyses of procurement, supplier performance, supplier payables, and employee expenses. With complete, end-to-end insight into spend patterns and supplier performance, organizations can significantly reduce costs, enhance profitability, increase customer satisfaction, and gain competitive advantage.
The solution allows to more effectively manage their expenditures and improve business performance by
  • Providing timely direct and indirect spending data to all departments
  • Reducing data collection time with source-specific adapters that extract and transform data from disparate enterprise systems—both Oracle and non-Oracle-based—so managers can spend more time on higher value activities such as analysis
  • Analyzing detailed, transaction-level data to understand the factors driving supplier performance and procurement costs
  • Identifying cost savings across business units, geographic locations, products, and procurement organizations
  • Improving performance by identifying suppliers that price inconsistently or do not adhere to price schedules

This module provides visibility into direct and indirect spending across the enterprise, payment, and employee expenses. The Procurement and Spend Analytics application comprises the following subject areas:

Total Spend: This is a summary subject area that provides the ability to do comparative analysis and report on requested spend, committed spend and actual spend across suppliers, company, products, commodities and associated hierarchies for both direct and indirect spend (indirect spend being MRO and employee expenses) in detail to allow complete visibility of spending across your organization.

Purchase Orders: This is a detailed subject area that provides the ability to report on committed spend, and Purchase orders of the suppliers of an organization across suppliers, company, products, commodities and associated hierarchies at purchase order line level

Purchase Order Costs: This is a detailed subject area that provides the ability to report on committed spend and purchase orders of the suppliers of an organization across suppliers, company, products, and commodities and associated hierarchies at cost center (distribution line) level.

Purchase Cycle Lines: This is a summary subject area that provides the ability to report cycle time performance, such as requisition to purchase order lead time, purchase order to receipt lead time, P2P lead time of the suppliers of an organization.

Purchase Schedules: This is a detailed subject area that provides the ability to report on purchase order shipments of an organization across suppliers, company, products, commodities and associated hierarchies at purchase schedule line level

Purchase Requisitions: This is a detailed subject area that provides the ability to report on requested spend and purchase requisitions of the suppliers of an organization across suppliers, company, products, commodities and associated hierarchies at purchase requisition line level

Purchase Requisition Status: This is a summary subject area that provides the ability to report on requisition status along the approval cycle of purchase requisitions of the suppliers of an organization. It's populated only by Universal adapter.

Purchase Receipts: This is a detailed subject area that provides the ability to report on actual spend and purchase receipts of the suppliers of an organization across suppliers, company, location, products, commodities and associated hierarchies at purchase receipt line level

Employee Spend: This is a detailed subject area that provides the ability to report on employee spend of an organization across employees, company, cost center and associated hierarchies. The Expenses subject area contains targeted metrics and reports that examine travel and expense costs in relationship to your organization's overall spending patterns. In contrast to analyzing direct spending patterns, where you may review purchasing, Expenses examines indirect spending—the cost of employee related expenses. It is populated only by Universal adapter.

Project Reporting: Project Reporting delivers insight into the financial performance of projects so all team members can seamlessly track the project lifecycle. Project Analytics provides hundreds of out-of-the-box, standards-based KPIs and reports for project profitability analysis, funding and budgets, cost, revenue, and billing. Information is personalized, relevant, and actionable to improve project performance and profitability. Project Analytics also delivers cross functional analysis—including project-based analysis of accounts receivable and accounts payable, invoice-aging analysis, or status of procurement transactions by project. As a result all employees—given their level of security—can see a personalized, consistent version of the truth and take timely, corrective actions to achieve project objectives.
  • To improve performance of both projects and project portfolios, Project Analytics allows team members and executives to monitor projects and control the risks that lead to budget and schedule overruns with the role-based dashboards
  • Look into a particular program or project and verify how it is performing for a given time period or inception-to-date metrics
  • See past, present, and future performance—including estimated metrics at project completion
  • Drill down to detailed cost information for a specific project such as line items sorted by task, expenditure category, resource, or person

Finance Business Intelligence Applications Architecture



High Level Data Flow
  • Sources – EBS (Raw Data), RevStream, Travel & Expense, Axiom, Workday, BPC, Coupa and Concur
  • ETL – Extraction Transform and Load (Informatica – Power Center)
  • FDW – Finance Data Warehouse
  • OBIEE Metadata
  • OBIEE Content – Reports and Dashboards


Detailed Data Flow


Oracle Business Analytics Warehouse Architecture Components
The Oracle Business Analytics Warehouse architecture comprises the following components:

  • DAC client: A command and control interface for the data warehouse to allow for schema management, and configuration, administration, and monitoring of data warehouse processes. It also enables you to design subject areas and build execution plans.
  • DAC server: Executes the instructions from the DAC client. The DAC server manages data warehouse processes, including loading of the ETL and scheduling execution plans. It dynamically adjusts its actions based on information in the DAC repository. Depending on your business needs, you might incrementally refresh the Oracle Business Analytics Warehouse once a day, once a week, once a month, or on another similar schedule.
  • DAC repository: Stores the metadata (semantics of the Oracle Business Analytics Warehouse) that represents the data warehouse processes.
  • Informatica Server: Also referred to as PowerCenter Services. Loads and refreshes the Oracle Business Analytics Warehouse.
  • Informatica Repository Server: Also referred to as Repository Services. Manages the Informatica repository.
  • Informatica Repository: Stores the metadata related to Informatica workflows.
  • Informatica client utilities: Tools that enable you to create and manage the Informatica repository.

About the Data Warehouse Administration Console (DAC)

The DAC provides a framework for the entire life cycle of data warehouse implementations. It enables you to create, configure, execute, and monitor modular data warehouse applications in a parallel, high-performing environment.

The DAC complements the Informatica ETL platform. It provides application-specific capabilities that are not prebuilt into ETL platforms. For example, ETL platforms are not aware of the semantics of the subject areas being populated in the data warehouse nor the method in which they are populated. The DAC provides the following application capabilities at a layer of abstraction above the ETL execution platform:

  • Dynamic generation of subject areas and execution plans
  • Dynamic settings for parallelism and load balancing
  • Intelligent task queue engine based on user- defined and computed scores
  • Automatic full and incremental mode aware
  • Index management for ETL and query performance
  • Embedded high performance Siebel OLTP change capture techniques
  • Ability to restart at any point of failure
  • Phase-based analysis tools for isolating ETL bottlenecks

A data warehouse application comprises the following repository objects

  • Subject area: A logical grouping of tables related to a particular subject or application context. It also includes the tasks that are associated with the tables, as well as the tasks required to load the tables. Subject areas are assigned to execution plans, which can be scheduled for full or incremental loads.
  • Tables: Physical database tables defined in the database schema. Can be transactional database tables or data warehouse tables. Table types can be fact, dimension, hierarchy, aggregate, and so on, as well as flat files that can be sources or targets.
  • Task: A unit of work for loading one or more tables. A task comprises the following: source and target tables, phase, execution type, truncate properties, and commands for full or incremental loads. When you assemble a subject area, the DAC automatically assigns tasks to it. Tasks that are automatically assigned to the subject area by the DAC are indicated by the Auto generated flag in the Tasks sub-tab of the Subject Areas tab.
  • Task Groups: A group of tasks that you define because you want to impose a specific order of execution. A task group is considered to be a "special task."
  • Execution plan: A data transformation plan defined on subject areas that needs to be transformed at certain frequencies of time. An execution plan is defined based on business requirements for when the data warehouse needs to be loaded. An execution plan comprises the following: ordered tasks, indexes, tags, parameters, source system folders, and phases.
  • Schedule: A schedule specifies when and how often an execution plan runs. An execution plan can be scheduled for different frequencies or recurrences by defining multiple schedules.

Overview of the DAC Interface

Main DAC Window


The DAC Server Monitor Icons



Sunday, June 8, 2014

ETL Informatica Repository Queries

Abstract

Below are queries to retrieve the data from Informatica metadata (Repository Tables). You need to have access to the Repository schema to access the metadata. Please prefix with your schema name to the tables and view in the below queries.

1. REPOSITORY
1.1          REPOSITORY INFO
SELECT DOMAIN_NAME,
       REPOSITORY_NAME,
       PCSF_DOMAIN AS DOMAIN,
       DB_USER
  FROM OPB_REPOSIT_INFO;
   
1.2          LIST OF OBJECTS WHICH ARE NOT VALID

SELECT REPOSITORY,
         FOLDER_NAME,
         TASK_TYPE,
         WORKLET_OR_SESSION,
         IS_VALID
    FROM (SELECT DISTINCT
                 OPB_REPOSIT_INFO.REPOSITORY_NAME AS REPOSITORY,
                 OPB_SUBJECT.SUBJ_NAME AS FOLDER_NAME,
                 OPB_TASK.TASK_NAME ASWORKFLOW_OR_WORKLET,
                 DECODE (OPB_TASK_INST.TASK_TYPE,
                         58, 'COMMAND',
                         59, 'DECISION',
                         60, 'EVENT WAIT',
                         62, 'START',
                         65, 'EMAIL',
                         66, 'TIMER',
                         67, 'ASSIGNMENT',
                         68, 'SESSION',
                         70, 'WORKLET',
                         91, 'CONTROL',
                         NULL)
                    TASK_TYPE,
                 OPB_TASK_INST.INSTANCE_NAME AS WORKLET_OR_SESSION,
                 DECODE (OPB_TASK_INST.IS_VALID, 1, 'VALID', 'NOT VALID')
                    AS IS_VALID
            FROM OPB_TASK_INST,
                 OPB_OBJECT_TYPE,
                 OPB_TASK,
                 OPB_SUBJECT,
                 OPB_REPOSIT_INFO
           WHERE     OPB_TASK_INST.TASK_TYPE != 62
                 AND OPB_TASK_INST.TASK_TYPE = OPB_OBJECT_TYPE.OBJECT_TYPE_ID
                 AND OPB_TASK_INST.WORKFLOW_ID = OPB_TASK.TASK_ID
                 AND OPB_TASK_INST.VERSION_NUMBER = OPB_TASK.VERSION_NUMBER
                 AND OPB_TASK.SUBJECT_ID = OPB_SUBJECT.SUBJ_ID
                 AND OPB_TASK.UTC_CHECKIN <> 0)
   WHERE IS_VALID = 'NOT VALID'
ORDER BY 2, 3;
1.3          LIST OF OBJECTS WHICH ARE FAILED IN LAST 10 DAYS

 SELECT Subject_Area AS Folder,
         Session_Name,
         Last_Error AS Error_Message,
         DECODE (Run_Status_Code,  3, 'Failed',  4, 'Stopped',  5, 'Aborted')
            AS Status,
         Actual_Start AS Start_Time,
         Session_TimeStamp
    FROM rep_sess_log
   WHERE     run_status_code <> 1
   AND TRUNC (Actual_Start) BETWEEN TRUNC (SYSDATE - 10) AND TRUNC (SYSDATE)
ORDER BY 1, 2;

1.4           LIST WHERE ALL A TABLE IS USED
If you want to know the tables used in SQL OVERRIDES of Source Qualifier or in Lookup Transformation. The Query will give you the Type, Subject Area (Folder) and SQL as o/p. 

SELECT DISTINCT REP_ALL_MAPPINGS.SUBJECT_AREA,
                  REP_ALL_MAPPINGS.MAPPING_NAME,
                  REP_WIDGET_ATTR.ATTR_NAME,
                  REP_WIDGET_ATTR.ATTR_VALUE
    FROM REP_WIDGET_ATTR, REP_WIDGET_INST, REP_ALL_MAPPINGS
   WHERE     REP_WIDGET_ATTR.WIDGET_ID = REP_WIDGET_INST.WIDGET_ID
         AND REP_WIDGET_INST.MAPPING_ID = REP_ALL_MAPPINGS.MAPPING_ID
         AND REP_WIDGET_ATTR.WIDGET_TYPE IN (3, 11)
         AND REP_WIDGET_ATTR.ATTR_ID = 1
         AND REP_WIDGET_ATTR.ATTR_VALUE LIKE
                '%' || REPLACE ('TABLE_NAME', '_', '/_') || '%' ESCAPE '/'
ORDER BY 1, 2, 3;

1.5           LIST ALL SOURCE AND TARGET TABLES OF MAPPING
SELECT DISTINCT SUBJECT_AREA,
                  SOURCE_NAME,
                  TARGET_NAME,
                  MAPPING_NAME
    FROM REP_TBL_MAPPING ;
1.6           LIST COMMENTS OF ALL OBJECTS
SELECT REP_SUBJECT.SUBJECT_AREA AS FOLDER_NAME,
         REP_VERSION_PROPS.OBJECT_NAME,
         REP_VERSION_PROPS.COMMENTS,
         REP_VERSION_PROPS.VERSION_NUMBER
    FROM REP_VERSION_PROPS, REP_SUBJECT
   WHERE REP_SUBJECT.SUBJECT_ID = REP_VERSION_PROPS.SUBJECT_ID AND REP_VERSION_PROPS.COMMENTS IS NOT NULL
ORDER BY 1, 2;
1.7      All Folders in Repository 

  SELECT SUBJ_NAME, SUBJ_DESC
    FROM OPB_SUBJECT
ORDER BY 1, 2;
1.8     All Shared Folders in the Repository

  SELECT SUBJ_NAME, SUBJ_DESC
    FROM OPB_SUBJECT
   WHERE IS_SHARED <> 0
ORDER BY 1, 2;

1.9   User or Groups having access to the folders in Repository

SELECT subj.subj_name folder_name,
         user_group.NAME user_name,
         DECODE (obj_access.user_type,  1, 'USER',  2, 'GROUP') TYPE,
       CASE WHEN ( (obj_access.permissions - (obj_access.user_id + 1))IN(8, 16))
            THEN 'READ'
            WHEN( (obj_access.permissions - (obj_access.user_id + 1))IN (10, 20))
            THEN 'READ & EXECUTE'
            WHEN ( (obj_access.permissions - (obj_access.user_id + 1))IN(12, 24))
            THEN 'READ & WRITE'
            WHEN ( (obj_access.permissions - (obj_access.user_id + 1))IN(14, 28))
            THEN 'READ, WRITE & EXECUTE'
            ELSE'NO PERMISSIONS'
         END permissions
    FROM opb_object_access obj_access,
         opb_subject subj,
         opb_user_group user_group
   WHERE     obj_access.object_type = 29
         AND obj_access.object_id = subj.subj_id
         AND obj_access.user_id = user_group.ID
         AND obj_access.user_type = user_group.TYPE

ORDER BY 1, 2, 3;