Tuesday, March 2, 2010

Data , Object and Package Security in Framework Manager

Add Data Security

You can restrict the data represented by query subjects in a project by creating a security filter. The security filter controls the data that is shown to your users when they set up their reports.

For example, your Sales team consists of a Sales Director, and four Sales Managers. You create a security filter that includes a group for the Sales Director and a group for Sales Managers, and apply the filter to the Salary query subject. When the package is available for your users, and a report is generated for the Sales Managers and the Sales Director, only the Sales Director can see
the salary information for the Sales Managers.


If a user has multiple roles, the security filters belonging to these roles are joined together with ORs.

If a role is based on another role, the security filters are joined together with ANDs.

You can base the security filter on an existing security filter. If you choose this option, the security filter inherits the filter and all the filter properties.
When you create a security filter, you can also use existing project filters, or create new filters usingthe expression editor. For more information, see "Create a Filter" (p. 156).

Steps

1. Click the query subject you want, and from the Actions menu, click Specify Data Security.

2. To add new users, groups, or roles, do the following:
● Click Add Groups.
● In the Select Users and Groups window, add users, groups, or roles. For information about
how to do this, see the Content Manager User Guide.
● In the Select Users and Groups window, click OK.

3. If you want to base the group on an existing group, click a group in the Based On column.
Tip: If you do not see the group you want in the list, you must add the group to the security
filter.

4. If you want to add a filter to a group, in the Filter column, click either Create/Edit Embedded Filter or Insert from Model.


ADD OBJECT SECURITY

Steps to Add Object-Based Security

1. Click the object you want to secure, and from the Actions menu, click Specify Object Security.
Tip: You can select more than one object at a time.

2. Select the users, groups, or roles you want to change. Or, click Add to add new users, groups,
or roles.
For more information, see the Content Manager User Guide.

3. Specify security rights for each user, group, or role by doing one of the following:
● To deny access to a user, group, or role, select the Deny check box next to the name for
the user, group, or role. Deny takes precedence over Allow.
● To grant access to a user, group, or role, select the Allow check box.
Tip: To allow everyone to see all objects unless specifically denied access, select the Allow check
box for the Everyone role.

4. Click OK.
A list of new and updated object-based packages appears.

ADD PACKAGE SECURITY

Steps
1. Click the package you want to edit, and from the Actions menu, click Package, Edit Package
Settings.
2. Click the Permissions tab.
3. In Cognos Connection, create, add, or remove groups or roles as required. For information
about how to do this, see the Cognos Connection User Guide.
4. After you finish modifying the security for the package, click OK to return to Framework
Manager.

MDX Overview

Multidimensional Expression (MDX) is the industry standard language used to
query multidimensional data sources and generate reports in BW. It is typically
used by third-party reporting tools. Those of you who are interested in a
non-SAP reporting solution to augment Business Explore (BEx) need to understand
the impact of MDX. Each tool uses it differently, and MDX affects your BW resources.

SAP Cognos Architecture Comparision

SAP BW is a very large scale datawarehousing tool based on ROLAP
technology.It has its own reporting Tool - BEX and its own star
schemas and ROLAP based cubes.
Cognos Powerplay is an OLAP based system , that sits on top of a
Star schema.
the SAP BW system has more overall functionality than the Cognos
system in terms of database schemas and extraction
transformation and loading.

One of the fundamental differences between SAP BW and Cognos is
in the ROLAP and OLAP technologies they base their cubes on.

The SAP BW Rolap is a relational database based on Oracle or DB2
or Microsoft Sql server.
the Cognos OLAP cubes are based on compressed large binary
objects.

This difference leads to the SAP BW system having the advantage
that the extract and data load time to the ROLAP level is as
fast as any good database load/insert can be and in theory this
is the only load time that you will incurr.

For the equivalent Cognos based system you may (if you are
running a data warehouse level)need to load your daily/weekly
data into a star schema and then subsequently load this data
into a Powerplay cube .

The load of the data intto the Powerplay transformer engine and
the subsequent creation of the Powerplay cube binary object -
can take a long period of time.This depends on the number of
rows you are loading and the parallel nature of the queries and
the complexity of the cube design.

So, for very large multi terrabyte systems, you may encounter
difficulties in fitting all the Powerplay cube creations into
your batch window.

However, SAP Bw has a similar problem i that it normall relies on
a level above the ROLAP, called Aggregations - and these are
similar in result to the Powerplay cubes aggregations - except
the BW ones are held in DB2 or Oracle and the powerplay ones are
Blobs.

Powerplay has better graphing and time series functionality - the
graphs are embedded in the Powerplay functinality - whereas in
SAP bW the graphs tend to be stuck on top of the spreadsheet
output.

I would say in summary, that the Cognos Powerplay system is
easier to use and deploy than SAP BW, but SAP BW is possibly
better for the complete end to end data warehouse solution - but
doesnt look quite as nice as Cognos to the end users.

Master Detail Relationship

Master detail relationships link information between two data containers (such as list crosstab or chart) within a report: a master data container and a detail data container.
Using a master detail relationship you can use a single report to display information that would normally take two reports.
Master detail relationship can be created for any data container within Report Studio.

1. Create or open a report to use that has one data
container (i.e. a crosstab report)
2. From the Toolbox tab drag a container object next to the container in the work area (i.e. a chart- drag items from the insertable objects pane)
3. Select the second container and drag it into the first container where you want to place it.
4. Click the embedded object to select it.
5. From the Data Menu click Master Detail Relationships to create the master detail links.
6. Click New Link and select the data items from each query that you want to link to the master detail relationship.

Cognos BI Applications

IBM Cognos 8 Business Intelligence applications:

# Query Studio - a tool to create basic ad-hoc reports and a lighter version of Report Studio. It allows users to create one 1-page report at a time. Application used by Query Authors and Report Authors.

# Report Studio - used to create and manage more advanced reports, which are very often standarized and executed periodically and distributed to people in various departments in an organization. Reports can engage dashboards with gauges, charts, maps and other graphical elements with a full drill-through capabilities. Reports designs created in Query Studio can be read and edited in Report Studio (and not the opposite way). Report Studio is used by Report Authors.

# Analysis Studio - an application to analyze cubes and explore business data in a multidimensional manner. Large volumes of data can be analyzed through ranking, complex filtering, slicing, dicing and creating custom subsets. Mainly used by business analysts.

# Metrics Studio - used for scorecarding, helps manage performance by monitoring and analyzing metrics at all levels of an organization. The idea behind it is to put performance indicators next to the organization's key performance measures (red, orange, green status notation) and link them to reports containing related information.

# Event Studio - a process (an agent) which runs in the background and monitors the data and reacts when ocurrs something which may be cause a problem in the future in the business. For instance, if quantity of product X on stock is less than a given value, event manager can send an email to the responsible person which will contain a warning with problem description and a cognos report attached.

Event studio takes action by generating emails, running reports, updating news channel in a corporate portal or communicating with other software.

Aside from tracking actual products, it can also be used in monitoring services. For example if the number of subscribers in, say, a mobile broadband service exceeds the set capacity, event manager can email the necessary people so the problem could be looked into.

# Cognos Connection - it is a Web portal for Cognos 8 and a component which interracts with the Content Store. It is a frontend to publish, find, manage, organize, and view organization’s business intelligence data.

# PowerPlay Transformer and PowerPlay (Cognos 8 Business Intelligence Mobile Analysis ) - The Cognos 8 BI Mobile Analysis package is a new name for Cognos 7 PowerPlay applications. In fact, apart from the integration with Cognos 8 packages, there are no major changes since PowerPlay version 7.
Among others, the package includes the following applications known from Cognos 7 BI:
- Cognos PowerPlay Series 7 Version 4
- PowerPlay Connect
- PowerPlay for Excel
- Configuration Manager
- Cognos Scheduler

# Content Store is a database where the metadata for all the Cognos 8 component is stored and managed centrally.

Content store can be accessed directly with the Framework Manager.

# Framework Manager is a Cognos 8 modeling tool for creating and managing business-related metadata for use in all Cognos 8 BI applications. It lets modelers model relational data dimensionally, apply hierarchies to allow drill behavior, apply member functions and query any of the supported data sources (relational database with SQL or OLAP with MDX).

The main users of Framework manager are data warehouse developers and data modelers. Report authors base on the metadata information set up using Framework Manager when creating new reports.

# IBM Cognos TM1 (formerly Applix TM1) is a business intelligence solution which provides high-performance analytics, reporting, dashboarding and budgeting capabilities, fully integrated with Microsoft Excel. Thanks to its in-memory engine, TM1 is flexible, scalable and provides a high-performance read/write capability which makes it a great tool for budgeting, planning and controlling.

# Cognos Analysis for Microsoft Excel (CAFE) provides capabilities around exploration and slice and dice within Excel. It provides such capabilities as the Analysis Studio with its interactivity, but allows users to stay within Excel which very often is a key business requirement.

Performance Tuning of Cognos Reports

These are some general guidelines we use at my work:

1. For your queries, try to set the Processing property to Database Only. It runs faster than Limited Local.

2. Try to use fixed column widths where possible. We have found that dynamic sizing sometimes runs queries twice--once to find the largest data content, then again to dynamically size the columns based on the largest data content.

3. Try to move as many calculations as possible into the model so your query doesn't have to do them for every row of data pulled.

4. Try to use as many filters as possible up front to minimize the data you must pull in and process.

5. Simple CASE statements run faster than Search CASE statements.

6. Place calculations strategically. Rather than include a calculation in a CASE statement in multiple places, do it once in a separate field and then refer to it in multiple places. This minimizes processing time.

7. Create logical and useful index fields and logical table structures to make searching and pulling the data as efficient as possible.

8. When sorting a field, sort it either at the tabular model level OR the query level but NOT both.

9. Where possible, use database functions for calculations instead of their equivalent Cognos functions. See #1.

10. When using direct facts in the report do not create shortcuts, use
directly in facts under Qurey.

11. Cognos configuration by default font is Times new roman
the best practices are saying change the font to Arial - it will use less memory.



Hope that helps get you started thinking in the right direction.

LO extraction in SAP

1. Go to transaction code RSA3 and see if any data is available related to your DataSource. If data is there in RSA3 then go to transaction code LBWG (Delete Setup data) and delete the data by entering the application name.

2. Go to transaction SBIW --> Settings for Application Specific Datasource --> Logistics --> Managing extract structures --> Initialization --> Filling the Setup table --> Application specific setup of statistical data --> perform setup (relevant application)

3. In OLI*** (for example OLI7BW for Statistical setup for old documents : Orders) give the name of the run and execute. Now all the available records from R/3 will be loaded to setup tables.

4. Go to transaction RSA3 and check the data.

5. Go to transaction LBWE and make sure the update mode for the corresponding DataSource is serialized V3 update.

6. Go to BW system and create infopackage and under the update tab select the initialize delta process. And schedule the package. Now all the data available in the setup tables are now loaded into the data target.

7. Now for the delta records go to LBWE in R/3 and change the update mode for the corresponding DataSource to Direct/Queue delta. By doing this record will bypass SM13 and directly go to RSA7. Go to transaction code RSA7 there you can see green light # Once the new records are added immediately you can see the record in RSA7.

8. Go to BW system and create a new infopackage for delta loads. Double click on new infopackage. Under update tab you can see the delta update radio button.

9. Now you can go to your data target and see the delta record.

Monday, March 1, 2010

Dimensional Modeling of Relational Data Sources

Dimensional Modeling of Relational Data Sources

Dimensional modeling of relational data sources is a capability made available by Cognos 8
Framework Manager. ReportNet 1.x provided some dimensional capabilities to enable
multiple-fact querying and prevented double-counting, and Cognos 8 introduces features designed explicitly for dimensional modeling. It is now possible to model dimensions with hierarchies and levels and to have facts with multiple measures. You can then relate the dimensions to the measures by setting scope in the model. You must dimensionally model a relational data source when you want to do one or more of the following:

• use Analysis Studio
• enable drill functionality in reports
• access member functions in the report authoring tools

We recommend that you follow this workflow when you dimensionally model relational data
sources:

❑ Import the metadata.
❑ Verify the imported metadata (p. 7).
❑ Customize the metadata.
❑ Simplify the model using dimensional concepts (p. 11).
❑ Resolve ambiguous relationships (p. 12).
❑ Define dimensions (p. 17) and determinants (p. 16).
❑ Organize the model by creating business views.
❑ Create star schema groups (p. 23).
❑ Apply security.
❑ Create packages and publish metadata.

For information about the topics not covered here, see "Designing a Project", "Preparing
Relational Metadata for Use in Reports", and "Making Metadata Available to Report Authors"
in the Framework Manager User Guide.

All examples in this document use the database view of the gosales_goretailers normalized model or the import view of the go_data_warehouse model, which are included with the Cognos8 samples.


Verifying Imported Metadata

After importing metadata, you must check the imported metadata in these areas:
• relationships and cardinality
• determinants
• the Usage property for query items
• the Regular Aggregate property for query items

Relationships and cardinality are discussed here. For information on the Usage and Regular
Aggregate properties, see "Modifying the Properties of Query Items" in the Framework Manager User Guide.

Cardinality is combined with dimensions to control how queries are generated so that you can
• prevent double-counting
• automatically resolve loop joins
• enable cross-fact querying for reporting and analysis

You can create model dimensions and data source dimensions. Model dimensions are built on a
foundation of query subjects that use determinants and relationships with cardinality. Data source dimensions contain their own SQL and use hierarchy and level information as well as
relationships with cardinality to define query granularity.

Cardinality drives query behavior by allowing rules to be applied regarding the granularity of data that is returned by an individual object and the consequence of joins between objects. The
cardinality specified in the relationship between query subjects or dimensions determines how and when Cognos 8 generates stitched queries. Stitched queries are needed for multiple-fact querying across conformed dimensions and across different levels of granularity.


Detecting Cardinality from the Data Source

When importing from a relational data source, cardinality is detected based on a set of rules that
you specify. The available options are

• use primary and foreign keys
• use matching query item names that represent uniquely indexed columns
• use matching query item names

The most common situation is to use primary and foreign keys as well as matching query items
that represent uniquely indexed columns. The information is used to set some properties of query items as well as to generate relationships.

To view the index and key information that was imported, right-click a query subject or regular
dimension and click Edit Definition. For a query subject, you can change the information in the
Determinants tab.
Note: All regular dimensions begin as query subjects. If you converted a query subject to a regular dimension, note that determinant information for the query subject is leveraged as a starting point to define the levels of a single hierarchy. We recommend that you review the levels and keys created in the hierarchy of the dimension.

Optional relationships, full outer joins, and many-to-many relationships can be imported from
your data source. Framework Manager will execute them as queries.

Reflexive relationships can be imported from your data source and appear in the model but
Framework Manager does not execute them as queries. Although you can view the metadata that defines the reflexive relationship, you cannot edit a reflexive relationship in Framework Manager.


Cardinality in Generated Queries

Framework Manager supports both minimum-maximum cardinality and mandatory-optional
cardinality.

0:1 - 0 is the minimum cardinality, 1 is the maximum cardinality
1:n - 1 is the minimum cardinality, n is the maximum cardinality

A relationship with cardinality 1-n can be specified as 1:1 to 1:n when reading the maximum
cardinalities. The minimum cardinality of 1 is set to 0 for optional data. Therefore a 1-n
relationship can also be specified as 0:1 to 0:n, 0:1 to 1:n, or 1:1 to 0:n.

The basic rules for how cardinality is used are the following:

• Cardinality is applied in the context of a query, meaning that only the cardinalities of items
explicitly included in the query are evaluated.
• 1-n cardinality implies fact on the n side and implies dimension on the 1 side.
• A query subject can behave as a dimension in one query and as a fact in another.
• Queries on more than one fact will result in a stitched query.


Modeling 1-n Relationships as 1-1 Relationships

If a 1-n relationship exists in the data but is modeled as a 1-1 relationship, SQL traps cannot be
avoided because the information provided by the metadata to the query engine is insufficient.
The most common problems that arise if 1-n relationships are modeled as 1-1 are the following:

• Double-counting for multiple-grain queries is not automatically prevented.

Cognos 8 cannot detect facts and then generate a stitched query to compensate for
double-counting, which can occur when dealing with hierarchical relationships and different
levels of granularity across conformed dimensions.

• Multiple-fact queries are not automatically detected.

Cognos 8 will not have sufficient information to detect a multiple-fact query. For
multiple-fact queries, an inner join is performed and the loop join is eliminated by dropping
the last evaluated join. Dropping a join is likely to lead to incorrect or unpredictable results
depending on the dimensions and facts included in the query.

You can deal with these issues in Report Studio by creating an inner or outer join between queries or creating calculations that compensate for differences in granularity. This workaround requires the report author to have detailed knowledge of the data. There is no workaround available in Query Studio or Analysis Studio.


Detecting Determinants in the Data Source

Determinants are detected in your relational data source by using index and key information. This information is useful when performing automatic aggregation to determine the appropriate level of granularity to assign to a particular item in the query. We recommend that you review the determinants that were created on import and modify them where appropriate to align with your reporting requirements.

For normalized data sources, not all determinants may be relevant for or aligned with your
reporting requirements. We recommend that you remove any unnecessary determinants,
particularly if they conflict with relationships being created for that query subject.

For denormalized data sources, you may want to add determinants to represent non-unique key
segments that represent sets within the data. This is relevant if joins will be made on keys other
than the unique key of the query subject.

Analyzing the Model for Facts and Dimensions

To dimensionally model a relational data source for the purpose of cross-fact querying and
analysis, you must examine the underlying schema and address areas where the role of an object
as a fact or a dimension is not clear. This examination can result in creating a logical layer to
present the data as a star schema. Or you may decide to make changes to the physical data source to deliver a higher performance application. Physical changes may include consolidating tables by using extract-transform-load tools or creating materialized views for frequently-used
aggregations.

In a relational data source that was not de-normalized using star schema concepts, you may need to analyze the cardinality of the imported query subjects before creating dimensions or converting query subjects to dimensions. You must first understand the business applications of the underlying data as well as understand how Cognos 8 generates queries.

Within the context of a query, a query subject with a cardinality of n on all its 1-n relationships to other query subjects can be regarded as a fact. This implies that there are more rows of data in the fact query subject than in the related query subject on the 1 side of the relationship. Any query subject on the 1 side of a 1-n relationship to another query subject is treated as a dimension.

page 10 Reportnet

Informatica SAP

PowerCenter and SAP NetWeaver Integration Methods

PowerExchange for SAP NetWeaver integrates with mySAP applications in the following ways:

¨ Data integration using the ABAP program
¨ IDoc integration using ALE
¨ Data integration using BAPI/RFC functions
¨ Data migration integration
¨ Business content integration


Data Integration Using the ABAP Program

You can extract data from mySAP applications using the ABAP program. Create a mapping in the Designer that uses the ABAP program. Generate and install the ABAP program on the SAP server that extracts the source data. When you configure a session, you can access the source data through streaming or a staged file. The Integration Service accesses streamed data through CPI-C. It accesses staged files through FTP, SFTP, or standard file I/O, typically using network file sharing, such as NFS.

Complete the following steps to extract data from mySAP applications using the ABAP program:

1. Import an SAP R/3 source definition.
2. Create a mapping.
3. Generate and install the ABAP program.
4. Create a session and run a workflow.



IDoc Integration Using ALE

You can integrate PowerCenter with mySAP applications using Application Link Enabling (ALE) to send and receive Intermediate Documents (IDocs). IDocs are messages that exchange electronic data between SAP applications or between SAP applications and external programs.
The message-based architecture of ALE comprises three layers:

¨ Application layer. Provides ALE an interface to R/3 to send or receive messages from external systems.

¨ Distribution layer. Filters and converts messages to ensure that they are compatible between different releases of R/3 and R/2.

¨ Communications layer. Enables ALE to support synchronous and asynchronous communication. You use IDocs for asynchronous communication.

The architecture of ALE provides a way to send IDocs as text files without connecting to a central database. This allows applications to communicate with each other without converting between formats to accommodate hardware or platform differences.

ALE has the following components:

¨ Logical component. Determines how messages flow between the various applications or systems.

¨ Physical component. Transport layer that routes IDoc messages using the tRFC (Transactional RFC) protocol.

¨ Message types. Application messages that classify categories of data. For example, ORDERS and MATMAS (Material Master).

¨ IDoc types. Data structure associated with the message type. For example, MATMAS01, MATMAS02 forMATMAS . IDocs contain the data belonging to the message type.

IDocs contain three record types:

¨ Control record. Identifies the message type.
¨ Data records. Contain the IDoc data in segments.
¨ Status records. Describe the status of the IDoc. Status record names are the same for each IDoc type.


Data Integration Using BAPI/RFC Functions

Business Application Programming Interfaces (BAPI) provide a way for third-party applications to integrate data with SAP. You use BAPIs to create, change, delete, list, and detail objects in SAP.

The Business Application Programming Interfaces allow object-oriented access to the SAP system through methods for the business object types. Together with the business object types, BAPIs define and document the interface standard at the business level.

BAPIs also provide an object-oriented view of business components in SAP. You define BAPIs in the SAP Business Objects Repository. You implement and store them as Remote Function Call (RFC) enabled function modules in the Function Builder of the ABAP Workbench. You can call BAPIs as an ABAP program within SAP. You use RFCs to call BAPIs outside of SAP.

Use a BAPI/RFC transformation to create, change, or delete data in mySAP applications. When you run a session with a BAPI/RFC transformation, the Integration Service makes the RFC function calls to SAP to process SAP data.


You can use a BAPI/RFC transformation for one of the following reasons:

¨ Migrate data to SAP. For example, your organization uses PeopleSoft applications for enterprise purchasing management. You want to migrate to mySAP applications to manage enterprise purchasing. Use PowerExchange for for PeopleSoft to extract data from PeopleSoft and a BAPI/RFC transformation to write purchase management data to mySAP applications.

¨ Synchronize data in SAP. For example, a mySAP application contains customer orders. You want to add line items to some of the orders. You can use a BAPI/RFC transformation to make a BAPI/RFC call to add the line items to the orders in SAP.


Data Migration Integration

You can migrate data from legacy applications, from other ERP systems, or data from any number of other sources and prepare it for input into mySAP applications. The Integration Service extracts the data from the data source and prepares the data in an SAP format flat file that you can load into mySAP applications.

Business Content Integration

You can integrate PowerCenter with mySAP applications to provide an efficient, high-volume data warehousing solution. SAP business content is a collection of metadata objects that can be integrated with other applications and used for analysis and reporting. SAP produces the business content data, and PowerCenter consumes it. PowerCenter can consume all or changed business content data from mySAP applications and write it to a target data warehouse. You can then use the data warehouse to meet analysis and reporting needs.


PowerCenter and SAP NetWeaver BI Integration

Methods
PowerExchange for SAP NetWeaver BI can perform the following tasks:

¨ Extracting data from SAP BI
¨ Loading data into SAP BI


PowerExchange for SAP NetWeaver BI interacts with InfoCubes and InfoSources. An InfoCube is a self-contained dataset created with data from one or more InfoSources. An InfoSource is a collection of data that logically belongs together, summarized into a single unit.



Communication Interfaces

TCP/IP is the native communication interface between PowerCenter and SAP NetWeaver. PowerCenter and SAP NetWeaver also use the following interfaces:

¨ CPI-C
¨ Remote Function Call (RFC)


Common Program Interface-Communications

CPI-C is the communication protocol that enables online data exchange and data conversion between SAP and PowerCenter. PowerExchange for SAP NetWeaver uses CPI-C to communicate with SAP NetWeaver only when you run ABAP stream mode sessions.
To initialize CPI-C communication with PowerCenter, SAP NetWeaver requires information, such as the host name of the application server and SAP gateway. This information is stored in a configuration file named saprfc.ini on the node where the Integration Service process runs. The Integration Service uses parameters in the saprfc.ini file to connect to SAP NetWeaver when you run ABAP stream mode sessions.

Remote Function Call

RFC is the remote communications protocol used by SAP NetWeaver. It is based on Remote Procedure Call (RPC). PowerCenter makes remote function calls to communicate with SAP NetWeaver. To execute remote calls from PowerCenter, SAP NetWeaver requires information, such as the connection type and the service name and gateway on the application server. This information is stored in a configuration file named saprfc.ini on the node hosting the PowerCenter Client and the node where the Integration Service and SAP BW Service processes run.


Transport System
The transport system is a set of ABAP programs installed on the SAP system. It provides a way to import SAP metadata into the PowerCenter repository. It also enables run-time functionalities, such as passing mapping variables and filters.

You use the transport system in the following situations:

¨ Configuring PowerExchange for SAP NetWeaver. You need to transport some customized objects that were eveloped by Informatica to the SAP system. These objects include tables, structures, programs, and unctions. PowerCenter calls one or more of these objects each time you make a request to the SAP system.

¨ Deploying run-time transports and ABAP programs from development to production. If you are using BAP to integrate with mySAP applications, you might want to deploy the run-time transports provided by informatica and the ABAP programs installed by the Designer to extract data when you move from development to production. The SAP system administrator is responsible for completing transport tasks.

Cognos Dimensional Functions

Cognos Dimensional function

Current Member: Returns the Current member of a hierarchy.

Ancestor; Returns the ancestor of the specified member

Children: Returns the set of children of a specified member.

Cognos MDx Functions

Tuple – Refers to an intersection of one or members from one or more dimensions to define a specific measure value

Set – A group of members or tuples. In the [Years] hierarchy, the members [Q1 2006], [Q2 2006], [Q1 2005], [Q2 2005] taken together is a set of members.

Slicer – A slicer is similar to the SQL “where” clause that limits the data returned in a query. As an example, a slicer of [2005] will limit any data returned to 2005. There are some considerations for the report writer when using a slicer:

1. Values returned will be only for the slicer condition. In the previous example, the [Revenue] measure value will be in context of the [2005] member for all calculations and values returned.
2. With a slicer for [2005], year over year comparisons will not be valid since the other years have been “sliced” from the returned data set.
3. Slicers can be sets of data. A valid slicer could be the set of the year members [2004] and [2005].
4. Unlike the SQL “where” clause, slicers do not remove members from other rows or columns. For example, by slicing on [2004] this will not remove [2005] if the years are displayed on the rows. The result would be that [2005] is still present but does not show any measure values.

Detail Filter

A set of conditions in a query that narrow the scope of the data returned. A detail filter is applied
before aggregation is complete.

Summary Filter

A set of conditions in a query that narrow the scope of the data returned. A summary filter is
applied after aggregation is complete.

Burst Reports in Cognos

Bursting Reports

Bursting is the process of running a report once and then dividing the results for distribution to
recipients who each view only a subset of the data. For example, salespeople in different
regions each need a report showing the sales target for each country. You use burst reports to
send each salesperson the information they need. Burst reports can be distributed by email or
saved to a directory.

To create a burst report, you
❑ create a burst table in the source database
❑ import the database metadata into Framework Manager
❑ create the report and set burst options in Cognos Report Studio

Creating a Burst Table in the Data Source

To create a burst report, you must first create a burst table in the source database. The steps to
create a burst table depends on the database system you are using. The burst table must
contain the following information:

• a unique identifier
Tip: Some database systems do not require a unique identifier for each table.
• the list of recipients
• the data item that you want to burst on

Importing the Database Metadata into Framework Manager

After you create the burst table in the source database, import the database metadata into
Framework Manager. Then publish the resulting model to Cognos Connection to make it
available in Cognos Report Studio.

For more information, see the Framework Manager User Guide.

Create the Report and Set Burst Options in Cognos Report Studio

After the Framework Manager model is published to Cognos Connection, use it to build your
report. Ensure that the report contains the query item that represents the list of recipients, and
group it. Then set the burst options you want. In the burst options, two pieces of information, a
burst key and a burst recipient, are required. The burst key defines how the data should be
filtered and divided. The burst recipient determines who receives the subset of data.


Steps

1. Open the report that you want.

2. From the File menu, click Burst Options.

3. Select the Make report available for bursting check box.

4. Under Burst Key, in the Query box, click the query that contains the query item to be
distributed.

5. In the Level box, click the query item to be distributed in each burst report.

6. In the Label box, click the query item to be used to label each burst report.

7. Under Burst Recipient, in the Query box, click the query that contains the query item to be
used as the distribution list.

8. In the Item box, click the query item that contains the list of recipients.

9. In the Type box, select the type of recipients specified via the Item box.

• Click Email if the recipients are email addresses.
• Click Directory if the recipients are users, groups, roles, contacts, or distribution lists.
• Click Automatic to let ReportNet determine whether the recipients are email addresses
or directory entries.

10. Click OK.

After you set burst options for a report, enable the report for bursting in Cognos Connection. For
more information, see the ReportNet Administration and Security Guide.

Creating Burst Reports Using an SAP BW Data Source

You can burst a report that is based on an SAP BW data source by using burst information
stored within an SAP BW reporting object. It is assumed that you do not want to append
bursting information to existing SAP BW InfoCubes.

Note that bursting information can also be placed in any relational data source.

In the SAP Administrator Workbench, the characteristic must contain the burst information and
the values must represent a form of user identification, such as SAP BW user IDs. If you use
email addresses to burst, add an attribute that contains the user's email address to the
characteristic.

For each characteristic that is used as a burst key in Cognos Report Studio, add a presentation
hierarchy to the burst characteristic based on the burst key characteristic. The child nodes of
each of the presentation hierarchy nodes are the intended recipients of the reports that are burst by that particular burst key value.

For example, the burst information is contained in a characteristic called BurstInfo and the
values of the characteristic are

• Tom
• Fred
• Mary
• Liz

You want to burst reports by the values of the Country characteristic. Therefore, define the
presentation hierarchy BurstByCountry for the BurstInfo characteristic as follows:

• USA
• Tom
• Fred
• Canada
• Mary
• Germany
• Liz

In Framework Manager, you must include both the InfoCube that is the basis for reporting, and
the burst InfoCube, which is BurstInfo in the example above, in the model.


In Cognos Report Studio, you must create a master-detail report in which the master query
drives the report of interest and the detail query contains the burst information.

The master report must be grouped by a characteristic for which exists a corresponding
presentation hierarchy in the burst InfoCube.

The detail report is authored against the burst InfoCube and contains two columns, the query
item corresponding to the one used in the master report for bursting, and the query item that
contains the value on which bursting is performed. The value can be an email address or an
expression used to obtain address information based on the underlying authentication
mechanism used by Cognos Access Manager.

The detail report must not be visible, but must be evaluated by ReportNet when the report is
executed. Do the following:

• Place the detail report in a conditional block for which the box type is set to None.
• Link the master and detail reports with the expression [Master Burst Key] = [Detail Burst
Key].

When setting the burst options for the report, the master query provides the query items for the burst key and the detail report provides the query items for the burst recipients.


Determinants

Determinants

Determinants are designed to provide control over granularity in a similar, but not identical, way
as dimension information in Cognos ReportNet. A determinant can define the set of database
columns (query items) that uniquely identify a set of data, or it can identify a set of columns that
identify a non-unique set within the data.

Determinants are most closely related to the concept of keys and indexes in the data source and
are imported based on key and index information in the data source. We recommend that you
always review the determinants that are imported.

There is no concept of hierarchy in determinants. The order in which they are specified governs
the order in which they are evaluated.

Use determinants in the following cases:

• Joins exist at multiple levels of granularity for a single query subject.

An example is the Time dimension in the Go Data Warehouse sample model. There are joins
to the Time dimension on the day key and on the month key. Determinants are used for the
Time dimension when you want to prevent double-counting for multiple-fact queries.
For example, some facts join to time on month and some facts join to time on day. Specify
determinants for time to clearly capture the functional dependency between month and day as
a minimum to prevent double-counting for those facts that join at the month key.


• BLOB data types exist in the query subject.

Querying blobs requires additional key or index type information. If this information is not
present in the data source, you can add it using determinants. Override the determinants
imported from the data source that conflict with relationships created for reporting.
For example, there are determinants on two query subjects for multiple columns but the
relationship between the query subjects uses only a subset of these columns. Modify the
determinant information of the query subject if it is not appropriate to use the additional
columns in the relationship.

• A join is specified that uses fewer keys than a unique determinant that is specified for a query
subject.
If your join is built on fewer columns than what is stored in Framework Manager within the
determinants, there will be a conflict. Resolve this conflict by modifying the relationship to
fully agree with the determinant or by modifying the determinant to support the relationship.

MDX in Cognos SAP Integration

MDX Functions

The below example shows that there is often a correlation between a version of a “standard” MDX syntax and the Cognos 8 syntax. Report Studio also provides extensions and added capabilities to MDX that simplify complex calculations and aggregations.

Standard MDX

Cognos MDX

union( [Golf Equipment].Children, [Camping Equipment].Children )

union( children( [Golf Equipment] ), children( [Camping Equipment] ) )

OR


{ [Golf Equipment].Children, [Camping Equipment].Children }


OR


{ [Golf Equipment].Children } + {[Camping Equipment].Children }


MDX functions can be broken down into five categories:

1. Member functions – used for retrieving specific members and their properties.

2. Set functions – working with sets of members.

3. Level and hierarchy functions – information on dimension and hierarchies in them.

4. Member summary functions – total, aggregate, percentage values on a set or report.

5. Logic functions – if, contains, is null, etc.


* Report Studio generates the MDX query for the report writer as they drag data items into a list or crosstab. This section introduces key MDX concepts by examining how MDX is utilized by Report Studio through its graphical interface.


Tuple : function is used whenever a report writer needs to explicitly retrieve a value. The syntax for tuple() from the Report Studio tip is:

tuple ( member { , member } )

Identifies a cell location (intersection) based on the specified members, each of which must be from a different dimension. Implicitly includes the current member from all dimensions not otherwise specified in the arguments. The current member of any dimension not specified in the evaluating context is assumed to be the default member of that dimension. The value of this cell can be obtained with the "value" function.

The tuple() function will return the value for the combination of every member included in the function.


MDX Member : Report Studio uses the MDX members() function to return all [Product line] members when producing the underlying MDX query for the list. The members() function will return the set of members in the [Product line] level of the products dimension. The definition of the members function in Report Studio is:


members ( hierarchy | level )


This returns the set of members in a hierarchy or level. In the case of a hierarchy, the order of the members in the result is not guaranteed; if a predictable order is required, an explicit ordering function (such as hierarchize) must be used.


“Returns the set of members” is easy to visualize using Report Studio. Examining the “Insertable Objects” pane source tab (Figure 3), the [Product line] level of the [Products] dimension will show the members of the [Product line] level, [Camping Equipment] through [Personal Accessories].


Sets :Similar to the members() function, Report Studio uses the MDX set() function to create a set of members when specific members are dragged onto a report layout. Using a crosstab (to avoid nesting in a list), [Camping Equipment] and [Outdoor Protection] could be placed into the rows of a crosstab and [Revenue] to the measures.


The MDX set() function could be used in a query data item with the function

set([Camping Equipment], [Outdoor Protection]) instead of dragging individual members.

Sets are the key to driving many reports, including creating custom aggregations over

sets of data, or choosing what data to display.