Dynamic Cube Deployment Steps

27
Set up routing rules & dynamic cube deployment By Rich Babicz/Srini Motupalli 1 | Page

Transcript of Dynamic Cube Deployment Steps

Page 1: Dynamic Cube Deployment Steps

Set up routing rules & dynamic cube deployment

By

Rich Babicz/Srini Motupalli

1 | P a g e

Page 2: Dynamic Cube Deployment Steps

Table of Contents

Setup of JDBC connection............................................................................................................................3

Setting up routing rules in distributed cognos environment.......................................................................3

Designing a star-schema model...................................................................................................................5

Export of Dynamic Cube Content................................................................................................................5

Import of Dynamic cube in new environment...........................................................................................10

Publishing Cognos Dynamic cube..............................................................................................................15

Starting Dynamic cube...............................................................................................................................25

2 | P a g e

Page 3: Dynamic Cube Deployment Steps

Setting Routing Rules & Dynamic Cube Deployment Steps

Setup of JDBC connection

Description: Dynamic cubes require JDBC data source to load data from database. Follow below steps to set up JDBC connection.

1. Install BI Server on windows/Unix environment.2. Setup JDBC connection by copying OJDBC files between oracle and cognos install locations.

1. On the computer where the Oracle client is installed, go to theORACLE_HOME/jdbc/lib directory.2. Locate and copy the following JDBC driver file(ojdbc5.jar), based on your Java version, tothe c10_installation\webapps\p2pd\WEB-INF\lib directory.

3. Test the JDBC connection and make sure that it is successful.

Setting up routing rules in distributed cognos environment]

Description: Setup routing rules to route the reporting requests to different servers based on package routing rules set.

1. Ensure DC dispatcher up and running successfully with 32bit processing….run a classic report.2. Setup DC JDBC datasource and test connectivity.3. Define server groups explicitly on each Dispatcher

a. Go to Dispatcher Tuning setting in Cognos Administration and assign the server group property.

b. Assign a value of 32brptg for CQM dispatchers and 64brptg for DQM cube dispatcher.4. Publish cube to cube dispatcher and assign server group and start cube. Refer to Publish cube

steps doc and in-memory agg loads.5. Run WA report on a dimension members only. Results should be that requests will fail

occasionally (2 of 3 run attempts) when the request hits a dispatcher without the cube and will be successful when request is routed to cube dispatcher.

6. Run report from cognos connection on a CQM package. Results should be that requests are always successful regardless of dispatcher hit. This is due to the fact that all dispatchers are 32bit execution mode which can handle both CQM and DQM processing.

7. If any above tests are unsuccessful evaluate changes before attempting an alternative test method, otherwise move to step 9. This method is to bring down report services on two CQM dispatchers and leave active CM up. Requests will then be routed to active cube dispatcher and will be successful on every attempt.

8. Run test report on cube which should succeed on every attempt. If successful startup 32bit dispatchers and move to step 9.

3 | P a g e

Page 4: Dynamic Cube Deployment Steps

9. If above tests successful, modify Cube dispatcher report execution mode setting to 64bit, stop services, save and restart dispatcher services.

10. Run a report similar to step 6. Results should now be that requests will now occasionally fail (1 of 3 attempts on avg) due to the 64bit dispatcher unable to handle CQM requests.

11. Setup new routing rule from Cognos Administration/Configuration tab – Dispatchers and Services.

12. Click on routing rule icon and add a new rule. Type in exact name of package and select the 64b rptg server group and save. Go to cube package properties and ‘Set’ routing rule on package.

13. You may have to setup a rule to act as a ‘catch all’ bucket for all non-routed requests. Needs investigation as this is a newly changed behavior in in 10.2.1.

14. Setup routing rule for CQM packages in similar manner but select 32brptg server group. 15. Test cube report with all dispatchers up and running. Report should now succeed on every

attempt.16. Test CQM report. Report should now succeed on every attempt.

Misc. Items

Investigate publish setting of add dc to default dispatcher. How do we change the default dispatcher configuration during publish process.

Test deployment package of cube from sandbox to Dev to evaluate if in-memory agg loads are successful without the use of Agg Advisor.

4 | P a g e

Page 5: Dynamic Cube Deployment Steps

Designing a star-schema model

1. Design and build the dynamic with required dimension and fact data(measures) from dynamic cube designer.

2. Update local dynamic cube designer configuration to cognosdevcube64 dispatcher and gateway.3. published the package, start cube and verified the data in WA.

Note: Publish by un-checking "Add dynamic cube to default dispatcher". Otherwise the cube will be assigned to default dispatcher.

Export of Dynamic Cube ContentDescription: Once the dynamic cube is published and ready for migration to other environments, below are the detailed steps to export the cube.

1) Take the backup of existing DEV/PROD content store and upload to version control.2) Take 2nd export of only Data sources and sign-on’s (check box selected) and upload to version

control from DEV/PROD environments.3) Export the dynamic cube package and respective data sources as per the steps below.

a) Go to content administration under Configuration tab of cognos administration.

b) Provide name to new export

c) Select public folders, directory and library content

5 | P a g e

Page 6: Dynamic Cube Deployment Steps

d) Add packages by selecting from public folders.

e) Leave all other options as default (screen shot below)

f) Select “Include data sources and connections” but do not select “Include sign-ons”

6 | P a g e

Page 7: Dynamic Cube Deployment Steps

Note: if we select “Include sign-on’s”, Sign-on information will also be imported into DEV/PROD environment and later sign-on information has to be changed manually.Under “Conflict resolution”, select “Keep existing entries” if you have other relational data sources already existing in DEV/PROD environments. Otherwise, data sources from DEV will overwrite the data sources in PROD. This should not be happen.

g) Under Entry Ownership” , set the owner to “The user performing the import” and

7 | P a g e

Page 8: Dynamic Cube Deployment Steps

Note: if the reports/packages already exist in PROD environment, then use “New Entries Only”.

h) Leave default in “deployment archive” folder.

8 | P a g e

Page 9: Dynamic Cube Deployment Steps

i) Review the summary tab:

j) Under select an action, select “Save and run once” and click Finish.

9 | P a g e

Page 10: Dynamic Cube Deployment Steps

Import of Dynamic cube in new environment

Description: Dynamic cube will be imported into different environment (other than DEV) to provide cube accessible to users. Below are the detail steps to import it.

Copy the exported zip file from sandbox/DEV to DEV/PROD deployment folder.Import the content from deployment file (DynamicCubes_Export_1024) file. Below are detail stepsa. Click “Import “button in configuration > Content Administration.

b. Select “Dynamic_Cubes_export_1024” and click Next

c. Enter Password given during export from other environment (Sandbox/DEV) and click Ok.d. d. Either change the name of import or leave it as it is. I changed the name in below screenshot

to Import instead of Export and click Next.

e. Select the package/folders from the list shown and click Next.

10 | P a g e

Page 11: Dynamic Cube Deployment Steps

f. Leave all other options default or if needed can change the options during import as well for the options set during export and click Next(If “Include Sign-Ons option is unchecked, then the access account has to be set at Trade_Cube data source level once it is imported.)

g. Leave all other options as well default and click Next

11 | P a g e

Page 12: Dynamic Cube Deployment Steps

h. Review the summary – New Import wizard and click Next

i. Click on “Save and Run Once” option and click Finish.

12 | P a g e

Page 13: Dynamic Cube Deployment Steps

j. In Run with options – Dynamic_Cubes_Import_1024 , Click Now and Run button.

k. In next tab select “View the details of this import after closing this dialog and check for any errors during import and click Ok.

l. After import, observe that only the data source “Trade_Cube” Datasource time stamp changed and no change for other data sources. Please find below snapshot. This import is conducted on October 27th.

13 | P a g e

Page 14: Dynamic Cube Deployment Steps

Publishing Cognos Dynamic cube

Description: Publish the dynamic cube for use by business for analytic use.

Once the import of package and datasource is complete, make sure that respective objects are imported by manually checking for these objects in respective folders.

1) Login to Cognos environment as “cognos8Admin “Account and set Trade_Cube data source “access account” to Cognos8Admin by searching in AD and click Ok.

Please find below snapshot.

2) Go to Status > Data Stores and assign the “Trade_Cube” to proper dispatcher group (64b_rptg) by selecting “Add Datastore to server Group” and click Ok.

14 | P a g e

Page 15: Dynamic Cube Deployment Steps

3) Set the query service level JVM heap size setting s as required. This setting has to be done based on the memory needed for Member chache,data cache, aggregate cache and temp space needed.

4) Go to Trade_Cube > Set Properties

Also set the aggregate cache, result set cache & data cache of respective cubes as per the data and member values and click Ok.

15 | P a g e

Page 16: Dynamic Cube Deployment Steps

5) Go to Status > System tab and start the cube by right click and “Start”.

If could not start cognos dynamic cube, make sure that the users credentials are renewed. If not, Please go to my preferences > Personal > click on “Renew the Credentials”.

16 | P a g e

Page 17: Dynamic Cube Deployment Steps

6) Once the cube is started, its status will be “Available” next to it and make sure that all the recommended aggregates are loaded.

7) Validate the data by running some sample reports.

Generate Aggregates using Cognos Dynamic query analyzer(Optional)

Once the dynamic cube is published and started, it will load all the dimensional members into member cache.But doesn’t load fact data until the aggregates are built and saved against dynamic cube in content store.

Below are the detailed steps to generate aggregates and saving them against dynamic cube in content store.

1) Stop the cube and enable the workload logging on dynamic cube and set

17 | P a g e

Page 18: Dynamic Cube Deployment Steps

Right click cube > Set Properties2) Set the “Maximum space for in-memory aggregates(MB) to zero otherwise cube will load pre existing

aggregates into in memory if any and click Ok.

3) Stop and Start the cube. Workload logging file will be created in a folder(with cube name) in below location of server .C:\Program Files\ibm\cognos\c102_64\logs\XQE\ROLAPCubes

4) Run any prebuilt or designed reports with some sample set of data(No need to run reports with full volume of data).

5) Workload_log file in ROLAPCubes folder will be updated with executed reports metadata which will be used by DQA when we start building aggregates.

6) Start the aggregate advisor from Dynamic cube analyzer tool and select the cube (which you want aggregate recommendations) and click Next

18 | P a g e

Page 19: Dynamic Cube Deployment Steps

7) Select one of 3 options below to generate aggregate recommendations.a) Cube Structure and Query workload informationb) Cube Structure only ( Just uses dynamic cube model structure)c) Query Workload Information Only ( Takes only workload logging information into

consideration)d) Set the approximate size of In-memory aggregates size (Maximum value) and in-Database

aggregates and maximum advisor run time limit and click Next

19 | P a g e

Page 20: Dynamic Cube Deployment Steps

e) If there is any workload_log. Xml with metadata to be used, then will see those reports in Report names section of this page and selected required reports to be used for advisor recommendations.

20 | P a g e

Page 21: Dynamic Cube Deployment Steps

f) Click Finish and advisor run for couple of min based on fact data volume and provide recommendations as shown below.

21 | P a g e

Page 22: Dynamic Cube Deployment Steps

g) If it generates all the required aggregate recommendations, Save them to content store to map to respective dynamic cube.

File > Apply Selected In-memory aggregates

22 | P a g e

Page 23: Dynamic Cube Deployment Steps

h) Click Ok to apply recommendations against Trade_Cube in content store.

23 | P a g e

Page 24: Dynamic Cube Deployment Steps

Starting Dynamic cube

Uncheck the workload logging and click Ok.

Start/Restart dynamic cube and it will load all new recommended aggregations.

24 | P a g e