Monday 3 December 2012

Standalone Essbase using external authentication

I have not actually be involved with any clients that use Essbase in standalone mode since pre version 9 days and to be honest I really don’t see the point in it nowadays and it is not really true standalone as you still need to configure the Shared Services registry but saying that I do notice from forum posts that it is still being used and the question about using external authentication has cropped up a few times.

I must admit I have never deployed Essbase  in standalone mode in version 11 and I didn’t know the answer how external authentication could be set up, instead of let this niggle at me not knowing I thought would have a look into it in some more detail.

In earlier versions you could populate an xml based file with the external directory information and update the essbase.cfg with details to the file using the AuthenticationModule setting like so:


If you try this in version 11.1.2.2 (maybe the same for earlier versions of 11) then the output in the log lets you know this method is not possible anymore.

[Sun Dec 01 11:23:03 2012]Local/ESSBASE0///4604/Error(1051223)
Single Sign On function call [css_init] failed with error [CSS Error: CSS class not found: NativeCAppImpl]


So what is the answer, well Essbase version 11 in standalone mode using external authentication does require Shared Services but you don’t have to deploy the web application, as always you don’t want me waffling on and you want to see how it is done so here goes.

The process I carried out if on 11.1.2.2 but I am sure it is pretty similar for all 11 versions; I am not going to into too much detail as I am sure you will get the idea.



When you start the installer make sure Foundation Components is selected (you don’t need Oracle HTTP Server if you have extracted the files) and of course Essbase Server is selected.

Once the installation has successfully completed start up the configurator.


You will need a relational database for Shared Services and the Registry.


The only requirement is to select “Configure Essbase Server


Enabling “Deploy Essbase in standalone mode” in the most important configuration step and unless there is a definite need the other settings can be left as default.


As the Foundations components were installed a username and password is required for the admin account, this will be used as the admin account for logging into Essbase even though Shared Services has not been deployed.


Once Essbase has been deployed into standalone mode and the service started you should be able to log into Essbase using Maxl with the account information provided in the configurator.


If you have an instance of EAS then you can see that Essbase is in standalone mode as the options are available to create new users/groups.

Well that is Essbase easily set up in standalone mode so what about setting up the external authentication.


The first step is to add “AuthenticationModule CSS” to the essbase.cfg


Once the configuration has been updated then essbase needs restarting to pick up the new setting, in the essbase.log it should output that the external authentication module is enabled.

The next step is configure the external authentication provider details which would be quite simple if Shared Services was deployed but as it not then the trick is to manipulate the Shared Services registry using the epmsys_registry utility.

The directory configuration used to be an xml file held in the file system but from version 11 this was moved into the registry, I did blog about this a number of years ago which you can read about in more detail here.

The epmsys_registry utility is available in

<MIDDLEWARE_HOME>\user_projects\<instancename\bin

To extract the configuration file the following can be executed from command line.

epmsys_registry.bat view FOUNDATION_SERVICES_PRODUCT/SHARED_SERVICES_PRODUCT/@CSSConfig



This will extract the file to the same location as the utility.



The file extracted will be named Comp_1_SHARED_SERVICES_PRODUCT_CSSConfig



The file can be edited with any text editor and currently will only contain the native provider information.

The file now needs updating with the external directory information and I cheated a little and examined a file from a different EPM instance which was configured to use external authentication to get an understanding of the structure, the structure is quite similar no matter which version you are using.

Here is an example of the file once I had updated it with information to connect to a Microsoft Active Directory:


Once updated I saved the file in the same location as the original export.


The file format is not important but I saved it as an xml so I could easily view the format in a structure way.

The utility is used again to import the file into the Shared Services registry and replace the existing configuraton.

epmsys_registry.bat updatefile FOUNDATION_SERVICES_PRODUCT/SHARED_SERVICES_PRODUCT/@CSSConfig E:\Oracle\Middleware\user_projects\essbase\bin\CSSConfig.xml


Once successfully imported the essbase service is restarted to pick up the new configuration.

To test the configuration I first inspected the essbase related logs to make sure there were no issues and then created a new user in essbase with a user that exists in the external directory.


Using EAS a new user was created selecting the authentication type as “Use external authentication


The user was given "Write" access to the Sample.Basic database.


The same could also be achieved if EAS is not available using Maxl.


To confirm that the external authentication was working I logged into EAS using the newly created user and all was expected.

It is worth pointing out that if you have deployed EAS using the same Shared Services registry then EAS will also be using the external authentication.

So there you go, it is not something I will probably have to do very often but I am sure it will help those baffled with what to do.

Wednesday 31 October 2012

Planning 11.1.2.2.300 Outline Load Utility Enhancements Part 3

In the previous two parts I went through the majority of the new functionality available with the Outline Load Utility, there is one remaining piece which I will quickly go through today and that is extracting data to a flat file.

The loading of meta/data can now be achieved from a relational source but unfortunately the new functionality of extracting data can only be to a target flat file, I am sure in future releases a relational target option will be added.

In most cases you will still probably extract data directly from Essbase as it is the optimal solution but if you want to extract text, Smart List and date data then it is now possible using the utility.

I am not going to cover any of the new Outline Load Utility properties again so if you have not done so it is probably worth having a read of the last couple of blogs.

In the last part I loaded the following data set to planning.


and the data form design:


The account member data types are:
Full Name – text member
Role – Smart List
Start Date – Date
Salary – Currency
Bonus – percentage


Now I will reverse the process and extract the data to a flat file.

There are a couple of parameters that I have not covered previously and will be required to extract data.

/ED: – specifies the output path and filename to export to
e.g. /ED:E:/PLAN_SCRIPTS/Extract/dataExport

Note: data into files of the form output_file.1-1.csv through output_file.n-n.csv where n is the number of files generated.

/EDD: - specifies the data that is to be exported and is in the format of
“Data Load member(s)”, “Driver Dimension member(s)”, “Point of View”, “Plan Type name”

So from my example form above the format would be
/EDD: EMP01,"Full Name, Role, Start Date, Salary, Bonus","FY12,Actual,Working,Local,Ignore,Begbalance",Consol

These parameters can be added to a properties file


Now all that is left is to run the utility from command line passing in the required parameters.


E:\Oracle\Middleware\user_projects\epmsystem1\Planning\planning1\OutlineLoad.cmd -f:E:\PLAN_SCRIPTS\password.txt  /CP:E:\PLAN_SCRIPTS\dataextract.properties

In my example I only required the password file and the property file as all the parameters were set in the property file.

Executing the command line produces the following output log

Property file arguments: /DF:DD-MM-YYYY /X:E:/PLAN_SCRIPTS/Logs/dataExp.err /A:PLANDEMO /ED:E:/PLAN_SCRIPTS/Extract/dataExport /S:FUSION11 /L:E:/PLAN_SCRIPTS/Logs/dataExp.log /U:epmadmin /EDD:EMP01,"Full Name, Role, Start Date, Salary, Bonus","FY12,Actual,Working,Local,Ignore,Begbalance",Consol

Command line arguments: /CP:E:\PLAN_SCRIPTS\dataextract.properties

Submitted (merged) command line: /DF:DD-MM-YYYY /X:E:/PLAN_SCRIPTS/Logs/dataExp.err /A:PLANDEMO /ED:E:/PLAN_SCRIPTS/Extract/dataExport /S:FUSION11 /L:E:/PLAN_SCRIPTS/Logs/dataExp.log /U:epmadmin /EDD:EMP01,"Full Name, Role, Start Date, Salary, Bonus","FY12,Actual,Working,Local,Ignore,Begbalance",Consol

Successfully logged into "PLANDEMO" application, Release 11.122, Adapter Interface Version 5, Workforce supported and not enabled, CapEx not supported and not enabled, CSS Version 3

[Sun Oct 28 14:47:50 GMT 2012]A cube refresh operation will not be performed.

[Sun Oct 28 14:47:50 GMT 2012]Create security filters operation will not be performed.

[Sun Oct 28 14:47:50 GMT 2012]Examine the Essbase log files for status if Essbase data was loaded.

[Sun Oct 28 14:47:50 GMT 2012]Planning data export operation finished. 1 data export file was written: E:/PLAN_SCRIPTS/Extract/dataExport.1-1.csv, 1 data export record was written (total all files).
As you can see from the log one export file was created called dataExport.1-1.csv and one record was exported.


Just like with loading data it is possible to use planning member functions in the /EDD definition which means you can include
Ilvl0Descendants(member), Children(member),Ancestors(member) etc


Running the utility again should now export the child members of Employees and Descendants of Year.


Well that finally covers all the new functionality with 11.1.1.2.300 release of the Outline Load utility, apologies for taking so long to get it completed but time has been very limited lately.

ODI 11.1.1.6 – Extracting Essbase attribute members bug + fix

Just another really quick update from me today, I have seen a few posts on the issue in the past and noticed there is an actual fix available now so I thought I would go through the problem and the fix.

The issue definitely exists on 11.1.1.6 and I believe it exists in unpatched 11.1.1.5 and 10g.

Say you want to extract the hierarchy from an attribute dimension from an essbase outline it certainly should be possible using the “LKM Hyperion Essbase METADATA to SQL”


In this example I will try and extract the population attribute hierarchy from everybody’s favourite Sample Basic.


If you reverse the Sample Basic database there will be a Datastore created for each attribute dimension with a number of property columns.


I created a simple interface which will extract the parent, child and alias members from the population attribute dimension and write these to a flat file.


Unfortunately when you run the interface it fails at the “Begin Essbase Metadata Extract” step with the following error:

org.apache.bsf.BSFException: exception from Jython:
Traceback (most recent call last):
  File "<string>", line 91, in <module>
    at com.hyperion.odi.essbase.ODIEssbaseMetaReader.validateSourceMetadata(Unknown Source)
    at com.hyperion.odi.essbase.ODIEssbaseMetaReader.validateExtractOptions(Unknown Source)
    at com.hyperion.odi.essbase.AbstractEssbaseReader.beginExtract(Unknown Source)
    at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
    at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
    at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
    at java.lang.reflect.Method.invoke(Method.java:597)

So that was the end of the road on 11.1.1.6 until a recent patch was made available


You can get more information and download patch 13093442 here.


Once download the patch can easily be applied using Opatch just make sure none of the ODI components are running before applying.

Basically the patch copies a new version of odihapp_essbase.jar to

\oracledi\client\jdev\extensions\oracle.odi.navigator\lib\
\oracledi.sdk\lib\


After applying the patch the interface can executed again.


This time it is much more promising as the interface complete successfully.


The target flat file is populated with the full attribute hierarchy even if it is in its usual unhelpful order.

Sunday 30 September 2012

Planning 11.1.2.2.300 Outline Load Utility Enhancements Part 2

In the last blog I went through some of the new functionality with the Outline Load Utility and today I am going to continue and go through loading data from a relational source.

I am not going to cover any of the properties again so if you have not done so it is probably worth having a read of the last blog.

Loading data through the planning layer is great if you want to load text, Smart List and date data, if loading numeric data then personally I would stick to loading direct to essbase as it is much more efficient and has a lot of additional functionality.

In an earlier blog I went through the process of loading text data from a flat file using the Outline Load utility and to be honest the concept is nearly identical for loading from a relational source.

Anyway no need for anymore waffle so let’s get on with the process.


In my example I have created simple form to show employee information by entity and instead of entering the information manually I am going to load it from a relational table.

The account member data types are:
Full Name– text member
Role – Smart List
Start Date – Date
Salary – Currency
Bonus – Percentage
 

The evaluation order has been set in planning so the correct data types are displayed in the form.


The source data is by entity, year and account, the rest of the data is static so can be fixed when writing the query to load to the planning application.

Now if you have ever loaded data to planning then you have probably come across the data load settings that apply the data load dimension and the driver dimension and members


In my example the data load dimension will be the entity dimension and the driver dimension will be Account with the members “Full Name”, “Role”, “Start Date”, “Salary” and “Bonus”.


You can still use the data load settings method for loading from a relational source or use a new parameter /SDM which allows the information to be set from command line or using a property file.

So for my example instead of using the data load settings functionality in planning I can use

/SDM:Entity, “Full Name, Role,  Start Date, Salary, Bonus”, Consol

This sets the data load dimension as Entity and the driver members as “Full Name”, “Role”, “Start Date”, “Salary”, “Bonus” and the third property is the plan type to apply these to.

Besides the data load dimension and driver members the “Point-of-View” and “Data Load Cube Name” also need to be provided in the relational query.

The “Point-of-View” is a comma separated list of all the remaining dimension members in the plan type and this is required so the intersection on where to load the data to is fully defined.

The remaining dimensions in my example application are Scenario, Version, Currency, Period, Segments and the members are all fixed as displayed in the form design:


The “Data Load Cube Name” is pretty obvious and is the plan type/essbase database name to load the data to which in my case is “Consol”.

A simple SQL query can be created to return the data in the correct format though it is worth noting the column names have to match the properties required for planning.


The query can be added to the command line or to a properties file which I will be using.


I covered most of the properties in the last blog so hopefully they should all make sense.

As there is a date member being loaded /DF should be used to define the date format and the available values are:
/DF:MM-DD-YYYY  The date data type values on the source data load are assumed to be month-day-year. For example, 12-25-2011
/DF:DD-MM-YYYY The date data type values on the source data load are assumed to be day­-month-year. For example, 25-12-2011
/DF:YYYY-MM-DD The date data type values on the source data load are assumed to be year-month-day. For example, 2011-12-25

Now all that is left is to run the utility from command line passing in the required parameters.


E:\Oracle\Middleware\user_projects\epmsystem1\Planning\planning1\OutlineLoad.cmd -f:E:\PLAN_SCRIPTS\password.txt  /CP:E:\PLAN_SCRIPTS\dataload.properties

In my example I only required the password file and the property file as all the parameters were set in the property file.

The output log generated was:
Property file arguments: /RIU:PLANSTAGE /D:Entity /RIR:jdbc:oracle:thin:@[fusion11]:1521:FUSION /DF:DD-MM-YYYY /RIP:TOzyauMwe2gtUQ9tjidf1Zq2pCA8iroN4i7HQxssFKUaogr16fi+WKmHFTD8NIIs /X:E:/PLAN_SCRIPTS/Logs/datald.err /RIQ:DATA_SQL /A:PLANDEMO /S:FUSION11 /L:E:/PLAN_SCRIPTS/Logs/datald.log /U:epmadmin /RIC:FUSION_CONN /SDM:Entity,"Full Name, Role,  Start Date, Salary, Bonus", Consol /RID:oracle.jdbc.OracleDriver /IR:E:/PLAN_SCRIPTS/metaload.properties

Command line arguments: /CP:E:\PLAN_SCRIPTS\dataload.properties

Submitted (merged) command line: /RIU:PLANSTAGE /D:Entity /RIR:jdbc:oracle:thin:@[fusion11]:1521:FUSION /DF:DD-MM-YYYY /RIP:TOzyauMwe2gtUQ9tjidf1Zq2pCA8iroN4i7HQxssFKUaogr16fi+WKmHFTD8NIIs /X:E:/PLAN_SCRIPTS/Logs/datald.err /RIQ:DATA_SQL /A:PLANDEMO /S:FUSION11 /L:E:/PLAN_SCRIPTS/Logs/datald.log /U:epmadmin /RIC:FUSION_CONN /SDM:Entity,"Full Name, Role,  Start Date, Salary, Bonus", Consol /RID:oracle.jdbc.OracleDriver /IR:E:/PLAN_SCRIPTS/metaload.properties

Successfully logged into "PLANDEMO" application, Release 11.122, Adapter Interface Version 5, Workforce supported and not enabled, CapEx not supported and not enabled, CSS Version 3

[Sun Sep 30 15:56:38 BST 2012]Setting Driver Members as specified with the /SDM switch.

A query was located in the Command Properties File "E:\PLAN_SCRIPTS\dataload.properties" that corresponded to the key passed on the Input Query switch (/RIQ) "DATA_SQL" so it's corresponding value will be executed as a query: "SELECT Entity, full_name as "Full Name",role,to_char(start_date, 'dd-mm-yyyy') as "Start Date",Salary,Bonus/100 as "Bonus", 'FY' || substr(Year,3,2) || ',Actual,Working,Local,Ignore,BegBalance' as "Point-of-View",'Consol' as "Data Load Cube Name" FROM PLANDATA"

Attempting to make input rdb connection with the following properties: catalog: FUSION_CONN, driver: oracle.jdbc.OracleDriver, url: jdbc:oracle:thin:@[fusion11]:1521:FUSION, userName: PLANSTAGE

Source RDB "FUSION_CONN" on jdbc:oracle:thin:@[fusion11]:1521:FUSION connected to successfully.

Connection to input RDB made successfully.

[Sun Sep 30 15:56:38 BST 2012]Date format pattern "DD-MM-YYYY" specified for date data type loading.

[Sun Sep 30 15:56:38 BST 2012]Header record fields: ENTITY, Full Name, ROLE, Start Date, SALARY, Bonus, Point-of-View, Data Load Cube Name

[Sun Sep 30 15:56:38 BST 2012]Located and using "Entity" dimension for loading data in "PLANDEMO" application.

[Sun Sep 30 15:56:39 BST 2012]Load dimension "Entity" has been unlocked successfully.

[Sun Sep 30 15:56:39 BST 2012]A cube refresh operation will not be performed.

[Sun Sep 30 15:56:39 BST 2012]Create security filters operation will not be performed.

[Sun Sep 30 15:56:39 BST 2012]Examine the Essbase log files for status if Essbase data was loaded.

[Sun Sep 30 15:56:39 BST 2012]Planning Outline data store load process finished. 2 data records were read, 2 data records were processed, 2 were accepted for loading (verify actual load with Essbase log files), 0 were rejected.

No errors were generated in the log and all records loaded successfully.


Running the form again confirmed that all the data has been loaded in the correct format.

There is another parameter available which I have not mentioned:

/ICB:blockSpecification

This allows you to define an intersection in the essbase database to clear before the data is imported, this functionality has been included because the utility does not currently load #missing values and if values already exist  they will not be overwritten.

So say the same data was loaded again but this time there was no Bonus data.


The Bonus member data will not be set to #missing and will still contain 10%

The format for the ICB parameter is similar to how the data is loaded so it requires the data load dimension member, the driver members, the point of view and the data load cube name.

/ICB:EMP01,"Full Name, Role,  Start Date, Salary, Bonus","FY12,Working,Version,Actual,Local,BegBalance,Ignore",Consol


The /ICB values that I have used will clear out all the data in the form before it is loaded.


As the data is now cleared before the import this time the Bonus member contains the correct data.


From examining the essbase application log you will be able to view the clear data command that has been executed.

It is also possible to use planning member functions in the /ICB definition which means you can include Ilvl0Descendants(member), Children(member),Ancestors(member) etc


Well that about covers loading data from a relational source, there is one more new piece of functionality with the Outline Load utility and that is extracting data to a csv file which I will try and go over whenever I find the time.

Tuesday 18 September 2012

Planning 11.1.2.2.300 Outline Load Utility Enhancements

I noticed that the recent patch release of planning 11.1.2.2.300 includes some additional functionality for the outline load utility that is worth going through.

With each release the outline load utility seems to gain extra functionality and it has just grown from strength to strength since its first appearance in 11.1.1.0

It is a utility that I know has made consultants lives much easier, it is simple to use and is now packed with functionality.

The enhancements in this patch release are
  • Import metadata and data from a relational data source
  •  

  • Optimize command lines by storing command line arguments in a command properties file. For example, if you use a command properties file to run the same commands on multiple applications, you need only change a parameter in the command line for each import. This allows you to bundle switches for a common application. It also makes command lines shorter and easier to manage, and enhances readability and ease of use.
  •  

  • Export data export to a flat CSV file
If you are going to patch planning then make sure you go through the readme in detail as it can be quite painful if you are on a distributed environment and it also requires patching other products first plus additional ADF patching.

Today I am going to go through using the properties file and importing metadata from a relational source, in the next blog I will cover the remaining enhancements.

In previous versions of Hyperion before the days of the Shared Services registry most of the configuration settings were held in properties file, a properties file (.properties) basically allows you to store key-value pairs which are separated either with a colon (key:value) or an equals sign (Key=value) and these pairs are then read by the calling application.

For example in the outline load utility world these could be /U:username or /S:servername

You are not restricted to only using pairs in the file as the other parameter switches can be used as well such as /N /O /C etc.

Before this release all of the these parameters had to be included in the command line and depending on the number of parameters it could look messy and if you have many scripts a lot of replication was being exercised.

Now there is a new parameter available /CP: commandPropertieFileName which designates a properties file to use when the outline load utility executes:

OutlineLoad.cmd /CP:E:\PLAN_SCRIPTS\metaload.properties   

By using the above example when the outline load utility start its will look for metaload.properties in E:\PLAN_SCRIPTS


Instead of having to write this information to the command line it is read from the easier to read and manageable properties file.

It is also possible to override the values in the properties file by including them in the command line.

OutlineLoad.cmd /CP:E:\PLAN_SCRIPTS\metaload.properties /D:Entity

/D:Entity which defines the entity dimension to load to takes precedence over /D:Account in the properties file.

I did notice that it doesn’t look possible to include -f:passwordFile parameter in the properties file and had to be included in the command line.

Right on to the main event and loading metadata into a planning application from a relational source, I know this new functionality will be music to the ears for lots of consultants because in many cases the source can be relational and up to now a SQL download and formatted file would have to be produced before using the utility.

There a quite a lot of new parameters available for the relational functionality and I will cover the ones that you are likely to use when loading metadata.

Here is an extension to above metaload.properties file which includes the parameters to run a SQL query against a relational source to load metadata to a planning application.


/IR:RelationalConnectionPropertiesFileName 

Just like the /CP parameter for including a properties file there is also one available just for the connectional information to a relational database, it is possible to use the same properties file as the one for other parameters like I have used:

/IR:E:/PLAN_SCRIPTS/metaload.properties

This will read metaload.properties for the source relational database information.

/RIQ:inputQueryOrKey

This can either the SQL query to be run or it can be used to designate a key which will hold the query.

/RIQ: ACCOUNT_SQL

So in my example the SQL query to be executed is held in key ACCOUNT_SQL

You may be asking why not just put the SQL directly in the /RIQ value, well you may have multiple SQL statements for different metadata load and by just updating /RIQ you call the required one, if you use /RIQ in the command line and have all the keys in the properties file then it is simple call different queries and looks much tidier.

KEY=SQL

The key relates to the key defined in /RIQ and SQL is the query that will be run, so in my example

The key is ACCOUNT_SQL and the SQL that will be executed is

SELECT ACCOUNT as "Account",PARENT as "Parent",alias_default as "Alias: Default",data_storage as "Data Storage" FROM PLAN_ACCOUNT ORDER BY idno

For my example this generates the following records to be loaded as metadata into the account dimension:


The query must return column header names to exactly match the properties required for planning.

/RIC: catalog

For Oracle I don't believe it matters what you specify as the value, for SQL Server it should be the database name.

/RID: driver

The parameter is the JDBC driver that will be used for the connection to the relational database.

For Oracle use:  /RID:oracle.jdbc.OracleDriver
For SQL Server:  /RID:weblogic.jdbc.sqlserver.SQLServerDriver

/RIR:url

The parameter is the JDBC URL to be used for the connection to the relational database.

For Oracle the format is: jdbc:oracle:thin:@[DB_SERVER_NAME]:DB_PORT:DB_SID

So in my example that equates to /RIR:jdbc:oracle:thin:@[fusion11]:1521:FUSION

For SQL Server the format is:
jdbc:weblogic:sqlserver://[DB_SERVER_NAME]:DB_PORT

An example being
/RIR: jdbc:weblogic:sqlserver://[fusion11]:1433

/RIU:username

Nice and simple this is the user name to connect to the relational database.

/RIP:password

The password for the database connection, this is unencrypted the first time it is used in a properties file.


Once the outline load utility has been run it will update the properties file and encrypt the password.

If you don’t want to use a properties file then all these parameters can be entered directly into the command line.

So let’s give it a go


The Account dimension currently contains no members.


The outline load utility is executed passing in the parameters to the encrypted password file and the properties file to use.


The utility reads the properties file and checks whether the database connection password is encrypted and as it is not encrypted it updates the file with an encrypted value.

The utility reads through the rest of the properties and then merges them with the ones currently in the command line before submitting them.

The output in the log provides further detailed information.

Property file arguments: /RIU:PLANSTAGE /D:Account /RIR:jdbc:oracle:thin:@[fusion11]:1521:FUSION /RIP:TOzyauMwe2gtUQ9tjidf1Zq2pCA8iroN4i7HQxssFKUaogr16fi+WKmHFTD8NIIs /RIQ:ACCOUNT_SQL /X:E:/PLAN_SCRIPTS/Logs/accld.err /A:PLANDEMO /S:FUSION11 /L:E:/PLAN_SCRIPTS/Logs/accld.log /U:epmadmin /RIC:FUSION_CONN /RID:oracle.jdbc.OracleDriver /IR:E:/PLAN_SCRIPTS/metaload.properties

Command line arguments: /CP:E:\PLAN_SCRIPTS\metaload.properties

Submitted (merged) command line: /RIU:PLANSTAGE /D:Account /RIR:jdbc:oracle:thin:@[fusion11]:1521:FUSION /RIP:TOzyauMwe2gtUQ9tjidf1Zq2pCA8iroN4i7HQxssFKUaogr16fi+WKmHFTD8NIIs /RIQ:ACCOUNT_SQL /X:E:/PLAN_SCRIPTS/Logs/accld.err /A:PLANDEMO /S:FUSION11 /L:E:/PLAN_SCRIPTS/Logs/accld.log /U:epmadmin /RIC:FUSION_CONN /RID:oracle.jdbc.OracleDriver /IR:E:/PLAN_SCRIPTS/metaload.properties

Successfully logged into "PLANDEMO" application, Release 11.122, Adapter Interface Version 5, Workforce supported and not enabled, CapEx not supported and not enabled, CSS Version 3

A query was located in the Command Properties File "E:\PLAN_SCRIPTS\metaload.properties" that corresponded to the key passed on the Input Query switch (/RIQ) "ACCOUNT_SQL" so it's corresponding value will be executed as a query: "SELECT ACCOUNT,PARENT as "Parent",alias_default as "Alias: Default",data_storage as "Data Storage" FROM PLAN_ACCOUNT ORDER BY idno"

Attempting to make input rdb connection with the following properties: catalog: FUSION_CONN, driver: oracle.jdbc.OracleDriver, url: jdbc:oracle:thin:@[fusion11]:1521:FUSION, userName: PLANSTAGE

Source RDB "FUSION_CONN" on jdbc:oracle:thin:@[fusion11]:1521:FUSION connected to successfully.

Connection to input RDB made successfully.

[Mon Sep 17 22:24:26 BST 2012]Header record fields: ACCOUNT, Parent, Alias: Default, Data Storage

[Mon Sep 17 22:24:26 BST 2012]Located and using "Account" dimension for loading data in "PLANDEMO" application.

[Mon Sep 17 22:24:26 BST 2012]Load dimension "Account" has been unlocked successfully.

[Mon Sep 17 22:24:26 BST 2012]A cube refresh operation will not be performed.

[Mon Sep 17 22:24:26 BST 2012]Create security filters operation will not be performed.

[Mon Sep 17 22:24:26 BST 2012]Examine the Essbase log files for status if Essbase data was loaded.

[Mon Sep 17 22:24:26 BST 2012]Planning Outline data store load process finished. 317 data records were read, 317 data records were processed, 317 were accepted for loading (verify actual load with Essbase log files), 0 were rejected.

All the 317 data records were read, processed and loaded successfully to the planning application.


Back in the planning application you can see all the metadata has been loaded.

There is one more parameter that I have not mentioned and that is /IRA, if this is used then the database connection information to the connected planning application is used.

This allows queries to be run against tables in the connected planning application database and does not require the /RIQ, /RIC, /RID, /RIR, /RIU, /RIP parameters.

Using the new functionality will also allow you to query the planning tables directly to return metadata, if this is something you are looking to do then it is definitely worth having a look at the Cameron Lackpour’s blog as he has kindly spent quite a lot of time covering this area in a number of his posts.

In the next blog I will cover the rest of the new functionality.