Sunday, 30 June 2013

FDMEE – ODI Console and Studio

If you are deploying FDMEE and have a requirement to install ODI Studio then you may be slightly confused if you read through the Installation and Configuration documentation as it currently states the following:

“If you are installing and configuring FDMEE, Oracle Data Integrator is automatically installed and configured for you. The database for Oracle Data Integrator is in same database as FDMEE and the Oracle Data Integrator agent application is deployed in same JVM as FDMEE. Optionally, you can install ODI Studio using”

I will get on to the Studio shortly because as well as the J2EE agent automatically being deployed and configured the ODI console is also deployed to the same WebLogic managed server which is still named ErpIntegrator:

The managed server is deployed on a default port of 6550 which means the console can be accessed on either

 If you are not already aware the console is basically a cut down web version of the Studio and if you want to understand more about it then it is worth reading the section “Working with Oracle Data Integrator Console” in the ODI documentation.

The security for the Console/Studio is independent of EPM security though it is possible to use external authentication and you can read through the fun and games I had setting it up in ODI here, here and here.

I have not yet configured ODI (which is the version deployed with EPM to use external authentication and it is my intention to go through and revisit the process to see if has changed and if so then bring it up to date in future blogs.

The console does not have to be accessed directly and once the web server has been configured it should be possible to access it through that route.

The ODI console and agent will have been added to the web server proxy configuration which it can be accessed through http://<webserver>:<port>/odiconsole

By default ODI is configured with only one user which is SUPERVISOR and the password still reflects the company that Oracle acquired ODI from in 2006 which is SUNOPSIS, in my opinion using a default password poses a security risk so I recommend updating the password straight away.

In reality you need the Studio up and running to be able to change the password so going back to the statement in the documentation “you can install ODI Studio using

So where is this file, well unless I am mistaken it doesn’t seem to exist and at first I thought all the ODI installation files do exist within the EPM install files but it does not seem to be the case.

The reason why I thought they existed was because if you go to the location where all the EPM installation files have been extracted to then you should see a directory named odi

Within the odi directory there are installers for both Windows and Unix (check the ODI support matrix for more information on supported operating systems for Studio) type systems.

I thought great it should be possible to install the Studio but once the installation commences you will be hit with a number of file not found errors and if you choose to continue through them the installation will complete successfully but there will be no Studio installed.

It looks like not all the necessary files for the Studio have been included which doesn’t make much sense to me so the best option is to download the ODI Studio installation files from Oracle Software Delivery Cloud under Enterprise Performance Management System (, this is what the documentation may be referring to as

Alternatively you can download the full ODI installation under Fusion Middleware 11g which is the route I took as I already had the files downloaded though the install process is exactly the same for either download.

Once all the files have been downloaded and extracted.

Start the installer:

Windows:  setup.exe -jreLoc JRE_LOCATION

Linux: ./runInstaller -jreLoc JRE_LOCATION

At the Installation Type stage "ODI Studio" should already be selected.

The Oracle Home Directory will be pointing to a location outside of the EPM installation which is perfectly fine to install to but as ODI components have already deployed I think it makes more sense to point to the existing home directory.

The warning says upgrade but as the versions should be currently the same I don’t believe it poses an issue but this might need further consideration in the future if FDMEE patches have been applied.

The next screens are for the configuration to ODI master repository which is part of the FDMEE database so that is where the database connection details should be configured to.

If the connection information has been entered correctly along with the supervisor password then the FDMEE work repository should be populated.

I said that it should not be a problem installing to the existing ODI home within the EPM installation and the installation completion message verifies this:

The Studio should now be available.

The repository connection information should all be automatically populated from running the Studio installation.

The SUPERVISOR password can be changed and once this has been done the ODI setup in FDMEE will need to be updated with the new password.

The ODI Java agent and Console also have the supervisor password stored in the Application Server credential store which is used to connect to the ODI master repository.

If you change the supervisor password and start up the FDMEE application server without updating the credential store you will see the following in the logs:

The password can be updated by either using the WebLogic scripting tool or through Enterprise Manager.

WLST can be found at <MIDDLEWARE_HOME>\odi\common\bin

Start the WebLogic admin server and once running start wlst and run the following commands:

updateCred(map="oracle.odi.credmap", key="SUPERVISOR", user="SUPERVISOR", password="<new_password", desc="Key for Supervisor")

Alternatively log into Enterprise Manager Fusion Middleware Control :

Go to EPMSystem > Security > Credentials

Edit the SUPERVISOR key and update the password then you should be good to go.

Now deploying FDMEE certainly does make life much simpler than all the steps that were required to get ODI up and running with ERPi but there is still room for improvement.

Monday, 24 June 2013 – Planning Outline Load Utility enhancements

I was looking at the new features for the planning Outline load utility and the following caught my attention -”Administrators can now use the Outline Load utility to export metadata to a relational data source.”

Going back to there were quite a few enhancements to the outline utility:
  • Import metadata and data from a relational data source.

  • Optimize command lines by storing command line arguments in a command properties file. For example, if you use a command properties file to run the same commands on multiple applications, you need only change a parameter in the command line for each import. This allows you to bundle switches for a common application. It also makes command lines shorter and easier to manage, and enhances readability and ease of use.

  • Export data export to a flat CSV file
I blogged about the features in three parts which can be found here, here and here.

I double checked and the ability to export metadata to a relational data source was not available in so I thought it would be worth going through this new piece of functionality in, now in my previous blogs I covered some areas such as using a command properties file which I am not going to cover again so if you have not done already it is probably worth having a read through the three previous blogs.

The outline load utility has certainly become a powerful tool over the years since its first appearance in the early days of version 11, the only feature missing with relational data sources is the ability to export data (only to a file at present) though I am sure that functionality won’t be too far away.

If you look at the new features in with classic planning applications and add in the functionality available with the outline load utility or ODI then I would certainly question the need to go down the EPMA route, if you are already in EPMA mode then keep enjoying the pain or maybe it is time to turn back from the dark side :)

Anyway before jumping in and exporting metadata then there are a number of new parameters available which need to be understood, many are similar to the ones used when importing except the parameter contains an E (export) instead of I (import).

 /ER: RelationalConnectionPropertiesFileName

This parameter defines the properties file which will hold all the database connection information for the export, it is the export equivalent to the /IR parameter which I previously covered and is used for importing.


If you don’t specify the /ER parameter then you can use the /ERA parameter which basically uses the same database connection information as the planning application that the export is running against, personally I would want to keep import/export type data separate to the planning application database.


This parameter I don’t believe is important if the database target is Oracle but for SQL Server it should be database name.


The parameter is the JDBC driver that will be used for the connection to the relational database.

For Oracle use:  /RED:oracle.jdbc.OracleDriver
For SQL Server: 


The parameter is the JDBC URL to be used for the connection to the relational database.

For Oracle the format is: jdbc:oracle:thin:@[DB_SERVER_NAME]:DB_PORT:DB_SID

So for example that could be: /RER:jdbc:oracle:thin:@[fusion11]:1521:FUSION

For SQL Server the format is:

An example being
/RER: jdbc:weblogic:sqlserver://[fusion11]:1433


This is the user name to connect to the relational database.


The password for the database connection, this is unencrypted the first time it is used in a properties file and once an export has completed the file will have been updated with an encrypted version.


This can either be the SQL export query to be run or it can be used to designate a key which will hold the SQL query, I will cover this in more detail shortly.

So those are the main new parameters to use when exporting metadata and if you combine them with the existing parameters you should be ready to start putting it all together.

I am now going to go through a few examples of extracting account metadata from the sample planning application.

First I start out with a batch script which will call the outline load utility and set the properties and password file to use.

OutlineLoad –f:E:\PLAN_SCRIPTS\password.txt /CP:E:\PLAN_SCRIPTS\

The –f parameter defines the password file to use and the /CP parameter defines the property file to use which will hold all the remaining parameters.

The properties file contains the following information:

The /ER parameter defines the database connection properties file and instead of using two property files I have pointed them all to the same file.

I am extracting to an Oracle database and the PLANSTAGE schema.

As explained earlier the password used in the /REP parameter is not encrypted at first.

Once the utility has been run the file should updated and the password encrypted.

Now on for the most important parameter /REQ which defines the SQL query or key.

The export query follows a strict format so don’t be thinking you can write your own elaborate SQL.

INSERT INTO <tableName> (column1, column2, ...) VALUES (property1, property2,...)

is the name of the table into which the exported metadata will be inserted into.

(column1, column2, …) is an optional list of column names the metadata will be inserted into.

(property1, property2, …)
are member property names to be extracted.

So let’s take a simple example and extract the member and parent information.

In this example I have used a key for the /REQ parameter but the following is perfectly valid as well:

Instead of specifying the dimension name like you would normally when using the utility you use “Member”.

The query will extract the Member and Parent values and insert them into a table called PLAN_ACC, the number of columns in the table will need to match the number of properties being extracted.

The name of columns in this case does not matter just as long as there are the same number of columns, the order of how they populated is defined by the query so in my example “Member “ then “Parent”.

Before running make sure the table and columns have been created as the utility will not do this and fail.

If you specify an incorrect table name or the table (this is based on Oracle): 

Incorrect number of columns in the table:

If the table has been created correctly then the export should be successful and the table populated.

It is worth pointing out that the table will not be cleared out before the export runs and unfortunately it doesn’t look like there is an option yet to do so, hopefully it will be added in the future as all it will require is another parameter with a true/false option but until then you will need to clear out the table with your own preferred method.

If the table has additional columns then you can define which columns to insert to in the query.

If you want to extract all the available properties then you can use the <properties> option in the query instead of specifying each property

Once again make sure the table has the same number of columns as properties that will be exported, if you are not sure how many columns are required then just run the script and check the output log as it will display all the properties that are being exported.

When the correct number of columns has been defined in the table the export should be successful.

Even the formulas are exported quite cleanly which I know could a bit of a pain when exporting to csv

There is another nice piece of functionality which allows you to export member properties based on the columns in the target table.

So say I had the following columns in a table: “Member”,”Parent”,”Alias: Default”

The column names must match the properties in planning.

In the query the <columns> option can be used.

When the export runs the utility will check the columns in the target table and then match that to the properties to export.

So there we have it yet another great additional to the Outline Load utility.