Thursday 30 March 2017

FDMEE – diving into the Essbase and Planning security mystery – Part 2

In the last part I took an in-depth look at the current situation with on-premise FDMEE and user based security when the target is either Essbase or Planning, I pretty much concluded that a user’s security is not honoured and a global admin user must be defined which basically means no security restrictions on the data that is being loaded.

In this part I am going to go through the same process but for EPM Cloud to see how it differs, the cloud products I will be concentrating on are PBCS including enterprise and FCCS.

Let us dive right in and look at loading data to a target planning application, at present there are two load methods available and we will start with “Numeric Data Only”


This method is the equivalent to on-premise load method of “Numeric Data only – File” but in the cloud the difference is that it can only be defined at application level, for on-premise it can be set at load rule level which makes more sense, I am not sure why it has not been implemented like that in the cloud but I am sure it will change and probably without any warning.

As always I am keeping things nice and simple and will be loading two rows of data.


I will start with a user that has been provisioned with the role of “Service Administrator” which is the same as a planning application administrator.


As expected the process ran through smoothly and the process log provides details on how the data is being loaded.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: epm_default_cloud_admin
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0029
DEBUG [AIF]: Fetching rule file from essbase server for data loading: AIF0029
DEBUG [AIF]: Locked rule file: AIF0029
INFO  [AIF]: Loading data into cube using data file...
INFO  [AIF]: The data has been loaded by the rule file.
DEBUG [AIF]: Unlocked rule file: AIF0029

The process is the same as with on-premise where a data file and load rule are created then the data is loaded using the load rule.

The difference between cloud and on-premise is the user that has loaded the data is the default cloud admin and not the user running the export, the ‘epm_default_cloud_admin’ is a system generated account to carry out admin type duties and there is no way to change the behaviour of this account.

To be sure I provisioned a user with the power user role and ran the export again.


The process log confirms that the user running the export is being overridden with the default cloud admin account

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: epm_default_cloud_admin
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0029
INFO  [AIF]: The data has been loaded by the rule file.

In the cloud, there is no option to set a global user so when loading data using the "Numeric Data Only" method the user will be ignored and the admin account will take control of the process, this also means that it is not possible to honour the user’s data level security using this method.

So, in some respects there is a similarity between cloud and on-premise when using this method as for it to function correctly it requires an admin type role and there is no control on restricting the security at data level.

Let us switch back to a service administrator user and set the load method to “All data types with security”


You would usually set this method if you are planning on loading non-numeric data which I previously wrote a post about, when this method was first introduced it was called “HPL” and then was subsequently changed.

The method includes reference to security so maybe it is inclination to it operating differently.

The export was successfully run again.



The process log contains important details to what is happening behind the scenes.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: John.Goodwin@mtl.com
DEBUG [AIF]: Overrode info.loadMethod for the admin user: OLU
Outline data store load process finished. 2 data records were read, 3 data records were processed, 3 were accepted for loading (verify actual load with Essbase log files), 0 were rejected.

This time the default cloud admin has not overridden the user running the export and the method to load the data is the Outline Load Utility (OLU), as this user has the service administrator role and security restrictions on the data are not important the OLU can be used, in the last part I showed what happens if the user is not an admin and tries to load data using the OLU.

Now let’s look at what happens with a user that has a power user role assigned which I believe is the minimum role level requirement for Data Management access.  please note I have not assigned any access permissions for the user in the planning application yet as I just want to show what happens.

The export is run and it fails.


This time in the process log there are noticeable differences to the way the data is being loaded.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: l.howlett@mtl.com
DEBUG [AIF]: Overrode info.loadMethod for the non-admin user: REST
DEBUG [AIF]: requestUrl: http://localhost:9000/HyperionPlanning/rest/v3/applications/Vision/plantypes/Plan1/importdataslice
ERROR [AIF]: The rest service request has failed: 400 Bad Request - {"status":400,"message":"java.lang.NullPointerException","localizedMessage":"java.lang.NullPointerException"}

There is no overriding of the user performing the export and as the user is not an admin the method for loading data is with the REST API.

If you are not aware there are REST resources available for importing, exporting and deleting data by building up a data grid which I will go into more detail shortly.

The REST call failed and the reason behind this even though the error message is not clear is because no access permissions have been defined.

Now I am going to add access permissions but not for all dimensions and run the export again.



The export failed again which was to be expected and now the error message that is produced is something you should be familiar with if you have worked with planning.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: l.howlett@mtl.com 
DEBUG [AIF]: Overrode info.loadMethod for the non-admin user: REST
DEBUG [AIF]: requestUrl: http://localhost:9000/HyperionPlanning/rest/v3/applications/Vision/plantypes/Plan1/importdataslice
ERROR [AIF]: The rest service request has failed: 400 Bad Request - {"status":400,"detail":"You are trying to open the form, but cannot because all of the required dimensions are not present. Possible causes may be that you do not have access to at least one member of a required dimension, or the member selection resulted in no members present. Contact your administrator."}

The error message is the same one you receive when a user opens a form and does not have all the required access permissions to the members in the form, so it looks like the REST resource sits on top of existing functionality built into forms.

I updated the access permissions so the user has write access to all the members contained in the data load.


This time the export was successful and the process log confirms the number of rows loaded.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: l.howlett@mtl.com
DEBUG [AIF]: Overrode info.loadMethod for the non-admin user: REST
DEBUG [AIF]: requestUrl: http://localhost:9000/HyperionPlanning/rest/v3/applications/Vision/plantypes/Plan1/importdataslice
INFO  [AIF]: Number of rows loaded: 2, Number of rows rejected: 0

Opening a form with the same user shows that the data was correctly loaded.


So what is happening behind the scenes to load the data using the REST API, well when an export is initiated a file is created which contains JSON and a grid based on the data which is being loaded.


The JSON in the file is then posted to the "importdataslice" REST resource and the user details running the export are passed in so that the security is honoured.

The URL format for the REST resource is:
https://<cloudinstance>/HyperionPlanning/rest/v3/applications/<app>/plantypes/<cube>/importdataslice

An example of the JSON that is being generated for the two rows of data is:


The response that is returned is in JSON format and contains information about the number of cells that were accepted and rejected.


The JSON grid that is being generated in my example would look like the following in terms of a planning form.


To demonstrate what happens with rejected cells I updated the JSON to include an account member “4110” which the user does not have access to.


The response shows that one cell was rejected and contains the row of the grid that was rejected, it does not provide the actual member that was rejected though.


I will take another example in Data Management and load four rows of data, the user does not have access to entity member “111” so in theory two rows of data should be rejected.


I was expecting the export process to be successful but contain warnings but it looks like if any invalid data is encountered the process status is shown as a failure.



Investigating the process logs shows that two rows were loaded and two were rejected which is what I would expect, the list of the rejected rows are written to the log and are generated from the REST response like in my previous example.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: l.howlett@mtl.com
DEBUG [AIF]: Overrode info.loadMethod for the non-admin user: REST
DEBUG [AIF]: requestUrl: http://localhost:9000/HyperionPlanning/rest/v3/applications/Vision/plantypes/Plan1/importdataslice
INFO  [AIF]: Number of rows loaded: 2, Number of rows rejected: 2
INFO  [AIF]: List of rejected cells: ["[Actual, 1520, 111, No Version, BaseData, P_000, FY17, Mar]","[Actual, 2210, 111, No Version, BaseData, P_000, FY17, Mar]"]

I was interested to know whether valid intersections would restrict the data being loaded, as the REST functionality looks to be built on top of form data grids then the logic should apply.

I updated the users access permissions so they could write to entity members “110”, “111” and then restricted permissions to entity “111” using a valid combination.


The export failed.



The rows containing entity member “111” were rejected so valid intersections are honoured when loading data using the REST method.

INFO  [AIF]: Number of rows loaded: 2, Number of rows rejected: 2
INFO  [AIF]: List of rejected cells: ["[Actual, 2210, 111, No Version, BaseData, P_000, FY17, Mar]","[Actual, 1520, 111, No Version, BaseData, P_000, FY17, Mar]"]

My previous examples have all been using PBCS which also applies to E-PBCS so how about FCCS, with FCCS the load type can be set a rule level.


There are only two options available which are Data and Journal.


I am not going to bother covering the journal option as setting this option will generate a Journal in FCCS and this does not relate to what this post is about.

I loaded the following data set with a user that has the Service Administrator role applied.


The process log confirms that when using the “Data” load type in FCCS it acts in the same way as “All data types with security” in PBCS, if the user has the Service Administrator role it will load the data using the Outline load utility.

INFO  [AIF]: cloudServiceType: FCCS, Resolved user name for application access: John.Goodwin@mtl.com
DEBUG [AIF]: Overrode info.loadMethod for the admin user: OLU
INFO  [AIF]: Number of rows loaded: 1, Number of rows rejected: 0

If I switch to a power user with the correct access permissions for the data that is being loaded, then the export is successful.


INFO  [AIF]: cloudServiceType: FCCS, Resolved user name for application access: l.howlett@mtl.com
DEBUG [AIF]: Overrode info.loadMethod for the non-admin user: REST
DEBUG [AIF]: requestUrl: http://localhost:9000/HyperionPlanning/rest/v3/applications/FCCS/plantypes/Consol/importdataslice
INFO  [AIF]: Number of rows loaded: 1, Number of rows rejected: 0

The method to load the data is with the REST API, this is basically the same as PBCS which means the user’s access permissions will be honoured.

The only concern I have about using the REST method would be the performance implications of loading a large amount of data as it will be the equivalent of creating a huge form, I have not had the chance yet to test whether it does impact performance.

That covers when the target is Planning so how about when an ASO Essbase cube is the target.

The load methods are the same for Essbase as they are for Planning and currently can only be set in the target application options, I going to start with “Numeric Data Only


The following data set is loaded by a Service Administrator.


When using the numeric data method then concept is the same as with Planning, the default cloud admin overrides, a data file and load rule are created, the data is loaded with the rule.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: epm_default_cloud_admin
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0031
DEBUG [AIF]: Fetching rule file from essbase server for data loading: AIF0031
DEBUG [AIF]: Locked rule file: AIF0031
INFO  [AIF]: Getting load buffer for ASO data load...
INFO  [AIF]: Initializing load buffer [1]
INFO  [AIF]: Successfully initialized the load buffer
INFO  [AIF]: Loading data into cube using data file...
INFO  [AIF]: The load buffer [1] has been closed.
INFO  [AIF]: The data has been loaded by the rule file.
DEBUG [AIF]: Unlocked rule file: AIF0031

Now to switch over to a power user and repeat the process.


Once again the process is the same as when Planning is the target and the default admin overrides and loads the data.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: epm_default_cloud_admin
DEBUG [AIF]: Resolved essbase rule file name for loading: AIF0031
DEBUG [AIF]: Fetching rule file from essbase server for data loading: AIF0031
DEBUG [AIF]: Locked rule file: AIF0031
INFO  [AIF]: Getting load buffer for ASO data load...
INFO  [AIF]: Initializing load buffer [1]
INFO  [AIF]: Successfully initialized the load buffer
INFO  [AIF]: Loading data into cube using data file...
INFO  [AIF]: The load buffer [1] has been closed.
INFO  [AIF]: The data has been loaded by the rule file.
DEBUG [AIF]: Unlocked rule file: AIF0031

How about setting the load method to “All data types with security


I will try with the Service Administrator user as there shouldn’t be any problems.


Spoke to soon, the process failed so time to look at the logs.

INFO  [AIF]: cloudServiceType: Planning, Resolved user name for application access: epm_default_cloud_admin
ERROR [AIF]: Essbase ASO application is not created from any Planning application, please use other data load methods.

Hold on that error does not make sense, this is PBCS so it must be created from the Planning application there is no other way.

I tried with the power user and received the same error message so I decided to test whether I could load data to the ASO cube using the REST API, it should be possible because I can create a form and enter data against the ASO cube.


The response confirms that it is possible to use the REST resource to load data to an ASO cube.


At the moment I am not sure why it is possible to select the “All data types with security” method if it doesn’t work, maybe I am missing something or it is a bug or a feature that will be implemented at a later stage. If I find out any further information I will update this post.

That is all I am going to cover on the different security type behaviour between admin and non-admin users for on-premise, hybrid and cloud. It certainly can be confusing with all the possible scenarios but hopefully I have cleared it up over the two posts.

No comments:

Post a Comment

Note: only a member of this blog may post a comment.