Tuesday, May 31, 2011

Is there EAL for Hyperion Planning? Yes!

Wouldn't it be great if you would be able to push Oracle Hyperion Planning data to an Essbase ASO reporting cube in real-time? Of course it would! Now you can and this article will describe the simple process on how to make it happen. Your system will work smarter, not harder. The requirements are as follows:
  • Star Command Center
  • Off-the-shelf Oracle Essbase Report Scripts
  • Off-the-shelf Oracle Essbase Rules Files
  • Off-the-shelf Oracle Essbase MaxL script
  • Minor modifications to the Hyperion Planning ValidateData.js file
Watch this 2 minute video that demonstrates the functionality within Star Command Centerbe sure to view in full screen mode: YouTube Link
Regardless of Hyperion Planning complexity and product version v4, v9, v11, this process is fully supported and it is quick and easy to setup (less than one hour).
Step #1 – Edit your data configuration file. Depending on your Hyperion Planning web server (i.e. Weblogic, Websphere, Tomcat, etc) you will need to edit the “ValidateData.js” file (1) so either a new custom button can be placed on the Planning web form or (2) a new action can be defined such as ‘On Save’. For more details, see the blog post:

Step #2 – Create an Essbase Report Script that represents the data set that you want to extract from the source Planning Essbase cube and load into the target reporting ASO cube. The report script should:
  • Include all combinations of data within the scope of the Hyperion Planning web form or include an entire Essbase “block”.
  • Map the Hyperion Planning page slicers as parameters in the Essbase Report Script.
  • Have 'Row' definitions that define the dimensionality of the Essbase block.

Step #3 – Create an Essbase Rules File to load the exported data into the target reporting ASO cube.
  • Tip: Often the target reporting ASO cube has dimensionality that is different from the source Planning Essbase cube to allow reporting users to see data in different ways. As such, the Essbase data load rules file should map the source text file to the target ASO cube which can be easily handled in the rules file. This in turn simplifies the Planning model while maintaining rich reporting functionality.

Step #4 – Create a simple MaxL script to run the report script against the source BSO cube and load into the target ASO cube.
  • Tip: Star Command Center Essbase tasks could also be used for this step
login $(ESS_USER) $(ESS_USERPW) on $(ESS_SERVER2);
export database $(ESS_APP).$(ESS_DB) using report_file 'ExptData.rep' to data_file 'Delta.txt';
import database 'TotASO'.'Plan1ASO' data from data_file 'Delta.txt' using rules_file 'Delta.rul' on error abort;
Step #5 – Create a simple task sequence in Star Command Center as illustrated:
  • Tip: The above approach can actually be used with any other type of task sequence, for example loading Essbase data into relational data warehouse or running an external variable driven procedure.


  1. Hey Quinlan,

    In the report script your reference parameters. How are they being passed into the report script? Is it via the Command Center? Or can report scripts grab them right off a webform? Is that why you don't use SIS or DATAEXPORT calc to do the data extract?


  2. Hi Sam,

    The Planning page slicers are passed directly to the Star Command Center Event Listener - so it is 100% on demand. The parameters can then be used in *any* automated process. In the use case above, we use the parameters in a report script to narrow the intersection point. As such, the report script runs really fast(sub second) and then the report script output can be used to push the data into an ASO cube via Rules files.