Contact Dave Hogg

Tel: +44 (0)7900 013007
Email: Dave@EPMinfra.com

We were able to carve out enough time during 2018 to also assist Honda with their Hyperion upgrades to 11.1.2.4 and OBIEE + ODI to 12c.

All very successul, despit hitting issues due to the change in technology underpinning HFM. Its all now Java API and ODI 12c is not set up with a knowledge module for it.

We instead developed some basic stand alone Java routines that would set the required EPM class-paths, and allow data load/extract/calc/consolidate from ODI via basic shell-out.

There has also been some fun with Essbase patches, as Oracle now have a grace period for fix support, but nobody told their Planning developers so they still only certify an out of support Essbase. Madness.

From around Oct 2017 we were helping HSBC direct, though more in a Portfolio Architecture capacity.

This included the struggle to decide when (or if) it would be approriate to move the likes of Hyperion Planning from a on-premise version (in Oracle cloud) to the PBCS, or even consider different products.

There was never any clear roadmap of when the likes of the Cloud versions of tools would be capable of handling some of the large/complex Plannign models, or allocation cubes. 

Safe harbour statement of course surround every possible future path, and roumours of mythical new On-Prem versions.

From around June 2015 till Sept 2017 we were helping Deloitte to deliver new financial systems for the HSBC Service Company.

As all Banks had to do, HSBC needed to have seperation between retail banking and their services back end operations. For that to work they needed to be able to spin up a subsidiary that had its own financial ledgers, planning, and ERP.

We were brought in to help with the Master Data management. Using DRM to turn the non-globalised Chart Of Accounts (like Cost Centres) into somethign that woudl work with the Oracle ERP Cloud solution (and Hyperion Planning).

I would urge anyone thinking of a new Global system to first tackle getting their master data globalised as the likes of the ERP needs a Cost Centre ID for example, to be the same CC/Description across all entities (even if the Entity would give it context during the GL posting). We managed to get data from golden sources within the Bank, and use DRM to knit these together to produce a consistent (but ever changing) COA.

For a month or so around June 15, I we were helping out the Hackett Group at their Seadrill engagement.

They needed to be able to load the exact same trial balance dataset that was beign loaded to HFM, into their new Account Reconcilliation Manager (ARM) product.

We exploited the fact that  the data was beoing loaded via FDMEE, usign the Open Interface Adaptor table so that it was retained, and used to also automatically fire off a load of same data into ARM.

One of my client just came across this nasty little Essbase 'feature'...

They have a DR strategy that includes the shipping of Essbase transaction logs to DR server and apply every hour. However they found that the Cube on the DR had different data to their live production cube.

After much investigation it was found to be due to execution of Calc scripts that use Essbase Substitution Variables. Unlike Business Rule execution it appears that on the DR server the calcs will utilise the local Variable settings rather than what is set on the Primary.

They do replicate the .SEC file each evening (and hence the variable values) but if they change the typical variables (e.g. CurrYr or CurrPer) the transaction apply at the DR uses the old variables.

Nasty. They have raised an enhancement request!

Go to top