Yes and no... It has many of the same terminology and features of DRM but works in a fundamentally different way.
It is more like a cloud version of DRM that is based around use via DRG module only, and into a persistent 'Master Version'. So every change is done in the context of a Request, which is atomic (so can hold other related changes and be approved & committed as one).
Gone are concepts around Version containers... once Request is approved change is committed. New/Pending changes can only be viewed through the lens of a request.
What is new/better (in our opinion) is the concept of subscriptions, where actions in one App/View can flow to others to keep related viewpoints in sync with a master, or have a simplified business entry layer that can get more complex as it goes deeper towards specific App level properties.
It helps to facilkitate the idea of "Manage by exception" keeping entry layer simple (Wide [scope] but Shallow [complexity]) and thus allows business to take ownership of their own data (leaving IT Tech teams to just manage framework and Technical elements like integrations).
While initially we were helping M&G to repatriate HFM to UK based servers they also decided to embark on a Journey to the Cloud.
Oracle's EPM cloud of course, and we helped them with use of EPM Agent, REST API and EPM Automate in an Azure DevOps context to get data to and from the Cloud plus orchestrate the overall processes. Some interesting challenges, especially given things like member combination Bind limits and performance of Cloud Data Management.
Remember all the Arrows you see on any nice pretty diagram about how it will all work, as most need to be built or at least configured ... you will need a Fletcher to make your arrows!
We were able to carve out enough time during 2018 to also assist Honda with their Hyperion upgrades to 188.8.131.52 and OBIEE + ODI to 12c.
All very successul, despit hitting issues due to the change in technology underpinning HFM. Its all now Java API and ODI 12c is not set up with a knowledge module for it.
We instead developed some basic stand alone Java routines that would set the required EPM class-paths, and allow data load/extract/calc/consolidate from ODI via basic shell-out.
There has also been some fun with Essbase patches, as Oracle now have a grace period for fix support, but nobody told their Planning developers so they still only certify an out of support Essbase. Madness.
From around Oct 2017 we were helping HSBC direct, though more in a Portfolio Architecture capacity.
This included the struggle to decide when (or if) it would be approriate to move the likes of Hyperion Planning from a on-premise version (in Oracle cloud) to the PBCS, or even consider different products.
There was never any clear roadmap of when the likes of the Cloud versions of tools would be capable of handling some of the large/complex Plannign models, or allocation cubes.
Safe harbour statement of course surround every possible future path, and roumours of mythical new On-Prem versions.
From around June 2015 till Sept 2017 we were helping Deloitte to deliver new financial systems for the HSBC Service Company.
As all Banks had to do, HSBC needed to have seperation between retail banking and their services back end operations. For that to work they needed to be able to spin up a subsidiary that had its own financial ledgers, planning, and ERP.
We were brought in to help with the Master Data management. Using DRM to turn the non-globalised Chart Of Accounts (like Cost Centres) into somethign that woudl work with the Oracle ERP Cloud solution (and Hyperion Planning).
I would urge anyone thinking of a new Global system to first tackle getting their master data globalised as the likes of the ERP needs a Cost Centre ID for example, to be the same CC/Description across all entities (even if the Entity would give it context during the GL posting). We managed to get data from golden sources within the Bank, and use DRM to knit these together to produce a consistent (but ever changing) COA.