In 2017, Andrew White from Gartner published a blog entitled: Before Implementing ERP Systems, Implement Application Data Management! It’s a quick and interesting read about how traditional MDM lacks the rigor to keep applications in sync beyond core elements like customer and product data.  The article makes many good points and coins the acronym ADM (Application Data Management), which influenced some major players, including Oracle & SAP.

His message was that traditional MDM doesn’t address the challenges of synchronizing application level data and settings across numerous ERP modules and other systems.  He observed that the slow degradation of data definitions and misalignment across applications rarely creates major issues quickly; it is more like death by a thousand cuts.  While his focus was on managing data across modules within an ERP, in the age of the Cloud these needs quickly transcend ERP, and need to include solutions provided by multiple vendors.

Enter Oracle’s Enterprise Data Management (EDM) Cloud, which has matured significantly in the in two years since its inception.  There are major benefits that can be achieved very quickly using the solution.  For example, financial reporting structures can be easily harmonized by merely installing and configuring out-of-the-box connectors to Oracle ERP Cloud Financials, as well as to Financial  Consolidation and Close and Planning in the Oracle EPM Cloud.

General Ledger reporting segments and hierarchies in ERP can be synchronized with reporting dimensions in Consolidation and Planning.  To the uninitiated, understanding the similarities and differences between these data structures can be daunting, and that was before they could be governed in a single tool!  Add in:

  • Robust security;
  • Drag and drop editing;
  • Approval workflows;
  • Automated updates across systems via a subscription model;
  • And the ability to audit metadata changes;

and the combination of insight and control that can be instituted in a matter of weeks is truly awesome and inspiring.

Applications

When applications are registered in Oracle EDM Cloud  using prebuilt adapters, metadata is generated automatically for all data elements to be mastered.  In the screenshot above, the applications that have been registered in EDM Cloud are shown.  The registration definitions include all the segment types, hierarchies, and reporting dimensions being mastered for those applications, as well as the connection details for those cloud services.

Viewing Application Dimensions & Metadata

When viewing an Application, all the nodes being mastered can be seen.  In the example above, the General Ledger (GL) Account Segments from the ERP Cloud Financials GL are viewed in list format.  The properties to the right include the account description, Boolean flags for allow budgeting/posting/etc., effective dates, and other metadata required by the General Ledger in Oracle ERP Cloud.

Maintenance Centric Views

While it’s useful to see all the metadata being mastered for an application, where EDM Cloud really shines is when domain-specific views are created.  In the example above, a custom view shows all the uses of GL Accounts across these applications.  The first tab shows the GL Account structure as a hierarchy, rather than a list view.  The properties for each account remain the same, the definitions required by the Oracle Fusion GL.

The second tab in the Account Maintenance View shows the Accounts dimension in Financial Consolidation and Close.  The same Account that was highlighted on the prior tab is highlighted here, and at a glance one can quickly tell that the Financial Consolidation and Close Account dimension looks significantly different from the Fusion GL Account hierarchy.  It contains many prebuilt members, but all the leaf level GL segments are required, and many of the parent members are reused as well!  The properties being managed are also quite different, and reflect the metadata used to manage and tune the performance of the Financial Consolidation and Close in the EPM Cloud  application.

The third tab in the view displays the Planning Account dimension.  It is purpose-built to support multiple planning models and contains many prebuilt members.  It also has the most complex property metadata, since the dimension can be tuned differently for use in the Financial, WorkForce, Project, CAPEX, Strategic, and Custom planning models within Oracle EPM Cloud.

Managing all these aspects centrally within EDM Cloud  allows multiple teams to collaborate and effectively control the impacts of GL segment changes across the enterprise.  The use case we’ve shared can be extended well beyond GL segment changes to include classification schemes, status codes and system settings.  It can also include Data Warehouse reporting dimensions to keep the mappings used by the ETL process up-to-date.  The ability to comprehensively validate metadata enables organizations to proactively control the impact of application-level changes across the enterprise, providing reporting landscapes like never before.

Contact AST today to learn more about EDM Cloud, and how Application Data Management can benefit your organization.

The EPM Automate utility can now be used on Mac.  In this post, you will learn how to install the utility on Mac OS and get it working.  Let’s dive right in!

Download and Install JDK

The first step is to download Java Development Kit (JDK) from Oracle’s website.  If you already have JDK installed, you may skip to the next step.  

Accept the license agreement and select the .dmg file for Mac OS.

Once the download is complete, proceed with installing JDK on your Mac.  After installation, open a new terminal and type the following commands to ensure that Java has been properly installed.

The first command will give you the path to where Java has been installed (usually “/usr/bin/java”).  The second command will provide version details of the Java installed on your Mac.

Download EPM Automate for Mac OS

Log in to the EPM instance and navigate to Downloads.  

Select the “Download for Linux/Mac” option for EPM Automate.

This will download the EPMAutomate.tar file.  I then moved the file to a “Documents/EPM” folder.  

The next step is to unzip the “.tar” file, which can be done by following these commands on your Mac Terminal:

The above commands will create the epmautomate folder within the “/Documents/EPM” folder.

Set JAVA HOME and Path to EPM Automate

For the EPM Automate utility to work properly, we need to set up JAVA_HOME and the path to the EPM Automate utility.  Begin by opening the .bash_profile.

Once open, add the following details to the end of the file.  (Press “Esc + i” to edit the file.)

Important Note:  Be sure to provide the correct path to the EPM Automate bin directory.

Save the file by pressing “esc” and “wq!”, then “Enter/Return”.  Exit the terminal.  Open a new terminal and type the following to ensure all is working properly:

You should see a screen like this, in which you can see that the path to Java and the Path to EPM Automate are correctly set.

Run EPM Automate Commands

Open a terminal and type epmautomate.sh to receive this output:

Now, you can use the EPM Automate utility on your Mac!

As always, let us know if you have questions by leaving a comment below.  Thanks for stopping by!

Tagged with: , , , , , ,

In this post, we’ll cover the basics of using PBCS/EPBCS REST APIs using Groovy Scripts.

With the release of the REST APIs, you now have the ability to develop robust applications using Java, Groovy, and CURL.  Let’s see how we can use PBCS REST APIs.  

For the purposes of this post, it is assumed that you are using a Windows Operating System, though Groovy can also be installed on Mac/Linux.

Here is a summary of the topics we’ll cover in this post:

  • Install and Configure Java
  • Install and Configure Groovy
  • Groovy Console
  • The First Groovy Program

Install and Configure Java

The first step is to install Java on your system. Head over to Oracle to get the latest version of Java SDK. Accept the license agreement and download the software. Once the software is download, install Java on your system, keeping the default options.

In my case, I downloaded the 64-bit version, as shown below.

 Once Java is installed, make sure to:

  1. Add JAVA_HOME to Environment Variables and
  2. Edit your “Path” environment variable to include the path to the Java folder.

For further explanation on how to complete the above two configurations, click here. Be sure to restart your system after making changes to the environment variables.

Once logged back in, open command prompt and type “java -version” and you should see the version displayed, as shown below.

Install and Configure Groovy

The second step in this process is to install Groovy. Head over to the Groovy Language website, scroll down, and download the “Windows Installer“, keeping all default values.

Next, download the java-json.jar file. Head over to the Java2s website to download the file. Copy the java-json.jar file to the Groovy Library folder.

In my case, it is “C:\Program Files (x86)\Groovy\Groovy-3.0.0\lib”.

Now, you’re all set to start invoking PBCS REST APIs using Groovy!

The Groovy Console

Open the Groovy console from the Windows start menu. The console consists of 3 main areas:

  • Toolbar
  • Editor – Write the code here
  • Output Area – Check error messages and output messages here

You should also become familiar with the “Execute” and “Clear” buttons, as shown in the image below.

The First Groovy Program

Oracle has provided some excellent examples and sample code for using PBCS REST APIs using Groovy. Head over to Oracle Docs for details. We’ll be working with the “List Files” function. Before we start, though, let’s look at the anatomy of a groovy script.

Import Section

The import section consists of any external files (libraries) that are needed to execute the program. In our example, we’ll include the following:

import org.json.JSONObject
import groovy.json.JsonSlurper

Variables

Assuming you are already familiar with variables, allow me to dive right in and explain the variables we’ll be using:

  • serverUrl:  Stores the planning URL. Be sure to include the port number at the end.
  • username:  PBCS/EPBCS login user name. The format to be followed is “domain.username”.
  • password:  PBCS/EPBCS login password.
  • apiVersion:  default to V3.
  • appName:  PBCS/EPBCS Application name.

Below is the sample code. To find the current version of Migration APIs, click here.

serverUrl="https://planning-yoururl.oraclecloud.com:443" username="a123456.adminuser"
password="mypassword"
apiVersion="11.1.2.3.600"
appName="Vision"

Methods or Functions

We’ll re-use the functions and methods from Oracle’s documentation. Grab the code for the following:

  • fetchResponse ( )
  • fetchJobStatusFromResponse ( )
  • executeRequest ( )
  • listFiles ( )

You’re all set!

Now for the fun part – seeing the results. Before we go ahead and execute the code, let’s add the following line to invoke the process:

listFiles();

Click the “Execute” button and take a moment to enjoy the first Groovy Script.

As always, don’t hesitate to leave us a comment below with any questions.  Thanks for stopping by!

*You might also want to check out an earlier post, Testing PBCS REST APIs Using SoapUI.

Tagged with: , , , , , , , ,

In this post, we’ll look at how to fetch data from Oracle ERP (SaaS) and load it into Oracle EPM (SaaS). 

*This post assumes that the reader has some knowledge of PowerShell and EPM Automate commands.  It also assumes that EPM Automate is already installed and that the import jobs are set up in the EPM instance to import the data file.

The high-level steps involved in the process are:

  • Invoke BI Report
  • Parse and Decode Base64 Response
  • Upload File to EPM
  • Invoke Import Job in EPM

Here, we take a look at using a combination of PowerShell and EPM Automate utility commands to accomplish the task at hand.

Invoking the BI Report

In this scenario, we are using Oracle BI Publisher Web Services.  More specifically, we’re using the “ExternalReportWSSService” web service and the “runReport” method.

In an effort to simplify this example, the BI report does not have any parameters.  The output is in CSV format.  

Create a SOAP payload file, as shown below, and save it as saopRequest.xml.

Make sure the path to the BI report is populated correctly. Look for reportAbsolutePath in the above code snippet.  Be sure to include “.xdo” at the end of the report name.  It is assumed that the report is available in the shared folders.

Now, let’s write the PowerShell script to invoke the BI report.  Make sure to use the correct Oracle Cloud URL.  You need the following information to invoke the web service:

  • User Name to Access BI Report
  • Password of the BI User
  • WSDL

Store all of this information in a json file.  Let’s name it biDetails.json.

Be sure to provide the correct username, password, and URL to the BI server.

Moving on to the PowerShell Script, let’s read the data file to get the details that are required to invoke the web service. We will also need to pass the authorization details as part of the request header.

In the above code snippet, we created the encoded credential string, which is a base64 encoded string of user name and password. The encoded string is then added to the request header.

The next code snippet invokes the BI report.

Make sure the following are set correctly:

  • ContentType to “application/soap+xml;charset=utf-8”
  • Method as “Post”
  • InFile as soapRequest.xml (the SOAP Payload)
  • OutFile as outFile.xml (response will be saved)

outFile.xml is the response from the web service. The report data is base64 encoded and is available in the xml response. The specific xml tag is reportBytes.

Parse and Decode the Report Data

The response from the web service is in base64 encoded format. We have to decode it to save the data to a csv file. Use the below code snippet to decode the response.

In the above code snippet, we are reading the output.xml file and generating the data file. The data is saved as “Positions.csv” in the current working directory.

Upload File to EPM

Once the data file is generated from the BI report, we can use EPM Automate utility commands to upload the file to the planning instance. The pre-requisite for this is to have EPM Automate installed on your system.

Log in to the Planning instance and upload the file. To learn more about EPM Automate commands, click here.  You may have to give the full path to the file in the upload command.

Invoke the Import Job in EPM

The final step is to invoke the job in Planning to import the data. The pre-requisite to this step is to define the Import Job in EPM. Use the below command to invoke the import job and log out of the Planning instance.

Import_Position_Data is the import job defined in EPM.

Conclusion

This approach allows you to run the job on-demand. You are not dependent on the ERP team or any scheduled jobs in ERP to get the data file. There are other ways of doing this, but this is the most convenient, from an EPM user’s perspective.  

As always, if you have any questions or comments, or if you want the complete code, please leave a note below and one of our experts will get back to you.

In future posts, we’ll take a look at the same process, but using Unix Shell Scripts.  Stay tuned!

Tagged with: , , , , , , , , , ,

CFO 2020 mandates: agility, speed, efficiency

Every CFO must contend with formidable forces at play today. Economic and geopolitical uncertainty. Fierce, unrelenting competition. Dynamic market conditions. Constant compliance and security concerns. There’s worry about a possible recession in 2020. And let’s not forget the CFO’s ever-growing purview of responsibility. CFOs today must serve equally well as company cost managers, enterprise-wide transformation leaders and boardroom strategic advisers.

In line with this, CFOs are extending their functional scope from directing finance-centered accounting activities to leading cross-functional teams that link sales, distribution, marketing, finance, customer service, and other critical areas. They are broadening their focus from reducing costs and ensuring governance and control to driving transformation and growing revenue. In every way, CFOs are increasing their value to the enterprise and growing, deservedly, in power and pay.

In 2020 a CFO’s success will ultimately hinge on how well he/she performs against today’s unrelenting mandates for agility, speed and efficiency. To accomplish everything and deliver on these mandates, CFOs – once considered technology laggards – are embracing digital transformation and racing to the cloud.

 In fact, 93% of the respondents to Deloitte’s 2018 Global Outsourcing Survey of finance and other business leaders reported that their organizations have already started, or are considering, new cloud implementations. Similarly, a poll of 3,000 finance executives conducted during a recent Deloitte webcast found that nearly half (48%) of respondents believe that cloud technology will be critical to the performance of their finance departments within the next two years.

 Now let’s have a look at how today’s Corporate Performance Management (CPM) solutions help CFOs deliver on these mandates.

 Agility mandate

 The only constant in business is change. Simply negotiating the typical fluctuations of business is challenge enough. Add to this today’s blistering pace of technological advancement and digital disruption and keeping pace with change becomes more difficult by orders of magnitude. Moreover, the consequences for not reacting or adapting quickly enough to changing conditions can be disastrous.

At a minimum, agility for CFOs means the ability to respond quickly to opportunities, address variations to assumptions or expectations, and to minimize the negative impact of threats and challenges. For this, CFOs need unhindered access to their organization’s trove of data to uncover hidden threats as well as new sources of value – and to do it all while controlling costs.

More than anyone else in the enterprise, the CFO is responsible for curating all the data that drives and defines a business. As the influence of CFOs ascends within the C-Suite, they’re the ones taking responsibility for turning that data into insights for better, more timely decision-making.

Accordingly, there is an urgent need for dynamic planning and forecasting that provides the agility businesses need to respond quickly and drive better financial decision-making. Gone are the days of ‘point-in-time’ pictures.

The natural fit between analytics and planning, budgeting, forecasting, and consolidating processes is driving a growing number of organizations to embrace analytics and select planning, budgeting and forecasting or financial reporting and consolidation as ground zero for migrating their CPM processes to the cloud.

Speed mandate

When I talk to CFOs about process improvement, pretty much without exception they tell me that their top priority is closing the books and reporting faster and more efficiently. Why the pressing need for a speedy close? Simply put, speeding up the process of closing the books each month frees CFOs and their finance teams to focus on more strategic priorities. If CFOs can get the books closed in just five days instead of 10, that’s five extra days each month they can spend on forecasting and decision support with the CEO, planning future projects with department heads, and perhaps most importantly, analyzing big data to drive efficiency and growth throughout the organization.

For decades, CFOs had no recourse other than to slog through a sea of spreadsheets in order to report on how their organization (or any aspect thereof) was performing. Not an enviable or effective task by any measure. Tragically, spreadsheet madness and disparate point solutions still hamper the CFO’s ability to close the books, let alone gain visibility into performance across the enterprise.

When a CFO can act faster, the entire organization moves faster. When a CFO can work smarter, the entire organization operates smarter. For this to happen, CFOs and finance teams must streamline and automate core, regularly occurring functions such as closing the books and preparing reports.

 Thankfully, cloud-based financial close and reporting solutions, generally under the umbrella of cloud CPM, enable you to adapt lightning-fast to changing business and compliance requirements while reducing risk, improving control, and delivering faster, more accurate insights to all stakeholders—anytime, anywhere.

 Efficiency mandate

It may be a cliché, but its truth cannot be denied. Working faster and smarter, not harder, is the key to sustainable efficiency gains. Recent research reveals (in financial terms) the widening gap between top performers and bottom dwellers when it comes to the efficiency of their financial operations.

Data from APQC’s Open Standards Benchmarking database shows that top-performing companies today spend 0.56% of their revenue on the finance function; median performers 1% and bottom performers 1.6%.

Companies spend less on finance as a percentage of revenue because they have optimized, efficient, and effective processes. Those processes have lower cycle times and require fewer full-time employees. When companies spend less to accomplish the same amount of work, they have opportunities to reinvest those resources in improving the bottom line.

Efficiency gains will always yield cost savings. Unsurprisingly, the top reason CFOs move CPM to the cloud is to reduce costs.  Even closer to the CFO’s heart, moving to the cloud also enables CPM users to innovate and adopt best practices such as rolling forecasts, driver-based planning, and faster reporting and close cycles.

Contact AST today and let our experts put the power of  the OneStream XF simplified platform for Financial Close and Consolidation to work for your enterprise, as well as unified Planning, Budgeting, and Forecasting with robust Financial Reporting.

Aside from our OneStream implementation services, ASTcan share deployment options and highlights of the XF platform’s key features that customers and Gartner alike recognize in its “Magic Quadrant”.

Whether your organization is highly manual in the Finance area or looking for more from you CPM software, contact us today to assess the power of OneStream XF.

In this article, we’ll show you how to add custom expense types in Oracle’s Enterprise Planning and Budgeting Cloud Service (EPBCS) Projects module.  This can be helpful if you are interfacing Project Plan/Actuals data from another system (e.g., Oracle PPM); the external system may be using different expense types that you want to capture in the EPBCS Projects module.

Let’s get started…

Predefined Expense Types

The Projects module in EPBCS comes with certain predefined expense types.  These expense types are used to capture and plan project expenses.  To view the standard expense types available in EPBCS, navigate to:

Dimensions > Account > OPF_Total Expenses

1

Adding Custom Expense Types

You may want to add other expense types to suite your needs or the needs of your client.  Here’s how to do that:

  1. Add a sibling to ‘OPF_Miscellaneous Expense’ and name it “OPC_Direct_Expenses”. This will allow us to group together any custom expenses that we define.  2
  2. Add children to the member we added in Step 1. For this example, we’ll add two members:  Custom Expense1 and Custom Expense2.

3

  1. Refresh the database.

View Custom Expense Types in Expense Entry Form

Now, we need to check the Expense Forms to make sure we can see the custom expense types we just created.

Navigate to Projects > Expense > Direct Entry.

4

Click on the list of value (LOV) under Expense Type.  You should see your custom expense types in this list.

5

Congratulations!  Moving forward, you can plan and capture project expenses using custom expense types in the EPBCS Projects module.

As always, leave us a comment below with any questions you have about this process.  Also, if there is a specific topic that you would like to see featured on our blog, please let us know.  Thanks for joining us!

Tagged with: , , , , , , , ,

Enterprise Planning and Budgeting Cloud Service (EPBCS) provides users the flexibility to select the financial planning process that best suits their needs.  The various planning options available in EPBCS Financials (FS) are:

  1. Direct Entry
  2. Trend-Based Planning
  3. Driver-Based Planning

EPBCS comes packaged with standard drivers that users can enable based on business requirements.  A few out-of-the-box drivers that can be enabled include:

  1. Compensation
  2. Marketing
  3. Sales
  4. Travel and Entertainment

These drivers can be used in planning and forecasting preparation.  Along with the drivers, you would also use pre-defined assumptions in your planning process.  These drivers would be used in the Driver-Based Planning forms.

Creating Custom Drivers in the EPBCS Financials Module

Often, the planning process may include use of custom drivers. Here, we will take a look at how we can create custom expense drivers that can be used in the planning process.

Enabling custom drivers is a two-step process:

  1. Enable Drivers and Related Accounts for FS
  2. Create Custom Drivers for Expense Accounts

We’ll go through each of these steps in more detail, below.

1. Enable Drivers and Related Accounts for FS

When you are enabling features in the FS module, always be sure to:

  1. Enable drivers and related accounts for expenses, and
  2. Select at least one predefined driver option from the given list (in this case, we are selecting Sales).

 

1

 

Once this is enabled, you will see the predefined account drivers in the Account Dimension.

 

2

 

 

 

 

 

 

You will also notice that there is a new user variable available on the User Variables page.

 

3

 

This variable is used in the Driver-Based Forms in the FS module.

2. Add or Create Custom Drivers

Navigate to Configure: Financials > Expense Accounts

4

 

In the Expense Accounts screen, Click on Actions > Add Category.

 

5

This will create a new category/parent member for custom drivers. In this example, we’ll call it Wage Drivers. Click Ok to create the category.

 

6

 

Click Save, then click Actions > Add to create various driver members in the “Wage Drivers” category.

 

7

8

 

Click Save.

Navigate to Dimensions > Account.  You will notice that the new drivers are now available in the Account Dimension.  The members are created as children to “OFS_Expense Drivers for Forms”.

 

9

 

 

 

 

 

 

You can now use these drivers during your planning process.  The drivers will also be available in the out-of-the-box forms within the EPBCS FS module.

 

9.1

 

As always, if you have any questions about this process, please leave us a comment below.

Tagged with: , , , , , , ,

In this post, we’ll show how to test Oracle Planning and Budgeting Cloud Service (PBCS) REST APIs using SoapUI.

In the example below, we will test the List Files REST API.  For more details on the List Files API, check out the Oracle Help Center:  REST API for Oracle Enterprise Performance Management Cloud.

Constructing the Rest API URL

First, you need to have the REST API URL in order to test the API.  Follow these steps in order to create this URL:

Format:  https://<SERVICE_URL>:443/interop/rest/<API_VERSION>/applicationsnapshots

  • SERVICE_URL = Let’s say your PBCS URL is:
    https://planning-test-a1234567.pbcs.us2.oraclecloud.com/workspace.
    The service URL in this case is:
    https://planning-test-a1234567.pbcs.us2.oraclecloud.com
  • API_VERSION = For testing purposes, use the value 1.2.3.600. *If you want to find out the latest version, you can use the getLCMVersions() helper function for Groovy/Java.  Here is a link to the Groovy helper functions, provided by Oracle. Leave a comment below and let us know if you are interested in learning more about invoking REST APIs using Groovy.

Testing Using SoapUI

Right-click on Projects and select “New REST Project”, as shown below.

1

 

 

 

 

 

 

2

 

 

 

 

 

 

 

Next, enter the REST API URI and click “OK”.

3

 

 

 

 

 

 

 

You will then see a window similar to this:

4

The next step is to enter the authentication information.  Let’s assume the following:

  • Domain Name: a1234567
  • Username: username1
  • Password: password123

When entering the authentication information in SoapUI, make sure to follow this convention:

Username = Domain.username

So, in our case, the username field will be a1234567.username1.

Now, click the Auth button and select Add New Authorization > Basic, then enter the details, as shown.

5

 

 

 

 

 

6

 

 

 

 

 

 

 

 

7

 

 

 

 

 

 

 

8

 

 

 

 

 

 

 

Make sure not to enter any value in the “Domain” field while you are on the Auth screen. Once this is complete, check the Method at the top of the panel to ensure it is set to GET.

9

You will also see a run button, in green, which will be used to send the request.  Next to that is the Stop button to be used in case you want to interrupt this request.

To the right of the panel, you will see options to select the data format of the response.  Select “JSON” as the data format.

10

 

 

 

 

 

 

Once the setup and configuration are complete, it is time to test the API.  Click the green Run button.  If there are no errors, you will see a JSON response, as shown below, in the right side of the panel.

11

 

If you have any questions about this process, please leave a comment below and one of our EPM experts will get back to you with a response.

Tagged with: , , , , , , , , ,

shutterstock_662073481

Welcome to AST’s blog dedicated to all things Enterprise Performance Management (EPM)!  Our expert EPM team members will share their knowledge, lessons-learned, and best practices here.  Bookmark this page and check back regularly for exciting updates and posts.  If you have any questions about our posts, or anything EPM-related, please feel free to leave us a comment and one of our experts will get back to you.

Thanks for joining us!

Tagged with: , , , ,