General Ledger Journal Import in Dynamics 365 With Logic Apps
Rob Dowsett
|
Jun 13 2018
|
8 min read
A pretty common scenario in most implementations is the need to import data. In this example with Dynamics 365, we will look specifically at general ledger journal data, as almost every project imports some form of General Ledger history.
There are a few ways to go about this. The Excel add in is a great tool for small volumes of data, but won’t handle large volumes well (1000+ lines).
For large imports, the Data Management Framework (DMF) is an excellent choice. This provides a tool that can handle very large files, large volumes of files, and automated processing.
We will be using the Ledger Journal Entity (LedgerJournalEntity), and to keep this scenario real world, we will also add a custom field to the ledger journal lines table to reference a legacy account code.
Adding a Custom Field
We need to populate a new field on the journal line with a legacy account code. To do this we first need to identify the objects we need to modify. LedgerJournalTrans is the easy one. We know we are using the General journal entity. By looking at the properties of the entity in Visual Studio, we can see the staging table used by the entity is called LedgerJournalEntityStaging. We will also need to add our custom field to this table and extend the entity to expose the new field.
And lastly, we will need to populate our new field when the journal is created. Our new field will be populated in the staging table by the mapping on the DMF project. Our new field on the journal line (LedgerJournalTrans) will need to be populated explicitly by us. To determine how to do this, look at the method copyCustomToStaging on the LedgerJournalEntity object. You can see the data is inserted into the LedgerJournalTrans table via a insert_recordset operation. In fact, all operations are recordset based, to improve performance.
We don’t want to customize any objects, so what do we do? We can create an event handler for the post event of this method, and in our own recordset operation, populate our new field.
Create a new class, paste in the method header, and write your update command. Note the skipDataMethods call – without this the update operation will revert to record by record updates. For a 10,000 line file, that’s about two hours vs two minutes.
Full method:
[PostHandlerFor(tableStr(LedgerJournalEntity), tableStaticMethodStr(LedgerJournalEntity, copyCustomStagingToTarget))] public static void LedgerJournalEntity_Post_copyCustomStagingToTarget(XppPrePostArgs args)
We only need a handful of values to create a journal. Obviously debit or credit, currency code, transaction date, account string. We know we also have our new field. We can also have a field for our journal batch number.
So our format (CSV) will look like this – Batch, Date, Account, Legacy account, Currency, Debit, Credit.
We will also need to format the values in our file correctly – specifically dates and numbers.
Financial Dimension Configuration for Integrating Applications
As important as the format of our source file is the format of our account string (GL account + dimensions). This is setup under General ledger Chart of accounts Dimensions Financial dimension configuration for integrating applications.
A few things to remember before you begin – Make sure the dimension format type is “Ledger dimension format.” Include all your dimensions – you can only have one format, but you can leave unused dimensions blank in the account string.
Create DMF Project
In D365, we need to create a new DMF project to import our data. Do this from the Data Management workspace, click on “Import.”
Upload a sample source file which will generate mapping.
Click on “View map” and switch to the Mapping details tab (I prefer this over the visual mapping tool).
And complete the mapping setup.
Note the three additional lines for voucher, journal name & line number. Line number is auto generated, journal name is defaulted to a value of our choosing via the default value button (Upload in my case) and voucher is auto defaulted also (I used VN1). Voucher will be replaced with a real voucher number when the journal is posted, based on the “Number allocation at posting” setting on the journal name setup.
Create the Recurring Data Job
Next we create the recurring data job. This will create a endpoint we can enqueue our files to as well as the batch job to process the inbound message.
Note the ID – this will form part of our URL later when we build our logic app.
The Application ID needs to come from your Azure AAD Application.
Set both the processing and monitoring recurrence as necessary.
Azure Active Directory Applications
This will associate our integration with a user account in D365.
Logic App
I’m not going to cover the entire Logic App creation, as Microsoft are soon to publish an ARM Template for this. Below is the basic flow of two Logic Apps. This is pretty simple. You could do a lot more here in terms of error handling if you were so inclined.
Logic App #1 (Get-files)
Runs on a predefined schedule
This looks for files in the source folder
For each file in source, copy itto an In-Process folder and send the file name to Logic App #2
Logic App #2
receives the file path from #1
gets the content of the file using the OneDrive action
Extracts the company code from the file name
Enqueues the file to D365
Return response code from D365 to #1
(You could do more here based on the response code…)
Based on the response code, move the file to an error or archive folder
Delete the original file from in-process
Below is the HTTP POST action in Logic App #1, which passes the Company, file name & file path to Logic App #2. Note the ability to use expressions in the input of the action. Take a look at the Workflow Definition Language schema for Azure Logic Apps for more detail.
LogicApp #2 (Enqueue files)
The first action is the HTTP Request. Use the HTTP POST URL to call this Logic App from the first Logic App. The parameters we need to receive are defined in the Request Body JSON Schema.
Next, using the Path parameter and the “Get file content using path” OneDrive for Business action, we get the contents of our source file.
Now we have the file contents, we can send it to D365. I’m using the HTTP POST action.
The URI is made up of a number of different pieces of information
Client ID and Secret come from your AAD application.
Our response takes the status code received from D365 and the Body (hopefully the message ID otherwise an error message) and returns those to our first Logic App.
The End Result
Response from HTTP POST action to D365:
Response sent back to Logic App #1:
And in D365 schedule data jobs, we can see out messages being processed successfully.
Have any questions about how to import the General Ledger Journal data in Dynamics 365? Please contact one of our D365 experts at any time!
This publication contains general information only and Sikich is not, by means of this publication, rendering accounting, business, financial, investment, legal, tax, or any other professional advice or services. This publication is not a substitute for such professional advice or services, nor should you use it as a basis for any decision, action or omission that may affect you or your business. Before making any decision, taking any action or omitting an action that may affect you or your business, you should consult a qualified professional advisor. In addition, this publication may contain certain content generated by an artificial intelligence (AI) language model. You acknowledge that Sikich shall not be responsible for any loss sustained by you or any person who relies on this publication.
About the Author
Rob Dowsett
Rob has worked in the Microsoft Dynamics channel for over 15 years, with expertise in Dynamics 365 Enterprise. Throughout his career, Rob helped build a thriving Dynamics AX practice with a team of over 20 and gained multiple Microsoft President Club awards.
Sign up for Insights
Join 14,000+ Business executives and decision makers.
Latest Insights
Oracle Cloud
Redefining Internal Audit: How Oracle RMC Elevates Efficienc...
December 20, 2024
Oracle Cloud
Redefining Internal Audit: How Oracle RMC Elevates Efficienc...
December 20, 2024
Implementing Oracle Risk Management and Compliance (RMC) redefines the role of Internal Audit Managers, transforming the traditional audit process in...
From Paper to Automation: Rethinking Shop Floor Reporting
December 18, 2024
Technology
From Paper to Automation: Rethinking Shop Floor Reporting
December 18, 2024
Traditionally, manufacturers have relied heavily on manual shop floor reporting methods involving paper logs, spreadsheets, chalkboards, and color-co...
Enhancing Disaster Recovery with Microsoft Azure Site Recove...
December 17, 2024
Article
Enhancing Disaster Recovery with Microsoft Azure Site Recove...
December 17, 2024
Ensuring business continuity and minimizing downtime during unexpected disruptions is always paramount for any business. Microsoft Azure Site Recover...
The Cost of Free Custom Code in Multi-Company Organizations
December 16, 2024
Dynamics 365
The Cost of Free Custom Code in Multi-Company Organizations
December 16, 2024
If you are part of a large, multi-company organization, or if you have various "affiliated" companies, you may find yourself in a situation where cus...
Transforming the Role of Risk Managers with Oracle RMC: From...
December 13, 2024
Oracle Cloud
Transforming the Role of Risk Managers with Oracle RMC: From...
December 13, 2024
In the evolving world of risk and compliance, the role of the Risk Manager is transforming at an unprecedented pace. Traditionally, managing risk mea...
Only One Microsoft Dynamics 365 ERP System Is the Right Solu...
December 12, 2024
Technology
Only One Microsoft Dynamics 365 ERP System Is the Right Solu...
December 12, 2024
Are you looking to upgrade from an older Microsoft ERP system, like Dynamics AX, Dynamics NAV, Dynamics GP, or Dynamics SL? Or are you considering Mi...
For the second year running, I attended the Association of Equipment Manufacturers annual conference. This year it was in sunny and warm Indian Wells...
Managing user privileges is a critical part of maintaining robust security policies. Traditional methods of managing privileges, such as using local ...
Setting Up Delegate Approval Functionality in NetSuite
December 9, 2024
Technology
Setting Up Delegate Approval Functionality in NetSuite
December 9, 2024
Unexpected delays can halt critical approval processes. Even the most benign activities can slow or stop business operations. Scheduled vacations, il...
This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.
Strictly Necessary Cookies
Strictly Necessary Cookie should be enabled at all times so that we can save your preferences for cookie settings.
If you disable this cookie, we will not be able to save your preferences. This means that every time you visit this website you will need to enable or disable cookies again.