Last week I was a speaker on the first AXUG European Congress. This event took place on May 9 and 1o in Stuttgart. My session was about the Data Import Export Framework in Microsoft Dynamics AX 2012. The subject was related to performance and troubleshooting. I decided to write a summary of the presentation in two blog posts. In this post I will recap the performance thoughts.
When Microsoft started the investments to build the Data Import Export Framework (DIXF), they considered many performance choices. In addition it should provide a solution for the normalized data model where many tables are linked using a foreign key relation based on record IDs.
The source is now first loaded into a staging table without any business logic like events for validating data or add rules in insert methods. Then from the staging table the data is copied to the target table(s) where also business logic will be used. In an older version of Microsoft Dynamics AX, I had to deal with importing over 60000 fixed assets with each 4 value models, acquisition value and cumulated depreciation. So in total there were over 240000 value models and almost 500000 journal lines to be posted for the opening balance. Usually within one flat Excel file with a certain template I used a script to read the Excel lines, create assets, post acquisition values, depreciation values and then correct value models for the number of remaining depreciation periods. This script was working where at a maximum 400 assets were loaded. The 240000 value models and 500000 transactions should take about 6-8 days for processing according to a calculation. Then we did also create a staging table which contained the Excel columns. From within AX we could process the business logic using the batch framework which solved the problem and the transactions could be converted within the given timeframe. So this architecture is a good implementation.
A very cool part of the DIXF framework is the use of SQL Server Integration Services (SSIS) to get the source into the staging tables. Microsoft did a real amazing good job here. The next picture shows the flow in general. It does not explain which part of the DIXF binary components (dll extensions) will take care of what part of the integration with SSIS.
The setup of processing groups in AX is the base for SSIS packages which will pick up the source records and will put it with our without some basic transformations in the staging table. SSIS is very efficient and really a winner on performance for this type of tasks.
However Microsoft seems to have invested a lot in performance for the Data Import Export Framework, you might encounter some performance issues. You also need to be aware of some standard behavior and features to get the best performance on executing the import jobs. It would be recommended to not change objects when there is no real need for more performance. If you make changes on your environment based on this blog post, make sure you test it thoroughly. Changing your environment is at your own risk.
There can be more possibilities of improving performance. Some settings can be related to SQL server tuning, but also the hardware you are using. Also there are two settings in the DIXF parameters which could cause performance problems when you change it to the wrong settings. One field is Data access mode. This field is a setting for SSIS. When it has NOT the fast load option, the records are committed one by one. So use the fast load setting. When a post man has 10 letters for the same address, he can insert them one by one or all at once. The last option is comparable with the fast load option. The Maximum insert commit size field is used to tell how many records will be inserted during before a commit command will be executed. E.g. the mailbox has a height to only insert 5 letters. Then 2 inserts are needed to put in the 10 letters. The default value is 2147483647 which actually means there is no limitation and al records will be committed at once. When you have e.g. a limited size for your TempDB on the SQL server, you may need to verify this setting to e.g. have 100000 records per commit action.
My next post will be related to troubleshooting operational issues when using Data Import Export Framework. So watch the next post coming…
That’s all for now. Till next time!