See how companies of all sizes and complexities can benefit from a new migration approach for SAP General Ledger (SAP GL). This approach seeks to reduce the activation downtime window of your SAP GL migration. Discover how to implement the migration technique over two weekends.
Key Concept
The migration to SAP General Ledger (SAP GL) can be especially challenging if companies have high transaction volumes or if their requirement is to minimize system downtime. Because the migration activities to SAP GL require users to be locked out during the data transfer process, a new technique for the migration process has been developed to minimize business disruption by using two weekends to complete the process.
The key to success for any migration project is proper planning. Migration projects have critical milestone dates that, in most cases, are not flexible. For example, you need to have several key components of the migration set up by year-end, which is a fixed time. As another example, you need to activate SAP General Ledger (SAP GL) during a time when no posting activity is allowed on the system. Often this is planned over a weekend, holiday, or some other time that is agreeable to the business community when the system is unavailable to users during the migration process. These windows of opportunity are fixed, which puts a high degree of pressure on the migration team to execute the conversion process in an efficient and timely manner.
It is essential that the planning of the migration window account for realistic processing times, contingencies, and recovery procedures in the event that a reconciliation issue arises. Establishing the correct performance benchmarks initially helps in estimating your downtime requirements and provides an early opportunity to earn the support of the business.
I will begin by focusing on the primary data processing tasks within the migration cockpit: worklist build, splitting process, and data transfer. These are responsible for the bulk of the migration processing (downtime) window. After I explain some of the major factors that influence the performance and time required to execute these migration programs, I will take you through an example of how a company overcame these challenges and successfully migrated to SAP GL using a new migration technique over a two-weekend time frame.
Note
The technical name for the former new General Ledger is SAP General Ledger (SAP GL).
Migration Project Influences
Several factors of your company's migration project influence how the migration programs perform. These are some areas that companies need to consider:
Document splitting. The most obvious factor is whether or not you are using document splitting in your SAP GL design. The decision to use document splitting requires you to run additional programs during the migration process. You use these programs to build the split information about existing open items and documents that you migrate.
Transaction volume. The volume of current year (phase 1) documents that you process has the most significant performance impact on your migration project. It is here that you really need to focus on obtaining a realistic estimate of volume to accurately predict the migration processing time. Simply put, the longer your phase 1 timeline (the time between year-end and the planned final SAP GL activation), the more documents you need to process during the activation weekend.
Business Add-In (BAdI) usage and code efficiency. With document splitting active, the processing of open items and current year documents can be challenging depending on how significantly the business process changes affect the organization. For example, a change in the definition of the profit center hierarchy requires that migrated data conform to the new reporting model, which is essential for consistent reporting in the migration year. However, the existing data might look nothing like the proposed design. SAP provides several BAdIs that you can use to supplement existing transaction data for this purpose. The most commonly used BAdIs are:
- FAGL_MIGR_SUBST: Document split information for open items
- FAGL_MIG_ADJ_ACCIT: Process account assignments for documents
- FAGL_UPLOAD_CF: For upload carry-forward balance
When implementing these routines, you should always exercise good judgment and keep your logic as straightforward as possible. Try to avoid sequential processing of large tables and optimize your code wherever possible.
Migrating open items from a prior year. FAGL_MIGR_SUBST is the only mandatory BAdI; you must implement it if you use document splitting. You must assign a split characteristic (e.g., profit center) to each open document line item comprising the year-end account balances. As part of this process, the system reads the entire document and allows the user to loop through the lines to determine the appropriate profit center assignment. Note that if a profit center already exists in the open line item, you cannot change it, unless it is the classic dummy profit center. Therefore, it is not necessary to manage these items as part of the process because it only expends processing time.
Another common mistake is the use of complex logic to determine the profit center assignment for prior-year open items. In most cases, these open items have been cleared during the phase one period, or are cleared shortly after the activation of SAP GL, depending on how far your activation date falls after year-end. You have to weigh the benefits of the additional assignment effort against the performance impact and reconciliation effort.
Migrating current-year documents. As the fiscal year-end approaches, you should implement all the business process changes, and internal controls that support those changes, in the production system. For example, this includes changes to the profit center hierarchy, assignments of profit centers to master data, field status controls, posting rules, and interface changes. If you begin the new fiscal year by processing documents in a manner that is aligned with the SAP GL design, the migration of those documents on activation weekend should be straight- forward. In technical terms, there would be no requirement to enrich or re-process any of the data within the migration programs. You would only run the standard splitting build and transfer based on the defined customizing rules you set up in the system.
While you plan for this situation, it rarely turns out this way. During phase 1, you might find errors in the split validation log that indicate the system is not posting some documents in a manner that is supported by your configured splitting rules. For example, the user might post a transaction using the incorrect document type, or a specific balance sheet account posting may be missing a profit center due to an incorrect field status setting. In these cases, the migration tools include several standard features, such as transactions and BAdIs, that assist in correcting these transactions by enriching the document line item or overriding the document splitting rules for that particular document. Regardless of the situation, these errors take time to analyze and spend processing resources to correct. If you use the optional BAdI FAGL_MIG_ADJ_ACCIT as a corrective measure, you might incur significant performance issues due to the additional code being executed.
My recommendation is to focus early on benchmarks that are determined without activating these additional tools to establish a baseline throughput. After you analyze the errors, it becomes necessary to implement the corrective tools, reset the data, and obtain comparative benchmarks using the same data set. The baseline performance benchmark allows your migration expert the opportunity to evaluate the run times against prior experiences, which indicate if opportunities for further performance improvements are possible.
Also, you cannot execute the current-year document splitting program using parallel system processes. Due to this limitation, I consider this task to be the critical path program in determining the overall migration processing window. Typically, this step has one-third the throughput of the other migration steps in the process. The testing benchmarks that you obtain for this task become a critical factor in the overall approach to executing the migration.
Migration Process
Now I'll take you through the basic steps in the migration process.
Step 1. Build a worklist. During this step, the system collects open items as of year-end and current-year documents in tables that serve to control all migration processing programs. The open-item worklist is built for customer, vendor, and SAP GL open-item managed accounts in a manner that is similar to how a receivables aging report would select data. Because it is executed on the migration weekend, it is retroactive (effective-date driven). This means that items are selected from both open- and closed-item indexes using the posting and clearing dates as key selection criteria. Depending on the number of open items present at year-end and the size of the respective index tables, this selection process can take a long time. For this reason, it is imperative for companies to consider SAP Note 1014364, which describes how to create secondary indexes for the open item tables accessed. I recommend that companies involve their Basis resources early in the project to communicate these requirements and to plan these tasks in the migration landscape.
Step 2. Transfer documents to SAP GL. This step in the migration process involves the transfer of the current-year documents to SAP GL, which writes the SAP GL line item (FAGLFLEXA) and updates the SAP GL totals (FAGLFLEXT). This step allows you to use parallel system processes during job execution, making the throughput efficient. Make sure the database statistics are current while executing this step. The migration cockpit includes a task that alerts you to this. My experience has indicated that performance degradation starts after approximately 300,000 to 500,000 records have been transferred to SAP GL. You should monitor this activity and keep statistics current to optimize the throughput.
Step 3. Set up your project plan. Now that you have run through several tests of the migration cockpit tasks and have optimized your system performance to the maximum extent possible, you need to extrapolate the run times for expected production data volume on the production hardware. Until now, I have not commented on the possibility that a company's test and production environments are almost certainly different, which consequently affects the performance. My experience indicates that production run times are faster than those seen in test environments. Some companies experience 50% better actual production run times when compared to the test benchmarks. Some were almost the same. In one case, the company's test system was actually better than production. Because there is no definitive way to tell, my only advice here is to look at other batch type processes and compare performance numbers. This might give you a clue as to how much better your production system really is performing.
As I stated in my introduction, the key to success is proper planning. By this moment in the project, you should have a decision point defined in your plan: Based on the performance data you have accumulated in testing, how long is the migration going to take and do you have enough downtime planned to execute it? What is the course of action if the answer to that question is no? That is exactly the challenge that one company faced as its planned migration activation date drew near.
Multiple-Weekend Approach
As of early 2008, all the companies that migrated were successful in executing the complete migration process during the course of a single downtime window. In fact, this is the recommended approach and makes the most sense from a project and risk mitigation standpoint. In one case, though, a company's best estimates put the expected performance window at the very edge of its available downtime with no room left for the recommended contingency reserve or recovery plan. The migration process needed to be split into two parts so that it could meet the time limits. Then, it revised the migration project plan as a result of the decision to split the processing over two weekends.
Executing the Migration — Weekend One
As of this point in the timeline, it is a safe bet that the prior year is completely closed. Therefore, all the data that is necessary to migrate the prior-year opening balances (e.g., open items, GL account balances, and adjustments) should be final and available to migrate. Because the data is final and the previous year's posting periods are officially closed, you should not expect any more prior-year postings to the account balances. As a result, you can execute all the tasks for the migration of prior-year balances.
Concerning the overall timeline, the prior-year tasks only comprised a small portion of the processing requirements needed to execute the complete migration. It was the creation of the current-year document splitting information that was on the critical path. We addressed this challenge by executing the current-year document worklist and performing the document splitting step on that data immediately. Over the next 20 hours, we processed the current- year documents (i.e., everything posted through this date) through the migration splitting programs and resolved all errors.
The interesting thing about weekend one activities is that you don't need to close the system to run these programs. I recommend that you do close it on a weekend to maximize the processing availability on the system, but the system should remain available for the business users. In the example situation, the company was able to process 80% of its migration plan requirements in advance of the final migration weekend.
This works because at the time the worklist runs, the system only collects and processes data that exists as of that point. Because SAP GL is still inactive, transactions that post during or after these tasks run are not updated in SAP GL. The system, other than migration programs, is not using SAP GL tables that are being updated.
Completing the Migration — Weekend Two
Because you completed such a large portion of the processing requirements during the previous weekend, the remaining tasks are collecting the additional documents posted during the prior week, finishing the split processing, and transferring all data to the SAP GL for reconciliation and validation.
What makes this technique possible is the capability of the worklist program to perform a delta build within the migration document control table. The program adds the newly posted transactions that were not captured during the build on the prior run. The program also preserves the status of each of the original documents contained in the worklist, so it is possible to perform a complete reconciliation of activity on the past week's transactions.
During this final reconciliation, it is imperative that the system be closed for transaction processing. This ensures that the final data collection results in the total number of all documents posted to the classic G/L during phase 1. After you reconcile the final worklist version, you simply complete the splitting process on those newly identified documents and begin the data transfer for the current year to SAP GL. For performance reasons, you should not transfer significant volumes of data to SAP GL before document splitting completes. This is due to the program's duplicate document check, where it reads the FAGLFLEXA table. By minimizing data in FAGLFLEXA during the splitting program execution, you improve the program's performance.
William Miller
William Miller is a principal manager and founder of MI6 Solutions Group. Prior to joining MI6 Solutions, he was a platinum FI/CO consultant with SAP America for more than 10 years. He most recently served as the community practice lead for the SAP General Ledger within SAP’s National Competency Center. He has led more than 11 SAP General Ledger migration projects in North America and is one of the leading subject matter experts in this area.
You may contact the author at william.miller@mi6solutions.com.
If you have comments about this article or publication, or would like to submit an article idea, please contact the editor.