SAP NetWeaver 2004s includes automatic data uploading functionality that enables compliance with Sarbanes-Oxley requirements.
Key Concept
The data basis defines the data model for a Business Consolidation system. The data basis is created in the Consolidation Workbench and is a combination of characteristics and key figures. The relationship between the characteristics and key figures, and the construction of the data streams for the collection and storage of data, are also part of the data basis. For example, it defines the posting level of the data that you can retrieve and the components from which you can upload data that are standard in SAP NetWeaver 2004s.
The most recent improvement to the consolidations process in SAP Strategic Enterprise Management-Business Consolidation System (SEM-BCS) enhances compliance with the Sarbanes-Oxley Act. Starting with SAP NetWeaver 2004s, SEM-BCS has moved from an interfacing uploading process for financial data to an integrated uploading process. This was the final step in the four following major areas requiring IT support for compliance with Sarbanes-Oxley:
- The uploading or collection process for the data to be consolidated
- The specific business rules that are applied to the data to generate the consolidated reports
- The process that is followed for the consolidation itself
- The reporting aspect of consolidations
Since the last three items are activities that SEM-BCS has accomplished for a number of years, I’ll concentrate on the first one. An SAP system now can offer consolidation functionality that generates consistent and validated information from multiple source systems via the SAP NetWeaver Business Intelligence (SAP NetWeaver BI) platform.
I’ll take you through a demo of the new upload process, which ensures accuracy and reduces reporting cycle time to meet increasingly short deadlines. In addition, it provides a full audit trail capability and the ability to generate the reports required for internal and external auditors.
The new process uses Real-Time InfoCubes and Real-Time DataStore objects, which in the 3.5 version of BW were referred to as Transactional InfoCubes and Operational Data Store (ODS). The system manipulates the collected data using specific business rules and processes to generate consistent consolidated financial and managerial data for use in reporting to stakeholders.
Consolidation Process
One process in SEM-BCS that did not satisfy Sarbanes-Oxley requirements until SAP NetWeaver 2004s is the integration of data with the SAP NetWeaver BI system that supported the consolidation process. Through the initial series of versions in the SEM-BCS functionality, users have used interfacing methods such as manual data entry or Excel uploading processes.
Using these types of data entry processes in SEM-BCS, you would need to be aware of the touch point of the actual Excel spreadsheet you’re using to upload the data, the format of the Excel spreadsheet, who extracted the data from the source system, and who uploaded it into SEM-BCS. During this time, you could manipulate or change the data. Rather than reducing the amount of manual intervention, this process increased it.
Figure 1 shows the Consolidation Workbench of a 3.5 version of SEM-BCS. Notice that it has only two methods of collection of data for consolidations: Data Entry Layout and Flexible Upload. SAP addressed this issue, and since version 4.0 of SEM-BCS, users have been able to upload data directly from another InfoProvider in the BW environment, as well as from the FI module.

Figure 1
The Consolidation Workbench in SEM version 3.5 is supported by BW 3.1B
Figure 2 shows the same area as Figure 1 but in version 6.0. You can see that additional upload and data collection methods are available, specifically Load from Data Stream. You can use this component to collect transactional or master data and hierarchies from the BI capabilities in SAP NetWeaver 2004s or other connected SAP source systems. SEM-BCS 4.0 uses SAP NetWeaver 2004 versus SAP NetWeaver 2004s, and thus the upload functionality is available for FI with that version. Therefore the upload using the Load from Data Stream is available in the 4.0 version of SEM-BCS.

Figure 2
The Consolidation Workbench in SEM version 6.0 in SAP NetWeaver 2004s
You can use a number of InfoProviders — InfoCubes or DataStore objects — to connect to and obtain data from SAP NetWeaver BI for consolidations. You can load data that is stored in an InfoProvider from a local or a non-local business warehouse system. In addition to accessing data directly stored in an InfoProvider, you can access data that is referenced and not stored via the use of a Remote Cube (Virtual Infocubes). Therefore, you can access the data without having to physically upload it into the InfoProvider. Numerous parameters, rules, and processes are involved with this integration of the InfoCubes to the SEM-BCS data basis. I’ll focus on the actual activity of integration.
You can integrate and automate the process of gathering data to be consolidated into the system so the number of touch points is minimal. Once you post the data into the system at the operational level, the system collects it in specific tables depending on the functionality of the mySAP ERP Central Component (ECC) environment. Then you need to create the architecture that allows you to upload this data into the SAP NetWeaver BI system (Figure 3), and develop a complete process flow to collect the data for consolidation. The data flows from R/3 or ECC into BI using standard data source and transformation rules, then into the InfoCube.

Figure 3
Upload the financial data required from the ECC client to the InfoCube Z0PCA_C01 for use during the upload process from InfoCube to InfoCube
You use the InfoCube as part of the data basis for uploading data into SEM-BCS for consolidations. First you select the specific basic InfoCube to which you will upload the FI data. Once you collect the data into it, you can use this object as a source of data for the consolidation InfoCube (totals table). The consolidation InfoCube stores the data and then makes it available for the consolidation process.
Note
Generally speaking, in the following steps the FI/Consolidations functional expert would develop the structure of the InfoCubes and identify the settings for the consolidation parameters. A BI person would handle the development documents, and could create the InfoCubes, but normally an SEM-BCS-focused person would create the InfoCubes.
Start this process with the creation of the Data Basis in Consolidations. Figure 4 illustrates the creation of the Data Basis. Include the InfoObject Catalog for Chars field so you can identify the set of additional fields that you can select on the Data Stream Fields tab and additional fields that you can use for the Data Model. You can use Source Data Basis to identify the InfoObjects that you can use for a flexible upload process. Also note that you fill in Source Data Basis — in this case 71 — on this screen to upload from the appropriate data source. You identify the InfoCube for Totals Records in this screen, which is 0BCS_C11 — the table that holds all of the records for consolidation purposes.

Figure 4
The initial step in creating the upload process from the BW InfoProviders
Moving to the screen shown in Figure 5, you can see the creation of the Source Data Basis 71. It is linked with the InfoProvider Z0PCA_C01, which contains the necessary data for the consolidations process. In the Field Catalog tab, you can see that the RFC Destination is empty, which confirms that the aforementioned InfoProvider is located in the underlying BW system. In addition to this tab, a series of additional informational items is available to establish parameters for your consolidation process. You can use these items, such as attributes and authorization checks, in one of the other requirements for Sarbanes-Oxley compliance — internal controls of business rules and access to specific characteristic values.

Figure 5
The creation of the Source Data Basis, linking Source Data Basis 71 to the InfoProvider Z0PCA_C01
Once you have created these structures, you develop a data collection method. Go to the consolidation workbench via Process View>Consolidation Functions>Data Collection>Load from Data Stream. Enter the source data basis and the specific InfoProvider that you assigned to this source into the appropriate fields (Figure 6).

Figure 6
Use the Source Data Basis to structure the transactional data upload
Several additional parameters on this screen provide varying selections. Update Mode has four options: Delete All (delete all data); Delete Uploaded Data (delete only the data being uploaded that has the corresponding characteristic value combinations); Overwrite (overwrite the existing data); and No Modification (all data remains unchanged). In my example, I selected Overwrite because I may have current data in the InfoCube and the data that I’m uploading is the current version of the consolidation data. Depending on the situation, you may want to choose something else — for example, if you are adding additional data to the InfoCube (delete uploaded data) or rerunning the upload (delete all).
Under Treatment of Dr/Cr Sign Attributes you can select where you want the system to obtain the appropriate Dr/Cr signs, the + or – signs that indicate whether the posted amount is a debt or credit posting, from the Movement Type or the Item. You can select the Input Type as Periodic or Cumulative, or choose BAdI filter if that’s what you are going to use. If you’ve identified the uploading of the data on a periodic basis — if you are loading January, February, and March separately, for example — you would use Periodic; I’ve selected Cumulative because I’m uploading a cumulative amount of data — all the months at once.
Figure 7 shows the Mapping tab, which allows you to map the source data basis fields to the target fields. You have numerous options to map these two sets of fields. Some of them are shown in the figure — Move, Reassignment, Condition — and other options allow the assignment of a series of business rules to the mapping process.

Figure 7
Map the fields from the Source Data Basis to the Target Fields of the totals table InfoProvider
For example, say you have a series of subsidiaries using transactional charts of accounts (COA) that are different from the consolidated chart of accounts. The consolidated COA would have one account number for travel and expenses, whereas the subsidiary would break out travel and expenses in more detail and have two or three accounts for this expense. You can address these differences in the mapping process and map the accounts correctly so you can homogenize the data. You can reuse the mappings that you have created in other areas and develop a set of template mappings for your consolidation process. This helps you comply with Sarbanes-Oxley regulations because it offers a configuration (system) control for mapping differences at the transactional data level.
Once you’ve completed this configuration, you need to assign this process to a method and then to a task in the consolidation monitor. Figure 8 shows the completion of the assignment of a data collection task to the monitoring process. Note the link between the Load from Data Stream (08100) and the assignment to Method (09100).

Figure 8
Assign the method Data Transfer to a task
You then assign the Method to the Task (data collection – 1000) on the Workbench: Change Task 1000 – Data Collection screen. The number 1000 is the technical name I assigned to the data collection process. The initial Method used in Data Transfer consists of other methods, which are shown on the right side of the screen in Figure 8, with the method 08100 chosen for a specific group of postings to be transferred. With the assignment of the Data Stream, you then assign the task to the Consolidations process in the Workbench screen. Scroll down on the screen shown in Figure 8, and click on Task. Then attach the Method. This allows you to execute the task from the consolidation monitor screen in Figure 9.

Figure 9
The Consolidation Monitor with the data collection process assigned
You now need to assign the tasks to the Hierarchy in Figure 9 so that you can execute these activities from the consolidation monitor. Use the consolidation Hierarchy tab and expand the task list. The Data Collection tab is available for you to execute. It holds a number of data collection processes, as you can see on Figure 10. You navigate to Figure 10 by clicking on the circled field in Figure 9. In my example, I am uploading Profit Center Accounting data from the Profit Center Accounting InfoCube. Upon execution of this method, the system copies the uploaded data automatically from InfoProvider Z0PCA_C01 to totals InfoCube 0BCS_C11, so it is available for the additional consolidation activities.

Figure 10
The execution of the upload of data from the InfoProvider identified as the source data basis
In this case the task is data collection. Notice that the method to collect data in the totals records is 08100 (Load from Data Stream). The technical name I gave the Load from Data Stream task is 08100. You can select any technical name you want for the task.
As you can see, the entire process is fully automated from the upload of the data to the operational environment through to the consolidation process and activities. At no time is it necessary for anyone to intervene manually in the data acquisition process. This offers auditors for Sarbanes-Oxley compliance a transparent view of the overall steps involved and, with the inclusion of some additional involvement of the Audit Management area of SAP NetWeaver 2004s, you are on the road to a successful, Sarbanes-Oxley compliance audit for your consolidation process.
Note
The Audit Management toolkit is new to SAP NetWeaver 2004s. It offers the auditor the ability to monitor SAP NetWeaver BI system activities, such as uploads, changes to the objects, and authorization processes. Using it in conjunction with SEM-BCS, the auditor can investigate the SEM-BCS process efficiently.

Peter Jones
Peter Jones is a business intelligence solution architect in the areas of CO, EC, SAP BW, BI, BusinessObjects, Planning and Consolidation (BPC), and SEM for MI6 Solutions. He is certified in all these areas and is a subject-material expert for SEM, CO, BI, and BPC, with over 14 years of experience in these areas working for SAP and MI6 Solutions. Currently, he is involved in projects incorporating SAP BW 7.4, HANA, BPC 10.0, and EPM 10.0. He has consulted in all those areas for the last nine years. Peter was an SAP instructor for seven years and has written several books about SAP NetWeaver BW, FI/CO, and has recently revised his book about BPC Implementation to version 10.1. He is a contributor to Best Practices for Financial Reporting in SAP, an exclusive anthology of articles that delivers unparalleled guidance on how to optimize financial reporting processes with SAP applications.
You may contact the author at peter.jones@mi6solutions.com.
If you have comments about this article or publication, or would like to submit an article idea, please contact the editor.