This step-by-step guide demonstrates the integration between SAP Data Services 4.0 and SAP NetWeaver BW 7.3. With the introduction of Data Services as a new source system in SAP NetWeaver BW, data flows from SAP BusinessObjects Data Services can now be integrated to create a complete data model for Analytics.
Key Concept
A new type of DataSource, Data Services, was introduced in SAP NetWeaver BW 7.3. (Before the release of SAP Data Services 4.0 it was known as BODS.) The integration makes it very easy to pull data from non-SAP and SAP systems directly into SAP NetWeaver through Data Services or from the SAP NetWeaver BW interface itself. The data pulling mechanism in earlier versions had limitations. For example, data had to be sent to SAP NetWeaver BW in a particular format. Now SAP and non-SAP systems’ data can be pulled using the same method so you do not need different options for different DataSources.
It’s well known that SAP NetWeaver BW works extremely well with SAP ERP Central Component (ECC) and other SAP products as its source systems. However, when it comes to integrating with non-SAP products, SAP NetWeaver’s options have a set of limitations. New functionality has been introduced that permits integration between the new SAP Data Services (formerly known as BODS) 4.0 and SAP NetWeaver BW 7.3. Creating SAP Data Services as a DataSource in SAP NetWeaver BW caters to all the previous challenges and also provides additional extraction, transformation, and loading (ETL) features.
The prerequisites are:
- Access to SAP Data Services 4.0
- User access to at least one SAP NetWeaver BW 7.3 system
- Prior knowledge of the SAP NetWeaver BW Workbench and Data Services Management Console is helpful
Before following the steps needed for integration, users should understand how SAP NetWeaver BW and SAP Data Services complement each other to achieve business and technical requirements. In both SAP NetWeaver BW and SAP Data Services, a lot of custom coding typically is needed to achieve requirements in data cleansing, data profiling, text analysis, geo code corrections, and pivot. Some of these standard transforms are already available in SAP Data Services, while in SAP NetWeaver BW you would be required to write complex routines. Custom transformations in SAP Data Services are done using the Python language and are easier to develop, implement, maintain, and reuse as compared to custom ABAP code in SAP NetWeaver BW.
With three steps, Data Services can be used in the SAP NetWeaver BW staging process for non-SAP system data. SAP Data Services has the capability to connect and extract data from SAP systems as well as non-SAP systems. However standard extractors are available for SAP systems. Therefore, we are focusing on non-SAP system data. Take these three steps to define the data load:
- Load the data from the source system into the SAP Data Services application memory engine.
- Perform complex transformation/data cleansing on the data.
- Load the data into the SAP NetWeaver BW Persistent Staging Area (PSA).
Note
Standard extractors are available with SAP ERP Central Component (ECC). However, if the DataSource in ECC is customized then the available extractors also need to be customized, which involves a good amount of coding. If you use SAP Data Services you can skip the “customizing the extractors” step and the data pulling can be done without coding. This is one of the advantages of the Data Services approach.
You can implement the data upload process integration between SAP NetWeaver BW 7.3 and Data Services 4.0 in two ways:
- Using a data upload process controlled by SAP NetWeaver BW 7.3.
- Using a data upload process controlled by SAP Data Services 4.0.
In the first approach, the InfoPackage has to be created in SAP NetWeaver BW by the user. That process is a typical SAP NetWeaver BW process and thus is outside the scope of this article.
We are focusing on the second approach, in which the job is started manually by the Data Services Designer or by the Data Services Management Console. The Data Services application automatically generates an InfoPackage in SAP NetWeaver BW, since SAP NetWeaver requires an InfoPackage that can be attached to a data load request. With the data load request, Data Services starts sending data to SAP NetWeaver BW for loading.
Modeling Integration Between SAP NetWeaver BW and Data Services
You take these six major steps to prepare the exchange of data and metadata between Data Services and SAP NetWeaver BW:
- Create the Remote Function Call (RFC) connection in Data Services.
- Create the RFC connection in SAP NetWeaver BW.
- Test the SAP NetWeaver BW RFC connection.
- Create the SAP NetWeaver BW DataSource.
- Create the SAP NetWeaver BW load job in Data Services.
- Verify the uploaded data in the PSA.
Step 1. To create the RFC connection in Data Services follow menu path Start > All Programs > SAP Business Objects Data Services 4.1 > Data Services Designer. Enter the credentials and click the Log On button. The system lists all the repositories associated with the user (Figure 1).

Figure 1
Data Services user login window
Select the Repository (your local working repository) and click the OK button. In the next pop-up screen, provide the repository’s Database credentials and click the OK button (screen not shown). That takes you to the Data Services Designer screen.
As shown in Figure 2, open the Data Services Management Console by clicking the Data Services Management Console icon in the standard tool bar. Following this step takes you to the log-in screen of the Data Services Management Console. Logging in then takes you to Figure 3.

Figure 2
SAP Data Services Designer
Log in to the Management Console and click the Administrator link at the top left, which takes you to Figure 4.

Figure 3
Data Services Management Console home page
Navigate to the RFC Server Interface under SAP Connections in the left navigation tree in Figure 4. Click the RFC Server Interface Configuration button. Then click the Add button to add a new RFC connection in the Data Services management console, which takes you to Figure 5. The RFC connection is used in SAP NetWeaver BW for integration.

Figure 4
Add an RFC connection
Information available in SAP NetWeaver BW must be used to complete the RFC Server Configuration screen in Data Services. Figure 5 indicates the comparable fields in SAP NetWeaver BW and Data Services. For example, the Program ID field in SAP NetWeaver BW provides the text that must be entered into the RFC ProgramID field. The user must ensure accuracy when duplicating the numbers and text in Data Services. In this case, we are using employee data, hence the Program ID is ID DS_RFC_EMP_LOAD. After you enter the required details in Figure 5, click the Apply button to go to Figure 6.

Figure 5
RFC server configuration
When the RFC connection is successfully created, the RFC server interface shows a status of Started (Figure 6). If the status does not show as Started, click the Select check box next to the specific RFC connection and click the Start button.

Figure 6
RFC server interface status
Step 2. Create an RFC connection in SAP SAP NetWeaver BW. Log in to the SAP NetWeaver BW system and use transaction code RSA1 to go to Figure 7. Select Source Systems in the Modeling navigation pane on the left. Select the Data Services source system folder in the right pane, right-click it, and click the Create button.

Figure 7
Create the BW Workbench source systems
In the Create Source System window that now opens, enter the Logical System name and Source System name. Click the OK icon
to go to Figure 8. Provide the Program ID, which is the RFC Program ID name used in Data Services to establish the RFC connection with SAP NetWeaver BW. (Figure 5 illustrates the origin of the Program ID.) The rest of the settings can be left as default. Click the save icon to save the RFC connection.

Figure 8
BW RFC destination details
Figure 9
Figure 10.
Figure 9
SAP NetWeaver BW RFC connections configurations
Figure 10
Figure 10
SAP NetWeaver BW - Test RFC Connection
Click the back icon on the menu bar or press F3. A pop-up window appears with the Attributes of BO Data Services System, as seen in Figure 11. The user needs to determine a number of attributes, including Repository, Job Server, and Source Data Store. The following sub-steps describe the attributes selection process.

Figure 11
SAP NetWeaver BW RFC connection – attributes of the Data Service system
Figure 12

Figure 12
Select the Data Services Repository
Figure 11Figure 13
Figure 13
Select the Data Services Job Server
Figure 11Figure 14
Figure 14
Select the Data Services DataStore
Once the appropriate selections are made, the Attributes window looks like the example in Figure 15. Click the OK icon.

Figure 15
The Data Services source system details
Step 4. Create the SAP NetWeaver BW DataSource. Now the RFC connections are in place in both SAP NetWeaver BW and Data Services. Next, the user must create a DataSource in the SAP NetWeaver BW system. This DataSource is used in the data services job as a target.
Log in to the SAP NetWeaver BW application and go to transaction code RSA1 (Figure 16). Click the DataSources button in the menu on the left side of the screen. Right-click the DataSources work area on the right side and then select the Create Application Component option…. (Figure 16). That opens the popup screen shown in Figure 17.

Figure 16
SAP NetWeaver BW - Create the application component
Figure 17
Figure 18
Figure 17
Application component details
Once the application component is created, right-click Application Component ZZ_AC_DS_VSONI, and select Create DataSource…. (Figure 18). That opens the screen in Figure 19.

Figure 18
SAP NetWeaver BW - create a DataSource
In the DataSource field, enter the technical name for the DataSource. Select the Master Data Attributes drop-down option in the DataSource Data Type field. Click the continue icon.

Figure 19
DataSource details
The datasource looks like the ZZ_DS_EMPLOYEE example shown in Figure 20.

Figure 20
DataSource ZZ_DS_EMPLOYEE
Click the Extraction tab to go to Figure 21. Then click the find objects icon (binocular icon) found to the right of the Source Object field (Figure 21). This action selects the table from which the relevant data is read (Figure 22).

Figure 21
DataStore ZZ_DS_EMPLOYEE - Extraction tab

Figure 22
Available tables in source
All the tables present in the DataStore DS_SQL_DB_BLR(selected in Figure 14) are listed under the Root Node. Expand the Root Node and select the dbo.Employees table as seen in the Figure 22. Click the continue icon. Save and activate the object by clicking the activate icon shown here
.
Step 5. Create the SAP NetWeaver BW Load Job in Data Services
At this point, the RFC connection is ready, the application component has been created, and the DataSource has been activated. It is time to go into Data Services, import the DataSource, and create a load job. Figure 23 is the SAP Data Services Designer. Open it by following the steps described in Figure 1.
As seen in Figure 23, go to the local repository by clicking the local object library icon. Go to the datastore table under the Local Object Library, right-click it, and select the New button.

Figure 23
Open the Local Object Library
With the Create New Datastore window open, provide the correct datastore name, server name, SAP NetWeaver BW user, and password. Under Datastore type, select the SAP NetWeaver BW target.
In the Local Object Library window (Figure 24), the datastore DS_BW_TARGET now appears in the Repository under the Datastore tab. Double-click the data store DS_BW_TARGET and expand the Master Transfer Structures node. Navigate to the ZZ_DS_EMPLOYEE DataSource and expand it. Select the RFC_DS_EMP source system, right-click it, and select the Import button. The DataSource appears under the DataStore.

Figure 24
Import metadata into the DS_SAP NetWeaver BW_TARGET data store
To create a simple upload job in the Data Services Designer, go to Project > New Project and give a project name. P_BW_EMP_LOAD is the name used here. As seen in Figure 25, go to Project Area, right-click, and select New Batch Job. A new job with a default name (<New_Job#>, where # is a number assigned automatically by SAP Data Services) appears. Right-click the job name and select the Rename option. Give the name J_BW_EMP_LOAD, Figure 26 is the standard tool pallet on the right side of the work area in SAP Data Services Designer.

Figure 25
Create a new batch job
Now select the data flow icon
from the tool pallet of Data Services designer as seen in Figure 26. Click the Work Area. Name it Data Flow DF_BW_EMP_LOAD. Double-click to open. This step creates a new dataflow under the job J_BW_EMP_LOAD. Once the dataflow is created and renamed, you open it by double-clicking the dataflow. Figure 27 shows the Local Object Library area of the Data Services Designer. In the default layout of Data Services designer, the Local Object Library is always shown on the left side of the Data Services Designer window. It has several tabs in the bottom, and Figure 27 corresponds to the datastore tab.

Figure 26
The Data Services Designer tool pallet
Expand the DS_SQLDB_BLR data store from the DataStore tab of the Local Object Library (Figure 27), select the Employees table, and drag it to the Dataflow DS_SQL_DB_BLR work area that you created. Select the Make Source button from the context menu as shown in Figure 28.

Figure 27
Employees source table under DataStore - DS_SQLDB_BLR

Figure 28
Employee table as the source
Go back to the Datastore tab under the Local Object Library shown in Figure 27. Under DS_BW_TARGET expand the Master Transfer Structure (as seen in Figure 24). Select and drag ZZ_DS_EMPLOYEE_TEXT@RFC_DS_EMP to the work area as a target as shown in Figure 29. The process is same as importing the employee table as a source. Refer to Figure 24 for more detail.

Figure 29
ZZ_DS_EMPLOYEE DataSource as a target
Add a Query Transform by selecting the query transform icon
from the tool pallet (Figure 26). Click the Data flow DF_DS_EMPLOYEE Work Area (Figure 30). Rename it Q_Map_Field. When a new object or transform is added, Data Services gives it a default name, You rename these Job/workflows/dataflows/transforms to follow the standard Data Services naming convention to improve readability.
Now you need to connect query transform Q_Map_Field with source table Employees and the target Master transfer structure ZZ_DS_EMPLOYEE_TEXT so that you can define the mappings. First connect the transform Q_Map_Fields with ZZ_DS_EMPLOYEE_TEXT. Click the square on the right edge of Q_Map_Fields and drag the cursor to the arrow on the left edge of ZZ_DS_EMPLOYEE_TEXT. Now connect the Employees table with Q_Map_Fields by following the same step. Once the Q_Map_Fields transform is connected with both Source and Target the dataflow looks like Figure 30.

Figure 30
Connect the query transform Q_Map_Fields with the source table Employees and the target ZZ_DS_EMPLOYEE_TARGET
Now you define the mapping of the source and target columns in Q_Map_Field transform. Double-click the transform Q_Map_Field. It opens the Query Editor. To define mapping select the column EmployeeID from Schema In and drag it to Schema Out. Drop it on the column EMPLOYEEID. Repeat the same process for rest of the columns. Once the mapping is completed the Query editor looks like Figure 31.

Figure 31
Query Transform Editor Q_Map_Fields
In the Data Services Designer application follow menu path Project > Save All to save the job and the dataflow created up to this step. Open the dataflow Dataflow DS_BW_EMP_LOAD by clicking it in the Project Area. Click the validate all icon (Figure 32).

Figure 32
Dataflow DF_SAP NetWeaver BW_EMP_LOAD
The job has been set up and is ready for execution. You can execute the job either from the Data Services Designer or from the Management Console. In this case, the job is executed from the Designer. Right-click the job name in the Project Area and select the Execute option. A popup opens to set the Execution Properties (figure not shown). Click the OK button on the Execution Properties popup window to begin the execution. This is accomplished in the Project Area of Data Services Designer followed by a popup window for Execution Properties.
You can see job execution progress in the job monitor log. The Row Count column in the monitor log shows the number of records processed and loaded into SAP NetWeaver BW. As seen in Figure 33, nine records are loaded into SAP NetWeaver BW. The Row Count Column shows the count of records read, processed, and loaded into BW. (You can also verify successful execution of the job from the Data Services Management Console.)

Figure 33
Job execution monitor log
Step 6. Verify Uploaded Data in SAP NetWeaver BW PSAThe user has now created the Data Services SAP NetWeaver BW data upload job and must verify if the data has been uploaded successfully in PSA. Go back to the SAP Launchpad, open the SAP NetWeaver BW application, and open SAP NetWeaver BW Workbench using transaction code RSA1.
Navigate to the DataSource area (on the right side of Figure 34) and expand the DataSource ZZ_DS_EMPLOYEE. As SAP NetWeaver BW always requires an InfoPackage to have the data loaded into PSA from an external system, Data Services automatically creates a new InfoPackage under the DataSource to perform the data upload.

Figure 34
DI-generated InfoPackage - 3rd Part Selections
This InfoPackage is automatically created with the name DI-generated InfoPackage. Double-click the InfoPackage DI-generated InfoPackage. The details required to execute the Data Services job are available under the 3rd Party Selections tab.
With the InfoPackage created, the user can schedule the InfoPackage in the process chain and control the load process through SAP NetWeaver BW instead of going back to the Data Services Management Console and executing the job. This integration is useful because the user does not need to switch between SAP NetWeaver BW and Data Services for job execution. Once the setup is done, the job can be executed either from Data Services or from the SAP NetWeaver BW process chain. Select DataSource ZZ_DS_EMPLOYEE and click the PSA maintenance icon
.
The PSA maintenance screen displays the records loaded into the PSA (Figure 35).

Figure 35
PSA Maintenance window showing records loaded into the PSA
Once the records are loaded into the PSA, the user can use the BW Data Transfer Process (DTP) to load the data into a DataStore object (DSO) and the InfoCubes.
Biswajit Biswas
Biswajit Biswas works at SAP GD and is a subject matter expert in SAP analytics. He has five years’ experience. He is proficient in the SAP BusinessObjects suite of reporting tools and SAP Data Services. He has been associated with development of Rapid Deployment Solutions for analytics on SAP HANA, focusing on the utilities industry.
You may contact the author at biswajit.biswas@sap.com.
If you have comments about this article or publication, or would like to submit an article idea, please contact the editor.

Sunil Mehta
Sunil Mehta is a solution architect at SAP. He received his master’s degree in Computer Management from Symbiosis in Pune, India. He is a certified SAP FI/CO/BOBJ consultant, working in analytics. During his career he has been associated with Accenture, IBM, Capgemini, and KPMG, and has worked in various roles, including as a consultant solution architect and a project manager.
You may contact the author at Sunil.Mehta@sap.com.
If you have comments about this article or publication, or would like to submit an article idea, please contact the editor.

Virendra K Soni
Virendra K Soni is a Certified SAP Data Services Consultant and is currently associated with SAP. He has 6.5 years of industry experience in Data Migration, Data Conversion, and Data ware Housing with the Retail and Self-insurance industries. Prior to SAP he has worked with Capgemini, CSC, and HCL Technologies.
You may contact the author at Virendra.kumar.soni@sap.com.
If you have comments about this article or publication, or would like to submit an article idea, please contact the editor.