35
SAP BusinessObjects Data Services 3.2 March 2011 English Migration Internal Order SAP AG Dietmar-Hopp-Allee 16 69190 Walldorf Germany Business Process Documentation

DataMigration InternalOrders en US

Embed Size (px)

DESCRIPTION

DataMigration InternalOrders en US

Citation preview

Business Process Procedures

SAP Best PracticesMigration APOInternal Order: BPD

SAP BusinessObjects Data Services 3.2March 2011EnglishEnglish

Migration APOInternal Order

SAP AGDietmar-Hopp-Allee 1669190 WalldorfGermanyBusiness Process Documentation

SAP AGPage 22 of 26

Copyright

Copyright 2011 SAP AG. All rights reserved. No part of this publication may be reproduced or transmitted in any form or for any purpose without the express permission of SAP AG. The information contained herein may be changed without prior notice. Some software products marketed by SAP AG and its distributors contain proprietary software components of other software vendors. Microsoft, Windows, Excel, Outlook, and PowerPoint are registered trademarks of Microsoft Corporation. IBM, DB2, DB2 Universal Database, OS/2, Parallel Sysplex, MVS/ESA, AIX, S/390, AS/400, OS/390, OS/400, iSeries, pSeries, xSeries, zSeries, System i, System i5, System p, System p5, System x, System z, System z9, z/OS, AFP, Intelligent Miner, WebSphere, Netfinity, Tivoli, Informix, i5/OS, POWER, POWER5, POWER5+, OpenPower and PowerPC are trademarks or registered trademarks of IBM Corporation. Adobe, the Adobe logo, Acrobat, PostScript, and Reader are either trademarks or registered trademarks of Adobe Systems Incorporated in the United States and/or other countries.Oracle is a registered trademark of Oracle Corporation. UNIX, X/Open, OSF/1, and Motif are registered trademarks of the Open Group. Citrix, ICA, Program Neighborhood, MetaFrame, WinFrame, VideoFrame, and MultiWin are trademarks or registered trademarks of Citrix Systems, Inc. HTML, XML, XHTML and W3C are trademarks or registered trademarks of W3C, World Wide Web Consortium, Massachusetts Institute of Technology. Java is a registered trademark of Sun Microsystems, Inc. JavaScript is a registered trademark of Sun Microsystems, Inc., used under license for technology invented and implemented by Netscape. SAP, R/3, xApps, xApp, SAP NetWeaver, Duet, PartnerEdge, ByDesign, SAP Business ByDesign, and other SAP products and services mentioned herein as well as their respective logos are trademarks or registered trademarks of SAP AG in Germany and in several other countries all over the world. All other product and service names mentioned are the trademarks of their respective companies. Data contained in this document serves informational purposes only. National product specifications may vary.These materials are subject to change without notice. These materials are provided by SAP AG and its affiliated companies ("SAP Group") for informational purposes only, without representation or warranty of any kind, and SAP Group shall not be liable for errors or omissions with respect to the materials. The only warranties for SAP Group products and services are those that are set forth in the express warranty statements accompanying such products and services, if any. Nothing herein should be construed as constituting an additional warranty.

Icons

IconMeaning

Caution

Example

Note

Recommendation

Syntax

External Process

Business Process Alternative/Decision Choice

Typographic Conventions

Type StyleDescription

Example textWords or characters that appear on the screen. These include field names, screen titles, pushbuttons as well as menu names, paths and options.Cross-references to other documentation.

Example textEmphasized words or phrases in body text, titles of graphics and tables.

EXAMPLE TEXTNames of elements in the system. These include report names, program names, transaction codes, table names, and individual key words of a programming language, when surrounded by body text, for example, SELECT and INCLUDE.

Example textScreen output. This includes file and directory names and their paths, messages, source code, names of variables and parameters as well as names of installation, upgrade and database tools.

EXAMPLE TEXTKeys on the keyboard, for example, function keys (such as F2) or the ENTER key.

Example textExact user entry. These are words or characters that you enter in the system exactly as they appear in the documentation.

Variable user entry. Pointed brackets indicate that you replace these words and characters with appropriate entries.

Contents

1Purpose52Prerequisites52.1Preparations53Overview53.1IDoc Structure represented in Data Services content63.2Global Variable Usage and Default Values93.2.1Default Value Variables93.2.2Segment Execution Variables113.2.3Segment Population in the IDoc Variables113.2.4IDoc Control Record Information113.2.5Data Formatting Variables123.3Referenced Lookup Tables134Process Steps134.1Configure and Execute Lookup Delta Population Job (Optional)134.2Connect to Source (Legacy) Data154.3Map Legacy Data174.4Execute Object Processing Job194.5Review Invalid Output204.6Map Input Values in the Lookup Maintenance Application214.7Import Object to SAP ERP224.8View IDocs on SAP234.9Check IDoc Status234.10Reconciliation235Troubleshooting236Appendix256.1Functions256.1.1Functions used for Validation256.1.2Functions used for Enrichment25

Create Internal OrderPurposeThis scenario guide provides instructions on the Internal Order migration process beginning with the legacy data mapping through to the upload of the data to the target SAP system. The SAP BusinessObjects Data Services tool allows organizations to easily extract, transform, cleanse, map, and validate their legacy data for upload to the SAP system using SAP NetWeaver IDoc technology.The SAP BusinessObjects Data Services tool provides a graphical user interface (GUI) development environment in which you define data application logic to extract, validate, transform, and load data from databases and applications into the required data migrations formats. You can also use the tool to define logical paths for processing message-based queries and transactions from Web-based, front-office, and back-office applications. The tool also allows you to view the errors that occur during the migration process.Prerequisites PreparationsEnsure that you have completed all steps specified in the Data Migration Quick Guide to set up the SAP BusinessObjects Data Services and SAP environments for migration.OverviewEvery data migration job developed for the SAP Best Practices for Data Migration package follows the same processing logic.1. Mapping Map the data from the legacy source (such as application, flat file, MS Excel) to the predefined mapping structure in the respective mapping DataFlow.1. Validation Validates the source legacy data in three ways:1. Mandatory checks ensures that mandatory columns are populated with values1. Lookup checks ensures that the legacy values are mapped to respective SAP values within the specific lookup table using the Migration Services application1. Format checks ensures that the legacy values are of the correct format (length, date format, decimal format) to load into SAP without further conversion1. Enrichment Changes the valid data that has passed the validation into the required values for the SAP load. For lookup columns, the legacy value will be translated to the SAP value. For decimal and date based columns, they will be converted to the required SAP format. Enrichment columns that are left null will be enriched with the respective default value.1. Profile the MAP tables lookup columns for their distinct list of values. These values are transfered to the MigrationServices application, populating the dropdown dialog boxes in the value mapping screens.1. Generate the IDoc with valid, enriched data and transfer to SAP for loading.Steps 1 to 3 are repeated for each segment within the IDoc. Step 4 is executed after the MAP data tables are populated. Step 5 should only be executed once the valid data records have been created and enriched. The specifics for the object processing are contained in the following sections of this document.IDoc Structure represented in Data Services contentThe Internal Order IDoc (INTERNAL_ORDER_CREATE02) target represents a hierarchical structure comprising of nested schemas which themselves contain the fields required to load the various relational table structures in the target SAP application. The complete structure for loading Internal Order data is represented in the IDoc and as such this can be used to support the functionality required for both migration or interfacing requirements. As part of the SAP Best Practices Data Migration development, we have limited the loading of the Internal Order IDoc to the segments that are required to support a SAP Best Practices deployment. The segments covered by the delivered SAP BusinessObjects Data Services content are shown in the table and diagram below. The table details the relationship (e.g. one to many) and mandatory or optional nature of the segments and their technical names and descriptions. The diagram graphically demonstrates the relationship of the segments in a hierarchical model to assist in the understanding of IDoc structure.The rules applied to decide on which segments will be delivered as standard content are as stated below: Rule 1 - If the segment contains a field that was delivered as part of the previous LSMW based content then the whole segment will be delivered, i.e. all fields in the segment not just the fields previously delivered. Rule 2 - If a lower level segment is required to be populated then the higher level segment i.e. its parent will be delivered as part of the standard content even though no fields where previously covered in this segment.

These rules are applied generically across all objects unless otherwise stated.

Segment Level 1Required (R) / Optional (O)Min / Max # per IDocDescription (Table Name in SAP)

E1INTERNAL_ORDER_CREATEO1 / 1Internal Order Header segment (*)

E1BP2075_7O1 / 1Internal Order Master Data

E1BP2075_6O1 / 999999999Internal Order Settlement Data

(*) The Internal Order Header Segment is only used to flag whether we are loading the IDoc in simulation mode and as such is populated with the variable $G_Simulation. As such an IDoc will be produced for each Master Data record.

An IDoc is very similar in structure to any other nested relational data structure, for example XML. It is hierarchical in nature and allows for segments to either by required or optional as can be seen in the table above. The IDoc always contains a single root level segment; in this case a single Inventory Balance Header record will be represented in one IDoc. In the case of Internal Orders an IDoc is produced for each Internal Order Master Data record migrated due to the one to one nature of the Internal Order Header segment and the Internal Order Master Data segment (level 1). Below the root level the IDoc will normally allow for one or more segments of the various lower level segments depending on the min and max values allowed for the lower segments. In the table above you can see that for Internal Order Settlement data we can have between 1 and 999999999 segments below a single Internal Order Master Data segment. The diagram below shows how we can have one Internal Order Master Data segment with two (or more) Internal Order Settlement Data segments.

Internal Order IDoc structure delivered by standard SAP Best Practices content

The diagram above shows how the segments of the Internal Order IDoc are related in a hierarchical tree structure. In the case of Internal Orders all segments of the IDoc are held at level one. With Internal Orders the root node is the Internal Order Header, or Internal Order Master Data segment. A single IDoc will be produced for each Internal Order Master Data record as there is a one to one relationship between this segment and the Internal Order Header segment. There are no child segments populated in the Internal Order IDoc (i.e. all segment are held at level one, but the Internal Order Settlement segment can be considered the leaf node, or child segment, as it has a one to many relationship with both the Internal Order Header and Internal Order Master Data segments.

Global Variable Usage and Default ValuesThe following section details the use of global variables within the SAP BusinessObjects Data Services Customer IDoc migration content Job - Job_AIO_InternalOrder_IDOC. You can increase the flexibility and reusability of Jobs, Work Flows and Data Flows by using local and global variables when you design your jobs. Variables are symbolic placeholders for values. The data type of a variable can be any supported by the software such as an integers, decimals, dates, or text strings.We use global variables within the SAP Best Practices development Jobs rather than local variables to allow for the variable values to be passed in from the Migration Services application (only global variables are exposed at the Job level).The global variables are used in a number of ways within the Best Practice Jobs as described below. Default values variables used to define default values for certain fields in the segments when no value is provided by the source mapping i.e. Company Code or Sales Organization. Segment execution variables used to determine whether a segment is required to be executed for that run of the Job to perform the mapping, validation and enrichment steps. Segment creation in the IDoc variables used to determine whether you wish to create the IDoc structure with the specific segment populated. IDoc control record information variables are used to provide the EDI_DC40 control record values that will change depending on the target SAP application. Data formatting variables used to determine what format should be used to convert data from the source (legacy) data formats to that expected by the target SAP application, e.g. date field conversion, decimal conversion, etc.The following tables define the global variables for the Customer IDoc Job.Default Value VariablesProvides a default value for a field that has either not been mapped to the legacy application or contains a NULL value.NameData TypeDescriptionDefault Value

$G_LoadDateDatetimeUsed to provide a consistent timestamp for a single execution of the JobCurrent date and time at the start of the Job execution

$G_ProfileMapTablesVarchar(1)Defines whether to execute the profile MAP table data stepY

$G_SourceData_PathVarchar(255)Points to the location of the test data referenced by each mapping Data Flow.'C:\Migration\Source_Files_DI_IN\Internal_Order

$G_SimulationVarchar(1)Simulation default.X

$G_Plant_DefaultVarchar(4)Plant default.1000

$G_Status_DefaultVarchar(1)Status default.1

$G_LanguageVarchar(2)Language default.EN

Segment Execution VariablesThese variables specify whether a segment will be processed as part of the specific Job execution. This will run the mapping, validation and enrichment steps for the specified segment.NameData TypeDescriptionDefault Value

$G_IOMasterData_ReqVarchar(1)Internal Order Master Data segment.Y

$G_IOSettlementRule_ReqVarchar(1)Internal Order Settlement Rule segment.Y

$G_GenerateIDOC_ReqVarchar(1)Define whether to execute the IDoc generation step.N

Segment Population in the IDoc VariablesThese variables specify whether a segment will be populated in the IDoc for a specific Job execution. These variables are used in the generate IDoc Data Flow in both the mappings (parent segments) and where clauses (leaf segments), to determine whether a field or segment is populated respectively.NameData TypeDescriptionDefault Value

Not applicable There are no segement population in the IDoc variables used in the Master Inspection Characteristics Job

IDoc Control Record InformationThese variables provide values for the IDoc control segment. The client, receiver partner number and receiver port will need to be modified depending on the target SAP application settings.NameData TypeDescriptionDefault Value

$G_ClientVarchar(6)SAP Client to be loaded by the IDoc220

$G_IDocTypeVarchar(60)IDoc type of the IDocINTERNAL_ORDER_CREATE02

$G_MessageTypeVarchar(60)Message type of the IDocINTERNAL_ORDER_CREATE

$G_ReceiverPartnerNumberVarchar(20) CLNT is the default formatRDECLNT220

$G_ReceiverPortVarchar(20)SAP is the default formatSAPRDE

Data Formatting VariablesProvides a format that will be used to validate and convert the legacy data of that type to the expected SAP data format.NameData TypeDescriptionDefault Value

$G_Date_FormatVarchar(20)Specifies the default format that dates will come in from the legacy application if coming in from a file or non date format field in a database.YYYYMMDD

$G_Time_FormatVarchar(10)Specifies the default format that times are migrated from the legacy application if coming in from a file or non-time format field in a databaseHHMISS

$G_Decimal_SeparatorVarchar(1)Specifies what decimal separator is being used in the legacy application if the decimal data is coming from a file or non decimal format field..

1.1 Referenced Lookup TablesThe following table defines which lookup tables are required to be mapped between the legacy values and the SAP configuration data values using the Migration Services application. This information is also included against each individual field requiring a lookup table in the objects mapping spreadsheets.

ObjectLookup TablesComment

Internal Order Master Data Business Area, Check Table For Results Analysis Key Of RA For Orders, Company Code, Controlling Area, Cost Center, Currency, Functional Area, Interest Profile, Logon Data Kernel Side Use, Order Master Data, Order Types, Plant, Pricing Procedures T683, Processing Group For Orders, Profit Center Master Data Table, Sales Doc Header Status And Admin Data, Tax Jurisdiction Code, WBS Element Master DataPlease reference the mapping spreadsheets for the correct column to lookup table relationships.

Internal Order Settlement RuleAssignment Table, Company Code, General Ledger Account, Order Master Data, Sales Doc Header Status And Admin DataPlease reference the mapping spreadsheets for the correct column to lookup table relationships.

Process StepsConfigure and Execute Lookup Delta Population Job (Optional)UseA delta job Job_AIO_Lookups_SAPToApp provides logic to import any changes to the configuration data in the SAP application. This job is only required once the target SAP application is installed and configured with changes (either new codes or text changes) to the standard delivered SAP Best Practices Baseline configuration. The job extracts the latest configuration table contents from the SAP system and joins it back to the already populated lookup mapping tables in the lookup maintenance application and underlying database structure.

If the DS_SAP Datastore has not been correctly configured to connect to the new SAP system then the job will fail to execute correctly. You then have to resolve any issues with the Datastore connectivity and rerun the job.

Procedure1. Accessing the Data Services Designer by choosing the following navigation options:Database type / Database server name / Database nameMicrosoft SQL Server / \SQLEXPRESS / AIO_REPO_IDOC

Windows authenticationSelect the checkbox

2. From the Local Object Library, open the AIO_BPFDM_Lookups project.3. From the Project Area, right-click Job_AIO_Lookups_SAPToApp and choose Execute. If you get a prompt whether you would like to save all changes and execute, choose OK. 4. On the Execution Properties dialog box, choose the Global Variable tab and perform the following: Field nameDescriptionUser action and valuesComment

$G_PathC:\\MigrationEnsure that $G_Path points to C:\\Migration or a location that you plan to keep your data migration files.

$G_Global_LanguageEGlobal language

$G_Local_LanguageDLocal language

$G_Default_String?Default legacy field value in lookup content

$G_GetDataFromObjectListNObject list indicator

$G_Runtime_ObjectListFile_NameOBJECTS_RUNTIME.csvObject list file name

$G_GetDataFromTXTFileNLookup list indicator

$G_Runtime_TxtFile_NameLOOKUPS_RUNTIME.csvLookup list file name

$G_GetDataFromVariableYVariable indicator

$G_ObjectAndLookupAndChkTableInternal_OrderObject Name

$G_LoadDate captures the date and time at the beginning of the session. You do not enter any value here, the tool will default the value to the system data and time. These values are effective only for the current run-time, to set the default values for each execution, right click on the Job in the Project Area and select properties, the global variables are available on a tab in this window and can be set as above.5. To find the statistics of a job, choose . To check the log, choose . Ensure that there are no errors and that the job is completed successfully before you move on to the next step.Result The job extracted the latest configuration table contents from the SAP system and joined it back to the already populated lookup mapping tables in the lookup maintenance application and underlying database structure.

Connect to Source (Legacy) DataUseIn this activity, you connect all your source (legacy) data into the Data Services Designer.ProcedureAccessing the Data Services Designer by choosing the following navigation options:Database type / Database server name / Database nameMicrosoft SQL Server / \SQLEXPRESS / AIO_REPO_IDOC

Windows authenticationSelect the checkbox

Option 1: Connecting to TXT files1. In the local object library, choose the Formats tab that is at the left-bottom portion of the screen, right-click Flat Files, and select New.2. In Type, specify the file type. Select Delimited if the file uses a character sequence to separate columns. Select Fixed width if the file uses specified widths for each column.3. Under General, in Name, enter a name that describes this file format template, for example, InternalOrderMasterData.

After you save this file format template, you cannot change the name. If you would like to change the name or reuse the file template, right-click on the file in the Local Object Library and choose Replicate.4. Under Data Files: In Root Directory, browse to the folder where you have your text file stored. In File name(s), browse to the text file that you would like to run.5. Under Delimiters: In Column, select the required delimiter for your file. In Text, select a text delimiter if one is used to define the text fields in your file.

The value in the Text field could create a problem when you map the inch () values in the Unit of Measure lookup table. 6. Under Input/Output: In Skipped rows, enter the number of rows that you would like to skip from your source file. In Skip row header, select Yes if your file contains the column headings in the first row. In Write row header, select Yes if you are going to write the file out again using the same file format and would like to preserve the column headings in the first row.

As you change the above settings, if you get prompted to overwrite the current schema with the schema from the file you selected, choose Yes for the tool to automatically create the required schema for your flat file.7. Edit the field names, data types, field sizes as required for the source file that you plan to import.8. Choose Save & Close to save the file format template and close the File Format Editor.9. You will now see your newly created file format under Flat Files in the Local Object Library section.

To check on the flat file definition, right-click on the newly created flat file template and select View Data.Option 2: Connecting to XLS files1. In the local object library, choose the Formats tab that is at the left-bottom portion of the screen, right-click Excel Workbooks, and select New.2. In the Import Excel Workbook dialog box: In Format name, enter a name that describes this file format template, for example, InternalOrderMasterData.

After you save this file format template, you cannot change the name. If you would like to change the name or reuse the file template, right-click on the file in the Local Object Library and choose Replicate. In Directory, browse to the folder where you have your Excel file stored. In File name, browse to the file from which you would like to extract. If your data is contained within a named range, select the Named range button and choose the named range you wish to use from the dropdown box.

If the first row contained within the named range contains the column headings then select the Use first row values as column names and then choose the Import Schema button, otherwise just choose the Import Schema button and change the field names to the desired names. If your data is contained in a worksheet, choose the Worksheet button and then select the worksheet you wish to use from the dropdown box. Select Custom range, then choose . Select the header row within the Excel sheet and close the file. The range gets populated in the Custom range field. Then select the Extend range checkbox. Select Use first row values as column names checkbox if the first row contains the column names. Choose the Import Schema button. In the schema that gets populated, change all the Data Type to varchar and the Field Size to 255 for all fields if you are unsure of the data type and size of the fields that you plan to import. Choose OK.3. You will now see your newly created file format under Excel Workbooks in the Local Object Library section.Option 3: Connecting to the Database1. In the Local Object Library, choose the Datastores tab that is located at the bottom left portion of the screen, right-click inside the window, and select New.2. In the Create New Datastore dialog box, enter the following: In Datastore name, specify the name you wish to give your datastore. In Datastore type, select Database. In Database type, select one of the available databases from the list. Depending on the Database type you choose the connection information will differ. For example, if you choose ODBC from the Database type selection, you will be prompted to choose the Data source from the available ODBC data sources defined on the machine or you can click the ODBC Admin button to create a new ODBC entry. Enter the User name and Password required for the connection to the source.

After you save the datastore, you cannot change the name. Do not include development tier information (DEV or TEST) to the name as you will not be able to change this if you change the connection information in your datastore.3. Choose the Advanced button if you wish to set any of the additional parameters for the datastore. Refer to the technical manuals provided with the Data Services installation for more information on these settings:4. Choose Apply to check the connection information is correct and then choose OK to save the new datastore.5. In the Local Object Library (on the bottom left), select the new datastore you have just created, right-click and choose Open.6. In the window (on the right), the datastore tables will be displayed if the correct connection parameters have been supplied.7. Select all the tables that you require as a source for mapping, right-click and choose Import/Reimport to import the metadata for each of the tables into the datastore.8. Expand the datastore (in the Local Object Library), and then expand Tables to ensure that the table metadata has been imported.

When you open a datastore, both views and tables within a database will be visible and they are then available for import. Result Source (legacy) data is now connected into the Data Services Designer and ready for mapping to the required segments in the IDoc structure represented in the mapping Data Flows.

Map Legacy DataUseIn this activity, you will map your legacy data to the SAP Object into the Data Services Designer.Procedure1. Accessing the Data Services Designer by choosing the following navigation options:Database type / Database server name / Database nameMicrosoft SQL Server / \SQLEXPRESS / AIO_REPO_IDOC

Windows authenticationSelect the checkbox

2. Open your project AIO_BPFDM_IDOC_ERP and select the Job Job_AIO_InternalOrder_IDOC. Expand the Job and expand the Conditional node InternalOrderMasterData_E1BP2075_7_Required. Select the Dataflow object DF_AIO_InternalOrderMasterData_Map. Delete the source test data Excel Workbook from the Dataflow in the Design Area (on the right). To do so, right-click the icon and select Delete from the displayed dropdown menu.

It is recommended that you replicate the original Data Flow and replace the original Data Flow within the Job before then making changes to the new Data Flow. This allows you to easily receive fixes and updates to the code from SAP without overwriting the objects and code you have developed yourself.3. From the Object Library (on the bottom left), select the source table, file format template or Excel sheet, that you wish to use as your source, InternalOrderMasterData in this example. Drag and drop the InternalOrderMasterData file to the Dataflow and make it the source. Now draw a connector between the source object and Qry_BestPractices query object.

If you need to map to additional fields that are not included in the Best Practices query transform (Qry_BestPractices), delete the Qry_BestPractices transform and map the source table or file directly to the Qry_AllFields transform. Ensure that all fields mapped to the previous Qry_BestPractices transform fields are either set to NULL or mapped to the new source table or file fields.4. If you are using a flat file as your source, double-click on the source object. In the dialog box that appears, under General, in Adaptable Schema, choose Yes for the field size to be compatible with the legacy data. This allows you to work with flat files generated from Excel.5. Double-click Qry_BestPractices under DF_AIO_InternalOrderMasterData_Map on the left. On the right side of the screen, you will see the Internal Order Master data (input schema) on the left and the legacy data (output schema) on the right. Map the source file (input schema) to the target table (output schema).

If you have deleted the Best Practices query transform (Qry_BestPractices), double-click Qry_AllFields transform and map the source fields (input schema) to the target table (output schema).6. Go back to the Dataflow screen. Click on the icon on the source object to see the data. If the data appears truncated, then you can modify any of the data types to Varchar and manually specify the size of the data to be expected on input. To do so, right-click on the InternalOrderMasterData file format in the Object Library and choose Edit. Now manually change the data type and field size for the fields that you want to be correctly displayed.

If you want to change multiple fields in one go highlight all fields and select properties and this will allow you to set the data type and size of the columns in one go.

7. If the number of rows displayed are lesser than the rows that are in your text file, then from the application menu, choose Tools Options. In the Options dialog box, select Designer General on the left. In the View data sampling size (rows), increase the number of rows that you would like to be displayed and choose OK.8. Repeat step 2 till 7 for all the Sub Objects you require to migrate within the Job Job_AIO_InternalOrder_IDoc, see table below:

Sub ObjectConditionalData Flow

Internal Order Master DataInternalOrderMasterData_E1BP2075_7_RequiredDF_AIO_InternalOrderMasterData_Map

Internal Order Settlement RulesInternalOrderSettlementRule_E1BP2075_6_RequiredDF_AIO_InternalOrderSettlementRule_Map

Result The legacy data is now mapped to the SAP Object into the Data Services Designer.

Execute Object Processing Job UseIn this activity, you will execute the Customer processing job for validation and enrichment, without loading it to the SAP system. You may need to execute this step a number of times until all data passes the required validation and enrichment steps.Procedure1. Accessing the Data Services Designer by choosing the following navigation options:Database type / Database server name / Database nameMicrosoft SQL Server / \SQLEXPRESS / AIO_REPO_IDOC

Windows authenticationSelect the checkbox

2. From the Project Area, right-click on the job you wish to execute for example, Job_AIO_InternalOrder_IDoc and choose Execute. If you get a prompt whether you would like to save all changes and execute, choose OK. 3. On the Execution Properties dialog box, choose the Global Variable tab and perform the following: For the segments that that you would like to execute, under Value, enter Y. i.e. $G_IOMasterData_Req, $G_IOSettlementRule_Req $G_LoadDate captures the data and time at the beginning of the session. Do not enter any information here; the tool will default the value to the system date and time. Ensure that the global variables for the default values to be used in the enrichment rules are set to the values you require e.g. $G_Plant_Default value equals 1000. See section 3.2: Global Variable Usage and Default Values for the complete list of variables. These values are effective only for the current run-time. Right click on the Job and choose Properties and then the Global Variable tab to set the default values to be used for all executions.

The default date format is set to YYYYMMDD. If the date format in your legacy data is in a different structure, you must set the $G_Date_Format variable to sync with the legacy data date format. For example, if 31072009 is the date format of your legacy data then set the $G_Date_Format to DDMMYYYY, and if 07312009 is the date format of your legacy data then set the $G_Date_Format to MMDDYYYY. Ensure that the global variable $G_ProfileMapTables is set to Y, to run the profile step of the Job. This populates the distinct list of values from the lookup columns in all segment MAP tables into the MigrationServices application. Ensure that the global variable $G_GenerateIDoc_Req is set to N, to run the validation and enrichment steps only without sending the data via the IDoc to the SAP target system.Result To find the statistics of a job, choose . To check the log, choose . Ensure that there are no errors and that the job is completed successfully before you move on to the next step. For information on the custom functions applied during the validation and enrichment steps of the processing please refer to section 6.1 in the Appendix.Review Invalid Output 1. Once the object processing Job is completed successfully, you can drill down to the DF_AIO__Validate. For example, let us use the Cost Center General Data object as shown below:

2. Choose the magnifying lens button on the invalid records container in the work area as shown below:

3. In the bottom area, the system displays an overview of all records that failed one or more validations. Choose the Save button to download the complete list of failed records, including a status field that indicates the reason for the failed records.

4. You can either download the file with invalid records or you could take a look at the errors directly within DI. Within DI, you can scroll to the right where you can find the status column, or alternatively choose the Show/Hide Columns button, select Status and deselect all others.

5. The system displays an overview of all records and can see where the validation failed:

To filter the data records, select the data in a cell, then right-click and choose either Filter or the more specific column name = value Filter option to apply that cell value as a filter for the data set. If the data set you are processing is larger than the sample dataset value then choose the Refresh button to apply the filter to the whole dataset. Data can be ordered by clicking on the column headings.

For information on reviewing invalid output using reports, see the Reporting and Visualization document (DataMigration_Reporting_Visualization_EN_US.doc) on the documentation DVD under \DMS_US\Documentation.

1.2 Map Input Values in the Lookup Maintenance ApplicationUseIn this activity, you map in the lookup maintenance application input values from the legacy system to SAP structural configuration data.Procedure1. Open the Lookup Maintenance Application http://:28080/MigrationServices.2. Within the application, select the objects as shown below and the lookup table that you wish to map values to. This displays the lookup table maintenance screen. An example is shown below. For a list of the tables requiring legacy value mapping if the fields are populated as part of the migration process, see section 5.3 in the Appendix. This information is also referenced within the mapping spreadsheets for ease of use and reference during the mapping exercise.

If you want to filter the values that are display in the lookup mapping table choose the Filter button to the right of the column headings. An additional set of filter boxes (one for each column) displays. You can enter a value and choose the Refresh button to filter the displayed rows. To remove the filter, choose the Filter button again.3. Within the lookup maintenance screen, you can now specify the legacy values that map to the certain SAP lookup values by overwriting the ? value in the LEGACY_ column.

If you want to map multiple legacy system values to the same SAP lookup value then you can choose to replicate a row. To replicate a value, choose the Duplicate button to the right of the row. If you want to delete a row you have created, choose the Delete button.4. When you have mapped all the values for the specific lookup table, choose the Save Data button at the bottom of the screen.

If you want to revert to the original state of the lookup mapping table, choose the Reset button to reload the lookup maintenance table.Result The lookup table maintenance tool manipulates the data directly in the lookup tables in the database. There is no need to run any further Data Services jobs to upload the changes to the data made in the application.

Import Object to SAP ERPUseWhen you have succeeded with the previous steps, and your data is validated and enriched successfully, you can then follow the instructions in this section, to import your data into SAP.Procedure1. Accessing the Data Services Designer by choosing the following navigation options:Database type / Database server name / Database nameMicrosoft SQL Server / \SQLEXPRESS / AIO_REPO_IDOC

Windows authenticationSelect the checkbox

2. From the Project Area, right-click on the job you wish to execute for example, Job_AIO_InternalOrder_IDOC and choose Execute. If you get a prompt whether you would like to save all changes and execute, choose OK. 3. On the Execution Properties dialog box, choose the Global Variable tab and perform the following: Take the same variable as you have run the validation check. Ensure that the global variables $G_GenerateIDoc_Req is set to Y, to generate the IDocs and run the Master Data upload. If you dont want to re-execute the mapping, validation and enrichment processes for the object you are migrating, i.e. you are happy with the content and do not need to re-extract the data from source. All you need to do is set the section global variables to N so that they arent re-executed.Result The legacy data is now uploaded into the SAP system via the IDoc mechanism. When you logon to the SAP system you can track the upload as well, via transaction WE05.

1.3 View IDocs on SAPUseThe following instructions lead you through the steps of viewing the IDocs.Procedure1. On the SAP Easy Access screen, enter transaction code WE05.2. In the IDoc List screen, enter the date for which you want to see the IDocs that were created and choose Execute.3. In the following screen, you would see the list of IDocs that were created.1.4 Check IDoc Status After you have completed section 4.7, you can now use the IDoc status checking code delivered by SAP Best Practices for Data Migration to check the current status of the IDocs in the target SAP environment. For more information on this process, see the IDoc Status Check document (DataMigration_IDoc_Status_Check_EN_US.doc) on the documentation DVD under \DMS_US\Documentation. The functionality delivered populated the MGMT_IDOCSTATUS table the staging area database, allowing you to build reports on top of this table for the required objects.1.5 ReconciliationYou can now use the Reconciliation code delivered by SAP Best Practices for Data Migration to check that the data has been successfully loaded into the target SAP environment. For more information on this process, see the Reconciliation document (DataMigration_Reconciliation_EN_US.doc) on the documentation DVD under \DMS_US\Documentation.

Troubleshooting Version IncompatibilityData Services Designer only imports the project file (atl) if the file version is the same or lower than the Data Services version that you are running on your machine. If the imported project file does not match the Data Services version that you are running, you will get an error message and will need to upgrade your Data Services installation. When the connections do not succeedIf the datastore connection to the application does not work, the datastore does not get saved. Tables not available for importIf the tables you are trying to import do not appear in the Datastore when you open it, it is probably due to permission issues on the database for the user that you are connecting with. Ensure that the user has read access to the tables and data dictionary where appropriate. SQL errors when trying to generate the IDocIf you get SQL errors stating that the tables do not exist in the staging database when trying to run the Generate IDoc Data Flow then this is normally due to the fact that not all sections have been executed, even if you are not supplying data for that section. Please refer the installation guide, regarding the running of all segments prior to mapping and using the Jobs for migration.

AppendixFunctionsThe validation and enrichment of the object related data is performed by both prebuilt and custom functions written in Data Services. The standard prebuilt functions such as ifthenelse and lookup_ext are not within this guide and should be referenced in the Data Services Technical Manuals. The custom functions are discussed in detail below so that an understanding can be gained as to their functionality, usage, and purpose. Functions used for ValidationFunction (Input Parameters)(Return)DescriptionWhere Used

is_valid_date (legacy date field [Input], expected date format [input] ($G_Date_Format)) (Return -1 = valid date0 = invalid date)Example of the function within the format validation transforms is_valid_date(table.field1, $G_Date_Format) = 1The standard function is_valid_date checks to see if the varchar value being passed in matches the standard specified in the second input parameter i.e. $G_Date_Format. If it does then the return is 1 else the return is 0. The default value for the $G_Date_Format variable is YYYYMMDD but this can be changed in the initialisation script at the start of the Job or within the parameter management screens in the Migration Services application.Note: If the legacy application is supplying the date from a database date field then this validation can be turned off and a straight mapping can be applied as the date validation has been already applied by the source system. For all input date fields within all levels of the Internal Order object.

Functions used for EnrichmentFunction (Input Parameters)(Return)DescriptionWhere Used

ENR_Decimal_Conversion (Legacy Input String [Input], Decimal Separator [Input])(Return Varchar string with the correct . separator used to separate the decimal values)e.g. ENR_Decimal_Conversion( Table.Field, ,) converts decimal string 123,45 to 123.45Converts the legacy decimal field value from its Varchar input format to the desired SAP format for the decimal i.e. with a . Rather than a , decimal separator. For all input decimal fields within all levels of the Internal Order object.