191
INFORMATICA ************************ Informatica is a GUI(Graphical User Interface) based tool and it is useful to do ETL work. The famous versions of Informatica are : 5, 6, 7.1, 8.6, 9 Architecture of Informatica : Different Components of Informatica : 1) PowerCenter Administration Console 2) PowerCenter Repository Manager 3) PowerCenter Designer 4) PowerCenter Workflow Manager 5) PowerCenter Workflow Monitor Different Roles of Informatica : In any Informatica project, there mainly two roles : Developer and Administrator Developer will create Mappings and Workflow¶s by using Designer and Workflow Manager respectively. Administrator is responsible to do all Admin activitie s, like, creating Users, Groups, Providing permissions and running the Workflows.

130297267 transformations

Embed Size (px)

DESCRIPTION

DWH

Citation preview

Page 1: 130297267 transformations

INFORMATICA ************************ Informatica is a GUI(Graphical User Interface) based tool and it is useful to do ETL work. The famous versions of Informatica are : 5, 6, 7.1, 8.6, 9 Architecture of Informatica :

Different Components of Informatica :

1) PowerCenter Administration Console 2) PowerCenter Repository Manager 3) PowerCenter Designer 4) PowerCenter Workflow Manager 5) PowerCenter Workflow Monitor

Different Roles of Informatica : In any Informatica project, there mainly two roles : Developer and Administrator à Developer will create Mappings and Workflow’s by using Designer and Workflow Manager respectively. à Administrator is responsible to do all Admin activities, like, creating Users, Groups, Providing permissions and running the Workflows.

Page 2: 130297267 transformations

* List of Transformations * Following are the list of Transformations available in Power Designer tool : 1) Aggregator Transformation 2) Source Qualifier Transformation 2) Expression Transformation 3) Filter Transformation 4) Joiner Transformation 5) Lookup Transformation 6) Normalizer Transformation 7) Rank Transformation 8) Router Transformation 9) Sequence Generator Transformation 10) S tored Procedure Transformation 11) Sorter Transformation 12) Update Strategy Transformation 13) XML Source Qualifier Transformation 16) Union Transformation. 17) Transaction Control Transformation * POWERCENTER DESIGNER * à It is Windows based Client and it is useful to create mappings. à To do any work with PowerCenter Designer window, we go through the following steps :

1) Creating ODBC connection to the Source Database 2) Importing Source Metadata. 3) Creating ODBC connection to the Target Database. 4) Importing Target Metadata. 5) Creating Mapping 6) Creating Transformations.

Step – 1 : Creating ODBC connection to the Source Database : Goto Start à click on Control Panel à Click on Administrative Tools à click on DataSources(ODBC) option then one window will open in that click on System DSN tab and fig as below :

Page 3: 130297267 transformations

Click on “Add” button, then another window will open , in that select DataSource as “Oracle in OraDb10g_home1(2)” option as below :

à Click on Finish button à One window will open as : In that provide Data Source Name as : RRITEC_SOURCE TNS Service Name as : ORCL User ID as : SOURCE_USER

Page 4: 130297267 transformations

Click on “Test Connection” button à Click on Ok. Then Connection Successful message window will get as :

Click on Ok à click on Ok. Now, the ODBC connection creation is successful for the Source Database. Step – 2 : Importing Source Metadata : Go to Start à click on RUN à type “serives.m sc” in that run window as below:

Click on Ok à then one window will open it shows all the services à in that select Informatica Services 8.6.0 à click on start, if it is already start in mode, then click on Restart link as below :

Page 5: 130297267 transformations

à go to Start à All Programs à Informatica Poercenter 8.6.0 à client à Powercenter Designer à click on Ok à then Designer window will open. à right click on our created Repository ie., rep_05 à click on Connect à one window will open and it asks the Username and password. Provide Username and Password as Administrator as shown below window :

à click on Connect. à right click on our folder ie., RRITEC folder à click on Open. à click on Source Analyzer icon from the tool bar as marked by Red circle.

à go to Sources menu à click on Import from Database option , then one window will open à in that provide ODBC datasource as RRITEC_SOURCE, Username and Password as SOURCE_USER as shown below :

Page 6: 130297267 transformations

à Click on Connect à under Select Tables sub window, which is marked by B rown colored window, select EMP and DEPT tables à click on Ok à go to Repository menu à click on Save. Step -3 : Creating ODBC connection to Target Database : à Goto SqlPlus tool à type as “/ as sysdba” à click on login. à after opening the Sql Plus window, type as > CREATE USER TARGET_USER Identified By TARGET_USER; > GRANT DBA TO TARGET_USER; à connect or login into Sql Plus using above UserName and Password as: CONN TARGET_USER @ ORCL ---- > press enter Password : TARGET_USER;

Ø Select * From Tab; Ø Create Table T_EMP as Select * From Scott.Emp Where 1=2 ;

à go to Start à click on Control Panel à Click on Administrative Tools à Click on Data Sources(ODBC) à then open one window as below :

Page 7: 130297267 transformations

Click on Add button à then one window will Open and select Oracle in OraDb10g_home2 option as shown in below :

à click on Finish, then new window will open à in that provide Data Source Name as : RRITEC_TARGET TNS Service Name as : ORCL User ID as : TARGET_USER

Page 8: 130297267 transformations

à click on Test Connection à one small window will open as :

provide Service Name : ORCL User Name : TARGET_USER Password : TARGET_USER à click on Ok.

Step -4 : Importing Target Metadata : à go to Informatica Designer tool à click on Target Designer icon as :

à go to targets menu à click on Import From Database à then new window will open à in that provide ODBC data source as : RRITEC_TARGET Username as : TARGET_USER Password as : TARGET_USER

Page 9: 130297267 transformations

à Click on Connect à then new window will open à in that expand TARGET_USER schema which is marked by Brown color as below :

à expand TABLES à select the Target Table and click on Ok à go to Repository menu and click on Save.

Page 10: 130297267 transformations

Step – 5 : Creating Mapping : à in the Designer tool à click on Mapping Designer icon à go to Mappings menu à click on Create à give the name as m_First_Mapping as shown below :

à click on Ok. à from the Repository navigator, expand RRITEC folder à expand Sources tab à Drag and drop EMP table into Workspace of mapping window à expand targets tab à drag and drop T_EMP table besides to Source Qualifier table (SQ_EMP) in workspace à map the corresponding ports from SQ_EMP to target table T_EMP à go to repository menu à click on Save. * POWERCENTER WORKFLOW MANAGER * à It is Useful to Create Tasks and Workflows. à In Creation of Workflow, we have below steps :

1) Creating Workflow 2) Creating Connection 3) Creating Task 4) Linking Task and Workflow 5) Running Workflow

1) Creating Workflow : Start à All Programs à Informatica Powercenter 8.6.0 à Client à click on PowerCenter Wo rkflow Managerà then window will look like as

Page 11: 130297267 transformations

à right click on our Repository ie., rep_05 à click on Connect à new window will open and provide Username and Password as Administrator.

à click on connect à Right click on RRITEC folder à click on O pen à click on Workflow Designer icon as shown in below :

à go to Workflows menu à click on Create à name it as First_Workflow :

Page 12: 130297267 transformations

à click on Ok à then workflow workspace will look as below :

à Goto Repository menu à click on Save. 2) Creating Connection : à For SOURCE : In the Connections menu à click on Relational option à new window will open as :

à select Type as ALL à click on New button à new window will open and select the Oracle from list of available options as shown below :

Page 13: 130297267 transformations

à click on Ok à then new window will open à in that Provide name as SOURCE , Username as : SOURCE_USER Password as : SOURCE_USER Connection String as : ORCL as shown below :

Click on Ok button à Similarly, create one more connection for Target as same procedure : the process is same as above , but here we need to provide the details as below : Provide name as SOURCE , Username as : SOURCE_USER Password as : SOURCE_USER Connection String as : ORCL as shown below :

Page 14: 130297267 transformations

à click on Ok. 3) Creating Task : à In the Workflow Manager window à goto Tasks menu à click on Create à new window will open, in that select the type as Session and name as : First Session as shown below :

à click on Create à click on Done à then new window will open as :

Page 15: 130297267 transformations

From the above window, select our mapping ie., First_Mapping from the list of all available mappings in o ur Repository ( but here it is not available, because I didn’t save our mapping while preparing this document, that’s why we didn’t observe that.) à click on Ok à then our workflow window will look like as :

à Double click on First_Session à new window will open, in that click on Mapping tab, now the window will look as :

à from the Navigator of above window à select SQ_EMP under sources tab à the window will change as :

Page 16: 130297267 transformations

à click on Down arrow mark , which is marked by Brown color, then another w indow will open as :

In that we need to select our newly created SOURCE, because what we selected from Navigator in above, is also Source à click on Ok.

Page 17: 130297267 transformations

à similarly, same as above., select our target table ie; T_EMP and click on Down arrow mark, and select the TARGET and click on Ok. 4) Link Task and Workflow : à goto Tasks menu à click on Link Task option à just click the workflow Start which is represented by Green Arrow and drag the cursor to First_Session à then the flow will look like as below :

à goto Repository menu à click on Save. 5) Running the Workflow : à Right click on Workflow ie., Green colored Arrow à click on Start workflow from task. TRANSFORMATIONS ******************************** à In Informatica, Transformations help to transform the source data according to the requirements of target system and it ensure the quality of the data being loaded into target. Transformations are of two types: Active and Passive. Active Transformation : An active transformation can change the number of rows that pass through it from source to target i.e it eliminates rows that do not meet the condition in transformation. Passive Transformation : A passive transformation does not change the number of rows that pass through it i.e it passes all rows through the transformation.

Page 18: 130297267 transformations

à Transformations can be Connected or UnConnected. Connected Transformation : Connected transformation is connected to other transformations or directly to target table in the mapping. UnConnected Transformation : An unconnected transformation is not connected to other transformations in the mapping. It is called within another transformation, and returns a value to that transformation. FILTER TRANSFORMATION ************************************* à It is Active and Connected type Transformation à A filter condition returns TRUE or FALSE for each row, that the Integration Service evaluates, depending on whether a row meets the specified condition. For each row that returns TRUE, the Integration Services pass through the transformation. For each row that returns FALSE, the Integration Service drops and writes a message to the session log. à For any task we will going to do, its good practice to draw the one diagram firstly, to get clear idea about task, that is DFD (Data Flow Diagram ) or MFD (Mapping Flow Diagram ). So, here also, we are going to draw the sample Data Flow Diagram.

Step-1, 2 and 3 are same here also. So I’m not giving here, you can do it… ! Step -4 : Creating and Importing Target Table Metadata : As we keep in mind that, if we want to import any table structure into the designer window, to place the target data, first that table structure is to be there in the Database. If it not there, then Create it first using below procedure. Go to Sql Plus, give username as TAREGT _USER , password as TAREGT_USER , and Host string as ORCL, then click on Ok. After opening the SqlPlus window, type the below syntax to create the table definition as:

Page 19: 130297267 transformations

CREATE TABLE F_EMP ( Empno Number(10), Ename Varchar2(20), Job Varchar2(50), Sal Number(10,2), Comm Number(10,2), Deptno Number(10) ) ; Now, open the Informatica Designer window, right click on Repository( here rep_05) from the Navigator window à click on C onnect à then small window will open and it asks Username and Password as :

à provide Username and Password as Administrator à click on Connect then our created Folder is Opened where we need to Store all our work and expand that Folder and see the list of options available in that Folder as below :

Page 20: 130297267 transformations

Now, w e need to import that Target table into the Designer as per below steps : à Click on Warehouse Designer ( Target Designer) icon from the top of Workspace window which is rounded by Red Colored Circle in below diagram as :

Now go to “Targets” menu à click on “Import From Database” option , then new window will open as below :

Page 21: 130297267 transformations

From that above window, Select ODBC data source as : RRITEC_TARGET, UserName as : Target_User Passowrd as : Target_User à Click on Connect and observe Under the “Select Tables” sub window which is rounded by Red colored circle, expand that user, you will find list of available tables under that user. Select the required table, here F_EMP table and click on OK, then that table metadata is came into Designer window und er Target Designer workspace as shown below :

Page 22: 130297267 transformations

Step – 5 : Creating Mapping : Got to Designer tool, click on Mapping Icon which is indicated by Red colored Circle as shown below :

Go to “Mappings” menu, click on Create option, then one small window will open as : Give the mapping name as anything, here I’m giving as “m_FILTER”.

Click on Ok. In Repository under Navigator window, expand RRITEC folder à expand Sources folder à expand Source_User tab, there we will get our source table. So, drag and drop that source table EMP into the Workspace window , after dragging done, the designer window is like below :

Page 23: 130297267 transformations

Whenever, if we drag Source table into the Workspace, one another table is also came by default with the name as SQ_<source_table_name> . here SQ_EMP. Where SQ stands for Source Qualifier. Now, to create Transformation, go to Transformation tab à click on Create, one window will open and it asks two values. Select Transformation type as Filter and Name of the transformation is : t_SAL _GREATERTHAN_3000, as shown below :

Click on Create à click on Done. Now the designer window workspace will look like as:

Page 24: 130297267 transformations

Now, we need to give filter condition on this newly created Filter Transformation table. Now select the required fields in Source Qualifier and drag and drop those fields from Source Qualifier Table into Filter Transformation Table as below :

à Now, we need to gove the condition for the filter Tranformation table, For that click on Header of that Filter Transformation table, then it will open one new window that is “Edit Transformation” in that select Properties Tab and the window will look like as :

Page 25: 130297267 transformations

à Click on Filter Condition attribute Down arrow mark , which is rounded by Red colored Circle as shown in above diagram, then Expression Editor window will open as below :

à In that Expression Editor window, select Ports tab à double click on SAL à from the keyboard type >= 3000, click on Validate button , then it shows validation success message, if the syntax and everything is proper as :

click on Ok à click on Ok à again Ok.

àNow, expand Target_User tab, there we will get our Target table. So, drag and drop that Target table F_EMP into the Workspace window , after dragging done, the designer window is like below :

Page 26: 130297267 transformations

à Now, map the required fields from Transformation Filter table into Target table as below:

à Go to Repository tab à click on Save, and observe the Saved message that is either Valid or Invalid the mapping message, we will got under Output window in the same Designer Window tool. à If want to check that whether the records are to be loaded into the target table after execution of our workflow, go to SqlPlus, querying the table as : > SELECT * FROM F_EMP; We get the records only those records will meet the Condition as we specified in the filter Transformation. Filtering Rows with NULL values : To filter rows containing null values or spaces, use the ISNULL and IS_SPACES functions to test the value of the port. For Eg , if you want to filter out rows that contain NULL value in the FIRST_NAME port, use the following condition : IIF ( ISNULL ( FIRST_NAME ) , FALSE , TRUE ) This condition states that if the FIRST_NAME port is NULL, the return value is FALSE and the row should be discarded. Otherwise, the row passes through to the next transformation. Limitations of Filter Transformation: i) Using Filter Transformation, we can load only one Target Table at a time. ii) If we have five different conditions, then we need to create five different Filter Transformations.

Page 27: 130297267 transformations

iii) W e cannot capture the False or Unsatisfied records of the condition in an table. NOTE : 1) Use the Filter transformation early in the mapping : To maximize session perform ance, keep the Filter transformation as close as possible to the sources in the mapping. Rather than passing rows that you plan to discard through the mapping, you can filter out unwanted data early in the flow of data from sources to targets. 2) Use the Source Qualifier transformation to filter : The Source Qualifier transformation provides an alternate way to filter rows. Rather than filtering rows from within a mapping, the Source Qualifier transformation filters rows when read from a source. The main difference is that the source qualifier limits the row set extracted from a source, while the Filter transformation limits the row set sent to a target. Since a source qualifier reduces the number of rows used throughout the m apping, it provides better performance. However, the Source Qualifier transformation only lets you filter rows from relational sources, while the Filter transformation filters rows from any type of source. Also, note that since it runs in the database, you must make sure that the filter condition in the Source Qualifier transformation only uses standard SQL. The Filter transformation can define a condition using any statement or transformation function that returns either a TRUE or FALSE value. EXPRESSION TRANSFORMATION ************************************* à It is Passive and Connected type of Transformation. à Use the Expression Transformation to perform Aggregate calculations. à It is best practice to place one Expression Transformation after Source Qualifier Transformation and one Expression Transformation before the Target table, because, in future, if we add any new column to the target which is independent to Source table, then we need to disturb all the mapping., instead of that, if we use above best practice, its no need to disturb all the mapping. Its enough to change at the Expression Transformation at before of the Target table. à Expression Transformation is useful to derive the new columns from the Existing Columns.

Page 28: 130297267 transformations

Mapping Flow Diagram : ( here the target table has fields of EmpNo, Ename, Job, Sal, Comm, Tax, Total Sal ) à Step-1, 2, 3 is same here also, so you can do it. Step – 4: Creating Target table and Import that metadata table into Designer tool : à open SqlPlus and type the below to create a table as : CREATE TABLE EXP_EMP ( EmpNo Number, Ename Varchar2(30), Job Varchar2(20), Sal Number(p,s), Comm Number(p,s), Tax Number(p,s), Total_Sal Number(p,s) ) ; à goto Targets menu à click on Import from Database option à new window will open, in that provide Target ODBC connection as : RRITEC_TARGET Username as : TARGET_USER Password as : TARGET_USER As below :

SQ_EMP (source)

SQ_EMP EXPRESSSION T / F à Tax=Sal * 0.1 à Total = Sal+IIF(IsNull(Comm),0,Comm)

EXP_EMP (target)

Page 29: 130297267 transformations

à click on Connect à the same above window will just look as different à in that expand the Target_User tab and expand the Tables tab which are marked by Brown color and à select our target table ie., EXP_EMP table and click on Ok. as below :

Page 30: 130297267 transformations

Step – 5: Creating Mapping : à click on Mapping Designer icon from the tool bar à goto Mappings menu à click on Create option and give the mapping name as : m_EXP_EMP as shown below :

à click on Ok. à now drag and drop Source and Target tables into mapping window workspace as below :

à Goto Transformations menu à click on Create à new window will get, in that select type as Expression and give name as : T_Tax_And_TotSal as below :

à click on Create à click on Done à Drag and Drop required columns ( EmpNo, Ename, Job, Sal, Comm ) from SQ_EMP to T_Tax_And_TotSal transformation table as shown below :

Page 31: 130297267 transformations

à Double click on header of Expression Transformation à new window will get, in that click on Ports Tab à window will look as :

à select last port., ie., Comm à click on two times on Add a New port icon which is marked by Brown color, to add new two ports as below :

Page 32: 130297267 transformations

à rename those two new ports as one is Tax and another is TotSal and set both Datatypes as Decimalà Disable the Input ports for both fields as :

Page 33: 130297267 transformations

à click on Expression icon, which is marked by brown color in above à new window will get, where you need to write the Formula for Tax column as :

à click on Validate button, it will give the validation status message, if any sysntax errors is there or not as :

à click on Ok à Click on Ok à similarly click on expression down arrow mark corresponding to TotalSal and type the below expression as :

Page 34: 130297267 transformations

à click on validate to check for any syntax errors are there or not :

à click on Ok à Click on Ok à Click on Ok à now connect corresponding ports from Expression Transformation table to Target table as shown below :

Page 35: 130297267 transformations

à The remaining process to create the Session and create the Workflow and running the Workflow from Task is same here also, so you can do that. Note : In the Session Properties window., make sure that Change the Load Type from default option ie., Bulk to Normal à after successfully ran the Workflow à goto target table and observe the data whether it is properly populated or not… ! Performace Tuning : As we know that there are three types of Ports. Input, Output, and Variable. Variable --- > this Port is used to store any temporary Calculation .

1) Use Operators instead of Functions. 2) Minimize the usage of String Functions. 3) If we use the Complex Expression multiple times in the Expression Transformer,

then make that expression as Variable. Then we need to use only this variable for all computations.

RANK TRANSFORMATION ******************************* à It is Active and Connected type of transformation. à The Rank transformation differs from the transformation functions MAX and MIN, in that it lets you select a group of top or bottom values, not just one value. à you can designate only one Rank port in a rank Transformation. à The Designer creates a RANKINDEX port for each Rank transformation automatically. The Integration Service uses the Rank Index port to store the ranking position for each row in a group. For example, if you create a Rank transformation that ranks the top five salespersons for each quarter, the rank index numbers the salespeople from 1 to 5 : RANKINDEX SALES_PERSON SALE S 1 Sam 10,000 2 Mary 9,000 3 Alice 8,000 4 Ron 7,000 5 Alex 6,000

Page 36: 130297267 transformations

à The RANKINDEX is an output port only. You can pass the rank index to another transformation in the mapping or directly to a target. à We Cannot Edit or Delete the RANKINDEX port. Rank Cache : there are two types of Caches here. They are Rank Data Cache and Rank Index Cache. Rank Data Cache : The index cache holds group information from the group by ports. If we are Using Group By on DEPTNO, then this cache stores values 10, 20, 30 etc. All Group By Columns are in RANK INDEX CACHE. Ex. DEPTNO Rank Index Cache : It holds row data until the Power Center Server completes the ranking and is Generally larger than the index cache. To reduce the data cache size, connect Only the necessary input/output ports to subsequent transformations. Note : All Variable ports, if there, Rank Port, All ports going out from RANK Transformations are stored in RANK DATA CACHE. Mapping Flow Diagram : ( the target table R ank_Emp has fields of EmpNo, Ename, Sal, DeptNo, Rank ) à Step-1, 2, 3 is same here also., so you can do and use it. Step -4 : Create and Import Target table as per above Mapping Flow Diagram using SqlPlus. Step -5: Creating Mapping : Goto designer tool à click on mapping icon à goto Mappings menu à click on Create à then new window will open and give the mapping name there as :

Click on Ok à Drag and drop Source EMP table into mapping window workspace as shown below :

EMP (source)

SQ_EMP RANK T / F (top 3 ranks)

RANK_EMP (target)

Page 37: 130297267 transformations

à Go to Transformations menu à click on Create à select type as RANK and name it as T_RANK as shown below :

à click on Create à click on Done à from Source Qualifier, Drag and Drop EmpNO, Sal, DeptNo into Rank Transformation table ie., T_RANK as shown below :

à Double click on header of the Rank Transformation table à new will open, in that click on Ports tab à enable the Checkbox of Rank ie., R[] of the SAL port because we are finding the Rank based on the Rank, which is marked by Brown color as shown in below :

Page 38: 130297267 transformations

à click on Properties tab on that same window à select the Top/Bottom attribute as value of Top à provide Number of Ranks option as 3 , the window as shown below :

Page 39: 130297267 transformations

à click on Ok à drag and drop Target table ie., RANK_EMP into workspace à connect RANKINDEX of Rank Transformation table to Rank field in Target table and connect all the remaining ports from T_RANK to corresponding ports at Target table as shown below :

à the remaining process for creating the Session and creating the Workflow and Running the workflow from Task is same here also. à after successfully ran the Workflow, go to taget table and observe that Rank is correctly populated or not. ROUTER TRANSFORMATION ************************************ à It is an Active and Connected transformation. à Router Transformation is useful to load Multiple targets and handled Multiple Conditions. à It is similar to filter transformation. The only difference is, filter transformation drops the data that does not meet the condition whereas router has an option to capture the data that do not meet the condition. It is useful to test multiple conditions. It has input, output, and default groups. à If you need to test the same input data based on multiple conditions, use a Router transformation in a mapping instead of creating multiple Filter transformations to perform the same task. à Likewise, when you use a Router transformation in a mapping, the Integration Service processes the incoming data only once. When you use multiple Filter transformations in a mapping, the Integration Service processes the incoming data for each transformation. This will effect the Performance.

Page 40: 130297267 transformations

Configuring Router Transformation : The router transformation has input and output groups. You need to configure these groups. 1) Input groups : The designer copies the input ports properties to create a set of output ports for each output group. 2) Output groups : Router transformation has two output groups. They are user-defined groups and default group. This have again two types of Groups. They are User-defined Groups and Default Group. 2- i) User-defined groups : Create a user-defined group to test a condition based on the incoming data. Each user-defined group consists of output ports and a group filter condition. You can create or modify the user-defined groups on the groups tab. Create one user-defined group for each condition you want to specify. 2- ii) Default group : The Designer creates only one default group when you create one new user-defined group. You cannot edit or delete the default group. The default group does not have a group filter condition. If all the conditions evaluate to FALSE, the integration service passes the row to the default group. Specifying Group Filter Condition : Specify the Group filter condition on the groups tab using the expression editor. You can enter any expression that returns a single value. The group filter condition returns TRU E or FALSE for each row that passes through the transformation. Mapping Flow Diagram : ( Both Target tables has fields as EmpNo, Ename, Job, Sal, DeptNo )

EMP (Source)

SQ_EMP (Source Qualifier)

ROUTER T / F

R_SAL_UPTO_2500 ( Target -1 )

R_SAL_AFTER_2500 ( Target-2 )

Page 41: 130297267 transformations

à Create two tables R_SAL_UPTO_2500 and R_SAL_AFTER_2500 from SqlPlus as : CREATE TABLE R_SAL_UPTO_2500 ( Eno Number(5), Ename Varchar2(10), Job Varchar2(10), Sal Number(6,2), DeptNo Number(10,2) ); CREATE TABLE R_SAL_AFTER_2500 ( Eno Number(5), Ename Varchar2(10), Job Varchar2(10), Sal Number(6,2), DeptNo Number(10,2) ); Creating Mapping : à Import Source and Target Tables into the Workspace window. Go to Mapping designer workspace à go to Mapping menu à click on Create, then below window will open and give the mapping name as m_ROUTER as below :

Click on Ok à drag and drop source and two target tables into the workspace window as and mapping window will look as below :

Page 42: 130297267 transformations

Now, go to Transformation menu à click on Create, then new window will open as below:

Click on Create à click on Done à Copy and drag all the columns from Source Qualifier table to Router Transformation table as per below :

Just keenly observe that, in the Router Transformation, we found that one INPUT, NEWGROUP1 and DEFAULT label we got, which is marked by Red rectangle as shown above. à double click on Header of the Router Transformation à one new window ie., Edit Transformations window will open à in that select Groups tab, which is marked by Red rectangle à click on “Add a New Group to this Router” icon, which is also rectangled by the Red color as shown below :

Page 43: 130297267 transformations

After click on that Add a New Group to this Router icon, the window will look as below :

On the NEWGROUP1 name, rename that with SAL_UPTO_2500 à under Group Filter Condition Field, click on Down arrow mark, which is circled by the Red color à new window will open that is Expression Editor as below :

Page 44: 130297267 transformations

Delete that default TRUE from the workspace à go to Ports Tab à double click on SAL field à type from the keyboard as <= 2500 à click on Ok à click on Ok à click on Ok. Now connect from NEWGROUP1 section fields to one target table and DEFAULT1 section fields to another Target table as shown in below :

The remaining process for creating Task and creating workflow is same here also.

Page 45: 130297267 transformations

After running the workflow and it is succeeded, go to database and give the query test the data of above two target tables, one table have to store all the details of employees whose salaries Upto 2500 and the other table have to store all the details of employees whose salaries are After/ Above 2500. Advantages of Using Router over Filter Transformation : Use router transformation to test multiple conditions on the same input data. If you use more than one filter transformation, the integration service needs to process the input for each filter transformation. In case of router transformation, the integration service processes the input data only once and thereby improving the performance. SORTER TRANSFORMATION ************************************** à It is Active and Connected type of transformation. à It is useful to sort the data according to the sort key specified , either in Ascending or Descending order based on requirement. à You can specify multiple columns as sort keys and the order (Ascending or Descending for each column) Properties :- Distinct Output Rows - Can output distinct records if you check this option , suppose If we enable this option, then obviously it will not return all the Rows as there in Input. So in this case, this Sorter Transformation acts as an Active Transformation. Null Treated Low --- You can configure the way the Sorter transformation treats null values. Enable th is property if you want the Integration Service to treat null values as lower than any other value when it performs the sort operation. Disable this option if you want the Integration Service to treat null values as higher than any other value. Case Sensitive --- The Case Sensitive property determines whether the Integration Service considers case when sorting data. When you enable the Case Sensitive property, the Integration Service sorts uppercase characters higher than lowercase characters.

Work Directory --- You must specify a work directory the Integration Service uses to create temporary files while it sorts data. After the Integration Service sorts the data, it deletes the temporary files. You can specify any directory on the Integration Service machine to use as a work directory. By default, the Integration Service uses the value specified for the $PMTempDir process variable.

Page 46: 130297267 transformations

Sorter Cache Size : - It is the max amount of memory used for sorter. In version 8 it is set to Auto. In version v7 it is 8MB If it cannot allocate then the session will fail If it needs more memory that it will pages the data in the W ork Directory, and writes a warning in the log file. Better Performance : Sort the data using sorter transformation before passing in to aggregator or joiner transformation. As the data is sorted, the integration service uses the memory to do aggregate and join operations and does not use cache files to process the data. Mapping Flow Diagram : ( Here, Target table has fields that whatever those Fields available for Source table ) à Create target table based on mapping flow diagram with name S_EMP as below : CREATE TABLE S_EMP AS SELECT * FROM EMP WHERE 1=2 ; Which is nothing the same structure that what we had in Source table ie., EMP Creating m apping : in the mapping designer, go to mappings menu à click on create, then small window will open as below :

Click on Ok à drag and drop source and target tables into the Workspace window à go to Transformation menu à click on create , then small window will open as :

EMP (source)

SQ_EMP SORTER T / F (based on DeptNo)

S_EMP (target)

Page 47: 130297267 transformations

Click on Create à click on Done, then mapping designer window will look as :

Now select and drop all the columns from Source Qualifier Transformation table to Sorter Transformation Table , the mapping is like below :

à Now, double click on header of the Sorter transformation à click on Ports tab à Enable the Check box under Key field corresponding to the “DEPTNO”, which is marked by Red colored rectangle as shown below :

Page 48: 130297267 transformations

Click on Ok à click on Save from Repository tab. Next, creating the Session and Creating the Workflow is same for this also. After running the workflow, go to backend and check the data. Performance Tuning :

1) While using the sorter transformation, configure sorter cache size to be larger than the input data size.

2) Configure the sorter cache size setting to be larger than the input data size while Using sorter transformation.

3) At the sorter transformation, use hash auto keys partitioning or hash user keys Partitioning.

Page 49: 130297267 transformations

AGGREGATOR TRANSFORMATION ****************************************** à It is Active and Connected type of Transformation . à It used to perform calculations such as SUMS, AVERAGES, COUNTS etc., on groups of data. à The Integration service stores the data group and Row data in Aggregate Cache. The Aggregator Transformation provides more advantages than the SQL, you can use the conditional clauses to filter rows. Components and Options of Aggregator Transformation: Aggregate Cache : The Integration Service stores data in the aggregate cache until it completes aggregate calculations. It stores group values in an index cache and row data in the data cache. Aggregate Expression : Enter an expression in an output port or Variab le Port. The expression can include non -aggregate expressions and conditional clauses. Group by Port : This tells the integration service how to create groups. You can configure input, input/output or variable ports for the group. The integration service performs aggregate calculations and produces one row for each group. If you do not specify any group by ports, the integration service returns one row for all input rows. By default, the integration service returns the last row received for each group along with the result of aggregation. By using the FIRST function, you can specify the integration service to return the first row of the group. Sorted Input : This option can be used to improve the session performance. You can use this option only when the input to the aggregator transformation in sorted on group by ports. Note: By default, the Integration Service treats null values as NULL in aggregate functions. You can change this by configuring the integration service. à When you run a session that uses an Aggregator transformation, the Integration Service creates Index Cache and Data Cache in memory to process the transformation. If the Integration Service requires more space, it stores overflow values in cache files. Aggregator Index Cache : The value of index Cache can be increased upto 24MB. The index cache holds group information from the group by ports. If we are using Group By on DEPTNO, then this cache stores values 10, 20, 30 etc.

Page 50: 130297267 transformations

Data Cache : DATA CACHE is generally larger than the AGGREGATOR INDEX CACHE. Columns in Data Cache: i) Variable ports if any ii) Non group by input/output ports. iii) Non group by input ports used in non-aggregate output expression. iv) Port containing aggregate function

à It can also include one aggregate function nested within another aggregate function, such as :

MAX ( COUNT ( ITEM ) )

Nested Aggregate Functions : You can include multiple single-level or multiple nested functions in different output ports in an Aggregator transformation. However, you cannot include both single-level and nested functions in an Aggregator transformation. Therefore, if an A ggregator transformation contains a single-level function in any output port, you cannot use a nested function in any other port in that transformation. When you include single-level and nested functions in the same Aggregator transformation. The Designer marks the mapping or mapplet invalid. If you need to create both single-level and nested functions, create separate Aggregator transformations. Null Values in Aggregate Functions : When you configure the Integration Service, you can choose how you want the Integration Service to handle null values in aggregate functions. You can choose to treat null values in aggregate functions as NULL or zero. By default, the Integration Service treats null values as NULL in aggregate functions. Mapping Flow Diagram : Best Practise : Whatever the Order of the Sorter Transformation Key positions, the same positions we should maintain the same order in Aggregator Transformation.

EMP (source)

SQ_EMP

SORTER T / F

AGGREGATOR T / F

AGG_EMP (target)

Page 51: 130297267 transformations

à Aggregator Transformation supports Conditional Sum., ie., SUM( sal, sal<3000) , this means to sum all the Salaries below the 3000 margin. à Only two level Nested Aggregated Transformation is supported . ie., for eg : MAX ( COUNT ( ITEM )) is valid eg : MIN ( MAX ( COUNT ( ITEM ))) Process : Create Target Tables as per Mapping Flow Diagram using Sql*Plus as below : CREATE TABLE AGG_EMP ( DEPTNO NUMBER, DEPT_WISE_SAL NUMBER ) ; and Import this table into Informatica Designer tool as a Target. Go to mappings menu à click on create, then open small window as :

Click on Ok à Drag and drop source and target tables into the workspace as shown below :

Go to Transformation tab à click on Create, then small window will open as :

Page 52: 130297267 transformations

Click on Create à click on Done. Drag and drop Deptno and Sal into Sorter Transformation from Source Qualifier Transformation table as shown below :

Now, double click on header of Sorter transformation, then Edit Transformation window will open and click on Ports tab as shown below :

Page 53: 130297267 transformations

Enable the Checkbox under K, corresponding DeptNo portname, which is marked by Red color à click on Ok. Go to Transformation à click on Create, then small window will open as :

Click on Create à click on Done à copy and drag the DeptNo and Sal from Sorter Transformation to Aggregator Transformation as shown below :

Double click on Aggregator Transformation à click on Ports tab à opened window is like below : Enable the checkbox under the Group By column corresponding to DeptNo portname as shown in above. à click on Ok.

Page 54: 130297267 transformations

Select Sal column in above fig à click on Add New Port icon which is marked by Green colored circle à then new column is added under Port Name field and rename it as DEPT_WISE_SAL à disable the Input checkbox for the corresponding newly created field à under expression field type SUM(sal) for the same field like as below :

Page 55: 130297267 transformations

Now, go to Properties tab on the same above window à enable the checkbox corresponding Sorted Input filed, which is marked by Red

Click on Ok. Now, connect DeptNo to DeptNo, Dept_Wise_Sal to the Dept_Wise_Sal from Aggregator Transformation to the Target table as shown below :

Scenario -1 : Load the Target table using Conditional SUM feature for the same above mapping. Solution : all the steps remains same in this scenario, the only one change we need to do is : go to Edit Transformation of the above Aggregate Transformation table, under

Page 56: 130297267 transformations

expression field for the DEPT_WISE_SAL port, type as SUM( sal, sal< 2500 ), which is marked by Red colored rectangle as shown below :

The remaining process to create Session, creating workflow is same. Now go to backend and check the target table data as querying SELECT * FROM AGG_EMP; We get result as : Empno Dept_Wise_Sal -------- ------------------- 10 1250 20 3480 like this….. that means it gives the result as Sum all the Salaries in Dept wise whichever Salaries are less than 2500. Performance Tuning Tips : 1) Use sorted input to decrease the use of aggregate caches : Sorted input reduces the amount of data cached during the session and im proves session performance. Use this option with the Sorter transformation to pass sorted data to the Aggregator transformation. 2) Limit connected input/output or output ports : Limit the number of connected input/output or output ports to reduce the amount of data the Aggregator transformation stores in the data cache.

Page 57: 130297267 transformations

3) Filter the data before aggregating it : If you use a Filter transformation in the mapping, place the transformation before the Aggregator transformation to reduce unnecessary aggregation. LOOK -UP TRANSFORMATION ************************************** à It is Passive and Connected / Unconnected type of transformation. à A connected Lookup transformation receives source data, performs a lookup, and returns data to the pipeline. à Cache the lookup source to improve performance. If you cache the lookup source, you can use a dynamic or static cache. By default, the lookup cache remains static and does not change during the session. With a dynamic cache, the Integration Service inserts or updates rows in the cache. à When you cache the target table as the lookup source, you can look up values in the cache to determine if the values exist in the target. The Lookup transformation marks rows to insert or update the target. Connected Look-Up Transformation : The following steps describe how the Integration Service processes a connected Lookup transformation : 1) A connected Lookup transformation receives input values directly from another transformation in the pipeline. 2) For each input row, the Integration Service queries the lookup source or cache based on the lookup ports and the condition in the transformation. 3) If the transformation is uncached or uses a static cache, the Integration Service returns values from the lookup query. 4) If the transformation uses a dynamic cache, the Integration Service inserts the row into the cache when it does not find the row in the cache. When the Integration Service finds the row in the cache, it updates the row in the cache or leaves it unchanged. It flags the row as insert, update, or no change. 5) The Integration Service passes return values from the query to the next transformation. If the transformation uses a dynamic cache, you can pass rows to a Filter or Router transformation to filter new rows to the target.

Page 58: 130297267 transformations

Mapping Flow Diagram : ( Target table has the fields say DeptNo, Dname, EmpNo, Sal ) à Create the Target table as per Mapping flow diagram. Till now, we had created the target table using SqlPlus, but now, we will see that how to create the table from Designer itself as below : Go to Designer tool à click on Target Designer icon from the tool bar as

goto Targets menu à click on Create, then small window opens as below :

Click on Create à Click on Done. Then one empty table will got in the workspace as below :

EMP (source table)

SQ_EMP Look-Up Transformation

LKP_EMP (target table)

Page 59: 130297267 transformations

Double click on Header of that above empty table, then another window will open , as below :

In that window, goto Columns tab à click on Add new column icon which is marked by red color symbol , here my target table has 4 ports, so I clicked 4 times on that red marked icon, then new window will open as below :

Page 60: 130297267 transformations

Rename the column N ames as per our Target table in Mapping flow diagram and select the corresponding Data type, the final window will look like as :

Page 61: 130297267 transformations

Click on Ok, then the target table appear in the Workspace as below :

Now, source and target tables are ready and impo rted into Workspace of Designer tool. Go to Mappings menu à click on create and give the name as below window :

Click on Ok à Drag and drop Source and Target tables into the workspace of Mapping window as below :

Go to Transformation menu à click on Create, then window will openà in that select Type as Lookup and name as T_LKP as :

Page 62: 130297267 transformations

Click on Create à then another new window will open as below :

In the above window, click on Import button, then two options will popup as shown below :

Page 63: 130297267 transformations

In those two options à click on From Relational Table option à it again open Import Tables window as shown below à in that provide ODBC connection details for Source as below and click on Connect à Under Select Tables sub blockà select DEPT table and click on Ok.

Click on Done on the above “Create Transformation” window. Now the mapping window will look like as :

Page 64: 130297267 transformations

Connect DeptNo, E mpNo, S al from SQ_EMP to target table ie., LKP_EMP à drag and drop DeptNo from SQ_EMP to T_LKP table and connect the Dname from T_LKP to LKP_EMP table as shown below :

à Double click on header of T_LKP table à new window will open à click on Condition tab à click on Add a New Condition icon marked by Red circle à make sure that Lookup Table Column as DEPTNO and Transformation Port as DEPTNO1 as :

à click on Ok.

Page 65: 130297267 transformations

à Now connect the DNAME from Lookup (T_LKP) to target (LKP_EMP) as below :

The remaining process for creating the Session, creating the Workflow is same., but at the session properties tab, we need to select or map the Lookup Transformation connection type to Source ODBC connection, which is similar what we done to map the Source Table to Source ODBC connection and Target table is mapped to Target ODBC connection, as of now. Note : This is the only one Transformation that we need to map the Transformation Table to ODBC connection at Session Properties window. That means only Look-Up transformation will interact and talk to Database, no other Transformation can do this.. UNCONNECTED LOOK -UP TRANSFORMATION ****************************************************** à An unconnected Lookup transformation is not connected to a source or target. A transformation in the pipeline calls the Lookup transformation with a :LKP ex pression. The unconnected Lookup transformation returns one column to the calling transformation. à The following steps describe the way the Integration Service processes an unconnected Lookup transformation : 1) An unconnected Lookup transformation receives input values from the result of a :LKP expression in another transformation, such as an Update Strategy transformation. 2) The Integration Service queries the lookup source or cache based on the lookup ports and condition in the transformation.

Page 66: 130297267 transformations

3) The Integration Service returns one value into the return port of the Lookup transformation. 4) The Lookup transformation passes the return value into the :LKP expression. Mapping Flow Diagram : ( the target table LKP_EMP has the fields say: DeptNo, Dname, EmpNo, Sal ) It will not be connected in Mapping Pipeline. It can be called in any Transformation. It supports expressions. à Using Unconnected Look-up, we will get only one output port. à Create and Import Target table into Target designer window as per Mapping Flow Diagram ( which is exactly same as above procedure ) à goto Mappings menu à click on create à name it as : m_UnConnected_Lookup à click on Ok.

à Drag and drop Source EMP table into workspace of Mapping window . à Goto Transformation menu à click on create à select Expression Transformation as type and name it as : T_EXP à click on Create à click on Done as below :

EMP (source table)

SQ_EMP Expression Transformation

LKP_EMP (target table)

UnConnected Look-Up Transformation

Page 67: 130297267 transformations

à from SQ_EMP table, drag and drop DeptNo, EmpNo, Sal into Expression Transformation (T_EXP) as below :

à Double click on Expression Transformation à click on Ports tab à select DeptNo column à click on Add New Port icon à name it as DNAME à disable the Input( I ) port à click on Ok, as shown below :

Page 68: 130297267 transformations

à connect all the columns from Expression Transformation table to Target table as below :

à go to Transformation menu à click on Create à select Look-Up transformation as type and name it as T_LookUp as below :

Page 69: 130297267 transformations

à Click on Create à new window will open as below :

à Click on Source button à select the DEPT table from the sub window à click on O., here there is no DEPT table, so we need to Import the table first from the database à click on Ok à click on Done.

Page 70: 130297267 transformations

à Double click on Look-Up transformation à click on ports tab à click on Add New Port à name it as IN_DEPTNO and datatype as Decimal à disable the Output( o ) port as below :

à from the above window only, click on Condition tab à click on Add a new condition icon à make sure that LookUp table column as DEPTNO and Transform ation Port as IN_DEPTNO à click on Ok.

Page 71: 130297267 transformations

à Double click on Expression Transformation à click on Dname Expression icon as below and provide the expression as : LKP . T_LOOKUP( DeptNo )

Page 72: 130297267 transformations

Click on Ok à again click on Ok à save it. Note : Suppose, If we have more than one mapping flow, we can possible to call the Same Unconnected Look-Up again. Performance Tuning Tips : 1) Add an index to the columns used in a lookup condition : If you have privileges to modify the database containing a lookup table, you can improve performance for both cached and uncached lookups. This is important for very large lookup tables. Since the Integration Service needs to query, sort, and compare values in these columns, the index needs to include every column used in a lookup condition. 2) Place conditions with an equality operator (=) first : If you include more than one lookup condition, place the conditions in the following order to optimize lookup performance: Equal to (=) Less than (<), greater than (>), less than or equal to (<=), greater than or equal to (>=) Not equal to (!=) 3) Cache small lookup tables : Improve session performance by caching small lookup tables. The result of the lookup query and processing is the same, whether or not you cache the lookup table. 4) Join tables in the database : If the lookup table is on the same database as the source table in the mapping and caching is not feasible, join the tables in the source database rather than using a Lookup transformation. 5) Use a persistent lookup cache for static lookups : If the lookup source does not change between sessions, configure the Lookup transformation to use a persistent lookup cache. The Integration S ervice then saves and reuses cache files from session to session, eliminating the time required to read the lookup source.

Page 73: 130297267 transformations

6) Call unconnected Lookup transformations with the :LKP reference qualifier : When you write an expression using the :LKP reference qualifier, you call unconnected Lookup transformations only. If you try to call a connected Lookup transformation, the Designer displays an error and marks the mapping invalid. 7) Configure a pipeline Lookup transformation to improve performance when processing a relational or flat file lookup source : You can create partitions to process a relational or flat file lookup source when you define the lookup source as a source qualifier. Configure a non -reusable pipeline Lookup tran sformation and create partitions in the partial pipeline that processes the lookup source. SEQUENCE GENERATOR TRANSFORMATION ******************************************************* OLTP Data OLAP Data

in the first table Enum acts as a Primary Key, whereas, in the second table Primary key is W_ID, it is nothing but Warehouse Id. It is also called Surrogate Key. Surrogate Key : it is the key which is acting as a Primary Key in Datawarehouse. à Sequence Generator is useful to generate the Surrogate Key and it is useful to generate some ran dom number as Surrogate Key Values. à It is Passive and Connected type of transformation. It contains two output ports that you can connect to one or more transformations. à The Integration Service generates a block of sequence numbers each time a b lock of rows enters a connected transformation. If you connect CURRVAL, the Integration Service processes one row in each block. à When NEXTVAL is connected to the input port of another transformation, the Integration Service generates a sequence of numbers. When CURRVAL is connected to

Enum Location Year 101 Hyd 2002 101 Chennai 2006 101 Banglore 2011

W_ID Enum Ename Year 1 101 Hyd 2002 2 101 Chennai 2006 3 101 Banglore 2011

Page 74: 130297267 transformations

the input port of another transformation, the Integration Service generates the NEXTVAL value plus the Increment By value.

à You can make a Sequence Generator reusable, and use it in multiple mappings. You might reuse a Sequence Generator when you perform multiple loads to a single target.

For example, if you have a large input file that you separate into three sessions running in parallel, use a Sequence Generator to generate primary key values. If you use different Sequence Generators, the Integration Service might generate duplicate key values. Instead, use the reusable Sequence Generator for all three sessions to provide a unique value for each target row.

à You can complete the following tasks with a Sequence Generator transformation : 1) Create Keys 2) Replace Missing Values 3) Cycle through a Sequential range of Numbers Creating keys is easier thing, anyway we will see in below mapping. Replace Missing Values : Use the Sequence Generator transformation to replace missing keys by using NEXTVAL with the IIF and ISNULL functions. For example, to replace null values in the ORDER_NO column, you create a Sequence Generator transformation with the properties and drag the NEXTVAL port to an Expression transformation. In the Expression transformation, drag the ORDER_NO port into the transformation along with any other necessary ports. Then create an output port, ALL_ORDERS. Under ALL_ORDERS, you can then enter the follow ing expression to replace null orders: IIF ( ISNULL ( ORDER_NO ), NEXTVAL, ORDER_NO )

à The Sequence Generator transformation has two output ports: NEXTVAL and CURRVAL. You cannot edit or delete these ports. Likewise, you cannot add ports to the transformation.

Mapping Flow Diagram : ( Fields of Target table is : Row_Wid, EmpNo, Ename, Sal )

EMP (Source Table)

Sequence Generator T/F

SEQ_EMP (Target Table)

Page 75: 130297267 transformations

à Create the Target table SEQ_EMP as per known technique. Now, we go to SqlPlus and create the Target table as per below : > CREATE TABLE SEQ_EMP ( Row_Id NUMBER, EmpNo NUMBER, Ename VARCHAR2(20), Sal NUMBER ) à import that target table into Designer tool à now, drag and drop Source and Target table into workspace of Mapping window as below :

Click on Ok. à drag and drop Source and Target tables into Workspace of mapping window as below :

à connect EmpNo, Sal from SQ_EMP to target SEQ_EMP table as below :

Page 76: 130297267 transformations

à click on Transformation menu à click on Create à select type as Sequence Generator and name it as t_Seq as below :

à Click on Create à click on Done , the mapping window workspace will look as :

à Connect NextVal of transformation table to Row_Id of target tab le as below :

Page 77: 130297267 transformations

à Save the mapping à the remaining process for creating the session and creating the Workflow is same here also and execute the workflow and observe the Target Table. JOINER TRANSFORMATION ************************************* à It is Active and Connected type of transformation. à Use the Joiner transformation to join source data from two related heterogeneous sources residing in different locations or file system s. You can also join data from the same source. The Joiner transformation joins sources with at least one matching column. Note: The Joiner transformation does not match null values. For example, if both EMP_ID1 and EMP_ID2 contain a row with a null v alue, the Integration Service does not consider them a match and does not join the two rows. To join rows with null values, replace null input with default values, and then join on the default values. à The Joiner transformation supports the following types of joins :

1) Normal 2) Master Outer 3) Detail Outer 4) Full Outer

Note: A normal or master outer join performs faster than a full outer or detail outer join.

Page 78: 130297267 transformations

Normal Join : With a normal join, the Integration Service discards all rows of data from the master and detail source that do not match, based on the condition.

Master Outer : A master outer join keeps all rows of data from the detail source and the matching rows from the master source. It discards the unmatched rows from the master source.

Detail Outer : A detail outer join keeps all rows of data from the master source and the matching rows from the detail source. It discards the unmatched rows from the detail source.

Full Outer Join : A full outer join keeps all rows of data from both the master and detail sources.

à Generally, if we join two tables from source, obviously we will get two Source Qualifier Tables came out for individual Sources and combine them and pointing to Target table as below : But in the above case, both the sources are came from same database ie., Oracle, so we don’t need to take separate Source Qualifier for each source table., instead of that we delete one source qualifier and take only one Source Qualifier and map both the sources to this and connect it to target as below diagram. Joining Two Tables Using single Source Qualifier : Mapping Flow Diagram : ( Here Target table has fields of EmpN O, Dname, Job, Sal, DeptNo )

EMP (Oracle Data base)

DEPT (Oracle Data base)

SQ_EMPDEPT JOINER_EMP (Target Table)

EMP (Oracle Data base)

DEPT (Oracle Data base)

SQ_EMP

SQ_DEPT

JOINER_EMP (Target Table)

Page 79: 130297267 transformations

à Normal Join in our Informatica is equivalent to Equi-Join as in Oracle Database level. à Master Outer Joins gives all the records of Detail Table( in this case ie., EMP ), not gives all the records of Master table ( in this case ie., Dept ) silly note : e . deptno = d . deptno (+) ---- > this is Right Outer Join and this gives all the Data of employee table (e) e . deptno (+) = d . deptno ----- > this is Left Outer Join and this gives all the Data of Department table (d). à create the Target table as per mapping flow diagram. Go to Sql Plus and create the table as below : > CREATE TABLE JOINER_EMP ( EmpNo Number, Dname Varchar2(20), Job Varchar2(20), Sal Number, DeptNo Number ); à go to Mappings menu à click on Create à name it as m_JOINER as below :

à Drag and drop Source tables EMP and DEPT table and target table JOINER_EMP into the mapping window workspace and window will look like as below :

Page 80: 130297267 transformations

à in our case, both source tables are coming from Same Database, so we don’t need two separate Source Qualifiers as we d iscussed previously, so just delete any one Source Qualifier , but here iam deleting SQ_DEPT table, then the window will look as below :

à Double click on Header of SQ_EMP table, the it will open one window à in that click on Rename button à again small window will open à in that type name as SQ_EMPDEPT for better clarity purpose as shown below :

Page 81: 130297267 transformations

à click on Ok. à Drag and drop DeptNo, Dname, Loc from DEPT table to SQ_EMP table then window will look like as :

Page 82: 130297267 transformations

à Map the corresponding ports from SQ_EMPDEPT table to JOINER_EMP table as shown below :

à now, double click on SQ_EMPDEPT transformation à new window will open à in that click on Properties tab à in Sql Query option click on corresponding down arrow mark, which is rectangled by Brown colo r as shown below :

Page 83: 130297267 transformations

à After click on that down arrow mark à another window will open as below :

In that provide ODBC datasource as : RRITEC_SOURCE Username as : SOURCE_USER Password as : SOURCE_U SER à then click on Generate SQL button à then automatically one query will popup into the workspace of SQL sub-window , which is marked by Green Color in below window. Then window will look like as below :

Page 84: 130297267 transformations

Click on Ok à again click on Ok. The Remain ing process to create Session , to create Workflow and running the Workflow is same here also. After ran the workflow, goto target table and preview the data whether it is populated correctly or not. Joining one Databse table and one Flat File : Silly N ote : whatever the table we drag and dropped into workspace as second attempt, then Inforamtica designer automatically treat that table as : MASTER à Open Notepad and type as : Deptno,Dname,Loc 10,Hr,Hyderabad 20,Sales,Chennai 30,Stock,Pune Click on Save with name as “ DEPT.txt “and place this notepad file at below location : F: \ Informatica \ Powercenter 8.6.0 \ Server \ Infa_Shared \ SrcFiles. à We had already created the Target table of JOINER_EMP, so you can use the sam e table in this scenario also : à Import the Source table and source Notepad file into Informatica Designer tool. à Goto Mapping Designer Window à click on Mappings menu à click on Create and name it as m_JOINER_EMP and drag and drop Source EMP, source Flat-File and target JOINER_EMP into the workspace as shown below :

Page 85: 130297267 transformations

à goto Transformations menu à click on Create à select transformation type as “Joiner” and give name as T_JOIN as below :

à click on Create à click on Done, then mapping window looks as below :

Page 86: 130297267 transformations

à Drag and Drop Ename, Sal, DeptNo from SQ_EMP table and Drag and drop Deptno, Dname from SQ_DEPT to to T_JOINER transformation table as below :

à Double click on Joiner transformation, then Edit Transformations window will open à in that click on Ports tab, make sure that the table coming from Source side having less number of Records is treated as Master and Corresponding Master port is Enabled in checkbox format as below :

Page 87: 130297267 transformations

à click on Condition tab on the same above window as :

à click on “Add a new condition” icon which is marked by Red color, from the above window as and select and make sure that Condition as below Red mark

Page 88: 130297267 transformations

à click on Ok à connect all the required ports from Joiner Transformation ports to Target table ports as below :

à click on Save. à goto Powercenter wokflow manager window à click on Workflows menu à click on Create à name it as : w_Hetro_Join , then window will look as :

Page 89: 130297267 transformations

Click on Ok à goto Tasks menu à click on Create and give name as “S_Hetro_Join” As below :

Page 90: 130297267 transformations

à Click on Create , then window will open , in that need to select our mapping as below :

à click on Ok à click on Done à then connect the Session and Workflow using Link Task and the workspace will look as :

à now double click on Session à Edit Tasks window will open à in that click on Mapping window as shown below :

Page 91: 130297267 transformations

à from the Navigator window at left side of above diagram à click on SQ_EMP and the corresponding Relational DB -connection is mapped to SOURCE as shown below :

Page 92: 130297267 transformations

à similarly select SQ_DEPT from the leftside Navigator window à then the corresponding properties will appear at right side à in that select “Import Type” as File Source File type as : Direct Source File Directory as : $PM Source File Dir \ Source File Name as : DEPT.txt à similarly select the target table from the leftside navigator and map the corresponding Relational DB -connection to TARGET à click on Ok à and right click on Workflow and click on Start Workflow from Task option and goto Target table and observe the target table data.

Page 93: 130297267 transformations

Joining Two Flat Files using Joiner Transformation : à Open notepad and type the below data as : Deptno,Ename,Sal 10,Kishor,1000 20,Divya,2000 30,Venkat,3000 Save the above notepad file with the name of EMP.txt à similarly open another notepad and type the below data as : Deptno,Dname,Loc 10,Hr,Hyderabad 20,Sales,Chennai 30,Stock,Pune Save the above notepad file with the name of DEPT.txt à Now place the above two notepad files in the below location as: F: \ Informatica \ Powercenter 8.6.0 \ Server \ Infa_Shared \ SrcFiles à Goto Mapping Designer window à goto Mappings menu à click on C reate à give the mapping name as : m_Join_FlatFiles as below :

Click on Ok à now drag and drop the above two Source FlatFiles and the Target Table, then the mapping window looks as :

Page 94: 130297267 transformations

à goto Transformations menu à click on Create à select the Transformation type as Joiner and give the name as : T_Join_FlatFiles as below :

à click on Create à Click on Done à then drag drop Deptno, Ename and Sal from SQ_EMP and Deptno, Dname from SQ_DEPT to the T_Join_FlatFiles transformation table as shown below :

Page 95: 130297267 transformations

à Double click on Header of the Join transformation the Edit Transformation window will open , in that click on Condition tab à make sure the condition as in below window with red marked rectangle as :

à click on Ok à then connect the corresponding ports from Joiner Transfromation table to the target table as below :

à the remaining process to create the Workflow and create the Task and setting the properties of the Task is same as above à after running the workflow, goto target table and view the data whether it is properly populated or not.

Page 96: 130297267 transformations

Loading one Source Flat File data into another Target Flat File : à in the Designer tool, click on Target Designer icon à goto Targets menu à click on Create à name it as FlatFiles_Target and select a Database type as : Flat File as shown below :

à click on Create à Click on Done. à Double click on the Header of that above target source file table à then Edit Tables window will open à click on Columns tab

à click on Add a New column to this table three times (because we need three fields at target table) then the above window will look as :

Page 97: 130297267 transformations

à Name those fields as Deptno, Ename, Sal and set the appropriate datatypes as shown in below :

Page 98: 130297267 transformations

à click on Ok à click on Mapping Designer icon à goto Mappings menu à click on Create à give the mapping name as “m_FlatFiles_Target” as shown below :

à Click on Ok à now drag and drop source EMP.txt and target FlatFiles_Target.txt file into the mapping window workspace as shown below : à Goto W orkflow Manager tool à goto Workflows menu à click on Create à give the name as : “w_FlatFiles_Target” and click on Ok. à Create the session with the name of “s_FlatFiles_Target” and click on ok à now connect the Workflow and Session using Link Task à now run the Workflow. à After running the Workflow , goto backend location of F: \ Informatica \ Powercenter 8.6.0 \ Server \ Infa_Shared \ SrcFiles And observe that whether the Target Flat file is created or not . Loading a List of Flat Files into a Table Using Indirect Method : à Open a Notepad and type the below data as : Deptno,Ename,Sal 10,Kishor,1000 20,Divya,2000 30,Shreya,3000 Save this file with the name of EMP.txt à similarly, open one more Notepad and type as below : Deptno,Ename,Sal 40,Swetha,4000 50,Sweety,5000 60,Alekhya,6000 Save this file with the name of EMP1.txt Now place the above two Notepads in the below location of C:\Informatica\PowerCenter8.6.0\server\in fa_shared \SrcFiles

Page 99: 130297267 transformations

à similarly, open one more notepad and type above two notepad’s path location as below : C:\Informatica\PowerCenter8.6.0\server\infa_shared \SrcFiles\EMP.txt C:\Informatica\PowerCenter8.6.0\server\infa_shared \SrcFiles\EMP1.txt Save this notepad with the name of E MPLIST .txt and place this notepad also in the same location of SrcFiles. à Open Designer tool à Import EMP as a Source à create and import target table with the name of T_ListofFiles having fields of Deptno, Ename, Sal using SqlPlus. à Create a Mapping with the name of m_Load_ListofFiles. Drag and drop source (EMP is enough ) and target into the workspace and connect the corresponding Ports as shown below :

à goto Workflow manager à create a workflow with the name of “w_T_ ListofFiles” à create a Task with the name of “s_T_ListofFiles” à connect the Workflow and session à Double click on Session à click on mappings tab à select the SQ_EMP from the leftside navigator window and the corresponding properties will appear at rightside. In that select Source File Type as : Indirect Source File Directory as : $PM Source File Dir \ Source File Name as : EMPLIST.txt Click on Ok à save it and run the workflow à goto table T_ListofFiles and observe the data.

Page 100: 130297267 transformations

XML SOURCE QUALIFIER TRANSFORMATION ***************************************************** à It is an Active and Connected type of transformation . à Open notepad and type the below data as : < EMP > < ROWS > <DEPTNO>10</DEPTNO> <ENAME>RamReddy</ENAME> <SAL>1000</SAL> </ROWS> <ROWS> <DEPTNO>20</DEPTNO> <ENAME>Venkat</ENAME> <SAL>2000</SAL> </ROWS> <ROWS> <DEPTNO>30</DEPTNO> <ENAME>Divya</ENAME> <SAL>3000</SAL> </ROWS> </EMP> à Save the above file with the name as EMP.xml and place that file in the below location as : C:\Informatica\PowerCenter8.6.0\server\infa_shared \SrcFiles à Open the Informatica Designer tool à click on Source Analyzer icon from the tool bar à click on Sources menu à click on Import From XML definition à new window will open à from that navigate to our Src Folder and select the newly created XML file as shown below :

Page 101: 130297267 transformations

à click on Open à then one warning window will came as below :

à click on Yes à again another window will came as below :

à if you have time, then read all those above options , otherwise simply click on Ok à new window will open as :

Page 102: 130297267 transformations

à click on Next à new window will open , in that select “Hierarchy Relationship” radio button and select “De-normalized XML views” options as below :

Page 103: 130297267 transformations

à click on Finish à if we go and observe in the Source Analyzer window, we will get the table structure as somewhat different normal database table structure as :

à create a Target table with name as “XML_EMP” with the columns of DEPTNO, ENAME, SAL and import into Informatica Designer tool as : > CRE ATE TABLE XML_EMP ( Deptno Number(5), Ename Varchar2(30), Sal Number(10,2) ) ; à click on Mapping designer icon à goto Mappings menu à click on Create à name it as “m_XML” as :

à click on Ok à from the leftside Navigator window, expand Sources à expand “XML_File” tab à drag and drop “EMP” into workarea , also drag and drop Target into workarea as :

Page 104: 130297267 transformations

à connect corresponding ports from XML source qualifier to Target table as :

à goto Workflow Manager tool à create a workflow with the name “w_XML” and create a Session with the name of “s_XML” and connect the both using Link Task. à double click on Session à click on Mappings tab à select XMLDSQ_EMP from leftside navigator window and provide Source File directory as : $PMSourceFileDir \ Source file name as “EMP . XML” Source file type as “Direct” as shown below :

Page 105: 130297267 transformations

à select target XML_EMP table and map the proper Target connection à click on Ok and run the Workflow. NORMALIZER TRANSFORMATION ******************************************** à It is Active and Connected Transformation à It is Useful to read Data from COBOL files. à It is associated with COBOL file. à It is useful to convert Single I/P record into Multiple O/P records. The below table is of De-Normalized Form : The below table is of Normalized Form : In the above table, the field “Quarter” is nothing but GCID (generated column Id ) à Open one notepad and type as below : Year,Account_Type,Q1,Q2,Q3,Q4 2012,Savings,1000,2000,3000,4000 Save the notepad and place this file in the below location as C:\Informatica\PowerCenter8.6.0\server\infa_shared \SrcFiles à Create a target table with the columns of Year, Account_Type,Quarter,Amount with a table name as T_EMP_NORMALIZER using SqlPlus as :

Year Account_Type Quarter Amount ------- ------------------ ----------- ----------- 2012 Savings 1 1000 2012 Savings 2 2000 2012 Savings 3 3000 2012 Savings 4 4000

Year Account_Type Q1 Q2 Q3 Q4 ------- ----------------- ------------ ------------ ----------- ----------- 2012 Savings 1000 2000 3000 4000

Page 106: 130297267 transformations

> CREATE TABLE T_EMP_NORMALIZER ( Year, Number(5), Account_Type Varchar2(30), Quarter Number(3), Amount Number(10,2) ) ; and import that table into Designer tool. à Goto Designer tool à click on Mapping Designer window icon à goto M appings menu à Create a mapping with the name of “m_Normalizer” as shown below :

à click on Ok à Drag and Drop Source flat file and target table into the mapping window workspace. à goto Transformations menu à click on Create à select transformation type as Normalizer and give name as “T_Normalizer” as below :

à click on Create à click on Done à now the mapping window will look as :

Page 107: 130297267 transformations

à Double click on Normalizer transformation à Edit transformations window will open as :

In the above window , click on Normalizer tab and click on Add a new column to this table icon of three times to create three ports, which are marked by Red color in the above window, now the resultant window will look as :

Page 108: 130297267 transformations

à change the Column Names and Datatypes and Occurs field as per below :

à click on Ok à then automatically in the Normalizer Transformation table has fileds in a different manner that what we regularly seen at the mapping window workspace as :

Page 109: 130297267 transformations

à connect the ports from SQ_EMP to T_NORMALIZER and connect the ports from T_NORMALIZER table to target T_EMP_NORMALIZER as shown below :

UPDATE STRATEGY TRANSFORMATION ************************************************ à It is Active and Connected Transformation à Target table should contain Primary Key Implimenting SCD1 : ( SCD is useful to maintain Current Data ) à It should maintain Source Data and Target Data as Same. Mapping Flow Diagram :

EMP SQ_EMP LookUp T / F

Expression T / F

Router T/F Expression T/F

Update Strategy Transformation (for Update Flow)

SCD1 (target)

Expression T/F

Update Strategy Transformation (for Insert Flow )

SCD1 (target)

Sequence Transformation

Page 110: 130297267 transformations

à Target table SCD1 having the Fields as EmpKey, EmpNo, Ename, Job, Sal Goto SqlPlus and create the target table as below :

Ø CREATE TABLE SCD1 ( EmpKey Number(5), EmpNo Number(5), Ename Varchar2(30), Job Varchar2(30), Sal Number(5) ) ;

Import this table into Informatica Designer tool. à Goto Mappings menu à click on create à give the name as “m_SCD1_EMP as :

à click on Ok --> drag and drop Source EMP tab le and Target table into the mapping window workspace as :

à Goto Transformations menu à click on Create à select the transformation type as LookUp and name it as T_LKP_SCD_UPDATE as shown below :

Page 111: 130297267 transformations

à click on Create à open one window , in that click on Target button à select our target table ie., SCD1 from the below available list of tables as shown below :

à click on Ok à click on Done à the mapping window will looks as :

à Drag and drop EmpNo port form SQ_EMP to LookUp transformation table as :

Page 112: 130297267 transformations

à Double click on LookUp transformation table à click on Conditions tab à click on Add a New Condition icon à make sure the Condition as below :

à Click on Ok . Note : if we get any errors after click on Ok button in the above window, then just change the datatype of EMPNO to decimal in the LookUp transformation table. à Goto Transformations menu à click on Create à select type as Expression and give name as EXP_B4_ROUTER_UPDATE and drag and drop EmpNo, Ename, Job, Sal from Lookup Transformation to Expression Transformation table as below :

Page 113: 130297267 transformations

à from the Look-Up transformation , drag and drop “EmpKey” to Expression transformation as :

à double click on Expression Transformation à click on Ports tab à click on “Add a new Port” icon twice to create two ports as : NEW_RECORD ----- Output ------- exp : IIF (ISNULL (EMPKEY ),’TRUE’,’FALSE’) UPDATE_RECORD ---Output---exp : IIF (NOT ISNULL (EMPKEY ),’TRUE’,’FALSE’) as shown below :

Page 114: 130297267 transformations

à click on Ok à drop Router Transformation à drag and drop all the ports from Expression Transformation to Router Transformation as :

à double click on Router Transformation à click on Groups tab à click on “Add a New Group” icon twice to create two groups and give conditions as : NEW_RECORD NEW_RECORD = ‘TRUE’ UPDATE_RECORD UPDATE_RECORD = ‘TRUE’ as shown below :

Page 115: 130297267 transformations

à click on Ok à now the mapping will look like as :

New Record Flow : à drop three Transformations of type Expression, Update Strategy, Sequence Generator into the mapping window workspace à drag and drop ports of EMPNO1, ENAME1, JOB1, SAL1 from NEW_RECORD group into Expression Transformation as :

Page 116: 130297267 transformations

à drag and drop all the above four ports from Expression Transformation to Update Strategy transformation as :

Page 117: 130297267 transformations

à double click on Update Strategy Transformation à click on Properties tab à provide Update Strategy expression as 0 as :

à click on Ok à connect all the ports from Update Strategy to Target table and connect NEXTVAL port from Sequ ence Generator Transformation to EMPKEY of target table as

Page 118: 130297267 transformations

Update Record Flow : à drop Update Strategy, Expression transformations into the above Mapping Workspace à in the above Mapping, from the Router Transformation , drag and drop the ports EMP KEY3, EMPNO3, ENAME3, JOB3, SAL3 of UPDATE_RECORD group into Expression Transformation as :

Page 119: 130297267 transformations

à drag and drop all the ports from Expression Transformation to Update Strategy Transformation as :

Page 120: 130297267 transformations

à double click on Update Strategy Transformation à click on Properties tab à set the “Update Strategy Expression” value as 1 as shown below :

à click on Ok à copy and paste the target table into the mapping workspace, for the purpose of Updated Record flow à and connect all the Ports from Update Strategy Transformation to another Target as :

Page 121: 130297267 transformations

à goto Workflow Manager tool à create a Workflow with the name of “w_SCD1” à create a Session with the name of “s_SCD1” and map this task to our above Mapping “m_SCD1_EMP” à click on Ok à connect Task and Workflow using Link Task. à double click on Session à click on Mapping tab à select SQ_EMP and set that to SOURCE connection à select Target and set that to TARGET connection à select Look-Up transformation and set that to TARGET connection à click on Properties tab à set the “Treat Source Rows” as Data – Driven à click on Ok à Save the Task and Run the Workflow and observe the Target table data. Testing : à goto SqlPlus à login as TARGET_USER;

Ø INSERT INTO SOURCE_USER . EMP ( Empno, Ename, Job, Sal ) VALUES ( 102, ‘Kishor’, ‘Manager’, 90000 ); Ø COMMIT ; > UPDATE SOURCE_USER . EMP SET Sal = 5001 WHERE Empno = 7839; > COMMIT ;

Page 122: 130297267 transformations

à go and run the Workflow now and observe the Result. Exercise -1 : Try to Impliment above scenario using below Mapping Flow Diagram : Condition to be used in the Update Strategy is : IIF ( ISNULL (EMPKEY ) , 0 , 1 ) Exercise – 2 : Impliment the above SCD1 using inbuilt Slowly Changing Dimension Wizard. SLOWLY CHANGING DIMENSION – 2 ************************************************ à Create a Target table with the name of “EMP_SCD2” with columns EMPKEY (pk), EMPNO, ENAME, JOB, DEPTNO, SAL, VERSION and import into Informatica Designer tool as : > CREATE TABLE EMP_SCD2 ( Empkey number(5) Primary Key, Empno number(5), Ename varchar2(20), Job varchar2(20), Deptno number(5), Sal number(10,2), Version number(5) ) ; à create a Mapping with the name of “m_SCD2” as :

à click on Ok à drag and drop Source and Target tables into the Mapping workspace as

EMP SQ_EMP Update Strategy T/F

Target

Look Up T/F

Page 123: 130297267 transformations

à drop Look-Up transformation and Use Look-Up table as Target table as :

à click on Create à new window will open in that Click on Target button à select EMP_SCD2 ass below :

Page 124: 130297267 transformations

à click on Ok à from Source Qualifier transformation , drag and drop EMPNO into Look-Up transformation as :

à double click on Look-Up transformation à click on Ports tab à and change Datatypes from Double to Decimal (if it is in Double datatype initially) à click on Condition tab à click on Add a new Condition à make sure that condition as “EmpNo = EmpNo1” as :

à click on Ok à drop Expression Transformation into mapping window à drag and drop EMPNO, ENAME, JOB, SAL, DEPTNO from Source Qualifier Transformation into Expression Transformation as :

Page 125: 130297267 transformations

à drag and drop EMPKEY, SAL, VERSION from Look-Up transformation into Expression transformation as :

à double click on Expression Transformation à click on Ports tab à select last port à click on “Add a new port to this transformation” icon twice to create two ports à name those two ports as :

Page 126: 130297267 transformations

à click on Ok à now the mapping window will look as :

Page 127: 130297267 transformations

à drop a Router Transformation à drag and drop all the ports from Expression Transformation to Router Transformation as :

à double click on Router Transformation à click on Groups tab à click on “Add a new group to this transformation” icon twice to create two groups and name those two groups as NEWRECORD and UPDATEGROUP as below :

Page 128: 130297267 transformations

à click on Ok New Record Flow : à drop Expression, Sequence Generator, Update Strategy transformations into mapping workspace as : à drag and drop EMPNO, ENAME, JOB, SAL, DEPTNO of “New Record” section from Router Transformation to Expression Transformation as :

à double click on Ex pression Transformation à click on Ports tab à select last port à click on “Add a new port” icon à and rename that new port as “VERSION” , data type as : decimal à give value as 0 to the expression as :

Page 129: 130297267 transformations

à click on Ok à from Expression Transformation, drag and drop all the ports to Update Strategy Transformation à connect corresponding ports from Update Strategy to Target table à connect “NEXTVAL” port from Sequence Generator transformation to EMPKEY of target table as :

Page 130: 130297267 transformations

Update Record Flow : à Drop Expression transformation and Update Strategy Expression into workspaceà from Update Record Group of Router transformation, drag and drop EMPNO3, ENAME3, JOB3, SAL3, DEPTNO3, VERSION3 into the expression Transformation as :

à double click on Expression transformation à click on Ports tab à select last port à click on “Add a new port” icon à name that port as “VERSION”, datatype as Decimal, under expression type as “VERSION3+1” à click on Ok

Page 131: 130297267 transformations

à click on Ok à from Expression Transformation, drag an drop EMPNO, ENAME, JOB, SAL, DEPTNO, VERSION to Update Strategy Transformation à copy and paste the Target table in the mapping workspace to store Update Record related data à connect corresponding ports from Update Strategy transformation to Target table à connect NEXTVAL port from Sequence Generator transformation to EMPKEY of Target table as :

à create a Workflow with the name of “w_SCD2” and create a task with the name of “s_SCD2” à double click on Session à click on Mappings tab à make sure that Source and Target connections as Appropriate à make sure that Look-Up transformation connection is to Target and run the Workflow. Testing : à in the Source EMP table, add one more record using Insert statement à update any Existing record in the same source table à now run the above workflow and observe the output. Exercise : Use the In -Built SCD2 wizard and built the Mapping.

Page 132: 130297267 transformations

SLOWLY CHANGING DIMENSION - 3 ************************************************ à it is Useful to maintain Current data plus one level previous data. Process Using Wizard only : à go to Mappings menu à select Wizards option à click on “Slowly Changing Dimension” option then one window will open in that provide name as “SCD3” à select last radio button (Type-3 dimension) as :

Page 133: 130297267 transformations

à click on Next à select source table as : SOURCE_USER . EMP and give new target table name as “EMP_SCD3” as :

Page 134: 130297267 transformations

à click on Next à select “EMPNO” column from Target table fields section click on “Add>>” button then that column will go and place in the “Logical Key Fields” section as :

Page 135: 130297267 transformations

à select “SAL” column from Target Table Fields section and click on “Add>>” button and that column will go and place in the “F ields to compare for changes” section as :

Page 136: 130297267 transformations

à click on Next à click on Finish à then automatically one mapping will form and placed in the Mapping window workspace as :

Page 137: 130297267 transformations

à save the mapping à click on Target Designer workspace icon à from the leftside Navigator window, expand targets tab à drag and drop EMP_SCD3 table into target designer workspace à goto Targets menu à click on “Generate and Execute SQL” option , new window will open as :

à in the above, click on “Generate and Execute” option , new small window will open , in that provide ODBC connection to target database, username and password for the target database need to give as :

à click on Connect à click on Close à save it à create a workflow with the name of “w_SCD3” and create a task with the name of “s_SCD3” and map the Session and Workflow using Link Task. à double click on Task à provide database connections appropriately to both source and target à click on Ok à save the workflow and run the Workflow. Testing : à login to SOURCE_USER schema using Sql Plus, give statements as : > UPDATE EMP SET SAL=10000 WHERE EMPNO = 7839 ; > COMMIT ;

Page 138: 130297267 transformations

Exercise : Use Router Transformation inplace of Filter Transformation and develop SCD3. SOURCE QUALIFIER TRANSFORMATION *************************************************** à Create a target table with name as “EMP_DEPT” with columns DNAME, DEPT_WISE_SALARY using SqlPlus and import into Informatica as : > CREATE TABLE EMP_DEPT ( Dname Varchar2(30), Dept_Wise_Salary Number(10,2) ); à Create a mapping with the name of M_SQ as :

click on Ok. à drag and drop sources EMP and DEPT and target E MP_DEPT tables into the mapping window workspace as :

Page 139: 130297267 transformations

à delete the above two default coming Source Qualifier tables then mapping window as:

à go to Transformations menu à click on Create à select transformation type as “Source Qualifier” and give name as “Emp_Dept” as below :

à click on Create then one small window will open and it asks us to select the tables for which we are going to create Source Qualifier transformation. Here we are created for both the tables., so I’m selecting EMP and D EPT as below :

Page 140: 130297267 transformations

à click on Ok à click on Done à automatically the newly created Source Qualifier Transformations ports are mapped with the corresponding ports from both EMP and DEPT tables as :

à now connect DNAME and SAL from source qualifier transformation to target table as :

Page 141: 130297267 transformations

à double click on Source Qualifier à click on Properties tab as :

à click on “SQL Q uery” attribute down arrow mark which is marked by Red color as above, new SQL editor window will open à in that select ODB C data source as “RRITEC_SOURCE” Username as : SOURCE_USER Password as : SOURCE_USER As below :

Page 142: 130297267 transformations

Click on Generate SQL à then automatically Query will be generated the workspace of SQL window shown as below :

à click on Ok à click on Ok à the remaining procedure to create Session and Workflow is same here also.

Page 143: 130297267 transformations

UNION TRANSFORMATION ********************************* à Connect to source database ( SOURCE_USER ) using Sql Plus and create two tables as below : > CREATE TABLE emp1020 AS SELECT deptno, sal FROM scott.EMP WHERE deptno IN ( 10,20 ); > CREATE TABLE emp2030 AS SELECT deptno, sal FROM scott.EMP WHERE deptno IN ( 20,30 ); à connect to target databse ( TARGET_USER ) using Sql Plus and create a table as : > CREATE TABLE emp_Union102030 AS SELECT deptno, sal FROM scott.EMP WHERE 1 = 2 ; à now, import the above two sources and one target tables into Informatica designer. à create a mapping with the name of “m_UNION” as :

Click on Ok à drag and drop two sources and a target tables into mapping window as :

Page 144: 130297267 transformations

à goto transformations menu à click on create à select type as UNION and name it as “t_UNION” as :

à click on Create à click on Done à under transformation we get two new options differently as shown below :

à double click on Union Transformation à click on Groups tab as below :

Page 145: 130297267 transformations

à click on “Add a new group” icon twice which is marked by Red color à give names as below :

à click on “Group Ports” tab

Page 146: 130297267 transformations

à click on “Add a new port” icon twice and name those two ports as below :

à click on Ok à now the mapping window will look as :

Page 147: 130297267 transformations

à connect all ports from SQ_EMP1020 with the emp1020 group of Union Transformation as :

à similarly, connect all the ports from SQ_EMP2030 with the emp2030 group of Union Transformation as below :

Page 148: 130297267 transformations

à now, connect all the output ports from Union Transformation to target table as below :

Page 149: 130297267 transformations

STORED PROCEDURE TRANSFORMATION ************************************************* à Using Sql Plus, create a Target table with columns EMPNO, SAL, TAX and import that table into Informatica Designer tool as : CREATE TABLE emp_sproc ( EMPNO number(5), SAL number(10,2), TAX number(10,2) ); à using Sql Plus, connect to source database ( SOURCE_USER ) , create a Stored Procedure as : > CREATE OR REPLACE PROCEDURE emp_tax ( Sal IN number, Tax OUT number ) IS BEGIN tax := sal * 0.1 ; END; à create a mapping with the name of “m_SPROC” as :

à click on Ok à drag and drop source EMP table and target “emp_sproc” into the mapping window as :

Page 150: 130297267 transformations

à goto Transformations menu à click on “Import Stored Procedure” option à new window will open , in that provide required details to import the stored procedure as :

à click on Ok à from the Source Qualifier transformation, connect SAL to stored procedure SAL port and connect TAX port of Stored Procedure to TAX port of target table as below :

à connect EMPNO and SAL from source qualifier transformation to corresponding ports in Target table as below :

Page 151: 130297267 transformations

à creating Task and creating Workflow and providing proper connections to Source and Target at session is same as here also. Create Mapping using Un -Connected Stored Procedure Transformation *********************************************************************** à create a Mapping with the name of “m_SP_UnConnected” as :

à click on Ok à drag and drop source and target tables into the mapping window workspace as below :

Page 152: 130297267 transformations

à goto Transformations menu à click on create à select type as “Expression” and name it as “t_SP_UnConnected” as :

à click on Create à click on Done à drag and EMPNO and SAL from Source Qualifier transforamation to Expression Transformation as :

à goto Transformations à click on “Import Stored Procedure” option à provide required details à select our Stored Procedure à click on Ok. à double click on Expression Transformation à click on Ports tab as :

Page 153: 130297267 transformations

à click on “Add a new port to this transformation” icon which is marked by Red color in above window and name it as TAX à disable the Input port as :

à click on Tax expression down arrow mark which is circled by red color , Expression Editor window will open and type the Formula as below :

Page 154: 130297267 transformations

Note : in the above, PROC_RESULT is the keyword. à click on Validate à click on Ok à click on Ok à connect all the ports from Expression transformation to Target table as :

à creating Session and creating workflow is same here also. TRANSACTION CONTROL TRANSFORMATION ***************************************************** à It is Active and Connected type of transformation à A transaction is the set of rows bound by commit or roll back rows. You can define a transaction based on a varying number of input rows. à In PowerCenter, you define transaction control at the following levels: Within a Mapping : Within a mapping, you use the Transaction Control transformation to define a transaction. You define transactions using an expression in a Transaction Control transformation. Based on the return value of the expression, you can choose to commit, roll back, or continue without any transaction changes. Within a Session : When you configure a session, you configure it for user-defined commit. You can choose to commit or roll back a transaction if the Integration Service fails to transform or write any row to the target.

à When you run the session, the Integration Service evaluates the ex pression for each row that enters the transformation. When it evaluates a commit row, it commits all rows in the transaction to the target or targets. When the Integration Service evaluates a roll back row, it rolls back all rows in the transaction from the target or targets.

Page 155: 130297267 transformations

à in the Transaction Control Transformation, we have 5 built-in variables. 1) TC_CONTINUE_TRANSACTION : The Integration Service does not perform any transaction change for this row. This is the default value of the expression. 2) TC_COMMIT_BEFORE : The Integration Service commits the transaction, begins a new transaction, and writes the current row to the target. The current row is in the new transaction. 3) TC_COMMIT_AFTER : The Integration Service writes the current row to the target, commits the transaction, and begins a new transaction. The current row is in the committed transaction. 4) TC_ROLLBACK_BEFORE : The Integration Service rolls back the current transaction, begins a new transaction, and writes the current row to the target. The current row is in the new transaction. 5) TC_ROLLBACK_AFTER : The Integration Service writes the current row to the target, rolls back the transaction, and begins a new transaction. The current row is in the rolled back transaction. Note : If the transaction control expression evaluates to a value other than commit, roll back, or continue, the Integration Service fails the session. à Create a Target table of name EMP_TCL with the columns EMPNO, ENAME, JOB, SAL and import into Informatica tool. > CREATE TABLE EMP_TCL ( EMPNO number(5), ENAME varchar2(20), JOB varchar2(2), SAL number(10,2) ) ; à goto mappings menu à click on Create à give name as “m_TCL” as :

à click on Ok à drag and drop source EMP table and target EMP_TCL table into mapping window workspace as :

Page 156: 130297267 transformations

à goto transformations menu à click on Create à select type as “Transaction Control” and give name as “t_TC” as :

à click on Create à click on Done à drag and drop EMPNO, ENAME, JOB and SAL from SQ_EMP to EMP_TCL transformation table as :

à double click on Transaction Control Transformation à click on Properties tab as :

Page 157: 130297267 transformations

à click on “Transaction Control Condition” value dropdow n button, which is circled by red color the Expression Editor will open à in that give condition as below :

à click on Validate à click on Ok à click on Ok à connect all the Ports from Transactional Control Transformation to Target table as :

Page 158: 130297267 transformations

à Save the mapping à creating Session and creating the Workflow is same here also. Advanced Training of Informatica Mapping Parameters and Variables ************************************* PARAMETERS : à A mapping Parameter represents a constant value that we can define before running a session. à A mapping parameter retains the same value throughout the entire session. Intial and Default value : When w e declare a mapping parameter or variable in a mapping or a mapplet, we can enter an initial value. When the Integration Service needs an initial value, and we did not declare an initial value for the parameter or variable, the Integration Service uses a default value based on the data type of the parameter or variable. Data Default Value -------- ----------------- Numeric 0 String Empty String Date Time 1/1/1 à create a target table with the columns EMPNO, ENAME, JOB, SAL, COMM, DEPTNO and import into Informatica as :

Page 159: 130297267 transformations

> CREATE TABLE emp_par_var ( EMPNO number(5), ENAME varchar2(20), JOB varchar2(20), SAL number(10,2), COMM number(10,2) ) ; à create a mapping with the name of “m_MP” as :

Click on Ok à drag and drop Source EMP table and target EMP_PAR_VAR table into mapping window workspace as :

à goto Mappings menu à click on “Parameters and Varaibles” option à new window will open as :

Page 160: 130297267 transformations

à click on “Add a new variable to this table” à name the new field as “$$DEPTNO” à set the type as “Parameters” from drodpwn à set “IsExpVar” to FALSE as below :

Page 161: 130297267 transformations

à click on Ok à drop Filter Transformation into mapping workspace à drag and drop EMPNO, ENAME, JOB, SAL, COMM from source qualifier transformation to Filter transformation as :

à double click on Filter transformation à click on Properties tab

à click on “Filter Condition” value drop down arrow mark, which is marked by red color then “Expression Editor” window will open as :

Page 162: 130297267 transformations

à type “DEPTNO= “ in the formula space, click on Variables tab à expand Mapping parameters tab à under that we will get our created Parameters. here, ie., $$DEPTNO.

à double click on $$DEPTNO from the leftside navigator window, the formual window will now get as :

Page 163: 130297267 transformations

à click on Validate à click on Ok à click on Ok à click on Ok à drag and drop corresponding columns from Filter transformation to Target table as :

à save the Mapping à creating the Session and creating Workflow is same here also. Creating Parameter File : navigate to below path as : F : \ Informatica \ PowerCenter 8.6.0 \ Server \ Infa_Shared \ BWparam à right click on free space à select New à click on Text Document à name it as Deptno .prm à double click on that file and type as : [RRITEC .S_MP ] $$DEPTNO = 20 Save the file and close it ( RRITEC --- > folder name ; S_MP --- > session name )

Page 164: 130297267 transformations

à double click on session ie., S_MP à click on Properties tab à provide parameter file name as : F:\Informatica\PowerCenter 8.6.0 \Server\Infa_Shared\ BWparam \DEPTNO.prm à set Source and Target table connections accordingly à save the workflow and run the workflow. VARIABLES : à Create a target table of name V_EMP with the columns of EMPNO, ENAME, JOB, SAL, COMM, DEPTNO and import into Informatica : > CREATE TABLE V_EMP ( EMPNO number(5), ENAME varchar2(20), JOB varchar2(20), SAL number(10,2), COMM number(10,2), DEPTNO number(5) ) ; à create a mapping with the name of “m_Variable_Incrimental” as :

à click on Ok à drag and drop Source EMP and Target V_EMP table into workarea as :

Page 165: 130297267 transformations

à goto Mappings menu à click on “Parameters and Variables” option à click on “Add a new Variable to this table” icon à name that field as $$DEPTNO set the Type as “Variable” Datatype as “decimal” Aggregation as Max Intial Value as ‘0’ à as shown below :

à click on Ok à drop Filter Transformation into mapping workspace à drag and drop required ports(required for Target) from Source Qualifier Transformation to Filter Transformation as :

à double click on “Filter Transformation” à click on Properties tab as :

Page 166: 130297267 transformations

à click on “Filter Condition” down arrow mark which is marked by red color in above window, then “Expression Editior” window will open à in that type as DEPTNO = SETVARIABLE ( $$DEPTNO,$$DEPTNO+10 ) ;

à click on Validate à click on Ok à click on Ok à click on Ok à connect corresponding ports from Filter Transformation to Target table as :

Page 167: 130297267 transformations

à save the mapping and the remaining process to create the Session and create the Workflow is same here also. MAPPLETS ********************** à A mapplet is a reusable object that we create in the Mapplet Designer. à It contains a set of transformations and lets us reuse that transformation logic in multiple mappings. Mapplet Input : Mapplet input can originate from a source definition and/or from an Input transformation in the mapplet. We can create multiple pipelines in a mapplet. We use Mapplet Input transformation to give input to mapplet. Use of Mapplet Input transformation is optional. Mapplet Output : The output of a mapplet is not connected to any target table. We must use Mapplet Output transformation to store mapplet output. A mapplet must contain at least one Output transformation with at least one connected port in the mapplet. à Mapplet Input Transformation is a Passive and Connected type . The above type of mapping link is Invalid. T hat means, Connecting the Same port to more than one Transformation is disallowed directly from Mapplet Input Transformation. If we want to really connect one port to more than one transformation, a small workaround you need to be done. That is., take one Expression transformation, and first

Mapplet Input T/F > > > >

T/F --- 1

T/F --- 2

Page 168: 130297267 transformations

you need to connect the same port to Expression transformation, from that Exp T/F, connect more than one Transformations. à Create a Target table of name EMP_MAPPLET with the columns EMPNO, SAL, COMM, TAX, TOTALSAL and import into Informatica as : > CREATE TABLE EMP_MAPPLET ( EMPNO number(5), SAL number(10,2), COMM number(10,2), TAX number(10,2), TOTALSAL number(10,2) ) ; à click on Mapplet Designer à goto Mapplets menu à click on Create à give name as MLT_TOTALSAL_TAX as :

Click on Ok à goto Transformations menu à click on Create à drop four transformations of types 1) Mapplet Input 2) Mapplet Output 3) Filter Transformation 4) Expression Transformation Then the Mapplet window will look as :

à double click on Mapplet Input Transformation à goto Ports tab à click on Add a new port icon thrice to create three fields à name those as : EMPNO, SAL , COMM as

Page 169: 130297267 transformations

à click on Ok à drag and drop all the columns from Mapplet Input Transformation to Filter Transformation as :

à double click on Filter Transformation à click on Ports tab à change all the Datatypes as Decimal as :

Page 170: 130297267 transformations

à click on Properties tab as :

à click on Filter Condition down arrow mark which is marked by red color, then Expression Editor window will open à in that type the formula as IIF ( ( ISNULL ( EMPNO ) OR ISNULL ( SAL ) OR ISNULL ( COMM ) ) , FALSE , TRUE ) as shown below :

Page 171: 130297267 transformations

à click on Validate à click on Ok à click on Ok à drag and drop all the columns from Filter Transformation to Expression Transformation ass below :

à double click on Expression Transformation à click on Ports tab à click on “Add a new port to this Transformation” icon twice to create two ports and rename those as below : TOTALSAL --- decimal --- expression as : SAL + COMM TAX ---- decimal ---- expression as : IIF ( SAL>3000 , SAL*0.2 , SAL*0.1 ) as shoowm in below :

Page 172: 130297267 transformations

à click on Ok à drag and drop all the columns from Expression Transformation to Mapplet Output Transformation as below :

Using Mapplet in a Mapping : à click on Mapping Designer icon àMappings menu à click on Create à name it as “m_MAPLET” as :

Page 173: 130297267 transformations

à click on Ok à drag and drop source EMP table and target EMP_MAPPLET into the mapping window workspace as :

à from the left side navigator window, drag our Mapplet (MLT_TOTALSAL_TAX ) which is marked by red color in above fig , into the Mapping window workspace à connect required ports from Source Qualifier Transformation to Mapplet and connect required ports from Mapplet to Target table as :

à save the Mapping à creating the Session and creating Workflow is same here also.

Page 174: 130297267 transformations

RE -USABLE TRANSFORMATION ***************************************** à It is Useful , if we want to utilize one Transformation in Many Mappings. à Except Source Qualifier Transformation, all the other Transformations can be Re-Usable. à we can develop Re-Usable Transformations in two ways :

1) By using Transformation Tool ( its not recommended method, becoz we didnt tested whether it is functioning properly or not )

2) Promoting Existing Transformation to Re-Usable (it’s a recommended

method ) à goto any mapping à click on any Transformation à then we will get one checkbox and we have to enable that “Make Re-Usable” checkbox à observe that Transformation Type as “Filter” only which is marked by Red color à whenever we enable that checkbox à we will get one confirmation window as below :

à click on Yes to confirm à then now observe that “Transformation Type” which is marked in ab ove fig is changed to “ReUsable” as shown below :

Page 175: 130297267 transformations

à click on Ok à observe that our Re-UsableTransformation is placed under Transformations tab of Navigator window as :

Page 176: 130297267 transformations

TARGET LOAD PLAN **************************** à If we have morethan one Source Qualifier in the Mapping, then control using SQ by setting Target Load Plan in Mapping designer. à if we have only one Source Qualifier in the Mapping and Multiple Targets are loading then we need to load based on the Constraint, then it is called as “Constraint Based Loading”. Process for Constraint Based Loading : à Create a source table as : > CREATE TABLE emp_dept AS SELECT emp . * , dept . deptno “dept_deptno”, Dname, loc FROM emp , dept WHERE emp . deptno (+) = dept . deptno ; Import that above table into Informatica as source à create two target tables as below : > CREATE TABLE dept1 ( DEPTNO number(5) CONSTRAINT pk_dept1 PRIMARY KEY, DNAME varchar2(20), LOC varchar2(20) ); > CREATE TABLE emp1 ( EMPNO number(5) CONSTRAINT pk_emp1 PRIMARY KEY, ENAME varchar2(20), JOB varchar2(20), MGR number(5), HIREDATE date, SAL number(7,2), COMM number(7,2), DEPTNO number(2) CONSTRAINT fk_deptno1 REFERENCES dept1 ); à goto Designer tool à create Mapping with the name “m_CBL” as :

Page 177: 130297267 transformations

à click on Ok à drag and drop above one source and two targets into the mapping window workspace and connect the corresponding ports from Source qualifier transformation to target as :

à create a workflow and create a Session and link those two using Link Task à double click on Session à click on “Config Object” à enable the checkbox of “Constraint Based Load Ordering “ as below :

Page 178: 130297267 transformations

à click on Ok à run the workflow and observe the result. *************************************************** Need to add the missing Content of tw o classes……….. *************************************************** TASKS *********** Tasks are of two types. 1) Re-Usable Tasks 2) Non Re-Usable Tasks. Re-Usable Tasks: there are three types of Re-Usable Tasks i) Session ii) Command iii) Email à All Re-Usable Tasks will be created using Task Developer. SESSION : Session will be useful to execute a Mapping. Creating Re-Usable Session : goto Power center Workflow Manager à click on Task Developer icon à goto Tasks menu à click on Create à select type as Session à give any name suppose say “s_reusable” as :

à click on Create à select required Mapping à click on Ok à click on Done à Save the session. à from the Navigator window of that current Workflow Manager à expand Sessions folder à and observed that our Session Task is created or not .

Page 179: 130297267 transformations

Email : à goto Power center Workflow Manager à click on Task Developer icon à goto Tasks menu à click on Create à select type as “Email” à give name as suppose say, “Send Mail”

à click on Create à click on Done. à Double click on Email à click on Properties tab provide em ail username as : [email protected] give Email subject as : “Test” give Email Text as : “Text Mail”

Page 180: 130297267 transformations

à click on Ok à Save the Task. à from the Navigator window of that current Workflow Manager à expand Tasks folder à and observed that our Email Task is created or not as :

Command : It is Useful to execute any UNIX commands. à goto Power center Workflow Manager à click on Task Developer icon à goto Tasks menu à click on Create à select type as “Command” à give name as suppose say “Copyfile”

à click on Create à click on Done. à Double click on “Copyfile” command à click on Commands tab

Page 181: 130297267 transformations

à click on “Add a New Command” icon , which is marked by Red color as :

Page 182: 130297267 transformations

à name it as “Copyfile”

à click on Down arrow mark which is circled by Red color circle then new window will open and type as below :

à click on Ok à click on Ok à Save the session.

Page 183: 130297267 transformations

à from the Navigator window of that current W orkflow Manager à expand Tasks folder à and observed that our Command Task is created or not as below :

Non Re-Usable Tasks : If we create a Task within the Workflow Designer window of Power Center Workflow Manager tool , then it is called as Non-Reusable Task. 1) Event – Driven 2) Event – Raise 3) Decision 4) Timer 5) Assignment 6) Control 1) Event – Driven : à Goto Power Center Workflow Manager à click on “Workflow Designer” à goto Tasks menu à click on Create à select task type as “Event Wait” à give name as “s_event_wait” as below :

Click on Create à click on Done.

Start Session Event Wait Session

Page 184: 130297267 transformations

à click on Ok à Save the Task. 2) Event Raise : Creating a Workflow Event : Right click on Workflow Designer free space à click on “Edit Workflow/Worklet” à click on Events tab à click on EXECUTE2SESSIONS à click on Ok. à Double click on Event Raise à click on Properties tab à under Value click on Dropdown buttion à select “EXECUTE2SESSIONS” à click on Ok à click on Ok.

Start Session

Event_Wait Session

Session Event_Raise

Page 185: 130297267 transformations

à Double click on EventWait à click on Events tab à select Userdefined radio button à click on Browse Events icon à select EXECUTE2SESSIONS à click on Ok à click on Ok à save the Task. 3) Decission : à double click on Decission Task à write the below condition as : $ session-1 . status = SUCCEDED and $ session-2 . status = SUCCEDED à click on Ok à click on Ok. à Double click on Link between Command and Decission à one new window will open , in that type $Decission-Name . Condition = TRUE à Similarly, double click on Link between Session and Decission à one new window will open , in that type $Decission-Name . Condition = FALSE à click on Ok à Save the Task 4) Timer : à Goto Power Center Workflow Manager à click on “Workflow Designer” à goto Tasks menu à click on Create à select task type as “Timer” and give name as “t_timer”

Start

Session

Decission Command

Session

Start Timer Session

Page 186: 130297267 transformations

à click on Create à click on Done. à now double click on Timer à select Timer tab à select “Absolute Time” radio button à give two minutes of forward time as below :

à click on Ok à save the Task and Run the workflow.

Page 187: 130297267 transformations

5) Assignment : à Goto Power Center Workflow Manager à click on “Workflow Designer” à goto Tasks menu à click on Create à select task type as “Assignment” and give name as “t_Assignment” as shown below :

à click on Create à click on Done. à right click on Workflow Designer free space à click on Edit Workflow/Worklet à click on Variables tab as :

à Click on “Add a new Variable to this Table” icon which is marked by Red color in above

Start Session Assignment

Page 188: 130297267 transformations

à Change the name as “ $$no_of_times_run “ à enable the Persistent checkbox à set default values as ‘0’

à click on Ok à double click on “Assignment “ à click on “Add a new Expression” icon à type expression as “ $$no_of_times_run + 1 as shown below :

Page 189: 130297267 transformations

Click on Ok WORKLET ****************** à A Reusable Workflow is called as Worklet. Creating Worklet : à click on Worklet Designer icon à goto Worklets menu à click on Create à give name as “Wroklet” as below :

Page 190: 130297267 transformations

Click on Ok à take any Session into the workspace of Worklet Designer window à connect or link Worklet and Session à save it. à from the Navigator window of that current Workflow Manager window à expand Worklets folder à and observed that our newly created Worklet is created or not as below :

Page 191: 130297267 transformations

Utilizing Worklet : click on Workflow Designer icon à goto Workflow menu à click on Create à name it as “Using_Worklet” à drag and drop Worklet from the “Wroklet” folder from left side Navigator window à Connect Workflow and Worklet as :

********************************* *************************************