Ds Resumes

Embed Size (px)

Citation preview

  • 8/22/2019 Ds Resumes

    1/38

    PROFESSIONAL SUMMARY:

    Over 6 years of Dynamic career reflecting pioneering experience and high performancein System Analysis, design, development and implementation of Relational Database and

    Data Warehousing Systems using IBM Data Stage 8.0.1/7.x/6.x/5.x (Info Sphere

    Information Server, Web Sphere, Ascential Data Stage). Excellent Experience in Designing, Developing, Documenting, Testing of ETL jobs and

    mappings in Server and Parallel jobs using Data Stage to populate tables in Data

    Warehouse and Data marts. Proficient in developing strategies for Extraction, Transformation and Loading (ETL)

    mechanism. Expert in designing Parallel jobs using various stages like Join, Merge, Lookup, Remove

    duplicates, Filter, Dataset, Lookup file set, Complex flat file, Modify, Aggregator, XML. Expert in designing Server jobs using various types of stages like Sequential file, ODBC,

    Hashed file, Aggregator, Transformer, Sort, Link Partitioner and Link Collector. Experienced in integration of various data sources (DB2-UDB, SQL Server, PL/SQL,

    Oracle, Teradata, XML and MS-Access) into data staging area. Expert in working with Data Stage Manager, Designer, Administrator, and Director. Experience in analyzing the data generated by the business process, defining the

    granularity, source to target mapping of the data elements, creating Indexes andAggregate tables for the data warehouse design and development.

    Excellent knowledge of studying the data dependencies using metadata stored in therepository and prepared batches for the existing sessions to facilitate scheduling ofmultiple sessions.

    Proven track record in troubleshooting of Data Stage jobs and addressing productionissues like performance tuning and enhancement.

    Expert in working on various operating systems like UNIX AIX 5.2/5.1, Sun SolarisV8.0 and Windows 2000/NT.

    Proficient in writing, implementation and testing of triggers, procedures and functions inPL/SQL and Oracle.

    Experienced in Database programming for Data Warehouses (Schemas), proficient indimensional modeling (Star Schema modeling, and Snowflake modeling).

    Expertise in UNIX shell scripts using K-shell for the automation of processes andscheduling the Data Stage jobs using wrappers.

    Experience in using software configuration management tools like Rational Clearcase/Clear quest for version control.

    Experienced in Data Modeling as well as reverse engineering using tools Erwin, OracleDesigner and MS Visio, SQL server management studio, SSIS and SSRS and storeprocedure.

    Expert in unit testing, system integration testing, implementation and maintenance ofdatabases jobs.

    Effective in cross-functional and global environments to manage multiple tasks andassignments concurrently with effective communication skills.

  • 8/22/2019 Ds Resumes

    2/38

    EDUCATIONAL QUALIFICATION:Bachelors in Electronics and Communication,

    TECHNICAL SKILLS:

    ETL Tools

    DATA STAGE- IBM Web Sphere Data stage and Quality Stage 8.0, Ascential Data Stage

    /7.5.2/5.1/6.0 Profile Stage 7.0, SSIS (SQL server 2005), Data Integrator.

    Business Intelligence tools

    Business Objects, Brio, SSRS(SQL Server 2005),IBM Cognos 8 BI

    Development Tools and Languages

    SQL, C, C++, Unix Shell Scripting, Perl, PL/SQL,oracle

    Testing Tools

    Auto Tester, Test Director, Lotus Notes

    Data Modeling Tools

    Erwin 4.0, Sybase Power Developer, SSIS,SSRS

    Operating Systems

    HP-UX, IBM-AIX 5.3, Windows 95/98/2000/ NT, Sun Solaris, Red-Hat Linux, MS SQLSERVER 2000/2005/2008& MS Access

    WORK EXPERIENCE:Confidential, CA Nov 2010Present ETL DeveloperNetApp Inc is leading Network Appliance Manufacturer Company as well as data storageCompany which provide Network appliance like hard disk, shelf for small business owners, large

    business owners. Also NetApp provides efficient data storage facility. The main aim is to

    provide variety of services like Data storage, Data Analysis, Data warehouse, Data mart etc.which can adopt consistent tailored processes in order to strive and fulfill promise of

    commitment and reliability to Customers.

    Involved as primary on-site ETL Developer during the analysis, planning, design,development, and implementation stages of projects using IBM Web Sphere software

    (Quality Stage v8.1, Web Service, Information Analyzer, Profile Stage, WISD of IIS

    8.0.1). Prepared Data Mapping Documents and Design the ETL jobs based on the DMD with

    required Tables in the Dev Environment.

  • 8/22/2019 Ds Resumes

    3/38

    Active participation in decision making and QA meetings and regularly interacted withthe Business Analysts &development team to gain a better understanding of the Business

    Process, Requirements & Design. Used DataStage as an ETL tool to extract data from sources systems, loaded the data into

    the ORACLE database.

    Designed and Developed Data stage Jobs to Extract data from heterogeneous sources,Applied transform logics to extracted data and Loaded into Data Warehouse Databases.

    Created Datastage jobs using different stages like Transformer, Aggregator, Sort, Join,Merge, Lookup, Data Set, Funnel, Remove Duplicates, Copy, Modify, Filter, Change

    Data Capture, Change Apply, Sample, Surrogate Key, Column Generator, RowGenerator, Etc.

    Extensively worked with Join, Look up (Normal and Sparse) and Merge stages. Extensively worked with sequential file, dataset, file set and look up file set stages. Extensively used Parallel Stages like Row Generator, Column Generator, Head, and Peek

    for development and de-bugging purposes.

    Used the Data Stage Director and its run-time engine to schedule running the solution,testing and debugging its components, and monitoring the resulting executable versionson ad hoc or scheduled basis.

    Developed complex store procedures using input/output parameters, cursors, views,triggers and complex queries using temp tables and joins.

    Converted complex job designs to different job segments and executed through jobsequencer for better performance and easy maintenance.

    Creation of jobs sequences. Maintained Data Warehouse by loading dimensions and facts as part of project. Also

    worked for different enhancements in FACT tables.

    Created shell script to run data stage jobs from UNIX and then schedule this script to rundata stage jobs through scheduling tool.

    Coordinate with team members and administer all onsite and offshore work packages. Analyze performance and monitor work with capacity planning. Performed performance tuning of the jobs by interpreting performance statistics of the

    jobs developed. Documented ETL test plans, test cases, test scripts, and validations based on design

    specifications for unit testing, system testing, functional testing, prepared test data for

    testing, error handling and analysis. Participated in weekly status meetings. Developed Test Plan that included the scope of the release, entrance and exit criteria and

    overall test strategy. Created detailed Test Cases and Test sets and executed them

    manually.

    Environment: IBM Web Sphere DataStage 8.1 Parallel Extender, Web Services, Quality Stage

    8.1, (Designer, Director, Manager), Microsoft Visio, IBM AIX 4.2/4.1 IBM DB2 Database, SQLServer, IBM DB2,Teradata, ORACLE 11G, Query man, Unix, Windows.

    Confidential, NJ Jan 2010 - Oct 2010 Lead Sr. Datastage DeveloperProject was to design and develop enterprise data warehouse. Extract data from heterogeneous

    source system, transform them using business logic and load in to data warehouse.

  • 8/22/2019 Ds Resumes

    4/38

    Used the DataStage Designer to develop processes for extracting, cleansing,transforming, integrating and loading data into staging tables.

    Extensively used ETL to load data from IBM DB2 database, XML & Flat files Source toInformix Database Server.

    Involved in analysis, planning, design, development, and implementation phages ofprojects using IBM Web Sphere software (Quality Stage v8.0.1, Web Service,Information Analyzer, Profile Stage, WISD of IIS 8.0.1).

    Developed complex jobs using various stages like Lookup, Join, Transformer, Dataset,Row Generator, Column Generator, Datasets, Sequential File, Aggregator and Modify

    Stages. Created queries using join and case statement to validate data in different databases. Created queries to compare data between two databases to make sure data is matched. Used the DataStage Director and its run-time engine to schedule running the solution,

    testing and debugging its components, and monitoring the resulting executable versionson an ad hoc or scheduled basis.

    Created shared container to incorporate complex business logic in job.

    Monitoring the Datastage job on daily basis by running the UNIX shell script and made aforce start whenever job fails.

    Created and modified batch scripts to ftp files from different server to data stage server. Extensively used slowly changing dimension Type 2 approach to maintain history in

    database. Created Job Sequencers to automate the job. Modified UNIX shell script to run Job sequencer from the mainframe job. Create parameter set to assign a value to job at run time. Standardized the Nomenclature used to define the same data by users from different

    business units.

    Created multiple layer report providing a comprehensive and detail report with Drillthrough facility.

    Used Parallel Extender for Parallel Processing for improving performance whenextracting the data from the sources.

    Worked with Metadata Definitions, Import and Export of Datastage jobs using Data stageManager.

    Providing the logical data model design, generating database, resolving technical issues,and loading data into multiple instances.

    Implemented PL/SQL scripts in accordance with the necessary Business rules andprocedures.

    Developed PL/SQL procedures & functions to support the reports by retrieving the datafrom the data warehousing application.

    Used PL/SQL programming to develop Stored Procedures/Functions and Databasetriggers.

    Environment: IBM Web Sphere DataStage 8.0.1 Parallel Extender, Web Services, Quality

    Stage 8.0, (Designer, Director, Manager), Microsoft Visio, IBM AIX 4.2/4.1 IBM DB2

    Database, SQL Server 2000, IBM DB2,Teradata, ORACLE 11G, Query man, BMQ, Unix,Windows.

  • 8/22/2019 Ds Resumes

    5/38

    Confidential, VA Oct 2008Dec 2009 Lead Datastage ETL DeveloperProject was involved in design and development of a group insurance system, which processes

    claims for group insurance. It covers benefits with subsystems covering Term Life Insurance,Medical Indemnity and Managed Health Care.

    Data Modeling:

    Gathered and analyzed the requirements of the in-house business users for the datawarehousing from JAD sessions.

    Collected the information about different Entities and attributes by studying the existingODS and reverse engineering into Erwin.

    Defined the Primary keys and foreign keys for the Entities. Defined the query view, index options and relationships. Created logical schema using ERWIN 4.0 and also created the Dimension Modeling for

    building the Cubes.

    Designed staging and Error handling tables keeping in view the overall ETL strategy. Assisted in creating the physical database by forward engineering.

    ETL Process:

    Extracted data from source systems transformed and loaded into Oracle databaseaccording to the required provision.

    Primary on-site technical lead during the analysis, planning, design, development, andimplementation stages of data quality projects using Integrity (now known as Quality

    Stage). Involved in system analysis, design, development, support and documentation. Created objects like tables, views, Materialized views procedures, packages using Oracle

    tools like PL/SQL, SQL*Plus, SQL*Loader and Handled Exceptions.

    Involved in database development by creating Oracle PL/SQL Functions, Procedures,Triggers, Packages, Records and Collections.

    Created views for hiding actual tables and to eliminate the complexity of the largequeries.

    Created various indexes on tables to improve the performance by eliminating the fulltable scans.

    Used the DataStage Designer to develop processes for extracting, cleansing,transforming, integrating and loading data into Data Marts.

    Created source table definitions in the DataStage Repository. Identified source systems, their connectivity, related tables and fields and ensure data

    suitability for mapping.

    Generated Surrogate IDs for the dimensions in the fact table for indexed and fasteraccess of data.

    Created hash tables with referential integrity for faster table look-up and for transformingthe data representing valid information.

    Used built-in as well as complex transformations. Used Data Stage Manager to manage the Metadata repository and for import/export of

    jobs.

  • 8/22/2019 Ds Resumes

    6/38

    Implemented parallel extender jobs for better performance using stages like Join, Merge,Sort and Lookup, transformer with different source files complex flat files, XML files.

    Optimized job performance by carrying out Performance Tuning. Created Stored Procedures to confirm to the Business rules. Used Aggregator stages to sum the key performance indicators in decision support

    systems and for granularity required in DW. Tuned DataStage transformations and jobs to enhance their performance. Used the DataStage Director and its run-time engine to schedule running the solution,

    testing and debugging its components, and monitoring the resulting executable versions

    on an ad hoc or scheduled basis. Scheduled Datastage job using Autosys scheduling tool. Prepared the documentation of Data Acquisition and Interface System Design. Assigned the tasks and provided technical support to the development team. Monitored the development activities of the team and updated to the Management. Created complicated reports using reporting tool Cognos.

    Environment: IBM / Ascential Data Stage E.E./7.5(Manager, Designer, Director, ParallelExtender), Quality Stage 7.5 Data Stage BASIC language Expressions,Autosys, Erwin 4.0,

    Windows NT, UNIX, Oracle 9i, SQL SERVER, Cognos, Sequential files, .csv files.

    Confidential, PA Jan 2007 -- Sep 2008

    Sr. Data Stage Developer

    As a DW developer designed, developed, and deployed DataStage Jobs and associatedfunctionality. The warehouse employed highly complex data transformations including Slowly

    Changing Dimensions and a series of Stored Procedures, which made performance tuning and

    efficient mapping highly critical. Along with designing jobs from scratch re-wrote existing codeto enhance performance and trouble-shoot errors in both DataStage & Oracle10G.

    Responsibilities:

    Used IBM Datastage Designer to develop jobs for extracting, cleaning, transforming andloading data into data marts/data warehouse.

    Developed several jobs to improve performance by reducing runtime using differentpartitioning techniques.

    Used different stages of Datastage Designer like Lookup, Join, Merge, Funnel, Filter,Copy, Aggregator, and Sort etc.

    Used to read complex flat files from mainframe machine buy using Complex Flat FileStage.

    Sequential File, Aggregator, ODBC, Transformer, Hashed-File, Oracle OCI, XML,Folder, FTP Plug-in Stages were extensively used to develop the server jobs.

    Use the EXPLAIN PLAN statement to determine the execution plan Oracle Database. Worked on Complex data coming from Mainframes (EBCIDIC files) and knowledge of

    Job Control Language (JCL). Used Cobol Copy books to import the Metadata information from mainframes. Designed Datastage jobs using Quality Stage stages in 7.5 for data cleansing & data

    standardization Process. Implemented Survive stage & Match Stage for data patterns &

    data definitions.

  • 8/22/2019 Ds Resumes

    7/38

    Staged the data coming from various environments in staging area before intoDataMarts.

    Involved in writing Test Plans, Test Scenarios, Test Cases and Test Scripts andperformed the Unit, Integration, system testing and User Acceptance Testing.

    Used stage variables for source validations, to capture rejects and used Job Parametersfor Automation of jobs.

    Strong knowledge in creating procedures, functions, sequences, triggers. Expertise in PLSQL/SQL. Performed debugging and unit testing and System Integrated testing of the jobs. Wrote UNIX shell script according to the business requirements. Wrote customized server/parallel routines according to complexity of the business

    requirements. Designed strategies forarchiving of legacy data. Created shell scripts to perform validations and run jobs on different instances (DEV,

    TEST and PROD).

    Created & Deployed SSIS (SQL Server Integration Services) Projects, Schemas andConfigured Report Server to generate reports through SSRS SQL Server 2005.

    Used to create ad-hoc reports by MS SQL Server Reporting Services for the businessusers.

    Used SQL Profiler to monitor the server performance, debug T-SQL and slow runningqueries.

    Expertise in developing and debugging indexes, stored procedures, functions, triggers,cursors using T-SQL.

    Wrote mapping documents for all the ETL Jobs (interfaces, Data Warehouse and DataConversion activities).

    Environment:IBM Web Sphere Data stage and Quality Stage 7.5, AscentialDatastage7.5/EE (Parallel Extender), SQL Server 2005/2008, Linux, Teradata 12, Oracle10g,

    Sybase, PL/SQL Toad, UNIX (HP-UX), Cognos 8 BI

    Confidential, NJ

    Jr. DATASTAGE DEVELOPER Jan 2006- Dec 2006

    Merrill Lynchwas a global financial service provides capital markets services, investment

    banking and advisory services, wealth management, asset management, insurance, banking and

    related financial services worldwide.

    ETL/Datastage Developer ResumeTitleETL/Datastage Developer

    Primary Skills

  • 8/22/2019 Ds Resumes

    8/38

    ETL, datawarehouse design

    LocationUS-NJ-Jersey City (will consider relocating)

    PostedJun-06-10

    RESUME DETAILS

    Professional Summary:

    Over 7 years of experience in Data modeling, Datawarehouse Design, Development and

    Testing using ETL and Data Migration life cycle using IBM WebSphere DataStage 8.x/7.x

    Expertise in building Operational Data Store (ODS), Data Marts, and Decision Support

    Systems (DSS) using Multidimensional Model(Kimball and Inmon),Star and Snowflake schemadesign.

    Experience in analyzing the data generated by the business process, defining the granularity,

    source to target mapping of the data elements, creating Indexes and Aggregate tables for the data

    warehouse design and development. Data Processing experience in designing and implementing Data Mart applications, mainly

    transformation processes using ETL tool DataStage (Ver8.0/7), designing and developing jobs

    using DataStage Designer, Data Stage Manager, DataStage Director and DataStage Debugger. Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data

    Conversion, Performance Tuning and System Testing.

    Excellent in using highly scalable parallel processing Infrastructure using DataStage Parallel

    Extender. Efficient in incorporation of various data sources such as Oracle, MS SQL Server, and DB2,

    Sybase, XML and Flat files into the staging area.

    Experience in Mapping Server/parallel Jobs in DataStage to populate tables in Data warehouseand Data marts.

    Proven track record in addressing production issues like performance tuning and enhancement.

    Excellent knowledge in creating and managing Conceptual, Logical and Physical Data Models. Experience in dimensional and relational database design.

    Strong in Data warehousing concepts, dimensional Star Schema and Snowflakes Schema

    methodologies.

    Expert in unit testing, system integration testing, implementation, maintenance andperformance tuning.

    Experience in different Scheduling tools like AutoSys for automating and scheduling jobs run.

    Excellent with PL/SQL, T-SQL, Stored Procedures, Database Triggers and SQL * Loader. Experience in UNIX Shell Scripting. Excellent knowledge of operating systems Windows, UNIX, Macintosh, and databases

    including Oracle, SQL Server,and DB2.

    Experience in implementing Quality Processes like ISO 9001:2000/Audits. Detail oriented with good problem solving, organizational, analysis, highly motivated and

    adaptive with the ability to grasp things quickly.

  • 8/22/2019 Ds Resumes

    9/38

    Ability to work effectively and efficiently in a team and individually with excellent

    interpersonal, technical and communication skills.

    Education Qualifications:

    Masters in Electrical and Computer Engineering.

    Skill Sets:IBM Information Server V8.1(DataStage, QualityStage, Information Analyzer), Ascential

    DataStage V7.5 (Designer, Director, Manager, Parallel Extender)., Oracle 8i/9i/10g, MS SQL

    Server 2005/2008, DB2 UDB, MS Access, Sybase, SQL, PL/SQL, SQL*Plus, Flat files,Sequential files, TOAD 9.6, Erwin, Microsoft Visio, Oracle Developer 2000, SQL*Loader, IBM

    Cognos 8.0, IBM AIX UNIX, Red Hat Enterprise Linux 4, UNIX Shell Scripting, Windows

    NT,/XP, Macintosh,C,C++,VB scripting.

    Project Summary:

    Prudential Financial, Newark,NJ 1/2009- Present

    DataStage Developer

    Prudential Financial, Inc. is a Fortune Global 500, provides insurance, investment management,

    and other financial products and services to both retail and institutional customers throughout theUnited States and in over 30 other countries.

    The project objective was to collect, organize and store data from different operational data

    sources to provide a single source of integrated and historical data for the purpose of reporting,analysis and decision support to improve the client services.

    Hardware/Software:IBM DataStage 8.0 (Designer, Director, Manager, Parallel Extender), Oracle 10g,SQL Server

    2008, DB2 UDB, Flat files, Sequential files, Autosys, TOAD 9.6, SQL*Plus, AIX UNIX, IBM

    Cognos 8.0

    Responsibilities:

    Interacted with End user community to understand the business requirements and in identifying

    data sources.

    Analyzed the existing informational sources and methods to identify problem areas and makerecommendations for improvement. This required a detailed understanding of the data sources

    and researching possible solutions.

    Implemented dimensional model (logical and physical) in the existing architecture using Erwin. Studied the PL/SQL code developed to relate the source and target mappings.

  • 8/22/2019 Ds Resumes

    10/38

    Helped in preparing the mapping document for source to target.

    Worked with Datastage Manager for importing metadata from repository, new job Categories

    and creating new data elements. Designed and developed ETL processes using DataStage designer to load data from Oracle, MS

    SQL, Flat Files (Fixed Width) and XML files to staging database and from staging to the target

    Data Warehouse database. Used DataStage stages namely Hash file, Sequential file, Transformer, Aggregate, Sort,Datasets, Join, Lookup, Change Capture, Funnel, Peek, Row Generator stages in accomplishing

    the ETL Coding.

    Developed job sequencer with proper job dependencies, job control stages, triggers. Used QualityStage to ensure consistency, removing data anomalies and spelling errors of the

    source information before being delivered for further processing.

    Excessively used DS Director for monitoring Job logs to resolve issues.

    Involved in performance tuning and optimization of DataStage mappings using features likePipeline and Partition Parallelism and data/index cache to manage very large volume of data.

    Documented ETL test plans, test cases, test scripts, and validations based on design

    specifications for unit testing, system testing, functional testing, prepared test data for testing,error handling and analysis.

    Used Autosys job scheduler for automating the monthly regular run of DW cycle in both

    production and UAT environments.

    Verified the Cognos Report by extracting data from the Staging Database using PL/SQLqueries.

    Wrote Configuration files for Performance in production environment.

    Participated in weekly status meetings.

    Kaiser Permanente, Pleasanton, CA 05/2007-- 12/2008

    ETL Designer/ DataStage Developer

    Kaiser Permanente is an integrated managed care organization, is the largest health care

    organization in the United States. The Health Plan and Hospitals operate under state and federalnon-profit tax status, while the Medical Groups operate as for-profit partnerships or professional

    corporations in their respective regions.

    The project was to design, develop and maintain a data warehouse for their vendor's data,

    internal reference data and work with their DBA to ensure that the physical build adheres to the

    model blueprint.

    Hardware/Software:

    DataStage 7.5.1 Enterprise Edition, Quality Stage, Flat files,Oracle10g, SQL Server -2005/2008,

    Erwin 4.2, PL/SQL, UNIX, Windows NT/XP

  • 8/22/2019 Ds Resumes

    11/38

    Responsibilities:

    Involved in understanding of business processes and coordinated with business analysts to get

    specific user requirements. Studied the existing data sources with a view to know whether they support the required

    reporting and generated change data capture request.

    Used Quality Stage to check the data quality of the source system prior to ETL process. Worked closely with DBA's to develop dimensional model using Erwin and created thephysical model using Forward Engineering.

    Worked with Datastage Administrator for creating projects, defining the hierarchy of users and

    their access. Defined granularity, aggregation and partition required at target database.

    Involved in creating specifications for ETL processes, finalized requirements and prepared

    specification document.

    Used DataStage as an ETL tool to extract data from sources systems, loaded the data into theSQL Server database.

    Imported table/file definitions into the Datastage repository.

    Performed ETL coding using Hash file, Sequential file, Transformer, Sort, Merge, Aggregatorstages compiled, debugged and tested. Extensively used stages available to redesign DataStage

    jobs for performing the required integration.

    Extensively used DataStage Tools like Infosphere DataStage Designer, Infosphere DataStage

    Director for developing jobs and to view log files for execution errors. Controlled jobs execution using sequencer, used notification activity to send email alerts.

    Ensured that the data integration design aligns with the established information standards.

    Used Aggregator stages to sum the key performance indicators used in decision supportsystems.

    Scheduled job runs using DataStage director, and used DataStage director for debugging and

    testing.

    Created shared containers to simplify job design. Performed performance tuning of the jobs by interpreting performance statistics of the jobs

    developed.

    Documented ETL test plans, test cases, test scripts, and validations based on designspecifications for unit testing, system testing, functional testing, regression testing, prepared test

    data for testing, error handling and analysis.

    Macy's, Atlanta,GA 12/2005-- 04/2007

    ETL Developer

    Macy's (NYSE: M) is a chain of mid-to-high range American department stores deliveringfashion and affordable luxury to customers coast to coast. Online shopping is offered through

    macys.com. Its selection of merchandise can vary significantly from location to location,

    resulting in the exclusive availability of certain brands in only higher-end stores.

    The aim of the Project was to build a data warehouse, which would keep historical data

    according to a designed strategy. Flat files, Oracle tables were part of the source data, which

    came in on a daily, weekly, monthly basis.

  • 8/22/2019 Ds Resumes

    12/38

    Hardware/Software:

    IBM Information Server DataStage 7.5, Oracle 10g, SQL, PL/SQL, UNIX, SQL*Loader,

    Autosys, Business Objects 6.1, Windows 2003, IBM AIX 5.2/5.1, HP Mercury Quality Center9.0

    Responsibilities:

    Involved in understanding of business processes and coordinated with business analysts to getspecific user requirements.

    Used Information Analyzer for Column Analysis, Primary Key Analysis and Foreign Key

    Analysis.

    Extensively worked on DataStage jobs for splitting bulk data into subsets and to dynamicallydistribute to all available processors to achieve best job performance.

    Developed ETL jobs as per business rules using ETL design document

    Converted complex job designs to different job segments and executed through job sequencerfor better performance and easy maintenance.

    Used DataStage maps to load data from Source to target.

    Enhanced the reusability of the jobs by making and deploying shared containers and multiple

    instances of the jobs. Imported the data residing in the host systems into the data mart developed in Oracle 10g.

    Extensively used Autosys for automation of scheduling jobs on daily, bi-weekly, weekly

    monthly basis with proper dependencies. Wrote complex SQL queries using joins, sub queries and correlated sub queries

    Performed Unit testing and System Integration testing by developing and documenting test

    cases in Quality Center.

    Validated the report generated using Business Objects using PL/SQL queries. Worked on troubleshooting, performance tuning and performances monitoring for enhancement

    of DataStage jobs and builds across Development, QA and PROD environments.

    Citibank Inc., New York, NY 12/2004--11/2005

    DataStage Developer

    Citibank, the leading global banking company, has some 200 million customer accounts anddoes business in more than 100 countries, providing services to consumers, corporations,

    governments and institutions.

    The project was to transform the data coming from various sources through multiple stagesbefore being loaded into the data warehouse and maintenance.

    Hardware/Software:

    Ascential DataStage 7.0(Designer, Manager, Director), Oracle 9i, MS Access, SQL-Server2000/2005,SQL, PL/SQL, Toad, UNIX

    Responsibilities: Involved in understanding of business processes to learn business requirements.

  • 8/22/2019 Ds Resumes

    13/38

    Extracted data from different systems into Source. Mainly involved in ETL developing.

    Defined and implemented approaches to load and extract data from database using DataStage.

    Worked closely with data warehouse architect and business intelligence analyst in developingsolutions.

    Used Erwin for data modeling (i.e. modifying the staging and SQL scripts on Oracle and MS

    Access Environments). Involved in design, source to target mappings between sources to operational staging targets,using DataStage Designer.

    Performed ETL coding using Hash file, Sequential file, Transformer, Sort, Merge, Aggregator

    stages compiled, debugged and tested. Extensively used stages available to redesign Data Stagejobs for performing the required integration.

    Executed jobs through sequencer for better performance and easy maintenance.

    Involved in unit, performance and integration testing of Data Stage jobs.

    Used Data Stage Director to run and monitor the jobs for performance statistics. Involved in performance tuning of the jobs.

    Used T-SQL for validating the data generated at OLAP server.

    Wipro Technologies, Bangalore 08/2003--11/2004 Quality Assurance Engineer

    Wipro Technologies is an Indian Multinational, a leading provider of integrated business,technology and process solutions on a global delivery platform.

    Worked as a QA tester on web based E-billing application. The application has two modulesaccount payable (AP) and account receivable (AR) to keep track of transactions.

    Hardware/Software:

    Windows XP,ASP.net,Test Director 7.2,WinRunner 7.0,SQL server 2000

    Responsibilities:

    Active participation in decision making and QA meetings and regularly interacted with the

    Business Analysts &development team to gain a better understanding of the Business Process,Requirements & Design.

    Analyzed business requirements with perspective to Black Box Testing, system

    architecture/design and converted them into functional requirements/test cases. Used Test Director to document the requirements and created traceability matrices for the

    requirements.

    Developed Test Plan that included the scope of the release, entrance and exit criteria and

    overall test strategy. Created detailed Test Cases and Test sets and executed them manually. Performed Cross-Browsing testing to verify if the application provides accurate information in

    different (IE, Netscape, Firefox, Safari) browsers.

    Extensively used Output and Checkpoint for verifying the UI properties and values using VB

    scripting. Back-End Database verification manually and using WinRunner to automatically

    Verify Database with the values entered during automated testing.

    Performed the Back-End integration testing to ensure data consistency on front-end by writingand executing SQL Queries. Provided management with metrics, reports, and schedules and was

  • 8/22/2019 Ds Resumes

    14/38

    responsible for entering, tracking bugs.

    Ensured that the Defect was always written with great level of detail.

    Sr. Data Stage Developer Resume

    TitleSr. Data Stage Developer

    Primary Skills7 yrs of IT experience in the Analysis, Development and Implementation of Data WarehousingSystems

    LocationUS-TX-dallas (will consider relocating)

    PostedJun-12-12

    RESUME DETAILS

    Summary

    Over 7 years of total IT experience in the Analysis, Development and Implementation of DataWarehousing Systems (OLAP) using Datastage.

    Experience in Datastage 8.1/8.0.1/7.5//7.1 using components like Datastage Designer,

    Datastage Manager, Datastage Director, Datastage Administrator, Quality stage, Informationserver and Parallel Extender. Designed Mapping documents ETL architecture documents and

    specifications.

    Strong understanding of the principles of DW using Fact Tables, Dimension Tables, star andsnowflake schema modeling.

    Development of Logical and Physical data warehouse models to represent business needs and

    designed Dimensional models for data mart creation, allowing flexible high performance

    reporting of core data.

    Experience with high-end programming experience using PL/SQL, SQL*Loader, ODBC, SQLNavigator, Toad, Developer 2000, Unix Shell Scripting, Win NT and Sun Solaris.

    RDBMS experience with Oracle 11g/10g/9i/8i, SQL Server, DB2 including database

    development, PL/SQL programming.

    Transfer data from one computer to another through a network through FTP Protocols.

    Having good understanding of ERISA Planning and Fund Transfer Pricing in the field ofbanking, Finance and Human resource management in various Industries.

  • 8/22/2019 Ds Resumes

    15/38

    Analyzed the Source Data and designed the source system documentation.

    Development of detailed specifications for data management scripts.

    Document the developed code for promotion to production environment.

    Used the Data stage Designer to develop processes for extracting, cleansing, transforming,integrating, and loading data into data warehouse database.

    Set up the development, QA & Production environments. Migrated jobs from development to QA to Production environments.

    Defined production support methodologies and strategies.

    Design Technical specifications and Mapping documents with Transformation rules.

    Strong background in ETL tools, such as parallel Extender and Information Server.

    Extensive experience in Dimensional modeling techniques, troubleshooting of Data Stagejobs, and performance tuning.

    Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data

    Conversion, Performance Tuning.

    Experience in Converting the Business Logic into Technical Specifications.

    Extensive experience in loading high volume data, and performance tuning.

    Written data stage routines to fulfill the business requirements.

    Efficient in all phases of the development lifecycle, coherent with Data Cleansing, Data

    Conversion, Performance Tuning.

    Strong Experience in Quality Assurance using various environments like System integrationtesting, Process scenario testing, User acceptance testing and Operation readiness testing.

    Having good knowledge of ETM (Enterprise Tax management).

    Participated in discussions with Project Manager, Business Analysts and Team Members onany technical and/or Business Requirement issues.

    Technical Skills:

    ETL Tools Datastage 8.1/8.0.1/7.5.2/7.1 (Administrator, Manager, Designer, Director, Parallel

    Extender/Orchestrate, Information Server, Quality Stage) MVS

    OLAP Tools Business Objects 6.5.1Operating Systems Sun Solaris 8/7.0, IBM AIX 4.3 and Windows NT 4.0/2000/2003/xp

    windows 7.

    Languages PL/SQL, C/C++, UNIXDatabase Oracle 11g/10g/9i/8i/7.3, SQL Server 2000/2005/2008 ,DB2, Sybase and MQ Series

    GUI Visual Basic 5.0/6.0

    Web Server Apache Web Server, Java Web Server, Web logic Server, IISOther Cyber fusion, SQL Assistant, MS Visio, Erwin, TOAD, Pattern Action Language (PAL),Autosys, Control M, TWS ER Win, and Teradata.

    Professional Experience

    Infor global solutions, Dallas, TX Feb'11 to Present

    Sr. Data Stage Developer

  • 8/22/2019 Ds Resumes

    16/38

    Responsibilities:

    Team-coordination and ensuring customer satisfaction by providing extensive support.

    Planning for project execution.

    Used Parallelism concepts for distributing load among different processors by implementing

    Pipeline and partitioning of data. Involved in Designing Parallel and server Jobs.

    Creating new batches that utilized parallelism better (Multi Instance)

    Simplifying the environment by removing unneeded objects.

    Reviewing ETL architecture and recommending changes as per Standards.

    Created and Monitored sequences using Data stage sequencer.

    Created containers to use in multiple jobs.

    Used debugger to test the data flow and fix jobs.

    Defined data base triggers & PL/SQL stored procedures for business validations.

    Extensively wrote Routines and Transforms.

    Involved in performance tuning of jobs.

    Created and Monitored sequences using Data stage sequencer

    Extensively used Parallel Stages like Join, Merge, Lookup, Filter, Remove Duplicates, Funnel,

    Row Generator, slowly changing dimensions, Modify, Peek etc. for development and de-bugging

    purposes

    Responsible for metadata management, creating shared containers for reusability.

    Extensively worked with Data Stage Job Sequences to Control and Execute Data Stage Jobs

    and Job Sequences using various Activities and Trigger.

    Maintained the Data Stage Production and Development servers. Worked on various adminissues to keep the server up at all times.

    Actively worked during nights and weekends during go-live and post production.

    Environment: Windows 2008, IBM Data stage 8.0.1, Oracle 10g, SQL Server 2008R2.

    BNY Mellon corp. Wilmington, DE. Jan'10-- Jan'11Sr. Data Stage Developer

    Responsibilities:

    Involved in understanding the scope of application, present schema, data model and definingrelationship within and between the groups of data.

    Interaction with the business users to better understand the requirements and document their

    expectations, handling the current process, modifying and created the jobs to the updated

    requirements, handle the load process to data mart and eventually data warehouse.

    Used the Ascetical Data Stage 7.5 extensively to transfer and load data into the staging areaand eventually into Warehouse.

    Extensively used the Sequential File Stage, Dataset Stage, File set Stage, Modify Stage, Look-

    up, transformation and other Database plug-ins and the stage provided by the Data Stage toperform transformation and load the data.

    Designed Mappings between sources to operational staging targets, using Star Schema,Implemented logic to Slowly Changing Dimensions.

    Designed a job template that provides the Environmental parameters for the subsequent use in

  • 8/22/2019 Ds Resumes

    17/38

    the projects.

    Extensively used Microsoft Visio for documentation and control _m to schedule the jobs.

    Implemented Debugging Methodologies with Break Point Options.Environment: IBM Web Sphere Data Stage 7.5.1, IBM-AIX, DB2, Oracle 10g, UNIX,MICROSOFT VISIO, control-M, SQL plus.

    Stanley works, New Britain, CT Nov '08 -- Dec '09Data Stage Developer

    Responsibilities:

    Used Parallel Extender for distributing load among different processors by implementingpipeline and partitioning of data in parallel extender.

    Involved in Designing Parallel Extender Jobs.

    Worked with Business customers to identify the different sources of data in operationalsystems and developed strategies to build data warehouse.

    Preparation of technical specification for the development of Extraction, Transformation andLoading (ETL) jobs to load data into various tables in Data marts.

    Used Parallel Extender for distributing load among different processors by implementingpipeline and partitioning of data.

    Developed several complex Data Stage jobs for loading Participants to the data warehouse.

    Provide technical support to team members for Design and developing Data stage Jobs.

    Used Data Stage Director and the runtime engine to schedule running the server jobs,

    monitoring scheduling and validating its components.

    Used Basic Routines to pass values from Sequencer to jobs in runtime.

    Coded Parallel Routines in C++ to cleansing and format data in Transformer.

    Good Experience in working with Control M for Scheduling daily and monthly ETL's Jobs

    Worked with Business users to fix defects and implemented the best practices for conducting

    best unit testing and integration testing and documenting the test results for user acceptance.

    Environment: IBM Web Sphere Data Stage 7.5.2 EE (Parallel Extender, Designer, Manager,Administrator, Director), Oracle 10g, DB2UDB, SQl, PL/SQL, Shell Scripting, UNIX, Control-

    M.

    TIAA-CREF, charlotte, NC Dec '07 -- Nov '08

    ETL Developer

    Responsibilities:

    Data profiling is done on various source systems to identify and accommodate the requireddata from these various source systems into the EDW.

    Worked with Subject matter experts for Data Requirement Documents (DRD)

    Created a data dictionary to map the business requirements to attributes to design logical datamodel implementing star schema.

    Designed Translate values (common generalization of attributes) to maintain uniformity across

    the EDW.

    Involved in Creating Technical Design documents and Unit and Integration Test Plans fordevelopment and testing activities.

  • 8/22/2019 Ds Resumes

    18/38

    Developed a very complex Data validation and error handling jobs.

    Created Jobs to classify the data into subject areas as source jobs.

    Created Jobs to load the data into EDW from various subject areas as Target jobs.

    Automated the entire ETL flow using autosys.

    Used Various Standard and created Custom Routines to use in DataStage jobs.

    Extensively worked with DataStage Shared Containers for Re-using the Businessfunctionality.

    Created Generic jobs using RCP to extract data from various source systems.

    Extensively wrote user-defined SQL coding for overriding for Auto generated SQL query inDataStage.

    Written batch Job Controls for automation of Execution of data Stage Jobs.

    Defined Stage variables for data validations and data filtering process.

    Worked on performance tuning to address very critical and challenging issues.

    Co-ordinate with SIT and UAT team to fix the Test Problem Reports.Environment: IBM Information server (Data Stage and Quality Stage 8.0.1), Oracle10g, Teradata

    V2R6, PL/SQL, AIX, OBIEE, Autosys, Borland Starteam.

    Bharati Airtel Telecom, Hyderabad, India Aug '06 -- Oct '07

    ETL developer

    Responsibilities:

    Developed several transformations in the process of building Data warehouse database.

    Used server jobs to Extraction, Transformation and Load.

    Analyzed and designed source code documentation for investment Data Warehouse.

    Used DataStage Designer for developing various jobs to extract, cleansing, transforming,integrating and loading data into Data Warehouse.

    Generating of unique keys for composite attributes while loading the data into Data

    Warehouse.

    Used COBOL programs to extract data from the mainframes and create a load ready file.

    Used DataStage Manager for managing DataStage repository (view and edit), define customroutines & transforms, Import and export items between different DataStage systems or

    exchange metadata with other data warehousing tools.

    Extensively used Parallel Stages like Join, Merge, Lookup, Filter, Remove Duplicates, Funnel,Row Generator, Modify, Peek etc. for development and de-bugging purposes

    Extensively worked with Data Stage Job Sequences to Control and Execute Data Stage Jobs

    and Job Sequences using various Activities and Triggers. Used DataStage Director and the runtime engine to schedule running the server jobs,monitoring scheduling and validating its components.

    Used Hashed file to extract and write data and also to act as intermediate file in a job. Hashedfile is also used as reference table based on single key field.

    Environment:

    DataStage 7.5, Oracle 8i, SQL Server 2000, DB2 UDB, SQl, COBOL, JCL

  • 8/22/2019 Ds Resumes

    19/38

    Mehar Health Care Corporation, New Delhi, India Mar'05- Aug '06 Data stage Developer

    Responsibilities:

    Involved in support for the existing ECLDW (ECL Data ware house). Monitoring and resolving data warehouse ETL production issues.

    Worked in analyzing the issue, trace the problem area, suggest a solution, discuss withbusiness.

    Implement the solution with extensive documentation.

    Understanding the functional requirement specifications and design documents.

    Analyzing the source and target flat forms.

    Designed Data Stage ETL jobs for extracting data from heterogeneous source systems,

    transform,Identify source systems, their connectivity, related tables and fields and ensure data suitably forjobs.

    Developed jobs according to ETL Specification to load test data. Involved in creating reusable components such as Shared Containers and Local Containers

    Used most of the transformations such as the Transformer, Sequential File, Joiner, Lookup andAggregator etc.

    Created unit test cases for jobs.

    Involved in support activities after the implementation Ted extensive documentations for theproduction support process improvements.

    Environment: Windows 2003, Ascential Data stage 7.5.1 (Parallel), Oracle 9i.

    Kapil ChhabraSr. DataStage Developer at IBM DataStage

    Iselin, NJ

    Over 7.5 years of IT experience in all phases of software development life cycle which

    includes analysis, designing, implementation and testing in various domains like retail,

    insurance, pharmaceuticals. Extensive Data warehousing experience using IBM Datastage

    8.x/7.x (Datastage designer, manager, director and administrator). Expert in data modeling

    using Erwin tool. Expert with working on various operating systems IBM AIX 5.2/5.1,sun

    solaris, windows. Good understanding of Ralph Kimball's data warehouse methodology.

    Designed both server and parallel using various stages like ODBC, transformer, join, merge,

    look up, funnel, aggregator, copy, and filter. Experienced in using advance datastage plug in

    stages like XML input and XML output. Experienced in integrating of different data sources

    DB2, oracle, MS SQL server,Sybase ,Teradata and MS access. Proficient in korn shell and

    perl scpriting. Proficient using Autosys for scheduling datastage jobs. Proficient in

  • 8/22/2019 Ds Resumes

    20/38

    implementation and testing of triggers, procedures and functions for data handling.

    Experienced in Unix shell scripting as part of file manipulation, scheduling and text processing.

    Proficient in unit testing, system integration testing and performance tuning of Datastage jobs.

    Experienced in working with ALTOVA MAPFORCE for mapping of XML files, Databases

    and text files. Experience in configuring, installing datastage on server. Experience with

    working with Xml schema. Experienced with data modeling tools like Erwin, oracle designer,

    MS Visio. Experienced with Quality stage for data profiling, standardization, matching and

    survivorship. Used information analyzer for columns analysis, primary key analysis, foreign

    key analysis and cross-domain analysis.Willing to adapt to new environment and learn new

    technologies.

    Work Experience

    Sr. DataStage Developer

    IBM DataStage

    May 2008 to Present

    Over 7.5 years of IT experience in all phases of software development life cycle which includes

    analysis, designing, implementation and testing in various domains like retail, insurance,pharmaceuticals.

    Extensive Data warehousing experience using IBM Datastage 8.x/7.x (Datastage designer,

    manager, director and administrator).

    Expert in data modeling using Erwin tool.

    Expert with working on various operating systems IBM AIX 5.2/5.1,sun solaris, windows.

    Good understanding of Ralph Kimball's data warehouse methodology.

    Designed both server and parallel using various stages like ODBC, transformer, join, merge,

    look up, funnel, aggregator, copy, and filter.

    Experienced in using advance datastage plug in stages like XML input and XML output.

    Experienced in integrating of different data sources DB2, oracle, MS SQL server,Sybase

    ,Teradata and MS access.

    Proficient in korn shell and perl scpriting.

    Proficient using Autosys for scheduling datastage jobs.

    Proficient in implementation and testing of triggers, procedures and functions for data

    handling.

    Experienced in Unix shell scripting as part of file manipulation, scheduling and textprocessing.

    Proficient in unit testing, system integration testing and performance tuning of Datastage

    jobs.

    Experienced in working with ALTOVA MAPFORCE for mapping of XML files, Databases

    and text files.

    Experience in configuring, installing datastage on server.

  • 8/22/2019 Ds Resumes

    21/38

    Experience with working with Xml schema.

    Experienced with data modeling tools like Erwin, oracle designer, MS Visio.

    Experienced with Quality stage for data profiling, standardization, matching and

    survivorship.

    Used information analyzer for columns analysis, primary key analysis, foreign key analysis

    and cross-domain analysis.Willing to adapt to new environment and learn new technologies.

    IBM DataStage 8.0/7.5/7.1/7.0/6.0, IBM Information Server 8.1, IBM Information Analyzer 8.1

    Business Objects 5.x, 6.x, Cognos 6.0/7.0/7.1 and Crystal Reports 9.0., AIX, UNIX, Linux, WinNT/2000, and MS-DOS, Visual Basic 6.0, C,JAVA, PERL, COBOL, and HTML, Oracle

    11g/10g/9i/8i/8.0/7.3,DB2,Teradata,SQL server2008/2005, Developer2000, MS-Access 2000,

    TOAD 6.3, Altova Mapforce, Erwin 4.0, Autosys.

    Projects Summary:

    Emblem Health, NY 5/2008- Present

    Sr. DataStage Developer

    EmblemHealth is set on being the mark of good health in the Northeast. The not-for-profitcompany provides health insurance through subsidiaries Group Health Incorporated (GHI) and

    the Health Insurance Plan of Greater New York (HIP). Collectively, the two health insurerscover some 4 million New Yorkers, primarily state government and New York City

    employees.GHI and HIP joined together under the EmblemHealth banner in 2006.

    The main purpose of project was to build claim data mart for life insurance claims,dentalclaims,QCARE and PPO claims.The collected information will used for claims approval or

    rejection.

    Hardware/Software: IBM Information Server 8.1, IBM Information Analyzer 8.1 IBM Data

    Stage 8.1 (Designer, Manager, Director), Erwin 4.1, Windows XP, Unix Shell scripting, TOAD8.1, Oracle 10g ,MS SQL server ,flat files, xml files, Autosys, cognos.

    Responsibilities: Identified business needs, evaluated business and technical alternatives,

    recommended solutions and participated in their implementation.

    Requirement gathering and business analysis by attending requirement.

    Used DataStage Designer to create the table definitions for the CSV and flat files, import the

    table definitions into the repository, import and export the projects, release and package the jobs.

    Developed ETL processes to extract the source data and load it into the enterprise-wide data

    warehouse after cleansing, transforming and integrating.

    Imported metadata from repository, created new job categories, routines and data elements.

    Designed and developed jobs using DataStage Designer as per the mapping specifications

    using appropriate stages. Used Perl for file parsing for loading into destination.

    Created JIL files for Datastage jobs entry in Autosys.

    Worked extensively With AutoSys for scheduling Jobs,Provided Production support for

    AutoSys.

    Worked HP Quality Center to resolved Data related defects.

    Used Unix Script to copy files from Development Server to Production Server for testing

  • 8/22/2019 Ds Resumes

    22/38

    Datastage jobs.

    Worked with Unix admin to resolved datastage bug related error.

    Developed job sequences to execute a set of jobs with restart ability, check points and

    implemented proper failure actions

    Wrote UNIX shell scripts to read parameters from files for invoking DataStage jobs.

    Created source to target mapping and job design documents from staging area to DataWarehouse.

    Used DataStage Director and its run-time engine to schedule running the solution, testing and

    debugging its components, and monitoring the resulting executable versions (on an ad hoc orscheduled basis).

    Worked on troubleshooting, performance tuning, performance monitoring and enhancement

    of DataStage jobs.

    Wrote several complex SQL queries to extensively test the ETL process.

    Provided production support and performed enhancement on existing multiple projects.

    Worked with Autosys for setting up production job cycles for daily, weekly monthly loads

    with proper dependencies. Performed unit testing and system integration testing in dev and UAT environments.

    ETL Developer

    Wells Fargo

    -

    St. Louis, MO

    June 2007 to May 2008

    Wells Fargo &co. is diversified financial services company with operations around world. Wells

    Fargo is fourth largest bank in US. The aim of project was to migrate jobs from datastage 7.5 todatastage 8.0 and provide required data to production team for managing the enterprise sales data

    warehouse, which contained the information about the products, business, customers.

    Hardware/Software: Data Stage 8.0(Parallel Extender EE and Server Edition (7.5.1)), Quality

    Stage, Teradata , Oracle 10g, Cognos, PL/SQL, Sun Solaris 5.8,ALTOVA MAPFORCE, TOAD9.1, Unix Shell scripting,.

    Responsibilities: Involved in Analysis, Requirements gathering, function/technical

    specification, development, deploying and testing. Created snapshot for the transactional tables in distributed databases.

    Analyzed existing legacy data source and extracted data from it using Datastage.

    Involved in critical multiple instance datastage jobs which will send the outbound files for

    different Lobs (line of business) at the same time and monitored the jobs accordingly.

    Scheduled the parallel jobs using DataStage Director, which is controlled by DataStage

    engine and also for monitoring and performance statistics of each stage.

    Used ALTOVA Mapforce Tool to do mapping between source and destination XML and

  • 8/22/2019 Ds Resumes

    23/38

    CSV files and exported table definition using datastage director.

    Created Error Tables containing data with discrepancies to analyze and re-process the data.

    Extensively used Parallel Stages like Join, Merge, Lookup, Filter, Aggregator, Modify, Copy,

    Sort, Funnel, Change Data Capture, Remove Duplicates, Surrogate key Generator, Row

    Generator, Column Generator, and Peek for development and de-bugging purposes

    Used DataStage Manager to Import and Export DataStage components and Import tabledefinitions from source databases.

    Used the Folder stage and RTI stage to extract data from .xml type files and load into Oracle

    database.

    Execute Backend Testing application by writing complex SQL Queries.

    Involved in the design, development and testing of the PL/SQL stored procedures, packages

    and triggers for the ETL processes.

    Wrote SQL scripts to create and drop the indexes those are passes as parameters in the pre &

    post sessions.

    Removed errors while migrating jobs from datastage 7.5 to 8.0.

    Developed shell scripts for job scheduling and logging.

    Involved in performance tuning of SQL queries by providing hints, obtaining Explain Plan,

    Analyzing tables and adding indexes.

    Involved in full integration test and code reviews of all jobs within each sequence before

    migrating the jobs and sequencers from the Development environment (Dev) to the QA

    environment, and from the QA environment to the Production environment.

    ETL Developer

    Merck and Co, Inc

    -

    Rahway, NJ

    April 2006 to June 2007

    Merck and co. Inc is one of largest pharmaceuticals companies in world having head quarters in

    Whitehouse station. The main functionality of the Monitor project was to do the performance

    analysis of the marketing campaigns launched by Merck. The project assisted the Marketingdepartment to keep track of the money invested in the campaigns and keep a check of the flow of

    money by monitoring the quality of marketing in the project. The primary responsibility

    included, creating DataStage jobs to extract data from 4 different databases namely: consumerproducts, prescription products, vaccines and veterinarian medicine database then transform andload data into data marts.

    Hardware/Software: Ascential Datastage 7.x (Designer, Administrator, Director and Manager),

    flat files,MS sqlserver,Sybase,Information analyzer, Oracle 9i, PL/SQL, Unix Shell scripts,TOAD, SQL*Loader, UNIX AIX and Metastage.

    Responsibilities: Created numerous project functional and scope document for ETL

  • 8/22/2019 Ds Resumes

    24/38

    processes.

    Requirement gathering, business analysis and project coordination was developed with

    business analyst and database administrator.

    Created various parallel jobs from 4 primary databases consumer, prescription, vaccine and

    veterinarian joined tables into target master sales database by creating data marts into target

    destination from source databases. Identified and documented data sources and transformation rules required to maintain data

    warehouse.

    Developed several parallel jobs to load data from sequential files, flat files MS SQL Server

    and oracle 9i.

    Used metastases to import and export table definition which shared by multiple users.

    Used information analyzer to perform primary key, foreign key and cross domain analysis.

    Used different stages in datastage designer like aggregator, join, look up, merge, join to

    design jobs.

    Used job sequencer to design various dependent jobs.

    Performed job tuning of datastage job.

    Involved in unit, system integration testing and moved jobs from development environment to

    production environment using datastage administrator.

    Used TOAD for writing SQL routines and functions.

    Used Unix shell scripting for file manipulation.

    ETL Consultant

    Ranbaxy Pharmaceutical

    February 2005 to March 2006

    Ranbaxy pharmaceuticals limited us Indian largest pharmaceuticals and export product in morethan 125 countries.Drug Distribution Data (DDD) mart, which is used to track sales activity for every major

    distribution channel, including hospitals, clinics, mail service, major retail food stores and

    chains, mass merchandisers and independent pharmacies. Requirement was to develop anindustrial-strength enterprise data warehouse that integrates and standardizes aggregate- and

    individual-level customer data. With individual-level detail, Ranbaxy could more precisely target

    mass marketing campaigns, while providing higher quality service to individual suppliers.

    Hardware/Software: Ascential DataStage 7.0, Datastage Basic, Erwin, Oracle 9i, SQL, PL/SQL,Toad, Command Center, Shell Scripts, Solaris and Windows 2000.

    Responsibilities: Involved in the requirement definition and analysis in support of Data

    Warehousing.

    Interpreted logical and physical data models for Business users to determine data definitions

    and establish referential integrity of the system.

    Installed and configured DataStage client and server components, connections with the Oracle

    database in Data warehouse using ODBC

  • 8/22/2019 Ds Resumes

    25/38

    Developed ETL procedures to ensure conformity, compliance with standards and lack of

    redundancy, translates business rules and functionality requirements into ETL procedures.

    Involved in the implementation of this application, which involved the Extraction,

    Transformation, and Loading of data into an Oracle database.

    Extensively used DataStage Designer to develop various jobs to extract, cleanse, transform,

    integrate and load data into DDD Data mart. Scheduled the parallel jobs using DataStage Director, which is controlled by DataStage

    engine and also for monitoring and performance statistics of each stage.

    Created Error Tables containing data with discrepancies to analyze and re-process the data.

    Worked with mainframes to extract some of the source files.

    Extensively used DataStage XE Parallel Extender to perform processing of massive data

    volumes.

    Developed Stored Procedures for complex jobs.

    Extensively wrote user defined SQL codes to modify/overwrite generated SQL query in

    DataStage.

    Worked with DataStage Manager to import/export metadata from database, jobs and routinesbetween DataStage projects.

    Used Data stage Director to schedule, monitor, cleanup resources and run the job with several

    invocation ids.

    Developed user defined Routines and Transforms by using DataStage Basic language.

    Responsible for daily verification that all scripts, downloads, and file copies were executed as

    planned, troubleshooting any steps that failed, and providing both immediate and long-term

    problem resolution.

    Wrote SQL, PL/SQL, queries, stored procedures, triggers for implementing business rules and

    transformations.

    Developed UNIX scripts to automate the Data Load processes to the target Data warehouse.

    Insurance Claims Management

    Lantech Solutions Ltd

    June 2004 to February 2005

    ETL Developer

    This system monitors and maintains all kinds of insurance claims in different aspects. It validatesand registers the claims and then processes accordingly. This also handles customer transactions

    with the company. This keeps track of the transaction history. Also developed couple of reportsfor claims auditing.

    Hardware/Software: Data Stage, Oracle 8i, Unix Shell scripts, PL/SQL, SQL and Windows2000.

    Responsibilities: Involved in the requirements gathering, analysis to define business and

    functional specifications.

    Assisted Systems Administrator in DataStage installation and maintenance.

  • 8/22/2019 Ds Resumes

    26/38

    Maintained Error Logs / Audit Trails.

    Enforced data Integrity rules/ Business Rules. Maintaining huge volume of data.

    Wrote Unix Shell Scripts to schedule the job.

    Involved in the process design documentation of the Data Warehouse Dimensional Upgrades.

    PL/SQL procedures to transform data from staging to Data Warehouse Fact and dimensional

    tables.

    Created various DataStage Hash files.

    Applied Performance Tuning logic to optimize session performance.

    Wrote SQL scripts to create and drop the indexes that are passed as parameters in the pre &

    post sessions.

    Performed unit and integration testing.

    Software Engineer

    ACE Tech International, India

    February 2003 to June 2004

    The project called IMS (Inventory Management System). The inventory project is designed tomeet the requirements of computerized stores in a company. The system contains five Master

    tables, two Transactions and one Report. In Item Master, cost price of an item cannot be greater

    than selling price. Vendor code is an automatically generated number by the system using DataAreas. In purchase order placement Master, Quantity supplied should be less then or quality

    ordered. Vendor code is got from the existing vendor item table. Required data should always be

    ahead of order date, Receipt date should also be entered in the stock master file as date of last

    receipt for that item code. The quantity on hand is the receipt quantity plus the already existing

    quantity in master and date of last issue for an item comes from issue transactions. Indentquantity is greater than or equal to issue quantity.

    Hardware/Software: Oracle 7.x, Visual Basic 6.0, Windows.

    Responsibilities: Involved in system analysis, designing data dictionary, development, testing

    and implementation

    Created Tables and Stored Procedures using Oracle 7.x

    Involved in design, development and Testing phase of the project

    Acquired data specifications and client requirements.

    Designed and developed the VB interfaces.

    Arranged ODBC drivers to connect to the Oracle database.

    Designed to meet the requirements of computerized stores.

    Studied business process and understood core business issues.

    Generated invoice reports for customers using Crystal Reports 4.5.

    Performed unit code and business function testing.

  • 8/22/2019 Ds Resumes

    27/38

    Additional Information

    Skillset:

    IBM DataStage 8.0/7.5/7.1/7.0/6.0, IBM Information Server 8.1, IBM Information Analyzer 8.1Business Objects 5.x, 6.x, Cognos 6.0/7.0/7.1 and Crystal Reports 9.0., AIX, UNIX, Linux, Win

    NT/2000, and MS-DOS, Visual Basic 6.0, C,JAVA, PERL, COBOL, and HTML, Oracle11g/10g/9i/8i/8.0/7.3,DB2,Teradata,SQL server2008/2005, Developer2000, MS-Access 2000,TOAD 6.3, Altova Mapforce, Erwin 4.0, Autosys.

    Projects Summary:

    Emblem Health, NY 5/2008- Present

    Sr. DataStage Developer

    EmblemHealth is set on being the mark of good health in the Northeast. The not-for-profit

    company provides health insurance through subsidiaries Group Health Incorporated (GHI) andthe Health Insurance Plan of Greater New York (HIP). Collectively, the two health insurers

    cover some 4 million New Yorkers, primarily state government and New York Cityemployees.GHI and HIP joined together under the EmblemHealth banner in 2006.The main purpose of project was to build claim data mart for life insurance claims,dental

    claims,QCARE and PPO claims.The collected information will used for claims approval or

    rejection.Hardware/Software:

    IBM Information Server 8.1, IBM Information Analyzer 8.1 IBM Data Stage 8.1 (Designer,

    Manager, Director), Erwin 4.1, Windows XP, Unix Shell scripting, TOAD 8.1, Oracle 10g ,MSSQL server ,flat files, xml files, Autosys, cognos.

    Responsibilities:

    Identified business needs, evaluated business and technical alternatives, recommendedsolutions and participated in their implementation.

    Requirement gathering and business analysis by attending requirement.

    Used DataStage Designer to create the table definitions for the CSV and flat files, import the

    table definitions into the repository, import and export the projects, release and package the jobs.

    Developed ETL processes to extract the source data and load it into the enterprise-wide data

    warehouse after cleansing, transforming and integrating.

    Imported metadata from repository, created new job categories, routines and data elements.

    Designed and developed jobs using DataStage Designer as per the mapping specifications

    using appropriate stages.

    Used Perl for file parsing for loading into destination. Created JIL files for Datastage jobs entry in Autosys.

    Worked extensively With AutoSys for scheduling Jobs,Provided Production support for

    AutoSys.

    Worked HP Quality Center to resolved Data related defects.

    Used Unix Script to copy files from Development Server to Production Server for testing

    Datastage jobs.

  • 8/22/2019 Ds Resumes

    28/38

    Worked with Unix admin to resolved datastage bug related error.

    Developed job sequences to execute a set of jobs with restart ability, check points and

    implemented proper failure actions

    Wrote UNIX shell scripts to read parameters from files for invoking DataStage jobs.

    Created source to target mapping and job design documents from staging area to Data

    Warehouse. Used DataStage Director and its run-time engine to schedule running the solution, testing and

    debugging its components, and monitoring the resulting executable versions (on an ad hoc or

    scheduled basis).

    Worked on troubleshooting, performance tuning, performance monitoring and enhancement

    of DataStage jobs.

    Wrote several complex SQL queries to extensively test the ETL process.

    Provided production support and performed enhancement on existing multiple projects.

    Worked with Autosys for setting up production job cycles for daily, weekly monthly loads

    with proper dependencies.

    Performed unit testing and system integration testing in dev and UAT environments.

    Wells Fargo, St. Louis, MO 6/2007 - 5/2008

    ETL Developer

    Wells Fargo &co. is diversified financial services company with operations around world. Wells

    Fargo is fourth largest bank in US. The aim of project was to migrate jobs from datastage 7.5 todatastage 8.0 and provide required data to production team for managing the enterprise sales data

    warehouse, which contained the information about the products, business, customers.

    Hardware/Software:Data Stage 8.0(Parallel Extender EE and Server Edition (7.5.1)), Quality Stage, Teradata , Oracle

    10g, Cognos, PL/SQL, Sun Solaris 5.8,ALTOVA MAPFORCE, TOAD 9.1, Unix Shell

    scripting,.

    Responsibilities:

    Involved in Analysis, Requirements gathering, function/technical specification, development,

    deploying and testing.

    Created snapshot for the transactional tables in distributed databases.

    Analyzed existing legacy data source and extracted data from it using Datastage.

    Involved in critical multiple instance datastage jobs which will send the outbound files for

    different Lobs (line of business) at the same time and monitored the jobs accordingly.

    Scheduled the parallel jobs using DataStage Director, which is controlled by DataStage

    engine and also for monitoring and performance statistics of each stage.

    Used ALTOVA Mapforce Tool to do mapping between source and destination XML and

    CSV files and exported table definition using datastage director.

    Created Error Tables containing data with discrepancies to analyze and re-process the data.

    Extensively used Parallel Stages like Join, Merge, Lookup, Filter, Aggregator, Modify, Copy,

    Sort, Funnel, Change Data Capture, Remove Duplicates, Surrogate key Generator, Row

    Generator, Column Generator, and Peek for development and de-bugging purposes

    Used DataStage Manager to Import and Export DataStage components and Import table

  • 8/22/2019 Ds Resumes

    29/38

    definitions from source databases.

    Used the Folder stage and RTI stage to extract data from .xml type files and load into Oracle

    database.

    Execute Backend Testing application by writing complex SQL Queries.

    Involved in the design, development and testing of the PL/SQL stored procedures, packages

    and triggers for the ETL processes.Wrote SQL scripts to create and drop the indexes those are passes as parameters in the pre &

    post sessions.

    Removed errors while migrating jobs from datastage 7.5 to 8.0.

    Developed shell scripts for job scheduling and logging.

    Involved in performance tuning of SQL queries by providing hints, obtaining Explain Plan,

    Analyzing tables and adding indexes.

    Involved in full integration test and code reviews of all jobs within each sequence before

    migrating the jobs and sequencers from the Development environment (Dev) to the QA

    environment, and from the QA environment to the Production environment.

    Merck and Co, Inc, Rahway, NJ 4/2006 - 6/2007

    ETL Developer

    Merck and co. Inc is one of largest pharmaceuticals companies in world having head quarters inWhitehouse station. The main functionality of the Monitor project was to do the performance

    analysis of the marketing campaigns launched by Merck. The project assisted the Marketing

    department to keep track of the money invested in the campaigns and keep a check of the flow ofmoney by monitoring the quality of marketing in the project. The primary responsibility

    included, creating DataStage jobs to extract data from 4 different databases namely: consumer

    products, prescription products, vaccines and veterinarian medicine database then transform and

    load data into data marts.Hardware/Software:

    Ascential Datastage 7.x (Designer, Administrator, Director and Manager), flat files,MS

    sqlserver,Sybase,Information analyzer, Oracle 9i, PL/SQL, Unix Shell scripts, TOAD,SQL*Loader, UNIX AIX and Metastage.

    Responsibilities:

    Created numerous project functional and scope document for ETL processes.

    Requirement gathering, business analysis and project coordination was developed with

    business analyst and database administrator.

    Created various parallel jobs from 4 primary databases consumer, prescription, vaccine and

    veterinarian joined tables into target master sales database by creating data marts into target

    destination from source databases. Identified and documented data sources and transformation rules required to maintain data

    warehouse.

    Developed several parallel jobs to load data from sequential files, flat files MS SQL Server

    and oracle 9i.

    Used metastases to import and export table definition which shared by multiple users.

    Used information analyzer to perform primary key, foreign key and cross domain analysis.

  • 8/22/2019 Ds Resumes

    30/38

    Used different stages in datastage designer like aggregator, join, look up, merge, join to

    design jobs.

    Used job sequencer to design various dependent jobs.

    Performed job tuning of datastage job.

    Involved in unit, system integration testing and moved jobs from development environment to

    production environment using datastage administrator. Used TOAD for writing SQL routines and functions.

    Used Unix shell scripting for file manipulation.

    Ranbaxy Pharmaceutical, NJ 2/2005 - 3/2006

    Analytics Enablement

    ETL Consultant

    Ranbaxy pharmaceuticals limited us Indian largest pharmaceuticals and export product in morethan 125 countries.

    Drug Distribution Data (DDD) mart, which is used to track sales activity for every major

    distribution channel, including hospitals, clinics, mail service, major retail food stores and

    chains, mass merchandisers and independent pharmacies. Requirement was to develop anindustrial-strength enterprise data warehouse that integrates and standardizes aggregate- and

    individual-level customer data. With individual-level detail, Ranbaxy could more precisely target

    mass marketing campaigns, while providing higher quality service to individual suppliers.Hardware/Software:

    Ascential DataStage 7.0, Datastage Basic, Erwin, Oracle 9i, SQL, PL/SQL, Toad, Command

    Center, Shell Scripts, Solaris and Windows 2000.

    Responsibilities:

    Involved in the requirement definition and analysis in support of Data Warehousing.

    Interpreted logical and physical data models for Business users to determine data definitions

    and establish referential integrity of the system.

    Installed and configured DataStage client and server components, connections with the Oracle

    database in Data warehouse using ODBC

    Developed ETL procedures to ensure conformity, compliance with standards and lack of

    redundancy, translates business rules and functionality requirements into ETL procedures.

    Involved in the implementation of this application, which involved the Extraction,

    Transformation, and Loading of data into an Oracle database.

    Extensively used DataStage Designer to develop various jobs to extract, cleanse, transform,

    integrate and load data into DDD Data mart.

    Scheduled the parallel jobs using DataStage Director, which is controlled by DataStage

    engine and also for monitoring and performance statistics of each stage. Created Error Tables containing data with discrepancies to analyze and re-process the data.

    Worked with mainframes to extract some of the source files.

    Extensively used DataStage XE Parallel Extender to perform processing of massive data

    volumes.

    Developed Stored Procedures for complex jobs.

    Extensively wrote user defined SQL codes to modify/overwrite generated SQL query in

  • 8/22/2019 Ds Resumes

    31/38

    DataStage.

    Worked with DataStage Manager to import/export metadata from database, jobs and routines

    between DataStage projects.

    Used Data stage Director to schedule, monitor, cleanup resources and run the job with several

    invocation ids.

    Developed user defined Routines and Transforms by using DataStage Basic language. Responsible for daily verification that all scripts, downloads, and file copies were executed as

    planned, troubleshooting any steps that failed, and providing both immediate and long-term

    problem resolution.

    Wrote SQL, PL/SQL, queries, stored procedures, triggers for implementing business rules and

    transformations.

    Developed UNIX scripts to automate the Data Load processes to the target Data warehouse.

    Lantech Solutions Ltd, India 6/2004 -2/2005

    Insurance Claims Management

    ETL Developer

    This system monitors and maintains all kinds of insurance claims in different aspects. It validatesand registers the claims and then processes accordingly. This also handles customer transactions

    with the company. This keeps track of the transaction history. Also developed couple of reports

    for claims auditing.Hardware/Software:

    Data Stage, Oracle 8i, Unix Shell scripts, PL/SQL, SQL and Windows 2000.

    Responsibilities:

    Involved in the requirements gathering, analysis to define business and functional

    specifications.

    Assisted Systems Administrator in DataStage installation and maintenance.

    Maintained Error Logs / Audit Trails.

    Enforced data Integrity rules/ Business Rules. Maintaining huge volume of data.

    Wrote Unix Shell Scripts to schedule the job.

    Involved in the process design documentation of the Data Warehouse Dimensional Upgrades.

    PL/SQL procedures to transform data from staging to Data Warehouse Fact and dimensional

    tables.

    Created various DataStage Hash files.

    Applied Performance Tuning logic to optimize session performance.

    Wrote SQL scripts to create and drop the indexes that are passed as parameters in the pre &

    post sessions.

    Performed unit and integration testing.

    ACE Tech International, India 2/2003 - 6/2004

    Inventory Management System

    Software EngineerThe project called IMS (Inventory Management System). The inventory project is designed to

    meet the requirements of computerized stores in a company. The system contains five Master

  • 8/22/2019 Ds Resumes

    32/38

    tables, two Transactions and one Report. In Item Master, cost price of an item cannot be greater

    than selling price. Vendor code is an automatically generated number by the system using Data

    Areas. In purchase order placement Master, Quantity supplied should be less then or qualityordered. Vendor code is got from the existing vendor item table. Required data should always be

    ahead of order date, Receipt date should also be entered in the stock master file as date of last

    receipt for that item code. The quantity on hand is the receipt quantity plus the already existingquantity in master and date of last issue for an item comes from issue transactions. Indentquantity is greater than or equal to issue quantity.

    Hardware/Software:

    Oracle 7.x, Visual Basic 6.0, Windows.

    Responsibilities:

    Involved in system analysis, designing data dictionary, development, testing and

    implementation

    Created Tables and Stored Procedures using Oracle 7.x

    Involved in design, development and Testing phase of the project

    Acquired data specifications and client requirements. Designed and developed the VB interfaces.

    Arranged ODBC drivers to connect to the Oracle database.

    Designed to meet the requirements of computerized stores.

    Studied business process and understood core business issues.

    Generated invoice reports for customers using Crystal Reports 4.5.

    Performed unit code and business function testing.

    Reume for ETL Developer (Data Stage)Reume for ETL Developer (Data Stage)

    Name :

    Professional Summary :

  • 8/22/2019 Ds Resumes

    33/38

    Overall 3.2 years of total IT experience out of which 2+ years of experience as ETL

    Developer using DataStage Parallel Extender and 1 year experience as SAS

    programmer.

    Involved in performance tuning of DataStage jobs at stage level , buffer level.

    Applied multiple lookups using lookup file set.

    Used DS manager for nodes creation and dataset management.

    Good knowledge in data warehouse concepts like Star Schema, Snow Flake,

    Dimension and Fact tables

    Implemented Local containers for same job and shared containers for multiple jobs,

    which have the same business logic.

    Good domain knowledge in Finance.

    Worked with Delta data.

    Performed various activities like import and export of Data Stage Projects.

    Unit Testing & Integration testing

    Good Knowledge in Oracle9i

    Skill Set

    ETL Tools : DataStage 7.5 , Datastage 7.5x2

    Operating Systems : MS-DOS, Windows-98/2000/NT/XP, Unix.

  • 8/22/2019 Ds Resumes

    34/38

    Database : Oracle 8i, MS-Access, SQL Server 2000.

    Languages : SAS 8.0(Base SAS, SAS Access, SAS STAT,SAS EG3.0 )

    Designing Tools : HTML, Java Script, XML

    Education

    MCA with First Class from Acharya Nagarjuna University

    Experience Summary

    Working as a ETL DeveloperCompany : Satyam Computers, Secunderabad

    Duration : Feb 2006 to Till Date.

  • 8/22/2019 Ds Resumes

    35/38

    Worked as a SAS ProgrammerCompamny : SLG Technologies., Bangalore

    Duration : Dec 2004 to January 2006.

    Professional Experience

    Projects

    #1Name : IBS Oct 2007 Till Date

    Client : RBC

    Team Size : 5

    Environment : DataStage7.5X2, Oracle 9i,Unix and Windows 2003.

    Description:

    Royal Bank of Canada has 70,000 full-and part-time employees serving 15 million

    clients through offices in North America and 34 countries around the world, RBC offers a

    full range of financial products and services of personal and commercial banking,

    wealth management services, insurance, corporate and investment banking and

    transactions processing services on a global basis . The objective of this project is to

    design and maintain a complete data warehouse system for Royal Bank management.

    This Project is to provide different reports and support adhoc queries for making

    intelligent banking decisions based on data available in various branches across the

    country collected over a period of time. The activities of the project ranged from data

    extraction, staging and loading into data warehouse through Datastage to developing

    end users.

  • 8/22/2019 Ds Resumes

    36/38

    Contribution:As an ETL Developer

    Used the transformations like Lookup, Filter, Aggregator, transformer in thedevelopment of data stage jobs

    Develop and implement strategies for performance tuning Used Sequential file, Oracle Enterprise, fileset, datasets, change capture, filter

    stage for designing the jobs in the DataStage Designer.

    Used Data Stage Director for running and monitoring & scheduling the jobs. Worked with Data Stage Manager for Export & Import Jobs. Involved in Unit Testing and integration testing.

    #2 Name : EDW for General Motors Dec 2006 Sep

    2007

    Client : General Motors

    Team Size : 6

    Environment : DataStage7.5X2, Oracle 9i,Unix and Windows 2003.

    Description:

    GM is the major manufacturer of motor vehicles. The main objective of the

    project is to outsource the Strategy, Design, Development, Coding, Testing, Deploymentand Maintenance of the Enterprise Data warehouse. This Project would cover the

    subject areas such as Vehicle Product Offering (VPO), Volume Mix Rates (VMR), Posts

    Structure Cost (PSC), and Account for Production Material (APM) and Control Vehicle

    Financials (CVF). Data from various source systems is gathered into a staging

    area, where data is cleansed and then finally moved into an Enterprise Data

  • 8/22/2019 Ds Resumes

    37/38

    warehouse Database. Data Stage 7.5x2 is used for extracting data from various Sources

    to Staging area; Staging area to DW. Business Objects is used as the reporting tool.

    Contribution:As an ETL Developer

    Mapping Documents, ETL Job Scheduling and maintenance documents.

    Performed import and export of Data Stage Jobs using DS Manager.

    Creating jobs involving Aggregator, transformer Stages

    Creating reusable comp