of 25/25
Module 8 Importing and Exporting Data

Module 8 Importing and Exporting Data

  • View
    42

  • Download
    3

Embed Size (px)

DESCRIPTION

Module 8 Importing and Exporting Data. Module Overview. Transferring Data To/From SQL Server Importing & Exporting Table Data Inserting Data in Bulk. Lesson 1: Transferring Data To/From SQL Server. Overview of Data Transfer Available Tools for Data Transfer - PowerPoint PPT Presentation

Text of Module 8 Importing and Exporting Data

Module 8Importing and Exporting Data1Module 8: Importing and Exporting DataCourse 6231BModule OverviewTransferring Data To/From SQL ServerImporting & Exporting Table DataInserting Data in Bulk

2Module 8: Importing and Exporting DataCourse 6231BLesson 1: Transferring Data To/From SQL ServerOverview of Data TransferAvailable Tools for Data TransferImproving the Performance of Data TransfersDisabling & Rebuilding IndexesDisabling & Enabling ConstraintsDemonstration 1A: Disabling & Enabling Constraints3Module 8: Importing and Exporting DataCourse 6231BScenariosCopying or moving between serversExporting query data to a fileImporting table data from a fileCopying to or from a viewTransforming collations

Overview of Data Transfer

ETL: Extract Transform LoadProvide an overview of bulk data transfer. Describe the main steps involved in any data transfer, including extracting data from any given source, transforming or aggregating (or even deaggregating -> allocating) the data as needed and then loading the data to any given destination.

Point out that these kind of tools are called ETL tools.

Discuss the potential scenarios for utilizing data transfer techniques.

Discuss the potential requirements of each scenario.

Question: In what situations have you had to import and export large amounts of data? Question: What other types of aggregation might need to be performed on data during the transformation phase?Answer: Counting the rows that have been processed.

Reference:Scenarios for Bulk Importing and Exporting Data: http://go.microsoft.com/fwlink/?LinkID=2134234Module 8: Importing and Exporting DataCourse 6231BAvailable Tools for Data TransferBulk Copy Program (bcp)BULK INSERT

C:\> bcp

XML Bulk Load

< >Import / Export WizardOPENROWSET(BULK)

Explain the different tools that are used in data transfers.

Highlight the considerations for selecting the data transfer tool for each project. To bulk import or export data as rapidly as possible, it is important to understand the factors that affect performance and the command qualifiers that are available to manage performance. Mention that this will be discussed in the next topics.

Bulk Copy Program (bcp) The bcp utility can be used to import large numbers of new rows into SQL Server tables or to export data out of tables into data files. Although it can be used with a query option it is normally used to transfer data without any transformation.Mention that a format file can be generated during export to ease the import of the data later on. Also point out that bcp can be used with a special native format to speed up the transfer between SQL Server instances. The use of native format between identical tables avoids unnecessary conversion of data types to and from character format, saving time and space.

BULK INSERT A Transact-SQL statement that imports data directly from a data file into a database table or view. Can be used for import but not export.

OPENROWSET(BULK) Includes all connection information that is required to access remote data from an OLE DB data source. This method is an alternative to accessing tables in a linked server and is a one-time, ad hoc method of connecting and accessing remote data by using OLE DB. For more frequent references to OLE DB data sources, use linked servers instead.

XML Bulk Load XML data can be imported as a binary stream, into an existing row.

Import/Export Wizard SQL Server 2008R2 includes the Import/Export Wizard, which is a simple way of creating a SQL Server Integration Services package. It can be used to import and export data from different sources and apply basic transformations on it. The SSIS package that is created, can be run immediately or scheduled.

5Module 8: Importing and Exporting DataCourse 6231BNotes Page Over-flow Slide. Do Not Print Slide. See Notes pane.Question: When would you choose SSIS over bcp?Answer: When a large amount of rows need to be imported or transferred and no transformation needs to be performed.

Reference:bcp Utility: http://go.microsoft.com/fwlink/?LinkID=213424Importing Bulk Data by Using BULK INSERT or OPENROWSET(BULK...): http://go.microsoft.com/fwlink/?LinkID=213425Examples of Bulk Importing and Exporting XML Documents: http://go.microsoft.com/fwlink/?LinkID=213426Importing and Exporting Data by Using the SQL Server Import and Export Wizard: http://go.microsoft.com/fwlink/?LinkID=213427

Module 8: Importing and Exporting DataCourse 6231BImproving the Performance of Data TransfersDisable constraints, indexes, and triggersNo need to check constraints as each row is loadedIndexes dont need to be maintained during importImportant to check business requirements before disabling triggersMinimizing lockingConsider the use of TABLOCK to speed up the importMinimizing loggingDatabase must be in BULK_LOGGED or SIMPLE modelAdditional requirements on table structure and lockingMinimize data conversionsUse native format when transferring data between SQL ServersDiscuss the each point and explain the implication of doing it.

Disable triggers, constraints and indexes:Point out that if triggers and constraints are enabled, the data has to be checked for every single row imported into SQL Server, which slows down import process significantly. Discuss that it might be a good idea to disable these objects, import the data and enable the constraints afterwards again. The data can be checked at the point where the objects are re-enabled, which typically is much faster than doing it during the import. Emphasize that only check and foreign key constraints can be disabled and that disabling of constraints will be discussed in a following topic. Point out that disabling indexes, prevents them from being maintained during data import, which can significantly improve performance. The indexes will be rebuilt when they are enabled after the import, which in total is still faster the maintaining them during the import. This approach might only be slower when the data already present in the table is much bigger than the imported data. Dont discuss too deeply at this point as it will be discussed in the next topic.

Emphasize that the effect of disabling constraints, indexes and triggers must be tested based on the import scenario.

Control the locking behaviourPoint out that be default, SQL Server manages the granularity of the locks it acquires during command execution. SQL Server always tries to balance the need to lock large numbers of objects against maintaining a highly concurrent environment. SQL Server starts with row level locking and only tries to escalate when a significant number of rows are locked within a table. Managing large numbers of locks occupies resources which could be used to minimize the execution time for the query. As the data in tables that are the target of bulk-import operations are normally only accessed by the process that is importing the data, the advantage of row-level locking is not present. Therefore it makes sense to use TABLOCK during import to lock the whole table.

Use minimal logging whenever possible:Point out that minimal logging is a special operation that can speed some particular operations including bulk-imports significantly. But besides speeding up the operations it also writes much less data in the transaction log files, preventing issues related to log file growth.

7Module 8: Importing and Exporting DataCourse 6231BNotes Page Over-flow Slide. Do Not Print Slide. See Notes pane.There are several requirements that operations are done minimally logged and not full logged:The table is not being replicated.Table locking is specified (using TABLOCK).If the table has no clustered index but has one or more nonclustered indexes, data pages are always minimally logged. How index pages are logged, however, depends on whether the table is empty:If the table is empty, index pages are minimally logged.If table is non-empty, index pages are fully logged. If the table has a clustered index and is empty, both data and index pages are minimally logged. In contrast, if a table has a clustered index and is non-empty, data pages and index pages are both fully logged regardless of the recovery model.

Question: What would the main problem with the transaction log be, if full logging occurs during a bulk-import operation?Answer: The log file can fill up or grow significantly, increasing the transaction log backup size, slowing down performance, or cause errors if the file cannot be expanded and fills up.

Reference:Prerequisites for Minimal Logging in Bulk Import: http://go.microsoft.com/fwlink/?LinkID=213428Optimizing Bulk Import Performance: http://go.microsoft.com/fwlink/?LinkID=213429Module 8: Importing and Exporting DataCourse 6231BDisabling & Rebuilding IndexesDisabling an indexPrevents user access to the indexPrevents access to the data if it is a clustered indexKeeps index definition in metadata Speeds up data import in tables, as the index is not maintained during the importEnabling an indexRebuilds the index entirelyIs easy to automate as the metadata is still presentEnabling and disabling indexes can be used as an alternative to dropping and recreating indexes

Explain the difference between dropping and disabling an index. The main difference is that dropping completely deletes the index out of the database and the metadata, while disabling only disables the index keeping the metadata intact.

Point out that we discussed before that indexes present on a table that is involved in a bulk-import operation slow down the operation. Disabling indexes might be a good choice to speed this up. Because the metadata is preserved when an index is disabled, it is much easier to script and automate the disabling and enabling of indexes than it is to script dropping and recreating the indexes.

Emphasize that it is possible to disable a clustered index. But because a clustered index holds the data of a table, disabling of indexes prevents access to the table.

Discuss that disabling and enabling indexes can be used as an alternative to dropping and recreating the indexes.

Question: What is the main advantage of disabling and enabling indexes compared to dropping and recreating an index during bulk-imports?Answer: The metadata is preserved, which makes scripting easier.

Reference:Disabling indexes: http://go.microsoft.com/fwlink/?LinkID=213430 9Module 8: Importing and Exporting DataCourse 6231BDisabling & Enabling ConstraintsDisabling PRIMARY KEY and UNIQUE constraints Is achieved by disabling the associated indexCauses associated indexes to be rebuilt when enabledCan cause failures during re-enabling if duplicate values existAlso causes associated foreign key constraints to be disabledDisabling FOREIGN KEY and CHECK constraintsCan be performed directly on the constraintCauses existing data to not be verified when re-enabledWhen you enable a FOREIGN KEY or CHECK constraint, existing data is not verified by defaultDiscuss the different constraints and how they can be disabled and enabled

Point out that Primary Key and Unique constraints are building indexes to enforce the constraints. Enabling and Disabling these constraints is done by enabling and disabling the associated indexes as discussed in the last topic.

Emphasize that Foreign Key constraints always refer to Primary Key or Unique constraints. And as they are dependent on these constraints, the foreign keys referencing a Primary Key or Unique constraint, will be automatically disable if the referenced Primary Key or Unique Constraint is disabled.

Foreign Key and Check Point out that Foreign Key and Check constraints can be disabled and enabled using the CHECK and NOCHECK option of the ALTER TABLE statement. Emphasize that this should not be confused with the WITH CHECK and WITH NOCHECK option that specifies whether the constraints are checked against the existing data when a CHECK or FOREIGN KEY constraint is created or enabled. Mention that WITH CHECK is the default when a constraint is created, but WITH NOCHECK is the default when a constraint is re-enabled.

Question: Why do referencing foreign key constraints get disabled when the referenced PRIMAY KEY or UNIQUE constraints get disabled?Answer: Because they are depending on these constraints and could no longer be checked efficiently.

Reference:Guidelines for Disabling Indexes and Constraints: http://go.microsoft.com/fwlink/?LinkID=213431

10Module 8: Importing and Exporting DataCourse 6231BDemonstration 1A: Disabling & Enabling ConstraintsIn this demonstration, you will see how to disable and re-enable a Check constraint 11High-level Steps

Revert the virtual machines as per the instructions in D:\6231B_Labs\Revert.txt.In the virtual machine, click Start, click All Programs, click Microsoft SQL Server 2008 R2, click SQL Server Management Studio. In the Connect to Server window, type Proseware and click Connect. From the File menu, click Open, click Project/Solution, navigate to D:\6231B_Labs\6231B_08_PRJ\6231B_08_PRJ.ssmssln and click Open.Open and execute the 00 Setup.sql script file from within Solution Explorer.Open the 11 Demonstration 1A.sql script file.Follow the instructions contained within the comments of the script file.

Notes for emphasis on specific steps:Step 4: Point out that the insert will work now.Step 5: Note that the NOCHECK is the default.Step 6: Note that it is not working, since the existing data is checked now.Step 7: Note that it is now working.

Module 8: Importing and Exporting DataCourse 6231BLesson 2: Importing & Exporting Table DataOverview of SQL Server Integration Services Demonstration 2A: Working with SSISSQL Server Import/Export WizardDemonstration 2B: Using the Import/Export Wizard12Module 8: Importing and Exporting DataCourse 6231BOverview of SQL Server Integration Services SSIS is a rich framework to develop ETL solutionsSSIS packages contain Data Sources and DestinationsControl and Data FlowsTransformations to be performedSSIS packages can be run usingdtsrun command line utilitySQL Server Agent jobs SSIS packages developed usingSQL Server Business Intelligence Development Studio (BIDS)Import / Export Wizard

Define what SSIS is. Also describe some of the basic tools used when using SSIS and defining some basic terminology.

SSIS packages can perform many complex calculations and can also integrate with many other applications, but the main purpose of SSIS is to create re-usable and easily deployable packages that perform data transfers.

Explain the core components of Integration Services, control flow and data flow and how they are used to design an SSIS project.Control flow is populated with tasks that provide functionality in packages and precedence constraints that connect containers and tasks.

SSIS contains a data-flow engine to transfer and transform data to and from varied data sources. The goal is to perform all data transformation steps of the ETL process in a single operation without staging data. It is therefore important that your integration solution can connect seamlessly to a wide range of data sources to make the most of the performance and reliability benefits brought by a comprehensive data access platform.

Question: When it will useful to use SSIS instead of other data transfer options?Answer: When complex transformations or flows need to be developed.

Reference:Typical Uses of Integration Services: http://go.microsoft.com/fwlink/?LinkID=213432Tools and Utilities Overview (Integration Services): http://go.microsoft.com/fwlink/?LinkID=21343313Module 8: Importing and Exporting DataCourse 6231BDemonstration 2A: Working with SSISIn this demonstration you will see how to use SQL Server Integration Services to export data from a table to a flat fileHigh-level Steps

If Demonstration 1A was not performed:Revert the virtual machines as per the instructions in D:\6231B_Labs\Revert.txt.In the virtual machine, click Start, click All Programs, click Microsoft SQL Server 2008 R2, click SQL Server Management Studio. In the Connect to Server window, type Proseware and click Connect. From the File menu, click Open, click Project/Solution, navigate to D:\6231B_Labs\6231B_08_PRJ\6231B_08_PRJ.ssmssln and click Open.Open and execute the 00 Setup.sql script file from within Solution Explorer.Open the 21 Demonstration 2A.sql script file.Follow the instructions contained within the comments of the script file.

Notes for emphasis on specific steps:Step 4: Mention the list of available templates.Step 5: Quickly show the available tasks in the Toolbox. Note keep in mind that this is not an SSIS course.Step 7: Quickly show the three sections in the Toolbox and the available controls. Step 24: Mention the displayed row count.

14Module 8: Importing and Exporting DataCourse 6231BSQL Server Import/Export WizardEasy to use wizard for creating an SSIS package that performs simple data transfers

Explain that The SQL Server Import and Export Wizard can copy data to and from any data source for which a managed .NET Framework data provider or a native OLE DB provider is available. Point out that the wizard has some limitations, but can be used with SQL Server, Flat files, Microsoft Office Access, and Microsoft Office Excel.

Although it leverages SQL Server Integration Services, the SQL Server Import and Export Wizard provides minimal transformation capabilities. Except for setting the name, the data type, and the data type properties of columns in new destination tables and files, the SQL Server Import and Export Wizard supports no column-level transformations.

Question: If additional transformations are needed above what is provided with the Import/Export Wizard, how could these be created? Answer: Create a package using the Import/export wizard and edit it using BIDS.

Reference:Importing and Exporting Data by Using the SQL Server Import and Export Wizard: http://go.microsoft.com/fwlink/?LinkId=213608

Module 8: Importing and Exporting DataCourse 6231BDemonstration 2B: Using the Import/Export WizardIn this demonstration you will see how to use the Import/Export Wizard to export data to a CSV fileHigh-level Steps

If Demonstration 1A was not performed:Revert the virtual machines as per the instructions in D:\6231B_Labs\Revert.txt.In the virtual machine, click Start, click All Programs, click Microsoft SQL Server 2008 R2, click SQL Server Management Studio. In the Connect to Server window, type Proseware and click Connect. From the File menu, click Open, click Project/Solution, navigate to D:\6231B_Labs\6231B_08_PRJ\6231B_08_PRJ.ssmssln and click Open.Open and execute the 00 Setup.sql script file from within Solution Explorer.Open the 22 Demonstration 2B.sql script file.Follow the instructions contained within the comments of the script file.

Notes for emphasis on specific steps:Step 3: Show the available list of Data Sources.Step 5: Mention that instead of selecting a table or view, that you could write a query to retrieve data from one or more tables.

16Module 8: Importing and Exporting DataCourse 6231BLesson 3: Inserting Data in Bulkbcp UtilityDemonstration 3A: Working with bcpBULK INSERT StatementDemonstration 3B: Working with BULK INSERTOPENROWSET FunctionDemonstration 3C: Working with OPENROWSET17Module 8: Importing and Exporting DataCourse 6231BBCP UtilityIs a command line tool to import and export dataUses a format file when transferring between SQL Instances

Creating a format file:

bcp Adv.Sales.Currency format nul -T -c x -f Cur.xml

Exporting data into a file:

bcp Adv.Sales.Currency out Cur.dat -T c

Importing data using a format file:

bcp tempdb.Sales.Currency2 in Cur.dat -T -f Cur.xmlbcpThe bcp utility bulk copies data between an instance of Microsoft SQL Server 2008 and a data file in a user-specified format. The bcp utility can be used to import large numbers of new rows into SQL Server tables or to export data out of tables into data files. Except when used with the queryout option, the utility requires no knowledge of Transact-SQL. To import data into a table, you must either use a format file created for that table or understand the structure of the table and the types of data that are valid for its columns.

Explain how a format file can ease the transfer between 2 SQL Server Instance. Go through the examples above and explain they main parameters:

-T to use integrated security to connect to the server-c to use character datatype for the export. Explain that n would use the SQL Server native format-f defines the format file-x defines that the format file should be created as XML file

Emphasize that in that example bcp would connect to the default instance on the local server. Explain that -S server_name\instance_name must be used, if the connection should go to a different server

Question: How could you improve the import speed of a bcp operation?Answer: By using minimal logging, which will be used, if all requirement discussed before are met.

Reference:bcp Utility: http://go.microsoft.com/fwlink/?LinkID=213424

Module 8: Importing and Exporting DataCourse 6231BDemonstration 3A: Working with bcpIn this demonstration, you will see how to import a file using the bcp utility19High-level Steps

If Demonstration 1A was not performed:Revert the virtual machines as per the instructions in D:\6231B_Labs\Revert.txt.In the virtual machine, click Start, click All Programs, click Microsoft SQL Server 2008 R2, click SQL Server Management Studio. In the Connect to Server window, type Proseware and click Connect. From the File menu, click Open, click Project/Solution, navigate to D:\6231B_Labs\6231B_08_PRJ\6231B_08_PRJ.ssmssln and click Open.Open and execute the 00 Setup.sql script file from within Solution Explorer.Open the 31 Demonstration 3A.sql script file.Follow the instructions contained within the comments of the script file.

Notes for emphasis on specific steps:

Step 4: Note the differences in the two types of format file.Step 6: Note that a large number of rows has been added.

Module 8: Importing and Exporting DataCourse 6231BBULK INSERT StatementProvides options similar to bcpRuns in the SQL Server processHas CHECK_CONSTRAINTS and FIRE_TRIGGERS optionsCan be executed within a user-defined transaction

BULK INSERT AdventureWorks.Sales.OrderDetail FROM 'f:\orders\neworders.txt' WITH ( FIELDTERMINATOR =' |', ROWTERMINATOR =' |\n' );GO20BULK INSERT loads data from a data file into a table. This functionality is similar to that provided by the in option of the bcp command; however, the data file is read by the SQL Server process. The BULK INSERT statement and OPENROWSET(BULK) functions execute in-process with SQL Server, sharing the same memory address space. Because the data files are opened by a SQL Server process, data is not copied between client process and SQL Server processes.

bcp runs in a separate process which produces a higher load, when run on the same system. On the other hand, bcp can be run on a separate system, and this can be used to offload load from SQL Server.

Explain that the CHECK_CONSTRAINTS and FIRE_TRIGGERS options can be used to tell SQL Server to check constraints and triggers. Emphasize that if not specified SQL Server, doesnt check CHECK and FOREIGN KEY constraints and doesnt fire insert triggers during import.

Point out that unlike bcp, BULK INSERT can be run in a user-defined transaction, which gives the ability to group BULK INSERT with other operations in a single transaction.

Question: How does the BULK INSERT statement differ from bcp?Answer: BULK INSERT runs in the process of SQL Server, can omit constraint checking and trigger firing and can be part of a user-defined transaction.

Reference:BULK INSERT (Transact-SQL): http://go.microsoft.com/fwlink/?LinkID=213434Module 8: Importing and Exporting DataCourse 6231BDemonstration 3B: Working with BULK INSERTIn this demonstration, you will see how to import a file using the BULK INSERT command21High-level Steps

If Demonstration 1A was not performed:Revert the virtual machines as per the instructions in D:\6231B_Labs\Revert.txt.In the virtual machine, click Start, click All Programs, click Microsoft SQL Server 2008 R2, click SQL Server Management Studio. In the Connect to Server window, type Proseware and click Connect. From the File menu, click Open, click Project/Solution, navigate to D:\6231B_Labs\6231B_08_PRJ\6231B_08_PRJ.ssmssln and click Open.Open and execute the 00 Setup.sql script file from within Solution Explorer.Open the 32 Demonstration 3B.sql script file.Follow the instructions contained within the comments of the script file.

Notes for emphasis on specific steps:

Step 4: Note the number of messages on the Messages tab.

Question: Ask the students why the first message says 199 and messages after that say 200.Answer: The first batch contained the header row which was skipped.

Module 8: Importing and Exporting DataCourse 6231BOPENROWSET FunctionAllows access to remote data by connecting to a remote data source using an OLE-DB providerOffers a built-in BULK provider for bulk imports from filesCan be used in a FROM clause with full functionalityCan use INSERT .. SELECT to insert dataOffers unique table hints to control operation SELECT * FROM OPENROWSET( BULK 'c:\mssql\export.csv', FORMATFILE = 'c:\mssql\format.fmt',FIRSTROW = 2) AS a;GO22OPENROWSET can be used to access remote data from OLE DB data sources only when the DisallowAdhocAccess registry option is explicitly set to 0 for the specified provider, and the Ad Hoc Distributed Queries advanced configuration option is enabled. When these options are not set, the default behavior does not allow for ad hoc access.

In addition, discuss format files. The format file can be used to provide all the format information that is required to bulk export data from and bulk import data to an instance of SQL Server, expanding the flexibility of bcp and BULK INSERT.

Explain that OPENROWSET is a system table-valued function that can be used to access remote data through OLE-DB provider.

In addition OPENROWSET has a built-in BULK provider to perform bulk imports to SQL Server.

Explain that in comparison to bcp and BULK INSERT OPENROWSET can be used with a simple SELECT statement that provides all querying functionalities. With this feature it is possible to filter or aggregate the data before inserting it into a table.

Point out that OPENROWSET can also be used to import files into LOB data, which is out of scope of this course.

In the example on the slide, the BULK provider for OPENROWSET is being used to import a comma-delimited file (csv file). The format of that file is defined in the format file format.fmt and the first row is being skipped. (It is likely that the first row contains header information).

Another example is included in the SM and described there. Make sure you have read it and understand how it works.

Point out that OPENROWSET will be shown in the next demonstration.

Question: When will it make sense to use OPENROWSET instead of bcp or BULK INSERT?Answer: When the data must be filtered before inserted into a table. Module 8: Importing and Exporting DataCourse 6231BNotes Page Over-flow Slide. Do Not Print Slide. See Notes pane.Reference:OPENROWSET (Transact-SQL): http://go.microsoft.com/fwlink/?LinkID=213435

Module 8: Importing and Exporting DataCourse 6231BDemonstration 3C: Working with OPENROWSETIn this demonstration, you will see how to import a file using OPENROWSET24High-level Steps

If Demonstration 1A was not performed:Revert the virtual machines as per the instructions in D:\6231B_Labs\Revert.txt.In the virtual machine, click Start, click All Programs, click Microsoft SQL Server 2008 R2, click SQL Server Management Studio. In the Connect to Server window, type Proseware and click Connect. From the File menu, click Open, click Project/Solution, navigate to D:\6231B_Labs\6231B_08_PRJ\6231B_08_PRJ.ssmssln and click Open.Open and execute the 00 Setup.sql script file from within Solution Explorer.Open the 33 Demonstration 3C.sql script file.Follow the instructions contained within the comments of the script file.

Notes for emphasis on specific steps:Step 1: Review the existing format.fmt file in Solution Explorer. Note that it is under the Miscellaneous category.Step 2: Point out that we are importing a copy of the data that we exported in demonstration 2B.

Module 8: Importing and Exporting DataCourse 6231BLab 8: Importing and Exporting DataExercise 1: Import the Excel spreadsheetExercise 2: Import the CSV fileExercise 3: Create and test an extraction packageChallenge Exercise 4: Compare loading performance (Only if time permits)Logon informationEstimated time: 45 minutesVirtual machine623XB-MIA-SQLUser nameAdventureWorks\AdministratorPasswordPa$$w0rd25Exercise 1You need to load a file of currency codes and names from an Excel spreadsheet. In this exercise, you will use the import wizard to perform the data load.Exercise 2You have also been provided with a comma-delimited file of exchange rates. You need to import these exchange rates into the existing DirectMarketing.ExchangeRates table. The table should be truncated before the data is loaded. Exercise 3Periodically the Marketing team requires a list of prospects that have not been contacted within the last month. You need to create and test a package that will extract this information to a file for them.Challenge Exercise 4 (Only if time permits)You are concerned about the import performance for the exchange rate file and you are considering disabling constraints and indexes on the exchange rate table during the import process. If you have time, you will test the difference in import performance.Module 8: Importing and Exporting DataCourse 6231BLab ScenarioProseware regularly receives updates of currencies and exchange rates from an external provider. One of these files is provided as an Excel spreadsheet, the other file is provided as a comma-delimited text file. You need to import both these files into tables that will be used by the Direct Marketing team within Proseware.Periodically the Marketing team requires a list of prospects that have not been contacted within the last month. You need to create and test a package that will extract this information to a file for them.You are concerned about the import performance for the exchange rate file and you are considering disabling constraints and indexes on the exchange rate table during the import process. If you have time, you will test the difference in import performance.26Module 8: Importing and Exporting DataCourse 6231BLab ReviewWhat kind of information needs to be present in a format file? Which tool or command should you use to read an entire XML document into a column in a SQL Server table?27Question: What kind of information needs to be present in a format file? Answer: The format file includes information to map columns and provide column information such as data type. The format file needs to contain correct column or record information or the command will fail.

Question: Which tool or command should you use to read an entire XML document into a column in a SQL Server table?Answer: The BULK option in the OPENROWSET command.Module 8: Importing and Exporting DataCourse 6231BModule Review and TakeawaysReview QuestionsBest Practices28Review Questions

Question: When would you use SSIS instead of other data transfer utilities?Answer: When complex transformations are needed.

Question: Why are minimally logged operations faster then fully logged operations?Answer: Because less data has to written to the transaction log files.

Best Practices Choose the right tool for bulk-importsUse SSIS for complex transformations.Use bcp or BULK INSERT for fast imports and exports.Use OPENROWSET when data need to be filtered before it gets inserted.Try to achieve minimal logging to speed up data import

Module 8: Importing and Exporting DataCourse 6231B