19
.SQL Portfolio Name Email

Sql portfolio admin_practicals

Embed Size (px)

Citation preview

Page 1: Sql portfolio admin_practicals

.SQL Portfolio

NameEmail

Page 2: Sql portfolio admin_practicals

Acme Traders Research and Development Department

Piggy Bank-Developing an Archive Plan Securing SQL Server Using Bulk Insert Choosing the Right Replication Type Library Maintenance Plan Triggers-Library Database

Page 3: Sql portfolio admin_practicals

Case Study – Web application using XML Acme Traders Research and Development database has up until now, only been available in house. Given various needs for engineers to review this data abroad, they will now

have this data available via a web application. This application will use a server running Server 2003 and of course - SQL Server 2005. Research Data (many different types exist- but just use this one example for this case study) entered through the web application is stored as the XML data type using the

following defined type:<Product>

<ProductID>1</ProductID><ProductName>Widget</ProductName><ProductText>

<Intro>Introduction</Intro> <SubIntroTitle= “Web Application”> Clever web application</SubIntroTitle> <SubIntroTitle=”Unit of data”>Used as a theoretical unit of data</SubIntroTitle></ProductText>

<Product>The XML datatype column is named ProductText and is in the FutureProjects table. The FutureProjects table has ProductID, MasterID, TeamID, ProductText, ProductCategoryID,

and Date. This web server should be available to engineers 24/7. It is important that this data is kept secure and is also backed up regularly.  1. You are creating a procedure for the web application to return each of the product Names from the reference data in the FutureProjects table. You only want to have this data

returned. You do not want any XML tags returned. Which of the following should you use?a) SELECT ProductText.query(‘\Product\ProductName’) FROM FutureProjectsb) SELECT ProductText.nodes(‘\Product\ProductName’)FROM FutureProjectsc) SELECT ProductText.value(‘\Product\ProductName[1]’,’varchar’) FROM FutureProjectd) SELECT ProductText.exist(‘\Product\@ProductName’,’varchar’) FROM FutureProjects

 2. You have been asked to create a new feature for a web application that will list the subIntroTitles for a given product that is stored in the database. Which of the following

should you use?a) nodes() methodb) OPENXMLc) value() methodd) query() method

 3. Since research engineers are reading, updating and inserting to the data (and on the same projects) concurrently, how should you handle concurrency requirements?

a) Set the transaction isolation level to SERIALIZABLEb) Set the transaction isolation level to READ_UNCOMMITTEDc) Set the transaction isolation level to REPEATABLE_READd) Set the transaction isolation level to READ_COMMITTED_SNAPSHOT

Page 4: Sql portfolio admin_practicals

4. The ProductGuides table has the document listing of all available product research materials. Engineers will often use this in the initial stages of research and design and it is queried very heavily and results in some large result sets. The content is relatively static however. As an extremely large table you will need to create an efficient index to minimize the locking in the table for the most common query- which are queries based on the ProductCategoryID. Which of the following should you use?a) ALLOW_ROW_LOCKS = OFF ALLOW_PAGE_LOCKS = OFF b) LLOW_ROW_LOCKS = ON ALLOW_PAGE_LOCKS = ONc) ALLOW_ROW_LOCKS = OFF ALLOW_PAGE_LOCKS = ON d) ALLOW_ROW_LOCKS = ON ALLOW_PAGE_LOCKS = OFF

 5. This database was originally designed for SQL Server 2000 and you need to make sure that all aspects are compliant with SQL Server 2005 and versions forward. Several

columns use the image data type. Which of the following would you use to replace this datatype? a) Nvarchar(max) b) Varbinary(max)c) Varbinary d) Nvarchar

 6. Several Vista notebook clients need to have full functionality to your SQL Server 2005 application. Which of the following client libraries should they be using?

a) OLEDB b) SQLCMDc) SQLNCLI d) ODBC

 7. Users are complaining that various queries are taking too long to process. You would like to capture these queries and analyze the data with a minimum of impact on the

current server. Which of the following represents the best answer?a) Create a SQL Server Profiler replay trace and save the data to a file on the SQL Server. b) Create a SQL Server Profiler replay trace and save the data to a file on another server.c) Create a SQL server Profiler replay trace and save the data to a table in tempdb.d) Monitor the queries in System Monitor using SQL Server: Memory Manager counters.

 8. When resolving a problem with a specific query, you would like to determine if a different index would make the query perform more efficiently. Which of the following SET

operations could you use to determine this? a) SET SHOWPLAN_TEXT ON b) SET STATISTICS XML ONc) SET SHOWPLAN_XML ON d) SET FORCEPLAN ON

 9. What solution should be used to provide secure access between the web server on the perimeter network and the research and the SQL Server development database in the

internal network? a) Install IIS on the SQL Server 2005 server and access the data using XML queries. B) Create an HTTP endpoint and access the server data using stored procedures.c) Create an HTTP endpoint and access the server data using ad hoc queries. D) Stop and restart the perimeter web server and access the server data using ad hoc queries

 10. You need to create indexes to improve the performance of queries for the FutureProjects table. The most common queries return the Product ID and Product Name. Which

indexes should you create? Choose all that apply. a) Create a clustered index on TeamID b) Create a clustered index on ProductTextc) Create a clustered index on ProjectID d) Create a primary XML index on ProductTexte) Create a property index for ProductText

Page 5: Sql portfolio admin_practicals

PiggyBank has survived the subprime mortgage fallout, and is now in a relative position of strength among it’s competitors. It now serves 2.4 million customers over a very expansive area in the U.S. and Canada. The company is headquartered in Parsippany, NJ.

543 Employees now work for PiggyBank in Parsippany. There are 3 regional offices, which for convenience we will call North, West and South. PiggyBank now has a 6TB OLTP database that tracks more than 4 billion transactions each year. This main database is

stored in Parsippany. The regional offices only process Deposit and Withdrawal information and update the Parsippany office daily.

The departmental servers in Parsippany have been experiencing some problems. Server capacity is often overloaded, resulting in subpar performance and several frustrating delays.

Company growth has been hurt by the general economy, but within 2 years, the company should return to 3% annual growth. The database is growing by 8% each year and will eclipse HD capacity within the next 2 years.

Most of this database is historical information. Government regulations require that 7 year records are kept, and that this data must be available within 24 hours. You must design a data-archiving plan for PiggyBank’s ATM transactions. Only the current month will now be kept in the

online database. Keep in mind that for all of the transactions- once they are made, they cannot be modified. Any changes would be

reflected by an additional transaction at a later date. This would make all of the historical information read-only.

1. Fill out this table (modify as you like) to show the online and archived data-accessibility requirements. Classify the data based on the time divisions, and indicate the storage format for each.

Data Source Accessibility Requirements Storage FormatOnline Current Month high availability

, immediate OLTP Database ServerArchived accessible within 24 hours

last 7 years diskOffline (older than 7 years) offsite tapes

Page 6: Sql portfolio admin_practicals

2. What is your proposed data movement schedule?Data Movement FrequencyFrom Online To Archive MonthlyFrom Archive to Tape Yearly 3. Which of the following should be considered when designing the archival design strategy? (Choose all that apply)

a) Cost b) Government/Industry Regulationsc) Accessibility Requirements d) Granularity

4. Which data structure would you use if you wished to maintain the historical context of the archival data, but you cannot archive all the related data together? a) Partitioned Data b) Normalized Tablesc) Denormalized Tables d) Summary Tables

5. If the requirements for the case study were to 1) maintain 24 months of data online for immediate access for queries and updates and 2) maintain a total of 7 years for accounting and reporting requirements, which of the following would be the most appropriate storage format?a) Place the current 24 months data on an OLTP database server and 5 years’ data on an archive server.b) Place all the data on the OLTP server, and use partitioning to separate the data between the current 24 months and the remaining 60 months.c) Place the current 24 months of data on an OLTP server and the remainder on tape.d) Use summary tables to reduce the load on the OLTP server, and store all detailed data on an archive server.

6. The data-movement strategy should contain which of the following steps? (Choose all that apply)a) Verification that data has been copied to the destination storage format b) Means to ensure the security of data during movementc) Specification of the frequency of data movement d) Scheduling of data movement to minimize impact on the production server

7. Which of the following roles can a single server have in a replication topology?a) Distributor b) Publisherc) Subscriber d) All of the Above e) A and B only

Page 7: Sql portfolio admin_practicals

8. Which of the following statements regarding replication topologies in SQL Server 2005 are true? (choose all that apply)A) Wizards are available in SSMS to simplify the setup once you’ve designed itB) Specific tables of a database can be replicated, not necessarily the entire databaseC) Schema changes can be automatically sent to subscribers without using any special stored procedures. D) All of the Above.

9. You are a database administrator for LoveMyLube, a small chain of auto service shops that provide oil changes and similar services. Requirements are that 48 months must be stored online in the Sales database and that older data must be sent to an archival database. Which of the following is the best way to structure the SalesTransactions table?A) Partitioned ViewB) Table PartitioningC) DenormalizationD) Summary tables

10. Refer to the previous question. Which archival frequency would you use? A) DailyB) MonthlyC) QuarterlyD) Annually

Page 8: Sql portfolio admin_practicals

“Ensuring your solution is safe from code injections attacks and minimizing the surface attack area”1. How would you mitigate code-injection attacks? (Please give at least 4 examples) Whenever you use string concatenation to build SQL code dynamically and accept user input as part of the concatenated string, treat your

application as insecure. There are too many different techniques to exploit this vulnerability, and new techniques evolve all the time. You can mitigate the problem by using the minimal-privilege approach. Disable all unnecessary services and features, such as extended procedures, to minimize the attack surface area. You should not return SQL Server error messages to the client application directly because they can inform the attacker that your application is using string concatenation. Validate all user input, testing the size and type of the input. Validate XML input against XML schemas. Check and reject special characters that can be used to modify the intended execution of your SQL string, such as semicolons (command delimiter), apostrophes (string delimiter), and double hyphens (inline comments). Do not accept strings that an attacker can use to construct file names, such as AUX, CON, and so on.

2. How can you minimize the surface attack area for your SQL Server services and components quickly? (What tool would you use?)Stop or Disable servicesSQL Server Surface Area Configuration Tool

3. How can you secure the sa login? (Please give at least 3 examples)Use Windows authenticationEncrypt communications for the log on processImplement passing aging

4. How would you implement the principle of least privilege for Notification Services service accounts? (What accounts should you not use?)Configure the engine to use Windows Authentication for database access.Run the engine under a low-privileged domain or local account. Do not use the Local System, Local Service, or Network Service

account or any account in the Administrators group.However, a delivery protocol may require additional privileges for the account that the service runs under.

When you deploy an instance of Notification Services, make sure that each engine has only the necessary permissions.For single-server deployments, the engine runs all of the instance's hosted event providers, generators, and distributors. The account used by the engine should obtain the required database permissions through membership in the NSRunService database role.For scale-out deployments, restrict the permissions of individual engines.

5. Your application uses the xp_cmdshell extend stored procedure. After you upgrade you database to SQLServer 2005, your application does not run anymore. What went wrong, and what can you do to mitigate the problem? Xp_cmdshell is disabled by default on new installs; it can be enabled by using the Policy Based Management or by running the

sp_configure system stored procedure. Just enable the xp_cmdshell procedure and all should be good

Page 9: Sql portfolio admin_practicals

First Export the data from AdventureWorks.Sales.CreditCard to a text file using the Bcp.exe command prompt utility from the command line or from within SSMS.

Create 2 new clean databases (Names: Test, Test2 would be fine). Set the recovery model for both to Bulk_Logged. Add the CreditCard table to both these databases via the “Create Table” script included below:

USE **[Whatever you’ve named your 2 databases – i.e. Test and Test2]**GOSET ANSI_NULLS ONSET QUOTED_IDENTIFIER ONGOCREATE TABLE [CreditCard](

[CreditCardID] [int] IDENTITY(1,1) NOT NULL,[CardType] [nvarchar](50) NOT NULL,[CardNumber] [nvarchar](25) NOT NULL,[ExpMonth] [tinyint] NOT NULL,[ExpYear] [smallint] NOT NULL,[ModifiedDate] [datetime] NOT NULL CONSTRAINT [DF_CreditCard_ModifiedDate] DEFAULT (getdate()),

CONSTRAINT [PK_CreditCard_CreditCardID] PRIMARY KEY CLUSTERED ( [CreditCardID] ASC)WITH (PAD_INDEX = OFF, STATISTICS_NORECOMPUTE = OFF, IGNORE_DUP_KEY = OFF, ALLOW_ROW_LOCKS

= ON, ALLOW_PAGE_LOCKS = ON) ON [PRIMARY] ) ON [PRIMARY] GO  Populate the first table (in the first database) by using INSERT… SELECT:INSERT INTO dbo.CreditCardSELECT * FROM AdventureWorks.Sales.CreditCard;GO Check the size of the transaction log. Use DBCC SQLPERF(‘Logspace’); GOUsing the second database and second CreditCard table, use BCP.exe to complete the same import taskUse the DBCC SQLPERF command again to test the size of the transaction log, and summarize the results for me (include the sizes for

both transaction logs).

Page 10: Sql portfolio admin_practicals

Result:The MB size of the of both transaction logs was the same,

9.929688MB. There was a difference in the Log Space Used (%), but it wasn’t much of one. The test1 had 50.96873% and test2 had slightly more at 50.95889%. There really isn’t a difference in the inserts so it would come down to which ever way you prefer to do it. Either T-SQL or BCP

Page 11: Sql portfolio admin_practicals

Acme Traders have several locations throughout the world. Users in these various locations need access to data stored in SQL Server, and they need it as fast as possible.

One of the servers, located in Los Angeles, CA, contains a Sales database that needs to be replicated to 3 primary satellite offices in San Francisco, Portland and Seattle,

1. which are connected via a T1 connection that runs at 85% capacity. The sales associates make changes through the day but the users in the satellite offices do not need to see the changes to the data immediately. What type of replication should be used? a) Merge b) Transactional with queued updatesc) Snapshot d) Transactional with subscribers that immediately updatee) Snapshot with subscribers that periodically update

2. The accounting departments at each of the primary satellite locations needs a copy of the data from the main accounting database in L.A. that they can make changes to locally. They need this data to be as current as possible. Which type of replication suits them best?a) Merge b) Transactional with queued updatesc) Snapshot d) Transactional with subscribers that immediately updatee) Snapshot with subscribers that periodically update

3. Several additional smaller sales offices are located in the west. The L.A. office needs an up-to-date copy of the sales offices‘ databases. When the L.A. office sends new inventory to these sales offices, they want to update the database in L.A. and have this new data replicated to the proper office. Which replication type should be used?a) Merge b) Transactional with queued updatesc) Snapshot d) Transactional with subscribers that immediately updatee) Snapshot with subscribers that periodically update

4. The retail division of Acme manages shops in various cities. Each of these shops maintains it’s own inventory database. The retail manager in Oakland, CA wants his shop to be able to share inventory with his other stores in the east bay. The employees will be able to update their local copy of the inventory database, subtract appropriately from the other store’s inventory, and then go pick up the part. The part will certainly be there as it’s been taken out of inventory. Which replication type should you use to accomplish this? a) Merge b) Transactional with queued updatesc) Snapshot d) Transactional with subscribers that immediately updatee) Snapshot with subscribers that periodically update

Page 12: Sql portfolio admin_practicals

Please provide a detailed backup and maintenance plan for the library database. Please either include screenshots of the setup, or the scripts needs to create the objects (Jobs, Schedules, etc.)

 The Library experiences medium-heavy traffic on weekends and evenings and fairly light traffic during the days and

mornings. The library closes at 9pm each week night and opens at 7am each morning Sunday-Saturday. Over the weekends it closes at 5pm.

 A fairly general recommendation is fine- but please address any potential concerns in the plan (be creative). A TWO

paragraph minimum of rationale for your backup/maintenance plan is required if you are going the screen shot route, and otherwise, you will probably need a page or two to outline each option.

 Use of Wizards is perfectly acceptable for this exercise.  I used the Maintenance wizard to create a backup and maintenance plan for the Library database. A FULL backup should be

done on a weekly basis, Sunday night at 11:30pm. A Differential backup will done every night of the week except on Sunday at 1am. A Transaction Log backup will be done on an hourly basis from 8am until 8pm daily. By using this as a backup schedule all data will be preserved in the event of a failure. Due to the light work load during the daytime hours an hourly backup of the transaction log should be sufficient. I also have set up a job to check data integrity to run nightly at 10pm, and then a reorganization of the indexes will be done after that. This way when the data is all set for backup.

 Due to the way the scheduling works in the wizard the transaction log backup job could not be altered for the weekend

hours. This will create a problem in the event of failure as you will be restoring empty backups of the transaction log. This needs to be adjusted so that the transaction log backup stops a 5pm on the weekend. I also included a cleanup maintenance job to remove the backups that are older than 4 weeks old. This data is not needed for anything and this way the disks are whipped clean and ready to reuse for backups.

Page 13: Sql portfolio admin_practicals

The SetFocus Library Human Resource application uses stored procedures for all access to tables in the human resource database. However in addition to permissions to execute the procedures, end users need permissions on base tables because the owners of the procedures are often different from the owners of the base tables. In addition, some personnel use Excel to create ad hoc reports from the base tables. You notice that end users frequently change the data in base tables directly, in an uncontrolled and risky manner, instead of using the stored procedures and the application.

1. How can you force the end users to access the tables through the programmable objects? If you remove their right to access the base tables and forces them to use the stored procedures. The stored

procedures have the rights to update the tables and the users have the rights to execute the procedure.2. If you revoke the end users’ permissions on base tables, how can you enable the users to still be able to generate ad

hoc reports in Excel? Create snapshots of the tables and give the users’ read permissions to the snapshots. They can do their ad hoc

reports against the snapshots instead of the actual tables. Or you can create views and grant the users’ read permissions on them to do their ad hoc reports.

3. How can you force the end users to access the tables through the programmable objects? You can force end users to access tables through the stored procedures by eliminating the broken ownership chains

problem in the human resources database. You can do this by changing the owner of the objects to a single owner or by altering the procedures to use a different execution context and, for example, impersonate a single fictitious user who has permissions to access the base tables. Then, you can revoke all permissions on base tables from end users.

 4. If you revoke the end users’ permissions on base tables, how can you enable the users to still be able to generate ad

hoc reports in Excel? If you revoke all permissions on base tables from end users, they will no longer be able to create ad hoc reports. You can

create views that have the same owner as the base tables and then grant SELECT permission only to your end users. However, when you create a view, you cannot specify a different execution context. Therefore, you can use views if there is a single owner of all base tables only; otherwise, you would encounter the same broken ownership chain problem as soon as your view joins data from two base tables with different owners. In such a case, you could use stored procedures and multi-statement table-valued functions instead of views as the intermediate data access layer.

Page 14: Sql portfolio admin_practicals

After reviewing the Library Database and considering the importance of various data contained within it, please develop appropriate DDL and DML trigger(s) to prevent unwanted changes to important data in the library.

Trigger 1CREATE TRIGGER table_safety ON DATABASE FOR DROP_TABLE, ALTER_TABLE AS PRINT 'You must disable Trigger "safety" to drop or alter tables!' ROLLBACK; Trigger 2CREATE TRIGGER proc_safety ON DATABASE FOR DROP_PROCEDURE, ALTER_PROCEDURE AS PRINT 'You must disable Trigger "safety" to drop or alter procedures!' ROLLBACK;USE [library]GOSET ANSI_NULLS ONGOSET QUOTED_IDENTIFIER ONGO 

Page 15: Sql portfolio admin_practicals

Trigger 3CREATE TRIGGER [dbo].[dmldeletemember]ON [dbo].[member]INSTEAD OF DELETEAS RAISERROR ('You can not delete a member from the database', 16, 10) Trigger 4 CREATE TRIGGER [dbo].[reminder1]ON [dbo].[member]AFTER INSERTAS RAISERROR ('Be Sure to give Member a card', 16, 10)

Page 16: Sql portfolio admin_practicals

There is not a need for a lot of DML triggers in this database as rows can’t be deleted from table with foreign key constraints and most of these tables have foreign key constraints on them. Are there performance considerations that you should take into account when implementing these triggers? Please give a thorough description of any potential drawbacks with implementing these triggers.

With Triggers as long a you stay away from using cursors and nested triggers the overhead is relatively low. If you nest triggers within triggers this will increase your overhead for the trigger and could impact the system performance depending on how deep the nesting goes. Using cursors is never a good idea there is always significant overhead with them so they should only be used in absolute necessity.

 Your supervisor is also interested in hearing what additional security features you might

suggest with regards to auditing in general.  From your experience with the library (or libraries in general) please give your supervisor a brief description of capabilities with SQL Server 2005- and what types of auditing you might suggest (2 sentence minimum).  

An audit trail could be set up for when changes are made to the books table as far as when they are checked in or out, or new ones arrive. This should be set on a different server if at all possible so that the overhead generated from it won’t impact the current server. This could be set up as a mirror on another server and any of the table data changes could be tracked.

Page 17: Sql portfolio admin_practicals

Please develop a log table to store DDL event data in the library, and a trigger that inserts such data after it occurs.

CREATE TRIGGER [ddlDatabaseTriggerLog] ON DATABASE FOR DDL_DATABASE_LEVEL_EVENTS AS BEGIN SET NOCOUNT ON;  DECLARE @data XML; DECLARE @schema sysname; DECLARE @object

sysname; DECLARE @eventType sysname;  SET @data = EVENTDATA(); SET @eventType = @data.value('(/EVENT_INSTANCE/EventType)[1]', 'sysname'); SET @schema = @data.value('(/EVENT_INSTANCE/SchemaName)[1]', 'sysname'); SET @object = @data.value('(/EVENT_INSTANCE/ObjectName)[1]', 'sysname')  

Page 18: Sql portfolio admin_practicals

IF @object IS NOT NULL PRINT ' ' + @eventType + ' - ' + @schema + '.' + @object; ELSE PRINT ' ' + @eventType + ' - ' + @schema;  IF @eventType IS NULL PRINT CONVERT(nvarchar(max), @data);  INSERT [dbo].[DatabaseLog] ( [PostTime], [DatabaseUser], [Event], [Schema], [Object], [TSQL], [XmlEvent] ) VALUES ( GETDATE(), CONVERT(sysname, CURRENT_USER), @eventType, CONVERT(sysname, @schema), CONVERT(sysname, @object), @data.value('(/EVENT_INSTANCE/TSQLCommand)[1]', 'nvarchar(max)'), @data

);END;  

Page 19: Sql portfolio admin_practicals

GOSET ANSI_NULLS OFFGOSET QUOTED_IDENTIFIER OFFGODISABLE TRIGGER [ddlDatabaseTriggerLog] ON DATABASEGOEXEC sys.sp_addextendedproperty @name=N'MS_Description', @value=N'Database

trigger to audit all of the DDL changes made to the AdventureWorks database.' , @level0type=N'TRIGGER',@level0name=N'ddlDatabaseTriggerLog'