dotNET, SQL SERVER and more.doc

Embed Size (px)

Citation preview

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    1/34

    .NET and SQL SERVER

    DataSet Vs. DataReaderI like to do as little work as I can when I code, so I used to like the DataSet. It can be

    filled and ready to go in just 3 lines of code, and then iterated using a nice, simple

    foreach loop (its even easier if you use typed DataSets!). Its a nice collection to work

    with. But often, performance is required at the expense of elegance -- especially on a

    performance-critical Web application.

    The DataSet actually uses a DataReader to populate itself. A DataReader is a lean, mean

    access method that returns results as soon as theyre available, rather than waiting for thewhole of the query to be populated into a DataSet. This can boost your application

    performance quite dramatically, and, once you get used to the methodology, can be quiteelegant in itself.

    The Advantages of DataReader in Action

    To highlight the advantages of using a DataReader over the DataSet, heres an example of

    using a DataSet. The following fills a DataSet with the results from a table, and outputs thefirst field in each row:

    SqlConnection conn = new SqlConnection(connectionString);

    SqlDataAdapter a = new SqlDataAdapter

    ("select * from mytable;",conn);

    DataSet s = new DataSet();

    a.Fill(s);

    foreach (DataRow dr in s.Tables[0].Rows)

    {

    Console.WriteLine(dr[0].ToString());

    }

    As you can see, we dont actually start the actual inspection of data (the foreach loop), untilthe whole DataSet has been filled. There may be occasions where we may not use all our

    results, or we might execute other code while inspecting (a progress bars progress is a

    trivial example). Using a DataSet, this can only take place after the complete results arefetched and passed into the various collections within the DataSet.

    In contrast, heres code that achieves the same results using a DataReader in place of a

    DataSet:

    SqlConnection conn = new SqlConnection(connectionString);

    SqlCommand comm = new SqlCommand("select * from mytable", conn);

    comm.Connection.Open();

    SqlDataReader r =

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    2/34

    comm.ExecuteReader(CommandBehavior.CloseConnection);

    while(r.Read())

    {

    Console.WriteLine(r.GetString(0));

    }

    r.Close();

    conn.Close();

    Here, the inspection is made as soon as data is available by employing the while loop,

    where r.Read() returns false if no more results are found. Not only can we therefore

    inspect as we go, but the DataReader only stores one result at a time on the client. This

    results in a significant reduction in memory usage and system resources when compared to

    the DataSet, where the whole query is stored.

    When Only DataSet Will Suffice

    Now, there are times when only a DataSet will suffice. Often, youll need to serialize your

    results, or pass the query results on to the next tier of your application. In these cases, acollection is required, and the DataSet provides a well-supported mechanism for doing this.

    For example, you can quickly serialize a DataSet to XML by calling the WriteXML

    method, or pass a DataSet in a SOAP method. While you can create your own collections tostore your results, with all this in-built, optimized functionality at hand, the DataSet is still a

    powerful type to keep in mind.

    However, for the majority of queries employed by Web applications, where data is found,

    displayed, and then forgotten, the DataReader will increase your performance drastically,with only a little extra work.

    When to use SqlDataAdapter? and When to use SqlDataReader ?

    If you use a SqlDataAdapter to generate a DataSet or DataTable, note the following:

    You do not need to explicitly open or close the database connection. The

    SqlDataAdapter Fill method opens the database connection and then closes theconnection before it returns. If the connection is already open, Fill leaves the

    connection open.

    If you require the connection for other purposes, consider opening it prior to calling

    the Fill method. You can thus avoid unnecessary open/close operations and gain aperformance benefit.

    Although you can repeatedly use the same SqlCommand object to execute the samecommand multiple times, do not reuse the same SqlCommand object to execute

    different commands.

    Use a SqlDataReader obtained by calling the ExecuteReader method of the SqlCommand

    object when:

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    3/34

    You are dealing with large volumes of datatoo much to maintain in a single

    cache.

    You want to reduce the memory footprint of your application.

    You want to avoid the object creation overhead associated with the DataSet.

    You want to perform data binding with a control that supports a data source thatimplements IEnumerable.

    You wish to streamline and optimize your data access.

    You are reading rows containing binary large object (BLOB) columns. You can use

    the SqlDataReader to pull BLOB data in manageable chunks from the database,

    instead of pulling all of it at once.

    Optimistic Concurrency Database

    Updating Pessimistic Concurrency

    The Problem

    Most ASP.NET applications incorporate a database as a repository for data. At some

    point in the life of your application, a user of that application will need to modify a

    record in that database. As soon as the person selects a record from the database tobe modified, the data selected is old and suspect. It is old and suspect, because the

    acquired data is now disconnected from the database and no longer accuratelyrepresents the original data in the database. The original database record may havebeen modified before the user is able to modify and save his/her changes.

    So now we have the age old problem, how confident do we feel as developers that theselected data by one user will not be changed by another user before he/she has a

    chance to modify and save it back to the database? If we decide poorly, one user endsup overwriting another user's changes or causing excessive delays in the performance

    of a multi-user ASP.NET application due to record locking.

    Pessimistic and Optimistic ConcurrencyIn a multiuser environment, there are two models for updating data in a database:

    optimistic concurrency and pessimistic concurrency.

    Pessismistic concurrency involves locking the data at the database when you readit. You essentially lock the database record and don't allow anyone to touch it until

    you are done modifying and saving it back to the database. Here you have 100%

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    4/34

    assurance that nobody will modify the record while you have it checked out. Another

    person will have to wait until you have made your changes.

    Optimistic concurrency means you read the database record, but don't lock it.Anyone can read and modify the record at anytime and you will take your chances that

    the record is not modified by someone else before you have a chance to modify and

    save it. As a developer, the burden is on you to check for changes in the original data( collisions ) and act accordingly based on any errors that may occur during the

    update.

    Depending on the application and the number of users, pessimistic concurrency can

    cause delays in your application. Other users may have to wait while another user haslocked a database record. Worst case, you get an actual dead lock, because there is

    a cyclical dependency on database resources and nobody can complete his/herintended task. A timeout occurs in the application and nobody is happy, including the

    guy responsible for cutting you a check.

    With optimistic concurrency, the application has to check for changes to the original

    record to avoid overwriting changes. There is no guarantee that the original recordhas not been changed, because no lock has been placed on that data at its source -

    the database. Hence, there is the real possibility of losing changes made by anotherperson.

    Simplest Thing Possible, But No Simpler

    The key is to know your application and to pick a solution that best meets its needs.

    If your application is a portal for your INETA .NET Developer Group Website, I dare saythat there is absolutely no chance that two people will be updating content at any

    time. ;) You are lucky if you get one person to update the website. Hence, I would

    say this is very much a single-user environment and the heck with concurrency issues- they won't happen. The chance of overwriting someone else's data is slim to none aswell as has very little repercussions if it happens.

    If your application is a banking application and the repercussions of overwriting data issevere, concurrency issues will be a big concern and need to be judged accordingly.

    Although the chance of 2 users / processes modifying the same data may be slim tonone in the application, the risk of it happening once may be important enough to use

    pessimistic concurrency and the heck with performance on the off chance that twoprocesses may want to access the same data.

    Optimistic Concurrency Strategies

    If you are in a performance state-of-mind, chances are you will go with optimistic

    concurrency. Optimistic concurrency frees up database resources as quickly aspossible so that other users and processes can act upon that data as soon as possible.

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    5/34

    To the best of my knowledge, there are four popular strategies to dealing with

    optimistic concurrency:

    1. Do Nothing.2. Check for changes to all fields during update.

    3. Check for changes to modified fields during update.

    4. Check for changes to timestamp ( rowversion ) during update.

    All of these strategies have to deal with the shaping of the Update T-SQL Commandsent to the database during the updating of the data. The examples below are not

    very detailed on purpose and assume a basic understanding of ADO.NET. Below showsthe strategies from a view point of 30,000 ft high.

    Optimistic Concurrency on Update Strategy #1 - Do

    Nothing

    The simplest strategy for dealing with concurrency issues during the updating of datais to do nothing.

    The update command will not check for any changes in the data, only specify theprimary key of the record to be changed. If someone else changed the data, those

    changes will more than likely be overwritten:

    Update Product

    Set

    Name = @NameWhere

    ID = @ID

    One would hope that this means either 1) the application is a single-user application,

    or 2) the chance of multi-user update collisions is very unlikely and the repercussionsof overwriting data is negligible.

    Optimistic Concurrency on Update Strategy #2 - Check

    All Fields

    With this strategy, the update command will check that all fields in the row ( usually

    minus BLOB fields ) are equal to their original values when peforming the update toassure no changes have been made to the original record. A check of the return value

    of the ExecuteNonQuery Command will tell you if the update actually took place.The return value of the ExecuteNonQuery Command is typically the number of rows

    affected by the query.

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    6/34

    Update Product

    SetName = @Name,

    WhereID = @ID

    AND

    Name = @OriginalNameAND

    Price = @OriginalPrice

    This is essentially what CommandBuilder creates when using DataSets and is astrategy that doesn't want to see any changes to the data.

    Optimistic Concurrency on Update Strategy #3 - Check

    Only Changed Fields

    Rather than checking all fields in the row to make sure they match their originalvalue, this strategy checks only those fields that are being updated in the command.

    Update Product

    SetName = @Name

    Where

    ID = @IDAND

    Name = @OriginalName

    This strategy only cares that it is not overwriting any data and could care less thatother fields in the record may have been changed. This could create an interestingcombination of data in the row.

    Optimistic Concurrency on Update Strategy #4 -

    Implement Timestamp

    SQL Server has a timestamp ( alias rowversion ) field that is modified everytime achange is made to a record that contains such a field. Therefore, if you add such a

    field to a table you only have to verify the timestamp record contains the same originalvalue to be assured none of the fields have been changed in the record.

    Update ProductSet

    Name = @NameWhere

    ID = @IDAND

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    7/34

    TimestampID = @TimestampID

    This is the same as Strategy #2 above without the need for checking all fields.

    Conclusion

    Optimistic concurrency has a performance component to it that suggests a higherperforming ASP.NET website, so I included it in my series of posts called High

    Performance ASP.NET Websites Made Easy. Yeah, I could be pushing that statement abit ;)

    There are other methods of achieving optimistic concurrency, but I think the onesabove are the most popular. A developer needs to look at the application itself to

    determine which strategy makes sense. The DataSet, Command Builder, andDataAdapter typically handle this stuff for you using Strategy #2. However, if you

    work with objects instead of DataSets, you need to handle concurrency issuesyourself. If you use an O/R Mapper, make sure you know how your O/R Mapper

    handles the possibility of collisions during updates.

    http://davidhayden.com/blog/dave/category/24.aspx?Show=Allhttp://davidhayden.com/blog/dave/category/24.aspx?Show=All
  • 7/29/2019 dotNET, SQL SERVER and more.doc

    8/34

    Master/Detail Using a Selectable

    Master GridView with a Details

    DetailViewIntroduction

    In the previous tutorial we saw how to create a master/detail report using two web pages: a

    "master" web page, from which we displayed the list of suppliers; and a "details" web pagethat listed those products provided by the selected supplier. This two page report format can

    be condensed into one page. This tutorial will have a GridView whose rows include the

    name and price of each product along with a Select button. Clicking the Select button for aparticular product will cause its full details to be displayed in a DetailsView control on the

    same page.

    Figure 1. Clicking the Select Button Displays the Product's Details

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    9/34

    Step 1: Creating a Selectable GridView

    Recall that in the two-page master/detail report that each master record included a

    hyperlink that, when clicked, sent the user to the details page passing the clicked row'sSupplierID value in the querystring. Such a hyperlink was added to each GridView row

    using a HyperLinkField. For the single page master/details report, we will need a Button foreach GridView row that, when clicked, shows the details. The GridView control can beconfigured to include a Select button for each row that causes a postback and marks that

    row as the GridView's SelectedRow.

    Start by adding a GridView control to the DetailsBySelecting.aspx page in the

    Filtering folder, setting its ID property to ProductsGrid. Next, add a new

    ObjectDataSource named AllProductsDataSource that invokes the ProductsBLL class's

    GetProducts() method.

    Figure 2. Create an ObjectDataSource NamedAllProductsDataSource

    http://msdn2.microsoft.com/en-us/library/system.web.ui.webcontrols.gridview.selectedrow.aspxhttp://msdn2.microsoft.com/en-us/library/system.web.ui.webcontrols.gridview.selectedrow.aspx
  • 7/29/2019 dotNET, SQL SERVER and more.doc

    10/34

    Figure 3. Use the ProductsBLL Class

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    11/34

    Figure 4. Configure the ObjectDataSource to Invoke the GetProducts() Method

    Edit the GridView's fields removing all but the ProductName and UnitPrice BoundFields.

    Also, feel free to customize these BoundFields as needed, such as formatting theUnitPrice BoundField as a currency and changing the HeaderText properties of the

    BoundFields. These steps can be accomplished graphically, by clicking the Edit Columnslink from the GridView's smart tag, or by manually configuring the declarative syntax.

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    12/34

    Figure 5. Remove All But the ProductName and UnitPrice BoundFields

    The final markup for the GridView is:

    Next, we need to mark the GridView as selectable, which will add a Select button to each

    row. To accomplish this, simply check the Enable Selection checkbox in the GridView'ssmart tag.

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    13/34

    Figure 6. Make the GridView's Rows Selectable

    Checking the Enabling Selection option adds a CommandField to the ProductsGrid

    GridView with its ShowSelectButton property set to True. This results in a Select button

    for each row of the GridView, as Figure 6 illustrates. By default, the Select buttons are

    rendered as LinkButtons, but you can use Buttons or ImageButtons instead through the

    CommandField's ButtonType property.

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    14/34

    When a GridView row's Select button is clicked a postback ensues and the GridView'sSelectedRow property is updated. In addition to the SelectedRow property, the GridView

    provides the SelectedIndex, SelectedValue, and SelectedDataKey properties. The

    SelectedIndex property returns the index of the selected row, whereas the

    SelectedValue and SelectedDataKey properties return values based upon the GridView's

    DataKeyNames property.

    The DataKeyNames property is used to associate one or more data field values with each

    row and is commonly used to attribute uniquely identifying information from the

    underlying data with each GridView row. The SelectedValue property returns the value

    of the first DataKeyNames data field for the selected row where as the SelectedDataKey

    property returns the selected row's DataKey object, which contains all of the values for the

    specified data key fields for that row.

    The DataKeyNames property is automatically set to the uniquely-identifying data field(s)

    when you bind a data source to a GridView, DetailsView, or FormView through the

    Designer. While this property has been set for us automatically in the preceding tutorials,the examples would have worked without the DataKeyNames property specified. However,

    for the selectable GridView in this tutorial, as well as for future tutorials in which we'll be

    examining inserting, updating, and deleting, the DataKeyNames property must be set

    properly. Take a moment to ensure that your GridView's DataKeyNames property is set to

    ProductID.

    Let's view our progress thus far through a browser. Note that the GridView lists the name

    and price for all of the products along with a Select LinkButton. Clicking the Select buttoncauses a postback. In Step 2 we'll see how to have a DetailsView respond to this postback

    by displaying the details for the selected product.

    http://msdn2.microsoft.com/en-us/library/system.web.ui.webcontrols.gridview.selectedindex(VS.80).aspxhttp://msdn2.microsoft.com/en-us/library/system.web.ui.webcontrols.gridview.selectedvalue(VS.80).aspxhttp://msdn2.microsoft.com/en-us/library/system.web.ui.webcontrols.gridview.selectedvalue(VS.80).aspxhttp://msdn2.microsoft.com/en-us/library/system.web.ui.webcontrols.gridview.selecteddatakey(VS.80).aspxhttp://msdn2.microsoft.com/en-us/library/system.web.ui.webcontrols.gridview.datakeynames(VS.80).aspxhttp://msdn2.microsoft.com/en-us/library/system.web.ui.webcontrols.gridview.selectedindex(VS.80).aspxhttp://msdn2.microsoft.com/en-us/library/system.web.ui.webcontrols.gridview.selectedvalue(VS.80).aspxhttp://msdn2.microsoft.com/en-us/library/system.web.ui.webcontrols.gridview.selecteddatakey(VS.80).aspxhttp://msdn2.microsoft.com/en-us/library/system.web.ui.webcontrols.gridview.datakeynames(VS.80).aspx
  • 7/29/2019 dotNET, SQL SERVER and more.doc

    15/34

    Figure 7. Each Product Row Contains a Select LinkButton

    Highlighting the Selected Row

    The ProductsGrid GridView has a SelectedRowStyle property that can be used to dictate

    the visual style for the selected row. Used properly, this can improve the user's experienceby more clearly showing which row of the GridView is currently selected. For this tutorial,let's have the selected row be highlighted with a yellow background.

    As with our earlier tutorials, let's strive to keep the aesthetic-related settings defined as CSS

    classes. Therefore, create a new CSS class in Styles.css named SelectedRowStyle.

    .SelectedRowStyle

    {

    background-color: Yellow;

    }

    To apply this CSS class to the SelectedRowStyle property ofallGridViews in our tutorial

    series, edit the GridView.skin Skin in the DataWebControls Theme to include the

    SelectedRowStyle settings as shown below:

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    16/34

    With this addition, the selected GridView row is now highlighted with a yellow backgroundcolor.

    Figure 8. Customize the Selected Row's Appearance Using the GridView's

    SelectedRowStyle Property

    Step 2: Displaying the Selected Product's Details in a

    DetailsView

    With the ProductsGrid GridView complete, all that remains is to add a DetailsView that

    displays information about the particular product selected. Add a DetailsView control

    above the GridView and create a new ObjectDataSource namedProductDetailsDataSource. Since we want this DetailsView to display particular

    information about the selected product, configure the ProductDetailsDataSource to usethe ProductsBLL class's GetProductByProductID(productID) method.

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    17/34

    Figure 9. Invoke the ProductsBLL Class's GetProductByProductID(productID)

    Method

    Have the productIDparameter's value obtained from the GridView control'sSelectedValue property. As we discussed earlier, the GridView's SelectedValue

    property returns the first data key value for the selected row. Therefore, it's imperative thatthe GridView's DataKeyNames property is set to ProductID, so that the selected row's

    ProductID value is returned by SelectedValue.

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    18/34

    Figure 10. Set the productIDParameter to the GridView's SelectedValue Property

    Once the productDetailsDataSource ObjectDataSource has been configured correctly

    and bound to the DetailsView, this tutorial is complete! When the page is first visited norow is selected, so the GridView's SelectedValue property returns Nothing. Since there

    are no products with a NULLProductID value, no records are returned by the

    GetProductByProductID(productID) method, meaning that the DetailsView isn't

    displayed (see Figure 11). Upon clicking a GridView row's Select button, a postback ensues

    and the DetailsView is refreshed. This time the GridView's SelectedValue property

    returns the ProductID of the selected row, the GetProductByProductID(productID)

    method returns a ProductsDataTable with information about that particular product, and

    the DetailsView shows these details (see Figure 12).

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    19/34

    Figure 11. When First Visited, Only the GridView is Displayed

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    20/34

    Figure 12. Upon Selecting a Row, the Product's Details are Displayed

    EnableViewStateMac in an ASPX Page ASP.Net

    Setting EnableViewStateMac=true is a security measure that allows ASP.NET to

    ensure that the viewstate for a page has not been tampered with. If on

    Postback, the ASP.NET framework detects that there has been a change in the

    value of viewstate that was sent to the browser, it raises an error - Validation of

    viewstate MAC failed.

    Use to set it to true (the default

    value, if this attribute is not specified is also true) in an aspx page.

    ASP.NET ViewState is tied to the particular server it came from by default--

    even though the documentation says it isn't. So when ViewState generated on

    server A is POST-ed back to server B, you'll get this exception. Somewhere in

    the pipeline, the viewstate is salted with a unique, autogenerated machine key

    http://24x7aspnet.blogspot.com/2009/10/enableviewstatemac-in-aspx-page-aspnet.htmlhttp://24x7aspnet.blogspot.com/2009/10/enableviewstatemac-in-aspx-page-aspnet.html
  • 7/29/2019 dotNET, SQL SERVER and more.doc

    21/34

    from the originating server's machine.config file:

    This is done to prevent users from somehow tampering with the ViewState. Any

    change to the ViewState data on the client will be detected. But this has a side

    effect: it also prevents multiple servers from processing the same ViewState.

    One solution is to force every server in your farm to use the same key--

    generate a hex encoded 64-bit or 128-bit and put that in each

    server's machine.config :

    Or you can disable the keying of viewstate to a particular server using a simple

    page directive at the top of your .aspx pages:

    Alternately, you can modify the pages element in Web.config:

    ...

    Basically we knew that some problems might be overseen while moving an application

    from classic mode to integrated mode, most importantly the fact that modules in

    system.web/httpModules would no longer be used and only the system.webServer/moduleswould. If any of those modules in system.web was used as a security feature or added any

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    22/34

    interesting behavior it could be that developers didn't realized it was no longer running and

    potentially introduce issues in the application.

    That is the reason this flag was introduced so that by default any of this potential issues areflagged as errors (500.23 if I remember correctly), but once a developer looks into their

    settings and verify they know that even though modules are in the system.web settings, theyknow that is the right configuration (probably duplicating the modules insystem.webServer) then they can turn off that warning and allow the application to run in

    both modes.

    That flag will not alter the behavior of the application in any way, it will only tell

    IIS to not flag an error when one of the three typical configuration issues are

    identified:

    ASP.NET 2.0 applications on IIS 7.0 are hosted using the ASP.NET Integrated

    mode by default. This new mode enables a myriad of exciting scenarios

    including using super-valuable ASP.NET features like Forms Authentication for

    your entire Web site, and developing new ASP.NET modules to do things like

    URL rewriting, authorization, logging, and more at the IIS level. As you know,

    with great power comes great responsibility. Similarly, with making ASP.NET

    applications more powerful in IIS 7.0 comes the responsibility of making sure

    that existing ASP.NET applications continue to work. This has been a major

    challenge for us as we re-architected the entire core engine of ASP.NET, and in

    the end we were highly successful in meeting it. As a result, most ASP.NET

    applications should work without change. This post lists the changes in

    behavior that you may encounter when deploying your ASP.NET applications on

    IIS 7.0 on Windows Vista SP1 and Windows Server 2008. Unless noted, these

    breaking changes occur only when using the default ASP.NET Integrated mode.

    Using Classic ASP.NET modeIIS 7.0 also offers the ability to run ASP.NET applications using the legacy ClassicASP.NET Integration mode, which works the same way as ASP.NET has worked on

    previous versions of IIS. However, we strongly recommend that you use a workaround

    where available to change your application to work in Integrated mode instead. Moving toClassic mode will make your application unable to take advantage of ASP.NET

    improvements made possible in Integrated mode, leveraging future features from both

    Microsoft and third parties that may require the Integrated mode. Use Classic mode as a

    last resort if you cannot apply the specified workaround.

    Migration errorsThese errors occur due to changes in how some ASP.NET configuration is

    applied in Integrated mode. IIS will automatically detect this configuration and

    issue an error asking you to migrate your application, or move it to classic

    mode if migration is not acceptable (See breaking change #3 below).

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    23/34

    1) ASP.NET applications require migration when specifying configuration in

    or .

    You will receive a 500 - Internal Server Error. This can include HTTP Error

    500.22, and HTTP Error 500.23:An ASP.NET setting has been detected that

    does not apply in Integrated managed pipeline mode.

    It occurs because ASP.NET modules and handlers should be specified in the IIS

    and configuration sections in Integrated mode.

    Workaround:

    1) You must migrate the application configuration to work properly in

    Integrated mode. You can migrate the application configuration with AppCmd:

    > %windir%\system32\inetsrv\Appcmd migrate config

    ""

    2) You can migrate manually by moving the custom entries in in the

    / and /

    configuration manually to the / and

    / configuration sections, and either removing

    the and configuration OR adding the following

    to your applications web.config:

    2) ASP.NET applications produce a warning when the application enables request

    impersonation by specifying in configuration

    You will receive a 500 - Internal Server Error. This is HTTP Error 500.24:An

    ASP.NET setting has been detected that does not apply in Integrated managed

    pipeline mode.

    It occurs because ASP.NET Integrated mode is unable to impersonate therequest identity in the BeginRequest and AuthenticateRequest pipeline stages.

    Workaround:

    1) If your application does not rely on impersonating the requesting user in the

    BeginRequest and AuthenticateRequest stages (the only stages where

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    24/34

    impersonation is not possible in Integrated mode), ignore this error by adding

    the following to your applications web.config:

    2) If your application does rely on impersonation in BeginRequest and

    AuthenticateRequest, or you are not sure, move to classic mode.

    3) You receive a configuration error when your application configuration includes an

    encrypted section.

    You will receive a 500 Internal Server Error. This is HTTP Error 500.19: The

    requested page cannot be accessed because the related configuration data for

    the page is invalid.The detailed error information indicates that Configuration section encryption is not

    supported.

    It occurs because IIS attempts to validate the section and fails to

    read section-level encryption.

    Workaround:

    1) If your application does not have the problem with request impersonation

    per breaking change #2, migrate your application configuration by usingAppCmd as described in breaking change #1:

    > %windir%\system32\inetsrv\Appcmd migrate config

    ""

    This will insure that the rest of application configuration is migrated, and

    automatically add the following to your applications web.config to ignore the

    section:

    2) If your application does have the problem with request impersonation, move

    to classic mode.

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    25/34

    Authentication, Authorization, and ImpersonationIn Integrated mode, both IIS and ASP.NET authentication stages have been

    unified. Because of this, the results of IIS authentication are not available until

    the PostAuthenticateRequest stage, when both ASP.NET and IIS authentication

    methods have completed. This causes the following changes:

    4) Applications cannot simultaneously use FormsAuthentication and

    WindowsAuthentication

    Unlike Classic mode, it is not possible to use Forms Authentication in ASP.NET

    and still require users to authenticate with an IIS authentication method

    including Windows Authentication, Basic Authentication, etc. If Forms

    Authentication is enabled, all other IIS authentication methods except for

    Anonymous Authentication should be disabled.

    In addition, when using Forms Authentication, the following changes are in

    effect:

    - The LOGON_USER server variable will be set to the name of the Forms

    Authentication user.

    - It will not be possible to impersonate the authenticated client. To impersonate theauthenticated client, you must use an authentication method that produces a

    Windows user instead of Forms Authentication.

    Workaround:

    1) Change your application to use the pattern explained in Implementing atwo level authentication scheme using Forms Authentication and

    another IIS authentication method in IIS 7.0.

    5) Windows Authentication is performed in the kernel by default. This may cause HTTP

    clients that send credentials on the initial request to fail.

    IIS 7.0 Kernel-mode authentication is enabled by default in IIS 7.0. This

    improves the performance of Windows Authentication, and simplifies the

    deployment of Kerberos authentication protocol. However, it may cause some

    clients that send the windows credentials on the initial request to fail due to a

    design limitation in kernel-mode authentication. Normal browser clients are

    not affected because they always send the initial request anonymously.

    NOTE: This breaking change applies to both Classic and Integrated modes.

    http://mvolo.com/blogs/serverside/archive/2008/02/11/IIS-7.0-Two_2D00_Level-Authentication-with-Forms-Authentication-and-Windows-Authentication.aspxhttp://mvolo.com/blogs/serverside/archive/2008/02/11/IIS-7.0-Two_2D00_Level-Authentication-with-Forms-Authentication-and-Windows-Authentication.aspxhttp://mvolo.com/blogs/serverside/archive/2008/02/11/IIS-7.0-Two_2D00_Level-Authentication-with-Forms-Authentication-and-Windows-Authentication.aspxhttp://mvolo.com/blogs/serverside/archive/2008/02/11/IIS-7.0-Two_2D00_Level-Authentication-with-Forms-Authentication-and-Windows-Authentication.aspxhttp://mvolo.com/blogs/serverside/archive/2008/02/11/IIS-7.0-Two_2D00_Level-Authentication-with-Forms-Authentication-and-Windows-Authentication.aspxhttp://mvolo.com/blogs/serverside/archive/2008/02/11/IIS-7.0-Two_2D00_Level-Authentication-with-Forms-Authentication-and-Windows-Authentication.aspx
  • 7/29/2019 dotNET, SQL SERVER and more.doc

    26/34

    Workaround:

    1) Disable kernel-mode authentication by setting the userKernelMode to false

    in the system.webServer/security/authentication/windowsAuthentication

    section. You can also do it by AppCmd as follows:

    > %windir%\system32\inetsrv\appcmd set config

    /section:windowsAuthentication /useKernelMode:false

    6) Passport authentication is not supported.

    You will receive an ASP.NET 500 Server Error: The PassportManager object

    could not be initialized. Please ensure that Microsoft Passport is correctly

    installed on the server.

    Passport authentication is no longer supported on Windows Vista and WindowsServer 2008. NOTE: This breaking change applies to both Classic and

    Integrated modes.

    7) HttpRequest.LogonUserIdentity throws an InvalidOperationException when accessed in a

    module before PostAuthenticateRequest

    You will receive an ASP.NET 500 Server Error: This method can only be called

    after the authentication event.

    HttpRequest.LogonUserIdentity throws an InvalidOperationException when

    accessed before PostAuthenticateRequest, because the value of this property is

    unknown until after the client has been authenticated.

    Workaround:

    1) Change the code to not access HttpRequest.LogonUserIdentity until at least

    PostAuthenticateRequest

    8) Client impersonation is not applied in a module in the BeginRequest and

    AuthenticateRequest stages.

    The authenticated user is not known until the PostAuthenticateRequest stage.

    Therefore, ASP.NET does not impersonate the authenticated user for ASP.NET

    modules that run in BeginRequest and AuthenticateRequest stages. This can

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    27/34

    affect your application if you have custom modules that rely on the

    impersonating the client for validating access to or accessing resources in

    these stages.

    Workaround:

    1) Change your application to not require client impersonation in BeginRequest

    and AuthenticateRequest stages.

    9) Defining an DefaultAuthentication_OnAuthenticate method in global.asax throws

    PlatformNotSupportedException

    You will receive an ASP.NET 500 Server Error: The

    DefaultAuthentication.Authenticate method is not supported by IIS integrated

    pipeline mode.

    In Integrated mode, the DefaultAuthenticationModule.Authenticate event in notimplemented and hence no longer raises. In Classic mode, this event is raised

    when no authentication has occurred.

    Workaround:

    1) Change application to not rely on the DefaultAuthentication_OnAuthenticate

    event. Instead, write an IHttpModule that inspects whether HttpContext.User is

    null to determine whether an authenticated user is present.

    10)Applications that implement WindowsAuthentication_OnAuthenticate in global.asax will

    not be notified when the request is anonymous[M2]

    If you define the WindowsAuthentication_OnAuthenticate method in

    global.asax, it will not be invoked for anonymous requests. This is because

    anonymous authentication occurs after the WindowsAuthentication module can

    raise the OnAuthenticate event.

    Workaround:

    1) Change your application to not use the

    WindowsAuthentication_OnAuthenticate method. Instead, implement an

    IHttpModule that runs in PostAuthenticateRequest, and inspects

    HttpContext.User.

    http://mvolo.com/tiny_mce/jscripts/tiny_mce/blank.htm#_msocom_2http://mvolo.com/tiny_mce/jscripts/tiny_mce/blank.htm#_msocom_2http://mvolo.com/tiny_mce/jscripts/tiny_mce/blank.htm#_msocom_2
  • 7/29/2019 dotNET, SQL SERVER and more.doc

    28/34

    Request limits and URL processingThe following changes result due to additional restrictions on how IIS processes

    incoming requests and their URLs.

    11)Request URLs containing unencoded + characters in the path (not querystring) is

    rejected by default

    You will receive HTTP Error 404.11 Not Found: The request filtering module is

    configured to deny a request that contains a double escape sequence.

    This error occurs because IIS is by default configured to reject attempts to

    doubly-encode a URL, which commonly represent an attempt to execute a

    canonicalization attack.

    Workaround:

    1) Applications that require the use of the + character in the URL path can

    disable this validation by setting the allowDoubleEscaping attribute in the

    system.webServer/security/requestFiltering configuration section in the

    applications web.config. However, this may make your application more

    vulnerable to malicious URLs:

    12)Requests with querystrings larger then 2048 bytes will be rejected by default

    You will receive an HTTP Error 404.15 Not Found: The request filtering

    module is configured to deny a request where the query string is too long.

    IIS by default is configured to reject querystrings longer than 2048 bytes. This

    may affect your application if it uses large querystrings or uses cookieless

    ASP.NET features like Forms Authentication and others that cumulatively

    exceed the configured limit on the querystring size.

    NOTE: This breaking change applies to both Classic and Integrated modes.

    Workaround:

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    29/34

    1) Increase the maximum querystring size by setting the maxQueryString

    attribute on the requestLimits element in the

    system.webServer/security/requestFiltering configuration section in your

    applications web.config:

    Changes in response header processingThese changes affect how response headers are generated by the application.

    13)IIS always rejects new lines in response headers (even if ASP.NET enableHeaderChecking

    is set to false)

    If your application writes headers with line breaks (any combination of \r, or \n),

    you will receive an ASP.NET 500 Server Error: Value does not fall within the

    expected range.

    IIS will always reject any attempt to produce response headers with line breaks,

    even if ASP.NETs enableHeaderChecking behavior is disabled. This is done to

    prevent header splitting attacks.

    NOTE: This breaking change applies to both Classic and Integrated modes.

    14)When the response is empty, the Content-Type header is not suppressed

    If the application sets a Content-Type header, it will remain present even if the

    response is cleared. Requests to ASP.NET content types will typically have the

    Content-Type: text/html present on responses unless overridden by theapplication.

    Workaround:

    1) While this should not typically have a breaking effect, you can remove the

    Content-Type header by explicitly setting the HttpResponse.ContentType

    property to null when clearing the response.

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    30/34

    15)When the response headers are cleared with HttpResponse.ClearHeaders, default

    ASP.NET headers are not generated. This may result in the lack of Cache-Control:

    private header that prevents the caching of the response on the client

    HttpResponse.ClearHeaders does not re-generate default ASP.NET response

    headers, including Content-Type: text/html and Cache-Control: private, as

    it does in Classic mode. This is because ASP.NET modules may call this API for

    requests to any resource type, and therefore generating ASP.NET-specific

    headers is not appropriate. The lack of the Cache-Control header may cause

    some downstream network devices to cache the response.

    Workaround:

    1) Change application to manually generate the Cache-Control: private header

    when clearing the response, if it is desired to prevent caching in downstream

    network devices.

    Changes in application and module event processingThese changes affect how the application and module event processing takes

    place.

    16)It is not possible to access the request through the HttpContext.Current property in

    Application_Start in global.asax

    If your application accesses the current request context in the Application_Start

    method in global.asax as part of application initialization, you will receive an

    ASP.NET 500 Server Error: Request is not available in this context.

    This error occurs because ASP.NET application initialization has been decoupled

    from the request that triggers it. In Classic mode, it was possible to indirectly

    access the request context by accessing the HttpContext.Current property. In

    Integrated mode, this context no longer represents the actual request and

    therefore attempts to access the Request and Response objects will generate

    an exception.

    Workaround:

    1) See Request is not available in this context exception in

    Application_Startfor a detailed description of this problem and available

    workarounds.

    http://mvolo.com/blogs/serverside/archive/2007/11/10/Integrated-mode-Request-is-not-available-in-this-context-in-Application_5F00_Start.aspxhttp://mvolo.com/blogs/serverside/archive/2007/11/10/Integrated-mode-Request-is-not-available-in-this-context-in-Application_5F00_Start.aspxhttp://mvolo.com/blogs/serverside/archive/2007/11/10/Integrated-mode-Request-is-not-available-in-this-context-in-Application_5F00_Start.aspxhttp://mvolo.com/blogs/serverside/archive/2007/11/10/Integrated-mode-Request-is-not-available-in-this-context-in-Application_5F00_Start.aspx
  • 7/29/2019 dotNET, SQL SERVER and more.doc

    31/34

    17)The order in which module event handlers execute may be different then in Classic mode

    The following differences exist:

    - For each event, event handlers for each module are executed in the order in which

    modules are configured in the configuration section. Global.asax eventhandlers are executed last.

    - Modules that register for the PreSendRequestHeaders and PreSendRequestContentevents are notified in the reverse of the order in which they appear in the configuration section

    - For each event, synchronous event handlers for each module are executed before

    asynchronous handlers. Otherwise, event handlers are executed in the order in

    which they are registered.Applications that have multiple modules configured to run in either of these

    events may be affected by these change if they share a dependency on event

    ordering. This is not likely for most applications. The order in which modules

    execute can be obtained from a Failed Request Tracing log.

    Workaround:

    1) Change the order of the modules experiencing an ordering problem in the

    system.webServer/modules configuration section.

    18)ASP.NET modules in early request processing stages will see requests that previously may

    have been rejected by IIS prior to entering ASP.NET. This includes modules running in

    BeginRequest seeing anonymous requests for resources that require authentication.

    ASP.NET modules can run in any pipeline stages that are available to native IIS

    modules. Because of this, requests that previously may have been rejected inthe authentication stage (such as anonymous requests for resources that

    require authentication) or other stages prior to entering ASP.NET may run

    ASP.NET modules.

    This behavior is by design in order to enable ASP.NET modules to extend IIS in

    all request processing stages.

    Workaround:

    1) Change application code to avoid any application-specific problems thatarise from seeing requests that may be rejected later on during request

    processing. This may involve changing modules to subscribe to pipeline events

    that are raised later during request processing.

  • 7/29/2019 dotNET, SQL SERVER and more.doc

    32/34

    Other application changesOther changes in the behavior of ASP.NET applications and APIs.

    19)DefaultHttpHandler is not supported. Applications relying on sub-classes of

    DefaultHttpHandler will not be able to serve requests. [M3]

    If your application uses DefaultHttpHandler or handlers that derive from

    DefaultHttpHandler, it will not function correctly. In Integrated mode, handlers

    derived from DefaultHttpHandler will not be able to pass the request back to IIS

    for processing, and instead serve the requested resource as a static file.

    Integrated mode allows ASP.NET modules to run for all requests without

    requiring the use of DefaultHttpHandler.

    Workaround:

    1) Change your application to use modules to perform request processing for

    all requests, instead of using wildcard mapping to map ASP.NET to all requests

    and then using DefaultHttpHandler derived handlers to pass the request back

    to IIS.

    20)It is possible to write to the response after an exception has occurred.In Integrated mode, it is possible to write to and display an additional response

    written after an exception has occurred, typically in modules that subscribe to

    the LogRequest and EndRequest events. This does not occur in Classic mode. If

    an error occurs during the request, and the application writes to the response

    in EndRequest after the exception has occurred, the response information

    written in EndRequest will be shown. This only affects requests that include

    unhandled exceptions. To avoid writing to the response after an exception, an

    application should check HttpContext.Error or HttpResponse.StatusCode before

    writing to the response.

    21)It is not possible to use the ClearError API to prevent an exception from being written to

    the response if the exception has occurred in a prior pipeline stage

    Calling Server.ClearError during the EndRequest event does not clear the

    exception if it occurred during an earlier event within the pipeline. This is

    http://mvolo.com/tiny_mce/jscripts/tiny_mce/blank.htm#_msocom_3http://mvolo.com/tiny_mce/jscripts/tiny_mce/blank.htm#_msocom_3
  • 7/29/2019 dotNET, SQL SERVER and more.doc

    33/34

    because the exception is formatted to the response at the end of each event

    that raises an exception.

    Workaround:

    1) Change your application to call Server.ClearError from the

    Application_OnError event handler, which is raised whenever an exception is

    thrown.

    22)HttpResponse.AppendToLog does not automatically prepend the querystring to the URL.[M4]

    When using HttpResponse.AppendToLog to append a custom string to the URL

    logged in the request log file, you will manually need to prepend the

    querystring to the string you pass to this API. This may result in existing codelosing the querystring from the logged URL when this API is used.

    Workaround:

    1) Change your application to manually prepend

    HttpResponse.QueryString.ToString() to the string passed to

    HttpResponse.AppendToLog.

    Other changesOther changes.

    23)ASP.NET threading settings are not used to control the request concurrency in Integrated

    mode

    The minFreeThreads, minLocalRequestFreeThreads settings in the

    system.web/httpRuntime configuration section and the maxWorkerThreads

    setting in theprocessModel configuration section no longer control the

    threading mechanism used by ASP.NET. Instead, ASP.NET relies on the IIS

    thread pool and allows you to control the maximum number of concurrentlyexecuting requests by setting the MaxConcurrentRequestsPerCPU DWORD

    value (default is 12) located in the

    HKEY_LOCAL_MACHINE\SOFTWARE\Microsoft\ASP.NET\2.0.50727.0 key. This

    setting is global and cannot be changed for individual application pools or

    applications.

    http://mvolo.com/tiny_mce/jscripts/tiny_mce/blank.htm#_msocom_4http://mvolo.com/tiny_mce/jscripts/tiny_mce/blank.htm#_msocom_4http://mvolo.com/tiny_mce/jscripts/tiny_mce/blank.htm#_msocom_4
  • 7/29/2019 dotNET, SQL SERVER and more.doc

    34/34

    Workaround:

    1) To control the concurrency of your application, set the

    MaxConcurrentRequestsPerCPU setting.

    24)ASP.NET application queues are not used in Integrated mode. Therefore, the ASP.NET

    Applications\Requests in Application Queue performance counter will always have a

    value of 0

    ASP.NET does not use application queues in Integrated mode.

    25)IIS 7.0 always restarts ASP.NET applications when changes are made to the applications

    root web.config file. Because of this, waitChangeNotification and

    maxWaitChangeNotification attributes have no effect.IIS 7.0 monitors changes to the web.config files as well, and cause the ASP.NET

    application corresponding to this file to be restarted without regard to the

    ASP.NET change notification settings including the waitChangeNotification and

    maxWaitChangeNotification attributes in the system.web/httpRuntime

    configuration sections.

    26)ASP.NET deadlock detection is not supported in Integrated mode.

    ASP.NET 2.0 Classic mode supports a deadlock detection mechanismwhich monitors ASP.NET responses and considers the process deadlocked if no

    responses have been received in more then the amount of time specified in the

    setting, and there are more than a

    certain number of requests outstanding. In this case, the process is marked

    unhealthy and a new process is eventually started. In Integrated mode, this

    feature is not supported. If your application is prone to deadlocks, it may lead

    to deadlock conditions being undetected, however it should have no effect on

    healthy applications that do not exhibit deadlocks.