LightSwitch 2011 with Azure

  • Upload
    leala

  • View
    44

  • Download
    7

Embed Size (px)

DESCRIPTION

LightSwitch 2011 with Azure. Andrew Butenko http://andrewbutenko.wordpress.com @ andrew_butenko. Intermediate level. Objectives and Takeaways. Session Objectives You will understand the existing and upcoming capabilities of Microsoft Azure cloud and LightSwitcn 2011 - PowerPoint PPT Presentation

Citation preview

PowerPoint Presentation

LightSwitch 2011 with AzureAndrew Butenkohttp://andrewbutenko.wordpress.com @andrew_butenkoIntermediate level1Objectives and TakeawaysSession ObjectivesYou will understand the existing and upcoming capabilities of Microsoft Azure cloud and LightSwitcn 2011Know how to choose you architecture blocksRestrictionsLightSwitcn combined with Azure benefitsGoalsUnderstand at the end the existing and upcoming deployment aspects of Lightswitcn and AzureLearn how to optimize your design.Do it as simple as possibleThe key takeaway that we want you to leave with is that LightSwitch and Windows Azure, both provide a comprehensive and easy to use offering that allows you to build business applications in rapid manner.And, probably, bring (distribute) it for customers all over the world2Architecture Decision Matrix

To provide organizations with mission-critical performance and availability you have to select model with acceptable and balanced level of control and responsibility.A variety of hybrid solutions you can choose from. Infrastructure as a service (IaaS) Amazon EC2 Platform as a service (PaaS) Azure Software as a service (SaaS) Gmail, Office 365, Hotmail

3Infrastructure as a Service

This is common solution in current IT. Bring us your VM and we will deploy it for you.Datacentres provide infrastructure and your host virtual machines. Currently, IT servers tends to virtualize 50%-80% of servers and move part of them to internet providers datacentresAmazon EC2 is typical IaaS.

4Platform as a Service

Front end: e.g. load-balanced stateless web servers Middle worker tier: e.g. order processing, encoding Backend storage: e.g. SQL tables or files

SQL Server 2012 is Microsofts latest cloud-ready information platform. Organizations can use SQL Server 2012 to efficiently protect, unlock, and scale the power of their data across the desktop,mobile device, datacenter, and either a private or public cloud. Building on the success of the SQLServer 2008 R2 release, SQL Server 2012 has made a strong impact on organizations worldwide with its significant capabilities. 5Platform as Service Security Model

Most security domains are handles by provider Perhaps LightSwitch applications most exciting deployment scenario is that of running in the cloud, using the Microsoft Windows Azure Platform as a Service (PaaS) cloud environment and its SQL Azure Database as a Service environment. PaaS is acceptable for many Independent Service Providers who do not need infrastructure but application and data. 6Hybrid solutions

It's pretty rare that an organization will take an entire set of applications that they have running today on-premises and just move all of that into the cloud. It's more realistic for someone to make hybrid solutions, that means certain parts of the computing are done locally, and other parts of the computing are done by a cloud provider

Creating Line of Business Applications using SQL Azure and LigthSwitch2011 based on a rapid development and full application life circle is a key point.The excitement stems not just from the power of running as a cloud-deployed application, but also because LightSwitch allows developers to take advantage of the cloud with fewer barriers and greater simplicity than other Microsoft application development environments.

7Private or Public?

For example, an organization can develop and deploy applications and database solutions on traditional nonvirtualized environments, on appliances, and in on-premises private clouds or off-premises public clouds. Moreover, these solutions can easily integrate with one another, offering a fully integrated hybrid solution8Windows Azure Service ModelA windows Azure application is called a serviceDefinition informationConfiguration informationAt least one roleRoles are like DLLs in the service processCollection of code with an entry point that runs in its own virtual machineWindows Azure compute SLA requires two instances of each role99.95$ for connectivity to two instancesAchieved with update and fault domainData CentresNodeVirtual MachineWindowsAzureRolesRoles are like DLLs, i.e. a collection of code with an entry point that runs in its own virtual machine

The developer specifies how many instances of the Web role should run, and the Windows Azure fabric controller creates this number of VMs. The fabric controller also monitors these instances, making sure that the requested number is always available. For data storage, the application uses Windows Azure storage tables, which provide scale-out storage capable of handling very large amounts of data.

Within 1 service you can have 2 environments: Staging and Production

Virtualization allows complete isolation between different customers codes (multitenant model)

Web Role front end serviceWorker Role background process/serviceVM - stateless module (if you lost vm it is only space issue. They use your image to recreate VM again)

Data centres are continentally distributed. Physical address of datacentres is a top secrete

9Multi-Instance approach

Automated, consistent application updateAutomated, consistent configuration changesMulti-instant managementScale outHigh availabilityAutomated, consistent OS securityScalability 2 instances = 99.95% availabilityMulti vs single instance1 instance by default (no cloud )2 instances = 99.95% availability (to achieve the same in Amazon EC2 you need 2 VM in different domains/data centers)

However, availability of datacentres is not always 100%. 29th of February 2012 all servers were not available. CodePlex TFS was not available many times in March 2012

10

Creating new service demo11

Adding instances13

Database interface15

Reporting interface16

Data sync is based on 3 points : azure database, hub and on-pemises database17Application SecurityInput ValidationAuthenticationConfiguration ManagementSensitive DataSession ManagementCryptographyParameter ManipulationException ManagementAuditing and LoggingLife is different in the cloud, and developers will need to adapt to those differences.

Application running on cloud should be implemented according to application security best practices.Use the loginname@Servername format for the login since certain tools implement TDS differently. For example, if your server name isOutbound and inbound refers to direction of traffic with reference to the router. Microsofts cloud infrastructure has obtained the Federal Information Security Management Act of 2002 (FISMA)s Authorization to Operate (ATO), which authorizes US government organizations to store their data in Windows Azure.

Each Windows Azure subscription can create one or more storage accounts, and each storage account has a single secret key, called the Storage Account Key (SAK). Any application that wants to have access to data in a storage account needs to have the appropriate SAK. For example, if your server name is mv2abek9r7.database.windows.net and your administrator login is testsa, use testsa@mv2abek9r7 as the login.18Identity in the cloud

Windows Azure supports role-base (Windows AD and Asp.Net Forms authentications) and claim-base identity management.Role based authorization provides access based on user membership (user->role->permission).Claim based identity claim system provides token, application open the token, extracts the claim and use it to execute business processCustom Identity (SQL Azure) Federation enables Single Sign On (SSO) experience.

Sharepoint 2010 farm HAS windows-claim-windows convertorSQL Azure does NOT currently support Windows Integrated authenticationSQL on-premise support Windows and SQL but does NOT support claim-based19Federated Security

Azure Connect Services allows you to federate security. It doesn't just work with Active Directory. You also have the ability to use Google authentication, Facebook, you could allow people to come in from Yahoo or Live.com. The interesting thing is that the credentials are not sent to Azure. They're verified against the user's system, wherever, let's say Facebook. If I try to hit an Azure asset and access Azure Connection Services are laid in, when I hit the asset, it will say go log in with Facebook and then Facebook will send me the claim of who you are. FederationOne application provides services to customers from different organisations.Role-based - organisation requires the application to manage entities from many organisations in a single location.Claim based authorization provides a much simpler solution by using federationAdding new partners requires only the definition of new claim mapping rules20Transport security

Inbound ACL (outgoing traffic): permit tcp any gt 1023 host 1433Outbound ACL (return traffic): permit tcp host 1433 any gt 1023 establishedRemember to Encrypt Your Connection in SSMSAvoid Men in Middle attack when you try establishing a new connection to SQL Azure viaSSMSRS security is based on SQL Azure username/password. Application is based on form authentication. How to join them transparently? Url parameters passed by GET need to be encrypted or binary encoded to avoid Localisation and security issues http://my123.cloudapp.net/my.aspx?id=1http://my123.cloudapp.net/my.aspx?id=XD01FF112

21Windows Azure Code Access Security

Keep sensitive data on premises (credit cards, encrypt/decrypt keys). Test access under different user accounts. Encrypt selectively (Two doctors one claim table. How to protect patient and provide confidential data within clinic). SQL Azure does not currently support the Transparent Data Encryption (TDE) feature available in Microsoft SQL Server.

22AppFabric Access Control ServiceProvides rules-driven, claim-based authorizationWeb applicationREST Web serviceSOAP Web serviceKey featuresBroad identity providers supportAD Federation Services v2.0Live ID, Facebook, Google, YahooPrivate Identity StoreWS-Trust and WS-Federation protocol supportFull integration with Windows Identity Foundation (WIF)Configurable through new management web portal

AppFabric Access Control Services (ACS) 2.0 may accept claims from different providers and convert them23Access to dataTransfer:BCPSSISSQL Azure Migration WizardSQL Server Migration AssistantConnect:Windows Azure Connect (*)Windows Azure AppFabric Access Control Service (ACS)BCP scriptYou can transfer data to SQL Azure by using SQL Server Integration Services.The SQL Azure Migration Wizard is an open source tool built and supported by the community. http://sqlazuremw.codeplex.com/SQL Server Migration Assistant has been updated to support migration from MySQL to SQL Azure. SSMA for Access v4.2 allows to upsize your data from the Access to the full relational database of SQL Azure.

Both provide a means to allow users of a corporate or disparate network to be able to access services offered through the Windows Azure platform.

Data Sync Agent works in conjunction with the SQL Azure Data Sync service. It enables on-premises SQL Server data to be synchronized with SQL Azure databases.

24Windows Azure Connect

Windows Azure Connect represents a VPN-like environment that allows the corporate network to become aware of the Windows Azure roles. Windows Azure AppFabric ACS is more a model for Identity Federation. Think of it as having an ID to be able to enter your corporation premises, but with the added functionality of access to a partners network. It provides an opportunity to extend authentication services to integrate with each other, so that users who consume services offered through the Windows Azure Platform are not required to remember different sets of credentials.

25On-Premises Synchronisation

Windows Azure Connect enables us to configure connections between computers that are on-premise, local computers, and the roles running on the Windows Azure platform. Once the connection is configured, role instances in Windows Azure will use an IP addressing scheme similar to what you use with other networked components.

If the intention is to have your Windows Azure role belong to a domain, you will need to consider the following: 1. Review a diagram of the different components needed to work with Windows Azure Connect. 2. Install the endpoint software on the on-premise systems. 3. Open Port 443 (TCP) outbound on the local computers. 4. Collect information on settings to implement and configure to join the Windows Azure to the domain. This is information that would include the Activation Token and the setting for the Windows Azure to be able to join the domain. 5. Activate your Windows Azure roles for Windows Azure Connect 6. Create and configure a group of local endpoints.

The policy will get available when you add the local machine in Connect endpoint group on Azure management portal. So go to portal, select "virtual Network" and create group. Add your desired Azure role and your local endpoint in the group the refresh policy by right clicking on connect agent logo on your local machine.To install Windows Azure Connect (set up local endpoint) on your local computer use online setup (url with activation key). Follow the instructions http://msdn.microsoft.com/en-us/library/windowsazure/gg433071.aspx to activate Windows Azure Roles for Windows Azure Connect.

26

SQL Types Restrictions

SQL Azure does NOT support UDT, hierarchy, XML features, NOT FOR REPLICATATION, WITH (many options.)

Geometry and geography support was recently added with Bing maps integration.27Components

Web and worker role and they're processing data and storing pictures. Perhaps you don't put the metadata within a table. Perhaps the partitioning scheme that you choose doesn't quite fit within a table storage environment. You could use the relational database management system features, the RDBMS features within SQL Azure to store metadata for a very fast retrieval of metadata and search and so on. So, as you can see, each of these components has a place, and it fits well within a certain environment. So, you pick them as needed. That's what makes a hybrid environment.

28Main Parts

Compute: runs applications in the cloud. Those applications largely see a Windows Server environment, although the Windows Azure programming model isnt exactly the same as the on-premises Windows Server model. Storage: stores binary and structured data in the cloud. Fabric Controller: deploys, manages, and monitors applications. The fabric controller also handles updates to system software throughout the platform. Content Delivery Network (CDN): speeds up global access to binary data in Windows Azure storage by maintaining cached copies of that data around the world. Connect: allows creating IP-level connections between on-premises computers and Windows Azure applications.

29Web application model

This is typical LightSwitch in Azure model.Maybe the application will have a short or unpredictable lifetime, and so allocating a server in the enterprises data center isnt worthwhile. Or maybe the application needs to be up and running as quickly as possible, making the wait for internal IT to provision a server unacceptable. Or perhaps the organization believes that running the application on Windows Azure will be cheaper and simpler. Whatever the reason, this application probably doesnt need the massive scale that Windows Azure tables allow. Instead, its creators might prefer to use the relational approach they already know, complete with familiar reporting tools. In a case like this, the application might use Windows Azure together with SQL Azure Because Windows Azure and SQL Azure provide cloud analogy to their on-premises counterparts, it can be straightforward to move the code and data for this kind of application between the two worlds 30Windows Azure Storage

In Windows Azure there are three major types of data storage: Windows Azure Storage, SQL Azure, and Azure AppFabric Cache. Each storage type has its own properties: its own pros and cons and its own typical usage scenario. Yet all three storage types are designed to protect the data they store.

BLOBsProvide a simple interface for storing named files along with metadata for the file

Tables (not SQL server table)Provide lightly structured storage with a set of entities that contain a set of properties

QueuesProvide reliable storage and delivery of messages (If you use SQL Server Broker)31Protection Against Data Loss

3 copies of replicated dataSQL Azure do not back up your data. But Windows Azure keeps a backup copy of all data in another data center in the same part of the world. If the data center holding the main copy is unavailable or destroyed, this backup remains accessible. To protect data loos (caused by user mistake) you have to organize custom extract data process. SSIS snapshot or sync with on premises database (supported on Windows 7 and higher).

32BLOB Model

Customers may want some blobs to be made public. One common scenario is static data (such as pictures) that needs to be accessible directly from a browser. To achieve this, the blob container can be marked as public. Sometimes, however, you want to be able to access your blobs from a browser without making them publicly accessible to everyone. In this case, a Shared Access Signature is needed. With the key, you can create a special

Blobs come in two forms: Block blobs, each of which can contain up to 200 gigabytes of data. To make transferring them more efficient, a block blob is subdivided into blocks. If a failure occurs, retransmission can resume with the most recent block rather than sending the entire blob again. Once all of a blobs blocks have been uploaded, the entire blob can be committed at once. Page blobs, which can be as large at one terabyte each. A page blob is divided into 512-byte pages, and an application is free to read and write individual pages at random in the blob.

URL that will enable access to the blobs even without the storage access keys. This special URL should be distributed only to customers to whom you want to give access Shared Access Signatures can only be created by the SAK owner because the key is required to create an HMAC signature. Figure 2 describes the structure of the Shared Access Signatures URL. 33

Scalable Web applications are useful, but theyre not the only situation where Windows Azure makes sense. Think about an organization that occasionally needs lots of computing power for a parallel processing application. There are plenty of examples of this: financial modelling at a bank, rendering at a film special effects house, new drug development in a pharmaceutical company, and more. While its possible to maintain a large cluster of machines to meet this occasional need, its also expensive. Windows Azure can instead provide these resources as needed, offering an on-demand compute cluster. A developer can use Worker roles to create this kind of application. And while its not the only choice, parallel applications commonly use large datasets, which might be stored in Windows Azure blobs

34Windows Azure QueuesProvide reliable message deliverySimple, asynchronous work dispatchProgramming semantics ensure that a message can be processed at least onceQueues are highly available, durable and performance efficientMaximum size is 64KFIFO in general, but not guaranteed (compare with SQL Broker and MSMQ)Pulling an item from the queue doesnt delete itIt becomes invisible for a visibility timeoutItem must be deleted before timeout or else it becomes visibleQueue designed for failure Separating Web role instances from Worker role instances makes sense. It frees the user from waiting for a long task to be processed, and it also makes scalability simpler: just add more instances of either. But why make instances explicitly delete messages? The answer is that it allows handling failures. If the Worker role instance that retrieves a message handles it successfully, it will delete the message while that message is still invisible, i.e., within its 30 second window. If a Worker role instance dequeues a message, however, then crashes before it completes the work that message specifies, it wont delete the message from the queue. When its visibility timeout expires, the message will reappear on the queue, then be read by another Worker role instance. The goal is to make sure that each message gets processed at least once.

Windows Azure storage queues dont have the same semantics as queues in Microsoft Message Queuing (MSMQ) or other more familiar technologies. For example, a conventional queuing system might offer first in, first out semantics, delivering each message exactly once. Windows Azure storage queues make no such promises. A message might be delivered multiple times, and theres no guarantee to deliver messages in any particular order.

In a typical scenario, multiple Web role instances are running, each accepting work from users (step 1). To pass that work on to Worker role instances, a Web role instance writes a message into a queue (step 2). This message, which can be up to eight kilobytes, might contain a URI pointing to a blob or to an entity in a table, or something elseits up to the application. Worker role instances read messages from this queue (step 3), then do the work the message requests (step 4). Its important to note, however, that reading a message from a queue doesnt actually delete the message. Instead, it makes the message invisible to other readers for a set period of time (which by default is 30 seconds). When the Worker role instance has completed the work this message requested, it must explicitly delete the message from the queue (step 5).

35DeploymentVS publishingDeploy DB wizardPowerShell

There are 2 main components: a package file and a configuration file. The package file contains the application components and the configuration file is the information needed by the deployment process in order to provision the service.

Once the application has been developed and tested locally, the developer can upload the code and its configuration information, then run it. A developer first uploads the application to the platforms staging area, then you request Windows Azure portal to put that into production. This switch between staging and production can be done with no downtime, which lets a running application be upgraded to a new version without disturbing its users.

36PowerShell TasksDeploy new hosted servicesUpgrade services. Swap VIP between staging and production. Remove Hosted Services .Automatically stop services and stop the billing cycle. Manage Storage Accounts. Retrieve or recreate storage keys Manage Certificates. Deploy certificates Configure Diagnostics. Configure event sources to monitor (Event Logs, Tracing, IIS Logs and Performance Counters) Transfer Diagnostic Information. Schedule diagnostics transfers or have them execute on demand.New-Deployment -serviceName -subscriptionId -certificate -slot staging -package -configuration -label | Get-OperationStatus WaitToComplete

Through the use of PowerShell, you are able to create new deployments, stop and start deployed services, and transition services from staging to production environments. Windows Azure PowerShell Cmdlets

As long as you have your Azure subscription ID and storage account information handy and have met a few other setup prerequisites, deploying your LightSwitch is as simple as running a wizard. This effectively creates a feedback mechanism where LightSwitch amplifies the value of the cloud and Azure amplifies the value of LightSwitch. 37Windows Azure changes Feb 2012Pay-As-You-Go rates:Windows Azure Compute Extra small instance: $0.02 per hour Small instance (default): $0.12 per hour Medium instance: $0.24 per hour Large instance: $0.48 per hour Extra large instance: $0.96 per hour Storage $0.125 per GB stored per month $0.01 per 10,000 storage transactions Content Delivery Network (CDN) $0.12 per GB for data transfers under Zone 1 $0.19 per GB for data transfers under Zone 2 $0.01 per 10,000 transactions Virtual NetworkWindows Azure Connect - No charge during CTP Access Control$1.99 per 100,000 transactions Service Bus $0.10 per 100 relay hours $0.01 per 10,000 messages Caching 128 MB cache for $45.00 256 MB cache for $55.00 512 MB cache for $75.00 1 GB cache for $110.00 2 GB cache for $180.00 4 GB cache for $325.00

Pricing policy varies in time. To the end of 2012 Microsoft do charge you some resources*Previous prices 50GB and larger reflect price cap of $499.95 announced December 12, 2011. Steven Martin,General Manager, Windows Azure Business Planning38Azure PricingWhat we are paying for?How to minimize cost?How to avoid over limiting?Hidden costsExtension pricesWhat is on special?Current price policyYou are paying for platform. Subscription is a fundamental billing concept.

Elastic model Pay as you go. Pay as you grow. SSL certificate for https. Buy or self generate?Hidden costs (expire date), godaddy special only for hosted customers. With green bar or not?Veritas, godaddy, comodo certificate - for public services and b2bSelf generated for private

Price calculator https://www.windowsazure.com/en-us/pricing/calculator/

Windows Azure supports two purchase options:

Pay-As-You-GoThis pricing option is extremely flexible. It involves no up-front costs, and no long term commitment. You pay only for the resources that you use.In the Pay-As-You-Go option, Web, Worker, and VM compute resources are paid for on a per-hour usage basis. Storage, Database, Bandwidth, Caching and CDN features are charged on a per-GB/month usage basis, with per-transaction costs for some resources.

6 Month Plans (save up to 33%)Buy a 6 month plan and take advantage of significant price discounts off our Pay-As-You-Go rates. You will get a 20% discount for Compute and Database and up to 33% off for Storage. The storage discount depends on the number of terabytes that you purchase.

39SQL and Transfer PricingSQL AzureDatabase Size Price Per Database Per Month 0 to 100 MB Flat $4.995 Greater than 100 MB to 1 GB Flat $9.99 Greater than 1 GB to 10 GB $9.99 for first GB, $3.996 for each additional GB Greater than 10 GB to 50 GB $45.954 for first 10 GB, $1.998 for each additional GB Great than 50 GB to 150 GB $125.874 for first 50 GB, $0.999 for each additional GBData TransfersZone 1 $0.12 per GB out Zone 2 $0.19 per GB out All inbound data transfers are at no charge. 1 Compute hours are calculated based on the number of hours that your application is deployed. Details on different compute sizes are summarized in the table below--Run this query in SSMS to evaluate your priceSELECT @@SERVERNAME AS ServerName ,DB_NAME(database_id) AS DatabaseName ,SUM(( size * 8.0 ) / 1048576) SizeGB ,SQLAzurePrice = CASE WHEN SUM(( size * 8 ) / 1048576) > 150THEN 999999.000WHEN SUM(( size * 8.0 ) / 1048576) > 50 THEN (125.874 + ((SUM(( size * 8.0 ) / 1048576) - 50) * .999))WHEN SUM(( size * 8.0 ) / 1048576) > 10 THEN (45.954 + ((SUM(( size * 8.0 ) / 1048576) - 10) * 1.998))WHEN SUM(( size * 8.0 ) / 1048576) > 1 THEN (9.990 + ((SUM(( size * 8.0 ) / 1048576) - 1) * 3.996))WHEN SUM(( size * 8.0 ) / 1024) > 100 THEN 9.990ELSE 4.995ENDFROM sys.master_filesWHERE type_desc 'LOG'GROUP BY DB_NAME(database_id)ORDER BY DB_NAME(database_id)

40BillingPay as you go Pay as you grow Pay as you here

The most expensive resource is CPU consumption.Compute hours is a measure of consumption. Be careful with computing resources41Billing statistics

Monitoring

Adding Co-Admin opens up the possibility of having multiple points of contact and distributes the task of management to other individuals within the IT Services team.

System Center 2012Not only can we do this for on-premises applications, but we can also support applications that are either Windows Azure based, or a cross between On-Premise and Windows Azure. In this environment, we have an application thats deployed that is cross platform. A piece of it is deployed on Windows Azure, and a piece of it is deployed locally. When we view that within System Center 2012, we can view it at the worker role there on the right and whatever is deployed within my datacenter there on the left. We have both On-Premises pieces of the application, for example the database; and we also have the Windows Azure pieces, for example the front end. We can see both of those from within the same view by creating distributed application that attaches to all these different pieces and elements. 43Technical support

Understand the topology and architecture of the application service in question. An application deployed in cloud computing model is called a service. This would necessitate a service model that accurately binds the applications architecture to the underlying resources where it will be hosted.

If you raise a query helpdesk will assist you email or phoneThey are pretty annoying , send 3 email before close issue, try to call you directly.

44Ready to cloud?Check restrictionsRun script directlyDeploy from VS2010/11Compare schemas in VS11 SQL Database project

Check list:Rename fields with data (protect data lose check)Deploy application into staging.Production is down when staging is deployed. Why ???Check staging result: 1 instance per deployment. New guid url is generated for every deploymentExtend number of instances in staging Swap production and staging.

45

LightSwitch 2011?

So what is LightSwitch all about? This is an easy to use tool for building LOB applications. That would bring multi-tier, cloud-hosted development to a wide group of developers.There are lots of good things here, such as the visual database designer, the Publish Application wizard, and the whole model-driven approach. The screens are automatically generated via pre-built templates. If extensions dont offer all of the functionality required, code-behind that enables access to the full .NET Framework (4.0) can be introduced to provide complex business data processing (e.g. Email integration, screen data management) and to implement complex validation requirements.

The applications generated by LightSwitch are very user and IT friendly. It allows involving business analysts as a driving force of application life circle.

An ASP.NET application might use a membership provider to store its own user ID and password, for example, or it might use some other method, such as the Microsoft Windows Live ID service. 46Restrictions and limitations

Life is simpler in the LS world, and as result you have more restrictions and trade-offs. Unlike a professional developer tool like Visual Studio, LightSwitch is more constrained. Sandboxed Silverlight run-time environment.Cut-down implementation of the Entity Framework to access data.

Taking many of the decisions out of the hands of the programmer. There are many obstacles on your way. 47Keep in mindNo *nix/Android/IOs support. Moonlight plans to support SilverLight 4.0 but not 5.0 yet.Migration LightSwitch project from VS 2010 to VS 11 is not straightforward. Windows XP is not compatible with VS 11.LS Extensions in VS 2010/11 and Desktop/Web are differentExcessive memory usage when you debug in VS (memory leaks)Disable Export to Excel feature to improve performanceTable relationships in VS2010 are not allowed in external datasource (fixed in VS 11). Views are not editable (without PK)Linq does not always pick up external datasource. Use iterators For security purposes, you cant buy a certificate mapping to the yourapp.cloudapp.net. Only Microsoft can issue certificates for cloudapp.net, though you can create your own self-signed certificate for development purposes.Computed fields are not directly available in queries and relationshipsNo deny, only allow permissions - User.HasPermission()Schema comparer in VS2010 does not supported SQL Azure. As a result, DB schema comparer would not be able to compare new schema with existing one. You need local copy of cloud database to generate script or use VS 11. If you rename table/column LS would not be able to resolve this issue automatically.

For some reason LINQ does not work with external datasource link. It is hard to read form authentication content from API48Development aspectsLightSwitch project architecture (logical vs physical)Customisation and extension (grab-and-go)Logical /physical structure of project (Client, Common, Server)Relationships(1-1|0; 1|0-*; *-*)Adding pages with parameters (binary encoding)Localisation. Display names, Money symbols, LanguagesAccessing security data (from API or external datasource)Desktop (works good) -> Web (partially) -> Azure (does not work) Validation (check input data, prevent big pictures, etc)Audit, exception handlingPermissions as an application security blocks, User.HasPermission()Screen navigationOne obvious difference between the cloud and on-premises worlds is that Windows Azure applications dont run locally. This difference has the potential to make development more challenging. To help mitigate this, Microsoft provides the development fabric, a version of the Windows Azure environment that runs on a developers machine. http://connect.microsoft.com/VisualStudio/feedback/details/731315/lightswitch-fails-in-visual-studio-11

LightSwitch project has 2 views: logical (data, screens) and physical (multi projects Client, CleintGenerated, Common, Server,ServerGenerated). Data sources are presented as Database, Sharepoint, RIA Service (Domain Service), OData Service (WCF, etc).

LightSwitch has an extensions model that is simultaneously attractive to advanced developers and to Independent Software Vendors (ISVs). Advanced developers can extend the products basic functionality using their .NET development skills. ISVs can offer their extensions to a market of business application developers eager to integrate advanced functionality into their LightSwitch applications, but who may lack the skills or time to implement such functionality on their own.

LightSwitch offers a spectrum of customization to the developer on the coding side as well. Entire LightSwitch applications can be created that: - have no code at all - have code in their data models but not in their screens have substantial code within all parts of the application

In most application development scenarios, the best way to interface with another applications data is in a loosely-coupled manner using services. In other words, developers responsible for the application hosting the data can create services to expose it. LightSwitch supports such a scenario, provided the developers of the source application build a WCF RIA Services interface to their data. If they do, LightSwitch can connect right in, and yet the owners of the application retain control of their data rather than ceding it. You must have at least one method in the DomainService-derived class with the Query(IsDefault = true) attribute applied for the service to be compatible with LightSwitch. 49Demo1LS application anatomyPhysical and logical structure of projectData SourcesTable-Screen approach.Access ControlDesktop / Web modelNuGet ManagerChange Theme50What to use?Source code systemsContinuous integration systemsIDEExtensionsData sources (ApplicationData/External, RIAServices, SharePoint, OData )Testing modelshttp://en.wikipedia.org/wiki/Comparison_of_open_source_operating_systemshttp://en.wikipedia.org/wiki/Comparison_of_Continuous_Integration_Software

Government web services: ga.gov.au, business.gov.au, data.gov.au, SalesForce web services, spacial data, etc.

51Customisation and extensiongrab-and-goauto update

Sscreen and data-type templates can be extended or replaced as needed. Extensions can be used to improve the user interface, to implement complex business types, to create new screen templates, custom controls and more.

You can write your own LS extension. Extension Manager (NuGet packaging) allows to control extension upgrade online.

Also there are plenty of free and payed extensionsLeaders:-DevExpress -Infragistics-ComponentOne

SQL Server Reporting Services could be used against a LightSwitch applications database, but that authoring and rendering of SSRS reports is external to LightSwitch. (DevExpress offers its XtraReports solution for integrated reporting in LightSwitch )52Demo2ExtensionsAudit

LightSwitch allows to build simple and comprehensive audit system based on events.53

Extension price http://visualstudiogallery.msdn.microsoft.com/ 54

But be aware of some negative effects.Extension are different in VS 2010 and 11.Some of them are compatible, some can broke your IDE.

Windows Azure applications are also free to use Windows Identity Foundation (WIF) to implement claims-based identity. The choice is entirely up to the applications creator. Here is Identity and Access Tool for Visual Studio 11 Beta extension http://visualstudiogallery.msdn.microsoft.com/e21bf653-dfe1-4d81-b3d3-795cb104066eIt has some issues on x64 platform which can cause broken project VS11 beta template. In that case you have to rebuild VS11 .

LightSwitch Filter Control has some issues with saving filters as a report

55DemoSocial Network gatheredon your knees56Debugging and testingWindows Azure EmulatorDebugging logsSilverlight unit testing framework*Test project

They say, LightSwitch wasn't originally released as a "developer tool", so the intended target market wouldn't have even necessarily even known what TDD or unit testing was. So it wasn't a top priority for LS team to make it an easy thing.

To allow monitoring and debugging Windows Azure applications, each instance can call a logging API that writes information to a common application-wide log. A developer can also configure the system to collect performance counters for an application, measure its CPU usage, store crash dumps if it fails, and more. This information is kept in Windows Azure storage, and a developer is free to write code to examine it. For example, if a Worker role instance crashes three times within an hour, custom code might send an email to the applications administrator.

The development fabric runs on a single desktop or server machine. It emulates the functionality of Windows Azure in the cloud, complete with Web roles, Worker roles, VM roles, and all three Windows Azure storage options. A developer can build a Windows Azure application, deploy it to the development fabric, and run it in much the same way as with the real thing. He can determine how many instances of each role should run, for example, use queues to communicate between these instances, and do almost everything else thats possible using Windows Azure itself. Azure Dev emulation is quiet similar to what you have in production. However, as you get your app gets to deal more with middle-ware services that Azure provides, including Azure Storage, Queues, Cache, etc. - you will find emulation less and less compatible.

* When you install the Silverlight Toolkit, for Silverlight 4 or 5, you can add a new project to your solution, and unit test you code. There are many recommendations, but not all of them work perfectly from version to version.

Another change from the original version of Windows Azure is that the platform now supports access via the Remote Desktop Protocol (RDP). This is useful in debugging, for example, letting a developer directly access a specific instance. Dont expect to use this for virtual desktop infrastructure (VDI), howeverWindows Azure (today, at least) isnt designed to support this kind of scenario. 57

Publishing processStaging and Production58

Adding Staging environment60

Lets swap staging and production

62Upgrade Process

A key point to understand in service up time is the use of fault domains and upgrade domains: what they represent and their impact in the overall Windows Azure platform. Fault domains represent a physical unit of failure, very closely related to the physical infrastructure contained in datacenters, and while a physical blade or rack can be considered a fault domain, there is no direct one-to-one mapping between the two. Upgrade domains represent a logical unit that helps determine how a role within an instance will be updated. When an upgrade occurs, Windows Azure goes through the process of updating these domains one by one. Windows Azure Upgrade domains protect your application from issues that lead to potential downtime. When an upgrade request takes place, the roles get a notification of the change. Imagine having 50 role instances, all getting notified at once, and all of these instances reacting to the upgrade notification. This scenario would lead to unavailable roles and resource contention, leading to downtime. The Windows Azure platform will protect against flooding requests to upgrade a given service to a new version to be deployed. There is no control on how Windows Azure divides your instances and roles into fault/upgrade domains. It is a completely automated procedure which takes place in the background. 63

Swapping environmentsStart production64

Finally we have 2 application environments (webroles) but 1 database 65

Staging DNS name is always regeneratedProduction is not changedVS publishing wizard deploy 1 instance only. Somewhere in VS project configuration number of instances must be defined

The staged application has a DNS name of the form .cloudapp.net, where represents a globally unique identifier assigned by Windows Azure. For production, the developer chooses a DNS name in the same domain, such as myazureservice.cloudapp.net. User may preffer to use a custom domain rather than the Microsoft cloudapp.net domain, the applications owner can create a DNS alias using a standard CNAME Once the application is accessible from the outside world, its users are likely to need some way to identify themselves. To do this, Windows Azure lets developers use any HTTP-based authentication mechanism they like. 66Switch for businessBy 2013, 33% of BI functionality will be consumed via handheld devices.By 2014, 30% of analytic applications will use in-memory functions to add scale and computational speed. By 2014, 30% of analytic applications will use proactive, predictive and forecasting capabilities.By 2014, 40% of spending on business analytics will go to system integrators, not software vendors.By 2013, 15% of BI deployments will combine BI, collaboration and social software into decision-making environments.

As part of its Predicts 2011 series, Gartnerhas madefour predictions for the near future of business intelligence and analytics. Also, Forrester analyst James Kobielus has written a lengthy article outlining his own predictions for business analytics inInformation Week. Both see analytics being embedded in more business processes, particularly in collaborative and social software.http://www.gartner.com/it/page.jsp?id=1513714

By 2014, 40 % of spending on business analytics will go to system integrators, not software vendors.40 % of applications will be written by business people in 2014!!!Business analyst tool?

67Looking aheadAzure Plans to add moreWindows Azure Platform Appliance. CDN dynamic caching, VM role snapshotJava supportToday, cloud platforms are still a slightly exotic option for most organizations. As all of us build experience with Windows Azure and other cloud platforms, however, this new approach will begin to feel less strange. Over time, we should expect cloud-based applicationsand the cloud platforms they run onto play an increasingly important role in the software world.

68LightSwitch developers also wantdata visualization web connectivity advanced data entry facilities professional reporting functionality screens, themes, and shells workflow content management credit card validation other eCommerce capabilitiesSilverLight to HTML5 renderingScreen Templates , Business Types, Corporate Business Types, Public Business Types, Data source extensions would work extremely well. RSSBus (to connect to data from Google, Salesforce, Quickbooks, Facebook, Twitter, and even Windows PowerShell scripts )

Visual Studio LightSwitch is a project that needs to exist, and I hope Microsoft perseveres with it. There's a genuine market need for a product like this, and Microsoft's ambitionsapplication development by nondevelopersare the right ones to have. The regimented approach to database and user interface design is probably the right one to have too, though it may need a little refinement. But to really fill that role, I think Microsoft needs to reconsider the way it's branding and selling the product, and needs to take another look at the benefits that existing tools used for this kind of development have to offer.

I suspect though that confused marketing, the Silverlight dependency, and the initial strangeness of the whole package, will combine to make it a hard sell for Microsoft. I would like to be wrong though, as a LightSwitch version 2 which generates HTML 5 instead of Silverlight could be really interesting.

69New Learninghttps://www.microsoftvirtualacademy.com/http://www.microsoft.com/learning/en/us/certification/cert-sql-server.aspx

MS virtual academy is long running education program for self testing. As a student of this program, I strongly recommend take this content on board for self testing and rising awareness.There are plenty of resources. LS site and channel 9 has many how-to videos and tutorials70MS jokes

71New CertificationsMicrosoft Certified Solutions AssociateMicrosoft Certified Solutions ExpertMicrosoft Certified Solutions Master

In 2000, we worked on our MCSE and MCDBA certifications. In 2005, Microsoft changed things up with the MCITP and MCM programs. Things are about to change again weve got the new MCSA, MCSE, and MCSM certifications.No More Certification Versions!Get ready for a big one: these new certifications dont all have SQL Server 2012 in their names. Thats on purpose, the certifications are not versioned.Instead, certifications may cover multiple versions of the product.

Microsoft has reinvented its certification programto support the industry shift to cloud-computing by building cloud related and solution based skills validation into its highly recognized and respected certification program. To support the growing number of job openings in the cloud computing industry and need for qualified workers, Microsoft is introducing new cloud-related certifications that focus on using multiple technologies to create business solutions. As a result, building cloud experts who can lead companies into the cloud. Microsoft is introducing new certifications that reflect the skills required to build and manage technology solutions on-premise or in the cloud:Microsoft Certifications now require that you recertify every 2-3 years. Sets of certification exams are available for purchase today. Start your training and certification journey by visiting: http://microsoft.com/learning/mcse

Also I recommend to learn about master program transformations at http://www.sqlskills.com founded by Paul Randal and Kimberly Tripp. MCMs program was partially delegated to 3rd party providers in 2011.Brent Ozar MCM on http://www.brentozar.com has some valuable resource about SQL certification changes and Azure experience.72Referenceshttp://blogs.msdn.com/b/windowsazure/http://blogs.msdn.com/b/sqlazure/http://social.msdn.microsoft.com/Forums/en-US/category/windowsazureplatform

http://www.microsoft.com/visualstudio/en-us/lightswitchhttps://twitter.com/#!/vslightswitchhttp://blogs.msdn.com/b/bethmassi/http://social.msdn.microsoft.com/Forums/en-US/category/vslightswitch

Mark Russinovich is a MS superstar now is in Azure team. MS evangelists Bethina Massi, Justin Anderson are key persons in LS community. Michael Washington and Yann Duran are well know MVP.Jeremy Huppatz independent consultant who implements LS solution for Adelaide Uni right now.All of them provide extraordinary support.

73SQL User GroupsAdelaide SQL Server User GroupMeets the 3rd Wednesday of every month at EDGE ChurchDrop Rob Farley a line if you want to be on the mailing list ([email protected])

PASS Virtual ChaptersMany topics Database Administration, Business Intelligence, Data Architecture, Application Development, Master Data\Data Quality, SQL Azure and Powershell to name a few

Keep in touch with your local group74Questions?Please complete an evaluation form for this sessionand thanks again to our awesome sponsors!

75