Search Results

Search found 44703 results on 1789 pages for 'azure web roles'.

Page 13/1789 | < Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >

  • How to deploy App_Data files with Azure cloud service (web role)

    - by user2977157
    I have a read-only data file (for IP geolocation) that my web role needs to read. It is currently in the App_Data folder, which is NOT included in the deployment package for the cloud service. Unlike "web deploy", there is no checkbox for an azure cloud service deployment to include/exclude App_Data. Is there a reasonable way to get the deployment package to include the App_Data folder/files? Or is using Azure storage for this sort of thing the better way to go? (cost and performance wise) Am using Visual Studio 2013 and the Azure SDK 2.2

    Read the article

  • SQLAuthority News – Download SQL Azure Labs Codename “Data Explorer” Client

    - by pinaldave
    Microsoft SQL Azure labs has recently released Data Explorer client. I was looking forward to visualizing tool for quite a while and I am delighted to see this tool. I will be trying out this tool in coming week and will post here my experience. I have listed few of the resources which are related to Data Explorer at the end. Please let me know if I have missed any and I will add the same. With “Data Explorer” you can: Identify the data you care about from the sources you work with (e.g. Excel spreadsheets, files, SQL Server databases). Discover relevant data and services via automatic recommendations from the Windows Azure Marketplace. Enrich your data by combining it and visualizing the results. Collaborate with your colleagues to refine the data. Publish the results to share them with others or power solutions. The Data Explorer Client package contains the Data Explorer workspace as well as an Office plugin that integrates Data Explorer into Excel. Resources: Download Data Explorer Data Explorer Blog Desktop Client Video of  Contoso Bikes and Frozen Yogurt (Data Explorer) Please note that this is not the final release of the product. Please do not attempt this on production server. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Azure, SQL Documentation, SQL Download, SQL Query, SQL Server, SQL Tips and Tricks, SQLAuthority News, T SQL, Technology

    Read the article

  • Applications are now open for the Microsoft Accelerator for Windows Azure - 2013

    - by ScottGu
    In October, I introduced the finalists for the Microsoft Accelerator for Windows Azure, powered by TechStars. Over the past couple of months, these startups have been mentored by business and technology leaders, met with investors, learned from each other, and, most importantly, been building great products. You can learn more about the startups in the first class and how they’re using Windows Azure here. As the first class approaches Demo Day on January 17th, I’m happy to announce that today we are opening applications for the second class of the Microsoft Accelerator for Windows Azure. The second class will begin on April 1,, 2013 and conclude with Demo Day on June 26, 2013. If you are currently working at a startup or considering founding your own company, I encourage you to apply. We’re accepting applications through February 1st, 2013. You can find more information about the Accelerator and the application process here. It’s been truly inspiring to work with the current class of startups. This inaugural class has brought with them incredible energy and innovation and I look forward to reviewing the applications for this next class. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Debugging Node.js applications for Windows Azure

    - by cibrax
    In case you are developing a new web application with Node.js for Windows Azure, you might notice there is no easy way to debug the application unless you are developing in an integrated IDE like Cloud9. For those that develop applications locally using a text editor (or WebMatrix) and Windows Azure Powershell for Node.js, it requires some steps not documented anywhere for the moment. I spent a few hours on this the other day I practically got nowhere until I received some help from Tomek and the rest of them. The IISNode version that currently ships with the Windows Azure for Node.js SDK does not support debugging by default, so you need to install the IISNode full version available in the github repository.  Once you have installed the full version, you need to enable debugging for the web application by modifying the web.config file <iisnode debuggingEnabled="true" loggingEnabled="true" devErrorsEnabled="true" /> The xml above needs to be inserted within the existing “<system.webServer/>” section. The last step is to open a WebKit browser (e.g. Chrome) and navigate to the URL where your application is hosted but adding the segment “/debug” to  the end. The full URL to the node.js application must be used, for example, http://localhost:81/myserver.js/debug That should open a new instance of Node inspector on the browser, so you can debug the application from there. Enjoy!!

    Read the article

  • Weird 302 Redirects in Windows Azure

    - by Your DisplayName here!
    In IdentityServer I don’t use Forms Authentication but the session facility from WIF. That also means that I implemented my own redirect logic to a login page when needed. To achieve that I turned off the built-in authentication (authenticationMode="none") and added an Application_EndRequest handler that checks for 401s and does the redirect to my sign in route. The redirect only happens for web pages and not for web services. This all works fine in local IIS – but in the Azure Compute Emulator and Windows Azure many of my tests are failing and I suddenly see 302 status codes where I expected 401s (the web service calls). After some debugging kung-fu and enabling FREB I found out, that there is still the Forms Authentication module in effect turning 401s into 302s. My EndRequest handler never sees a 401 (despite turning forms auth off in config)! Not sure what’s going on (I suspect some inherited configuration that gets in my way here). Even if it shouldn’t be necessary, an explicit removal of the forms auth module from the module list fixed it, and I now have the same behavior in local IIS and Windows Azure. strange. <modules>   <remove name="FormsAuthentication" /> </modules> HTH Update: Brock ran into the same issue, and found the real reason. Read here.

    Read the article

  • Server 2008R2 Server Manager Roles and Features won't refresh or allow addition of new roles or features

    - by MattChorba
    I have a standalone DC in an isolated lab. I have installed the SUR tool and found no errors. I ran SFC and found no errors. I have attempted to install Windows Backup feature using Powershell, but received the same error about the computer needing to be restarted. Powershell cmdlets will list all of the installed roles and features. The rest of Server Manager works without problems. What can I do to get Server Manager Roles and Features working properly again? Picture of Error: CheckSUR.log: ================================= Checking System Update Readiness. Binary Version 6.1.7601.21645 Package Version 13.0 2011-11-28 13:20 Checking Windows Servicing Packages Checking Package Manifests and Catalogs Checking Package Watchlist Checking Component Watchlist Checking Packages Checking Component Store Summary: Seconds executed: 413 No errors detected (w) Unable to get system disk properties 0x0000045D IOCTL_STORAGE_QUERY_PROPERTY Disk Cache CheckSUR.persist.log: ================================= Checking System Update Readiness. Binary Version 6.1.7601.21645 Package Version 13.0 2011-11-28 13:20 Checking Windows Servicing Packages Checking Package Manifests and Catalogs Checking Package Watchlist Checking Component Watchlist Checking Packages Checking Component Store Summary: Seconds executed: 413 No errors detected (w) Unable to get system disk properties 0x0000045D IOCTL_STORAGE_QUERY_PROPERTY Disk Cache

    Read the article

  • ASP.net roles and Projects

    - by Zyphrax
    EDIT - Rewrote my original question to give a bit more information Background info At my work I'm working on a ASP.Net web application for our customers. In our implementation we use technologies like Forms authentication with MembershipProviders and RoleProviders. All went well until I ran into some difficulties with configuring the roles, because the roles aren't system-wide, but related to the customer accounts and projects. I can't name our exact setup/formula, because I think our company wouldn't approve that... What's a customer / project? Our company provides management information for our customers on a yearly (or other interval) basis. In our systems a customer/contract consists of: one Account: information about the Company per Account, one or more Products: the bundle of management information we'll provide per Product, one or more Measurements: a period of time, in which we gather and report the data Extranet site setup Eventually we want all customers to be able to access their management information with our online system. The extranet consists of two sites: Company site: provides an overview of Account information and the Products Measurement site: after selecting a Measurement, detailed information on that period of time The measurement site is the most interesting part of the extranet. We will create submodules for new overviews, reports, managing and maintaining resources that are important for the research. Our Visual Studio solution consists of a number of projects. One web application named Portal for the basis. The sites and modules are virtual directories within that application (makes it easier to share MasterPages among things). What kind of roles? The following users (read: roles) will be using the system: Admins: development users :) (not customer related, full access) Employees: employees of our company (not customer related, full access) Customer SuperUser: top level managers (full access to their account/measurement) Customer ContactPerson: primary contact (full access to their measurement(s)) Customer Manager: a department manager (limited access, specific data of a measurement) What about ASP.Net users? The system will have many ASP.Net users, let's focus on the customer users: Users are not shared between Accounts SuperUser X automatically has access to all (and new) measurements User Y could be Primary contact for Measurement 1, but have no role for Measurement 2 User Y could be Primary contact for Measurement 1, but have a Manager role for Measurement 2 The department managers are many individual users (per Measurement), if Manager Z had a login for Measurement 1, we would like to use that login again if he participates in Measurement 2. URL structure These are typical urls in our application: http://host/login - the login screen http://host/project - the account/product overview screen (measurement selection) http://host/project/1000 - measurement (id:1000) details http://host/project/1000/planning - planning overview (for primary contact/superuser) http://host/project/1000/reports - report downloads (manager department X can only access report X) We will also create a document url, where you can request a specific document by it's GUID. The system will have to check if the user has rights to the document. The document is related to a Measurement, the User or specific roles have specific rights to the document. What's the problem? (finally ;)) Roles aren't enough to determine what a user is allowed to see/access/download a specific item. It's not enough to say that a certain navigation item is accessible to Managers. When the user requests Measurement 1000, we have to check that the user not only has a Manager role, but a Manager role for Measurement 1000. Summarized: How can we limit users to their accounts/measurements? (remember superusers see all measurements, some managers only specific measurements) How can we apply roles at a product/measurement level? (user X could be primarycontact for measurement 1, but just a manager for measurement 2) How can we limit manager access to the reports screen and only to their department's reports? All with the magic of asp.net classes, perhaps with a custom roleprovider implementation. Similar Stackoverflow question/problem http://stackoverflow.com/questions/1367483/asp-net-how-to-manage-users-with-different-types-of-roles

    Read the article

  • Using design-patterns to transform web-service model classes into local model classes and vise versa

    - by Daniil Petrov
    There is a web-application built with play framework 1.2.7. It contains less than 10 model classes. The main purpose of the application is a lightweight access to a complex remote application (more than 50 model classes). The remote application has its own SOAP API and we use it for synchronization of data. There is a scheduled job in the web-app which makes requests to the remote app. It gets bunches of objects from the remote model and populates corresponding objects of the local model. Currently, there are two groups of classes - the local model and the remote model (generated from wsdl schema). It is not allowed to make any modifications to the remote model. Transformations are being made in the scheduled job class. When it gets objects from the remote app it creates local objects. Recently, it was decided to add a possibility to modify the remote objects. It requires more transformations on our side. We need to transform from remote to local model when reading objects and from local to remote when changing objects. I wonder if this would be possible to use some design-patterns to reduce a number of transformations?

    Read the article

  • A standard style guide or best-practice guide for web application development

    - by gutch
    I run a very small team of developers on a web application, just three people (and not even full time). We're all capable developers, but we write our code in very different ways: we name similar things in different ways, we use different HTML and CSS to achieve similar outcomes. We can manage this OK because we're small, but can't help feeling it would be better to get some standards in place. Are there any good style guides or best-practice guides for web application development that we can use to keep our code under control? Sure, we could write them ourselves. But the reality is that with lots to do and very few staff, we're not going to bother. We need something off the shelf that we can tinker with rather than start from scratch. What we're not looking for here is basic code formatting rules like "whether to use tabs or spaces" or "where to put line breaks" — we can control this by standardising our IDEs. What we are looking for are rules for code and markup. For example: What HTML markup should be used for headers, tables, sidebars, buttons, etc. When to add new CSS styles, and what to name them When IDs should be allocated to HTML elements, and what to name them How Javascript functions should be declared and called How to pick an appropriate URL for given page or AJAX call When to use each HTTP method, ie POST vs GET vs PUT etc How to name server-side methods (Java, in our case) How to throw and handle errors and exceptions in a consistent way etc, etc.

    Read the article

  • Java web app, with plugin framework and ability to connect to source for updates

    - by lessthancommon
    I've searched all around for some good sources, but either have been searching for the wrong keywords, or I'm just missing something. I'm looking to redevelop a web app I've been using for some time now. Many parts are out of date, and we're constantly throwing in little hacks to attempt to give it new life. So what I'd like to do is re-engineer it from the ground up, built on some sort of plug-in framework. Before I continue, I'm more or less an intermediate Java programmer. In some ways, I'm hoping to use this project as a big learning experience. I've read a lot about OSGi, and it seems that's the most complete framework. Ideally, I would like an end result web app which I can run one instance as my hosting environment, and other instances can connect to it to grab new and updated plug-ins. Eventually I'll want to lock down these plug-ins based on some undecided criteria of who can get them (basically some will simply be updates, others will provide new functionality and should be "purchased" through an external system). But that will probably be handled in a later phase. There should be an administration view for managing bundles in a hot environment (looking to avoid having to restart the server for an update). I know all these things are possible, I'm just trying to find some good resources for reference. All the OSGi tutorials I'm finding seem to be too simplistic. If anyone here can guide me in the right direction on any or all of the items I'm looking for, it would be much appreciated. Also, this is my first post, so I'll take any comments/criticisms about the content of my post. Thanks!

    Read the article

  • Best option for PDF viewer embedded in web app

    - by RationalGeek
    I have a web app that needs to be able to display a PDF. It needs to allow the user to page through the PDF, and my application needs to be able to know which page is currently being viewed, because other aspects of the web app will change based on the current page. Ideally it would not be dependent on the client having Adobe Reader but I could probably support that dependency. What are my best options for this? My application stack consists of ASP.NET 4 along with optionally Silverlight 5. Also, I could use something that is client-side based as well using JavaScript / HTML if such a thing exists. I found ComponentOne's offering for this and that seems like the leading candidate at this point, but I want to know if there are other options I should consider. Edit: Per Fosco's comment, converting the PDF to another format (such as HTML) might be an option, as long as I could tie back parts of the converted document to the original PDF page #s. Another note: this has to run entirely on our servers. It would not be acceptable to use a third-party service to view the PDFs.

    Read the article

  • Multitenancy in SQL Azure

    - by cibrax
    If you are building a SaaS application in Windows Azure that relies on SQL Azure, it’s probably that you will need to support multiple tenants at database level. This is short overview of the different approaches you can use for support that scenario, A different database per tenant A new database is created and assigned when a tenant is provisioned. Pros Complete isolation between tenants. All the data for a tenant lives in a database only he can access. Cons It’s not cost effective. SQL Azure databases are not cheap, and the minimum size for a database is 1GB.  You might be paying for storage that you don’t really use. A different connection pool is required per database. Updates must be replicated across all the databases You need multiple backup strategies across all the databases Multiple schemas in a database shared by all the tenants A single database is shared among all the tenants, but every tenant is assigned to a different schema and database user. Pros You only pay for a single database. Data is isolated at database level. If the credentials for one tenant is compromised, the rest of the data for the other tenants is not. Cons You need to replicate all the database objects in every schema, so the number of objects can increase indefinitely. Updates must be replicated across all the schemas. The connection pool for the database must maintain a different connection per tenant (or set of credentials) A different user is required per tenant, which is stored at server level. You have to backup that user independently. Centralizing the database access with store procedures in a database shared by all the tenants A single database is shared among all the tenants, but nobody can read the data directly from the tables. All the data operations are performed through store procedures that centralize the access to the tenant data. The store procedures contain some logic to map the database user to an specific tenant. Pros You only pay for a single database. You only have a set of objects to maintain and backup. Cons There is no real isolation. All the data for the different tenants is shared in the same tables. You can not use traditional ORM like EF code first for consuming the data. A different user is required per tenant, which is stored at server level. You have to backup that user independently. SQL Federations A single database is shared among all the tenants, but a different federation is used per tenant. A federation in few words, it’s a mechanism for horizontal scaling in SQL Azure, which basically uses the idea of logical partitions to distribute data based on certain criteria. Pros You only have a single database with multiple federations. You can use filtering in the connections to pick the right federation, so any ORM could be used to consume the data. Cons There is no real isolation at that database level. The isolation is enforced programmatically with federations.

    Read the article

  • Part 2&ndash;Load Testing In The Cloud

    - by Tarun Arora
    Welcome to Part 2, In Part 1 we discussed the advantages of creating a Test Rig in the cloud, the Azure edge and the Test Rig Topology we want to get to. In Part 2, Let’s start by understanding the components of Azure we’ll be making use of followed by manually putting them together to create the test rig, so… let’s get down dirty start setting up the Test Rig.  What Components of Azure will I be using for building the Test Rig in the Cloud? To run the Test Agents we’ll make use of Windows Azure Compute and to enable communication between Test Controller and Test Agents we’ll make use of Windows Azure Connect.  Azure Connect The Test Controller is on premise and the Test Agents are in the cloud (How will they talk?). To enable communication between the two, we’ll make use of Windows Azure Connect. With Windows Azure Connect, you can use a simple user interface to configure IPsec protected connections between computers or virtual machines (VMs) in your organization’s network, and roles running in Windows Azure. With this you can now join Windows Azure role instances to your domain, so that you can use your existing methods for domain authentication, name resolution, or other domain-wide maintenance actions. For more details refer to an overview of Windows Azure connect. A very useful video explaining everything you wanted to know about Windows Azure connect.  Azure Compute Windows Azure compute provides developers a platform to host and manage applications in Microsoft’s data centres across the globe. A Windows Azure application is built from one or more components called ‘roles.’ Roles come in three different types: Web role, Worker role, and Virtual Machine (VM) role, we’ll be using the Worker role to set up the Test Agents. A very nice blog post discussing the difference between the 3 role types. Developers are free to use the .NET framework or other software that runs on Windows with the Worker role or Web role. Developers can also create applications using languages such as PHP and Java. More on Windows Azure Compute. Each Windows Azure compute instance represents a virtual server... Virtual Machine Size CPU Cores Memory Cost Per Hour Extra Small Shared 768 MB $0.04 Small 1 1.75 GB $0.12 Medium 2 3.50 GB $0.24 Large 4 7.00 GB $0.48 Extra Large 8 14.00 GB $0.96   You might want to review the Windows Azure Pricing FAQ. Let’s Get Started building the Test Rig… Configuration Machine Role Comments VM – 1 Domain Controller for Playpit.com On Premise VM – 2 TFS, Test Controller On Premise VM – 3 Test Agent Cloud   In this blog post I would assume that you have the domain, Team Foundation Server and Test Controller Installed and set up already. If not, please refer to the TFS 2010 Installation Guide and this walkthrough on MSDN to set up your Test Controller. You can also download a preconfigured TFS 2010 VM from Brian Keller's blog, Brian also has some great hands on Labs on TFS 2010 that you may want to explore. I. Lets start building VM – 3: The Test Agent Download the Windows Azure SDK and Tools Open Visual Studio and create a new Windows Azure Project using the Cloud Template                   Choose the Worker Role for reasons explained in the earlier post         The WorkerRole.cs implements the Run() and OnStart() methods, no code changes required. You should be able to compile the project and run it in the compute emulator (The compute emulator should have been installed as part of the Windows Azure Toolkit) on your local machine.                   We will only be making changes to WindowsAzureProject, open ServiceDefinition.csdef. Ensure that the vmsize is small (remember the cost chart above). Import the “Connect” module. I am importing the Connect module because I need to join the Worker role VM to the Playpit domain. <?xml version="1.0" encoding="utf-8"?> <ServiceDefinition name="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition"> <WorkerRole name="WorkerRole1" vmsize="Small"> <Imports> <Import moduleName="Diagnostics" /> <Import moduleName="Connect"/> </Imports> </WorkerRole> </ServiceDefinition> Go to the ServiceConfiguration.Cloud.cscfg and note that settings with key ‘Microsoft.WindowsAzure.Plugins.Connect.%%%%’ have been added to the configuration file. This is because you decided to import the connect module. See the config below. <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*"> <Role name="WorkerRole1"> <Instances count="1" /> <ConfigurationSettings> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.ActivationToken" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Refresh" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.WaitForConnectivity" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Upgrade" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.EnableDomainJoin" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainFQDN" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainControllerFQDN" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainAccountName" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainPassword" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainOU" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Administrators" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainSiteName" value="" /> </ConfigurationSettings> </Role> </ServiceConfiguration>             Let’s go step by step and understand all the highlighted parameters and where you can find the values for them.       osFamily – By default this is set to 1 (Windows Server 2008 SP2). Change this to 2 if you want the Windows Server 2008 R2 operating system. The Advantage of using osFamily = “2” is that you get Powershell 2.0 rather than Powershell 1.0. In Powershell 2.0 you could simply use “powershell -ExecutionPolicy Unrestricted ./myscript.ps1” and it will work while in Powershell 1.0 you will have to change the registry key by including the following in your command file “reg add HKLM\Software\Microsoft\PowerShell\1\ShellIds\Microsoft.PowerShell /v ExecutionPolicy /d Unrestricted /f” before you can execute any power shell. The other reason you might want to move to os2 is if you wanted IIS 7.5.       Activation Token – To enable communication between the on premise machine and the Windows Azure Worker role VM both need to have the same token. Log on to Windows Azure Management Portal, click on Connect, click on Get Activation Token, this should give you the activation token, copy the activation token to the clipboard and paste it in the configuration file. Note – Later in the blog I’ll be showing you how to install connect on the on premise machine.                       EnableDomainJoin – Set the value to true, ofcourse we want to join the on windows azure worker role VM to the domain.       DomainFQDN, DomainControllerFQDN, DomainAccountName, DomainPassword, DomainOU, Administrators – This information is specific to your domain. I have extracted this information from the ‘service manager’ and ‘Active Directory Users and Computers’. Also, i created a new Domain-OU namely ‘CloudInstances’ so all my cloud instances joined to my domain show up here, this is optional. You can encrypt the DomainPassword – refer to the instructions here. Or hold fire, I’ll be covering that when i come to certificates and encryption in the coming section.       Now once you have filled all this information up, the configuration file should look something like below, <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="2" osVersion="*"> <Role name="WorkerRole1"> <Instances count="1" /> <ConfigurationSettings> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.ActivationToken" value="45f55fea-f194-4fbc-b36e-25604faac784" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Refresh" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.WaitForConnectivity" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Upgrade" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.EnableDomainJoin" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainFQDN" value="play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainControllerFQDN" value="WIN-KUDQMQFGQOL.play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainAccountName" value="playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainPassword" value="************************" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainOU" value="OU=CloudInstances, DC=Play, DC=Pit, DC=com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Administrators" value="Playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainSiteName" value="" /> </ConfigurationSettings> </Role> </ServiceConfiguration> Next we will be enabling the Remote Desktop module in to the ServiceDefinition.csdef, we could make changes manually or allow a beautiful wizard to help us make changes. I prefer the second option. So right click on the Windows Azure project and choose Publish       Now once you get the publish wizard, if you haven’t already you would be asked to import your Windows Azure subscription, this is simply the Msdn subscription activation key xml. Once you have done click Next to go to the Settings page and check ‘Enable Remote Desktop for all roles’.       As soon as you do that you get another pop up asking you the details for the user that you would be logging in with (make sure you enter a reasonable expiry date, you do not want the user account to expire today). Notice the more information tag at the bottom, click that to get access to the certificate section. See screen shot below.       From the drop down select the option to create a new certificate        In the pop up window enter the friendly name for your certificate. In my case I entered ‘WAC – Test Rig’ and click ok. This will create a new certificate for you. Click on the view button to see the certificate details. Do you see the Thumbprint, this is the value that will go in the config file (very important). Now click on the Copy to File button to copy the certificate, we will need to import the certificate to the windows Azure Management portal later. So, make sure you save it a safe location.                                Click Finish and enter details of the user you would like to create with permissions for remote desktop access, once you have entered the details on the ‘Remote desktop configuration’ screen click on Ok. From the Publish Windows Azure Wizard screen press Cancel. Cancel because we don’t want to publish the role just yet and Yes because we want to save all the changes in the config file.       Now if you go to the ServiceDefinition.csdef file you will see that the RemoteAccess and RemoteForwarder roles have been imported for you. <?xml version="1.0" encoding="utf-8"?> <ServiceDefinition name="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition"> <WorkerRole name="WorkerRole1" vmsize="Small"> <Imports> <Import moduleName="Diagnostics" /> <Import moduleName="Connect" /> <Import moduleName="RemoteAccess" /> <Import moduleName="RemoteForwarder" /> </Imports> </WorkerRole> </ServiceDefinition> Now go to the ServiceConfiguration.Cloud.cscfg file and you see a whole bunch for setting “Microsoft.WindowsAzure.Plugins.RemoteAccess.%%%” values added for you. <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="2" osVersion="*"> <Role name="WorkerRole1"> <Instances count="1" /> <ConfigurationSettings> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.ActivationToken" value="45f55fea-f194-4fbc-b36e-25604faac784" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Refresh" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.WaitForConnectivity" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Upgrade" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.EnableDomainJoin" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainFQDN" value="play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainControllerFQDN" value="WIN-KUDQMQFGQOL.play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainAccountName" value="playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainPassword" value="************************" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainOU" value="OU=CloudInstances, DC=Play, DC=Pit, DC=com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Administrators" value="Playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainSiteName" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.Enabled" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountUsername" value="Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountEncryptedPassword" value="MIIBnQYJKoZIhvcNAQcDoIIBjjCCAYoCAQAxggFOMIIBSgIBADAyMB4xHDAaBgNVBAMME1dpbmRvd 3MgQXp1cmUgVG9vbHMCEGa+B46voeO5T305N7TSG9QwDQYJKoZIhvcNAQEBBQAEggEABg4ol5Xol66Ip6QKLbAPWdmD4ae ADZ7aKj6fg4D+ATr0DXBllZHG5Umwf+84Sj2nsPeCyrg3ZDQuxrfhSbdnJwuChKV6ukXdGjX0hlowJu/4dfH4jTJC7sBWS AKaEFU7CxvqYEAL1Hf9VPL5fW6HZVmq1z+qmm4ecGKSTOJ20Fptb463wcXgR8CWGa+1w9xqJ7UmmfGeGeCHQ4QGW0IDSBU6ccg vzF2ug8/FY60K1vrWaCYOhKkxD3YBs8U9X/kOB0yQm2Git0d5tFlIPCBT2AC57bgsAYncXfHvPesI0qs7VZyghk8LVa9g5IqaM Cp6cQ7rmY/dLsKBMkDcdBHuCTAzBgkqhkiG9w0BBwEwFAYIKoZIhvcNAwcECDRVifSXbA43gBApNrp40L1VTVZ1iGag+3O1" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountExpiration" value="2012-11-27T23:59:59.0000000+00:00" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteForwarder.Enabled" value="true" /> </ConfigurationSettings> <Certificates> <Certificate name="Microsoft.WindowsAzure.Plugins.RemoteAccess.PasswordEncryption" thumbprint="AA23016CF0BDFC344400B5B82706B608B92E4217" thumbprintAlgorithm="sha1" /> </Certificates> </Role> </ServiceConfiguration>          Okay let’s look at them one at a time,       Enabled - Yes, we would like to enable Remote Access.       AccountUserName – This is the user name you entered while you were on the publish windows azure role screen, as detailed above.       AccountEncrytedPassword – Try and decode that, the certificate is used to encrypt the password you specified for the user account. Remember earlier i said, either use the instructions or wait and i’ll be showing you encryption, now the user account i am using for rdp has the same password as my domain password, so i can simply copy the value of the AccountEncryptedPassword to the DomainPassword as well.       AccountExpiration – This is the expiration as you specified in the wizard earlier, make sure your account does not expire today.       Remote Forwarder – Check out the documentation, below is how I understand it, -- One role in an application that implements a remote desktop connection must import the RemoteForwarder module. The two modules work together to enable the remote desktop connections to role instances. -- If you have multiple roles defined in the service model, it does not matter which role you add the RemoteForwarder module to, but you must add it to only one of the role definitions.       Certificate – Remember the certificate thumbprint from the wizard, the on premise machine and windows azure role machine that need to speak to each other must have the same thumbprint. More on that when we install Windows Azure connect Endpoints on the on premise machine. As i said earlier, in this blog post, I’ll be showing you the manual process so i won’t be scripting any star up tasks to install the test agent or register the test agent with the TFS Server. I’ll be showing you all this cool stuff in the next blog post, that’s because it’s important to understand the manual side of it, it becomes easier for you to troubleshoot in case something fails. Having said that, the changes we have made are sufficient to spin up the Windows Azure Worker Role aka Test Agent VM, have it connected with the play.pit.com domain and have remote access enabled on it. Before we deploy the Test Agent VM we need to set up Windows Azure Connect on the TFS Server. II. Windows Azure Connect: Setting up Connect on VM – 2 i.e. TFS & Test Controller Glad you made it so far, now to enable communication between the on premise TFS/Test Controller and Azure-ed Test Agent we need to enable communication. We have set up the Azure connect module in the Test Agent configuration, now the connect end points need to be enabled on the on premise machines, let’s have a look at how we can do this. Log on to VM – 2 running the TFS Server and Test Controller Log on to the Windows Azure Management Portal and click on Virtual Network Click on Virtual Network, if you already have a subscription you should see the below screen shot, if not, you would be asked to complete the subscription first        Click on Install Local Endpoints from the top left on the panel and you get a url appended with a token id in it, remember the token i showed you earlier, in theory the token you get here should match the token you added to the Test Agent config file.        Copy the url to the clip board and paste it in IE explorer (important, the installation at present only works out of IE and you need to have cookies enabled in order to complete the installation). As stated in the pop up, you can NOT download and run the software later, you need to run it as is, since it contains a token. Once the installation completes you should see the Windows Azure connect icon in the system tray.                         Right click the Azure Connect icon, choose Diagnostics and refer to this link for diagnostic detail terminology. NOTE – Unfortunately I could not see the Windows Azure connect icon in the system tray, a bit of binging with Google revealed that the azure connect icon is only shown when the ‘Windows Azure Connect Endpoint’ Service is started. So go to services.msc and make sure that the service is started, if not start it, unfortunately again, the service did not start for me on a manual start and i realised that one of the dependant services was disabled, you can look at the service dependencies and start them and then start windows azure connect. Bottom line, you need to start Windows Azure connect service before you can proceed. Please refer here on MSDN for more on Troubleshooting Windows Azure connect. (Follow the next step as well)   Now go back to the Windows Azure Management Portal and from Groups and Roles create a new group, lets call it ‘Test Rig’. Make sure you add the VM – 2 (the TFS Server VM where you just installed the endpoint).       Now if you go back to the Azure Connect icon in the system tray and click ‘Refresh Policy’ you will notice that the disconnected status of the icon should change to ready for connection. III. Importing Certificate in to Windows Azure Management Portal But before that you need to import the certificate you created in Step I in to the Windows Azure Management Portal. Log on to the Windows Azure Management Portal and click on ‘Hosted Services, Storage Accounts & CDN’ and then ‘Management Certificates’ followed by Add Certificates as shown in the screen shot below        Browse to the location where you saved the certificate earlier, remember… Refer to Step I in case you forgot.        Now you should be able to see the imported certificate here, make sure the thumbprint of the certificate matches the one you inserted in the config files        IV. Publish Windows Azure Worker Role aka Test Agent Having completed I, II and III, you are ready to publish the Test Agent VM – 3 to the cloud. Go to Visual Studio and right click the Windows Azure project and select Publish. Verify the infomration in the wizard, from the advanced settings tab, you can also enabled capture of intellitrace or profiling information.         Click Next and Click Publish! From the view menu bar select the Windows Azure Activity Log window.       Now you should be able to see the deployment progress in real time.             In the Windows Azure Management Portal, you should also be able to see the progress of creation of a new Worker Role.       Once the deployment is complete you should be able to RDP (go to run prompt type mstsc and in the pop up the machine name) in to the Test Agent Worker Role VM from the Playpit network using the domain admin user account. In case you are unable to log in to the Test Agent using the domain admin user account it means the process of joining the Test Agent to the domain has failed! But the good news is, because you imported the connect module, you can connect to the Test Agent machine using Windows Azure Management Portal and troubleshoot the reason for failure, you will be able to log in with the user name and password you specified in the config file for the keys ‘RemoteAccess.AccountUsername, RemoteAccess.EncryptedPassword (just that enter the password unencrypted)’, fix it or manually join the machine to the domain. Once you have managed to Join the Test Agent VM to the Domain move to the next step.      So, log in to the Test Agent Worker Role VM with the Playpit Domain Administrator and verify that you can log in, the machine is connected to the domain and the connect service is successfully running. If yes, give your self a pat on the back, you are 80% mission accomplished!         Go to the Windows Azure Management Portal and click on Virtual Network, click on Groups and Roles and click on Test Rig, click Edit Group, the edit the Test Rig group you created earlier. In the Connect to section, click on Add to select the worker role you have just deployed. Also, check the ‘Allow connections between endpoints in the group’ with this you will enable to communication between test controller and test agents and test agents/test agents. Click Save.      Now, you are ready to deploy the Test Agent software on the Worker Role Test Agent VM and configure it to work with the Test Controller. V. Configuring VM – 3: Installing Test Agent and Associating Test Agent to Controller Log in to the Worker Role Test Agent VM that you have just successfully deployed, make sure you log in with the domain administrator account. Download the All Agents software from MSDN, ‘en_visual_studio_agents_2010_x86_x64_dvd_509679.iso’, extract the iso and navigate to where you have extracted the iso. In my case, i have extracted the iso to “C:\Resources\Temp\VsAgentSetup”. Open the Test Agent folder and double click on setup.exe. Once you have installed the Test Agent you should reach the configuration window. If you face any issues installing TFS Test Agent on the VM, refer to the walkthrough on MSDN.       Once you have successfully installed the Test Agent software you will need to configure the test agent. Right click the test agent configuration tool and run as a different user. i.e. an Administrator. This is really to run the configuration wizard with elevated privileges (you might have UAC block something's otherwise).        In the run options, you can select ‘service’ you do not need to run the agent as interactive un less you are running coded UI tests. I have specified the domain administrator to connect to the TFS Test Controller. In real life, i would never do that, i would create a separate test user service account for this purpose. But for the blog post, we are using the most powerful user so that any policies or restrictions don’t block you.        Click the Apply Settings button and you should be all green! If not, the summary usually gives helpful error messages that you can resolve and proceed. As per my experience, you may run in to either a permission or a firewall blocking communication issue.        And now the moment of truth! Go to VM –2 open up Visual Studio and from the Test Menu select Manage Test Controller       Mission Accomplished! You should be able to see the Test Agent that you have just configured here,         VI. Creating and Running Load Tests on your brand new Azure-ed Test Rig I have various blog posts on Performance Testing with Visual Studio Ultimate, you can follow the links and videos below, Blog Posts: - Part 1 – Performance Testing using Visual Studio 2010 Ultimate - Part 2 – Performance Testing using Visual Studio 2010 Ultimate - Part 3 – Performance Testing using Visual Studio 2010 Ultimate Videos: - Test Tools Configuration & Settings in Visual Studio - Why & How to Record Web Performance Tests in Visual Studio Ultimate - Goal Driven Load Testing using Visual Studio Ultimate Now that you have created your load tests, there is one last change you need to make before you can run the tests on your Azure Test Rig, create a new Test settings file, and change the Test Execution method to ‘Remote Execution’ and select the test controller you have configured the Worker Role Test Agent against in our case VM – 2 So, go on, fire off a test run and see the results of the test being executed on the Azur-ed Test Rig. Review and What’s next? A quick recap of the benefits of running the Test Rig in the cloud and what i will be covering in the next blog post AND I would love to hear your feedback! Advantages Utilizing the power of Azure compute to run a heavy virtual user load. Benefiting from the Azure flexibility, destroy Test Agents when not in use, takes < 25 minutes to spin up a new Test Agent. Most important test Network Latency, (network latency and speed of connection are two different things – usually network latency is very hard to test), by placing the Test Agents in Microsoft Data centres around the globe, one can actually test the lag in transferring the bytes not because of a slow connection but because the page has been requested from the other side of the globe. Next Steps The process of spinning up the Test Agents in windows Azure is not 100% automated. I am working on the Worker process and power shell scripts to make the role deployment, unattended install of test agent software and registration of the test agent to the test controller automated. In the next blog post I will show you how to make the complete process unattended and automated. Remember to subscribe to http://feeds.feedburner.com/TarunArora. Hope you enjoyed this post, I would love to hear your feedback! If you have any recommendations on things that I should consider or any questions or feedback, feel free to leave a comment. See you in Part III.   Share this post : CodeProject

    Read the article

  • Microsoft Declares the Future of ASP.NET is Web API

    - by sbwalker
    Sitting on a plane on my way home from Tech Ed 2012 in Orlando, I thought it would be a good time to jot down some key takeaways from this year’s conference. Some of these items I have known since the Microsoft MVP Summit which occurred in Redmond in late February ( but due to NDA restrictions I could not share them with the developer community at large ) and some of them are a result of insightful conversations with a wide variety of industry insiders and Microsoft employees at the conference. First, let’s travel back in time 4 years to the Microsoft MVP Summit in 2008. Microsoft was facing some heat from market newcomer Ruby on Rails and responded with a new web development framework of its own, ASP.NET MVC. At the Summit they estimated that MVC would only be applicable for ~10% of all new web development projects. Based on that prediction I questioned why they were investing such considerable resources for such a relative edge case, but my guess is that they felt it was an important edge case at the time as some of the more vocal .NET evangelists as well as some very high profile start-ups ( ie. Twitter ) had publicly announced their intent to use Rails. Microsoft made a lot of noise about MVC. In fact, they focused so much of their messaging and marketing hype around MVC that it appeared that WebForms was essentially dead. Yes, it may have been true that Microsoft continued to invest in WebForms, but from an outside perspective it really appeared that MVC was the only framework getting any real attention. As a result, MVC started to gain market share. An inside source at Microsoft told me that MVC usage has grown at a rate of about 5% per year and now sits at ~30%. Essentially by focusing so much marketing effort on MVC, Microsoft actually created a larger market demand for it.  This is because in the Microsoft ecosystem there is somewhat of a bandwagon mentality amongst developers. If Microsoft spends a lot of time talking about a specific technology, developers get the perception that it must be really important. So rather than choosing the right tool for the job, they often choose the tool with the most marketing hype and then try to sell it to the customer. In 2010, I blogged about the fact that MVC did not make any business sense for the DotNetNuke platform. This was because our ecosystem relied on third party extensions which were dependent on the WebForms model. If we migrated the core to MVC it would mean that all of the third party extensions would no longer be compatible, which would be an irresponsible business decision for us to make at the expense of our users and customers. However, this did not stop the debate from continuing to occur in our ecosystem. Clearly some developers had drunk Microsoft’s Kool-Aid about MVC and were of the mindset, to paraphrase an old Scottish saying, “If its not MVC, it’s crap”. Now, this is a rather ignorant position to take as most of the benefits of MVC can be achieved in WebForms with solid architecture and responsible coding practices. Clean separation of concerns, unit testing, and direct control over page output are all possible in the WebForms model – it just requires diligence and discipline. So over the past few years some horror stories have begun to bubble to the surface of software development projects focused on ground-up rewrites of web applications for the sole purpose of migrating from WebForms to MVC. These large scale rewrites were typically initiated by engineering teams with only a single argument driving the business decision, that Microsoft was promoting MVC as “the future”. These ill-fated rewrites offered no benefit to end users or customers and in fact resulted in a less stable, less scalable and more complicated systems – basically taking one step forward and two full steps back. A case in point is the announcement earlier this week that a popular open source .NET CMS provider has decided to pull the plug on their new MVC product which has been under active development for more than 18 months and revert back to WebForms. The availability of multiple server-side development models has deeply fragmented the Microsoft developer community. Some folks like to compare it to the age-old VB vs. C# language debate. However, the VB vs. C# language debate was ultimately more of a religious war because at least the two dominant programming languages were compatible with one another and could be used interchangeably. The issue with WebForms vs. MVC is much more challenging. This is because the messaging from Microsoft has positioned the two solutions as being incompatible with one another and as a result web developers feel like they are forced to choose one path or another. Yes, it is true that it has always been technically possible to use WebForms and MVC in the same project, but the tooling support has always made this feel “dirty”. The fragmentation has also made it difficult to attract newcomers as the perceived barrier to entry for learning ASP.NET has become higher. As a result many new software developers entering the market are gravitating to environments where the development model seems more simple and intuitive ( ie. PHP or Ruby ). At the same time that the Web Platform team was busy promoting ASP.NET MVC, the Microsoft Office team has been promoting Sharepoint as a platform for building internal enterprise web applications. Sharepoint has great penetration in the enterprise and over time has been enhanced with improved extensibility capabilities for software developers. But, like many other mature enterprise ASP.NET web applications, it is built on the WebForms development model. Similar to DotNetNuke, Sharepoint leverages a rich third party ecosystem for both generic web controls and more specialized WebParts – both of which rely on WebForms. So basically this resulted in a situation where the Web Platform group had headed off in one direction and the Office team had gone in another direction, and the end customer was stuck in the middle trying to figure out what to do with their existing investments in Microsoft technology. It really emphasized the perception that the left hand was not speaking to the right hand, as strategically speaking there did not seem to be any high level plan from Microsoft to ensure consistency and continuity across the different product lines. With the introduction of ASP.NET MVC, it also made some of the third party control vendors scratch their heads, and wonder what the heck Microsoft was thinking. The original value proposition of ASP.NET over Classic ASP was the ability for web developers to emulate the highly productive desktop development model by using abstract components for creating rich, interactive web interfaces. Web control vendors like Telerik, Infragistics, DevExpress, and ComponentArt had all built sizable businesses offering powerful user interface components to WebForms developers. And even after MVC was introduced these vendors continued to improve their products, offering greater productivity and a superior user experience via AJAX to what was possible in MVC. And since many developers were comfortable and satisfied with these third party solutions, the demand remained strong and the third party web control market continued to prosper despite the availability of MVC. While all of this was going on in the Microsoft ecosystem, there has also been a fundamental shift in the general software development industry. Driven by the explosion of Internet-enabled devices, the focus has now centered on service-oriented architecture (SOA). Service-oriented architecture is all about defining a public API for your product that any client can consume; whether it’s a native application running on a smart phone or tablet, a web browser taking advantage of HTML5 and Javascript, or a rich desktop application running on a PC. REST-based services which utilize the less verbose characteristics of JSON as a transport mechanism, have become the preferred approach over older, more bloated SOAP-based techniques. SOA also has the benefit of producing a cross-platform API, as every major technology stack is able to interact with standard REST-based web services. And for web applications, more and more developers are turning to robust Javascript libraries like JQuery and Knockout for browser-based client-side development techniques for calling web services and rendering content to end users. In fact, traditional server-side page rendering has largely fallen out of favor, resulting in decreased demand for server-side frameworks like Ruby on Rails, WebForms, and (gasp) MVC. In response to these new industry trends, Microsoft did what it always does – it immediately poured some resources into developing a solution which will ensure they remain relevant and competitive in the web space. This work culminated in a new framework which was branded as Web API. It is convention-based and designed to embrace native HTTP standards without copious layers of abstraction. This framework is designed to be the ultimate replacement for both the REST aspects of WCF and ASP.NET MVC Web Services. And since it was developed out of band with a dependency only on ASP.NET 4.0, it means that it can be used immediately in a variety of production scenarios. So at Tech Ed 2012 it was made abundantly clear in numerous sessions that Microsoft views Web API as the “Future of ASP.NET”. In fact, one Microsoft PM even went as far as to say that if we look 3-4 years into the future, that all ASP.NET web applications will be developed using the Web API approach. This is a fairly bold prediction and clearly telegraphs where Microsoft plans to allocate its resources going forward. Currently Web API is being delivered as part of the MVC4 package, but this is only temporary for the sake of convenience. It also sounds like there are still internal discussions going on in terms of how to brand the various aspects of ASP.NET going forward – perhaps the moniker of “ASP.NET Web Stack” coined a couple years ago by Scott Hanselman and utilized as part of the open source release of ASP.NET bits on Codeplex a few months back will eventually stick. Web API is being positioned as the unification of ASP.NET – the glue that is able to pull this fragmented mess back together again. The  “One ASP.NET” strategy will promote the use of all frameworks - WebForms, MVC, and Web API, even within the same web project. Basically the message is utilize the appropriate aspects of each framework to solve your business problems. Instead of navigating developers to a fork in the road, the plan is to educate them that “hybrid” applications are a great strategy for delivering solutions to customers. In addition, the service-oriented approach coupled with client-side development promoted by Web API can effectively be used in both WebForms and MVC applications. So this means it is also relevant to application platforms like DotNetNuke and Sharepoint, which means that it starts to create a unified development strategy across all ASP.NET product lines once again. And so what about MVC? There have actually been rumors floated that MVC has reached a stage of maturity where, similar to WebForms, it will be treated more as a maintenance product line going forward ( MVC4 may in fact be the last significant iteration of this framework ). This may sound alarming to some folks who have recently adopted MVC but it really shouldn’t, as both WebForms and MVC will continue to play a vital role in delivering solutions to customers. They will just not be the primary area where Microsoft is spending the majority of its R&D resources. That distinction will obviously go to Web API. And when the question comes up of why not enhance MVC to make it work with Web API, you must take a step back and look at this from the higher level to see that it really makes no sense. MVC is a server-side page compositing framework; whereas, Web API promotes client-side page compositing with a heavy focus on web services. In order to make MVC work well with Web API, would require a complete rewrite of MVC and at the end of the day, there would be no upgrade path for existing MVC applications. So it really does not make much business sense. So what does this have to do with DotNetNuke? Well, around 8-12 months ago we recognized the software industry trends towards web services and client-side development. We decided to utilize a “hybrid” model which would provide compatibility for existing modules while at the same time provide a bridge for developers who wanted to utilize more modern web techniques. Customers who like the productivity and familiarity of WebForms can continue to build custom modules using the traditional approach. However, in DotNetNuke 6.2 we also introduced a new Service Framework which is actually built on top of MVC2 ( we chose to leverage MVC because it had the most intuitive, light-weight REST implementation in the .NET stack ). The Services Framework allowed us to build some rich interactive features in DotNetNuke 6.2, including the Messaging and Notification Center and Activity Feed. But based on where we know Microsoft is heading, it makes sense for the next major version of DotNetNuke ( which is expected to be released in Q4 2012 ) to migrate from MVC2 to Web API. This will likely result in some breaking changes in the Services Framework but we feel it is the best approach for ensuring the platform remains highly modern and relevant. The fact that our development strategy is perfectly aligned with the “One ASP.NET” strategy from Microsoft means that our customers and developer community can be confident in their current and future investments in the DotNetNuke platform.

    Read the article

  • Debugging deployed azure app

    - by gonzohunter
    Is it possible to attach to a deployed Azure app? I would like to be able to step through the code so that I can see what values are being set in a request to one of my web role actions. I have looked around and the only examples seem to be of debugging when the azure app is running on the local machine.

    Read the article

  • Web Designer and/or Developer

    - by chimps
    we've outsourced our app development, the dev's have created a DB hosted on Amazon-EC2. we're in talks with a web designer for website but the designer does not do any backend integration. i.e connect the website with DB created by app developers do you recommend getting designs from the designer and getting a freelancer to do the front-back end integration, I mean would there be issues/complications? Or go with designer who provides the complete package?

    Read the article

  • The C++ web stack, is there one?

    - by NimChimpsky
    Java would be jsps and servlets (or a framework such as Spring) running on the JVM and tomcat (or glassfish etc). C# would be asp and C# running on dot.net framework and IIS ? (I have no experience with this please correct and improve my terminology) Is there an equivalent for C++ ? I could happily call some C++ from a java servlet/controller but was wondering if there are existing frameworks and libraries out there specifically for creating business logic in C++ with a web front end.

    Read the article

  • Tutoriel Windows Azure : Web Tracker, par l'équipe Azure de Microsoft

    Contoso dispose d'une application mobile et d'un site Web qui renvoient des informations d'utilisation. Une sonde (en JS depuis les pages HTML, en Objective-C sur iOS, en Java depuis Android, en C# depuis Windows Phone, en C# ou en JS depuis Windows 8+) envoie via HTTP POST ces informations d'usage en JSON à l'adresse http(s)://webtracker.contoso.com/t/. Le but de ce tutoriel est de montrer comment on peut mettre en ?uvre la réception de ces informations de sonde de façon à ce que cela puisse monter...

    Read the article

  • SQL Azure Pricing

    - by kaleidoscope
    Microsoft’s pricing for SQL Server in the cloud, SQLAzure has been announced: $9.99   per month for 0 – 1GB $99.99 per month up to 10GB. There’s currently a 10GB maximum size cap for SQLAzure. For larger data storage needs, you’ll need to break the databases into smaller sizes. Scaling SQL Azure Applications If you think you’re going to need 100GB in the near term, it probably makes sense to break your application up into multiple separate databases from the get-go (10 x $9.99 = $99.99 anyway) and just make really sure none of the individual databases exceed 10GB. Beep Beep, Back That Database Up The bandwidth costs for SQL Azure are $.15 per GB of outbound bandwidth.  Assuming that you don’t compress the data before you pull it out of the cloud, that means daily backups of a 1GB database will add another $4.50 per month, and a 10GB database will add another $45/month.  Daily backups will cost about half of what your monthly service charges cost. It’s not completely clear from the press release, but if Microsoft follows Amazon’s pricing model, bandwidth between the Microsoft cloud services will not incur a cost.  That would mean it might make sense to spin up an Windows Azure computing application for $.12 per hour, use that application to compress your SQL Azure database, and then send the compressed data off to Azure storage for backup.  That would eliminate the data in/out costs, and minimize the Azure storage costs ($.15/GB).  Database administrators would back up their SQL Azure data to Azure Storage, keep a history of backups there, and restore them to SQL Azure faster when needed. Of course, there’s no native backup support in SQL Azure, and it’s not clear whether Windows Azure will include tools like SQL Server Integration Services. More details can be found at http://www.brentozar.com/archive/2009/07/sql-azure-pricing-10-for-1gb-100-for-10gb/   Anish, S

    Read the article

  • Does VSTO run on Windows Azure?

    - by Joni
    I have a Web application which will be deployed to Windows Azure and I'm looking for alternatives to generate Excel spreadsheets. Can I use VSTO to programatically generate an Excel spreadsheet in a Web Role running on Windows Azure?... If yes, how should I deploy the application to Windows Azure? What assemblies should I include? Thanks!

    Read the article

  • How to hire a web-programmer : for non-programmer

    - by 0Complex
    I am a non-programmer that has used the services of : freelancer, odesk, etc I've tried asking for what i need but, I can't find anyone who can show me any type of example similar to what I request in the specs for the web-programming. They have front ends and back ends, but they don't fulfill true "live" website requirements. "live" as to be ready to support traffic, keys in hand, can be updated constantly by me, ... How do I figure how to evaluate a programmer ? How do I bid the appropriate price for the services ?

    Read the article

  • How to hire a web-programmer : for non-programmer

    - by 0Complex
    I am a non-programmer that has used the services of : freelancer, odesk, etc I've tried asking for what i need but, I can't find anyone who can show me any type of example similar to what I request in the specs for the web-programming. They have front ends and back ends, but they don't fulfill true "live" website requirements. "live" as to be ready to support traffic, keys in hand, can be updated constantly by me, ... How do I figure how to evaluate a programmer ? How do I bid the appropriate price for the services ?

    Read the article

< Previous Page | 9 10 11 12 13 14 15 16 17 18 19 20  | Next Page >