Search Results

Search found 42941 results on 1718 pages for 'web clipping'.

Page 33/1718 | < Previous Page | 29 30 31 32 33 34 35 36 37 38 39 40  | Next Page >

  • Windows Azure: Import/Export Hard Drives, VM ACLs, Web Sockets, Remote Debugging, Continuous Delivery, New Relic, Billing Alerts and More

    - by ScottGu
    Two weeks ago we released a giant set of improvements to Windows Azure, as well as a significant update of the Windows Azure SDK. This morning we released another massive set of enhancements to Windows Azure.  Today’s new capabilities include: Storage: Import/Export Hard Disk Drives to your Storage Accounts HDInsight: General Availability of our Hadoop Service in the cloud Virtual Machines: New VM Gallery, ACL support for VIPs Web Sites: WebSocket and Remote Debugging Support Notification Hubs: Segmented customer push notification support with tag expressions TFS & GIT: Continuous Delivery Support for Web Sites + Cloud Services Developer Analytics: New Relic support for Web Sites + Mobile Services Service Bus: Support for partitioned queues and topics Billing: New Billing Alert Service that sends emails notifications when your bill hits a threshold you define All of these improvements are now available to use immediately (note that some features are still in preview).  Below are more details about them. Storage: Import/Export Hard Disk Drives to Windows Azure I am excited to announce the preview of our new Windows Azure Import/Export Service! The Windows Azure Import/Export Service enables you to move large amounts of on-premises data into and out of your Windows Azure Storage accounts. It does this by enabling you to securely ship hard disk drives directly to our Windows Azure data centers. Once we receive the drives we’ll automatically transfer the data to or from your Windows Azure Storage account.  This enables you to import or export massive amounts of data more quickly and cost effectively (and not be constrained by available network bandwidth). Encrypted Transport Our Import/Export service provides built-in support for BitLocker disk encryption – which enables you to securely encrypt data on the hard drives before you send it, and not have to worry about it being compromised even if the disk is lost/stolen in transit (since the content on the transported hard drives is completely encrypted and you are the only one who has the key to it).  The drive preparation tool we are shipping today makes setting up bitlocker encryption on these hard drives easy. How to Import/Export your first Hard Drive of Data You can read our Getting Started Guide to learn more about how to begin using the import/export service.  You can create import and export jobs via the Windows Azure Management Portal as well as programmatically using our Server Management APIs. It is really easy to create a new import or export job using the Windows Azure Management Portal.  Simply navigate to a Windows Azure storage account, and then click the new Import/Export tab now available within it (note: if you don’t have this tab make sure to sign-up for the Import/Export preview): Then click the “Create Import Job” or “Create Export Job” commands at the bottom of it.  This will launch a wizard that easily walks you through the steps required: For more comprehensive information about Import/Export, refer to Windows Azure Storage team blog.  You can also send questions and comments to the [email protected] email address. We think you’ll find this new service makes it much easier to move data into and out of Windows Azure, and it will dramatically cut down the network bandwidth required when working on large data migration projects.  We hope you like it. HDInsight: 100% Compatible Hadoop Service in the Cloud Last week we announced the general availability release of Windows Azure HDInsight. HDInsight is a 100% compatible Hadoop service that allows you to easily provision and manage Hadoop clusters for big data processing in Windows Azure.  This release is now live in production, backed by an enterprise SLA, supported 24x7 by Microsoft Support, and is ready to use for production scenarios. HDInsight allows you to use Apache Hadoop tools, such as Pig and Hive, to process large amounts of data in Windows Azure Blob Storage. Because data is stored in Windows Azure Blob Storage, you can choose to dynamically create Hadoop clusters only when you need them, and then shut them down when they are no longer required (since you pay only for the time the Hadoop cluster instances are running this provides a super cost effective way to use them).  You can create Hadoop clusters using either the Windows Azure Management Portal (see below) or using our PowerShell and Cross Platform Command line tools: The import/export hard drive support that came out today is a perfect companion service to use with HDInsight – the combination allows you to easily ingest, process and optionally export a limitless amount of data.  We’ve also integrated HDInsight with our Business Intelligence tools, so users can leverage familiar tools like Excel in order to analyze the output of jobs.  You can find out more about how to get started with HDInsight here. Virtual Machines: VM Gallery Enhancements Today’s update of Windows Azure brings with it a new Virtual Machine gallery that you can use to create new VMs in the cloud.  You can launch the gallery by doing New->Compute->Virtual Machine->From Gallery within the Windows Azure Management Portal: The new Virtual Machine Gallery includes some nice enhancements that make it even easier to use: Search: You can now easily search and filter images using the search box in the top-right of the dialog.  For example, simply type “SQL” and we’ll filter to show those images in the gallery that contain that substring. Category Tree-view: Each month we add more built-in VM images to the gallery.  You can continue to browse these using the “All” view within the VM Gallery – or now quickly filter them using the category tree-view on the left-hand side of the dialog.  For example, by selecting “Oracle” in the tree-view you can now quickly filter to see the official Oracle supplied images. MSDN and Supported checkboxes: With today’s update we are also introducing filters that makes it easy to filter out types of images that you may not be interested in. The first checkbox is MSDN: using this filter you can exclude any image that is not part of the Windows Azure benefits for MSDN subscribers (which have highly discounted pricing - you can learn more about the MSDN pricing here). The second checkbox is Supported: this filter will exclude any image that contains prerelease software, so you can feel confident that the software you choose to deploy is fully supported by Windows Azure and our partners. Sort options: We sort gallery images by what we think customers are most interested in, but sometimes you might want to sort using different views. So we’re providing some additional sort options, like “Newest,” to customize the image list for what suits you best. Pricing information: We now provide additional pricing information about images and options on how to cost effectively run them directly within the VM Gallery. The above improvements make it even easier to use the VM Gallery and quickly create launch and run Virtual Machines in the cloud. Virtual Machines: ACL Support for VIPs A few months ago we exposed the ability to configure Access Control Lists (ACLs) for Virtual Machines using Windows PowerShell cmdlets and our Service Management API. With today’s release, you can now configure VM ACLs using the Windows Azure Management Portal as well. You can now do this by clicking the new Manage ACL command in the Endpoints tab of a virtual machine instance: This will enable you to configure an ordered list of permit and deny rules to scope the traffic that can access your VM’s network endpoints. For example, if you were on a virtual network, you could limit RDP access to a Windows Azure virtual machine to only a few computers attached to your enterprise. Or if you weren’t on a virtual network you could alternatively limit traffic from public IPs that can access your workloads: Here is the default behaviors for ACLs in Windows Azure: By default (i.e. no rules specified), all traffic is permitted. When using only Permit rules, all other traffic is denied. When using only Deny rules, all other traffic is permitted. When there is a combination of Permit and Deny rules, all other traffic is denied. Lastly, remember that configuring endpoints does not automatically configure them within the VM if it also has firewall rules enabled at the OS level.  So if you create an endpoint using the Windows Azure Management Portal, Windows PowerShell, or REST API, be sure to also configure your guest VM firewall appropriately as well. Web Sites: Web Sockets Support With today’s release you can now use Web Sockets with Windows Azure Web Sites.  This feature enables you to easily integrate real-time communication scenarios within your web based applications, and is available at no extra charge (it even works with the free tier).  Higher level programming libraries like SignalR and socket.io are also now supported with it. You can enable Web Sockets support on a web site by navigating to the Configure tab of a Web Site, and by toggling Web Sockets support to “on”: Once Web Sockets is enabled you can start to integrate some really cool scenarios into your web applications.  Check out the new SignalR documentation hub on www.asp.net to learn more about some of the awesome scenarios you can do with it. Web Sites: Remote Debugging Support The Windows Azure SDK 2.2 we released two weeks ago introduced remote debugging support for Windows Azure Cloud Services. With today’s Windows Azure release we are extending this remote debugging support to also work with Windows Azure Web Sites. With live, remote debugging support inside of Visual Studio, you are able to have more visibility than ever before into how your code is operating live in Windows Azure. It is now super easy to attach the debugger and quickly see what is going on with your application in the cloud. Remote Debugging of a Windows Azure Web Site using VS 2013 Enabling the remote debugging of a Windows Azure Web Site using VS 2013 is really easy.  Start by opening up your web application’s project within Visual Studio. Then navigate to the “Server Explorer” tab within Visual Studio, and click on the deployed web-site you want to debug that is running within Windows Azure using the Windows Azure->Web Sites node in the Server Explorer.  Then right-click and choose the “Attach Debugger” option on it: When you do this Visual Studio will remotely attach the debugger to the Web Site running within Windows Azure.  The debugger will then stop the web site’s execution when it hits any break points that you have set within your web application’s project inside Visual Studio.  For example, below I set a breakpoint on the “ViewBag.Message” assignment statement within the HomeController of the standard ASP.NET MVC project template.  When I hit refresh on the “About” page of the web site within the browser, the breakpoint was triggered and I am now able to debug the app remotely using Visual Studio: Note above how we can debug variables (including autos/watchlist/etc), as well as use the Immediate and Command Windows. In the debug session above I used the Immediate Window to explore some of the request object state, as well as to dynamically change the ViewBag.Message property.  When we click the the “Continue” button (or press F5) the app will continue execution and the Web Site will render the content back to the browser.  This makes it super easy to debug web apps remotely. Tips for Better Debugging To get the best experience while debugging, we recommend publishing your site using the Debug configuration within Visual Studio’s Web Publish dialog. This will ensure that debug symbol information is uploaded to the Web Site which will enable a richer debug experience within Visual Studio.  You can find this option on the Web Publish dialog on the Settings tab: When you ultimately deploy/run the application in production we recommend using the “Release” configuration setting – the release configuration is memory optimized and will provide the best production performance.  To learn more about diagnosing and debugging Windows Azure Web Sites read our new Troubleshooting Windows Azure Web Sites in Visual Studio guide. Notification Hubs: Segmented Push Notification support with tag expressions In August we announced the General Availability of Windows Azure Notification Hubs - a powerful Mobile Push Notifications service that makes it easy to send high volume push notifications with low latency from any mobile app back-end.  Notification hubs can be used with any mobile app back-end (including ones built using our Mobile Services capability) and can also be used with back-ends that run in the cloud as well as on-premises. Beginning with the initial release, Notification Hubs allowed developers to send personalized push notifications to both individual users as well as groups of users by interest, by associating their devices with tags representing the logical target of the notification. For example, by registering all devices of customers interested in a favorite MLB team with a corresponding tag, it is possible to broadcast one message to millions of Boston Red Sox fans and another message to millions of St. Louis Cardinals fans with a single API call respectively. New support for using tag expressions to enable advanced customer segmentation With today’s release we are adding support for even more advanced customer targeting.  You can now identify customers that you want to send push notifications to by defining rich tag expressions. With tag expressions, you can now not only broadcast notifications to Boston Red Sox fans, but take that segmenting a step farther and reach more granular segments. This opens up a variety of scenarios, for example: Offers based on multiple preferences—e.g. send a game day vegetarian special to users tagged as both a Boston Red Sox fan AND a vegetarian Push content to multiple segments in a single message—e.g. rain delay information only to users who are tagged as either a Boston Red Sox fan OR a St. Louis Cardinal fan Avoid presenting subsets of a segment with irrelevant content—e.g. season ticket availability reminder to users who are tagged as a Boston Red Sox fan but NOT also a season ticket holder To illustrate with code, consider a restaurant chain app that sends an offer related to a Red Sox vs Cardinals game for users in Boston. Devices can be tagged by your app with location tags (e.g. “Loc:Boston”) and interest tags (e.g. “Follows:RedSox”, “Follows:Cardinals”), and then a notification can be sent by your back-end to “(Follows:RedSox || Follows:Cardinals) && Loc:Boston” in order to deliver an offer to all devices in Boston that follow either the RedSox or the Cardinals. This can be done directly in your server backend send logic using the code below: var notification = new WindowsNotification(messagePayload); hub.SendNotificationAsync(notification, "(Follows:RedSox || Follows:Cardinals) && Loc:Boston"); In your expressions you can use all Boolean operators: AND (&&), OR (||), and NOT (!).  Some other cool use cases for tag expressions that are now supported include: Social: To “all my group except me” - group:id && !user:id Events: Touchdown event is sent to everybody following either team or any of the players involved in the action: Followteam:A || Followteam:B || followplayer:1 || followplayer:2 … Hours: Send notifications at specific times. E.g. Tag devices with time zone and when it is 12pm in Seattle send to: GMT8 && follows:thaifood Versions and platforms: Send a reminder to people still using your first version for Android - version:1.0 && platform:Android For help on getting started with Notification Hubs, visit the Notification Hub documentation center.  Then download the latest NuGet package (or use the Notification Hubs REST APIs directly) to start sending push notifications using tag expressions.  They are really powerful and enable a bunch of great new scenarios. TFS & GIT: Continuous Delivery Support for Web Sites + Cloud Services With today’s Windows Azure release we are making it really easy to enable continuous delivery support with Windows Azure and Team Foundation Services.  Team Foundation Services is a cloud based offering from Microsoft that provides integrated source control (with both TFS and Git support), build server, test execution, collaboration tools, and agile planning support.  It makes it really easy to setup a team project (complete with automated builds and test runners) in the cloud, and it has really rich integration with Visual Studio. With today’s Windows Azure release it is now really easy to enable continuous delivery support with both TFS and Git based repositories hosted using Team Foundation Services.  This enables a workflow where when code is checked in, built successfully on an automated build server, and all tests pass on it – I can automatically have the app deployed on Windows Azure with zero manual intervention or work required. The below screen-shots demonstrate how to quickly setup a continuous delivery workflow to Windows Azure with a Git-based ASP.NET MVC project hosted using Team Foundation Services. Enabling Continuous Delivery to Windows Azure with Team Foundation Services The project I’m going to enable continuous delivery with is a simple ASP.NET MVC project whose source code I’m hosting using Team Foundation Services.  I did this by creating a “SimpleContinuousDeploymentTest” repository there using Git – and then used the new built-in Git tooling support within Visual Studio 2013 to push the source code to it.  Below is a screen-shot of the Git repository hosted within Team Foundation Services: I can access the repository within Visual Studio 2013 and easily make commits with it (as well as branch, merge and do other tasks).  Using VS 2013 I can also setup automated builds to take place in the cloud using Team Foundation Services every time someone checks in code to the repository: The cool thing about this is that I don’t have to buy or rent my own build server – Team Foundation Services automatically maintains its own build server farm and can automatically queue up a build for me (for free) every time someone checks in code using the above settings.  This build server (and automated testing) support now works with both TFS and Git based source control repositories. Connecting a Team Foundation Services project to Windows Azure Once I have a source repository hosted in Team Foundation Services with Automated Builds and Testing set up, I can then go even further and set it up so that it will be automatically deployed to Windows Azure when a source code commit is made to the repository (assuming the Build + Tests pass).  Enabling this is now really easy.  To set this up with a Windows Azure Web Site simply use the New->Compute->Web Site->Custom Create command inside the Windows Azure Management Portal.  This will create a dialog like below.  I gave the web site a name and then made sure the “Publish from source control” checkbox was selected: When we click next we’ll be prompted for the location of the source repository.  We’ll select “Team Foundation Services”: Once we do this we’ll be prompted for our Team Foundation Services account that our source repository is hosted under (in this case my TFS account is “scottguthrie”): When we click the “Authorize Now” button we’ll be prompted to give Windows Azure permissions to connect to the Team Foundation Services account.  Once we do this we’ll be prompted to pick the source repository we want to connect to.  Starting with today’s Windows Azure release you can now connect to both TFS and Git based source repositories.  This new support allows me to connect to the “SimpleContinuousDeploymentTest” respository we created earlier: Clicking the finish button will then create the Web Site with the continuous delivery hooks setup with Team Foundation Services.  Now every time someone pushes source control to the repository in Team Foundation Services, it will kick off an automated build, run all of the unit tests in the solution , and if they pass the app will be automatically deployed to our Web Site in Windows Azure.  You can monitor the history and status of these automated deployments using the Deployments tab within the Web Site: This enables a really slick continuous delivery workflow, and enables you to build and deploy apps in a really nice way. Developer Analytics: New Relic support for Web Sites + Mobile Services With today’s Windows Azure release we are making it really easy to enable Developer Analytics and Monitoring support with both Windows Azure Web Site and Windows Azure Mobile Services.  We are partnering with New Relic, who provide a great dev analytics and app performance monitoring offering, to enable this - and we have updated the Windows Azure Management Portal to make it really easy to configure. Enabling New Relic with a Windows Azure Web Site Enabling New Relic support with a Windows Azure Web Site is now really easy.  Simply navigate to the Configure tab of a Web Site and scroll down to the “developer analytics” section that is now within it: Clicking the “add-on” button will display some additional UI.  If you don’t already have a New Relic subscription, you can click the “view windows azure store” button to obtain a subscription (note: New Relic has a perpetually free tier so you can enable it even without paying anything): Clicking the “view windows azure store” button will launch the integrated Windows Azure Store experience we have within the Windows Azure Management Portal.  You can use this to browse from a variety of great add-on services – including New Relic: Select “New Relic” within the dialog above, then click the next button, and you’ll be able to choose which type of New Relic subscription you wish to purchase.  For this demo we’ll simply select the “Free Standard Version” – which does not cost anything and can be used forever:  Once we’ve signed-up for our New Relic subscription and added it to our Windows Azure account, we can go back to the Web Site’s configuration tab and choose to use the New Relic add-on with our Windows Azure Web Site.  We can do this by simply selecting it from the “add-on” dropdown (it is automatically populated within it once we have a New Relic subscription in our account): Clicking the “Save” button will then cause the Windows Azure Management Portal to automatically populate all of the needed New Relic configuration settings to our Web Site: Deploying the New Relic Agent as part of a Web Site The final step to enable developer analytics using New Relic is to add the New Relic runtime agent to our web app.  We can do this within Visual Studio by right-clicking on our web project and selecting the “Manage NuGet Packages” context menu: This will bring up the NuGet package manager.  You can search for “New Relic” within it to find the New Relic agent.  Note that there is both a 32-bit and 64-bit edition of it – make sure to install the version that matches how your Web Site is running within Windows Azure (note: you can configure your Web Site to run in either 32-bit or 64-bit mode using the Web Site’s “Configuration” tab within the Windows Azure Management Portal): Once we install the NuGet package we are all set to go.  We’ll simply re-publish the web site again to Windows Azure and New Relic will now automatically start monitoring the application Monitoring a Web Site using New Relic Now that the application has developer analytics support with New Relic enabled, we can launch the New Relic monitoring portal to start monitoring the health of it.  We can do this by clicking on the “Add Ons” tab in the left-hand side of the Windows Azure Management Portal.  Then select the New Relic add-on we signed-up for within it.  The Windows Azure Management Portal will provide some default information about the add-on when we do this.  Clicking the “Manage” button in the tray at the bottom will launch a new browser tab and single-sign us into the New Relic monitoring portal associated with our account: When we do this a new browser tab will launch with the New Relic admin tool loaded within it: We can now see insights into how our app is performing – without having to have written a single line of monitoring code.  The New Relic service provides a ton of great built-in monitoring features allowing us to quickly see: Performance times (including browser rendering speed) for the overall site and individual pages.  You can optionally set alert thresholds to trigger if the speed does not meet a threshold you specify. Information about where in the world your customers are hitting the site from (and how performance varies by region) Details on the latency performance of external services your web apps are using (for example: SQL, Storage, Twitter, etc) Error information including call stack details for exceptions that have occurred at runtime SQL Server profiling information – including which queries executed against your database and what their performance was And a whole bunch more… The cool thing about New Relic is that you don’t need to write monitoring code within your application to get all of the above reports (plus a lot more).  The New Relic agent automatically enables the CLR profiler within applications and automatically captures the information necessary to identify these.  This makes it super easy to get started and immediately have a rich developer analytics view for your solutions with very little effort. If you haven’t tried New Relic out yet with Windows Azure I recommend you do so – I think you’ll find it helps you build even better cloud applications.  Following the above steps will help you get started and deliver you a really good application monitoring solution in only minutes. Service Bus: Support for partitioned queues and topics With today’s release, we are enabling support within Service Bus for partitioned queues and topics. Enabling partitioning enables you to achieve a higher message throughput and better availability from your queues and topics. Higher message throughput is achieved by implementing multiple message brokers for each partitioned queue and topic.  The  multiple messaging stores will also provide higher availability. You can create a partitioned queue or topic by simply checking the Enable Partitioning option in the custom create wizard for a Queue or Topic: Read this article to learn more about partitioned queues and topics and how to take advantage of them today. Billing: New Billing Alert Service Today’s Windows Azure update enables a new Billing Alert Service Preview that enables you to get proactive email notifications when your Windows Azure bill goes above a certain monetary threshold that you configure.  This makes it easier to manage your bill and avoid potential surprises at the end of the month. With the Billing Alert Service Preview, you can now create email alerts to monitor and manage your monetary credits or your current bill total.  To set up an alert first sign-up for the free Billing Alert Service Preview.  Then visit the account management page, click on a subscription you have setup, and then navigate to the new Alerts tab that is available: The alerts tab allows you to setup email alerts that will be sent automatically once a certain threshold is hit.  For example, by clicking the “add alert” button above I can setup a rule to send myself email anytime my Windows Azure bill goes above $100 for the month: The Billing Alert Service will evolve to support additional aspects of your bill as well as support multiple forms of alerts such as SMS.  Try out the new Billing Alert Service Preview today and give us feedback. Summary Today’s Windows Azure release enables a ton of great new scenarios, and makes building applications hosted in the cloud even easier. If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using all of the above features today.  Then visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • How can I get my Web API app to run again after upgrading to MVC 5 and Web API 2?

    - by Clay Shannon
    I upgraded my Web API app to the funkelnagelneu versions using this guidance: http://www.asp.net/mvc/tutorials/mvc-5/how-to-upgrade-an-aspnet-mvc-4-and-web-api-project-to-aspnet-mvc-5-and-web-api-2 However, after going through the steps (it seems all this should be automated, anyway), I tried to run it and got, "A project with an Output Type of Class Library cannot be started directly" What in Sam Hills Brothers Coffee is going on here? Who said this was a class library? So I opened Project Properties, and changed it (it was marked as "Class Library" for some reason - it either wasn't yesterday, or was and worked fine) to an Output Type of "Windows Application" ("Console Application" and "Class Library" being the only other options). Now it won't compile, complaining: "*Program 'c:\Platypus_Server_WebAPI\PlatypusServerWebAPI\PlatypusServerWebAPI\obj\Debug\PlatypusServerWebAPI.exe' does not contain a static 'Main' method suitable for an entry point...*" How can I get my Web API app back up and running in view of this quandary? UPDATE Looking in packages.config, two entries seem chin-scratch-worthy: <package id="Microsoft.AspNet.Providers" version="1.2" targetFramework="net40" /> <package id="Microsoft.Web.Infrastructure" version="1.0.0.0" targetFramework="net40" /> All the others target net451. Could this be the problem? Should I remove these packages? UPDATE 2 I tried to uninstall the Microsoft.Web.Infrastructure package (its description leads me to believe I don't need it; also, it has no dependencies) via the NuGet package manager, but it tells me, "NuGet failed to install or uninstall the selected package in the following project(s). [mine]" UPDATE 3 I went through the steps in again, and found that I had missed one step. I had to change this entry in the Application web.config File : <dependentAssembly> <assemblyIdentity name="System.Web.Mvc" publicKeyToken="31bf3856ad364e35" /> <bindingRedirect oldVersion="1.0.0.0-5.0.0.0" newVersion="5.0.0.0" /> </dependentAssembly> (from "4.0.0.0" to "5.0.0.0") ...but I still get the same result - it rebuilds/compiles, but won't run, with "A project with an Output Type of Class Library cannot be started directly" UPDATE 4 Thinking about the err msg, that it can't directly open a class library, I thought, "Sure you can't/won't -- this is a web app, not a project. So I followed a hunch, closed the project, and reopened it as a website (instead of reopening a project). That has gotten me further, at least; now I see a YSOD: Could not load file or assembly 'System.Web.WebPages.Razor, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified. UPDATE 5 Note: The project is now (after being opened as a web site) named "localhost_48614" And...there is no longer a "References" folder...?!?!? As to that YSOD I'm getting, the official instructions (http://www.asp.net/mvc/tutorials/mvc-5/how-to-upgrade-an-aspnet-mvc-4-and-web-api-project-to-aspnet-mvc-5-and-web-api-2) said to do this, and I quote: "Update all elements that contain “System.Web.WebPages.Razor” from version “2.0.0.0” to version“3.0.0.0”." UPDATE 6 When I select Tools Library Package Manager Manage NuGet Packages for Solution now, I get, "Operation failed. Unable to locate the solution directory. Please ensure that the solution has been saved." So I save it, and it saves it with this funky new name (C:\Users\clay\Documents\Visual Studio 2013\Projects\localhost_48614\localhost_48614.sln) I get the Yellow Strip of Enlightenment across the top of the NuGet Package Manager telling me, "Some NuGet packages are missing from this solution. Click to restore from your online package sources." I do (click the "Restore" button, that is), and it downloads the missing packages ... I end up with the 30 packages. I try to run the app/site again, and ... the erstwhile YSOD becomes a compilation error: The pre-application start initialization method Start on type System.Web.Mvc.PreApplicationStartCode threw an exception with the following error message: Could not load file or assembly 'System.Web.WebPages.Razor, Version=3.0.0.0, Culture=neutral, PublicKeyToken=31bf3856ad364e35' or one of its dependencies. The system cannot find the file specified.. Argghhhh!!! (and it's not even talk-like-a-pirate day).

    Read the article

  • Authentication settings in IIS Manager versus web.config versus system.serviceModel

    - by Joe
    I'm new to ASP.NET :) I have a WCF web service, and I want to use Basic authentication. I am getting lost in the authentication options: In IIS 6 Manager, I can go in to the properties of the web site and set authentication options. In the web site's web.config file, under system.web, there is an <authentication mode="Windows"/> tag In the web site's web.config file, under system.serviceModel, I can configure: <wsHttpBinding <binding name="MyBinding" <security mode="Transport" <transport clientCredentialType="Basic"/ </security </binding </wsHttpBinding What is the difference between these three? How should each be configured? Some context: I have a simple web site project that contains a single .svc web service, and I want it to use Basic authentication over SSL. (Also, I want it to not use Windows accounts, but maybe that is another question.)

    Read the article

  • Team build of Web Projects generates App_web_xxxx.dll files and TFSBuild.Proj Script

    - by Steve Johnson
    Hi all, I have a web application that has some non-web projects as well. When using Web Deployment, a single assembly is generated for all the aspx.vb files. When using Team Build (TS 2008), a lot number App_Web_xxx.dll file(s) are generated instead of a single assembly. How can i solve this problem and change the TFSBuild.proj file so that it can generate a single Web Assembly instead of a lot number of assemblies. Please help. Thanks Edit: I guess thats because the MERGE operation is not occurring like it used to happen for Web Deployment Project in my solution. How can i enable MERGE of App_web_*.dll files into a single Web.dll assembly file and delete the satellite assemblies? Here is my code from TFSBuild.proj file: (MY web project is in Release|.NET Config and all other projects within the solution are in Release|Any CPU) true .\Debug true true Web true false .\Release true true Web true Please tell me what are the corrections i need to do.,

    Read the article

  • Could an ontology suitably replace an RDBMS for a web app?

    - by Thomas
    I'm considering storing the content of a web app in an RDF or OWL ontology instead of an RDBMS. However, when I research ontologies they seem to always exist in the context of publicly accessible data stores serving as the backbone of the semantic web. I've never heard of them being used as the content engine behind a web app. Is it reasonable to use an ontology instead of an RDBMS for such an application? (Again, this is just for content. User data, commerce and stuff like that will stay in a database as I see no need to reinvent the wheel there.)

    Read the article

  • Web service reference location?

    - by Damien Dennehy
    I have a Visual Studio 2008 solution that's currently consisting of three projects: A DataFactory project for Business Logic/Data Access. A Web project consisting of the actual user interface, pages, controls, etc. A Web.Core project consisting of utility classes, etc. The application requires consuming a web service. Normally I'd add the service reference to the Web project, but I'm not sure if this is best practice or not. The following options are open to me: Add the reference to the Web project. Add the reference to the Web.Core project, and create a wrapper method that Web will call to consume the web service. Add a new project called Web.Services, and copy step 2. This project is expected to increase in size so I'm open to any suggestions.

    Read the article

  • What is the difference between remote procedure call and web service

    - by xiao
    Is there any clear definition about RPC and Web Service? A quick wikipedia search shows: RPC: Remote procedure call (RPC) is an Inter-process communication technology that allows a computer program to cause a subroutine or procedure to execute in another address space (commonly on another computer on a shared network) without the programmer explicitly coding the details for this remote interaction. Web Service: Web services are typically application programming interfaces (API) or web APIs that are accessed via Hypertext Transfer Protocol and executed on a remote system hosting the requested services. Web services tend to fall into one of two camps: Big Web Services[1] and RESTful Web Services. I am not quite clear what the real difference between the two things. It seems that one thing could belongs to RPC and is kind of web service at the same time. Is Web Service a higher level representation of RPC?

    Read the article

  • An Honest look at SharePoint Web Services

    - by juanlarios
    INTRODUCTION If you are a SharePoint developer you know that there are two basic ways to develop against SharePoint. 1) The object Model 2) Web services. SharePoint object model has the advantage of being quite rich. Anything you can do through the SharePoint UI as an administrator or end user, you can do through the object model. In fact everything that is done through the UI is done through the object model behind the scenes. The major disadvantage to getting at SharePoint this way is that the code needs to run on the server. This means that all web parts, event receivers, features, etc… all of this is code that is deployed to the server. The second way to get to SharePoint is through the built in web services. There are many articles on how to manipulate web services, how to authenticate to them and interact with them. The basic idea is that a remote application or process can contact SharePoint through a web service. Lots has been written about how great these web services are. This article is written to document the limitations, some of the issues and frustrations with working with SharePoint built in web services. Ultimately, for the tasks I was given to , SharePoint built in web services did not suffice. My evaluation of SharePoint built in services was compared against creating my own WCF Services to do what I needed. The current project I'm working on right now involved several "integration points". A remote application, installed on a separate server was to contact SharePoint and perform an task or operation. So I decided to start up Visual Studio and built a DLL and basically have 2 layers of logic. An integration layer and a data layer. A good friend of mine pointed me to SOLID principles and referred me to some videos and tutorials about it. I decided to implement the methodology (although a lot of the principles are common sense and I already incorporated in my coding practices). I was to deliver this dll to the application team and they would simply call the methods exposed by this dll and voila! it would do some task or operation in SharePoint. SOLUTION My integration layer implemented an interface that defined some of the basic integration tasks that I was to put together. My data layer was about the same, it implemented an interface with some of the tasks that I was going to develop. This gave me the opportunity to develop different data layers, ultimately different ways to get at SharePoint if I needed to. This is a classic SOLID principle. In this case it proved to be quite helpful because I wrote one data layer completely implementing SharePoint built in Web Services and another implementing my own WCF Service that I wrote. I should mention there is another layer underneath the data layer. In referencing SharePoint or WCF services in my visual studio project I created a class for every web service call. So for example, if I used List.asx. I created a class called "DocumentRetreival" this class would do the grunt work to connect to the correct URL, It would perform the basic operation of contacting the service and so on. If I used a view.asmx, I implemented a class called "ViewRetrieval" with the same idea as the last class but it would now interact with all he operations in view.asmx. This gave my data layer the ability to perform multiple calls without really worrying about some of the grunt work each class performs. This again, is a classic SOLID principle. So, in order to compare them side by side we can look at both data layers and with is involved in each. Lets take a look at the "Create Project" task or operation. The integration point is described as , "dll is to provide a way to create a project in SharePoint". Projects , in this case are basically document libraries. I am to implement a way in which a remote application can create a document library in SharePoint. Easy enough right? Use the list.asmx Web service in SharePoint. So here we go! Lets take a look at the code. I added the List.asmx web service reference to my project and this is the class that contacts it:  class DocumentRetrieval     {         private ListsSoapClient _service;      d   private bool _impersonation;         public DocumentRetrieval(bool impersonation, string endpt)         {             _service = new ListsSoapClient();             this.SetEndPoint(string.Format("{0}/{1}", endpt, ConfigurationManager.AppSettings["List"]));             _impersonation = impersonation;             if (_impersonation)             {                 _service.ClientCredentials.Windows.ClientCredential.Password = ConfigurationManager.AppSettings["password"];                 _service.ClientCredentials.Windows.ClientCredential.UserName = ConfigurationManager.AppSettings["username"];                 _service.ClientCredentials.Windows.AllowedImpersonationLevel =                     System.Security.Principal.TokenImpersonationLevel.Impersonation;             }     private void SetEndPoint(string p)          {             _service.Endpoint.Address = new EndpointAddress(p);          }          /// <summary>         /// Creates a document library with specific name and templateID         /// </summary>         /// <param name="listName">New list name</param>         /// <param name="templateID">Template ID</param>         /// <returns></returns>         public XmlElement CreateLibrary(string listName, int templateID, ref ExceptionContract exContract)         {             XmlDocument sample = new XmlDocument();             XmlElement viewCol = sample.CreateElement("Empty");             try             {                 _service.Open();                 viewCol = _service.AddList(listName, "", templateID);             }             catch (Exception ex)             {                 exContract = new ExceptionContract("DocumentRetrieval/CreateLibrary", ex.GetType(), "Connection Error", ex.StackTrace, ExceptionContract.ExceptionCode.error);                             }finally             {                 _service.Close();             }                                      return viewCol;         } } There was a lot more in this class (that I am not including) because i was reusing the grunt work and making other operations with LIst.asmx, For example, updating content types, changing or configuring lists or document libraries. One of the first things I noticed about working with the built in services is that you are really at the mercy of what is available to you. Before creating a document library (Project) I wanted to expose a IsProjectExisting method. This way the integration or data layer could recognize if a library already exists. Well there is no service call or method available to do that check. So this is what I wrote:   public bool DocLibExists(string listName, ref ExceptionContract exContract)         {             try             {                 var allLists = _service.GetListCollection();                                return allLists.ChildNodes.OfType<XmlElement>().ToList().Exists(x => x.Attributes["Title"].Value ==listName);             }             catch (Exception ex)             {                 exContract = new ExceptionContract("DocumentRetrieval/GetList/GetListWSCall", ex.GetType(), "Unable to Retrieve List Collection", ex.StackTrace, ExceptionContract.ExceptionCode.error);             }             return false;         } This really just gets an XMLElement with all the lists. It was then up to me to sift through the clutter and noise and see if Document library already existed. This took a little bit of getting used to. Now instead of working with code, you are working with XMLElement response format from web service. I wrote a LINQ query to go through and find if the attribute "Title" existed and had a value of the listname then it would return True, if not False. I didn't particularly like working this way. Dealing with XMLElement responses and then having to manipulate it to get at the exact data I was looking for. Once the check for the DocLibExists, was done, I would either create the document library or send back an error indicating the document library already existed. Now lets examine the code that actually creates the document library. It does what you are really after, it creates a document library. Notice how the template ID is really an integer. Every document library template in SharePoint has an ID associated with it. Document libraries, Image Library, Custom List, Project Tasks, etc… they all he a unique integer associated with it. Well, that's great but the client came back to me and gave me some specifics that each "project" or document library, should have. They specified they had 3 types of projects. Each project would have unique views, about 10 views for each project. Each Project specified unique configurations (auditing, versioning, content types, etc…) So what turned out to be a simple implementation of creating a document library as a repository for a project, turned out to be quite involved.  The first thing I thought of was to create a template for document library. There are other ways you can do this too. Using the web Service call, you could configure views, versioning, even content types, etc… the only catch is, you have to be working quite extensively with CAML. I am not fond of CAML. I can do it and work with it, I just don't like doing it. It is quite touchy and at times it is quite tough to understand where errors were made with CAML statements. Working with Web Services and CAML proved to be quite annoying. The service call would return a generic error message that did not particularly point me to a CAML statement syntax error, or even a CAML error. I was not sure if it was a security , performance or code based issue. It was quite tough to work with. At times it was difficult to work with because of the way SharePoint handles metadata. There are "Names", "Display Name", and "StaticName" fields. It was quite tough to understand at times, which one to use. So it took a lot of trial and error. There are tools that can help with CAML generation. There is also now intellisense for CAML statements in Visual Studio that might help but ultimately I'm not fond of CAML with Web Services.   So I decided on the template. So my plan was to create create a document library, configure it accordingly and then use The Template Builder that comes with the SharePoint SDK. This tool allows you to create site templates, list template etc… It is quite interesting because it does not generate an STP file, it actually generates an xml definition and a feature you can activate and make that template available on a site or site collection. The first issue I experienced with this is that one of the specifications to this template was that the "All Documents" view was to have 2 web parts on it. Well, it turns out that using the template builder , it did not include the web parts as part of the list template definition it generated. It backed up the settings, the views, the content types but not the custom web parts. I still decided to try this even without the web parts on the page. This new template defined a new Document library definition with a unique ID. The problem was that the service call accepts an int but it only has access to the built in library int definitions. Any new ones added or created will not be available to create. So this made it impossible for me to approach the problem this way.     I should also mention that one of the nice features about SharePoint is the ability to create list templates, back them up and then create lists based on that template. It can all be done by end user administrators. These templates are quite unique because they are saved as an STP file and not an xml definition. I also went this route and tried to see if there was another service call where I could create a document library based no given template name. Nope! none.      After some thinking I decide to implement a WCF service to do this creation for me. I was quite certain that the object model would allow me to create document libraries base on a template in which an ID was required and also templates saved as STP files. Now I don't want to bother with posting the code to contact WCF service because it's self explanatory, but I will post the code that I used to create a list with custom template. public ServiceResult CreateProject(string name, string templateName, string projectId)         {             string siteurl = SPContext.Current.Site.Url;             Guid webguid = SPContext.Current.Web.ID;                        using (SPSite site = new SPSite(siteurl))             {                 using (SPWeb rootweb = site.RootWeb)                 {                     SPListTemplateCollection temps = site.GetCustomListTemplates(rootweb);                     ProcessWeb(siteurl, webguid, web => Act_CreateProject(web, name, templateName, projectId, temps));                 }//SpWeb             }//SPSite              return _globalResult;                   }         private void Act_CreateProject(SPWeb targetsite, string name, string templateName, string projectId, SPListTemplateCollection temps) {                         var temp = temps.Cast<SPListTemplate>().FirstOrDefault(x => x.Name.Equals(templateName));             if (temp != null)             {                             try                 {                                         Guid listGuid = targetsite.Lists.Add(name, "", temp);                     SPList newList = targetsite.Lists[listGuid];                     _globalResult = new ServiceResult(true, "Success", "Success");                 }                 catch (Exception ex)                 {                     _globalResult = new ServiceResult(false, (string.IsNullOrEmpty(ex.Message) ? "None" : ex.Message + " " + templateName), ex.StackTrace.ToString());                 }                                       }        private void ProcessWeb(string siteurl, Guid webguid, Action<SPWeb> action) {                        using (SPSite sitecollection = new SPSite(siteurl)) {                 using (SPWeb web = sitecollection.AllWebs[webguid]) {                     action(web);                 }                     }                  } This code is actually some of the code I implemented for the service. there was a lot more I did on Project Creation which I will cover in my next blog post. I implemented an ACTION method to process the web. This allowed me to properly dispose the SPWEb and SPSite objects and not rewrite this code over and over again. So I implemented a WCF service to create projects for me, this allowed me to do a lot more than just create a document library with a template, it now gave me the flexibility to do just about anything the client wanted at project creation. Once this was implemented , the client came back to me and said, "we reference all our projects with ID's in our application. we want SharePoint to do the same". This has been something I have been doing for a little while now but I do hope that SharePoint 2010 can have more of an answer to this and address it properly. I have been adding metadata to SPWebs through property bag. I believe I have blogged about it before. This time it required metadata added to a document library. No problem!!! I also mentioned these web parts that were to go on the "All Documents" View. I took the opportunity to configure them to the appropriate settings. There were two settings that needed to be set on these web parts. One of them was a Project ID configured in the webpart properties. The following code enhances and replaces the "Act_CreateProject " method above:  private void Act_CreateProject(SPWeb targetsite, string name, string templateName, string projectId, SPListTemplateCollection temps) {                         var temp = temps.Cast<SPListTemplate>().FirstOrDefault(x => x.Name.Equals(templateName));             if (temp != null)             {                 SPLimitedWebPartManager wpmgr = null;                               try                 {                                         Guid listGuid = targetsite.Lists.Add(name, "", temp);                     SPList newList = targetsite.Lists[listGuid];                     SPFolder rootFolder = newList.RootFolder;                     rootFolder.Properties.Add(KEY, projectId);                     rootFolder.Update();                     if (rootFolder.ParentWeb != targetsite)                         rootFolder.ParentWeb.Dispose();                     if (!templateName.Contains("Natural"))                     {                         SPView alldocumentsview = newList.Views.Cast<SPView>().FirstOrDefault(x => x.Title.Equals(ALLDOCUMENTS));                         SPFile alldocfile = targetsite.GetFile(alldocumentsview.ServerRelativeUrl);                         wpmgr = alldocfile.GetLimitedWebPartManager(PersonalizationScope.Shared);                         ConfigureWebPart(wpmgr, projectId, CUSTOMWPNAME);                                              alldocfile.Update();                     }                                        if (newList.ParentWeb != targetsite)                         newList.ParentWeb.Dispose();                     _globalResult = new ServiceResult(true, "Success", "Success");                 }                 catch (Exception ex)                 {                     _globalResult = new ServiceResult(false, (string.IsNullOrEmpty(ex.Message) ? "None" : ex.Message + " " + templateName), ex.StackTrace.ToString());                 }                 finally                 {                     if (wpmgr != null)                     {                         wpmgr.Web.Dispose();                         wpmgr.Dispose();                     }                 }             }                         }       private void ConfigureWebPart(SPLimitedWebPartManager mgr, string prjId, string webpartname)         {             var wp = mgr.WebParts.Cast<System.Web.UI.WebControls.WebParts.WebPart>().FirstOrDefault(x => x.DisplayTitle.Equals(webpartname));             if (wp != null)             {                           (wp as ListRelationshipWebPart.ListRelationshipWebPart).ProjectID = prjId;                 mgr.SaveChanges(wp);             }         }   This Shows you how I was able to set metadata on the document library. It has to be added to the RootFolder of the document library, Unfortunately, the SPList does not have a Property bag that I can add a key\value pair to. It has to be done on the root folder. Now everything in the integration will reference projects by ID's and will not care about names. My, "DocLibExists" will now need to be changed because a web service is not set up to look at property bags.  I had to write another method on the Service to do the equivalent but with ID's instead of names.  The second thing you will notice about the code is the use of the Webpartmanager. I have seen several examples online, and also read a lot about memory leaks, The above code does not produce memory leaks. The web part manager creates an SPWeb, so just dispose it like I did. CONCLUSION This is a long long post so I will stop here for now, I will continue with more comparisons and limitations in my next post. My conclusion for this example is that Web Services will do the trick if you can suffer through CAML and if you are doing some simple operations. For Everything else, there's WCF! **** fireI apologize for the disorganization of this post, I was on a bus on a 12 hour trip to IOWA while I wrote it, I was half asleep and half awake, hopefully it makes enough sense to someone.

    Read the article

  • Web Deploy 3.0 Installation Fails

    - by jkarpilo
    I am having difficulty installing Microsoft Web Deploy 3.0 to a Windows Server 2008 R2 box. I have tried installing with both the Web Platform Installer and the MSI package but installation fails while trying to execute the MSI custom action ExecuteRegisterUIModuleCA. This server is a VM and a member of a farm but shared config is disabled while I'm installing. Here's the point at which it fails in the MSI log (starting at line 1875): MSI (s) (80:FC) [15:29:01:358]: Executing op: ActionStart(Name=IISBeginTransactionCA,,) MSI (s) (80:FC) [15:29:01:374]: Executing op: CustomActionSchedule(Action=IISBeginTransactionCA,ActionType=3073,Source=BinaryData,Target=IISBeginTransactionCA,) MSI (s) (80:A8) [15:29:01:374]: Invoking remote custom action. DLL: C:\Windows\Installer\MSI6C6A.tmp, Entrypoint: IISBeginTransactionCA MSI (s) (80:FC) [15:29:01:436]: Executing op: ActionStart(Name=IISRollbackTransactionCA,,) MSI (s) (80:FC) [15:29:01:436]: Executing op: CustomActionSchedule(Action=IISRollbackTransactionCA,ActionType=3329,Source=BinaryData,Target=IISRollbackTransactionCA,) MSI (s) (80:FC) [15:29:01:436]: Executing op: ActionStart(Name=IISCommitTransactionCA,,) MSI (s) (80:FC) [15:29:01:436]: Executing op: CustomActionSchedule(Action=IISCommitTransactionCA,ActionType=3585,Source=BinaryData,Target=IISCommitTransactionCA,) MSI (s) (80:FC) [15:29:01:436]: Executing op: ActionStart(Name=IISExecuteCA,,) MSI (s) (80:FC) [15:29:01:452]: Executing op: CustomActionSchedule(Action=IISExecuteCA,ActionType=3073,Source=BinaryData,Target=IISExecuteCA,CustomActionData=1^3^21^WebDeployment_Current^154^Microsoft.Web.Deployment.UI.PackagingModuleProvider, Microsoft.Web.Deployment.UI.Server, Version=9.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35^1^1^0^^1^3^28^DelegationManagement_Current^171^Microsoft.Web.Management.Delegation.DelegationModuleProvider, Microsoft.Web.Management.Delegation.Server, Version=9.0.0.0, Culture=neutral, PublicKeyToken=31BF3856AD364E35^1^1^0^^1^7^38^system.webServer/management/delegation^4^Deny^16^MachineToWebRoot^0^^3^yes^1^7^31^system.webServer/wdeploy/backup^4^Deny^20^MachineToApplication^0^^2^no^) MSI (s) (80:84) [15:29:01:452]: Invoking remote custom action. DLL: C:\Windows\Installer\MSI6CB9.tmp, Entrypoint: IISExecuteCA 1: IISCA IISExecuteCA : Begin CA Setup 1: IISCA IISExecuteCA : CA 'ExecuteRegisterUIModuleCA' completed with return code hr=0x8007000d 1: IISCA IISExecuteCA : CA 'IISExecuteCA' completed with return code hr=0x8007000d 1: IISCA IISExecuteCA : End CA Setup CustomAction IISExecuteCA returned actual error code 1603 (note this may not be 100% accurate if translation happened inside sandbox) Action ended 15:29:05: InstallFinalize. Return value 3. I can't seem to find any information regarding this particular issue; can someone help point me in the right direction?

    Read the article

  • What is the difference between WCF service and a simple Web service in developing using .NET Framework?

    - by Steve Johnson
    My questions are: What is the difference between WCF service and a simple Web service in .NET Framework? What a WCF Service can do which a .NET Web service cant? In other words, what are the limitation of .NET Web services which were overcome in WCF services? I understand that WCF are REST based and .NET web services are SOAP based. But I need to know more than that. How a developer will make a design decision whether to developer a Web service or a WCF service?

    Read the article

  • Why is web app development path in Java this much confusing? [closed]

    - by Farshid
    I'm currently a .net web developer and I really like to switch to Java. I've used JSP about 7 years ago to develop and deploy a small web application on a JRUN app server. But after 7 years that I like to return back to Java, I can't find the clue. There are many web development frameworks that exist in Java world and each of them has fans that recommand it. There are extensions that sit above jvm for web development (like jRuby i think). I am confused and I do not know where to start the path of learning java web development. I do not want to focus on custom tailor-made approaches and want to remain on the basic path of developing with standard tools and methods and deploy them into standard app servers. (For example some says do not use EJBs, some says focus on MVC facilities like JSF. I'm confused and I do not know the path that i should go on)

    Read the article

  • Am I getting paid a reasonable wage for web engineering?

    - by sailtheworld
    I've been doing some research and it looks like most people in my line of work - WEB ENGINEERING/WEB APPLICATION DEVELOPMENT - that get paid hourly, make anywhere from $30-80 an hour for their work. With that said, I have SEVEN years of experience with web development including OOP-PHP, MySQL, jQuery, OOP-JS, interface design, ajax, database architecture, etc. I am also very strong with visual design and workflow - thus, I've made some really high quality interactive interfaces. I also have a lot of experience with Zend Framework, Symfony, Wordpress, Drupal, etc and a really strong portfolio to show for it. Here's the catch: I'm 20 years old, haven't graduated from college yet (I'm doing part time school and ~30 hours a week of web development.) But I've literally been doing web apps since I was 13 years old. So my question is: is $14 an hour a reasonable starting wage for working at a company part time?

    Read the article

  • What Do I Need To Know About Servers In a Web Development Role?

    - by john
    I know that may sound a little vague, so I'll try and explain a little further... After being self employed developer for many years I'm now in search of a commercial web developer role. My only experience with servers and hosting is uploading through FTP and playing around with CPanel/WHM a little. The role's I'm going for are web development PHP, MySQL, HTML, CSS type roles, but in recent interviews I've been asked questions about setting up things on the server, that I had no idea what was being said... which wasn't ideal! Without knowing more than I do, it's hard to explain what exactly I'm looking to learn, but it's basically just the server elements I should know as a web developer? If you're a web developer, do you have any dealing with the server apart from uploading files, and if so, what? Are things like Subversion(SVN) and version control systems often set up by the web development team, could that be what they were talking about?

    Read the article

  • Java - Problem in deploying Web Application

    - by Yatendra Goel
    I have built a Java Web Application and packed it in a .war file and tested it on my local tomcat server and it is running fine. But when I deployed it on my client's server, it is showing an error. According to the remote server (my client's server), it is not finding a tld file packed in a jar file which I had placed in WEB-INF/lib directory. But when I checked the WEB-INF/lib directory for the jar file, i found that it was there. The contents of META-INF/MANIFEST.MF is as follows: Manifest-Version: 1.0 Class-Path: I think that there is no need to explicitly mention the classpath of WEB-INF/lib directory as it is in the classpath of any web application by default. Then, why the server can't find the jar file in the lib directory when I deployed it on a remote server and why it is working when I deployed the same application on my local server. I posted a question for this at http://stackoverflow.com/questions/2441254/struts-1-struts-taglib-jar-is-not-being-found-by-my-web-application but found that the problem is unusual as nobody could answer it. So my questions are as follows: Q1. Is WEB-INF/lib still remains on the classpath if I leave the classpath entry blank as shown above in the MANIFEST.MF file or I should delete the classpath entry completely from the file or I should explicitly enter Class-Path: /WEB-INF/lib as the classpath entry? Q2. I have JSP pages, Servlets and some helper classes in the web application. Jsp pages are located at the root. Servlets and helper classes are located in WEB-INF/classes folder. So Is there any problem if my helper classes are located in the WEB-INF/classes folder? Note: Please note that this question is not same as my previous question. It is a follow-up question of my previous question. Both the servers (local and remote) are tomcat servers.

    Read the article

  • System.InvalidOperationException compiling ASP.NET app on Mono

    - by Radu094
    This is the error I get when I start my ASP.NET application in Mono: System.InvalidOperationException: The process must exit before getting the requested information. at System.Diagnostics.Process.get_ExitCode () [0x00044] in /usr/src/mono-2.6.3/mcs/class/System/System.Diagnostics/Process.cs:149 at (wrapper remoting-invoke-with-check) System.Diagnostics.Process:get_ExitCode () at Mono.CSharp.CSharpCodeCompiler.CompileFromFileBatch (System.CodeDom.Compiler.CompilerParameters options, System.String[] fileNames) [0x001ee] in /usr/src/mono-2.6.3/mcs/class/System/Microsoft.CSharp/CSharpCodeCompiler.cs:267 at Mono.CSharp.CSharpCodeCompiler.CompileAssemblyFromFileBatch (System.CodeDom.Compiler.CompilerParameters options, System.String[] fileNames) [0x00011] in /usr/src/mono-2.6.3/mcs/class/System/Microsoft.CSharp/CSharpCodeCompiler.cs:156 at System.CodeDom.Compiler.CodeDomProvider.CompileAssemblyFromFile (System.CodeDom.Compiler.CompilerParameters options, System.String[] fileNames) [0x00014] in /usr/src/mono-2.6.3/mcs/class/System/System.CodeDom.Compiler/CodeDomProvider.cs:119 at System.Web.Compilation.AssemblyBuilder.BuildAssembly (System.Web.VirtualPath virtualPath, System.CodeDom.Compiler.CompilerParameters options) [0x0022f] in /usr/src/mono-2.6.3/mcs/class/System.Web/System.Web.Compilation/AssemblyBuilder.cs:804 at System.Web.Compilation.AssemblyBuilder.BuildAssembly (System.Web.VirtualPath virtualPath) [0x00000] in /usr/src/mono-2.6.3/mcs/class/System.Web/System.Web.Compilation/AssemblyBuilder.cs:730 at System.Web.Compilation.BuildManager.GenerateAssembly (System.Web.Compilation.AssemblyBuilder abuilder, System.Web.Compilation.BuildProviderGroup group, System.Web.VirtualPath vp, Boolean debug) [0x00254] in /usr/src/mono-2.6.3/mcs/class/System.Web/System.Web.Compilation/BuildManager.cs:624 at System.Web.Compilation.BuildManager.BuildInner (System.Web.VirtualPath vp, Boolean debug) [0x0011c] in /usr/src/mono-2.6.3/mcs/class/System.Web/System.Web.Compilation/BuildManager.cs:411 at System.Web.Compilation.BuildManager.Build (System.Web.VirtualPath vp) [0x00050] in /usr/src/mono-2.6.3/mcs/class/System.Web/System.Web.Compilation/BuildManager.cs:356 at System.Web.Compilation.BuildManager.GetCompiledType (System.Web.VirtualPath virtualPath) [0x0003a] in /usr/src/mono-2.6.3/mcs/class/System.Web/System.Web.Compilation/BuildManager.cs:803 at System.Web.Compilation.BuildManager.CreateInstanceFromVirtualPath (System.Web.VirtualPath virtualPath, System.Type requiredBaseType) [0x0000c] in /usr/src/mono-2.6.3/mcs/class/System.Web/System.Web.Compilation/BuildManager.cs:500 at System.Web.UI.PageParser.GetCompiledPageInstance (System.String virtualPath, System.String inputFile, System.Web.HttpContext context) [0x0001c] in /usr/src/mono-2.6.3/mcs/class/System.Web/System.Web.UI/PageParser.cs:161 at System.Web.UI.PageHandlerFactory.GetHandler (System.Web.HttpContext context, System.String requestType, System.String url, System.String path) [0x00000] in /usr/src/mono-2.6.3/mcs/class/System.Web/System.Web.UI/PageHandlerFactory.cs:45 at System.Web.HttpApplication.GetHandler (System.Web.HttpContext context, System.String url, Boolean ignoreContextHandler) [0x00055] in /usr/src/mono-2.6.3/mcs/class/System.Web/System.Web/HttpApplication.cs:1643 at System.Web.HttpApplication.GetHandler (System.Web.HttpContext context, System.String url) [0x00000] in /usr/src/mono-2.6.3/mcs/class/System.Web/System.Web/HttpApplication.cs:1624 at System.Web.HttpApplication+<Pipeline>c__Iterator2.MoveNext () [0x0075f] in /usr/src/mono-2.6.3/mcs/class/System.Web/System.Web/HttpApplication.cs:1259 I checked the source code indicated by the stacktrace, namely :CSharpCodeCompiler.cs:267 mcs.WaitForExit(); result.NativeCompilerReturnValue = mcs.ExitCode; //this throws the exception I have no ideea if this is a bug in Mono, or if my App is doing something it shoudn't. A simple "Hello World" application indicates that Mono is properly installed and working, It is just my app that is causing this exception to be thrown. Hoping some enlighted minds have more on the issue

    Read the article

  • ASP.NET Web Service Throws 401 (unauthorized) Error

    - by user268611
    Hi Experts, I have this .NET application to be run in an intranet environment. It is configured so that it requires Windows Authentication before you can access the website (Anonymous access is disabled). This website calls a web service (enable anonymous access) and the web service calls the DB. We do have a token-based authentication between the web application and the web service to secure the communication between them. The issue I'm facing is that when I deploy this to production, I'm having an intermittent issue whereby the communication between the web application and the web service failed. The 401 issue was thrown. This is actually working fine in our QA environment. Is this an issue with Active Directory? Or could it be an isssue with FQDN as mentioned here: http://support.microsoft.com/default.aspx?scid=kb;en-us;896861? The weirdest thing is that this is happening intermittently when tested in both on the server itself and a remote workstation in my client's environment. But, this is working perfectly in my environment. OS: Windows Server SP1 IIS 6 .NET 3.5 Framework Any idea about the 401 (Unauthorized) issue?? Thx for the help... This is from the log... Event code: 3005 Event message: An unhandled exception has occurred. Event time: 4/5/2010 10:44:57 AM Event time (UTC): 4/5/2010 2:44:57 AM Event ID: 6c8ea2607b8d4e29a7f0b1c392b1cb21 Event sequence: 155112 Event occurrence: 2 Event detail code: 0 Application information: Application domain: xxx Trust level: Full Application Virtual Path: xxx Application Path: xxx Machine name: xxx Process information: Process ID: 4424 Process name: w3wp.exe Account name: NT AUTHORITY\NETWORK SERVICE Exception information: Exception type: WebException Exception message: The request failed with HTTP status 401: Unauthorized. Request information: Request URL: http://[ip]/[app_path] Request path: xxx User host address: [ip] User: xxx Is authenticated: True Authentication Type: Negotiate Thread account name: xxx Thread information: Thread ID: 6 Thread account name: xxx Is impersonating: False Stack trace: at System.Web.Services.Protocols.SoapHttpClientProtocol.ReadResponse(SoapClientMessage message, WebResponse response, Stream responseStream, Boolean asyncCall) at System.Web.Services.Protocols.SoapHttpClientProtocol.Invoke(String methodName, Object[] parameters) at wsVulnerabilityAdvisory.Service.test() at test.Page_Load(Object sender, EventArgs e) at System.Web.Util.CalliHelper.EventArgFunctionCaller(IntPtr fp, Object o, Object t, EventArgs e) at System.Web.Util.CalliEventHandlerDelegateProxy.Callback(Object sender, EventArgs e) at System.Web.UI.Control.OnLoad(EventArgs e) at System.Web.UI.Control.LoadRecursive() at System.Web.UI.Page.ProcessRequestMain(Boolean includeStagesBeforeAsyncPoint, Boolean includeStagesAfterAsyncPoint)

    Read the article

  • Is there a way to identify which web framework is used in a site?

    - by ragu.pattabi
    Whenever I come across a cute website, I am always curious to know which web framework was used by its developers? Being a novice in web development, a look at the page source doesn't give any clue. Any way(s) to get this information? If possible, may be with a bit of Ruby magic, I can figure out things like: which is the most/least used framework for my favorite sites, audio/video heavy sites, etc.

    Read the article

  • Why would one use REST instead of Web services?

    - by AngryHacker
    Attended an interesting demo on REST today, however, I couldn't think of a single reason (nor was one presented) why REST is in anyway better or simpler to use and implement than a Web Services stack. What are some of the reasons Why anyone in the "real world" use REST instead of the Web Services?

    Read the article

  • How do I manage the technical debate over WCF vs. Web API?

    - by Saeed Neamati
    I'm managing a team of like 15 developers now, and we are stuck at a point on choosing the technology, where the team is broken into two completely opposite teams, debating over usage of WCF vs. Web API. Team A which supports usage of Web API, brings forward these reasons: Web API is just the modern way of writing services (Wikipedia) WCF is an overhead for HTTP. It's a solution for TCP, and Net Pipes, and other protocols WCF models are not POCO, because of [DataContract] & [DataMember] and those attributes SOAP is not as readable and handy as JSON SOAP is an overhead for network compared to JSON (transport over HTTP) No method overloading Team B which supports the usage of WCF, says: WCF supports multiple protocols (via configuration) WCF supports distributed transactions Many good examples and success stories exist for WCF (while Web API is still young) Duplex is excellent for two-way communication This debate is continuing, and I don't know what to do now. Personally, I think that we should use a tool only for its right place of usage. In other words, we'd better use Web API, if we want to expose a service over HTTP, but use WCF when it comes to TCP and Duplex. By searching the Internet, we can't get to a solid result. Many posts exist for supporting WCF, but on the contrary we also find people complaint about it. I know that the nature of this question might sound arguable, but we need some good hints to decide. We're stuck at a point where choosing a technology by chance might make us regret it later. We want to choose with open eyes. Our usage would be mostly for web, and we would expose our services over HTTP. In some cases (say 5 to 10 percent) we might need distributed transactions though. What should I do now? How do I manage this debate in a constructive way?

    Read the article

  • How do web servers enforce the same-origin policy?

    - by BBnyc
    I'm diving deeper into developing RESTful APIs and have so far worked with a few different frameworks to achieve this. Of course I've run into the same-origin policy, and now I'm wondering how web servers (rather than web browsers) enforce it. From what I understand, some enforcing seems to happen on the browser's end (e.g., honoring a Access-Control-Allow-Origin header received from a server). But what about the server? For example, let's say a web server is hosting a Javascript web app that accesses an API, also hosted on that server. I assume that server would enforce the same-origin policy --- so that only the javascript that is hosted on that server would be allowed to access the API. This would prevent someone else from writing a javascript client for that API and hosting it on another site, right? So how would a web server be able to stop a malicious client that would try to make AJAX requests to its api endpoints while claiming to be running javascript that originated from that same web server? What's the way most popular servers (Apache, nginx) protect against this kind of attack? Or is my understanding of this somehow off the mark? Or is the cross-origin policy only enforced on the client end?

    Read the article

  • Question about the evolution of interaction paradigm between web server program and content provider program?

    - by smwikipedia
    Hi experts, In my opinion, web server is responsible to deliver content to client. If it is static content like pictures and static html document, web server just deliver them as bitstream directly. If it is some dynamic content that is generated during processing client's request, the web server will not generate the conetnt itself but call some external proram to genearte the content. AFAIK, this kind of dynamice content generation technologies include the following: CGI ISAPI ... And from here, I noticed that: ...In IIS 7, modules replace ISAPI filters... Is there any others? Could anyone help me complete the above list and elabrate on or show some links to their evolution? I think it would be very helpful to understand application such as IIS, TomCat, and Apache. I once wrote a small CGI program, and though it serves as a content generator, it is still nothing but a normal standalone program. I call it normal because the CGI program has a main() entry point. But with the recenetly technology like ASP.NET, I am not writing complete program, but only some class library. Why does such radical change happens? Many thanks.

    Read the article

< Previous Page | 29 30 31 32 33 34 35 36 37 38 39 40  | Next Page >