Search Results

Search found 26165 results on 1047 pages for 'iis7 failed request'.

Page 25/1047 | < Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >

  • Windows 8 / IIS 8 Concurrent Requests Limit

    - by OWScott
    IIS 8 on Windows Server 2012 doesn’t have any fixed concurrent request limit, apart from whatever limit would be reached when resources are maxed. However, the client version of IIS 8, which is on Windows 8, does have a concurrent connection request limitation to limit high traffic production uses on a client edition of Windows. Starting with IIS 7 (Windows Vista), the behavior changed from previous versions.  In previous client versions of IIS, excess requests would throw a 403.9 error message (Access Forbidden: Too many users are connected.).  Instead, Windows Vista, 7 and 8 queue excessive requests so that they will be handled gracefully, although there is a maximum number of requests that will be processed simultaneously. Thomas Deml provided a concurrent request chart for Windows Vista many years ago, but I have been unable to find an equivalent chart for Windows 8 so I asked Wade Hilmo from the IIS team what the limits are.  Since this is controlled not by the IIS team itself but rather from the Windows licensing team, he asked around and found the authoritative answer, which I’ll provide below. Windows 8 – IIS 8 Concurrent Requests Limit Windows 8 3 Windows 8 Professional 10 Windows RT N/A since IIS does not run on Windows RT Windows 7 – IIS 7.5 Concurrent Requests Limit Windows 7 Home Starter 1 Windows 7 Basic 1 Windows 7 Premium 3 Windows 7 Ultimate, Professional, Enterprise 10 Windows Vista – IIS 7 Concurrent Requests Limit Windows Vista Home Basic (IIS process activation and HTTP processing only) 3 Windows Vista Home Premium 3 Windows Vista Ultimate, Professional 10 Windows Server 2003, Windows Server 2008, Windows Server 2008 R2 and Windows Server 2012 allow an unlimited amount of simultaneously requests.

    Read the article

  • Failed to install GNOME3 with 404 error

    - by Neon
    I followed the instructions here, however I got 404 not found error checking for updates: W:Failed to fetch http://ppa.launchpad.net/ubuntugnometeam/gnome3/ubuntu/dists/natty/main/source/Sources 404 Not Found, W:Failed to fetch http://ppa.launchpad.net/ubuntugnometeam/gnome3/ubuntu/dists/natty/main/binary-i386/Packages 404 Not Found, E:Some index files failed to download. They have been ignored, or old ones used instead. , Thus I couldn't even start the installation. Anybody know how to perform a full installation?

    Read the article

  • Using Performance Monitor To Get IIS7 Response Turnaround Time

    - by alphadogg
    I have a MVC2 web application on W2KR2/IIS7 that I'd like to benchmark/monitor. Some XHR requests by a browser-based client are suddenly taking 8-10 sec when they used to take much less time (as per Chrome Developer Tool timings). The underlying SQL Server queries, using the same params, runs in 1.4s according to total execution time client statistics from SSMS. I'm assuming that there are various counters that can specifically dissect time taken/waiting/processing between IIS7 itself and the web application? For example, can I check how long it takes to get a response from IIS7 app and DB? How about how long it takes to serve IIS7?

    Read the article

  • The Primary Cause of Failed IT Projects

    - by Paul Nielsen
    During my career I’ve been a part of dozens of projects. Some I was on from the start, most I came in to help bail out. Some went smooth and were a pleasure to build and maintain and some projects failed (failed being broadly defined as projects that were not completed, or were completed but were a horrid mess – very complex, impossible to maintain, refactor, and a royal pain to keep running.) While there are a number of factors that can contribute to a failed project, in my career it seems the primary...(read more)

    Read the article

  • The Primary Cause of Failed IT Projects

    - by Paul Nielsen
    During my career I’ve been a part of dozens of projects. Some I was on from the start, most I came in to help bail out. Some went smooth and were a pleasure to build and maintain and some projects failed (failed being broadly defined as projects that were not completed, or were completed but were a horrid mess – very complex, impossible to maintain, refactor, and a royal pain to keep running.) While there are a number of factors that can contribute to a failed project, in my career it seems the primary...(read more)

    Read the article

  • Differences when Running with OutputCache managed module under ASP.NET IIS7.x with Cache-control header

    - by Shawn Cicoria
    This post is to report some differences when using MVC or IHttpHandlers if you’re attempting to set the Cache-control : max-age or s-maxage value under IIS7.x using the HttpResponse.Cache methods. [UPDATE]: 2011-3-14 – The missing piece was calling  Response.Cache.SetSlidingExpiration(true) as follows: context.Response.Cache.SetCacheability(HttpCacheability.Public); context.Response.Cache.SetMaxAge(TimeSpan.FromMinutes(5)); context.Response.ContentType = "image/jpeg"; context.Response.Cache.SetSlidingExpiration(true);   Under IIS7.x if you us one of the following 2 methods, you will only get a Cache-ability of “public”.  public ActionResult Image2() { MemoryStream oStream = new MemoryStream(); using (Bitmap obmp = ImageUtil.RenderImage("Respone.Cache.Setxx calls", 5, DateTime.Now)) { obmp.Save(oStream, ImageFormat.Jpeg); oStream.Position = 0; Response.Cache.SetCacheability(HttpCacheability.Public); Response.Cache.SetMaxAge(TimeSpan.FromMinutes(5)); return new FileStreamResult(oStream, "image/jpeg"); } } Method 2 – which is just a plain old HttpHandler and really isn’t MVC3, but under the same MVC ASP.NET application, same result. public class image : IHttpHandler { public void ProcessRequest(HttpContext context) { using (var image = ImageUtil.RenderImage("called from IHttpHandler direct", 5, DateTime.Now)) { context.Response.Cache.SetCacheability(HttpCacheability.Public); context.Response.Cache.SetMaxAge(TimeSpan.FromMinutes(5)); context.Response.ContentType = "image/jpeg"; image.Save(context.Response.OutputStream, ImageFormat.Jpeg); } } } Using the following under MVC3 (I haven’t tried under earlier versions) will work by applying the OutputCacheAttribute to your Action: [OutputCache(Location = OutputCacheLocation.Any, Duration = 300)] public ActionResult Image1() { MemoryStream oStream = new MemoryStream(); using (Bitmap obmp = ImageUtil.RenderImage("called with OutputCacheAttribute", 5, DateTime.Now)) { obmp.Save(oStream, ImageFormat.Jpeg); oStream.Position = 0; return new FileStreamResult(oStream, "image/jpeg"); } } To remove the “OutputCache” module, you use the following in your web.config: <system.webServer> <validation validateIntegratedModeConfiguration="false"/> <modules runAllManagedModulesForAllRequests="true"> <!--<remove name="OutputCache"/>--> </modules>

    Read the article

  • HP Smart Array; Is it possible to convert Raid1 to Raid0 by dropping a failed drive?

    - by Erik Heppler
    I have a server that was running two 60GB drives as a logical RAID1. At some point the second drive was physically removed and the logical drive has been in "Interim Recovery Mode" for several months now. There's no need for the redundancy of RAID1 on this machine, and I have no intention of replacing the missing drive. If possible I would like to convert the current RAID1 to a single-drive RAID0 by simply dropping the failed drive from the current configuration. I'm only interested in doing this if it can be done in-place. Otherwise I'm perfectly content to leave it in "Interim Recovery Mode" indefinitely.

    Read the article

  • Cannot update 12.04 due to "Failed to fetch" warning

    - by harsha
    I can't update my ubuntu 12.04. I tried it from terminal, update manager and also from the synaptic package manager. The error is W: Failed to fetch http://in.archive.ubuntu.com/ubuntu/dists/precise-backports/universe/i18n/Translation-en_IN Unable to connect to 172.20.0.100:8080: W: Failed to fetch http://in.archive.ubuntu.com/ubuntu/dists/precise-backports/universe/i18n/Translation-en Unable to connect to 172.20.0.100:8080: E: Some index files failed to download. They have been ignored, or old ones used instead.

    Read the article

  • Failed to download repository information

    - by Bob Van Elst
    When i clicked check this error message came up. But it does not come up when update manager strarts automatically. When you open update manager this error comes up. Any ideas on how to fix it? Details: *W:GPG error: (http://ppa.launchpad.net precise Release: The following signatures couldn't be verified because the public key is not available: NO_PUBKEY FC8CA6FE7B1FEC7C, W:Failed to fetch (http://ppa.launchpad.net/jonabeck/ppa/ubuntu/dists/precise/main/binary-amd64/Packages 404 Not Found , W:Failed to fetch (http://ppa.launchpad.net/jonabeck/ppa/ubuntu/dists/precise/main/binary-i386/Packages 404 Not Found , W:Failed to fetch (http://ppa.launchpad.net/jonabeck/ppa/ubuntu/dists/precide/main/source/Sources 404 Not Found , E:Some index files failed to download. They have been ignored, or old ones used instead.*

    Read the article

  • Failed to upgrade to Ubuntu 11.10

    - by Gigili
    This error prevents the system from upgrading to a newer version of Ubuntu, what is causing it? W: Failed to fetch http://extras.ubuntu.com/ubuntu/dists/natty/main/source/Sources 404 Not Found W: Failed to fetch http://extras.ubuntu.com/ubuntu/dists/natty/main/binary-amd64/Packages 404 Not Found E: Some index files failed to download. They have been ignored, or old ones used instead. Since I got the warning message that this release is not supported anymore, should I download and install Ubuntu 12.10 directly from Ubuntu's site instead?

    Read the article

  • The Mysterious ARR Server Farm to URL Rewrite link

    - by OWScott
    Application Request Routing (ARR) is a reverse proxy plug-in for IIS7+ that does many things, including functioning as a load balancer.  For this post, I’m assuming that you already have an understanding of ARR.  Today I wanted to find out how the mysterious link between ARR and URL Rewrite is maintained.  Let me explain… ARR is unique in that it doesn’t work by itself.  It sits on top of IIS7 and uses URL Rewrite.  As a result, ARR depends on URL Rewrite to ‘catch’ the traffic and redirect it to an ARR Server Farm. As the last step of creating a new Server Farm, ARR will prompt you with the following: If you accept the prompt, it will create a URL Rewrite rule for you.  If you say ‘No’, then you’re on your own to create a URL Rewrite rule. When you say ‘Yes’, the Server Farm’s checkbox for “Use URL Rewrite to inspect incoming requests” will be checked.  See the following screenshot. However, I’m not a fan of this auto-rule.  The problem is that if I make any changes to the URL Rewrite rule, which I always do, and then make the wrong change in ARR, it will blow away my settings.  So, I prefer to create my own rule and manage it myself. Since I had some old rules that were managed by ARR, I wanted to update them so that they were no longer managed that way.  I took a look at a config in applicationHost.config to try to find out what property would bind the two together.  I assumed that there would be a property on the ServerFarm called something like urlRewriteRuleName that would serve as the link between ARR and URL Rewrite.  I found no such property.  After a bit of testing, I found that the name of the URL Rewrite rule is the only link between ARR and URL Rewrite.  I wouldn’t have guessed.  The URL Rewrite rule needs to be exactly ARR_{ServerFarm Name}_loadBalance, although it’s not case sensitive. Consider the following auto-created URL Rewrite rule: And, the link between ARR and URL Rewrite exists: Now, as soon as I rename that to anything else, for example, site.com ARR Binding, the link between ARR and URL Rewrite is broken. To be certain of the relationship, I renamed it back again and sure enough, the relationship was reestablished. Why is this important?  It’s only important if you want to decouple the relationship between ARR the URL Rewrite rule, but if you want to do so, the best way to do that is to rename the URL Rewrite rule.  If you uncheck the “Use URL Rewrite to inspect incoming requests” checkbox, it will delete your rule for you without prompting.  Conclusion The mysterious link between ARR and URL Rewrite only exists through the ARR Rule name.  If you want to break the link, simply rename the URL Rewrite rule.  It’s completely safe to do so, and, in my opinion, this is a rule that you should manage yourself anyway. 

    Read the article

  • How do I get meaningful error messages in IIS7?

    - by Petras
    I have a classic ASP website that is crashing in IIS7. It is crashing because IIS doesn't allow file uploads greater than a certain size. I know this because files below about 200k work fine. I removed the Status Code 500 error in IIS but I still don't get a file name and the line where my code failed as I do when running locally. Instead I get: "The page cannot be displayed because an internal server error has occurred. If you are the system administrator please click here to find out more about this error." How do I get a file name and the line where my code failed? Here are my IIS settings:

    Read the article

  • Creating a HTTP handler for IIS that transparently forwards request to different port?

    - by Lasse V. Karlsen
    I have a public web server with the following software installed: IIS7 on port 80 Subversion over apache on port 81 TeamCity over apache on port 82 Unfortunately, both Subversion and TeamCity comes with their own web server installations, and they work flawlessly, so I don't really want to try to move them all to run under IIS, if that is even possible. However, I was looking at IIS and I noticed the HTTP redirect part, and I was wondering... Would it be possible for me to create a HTTP handler, and install it on a sub-domain under IIS7, so that all requests to, say, http://svn.vkarlsen.no/anything/here is passed to my HTTP handler, which then subsequently creates a request to http://localhost:81/anything/here, retrieves the data, and passes it on to the original requestee? In other words, I would like IIS to handle transparent forwards to port 81 and 82, without using the redirection features. For instance, Subversion doesn't like HTTP redirect and just says that the repository has been moved, and I need to relocate my working copy. That's not what I want. If anyone thinks this can be done, does anyone have any links to topics I need to read up on? I think I can manage the actual request parts, even with authentication, but I have no idea how to create a HTTP handler. Also bear in mind that I need to handle sub-paths and documents beneath the top-level domain, so http://svn.vkarlsen.no/whatever/here needs to be handled by a single handler, I cannot create copies of the handler for all sub-directories since paths are created from time to time.

    Read the article

  • ASP.NET Performance tip- Combine multiple script file into one request with script manager

    - by Jalpesh P. Vadgama
    We all need java script for our web application and we storing our JavaScript code in .js files. Now If we have more then .js file then our browser will create a new request for each .js file. Which is a little overhead in terms of performance. If you have very big enterprise application you will have so much over head for this. Asp.net Script Manager provides a feature to combine multiple JavaScript into one request but you must remember that this feature will be available only with .NET Framework 3.5 sp1 or higher versions.  Let’s take a simple example. I am having two javascript files Jscrip1.js and Jscript2.js both are having separate functions. //Jscript1.js function Task1() { alert('task1'); } Here is another one for another file. ////Jscript1.js function Task2() { alert('task2'); } Now I am adding script reference with script manager and using this function in my code like this. <form id="form1" runat="server"> <asp:ScriptManager ID="myScriptManager" runat="server" > <Scripts> <asp:ScriptReference Path="~/JScript1.js" /> <asp:ScriptReference Path="~/JScript2.js" /> </Scripts> </asp:ScriptManager> <script language="javascript" type="text/javascript"> Task1(); Task2(); </script> </form> Now Let’s test in Firefox with Lori plug-in which will show you how many request are made for this. Here is output of that. You can see 5 Requests are there. Now let’s do same thing in with ASP.NET Script Manager combined script feature. Like following <form id="form1" runat="server"> <asp:ScriptManager ID="myScriptManager" runat="server" > <CompositeScript> <Scripts> <asp:ScriptReference Path="~/JScript1.js" /> <asp:ScriptReference Path="~/JScript2.js" /> </Scripts> </CompositeScript> </asp:ScriptManager> <script language="javascript" type="text/javascript"> Task1(); Task2(); </script> </form> Now let’s run it and let’s see how many request are there like following. As you can see now we have only 4 request compare to 5 request earlier. So script manager combined multiple script into one request. So if you have lots of javascript files you can save your loading time with this with combining multiple script files into one request. Hope you liked it. Stay tuned for more!!!.. Happy programming.. Technorati Tags: ASP.NET,ScriptManager,Microsoft Ajax

    Read the article

  • Is it possible to tell IIS 7 to process the request queue in parallel?

    - by Uwe Keim
    Currently we are developing an ASMX, ASP 2.0, IIS 7 web service that does some calculations (and return a dynamically generated document) and will take approx. 60 seconds to run. Since whe have a big machine with multiple cores and lots of RAM, I expected that IIS tries its best to route the requests that arrive in its requests queue to all available threads of the app pool's thread pool. But we experience quiet the opposite: When we issue requests to the ASMX web service URL from multiple different clients, the IIS seems to serially process these requests. I.e. request 1 arrives, is being processed, then request 2 is being processed, then request 3, etc. Question: Is it possible (without changing the C# code of the web service) to configure IIS to process requests in parallel, if enough threads are available? If yes: how should I do it? It no: any workarounds/tips? Thanks Uwe

    Read the article

  • SSL setup with GoDaddy subdomains and EC2 servers

    - by Kevin
    We have two EC2 instances that are used to host various scripts. Our main page 'companyname.com' is hosted with GoDaddy but is unrelated to those EC2 instances. I need to setup SSL connections for the two EC2 microinstances, one running Linux AMI and the other running Windows Server. I purchased two single-domain Comodo certificates and am at the part to generate CSR's on the instances. I'm not sure what to put as "Server Name" on EC2. I would like each server to be accessible through a subdomain which I have forwarded on GoDaddy to the elastic IPs on EC2. For server name, do I use the elastic ip, the EC2 public dns, or the subdomain that I want? And which of these do I then place in my VirtualHosts file on Apache? The Windows instance is running IIS7 but the Apache box is priority.

    Read the article

  • Please explain some of the features of URL Rewrite module for a newbie

    - by kunjaan
    I am learning to use the IIS Rewrite module and some of the "features" listed in the page is confusing me. It would be great if somebody could explain them to me and give a first hand account of when you would use the feature. Thanks a lot! Rewriting within the content of specific HTML tags Access to server variables and HTTP headers Rewriting of server variables and HTTP request headers What are the "server variables" and when would you redefine or define them? Rewriting of HTTP response headers HtmlEncode function Why would you use an HTMLEncode in the server? Reverse proxy rule template Support for IIS kernel-mode and user-mode output caching Failed Request Tracing support

    Read the article

  • Active Directory auto login to website for domain users

    - by Darkcat Studios
    I am putting together an Intranet for a company - I have set up authentication to get into the Intranet from a login box linked to AD via LDAP/ However the client wants (if possible) to have users automatically authenticate into the intranet if they are logged into the domain. AD and IIS7.5 are on separate servers (in the same network). I believe that I need to use WindowsAuthentication to do this - but will that work? as the web server is not part of the domain: do I need to tell IIS where the AD server is? The next part could be more complex: once the user has authenticated, I need to drag user details from AD about the user, I guess with LDAP, however I will need to know the user's username in order to do this, won't I? as the user hast had to type this in, how do I get that? The intranet site is in asp.net 4 VB.

    Read the article

  • Links in my site have been hacked

    - by Funky
    In my site I prefix the images and links with the domain of the site for better SEO using the code below: public static string GetHTTPHost() { string host = ""; if (HttpContext.Current.Request["HTTP_HOST"] != null) host = HttpContext.Current.Request["HTTP_HOST"]; if (host == "site.co.uk" || host == "site.com") { return "http://www." + host; } return "http://"+ host; } This works great, but for some reason, lots of links have now changed to http://www.baidu.com/... There is no sign of this in any of the code or project, the files on the server also have a change date when i last did the publish at 11 yesterday, so all the files on there look fine. I am using ASP.net and Umbraco 4.7.2 Does anyone have any ideas? thanks

    Read the article

  • IIS7 Integrated Pipeline - Response.End not ending the request.

    - by MikeGurtzweiler
    I have the following bit of code that worked as expected before we upgraded to Integrated Pipeline in IIS7. public void RedirectPermanently(string url, bool clearCookies) { Response.ClearContent(); Response.StatusCode = 301; Response.AppendHeader("Location", url); if(clearCookies) { Response.Cookies.Clear(); Response.Flush(); Response.End(); } } Previously when this method was executed, if clearCookies was true, the response would be sent to the client and request processing would end. Now under Integrated Pipeline Response.End() does not seem to end processing. The page continues running as if the method was never called. Big question is, why and what changed! Thanks.

    Read the article

  • Debug .NET app with IIS7 and Delphi.NET - Use aspnet_regiis.exe to configure the local IIS web serve

    - by webnoob
    Hi All, I have been trying to set up local debugging for my ASP.NET app in Delphi and am getting the error above. I have used the aspnet_regiis.exe tool with the following: Aspnet_regiis.exe -s W3SVC/1/ROOT/DevTest but this hasn't helped. It added it to IIS as an application but I am still getting the error: The project cannot be debugged because virtual directory "DevTes" is not configured with ASP.NET version 2.0 or 3.0 Use aspnet_regiis.exe to configure the local IIS web server. I am not really sure where to go from here so need some help please: Machine specs: Windows 7 64bit, IIS7, Using RAD Studio 2007.

    Read the article

  • Anybody have any success getting around IIS7 issues with WiX 3.5?

    - by Will
    WiX 3.5 still is having issues with creating websites in IIS7. I can get around most of them, but I'm getting hosed by the inability to configure the website authentication mode. Traditionally, you would just use the WebDirProperties to, for instance, turn on windows authentication: <iis:WebDirProperties Id="OMFG3.5BUGSUX" WindowsAuthentication="yes" /> Well, this doesn't work. So, now, once my nice lovely installer exits you get a big fat screw-you-unauthorized-jerk message. Not exactly professional looking. Does anybody have any suggestions/tips on working around these shortcomings in WiX?

    Read the article

< Previous Page | 21 22 23 24 25 26 27 28 29 30 31 32  | Next Page >