Search Results

Search found 57458 results on 2299 pages for 'http response codes'.

Page 121/2299 | < Previous Page | 117 118 119 120 121 122 123 124 125 126 127 128  | Next Page >

  • XAML Namespace http://schemas.microsoft.com/winfx/2006/xaml is not resolved

    - by Justin Poliey
    I'm using Visual Studio 2010 Express, working on a Silverlight 4 project in C#. This started happening all of a sudden in my project, I get the error that this XAML Namespace is not resolved: XAML Namespace http://schemas.microsoft.com/winfx/2006/xaml is not resolved If it helps, here is the section of the XAML file in which the error is being raised: <ResourceDictionary xmlns:my="clr-namespace:System.Windows.Controls;assembly=System.Windows.Controls.Toolkit" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:GetGlue="clr-namespace:GetGluePlugin;assembly=GetGluePlugin" xmlns:System="clr-namespace:System;assembly=mscorlib" xmlns:utils="clr-namespace:Seesmic.Sdp.Utils;assembly=Seesmic.Sdp.Utils"> What could the problem be?

    Read the article

  • Problem in doing Add Web Reference (http://.../service.asmx)

    - by flopdix
    I am trying to Add a Web Reference to http//.../service.asmx file in my project. First it gives me an error: "... Unable to download following files from: http://.../service.asmx?wsdl Do you want to skip these files and continue ..." When i click 'Yes', the proxy gets created with .disco and other reference files, but does not adds .wsdl file. Other option i tried, i used http//.../service.asmx?wsdl to add web reference. In this case, i dont get any error, but the proxy gets added with .wsdl and reference files, but it does not adds .disco file. Can someone help me on why this is happening? I thought, adding web reference to .asmx should add everything under the proxy.

    Read the article

  • How to Create VBA Add-In with Shared Codes for All Excels?

    - by StanFish
    I'm writing VBA codes for multiple Excel spreadsheets, which will be shared with others from time to time. At some point I find there are lots of duplications in my works. So I want to find a way to share codes in a sort of Excel add-in, like the .xla file. But when I tried to save the Excel file containing shared codes as .xla file, I got some problems: The file cannot be edit anymore after I save it in the default add-in folder If I move the .xls file to a folder other than the add-in folder, and open it directly - I cannot use its classes - which creates problems for sharing the codes Any ideas to create add-ins in a flexible and powerful way please? Thanks a lot for the help

    Read the article

  • HTTP request stream not readable outside of request handler

    - by Jason Young
    I'm writing a fairly complicated multi-node proxy, and at one point I need to handle an HTTP request, but read from that request outside of the "http.Server" callback (I need to read from the request data and line it up with a different response at a different time). The problem is, the stream is no longer readable. Below is some simple code to reproduce the issue. Is this normal, or a bug? function startServer() { http.Server(function (req, res) { req.pause(); checkRequestReadable(req); setTimeout(function() { checkRequestReadable(req); }, 1000); setTimeout(function() { res.end(); }, 1100); }).listen(1337); console.log('Server running on port 1337'); } function checkRequestReadable(req) { //The request is not readable here! console.log('Request writable? ' + req.readable); } startServer();

    Read the article

  • Parse raw HTTP Headers

    - by Cev
    I have a string of raw HTTP and I would like to represent the fields in an object. Is there any way to parse the individual headers from an HTTP string? 'GET /search?sourceid=chrome&ie=UTF-8&q=ergterst HTTP/1.1\r\nHost: www.google.com\r\nConnection: keep-alive\r\nAccept: application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5\r\nUser-Agent: Mozilla/5.0 (Macintosh; U; Intel Mac OS X 10_6_6; en-US) AppleWebKit/534.13 (KHTML, like Gecko) Chrome/9.0.597.45 Safari/534.13\r\nAccept-Encoding: gzip,deflate,sdch\r\nAvail-Dictionary: GeNLY2f-\r\nAccept-Language: en-US,en;q=0.8\r\n [...]'

    Read the article

  • jQuery ajax multiline "script" response

    - by Rendrik
    I'm designing a template creation tool, which uses a jQuery Ajax request that posts parameters to a PHP file. The PHP does the actual generation of the template's HTML. // Send for processing. Expect JS back to execute. function generate() { $.ajax({ type: "POST", url: "generate.php", data: $('#genform :input').serialize(), dataType: "script", beforeSend: function() { $("#loading").html("<img src='images/loadbar.gif' />"); $("#loading") .dialog({ height: 80, width: 256, autoOpen: true, modal: true }); }, success: function(data) { $("#loading").dialog('close'); } }); } My trouble is that I have the ajax dataType: set to "script". Using this, the PHP file generates some jQuery dialogs for any errors which works nicely. However, after I generate the HTML, i'm having trouble passing it back. So I have probably 100 lines of generated HTML and javascript which i'd like to work with. In the PHP file, i've tried: echo('$("#result").html("'.$html.'");'); This does actually work if there are NO line breaks in $html. As soon as there are any line breaks, the Chrome debugger reports "gen.html:1 Uncaught SyntaxError: Unexpected token ILLEGAL". It's obvious that it's trying to eval the returned response headers, but is stopping at any line break. So, to be clear, when I pass $html back, if the contents are this: $html = "<div>hi there</div>"; It works fine (all of my error message dialogs are one line). But if it's: $html = "<div> hi there </div>"; It blows up. I'm really not sure how to get around this, or if there's a better way to go about it. It's important to me to keep the formatting so people can copy the HTML template. I may just break down and display the template file on the PHP page if I can't solve this, but I was really hoping to keep everything within the confines of the HTML page.

    Read the article

  • Handling multiple async HTTP requests in Silverlight serially

    - by Jeb
    Due to the async nature of http access via WebClient or HttpWebRequest in Silverlight 4, when I want to do multiple http get/posts serially, I find myself writing code that looks like this: doFirstGet(someParams, () => { doSecondGet(someParams, () => { doThirdGet(... } }); Or something similar. I'll end up nesting subsequent calls within callbacks usually implemented using lambdas of some sort. Even if I break things out into Actions or separate methods, it still ends up being hard to read. Does anyone have a clean solution to executing multiple http requests in SL 4 serially? I don't need things to be synchronous, I just want serial execution.

    Read the article

  • Deny http access to a directory, allow access from WordPress plugin

    - by luke
    Hey. I need to prevent direct access to http://www.site.com/wp-content/uploads/folder/something.pdf through the browser. However the Download Monitor plugin I am using, which allows logged in users to download the file, needs to be able to work. Trying Order Allow,Deny Deny from all Allow from all but the download links do not now work... even though (I think) they are links produced by the script e.g. http://www.site.com/wp-content/plugins/download-monitor/download.php?id=something.pdf Enter that in the address bar and you correctly get a WordPress message, 'You must be logged in to download this file.' However, if someone knows the URL where the file was uploaded http://www.site.com/wp-content/uploads/folder/something.pdf they can still access it directly. I don't know how (guesswork?) they would find the direct URL anyway, but the client wants it stopped! Thanks for any help.

    Read the article

  • There was no endpoint listening at http://localhost:49481/Marketing.svc that could accept the message

    - by duckmike
    I have a Marketing service running locally through port 49481 and through IIS on port 65000. Either way when I try to debug into that service, I get an error message - {"There was no endpoint listening at http:/localhost:49481/Marketing.svc that could accept the message. This is often caused by an incorrect address or SOAP action. See InnerException, if present, for more details."} My inner exception is - {"The remote server returned an error: (404) Not Found."} I can open that service address, http:/localhost:49481/Marketing.svc, through a browser and I get a message that it's running ok. My config file is set up like - <endpoint address="http://localhost:49481/Marketing.svc" binding="basicHttpBinding" bindingConfiguration="BasicHttpBinding_IMarketing" contract="SunriseMarketing.IMarketing" name="BasicHttpBinding_IMarketing" /> What am I missing?

    Read the article

  • What's the difference between /123 and /?123?

    - by BoltClock
    I've noticed that some sites (including http://jobs.stackoverflow.com) have query strings that look like this: http://somewebapp.example/?123 as compared to: http://somewebapp.example/123 or http://somewebapp.example/id/123 What are the reasons that developers choose to implement their web apps' URLs using the first example instead of the second and third examples? And as a bonus, how would one implement the first example in PHP, given that 123 is the primary key of some row in a database table? (I just need to know how to retrieve 123 from the URL; I already know how to query the database for a primary key of 123.)

    Read the article

  • How to post XML document to HTTP with VB.Net

    - by Joshua McGinnis
    I'm looking for help with posting my XML document to a url in VB.NET. Here's what I have so far ... Public Shared xml As New System.Xml.XmlDocument() Public Shared Sub Main() Dim root As XmlElement root = xml.CreateElement("root") xml.AppendChild(root) Dim username As XmlElement username = xml.CreateElement("username") username.InnerText = _username root.AppendChild(username) xml.Save(Console.Out) Dim url = "https://mydomain.com" Dim req As WebRequest = WebRequest.Create(url) req.Method = "POST" req.ContentType = "application/xml" req.Headers.Add("Custom: API_Method") Console.WriteLine(req.Headers.ToString()) This is where things go awry: I want to post the xml, and then print the results to console. Dim newStream As Stream = req.GetRequestStream() xml.Save(newStream) Dim response As WebResponse = req.GetResponse() Console.WriteLine(response.ToString()) End Sub

    Read the article

  • Managed HttpListener vs C++ Network Lib - Requires admin rights?

    - by Max
    So, I have noticed that starting an HttpListener is considered impolite according to Win 7. I cannot do so without administrative rights without adding myself to some URL reservation list. In theory, this is alright, but I'd like to make my program as little invasive as possible. My main other alternative is something like the c++ Network Library, which utilizes boost. This is probably not as simple as a HttpListener though. Will this circumvent the admin rights requirement for listening to some HTTP url? How does windows handle http listening? Right now I'm just listening to http://+:xxxx/url, I guess it's fully possible to just create a Socket listening at port xxxx and provide my own/third party http implementation?

    Read the article

  • Can't parse XML from AJAX response.

    - by Pavel
    Hi everyone. I'm having some problems with parsing the xml response from my ajax script. The XML looks like this: <IMAGE> <a href="address"> <img width="300" height="300" src="image.png class="image" alt="" title="LINKING"/> </a> </IMAGE> <LINK> www.address.com </LINK> <TITLE> This <i>is title</i> </TITLE> <EXCERPT> <p> And some excerpt </p> </EXCERPT> The code for js looks like this. function loadTab(id) { if (window.XMLHttpRequest) {// code for IE7+, Firefox, Chrome, Opera, Safari xmlhttp=new XMLHttpRequest(); } else {// code for IE6, IE5 xmlhttp=new ActiveXObject("Microsoft.XMLHTTP"); } xmlhttp.onreadystatechange=function() { if (xmlhttp.readyState==4 && xmlhttp.status==200) { xmlDoc=xmlhttp.responseXML; var title=""; var image=""; x=xmlDoc.getElementsByTagName("TITLE"); for (i=0;i<1;i++) { title=title + x[i].childNodes[0].nodeValue; } document.getElementById("ntt").innerHTML=title; x1=xmlDoc.getElementsByTagName("IMAGE"); for (j=0;j<1;j++) { image=image + x1[j].childNodes[0].nodeValue; } document.getElementById("nttI").innerHTML=image; } } var url = 'http://www.factmag.com/staging/page/?id='+id; xmlhttp.open("GET",url,true); xmlhttp.send(); } When I'm parsing it it pulls out the title but not the IMAGE tag contents. What I'm doing wrong? Can someone please tell me? Thanks in advance!

    Read the article

  • Does a servlet-based stack have significant overheads?

    - by John
    I don't know if it's simply because page-loads take a little time, or the way servlets have an abstraction framework above the 'bare metal' of HTTP, or just because of the "Enterprise" in Jave-EE, but in my head I have the notion that a servlet-based app is inherently adding overhead compared to a Java app which simply deals with sockets directly. Forget web-pages, imagine instead a Java server app where you send it a question over an HTTP request and it looks up an answer from memory and returns the answer in the response. You can easily write a Java socket-based app which does this, you can also do a servlet approach and get away from the "bare metal" of sockets. Is there any measurable performance impact to be expected implementing the same approach using Servlets rather than a custom socket-based HTTP listening app? And yes, I am hazy on the exact data sent in HTTP requests and I know it's a vague question. It's really about whether servlet implementations have lots of layers of indirection or anything else that would add up to a significant overhead per call, where by significant I mean maybe an additional 0.1s or more.

    Read the article

  • what is other option for HTTP_REFERER in php?

    - by user251336
    Hi all, I want to know is there any option/work sround for $_SERVER['HTTP_REFERER']. Because 'HTTP_REFERER' can not be trusted. Then What is other way to know that from which url the request has came from?. Here is the situation - http:// abc.com/one.htmlwill have an iframe having src=http:// xyz.com/giv.php?param=1. How giv.php on xyz.com will know that request is coming from http:// abc.com/one.html?

    Read the article

  • How can I detect if a request and response cookies are different?

    - by Malcolm Frexner
    I need to detect if a request cookie - value is different from a response cookie - value. Its not as easy as: if(cookiesArePresent) { bool isDifferent = HttpContext.Current.Response.Cookies[".ASPXANONYMOUS"].value == HttpContext.Current.Response.Cookies[".ASPXANONYMOUS"].value; } But I read that changing the Response.Cookies changes the Request.Cookies. That would mean they are always the same if HttpContext.Current.Response.Cookies[".ASPXANONYMOUS"] was changed. Is there an easy way around this? http://chance.lindseydev.com/2009/04/aspnet-httprequestcookies-and.html

    Read the article

  • how can i send jquery ajax http request with php

    - by testkhan
    i have following form... <form action="http://mydomain.com/get.php" method="post"> <input type="hidden" value="thisisvalue" name="hdnvalue"> <input type="text" name="cost" id="cost"><br><br> <textarea id="msg" name="message"></textarea><br><br> <input type="submit" value="Send" id="send"> </form> the get.php file will response in 1 or 0 i.e 1 means recieved and 0 means not recieved now i want to send it to http://mydomain.com/get.php via http request with jquery ajax how can i do that get the returned value..

    Read the article

  • JAAS : on Callback ( Interesting based on HTTP headers )

    - by VJS
    I am using NameCallback and PasswordCallback for username and password.For username and password, popup comes on browser and when i enter username ans password, JAAS authenticates my request. On the wireshark, I have seen that 401 Unauthorized message (WWW-Authenticate header)comes and when i enter username/password HTTP request with credentials generate ( with Authorization header) and goes to server. My requirement : I don't want pop up to come.My application on other server having username / password, so once it received 401 then based on some logic it will generate HTTP request with Authorization header / credentials and sent it back. FLow : User - Other Server - My Tomcat5.5 Here on Other Server, nobody is available to enter username/password manually.Application is deployed and it will only generate HTTP request with credential and sent it back to tomcat. Can we have any other callback which behave like this.Need your help.Please provide me feedback as well related to approach.

    Read the article

  • Default http/admin port in dropwizard project

    - by mithrandir
    I have a dropwizard project and I have maintained a config.yml file at the ROOT of the project (basically at the same level as pom.xml). Here I have specified the HTTP port to be used as follows: http: port:9090 adminPort:9091 I have the following code in my TestService.java file public class TestService extends Service<TestConfiguration> { @Override public void initialize(Bootstrap<TestConfiguration> bootstrap) { bootstrap.setName("test"); } @Override public void run(TestConfiguration config, Environment env) throws Exception { // initialize some resources here.. } public static void main(String[] args) throws Exception { new TestService().run(new String[] { "server" }); } } I expect the config.yml file to be used to determine the HTTP port. However the app always seems to start with the default ports 8080 and 8081. Also note that I am running this from eclipse. Any insights as to what am I doing wrong here ?

    Read the article

  • DNS lookup failures while accessing my website some proxy error

    - by Bond
    Here is a situation until today morning,every thing has been working perfectly fine with me. From past 6 months many of my domains wer accessible as http://site1.myserver.com http://site2.myserver.com http://site3.myserver.com http://site4.myserver.com All these were Reverse Proxy configurations. I have some applications on each of them. until today morning some people reported me that http://site1.myserver.com/app1 is not working but http://site1.myserver.com is accessible but http://site2.myserver.com is accessible but http://site3.myserver.com is accessible but http://site4.myserver.com not accessible In past 6 months I have not changed any of these Apache configurations (things were working perfectly so) The error which can be seen in browser are while accessing http://site1.myserver.com/app1 Proxy Error The proxy server received an invalid response from an upstream server. The proxy server could not handle the request GET /app1. Reason: DNS lookup failure for: myserver.com and same is the error for http://site4.myserver.com So what should I check in I have checked all the apache logs to an extent which I could see and 192.168.1.25 - - [10/Jan/2011:14:50:48 +0530] "GET /app1 HTTP/1.1" 502 531 "-" "Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.2.3) Gecko/20100401 Firefox/3.6.3" Mon Jan 10 14:27:42 2011] [error] (113)No route to host: proxy: HTTP: attempt to connect to 192.168.1.3:80 (192.168.1.3) failed [Mon Jan 10 14:27:42 2011] [error] ap_proxy_connect_backend disabling worker for (192.168.1.3) [Mon Jan 10 14:27:44 2011] [error] proxy: HTTP: disabled connection for (192.168.1.3) [Mon Jan 10 14:27:44 2011] [error] proxy: HTTP: disabled connection for (192.168.1.3) [Mon Jan 10 14:27:44 2011] [error] proxy: HTTP: disabled connection for (192.168.1.3) [Mon Jan 10 14:27:45 2011] [error] proxy: HTTP: disabled connection for (192.168.1.3) [Mon Jan 10 14:27:45 2011] [error] proxy: HTTP: disabled connection for (192.168.1.3) [Mon Jan 10 14:27:45 2011] [error] proxy: HTTP: disabled connection for (192.168.1.3) [Mon Jan 10 14:27:45 2011] [error] proxy: HTTP: disabled connection for (192.168.1.3) [Mon Jan 10 14:27:46 2011] [error] proxy: HTTP: disabled connection for (192.168.1.3) [Mon Jan 10 14:27:47 2011] [error] proxy: HTTP: disabled connection for (192.168.1.3) [Mon Jan 10 14:27:48 2011] [error] proxy: HTTP: disabled connection for (192.168.1.3) [Mon Jan 10 14:27:48 2011] [error] proxy: HTTP: disabled connection for (192.168.1.3) [Mon Jan 10 14:27:48 2011] [error] proxy: HTTP: disabled connection for (192.168.1.3) [Mon Jan 10 14:35:29 2011] [error] [client 192.168.1.25] proxy: DNS lookup failure for: myserver.com returned by /app1 [Mon Jan 10 14:35:30 2011] [error] [client 192.168.1.25] proxy: DNS lookup failure for: myserver.com returned by /app1 [Mon Jan 10 14:35:30 2011] [error] [client 192.168.1.25] proxy: DNS lookup failure for: myserver.com returned by /app1 [Mon Jan 10 14:50:30 2011] [error] [client 192.168.1.25] proxy: DNS lookup failure for: myserver.com returned by /app1 [Mon Jan 10 14:50:48 2011] [error] [client 192.168.1.25] proxy: DNS lookup failure for: myserver.com returned by /app1 and for site4.myserver.com I get [Mon Jan 10 14:57:40 2011] [error] [client 192.168.1.25] proxy: DNS lookup failure for: site4.myserver.com returned by /favicon.ico [Mon Jan 10 14:57:40 2011] [error] [client 192.168.1.25] proxy: DNS lookup failure for: site4.myserver.com returned by /favicon.ico [Mon Jan 10 14:57:43 2011] [error] [client 192.168.1.25] proxy: DNS lookup failure for: site4.myserver.com returned by /favicon.ico [Mon Jan 10 15:02:38 2011] [error] [client <some external IP>] proxy: DNS lookup failure for: site4.myserver.com returned by / [Mon Jan 10 15:03:04 2011] [error] [client <some external IP>] proxy: DNS lookup failure for: site4.myserver.com returned by /, referer: http://site4.myserver.com/ [Mon Jan 10 15:03:04 2011] [error] [client <some external IP>] proxy: DNS lookup failure for: site4.myserver.com returned by /favicon.ico [Mon Jan 10 15:03:08 2011] [error] [client <some external IP>] proxy: DNS lookup failure for: site4.myserver.com returned by /, referer: http://site4.myserver.com/ [Mon Jan 10 15:03:08 2011] [error] [client <some external IP>] proxy: DNS lookup failure for: site4.myserver.com returned by /favicon.ico [Mon Jan 10 15:03:10 2011] [error] [client <some external IP>] proxy: DNS lookup failure for: site4.myserver.com returned by /, referer: http://site4.myserver.com/ [Mon Jan 10 15:06:21 2011] [error] [client 192.168.1.25] proxy: DNS lookup failure for: site4.myserver.com returned by / [Mon Jan 10 15:06:31 2011] [error] [client 192.168.1.25] proxy: DNS lookup failure for: site4.myserver.com returned by /, referer: http://site4.myserver.com/ [Mon Jan 10 15:26:03 2011] [error] [client 192.168.1.25] proxy: DNS lookup failure for: site4.myserver.com returned by /

    Read the article

  • cURL hangs trying to upload file from stdin

    - by SidneySM
    I'm trying to PUT a file with cURL. This hangs: curl -vvv --digest -u user -T - https://example.com/file.txt < file This does not: curl -vvv --digest -u user -T file https://example.com/file.txt What's going on? * About to connect() to example.com port 443 (#0) * Trying 0.0.0.0... connected * Connected to example.com (0.0.0.0) port 443 (#0) * SSLv3, TLS handshake, Client hello (1): * SSLv3, TLS handshake, Server hello (2): * SSLv3, TLS handshake, CERT (11): * SSLv3, TLS handshake, Server key exchange (12): * SSLv3, TLS handshake, Server finished (14): * SSLv3, TLS handshake, Client key exchange (16): * SSLv3, TLS change cipher, Client hello (1): * SSLv3, TLS handshake, Finished (20): * SSLv3, TLS change cipher, Client hello (1): * SSLv3, TLS handshake, Finished (20): * SSL connection using DHE-RSA-AES256-SHA * Server certificate: * subject: serialNumber=jJakwdOewDicmqzIorLkKSiwuqfnzxF/, C=US, O=*.example.com, OU=GT01234567, OU=See www.example.com/resources/cps (c)10, OU=Domain Control Validated - ExampleSSL(R), CN=*.example.com * start date: 2010-01-26 07:06:33 GMT * expire date: 2011-01-28 11:22:07 GMT * common name: *.example.com (matched) * issuer: C=US, O=Equifax, OU=Equifax Secure Certificate Authority * SSL certificate verify ok. * Server auth using Digest with user 'user' > PUT /file.txt HTTP/1.1 > User-Agent: curl/7.19.4 (universal-apple-darwin10.0) libcurl/7.19.4 OpenSSL/0.9.8l zlib/1.2.3 > Host: example.com > Accept: */* > Transfer-Encoding: chunked > Expect: 100-continue > < HTTP/1.1 100 Continue

    Read the article

  • How is a subdomain passed to the webserver?

    - by Joshua Frank
    I know that dns resolves an address like example.com to an IP address like 11.22.33.44, but I'm a little confused about how subdomains are resolved, so that when you type http://subdomain.example.com, what actually gets passed to the server at 11.22.33.44? In other words, example.com = 11.22.33.44, but subdomain.example.com/path = ??? Are "subdomain" and "path" passed as http headers, or mapped in the url in some way, or what? Thanks in advance. Edit: If I'm understanding correctly, BloodPhilia says that subdomain.example.com actually is a different domain that in principle could resolve to a totally different IP. But if that's so, then what about hosts that have huge numbers of (what look like) subdomains, but which actually map to some path on the site. For instance, blogspot hosts millions of blogs, and they all look like this: aaa.blogspot.com bbb.blogspot.com ...millions more... yyy.blogspot.com zzz.blogspot.com Those are clearly not subdomains with their own IP's, but rather some mapping like aaa.blogspot.com -- www.blogspot.com/aaa, but how is this accomplished? What actually gets passed to the web server at blogspot.com?

    Read the article

  • Enable gzip on Nginx

    - by Rob Wilkerson
    Yes, I know that there are a lot of other questions that seem exactly like this out there. I think I must've looked all of them. Twice. In desparation, I'm adding another in case my specific configuration is the issue. Bear with me. First, the question: What do I need to do to get gzip compression to work? I have an Ubuntu 12.04 server installed running nginx 1.1.19. Nginx was installed with the following packages: nginx nginx-common nginx-full The http block of my nginx.conf looks like this: http { include /etc/nginx/mime.types; access_log /var/log/nginx/access.log; sendfile on; keepalive_timeout 65; tcp_nodelay on; gzip on; gzip_disable "msie6"; include /etc/nginx/conf.d/*.conf; include /etc/nginx/sites-enabled/*; } Both PageSpeed and YSlow are reporting that I need to enable compression. I can see that the request headers indicate Accept-Encoding:gzip,deflate,sdch, but the response headers do not have the corollary Content-Encoding header. I've tried various other config values (gzip_vary on, gzip_http_version 1.0, etc.), but no joy. As far as I know, I can only assume that nginx was compiled with compression support, but if there's any way to verify that, I'd love to know. If anyone sees anything I'm missing or can suggest further debugging, please let me know. I'm no sysadmin and I'm new to Nginx so I've exhausted everything I can think of or have read. Thanks.

    Read the article

  • add_header directives in location overwriting add_header directives in server

    - by user64204
    Using nginx 1.2.1 I am able to add multiple headers using add_header as follows: server { listen 80; server_name localhost; root /var/www; add_header Name1 Value1; <=== HERE add_header Name2 Value2; <=== HERE location / { echo "Nginx localhost site"; } } GET / HTTP/1.1 200 OK Name1: Value1 Name2: Value2 However I soon as I use the add_header directive inside location, the other add_header directives under server are ignored server { listen 80; server_name localhost; root /var/www; add_header Name1 Value1; <=== HERE add_header Name2 Value2; <=== HERE location / { add_header Name3 Value3; <=== HERE add_header Name4 Value4; <=== HERE echo "Nginx localhost site"; } } GET / HTTP/1.1 200 OK Name3: Value3 Name4: Value4 The documentation says that both server and location are valid context and doesn't state that using add_header in one prevents using it in the other. Q1: Do you know if this is a bug or the intended behaviour and why? Q2: Do you see other options to get this fixed than using the HttpHeadersMoreModule module?

    Read the article

< Previous Page | 117 118 119 120 121 122 123 124 125 126 127 128  | Next Page >