Search Results

Search found 327 results on 14 pages for 'deflate'.

Page 3/14 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Getting 500 Error when trying to access Rails application through Apache2

    - by cojones
    Hey, I'm using Apache2 as proxy and mongrel_cluster as server for my Rails applications. When I try to access it by typing in the url I get a 500 "Internal Server Error" but when try to locally access the website with "lynx http://localhost:8200" it works. This is my config: <Proxy balancer://sportfreundewitold_cluster> BalancerMember http://127.0.0.1:8200 BalancerMember http://127.0.0.1:8201 </Proxy> # httpd [example.org] dmn entry BEGIN. <VirtualHost x.x.x.x:80> <IfModule suexec_module> SuexecUserGroup vu2025 vu2025 </IfModule> ServerAdmin [email protected] DocumentRoot /var/www/virtual/example.org/htdocs/current/public ServerName example.org ServerAlias www.example.org example.org *.example.org vu2025.admin.roughneck-media.de Alias /errors /var/www/virtual/example.org/errors/ RedirectMatch permanent ^/ftp[\/]?$ http://admin.roughneck-media.de/ftp/ RedirectMatch permanent ^/pma[\/]?$ http://admin.roughneck-media.de/pma/ RedirectMatch permanent ^/webmail[\/]?$ http://admin.roughneck-media.de/webmail/ RedirectMatch permanent ^/ispcp[\/]?$ http://admin.roughneck-media.de/ ErrorDocument 401 /errors/401.html ErrorDocument 403 /errors/403.html ErrorDocument 404 /errors/404.html ErrorDocument 500 /errors/500.html ErrorDocument 503 /errors/503.html <IfModule mod_cband.c> CBandUser example.org </IfModule> # httpd awstats support BEGIN. # httpd awstats support END. # httpd dmn entry cgi support BEGIN. ScriptAlias /cgi-bin/ /var/www/virtual/example.org/cgi-bin/ <Directory /var/www/virtual/example.org/cgi-bin> AllowOverride AuthConfig #Options ExecCGI Order allow,deny Allow from all </Directory> # httpd dmn entry cgi support END. <Directory /var/www/virtual/example.org/htdocs/current/public> # httpd dmn entry PHP support BEGIN. # httpd dmn entry PHP support END. Options -Indexes Includes FollowSymLinks MultiViews AllowOverride All Order allow,deny Allow from all </Directory> # httpd dmn entry PHP2 support BEGIN. <IfModule mod_php5.c> php_admin_value open_basedir "/var/www/virtual/example.org/:/var/www/virtual/example.org/phptmp/:/usr/share/php/" php_admin_value upload_tmp_dir "/var/www/virtual/example.org/phptmp/" php_admin_value session.save_path "/var/www/virtual/example.org/phptmp/" php_admin_value sendmail_path '/usr/sbin/sendmail -f vu2025 -t -i' </IfModule> <IfModule mod_fastcgi.c> ScriptAlias /php5/ /var/www/fcgi/example.org/ <Directory "/var/www/fcgi/example.org"> AllowOverride None Options +ExecCGI -MultiViews -Indexes Order allow,deny Allow from all </Directory> </IfModule> <IfModule mod_fcgid.c> Include /etc/apache2/mods-available/fcgid_ispcp.conf <Directory /var/www/virtual/example.org/htdocs> FCGIWrapper /var/www/fcgi/example.org/php5-fcgi-starter .php Options +ExecCGI </Directory> <Directory "/var/www/fcgi/example.org"> AllowOverride None Options +ExecCGI MultiViews -Indexes Order allow,deny Allow from all </Directory> </IfModule> # httpd dmn entry PHP2 support END. Include /etc/apache2/ispcp/example.org.conf RewriteEngine On # Make sure people go to www.myapp.com, not myapp.com RewriteCond %{HTTP_HOST} ^myapp\.com$ [NC] RewriteRule ^(.*)$ http://www.myapp.com$1 [R=301,L] # Yes, I've read no-www.com, but my site already has much Google-Fu on # www.blah.com. Feel free to comment this out. # Uncomment for rewrite debugging #RewriteLog logs/myapp_rewrite_log #RewriteLogLevel 9 # Check for maintenance file and redirect all requests RewriteCond %{DOCUMENT_ROOT}/system/maintenance.html -f RewriteCond %{SCRIPT_FILENAME} !maintenance.html RewriteRule ^.*$ /system/maintenance.html [L] # Rewrite index to check for static RewriteRule ^/$ /index.html [QSA] # Rewrite to check for Rails cached page RewriteRule ^([^.]+)$ $1.html [QSA] # Redirect all non-static requests to cluster RewriteCond %{DOCUMENT_ROOT}/%{REQUEST_FILENAME} !-f RewriteRule ^/(.*)$ balancer://mongrel_cluster%{REQUEST_URI} [P,QSA,L] # Deflate AddOutputFilterByType DEFLATE text/html text/plain text/xml application/xml application/xhtml+xml text/javascript text/css BrowserMatch ^Mozilla/4 gzip-only-text/html BrowserMatch ^Mozilla/4\.0[678] no-gzip BrowserMatch \\bMSIE !no-gzip !gzip-only-text/html # Uncomment for deflate debugging #DeflateFilterNote Input input_info #DeflateFilterNote Output output_info #DeflateFilterNote Ratio ratio_info #LogFormat '"%r" %{output_info}n/%{input_info}n (%{ratio_info}n%%)' deflate #CustomLog logs/myapp_deflate_log deflate </VirtualHost> # httpd [example.org] dmn entry END. Does anyone know what could be wrong with it?

    Read the article

  • Fast ruby http library for large XML downloads

    - by Vlad Zloteanu
    I am consuming various XML-over-HTTP web services returning large XML files ( 2MB). What would be the fastest ruby http library to reduce the 'downloading' time? Required features: both GET and POST requests gzip/deflate downloads (Accept-Encoding: deflate, gzip) - very important I am thinking between: open-uri Net::HTTP curb but you can also come with other suggestions. P.S. To parse the response, I am using a pull parser from Nokogiri, so I don't need an integrated solution like rest-client or hpricot.

    Read the article

  • WebClient and Gzip compression is faster?

    - by Yozer
    I writting an application which is using WebClient class. Adding something like that: ExC.Headers.Add("Accept-Encoding: gzip, deflate"); where ExC is: class ExWebClient1 : WebClient { protected override WebRequest GetWebRequest(Uri address) { HttpWebRequest request = (HttpWebRequest)base.GetWebRequest(address); request.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate; return request; } } It will be a diffrence in speed when i will be using encoded response?

    Read the article

  • why nginx rewrite post request from /login to //login?

    - by jiangchengwu
    There is a if statement, which will rewrite url when the client is Android. Everything ok. But, something got strange. Nginx will write post request /login to //login, even if the block of if statement is bank. So I got a 404 page. As the jetty server only accept /login request. Server conf: location / { proxy_pass http://localhost:8785/; proxy_set_header Host $http_host; proxy_set_header Remote-Addr $http_remote_addr; proxy_set_header X-Real-IP $remote_addr; if ( $http_user_agent ~ Android ){ # rewrite something, been commented } } Debug info, origin log https://gist.github.com/3799021 ... 2012/09/28 16:29:49 [debug] 26416#0: *1 http script regex: "Android" 2012/09/28 16:29:49 [notice] 26416#0: *1 "Android" matches "Android/1.0", client: 106.187.97.22, server: ireedr.com, request: "POST /login HTTP/1.1", host: "ireedr.com" ... 2012/09/28 16:29:49 [debug] 26416#0: *1 http proxy header: "POST //login HTTP/1.0 Host: ireedr.com X-Real-IP: 106.187.97.22 Connection: close Accept-Encoding: identity, deflate, compress, gzip Accept: */* User-Agent: Android/1.0 " ... 2012/09/28 16:29:49 [debug] 26416#0: *1 HTTP/1.1 404 Not Found Server: nginx/1.2.1 Date: Fri, 28 Sep 2012 08:29:49 GMT Content-Type: text/html;charset=ISO-8859-1 Transfer-Encoding: chunked Connection: keep-alive Cache-Control: must-revalidate,no-cache,no-store Content-Encoding: gzip ... Only when I commented the block in the configration file: location / { proxy_pass http://localhost:8785/; proxy_set_header Host $http_host; proxy_set_header Remote-Addr $http_remote_addr; proxy_set_header X-Real-IP $remote_addr; #if ( $http_user_agent ~ Android ){ # #} } The client can get an 200 response. Debug info, origin log https://gist.github.com/3799023 ... "POST /login HTTP/1.0 Host: ireedr.com X-Real-IP: 106.187.97.22 Connection: close Accept-Encoding: identity, deflate, compress, gzip Accept: */* User-Agent: Android/1.0 " ... 2012/09/28 16:27:19 [debug] 26319#0: *1 HTTP/1.1 200 OK Server: nginx/1.2.1 Date: Fri, 28 Sep 2012 08:27:19 GMT Content-Type: application/json;charset=UTF-8 Content-Length: 17 Connection: keep-alive ... As the log: 2012/09/28 16:29:49 [notice] 26416#0: *1 "Android" matches "Android/1.0", client: 106.187.97.22, server: ireedr.com, request: "POST /login HTTP/1.1", host: "ireedr.com" 2012/09/28 16:29:49 [debug] 26416#0: *1 http script if 2012/09/28 16:29:49 [debug] 26416#0: *1 post rewrite phase: 4 2012/09/28 16:29:49 [debug] 26416#0: *1 generic phase: 5 2012/09/28 16:29:49 [debug] 26416#0: *1 generic phase: 6 2012/09/28 16:29:49 [debug] 26416#0: *1 generic phase: 7 2012/09/28 16:29:49 [debug] 26416#0: *1 access phase: 8 2012/09/28 16:29:49 [debug] 26416#0: *1 access phase: 9 2012/09/28 16:29:49 [debug] 26416#0: *1 access phase: 10 2012/09/28 16:29:49 [debug] 26416#0: *1 post access phase: 11 2012/09/28 16:29:49 [debug] 26416#0: *1 try files phase: 12 2012/09/28 16:29:49 [debug] 26416#0: *1 posix_memalign: 0000000001E798F0:4096 @16 2012/09/28 16:29:49 [debug] 26416#0: *1 http init upstream, client timer: 0 2012/09/28 16:29:49 [debug] 26416#0: *1 epoll add event: fd:13 op:3 ev:80000005 2012/09/28 16:29:49 [debug] 26416#0: *1 http script copy: "Host: " 2012/09/28 16:29:49 [debug] 26416#0: *1 http script var: "ireedr.com" 2012/09/28 16:29:49 [debug] 26416#0: *1 http script copy: " " 2012/09/28 16:29:49 [debug] 26416#0: *1 http script copy: "" 2012/09/28 16:29:49 [debug] 26416#0: *1 http script copy: "" 2012/09/28 16:29:49 [debug] 26416#0: *1 http script copy: "X-Real-IP: " 2012/09/28 16:29:49 [debug] 26416#0: *1 http script var: "106.187.97.22" 2012/09/28 16:29:49 [debug] 26416#0: *1 http script copy: " " 2012/09/28 16:29:49 [debug] 26416#0: *1 http script copy: "Connection: close " 2012/09/28 16:29:49 [debug] 26416#0: *1 http proxy header: "Accept-Encoding: identity, deflate, compress, gzip" 2012/09/28 16:29:49 [debug] 26416#0: *1 http proxy header: "Accept: */*" 2012/09/28 16:29:49 [debug] 26416#0: *1 http proxy header: "User-Agent: Android/1.0" 2012/09/28 16:29:49 [debug] 26416#0: *1 http proxy header: "POST //login HTTP/1.0 Host: ireedr.com X-Real-IP: 106.187.97.22 Connection: close Accept-Encoding: identity, deflate, compress, gzip Accept: */* User-Agent: Android/1.0 " ... Maybe post rewrite phase had rewrite the request. Anybody can help me to solve this problem or know why nginx do that ? Much appreciated.

    Read the article

  • How mod_cache working with "must-revalidate" and "max-age"?

    - by Dmitriy Sosunov
    Quick question before I will explain my flow: ?an mod_cache perform revalidate with if-none-match only if max-age is expired in case if it configured in reverse proxy mode? My goal is to reduce a number of revalidation requests to our the origin server. For instance: The first request goes to the origin server and then mod_cache save a response in to the cache according to header cache-control: max-age. And only when max-age is expired then mod_cache will revalidate with if-none-match. Currently, mod_cache revalidate each request, regardless that max-age is defined or not. My configuration of Apache 2.4.3 (Windows), on linux I see the same behavior that I will show below. ServerName proxy.lo ProxyRequests Off ProxyPreserveHost Off Header set Vary "Accept, Content-Type, Content-Encoding, Accept-Language" RequestHeader set X-Forwarded-Proto "http" # modify header for user agent's Header set Cache-Control "private, no-cache, no-store, no-transform" CacheQuickHandler off CacheDefaultExpire 300 # the origin server do not provide last-modified CacheIgnoreNoLastMod On CacheIgnoreCacheControl On # the origin server define cache-control: private, no-store only for user agents # Therefore, I would like ignore those headers on the proxy server. CacheStorePrivate On CacheStoreNoStore On CacheEnable disk / CacheRoot "C:/Apache.Cache" CacheDirLevels 5 CacheDirLength 4 CacheMinExpire 15 CacheDetailHeader on CacheHeader on KeepAlive Off ProxyPass / http://origin.lo/ ProxyPassReverse / http://origin.lo/ Also, I have turned on debug log level to see how mod_cache handles a content for caching: I provided this to show that mod_proxy always decides that a content isn't fresh. Why?I provided this to show that mod_proxy always decide that a content isn't fresh. Why? max-age was provided (see below). [Sun Nov 04 11:58:42.899890 2012] [cache:debug] [pid 6492:tid 1400] cache_storage.c(624): [client 192.168.1.100:63741] AH00698: cache: Key for entity /testpage?(null) is http://proxy.lo/testpage? [Sun Nov 04 11:58:42.899890 2012] [cache_disk:debug] [pid 6492:tid 1400] mod_cache_disk.c(569): [client 192.168.1.100:63741] AH00709: Recalled cached URL info header http://proxy.lo/testpage? [Sun Nov 04 11:58:42.899890 2012] [cache_disk:debug] [pid 6492:tid 1400] mod_cache_disk.c(865): [client 192.168.1.100:63741] AH00720: Recalled headers for URL http://proxy.lo/testpage? [Sun Nov 04 11:58:42.899890 2012] [cache:debug] [pid 6492:tid 1400] cache_storage.c(320): [client 192.168.1.100:63741] AH00695: Cached response for /testpage isn't fresh. Adding/replacing conditional request headers. [Sun Nov 04 11:58:42.899890 2012] [cache:debug] [pid 6492:tid 1400] mod_cache.c(414): [client 192.168.1.100:63741] AH00757: Adding CACHE_SAVE filter for /testpage [Sun Nov 04 11:58:42.899890 2012] [cache:debug] [pid 6492:tid 1400] mod_cache.c(448): [client 192.168.1.100:63741] AH00759: Adding CACHE_REMOVE_URL filter for /testpage [Sun Nov 04 11:58:42.899890 2012] [proxy:debug] [pid 6492:tid 1400] mod_proxy.c(1068): [client 192.168.1.100:63741] AH01143: Running scheme http handler (attempt 0) [Sun Nov 04 11:58:42.899890 2012] [proxy:debug] [pid 6492:tid 1400] proxy_util.c(1976): AH00942: HTTP: has acquired connection for (origin.lo) [Sun Nov 04 11:58:42.899890 2012] [proxy:debug] [pid 6492:tid 1400] proxy_util.c(2029): [client 192.168.1.100:63741] AH00944: connecting http://origin.lo/testpage to origin.lo:80 [Sun Nov 04 11:58:42.901890 2012] [proxy:debug] [pid 6492:tid 1400] proxy_util.c(2151): [client 192.168.1.100:63741] AH00947: connected /testpage to origin.lo:80 [Sun Nov 04 11:58:42.901890 2012] [proxy:debug] [pid 6492:tid 1400] proxy_util.c(2554): AH00962: HTTP: connection complete to 192.168.1.100:80 (origin.lo) [Sun Nov 04 11:58:42.903890 2012] [proxy:debug] [pid 6492:tid 1400] proxy_util.c(1991): AH00943: http: has released connection for (origin.lo) [Sun Nov 04 11:58:42.903890 2012] [headers:debug] [pid 6492:tid 1400] mod_headers.c(800): AH01502: headers: ap_headers_output_filter() [Sun Nov 04 11:58:42.903890 2012] [cache:debug] [pid 6492:tid 1400] mod_cache.c(1190): [client 192.168.1.100:63741] AH00769: cache: Caching url: /testpage [Sun Nov 04 11:58:42.903890 2012] [cache:debug] [pid 6492:tid 1400] mod_cache.c(1196): [client 192.168.1.100:63741] AH00770: cache: Removing CACHE_REMOVE_URL filter. [Sun Nov 04 11:58:42.904890 2012] [cache_disk:debug] [pid 6492:tid 1400] mod_cache_disk.c(1318): [client 192.168.1.100:63741] AH00737: commit_entity: Headers and body for URL http://proxy.lo/testpage? cached. The first request to the origin server without mod_proxy to http://origin.lo/ GET http://origin.lo/testpage HTTP/1.1 Host: origin.lo Connection: keep-alive User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.4 (KHTML, like Gecko) Chrome/22.0.1229.94 Safari/537.4 Accept: application/json Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 The first response from the origin without mod_proxy HTTP/1.1 200 OK Cache-Control: must-revalidate, proxy-revalidate, max-age=30 Content-Type: application/json; charset=utf-8 ETag: "7cf651e2-176f-4ac1-808e-0e0c17cfd0a2" Server: Microsoft-IIS/7.5 X-AspNet-Version: 4.0.30319 X-Powered-By: ASP.NET Date: Sun, 04 Nov 2012 10:11:01 GMT Content-Length: 1877 So, I assumed that revalidation must be occur only in 30 seconds after the success response. Is't right? Let's check it:) Within 30 sec, the Google Chrome didn't perform any requests to the origin server to revalidate a request and has return the response from local cache. When max-age is expired, the Google Chrome perform a request to revalidate: GET http://origin.lo/testpage HTTP/1.1 Host: origin.lo Connection: keep-alive Cache-Control: max-age=0 User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.4 (KHTML, like Gecko) Chrome/22.0.1229.94 Safari/537.4 Accept: application/xml If-None-Match: "7cf651e2-176f-4ac1-808e-0e0c17cfd0a2" Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 and response: HTTP/1.1 304 Not Modified Cache-Control: must-revalidate, proxy-revalidate, max-age=30 ETag: "7cf651e2-176f-4ac1-808e-0e0c17cfd0a2" Server: Microsoft-IIS/7.5 X-AspNet-Version: 4.0.30319 X-Powered-By: ASP.NET Date: Sun, 04 Nov 2012 10:16:20 GMT As you can see, all works as expected. User agent revalidates request only when max-age is expired. Let's now try perform the folling flow though mod_proxy (see configuration above). The first request: GET http://proxy.lo/testpage HTTP/1.1 Host: proxy.lo Connection: keep-alive User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.4 (KHTML, like Gecko) Chrome/22.0.1229.94 Safari/537.4 Accept: application/json Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 and the response was: HTTP/1.1 200 OK Date: Sun, 04 Nov 2012 10:23:36 GMT Server: Apache Cache-Control: private, no-cache, no-store, no-transform Content-Type: application/json; charset=utf-8 ETag: "7cf651e2-176f-4ac1-808e-0e0c17cfd0a2" Content-Length: 1932 Vary: Accept,Content-Type,Content-Encoding,Accept-Language X-Cache: MISS from proxy.lo X-Cache-Detail: "cache miss: attempting entity save" from proxy.lo Connection: close Ok, let's see to the disk cache and try to see how request and response was stored. (I cut binary data) http://proxy.lo/testpage? Cache-Control: private, no-cache, no-store, no-transform Content-Type: application/json; charset=utf-8 ETag: "7cf651e2-176f-4ac1-808e-0e0c17cfd0a2" Date: Sun, 04 Nov 2012 10:27:15 GMT Content-Length: 1932 Vary: Accept, Content-Type, Content-Encoding, Accept-Language Host: proxy.lo User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.4 (KHTML, like Gecko) Chrome/22.0.1229.94 Safari/537.4 Accept: application/json Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 X-Forwarded-Proto: http Cache-Control: max-age=300, must-revalidate X-Forwarded-For: 192.168.1.100 X-Forwarded-Host: proxy.lo X-Forwarded-Server: origin.lo Ok, what we see? We see that the first request was performed with max-age=300 & must-revalidate Ok, looks good, as for me, lets perform the next call: GET http://proxy.lo/testpage HTTP/1.1 Host: proxy.lo Connection: keep-alive User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.4 (KHTML, like Gecko) Chrome/22.0.1229.94 Safari/537.4 Accept: application/json Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 and the second response from mod_proxy: HTTP/1.1 200 OK Date: Sun, 04 Nov 2012 10:31:58 GMT Server: Apache Cache-Control: private, no-cache, no-store, no-transform ETag: "7cf651e2-176f-4ac1-808e-0e0c17cfd0a2" Content-Length: 1932 Vary: Accept,Content-Type,Content-Encoding,Accept-Language X-Cache: REVALIDATE from proxy.lo X-Cache-Detail: "conditional cache hit: entity refreshed" from proxy.lo Connection: close Content-Type: application/json; charset=utf-8 SO, MY QUESTION IS: WHY mod_proxy perform revalidation on each request regardless that max-age is defined? N.B. Apache 2.4.3 Thanks, I would be grateful for any help.

    Read the article

  • .NET: Is it possible to get HttpWebRequest to automatically decompress gzip'd responses?

    - by Cheeso
    In this answer, I described how I resorted to wrappnig a GZipStream around the response stream in a HttpWebResponse, in order to decompress it. The relevant code looks like this: HttpWebRequest hwr = (HttpWebRequest) WebRequest.Create(url); hwr.CookieContainer = PersistentCookies.GetCookieContainerForUrl(url); hwr.Accept = "text/xml, */*"; hwr.Headers.Add(HttpRequestHeader.AcceptEncoding, "gzip, deflate"); hwr.Headers.Add(HttpRequestHeader.AcceptLanguage, "en-us"); hwr.UserAgent = "My special app"; hwr.KeepAlive = true; var resp = (HttpWebResponse) hwr.GetResponse(); using(Stream s = resp.GetResponseStream()) { Stream s2 = s; if (resp.ContentEncoding.ToLower().Contains("gzip")) s2 = new GZipStream(s2, CompressionMode.Decompress); else if (resp.ContentEncoding.ToLower().Contains("deflate")) s2 = new DeflateStream(s2, CompressionMode.Decompress); ... use s2 ... } Is there a way to get HttpWebResponse to provide a de-compressing stream, automatically? In other words, a way for me to eliminate the following from the above code: Stream s2 = s; if (resp.ContentEncoding.ToLower().Contains("gzip")) s2 = new GZipStream(s2, CompressionMode.Decompress); else if (resp.ContentEncoding.ToLower().Contains("deflate")) s2 = new DeflateStream(s2, CompressionMode.Decompress); Thanks.

    Read the article

  • HTTP Compression problems on IIS7

    - by Jonathan Wood
    I've spent quite a bit of time on this but seem to be going nowhere. I have a large page that I really want to speed up. The obvious place to start seems to be HTTP compression, but I just can't seem to get it to work for me. After considerable searching, I've tried several variations of the code below. It kind of works, but after refreshing the browser, the results seem to fall apart. They were turning to garbage when the page used caching. If I turn off caching, then the page seems right but I lose my CSS formatting (stored in a separate file) and get an error that an included JS file contains invalid characters. Most of the resources I've found on the Web were either very old or focused on accessing IIS directly. My page is running on a shared hosting account and I do not have direct access to IIS7, which it's running on. protected void Application_BeginRequest(object sender, EventArgs e) { // Implement HTTP compression if (Request["HTTP_X_MICROSOFTAJAX"] == null) // Avoid compressing AJAX calls { // Retrieve accepted encodings string encodings = Request.Headers.Get("Accept-Encoding"); if (encodings != null) { // Verify support for or gzip (deflate takes preference) encodings = encodings.ToLower(); if (encodings.Contains("gzip") || encodings == "*") { Response.Filter = new GZipStream(Response.Filter, CompressionMode.Compress); Response.AppendHeader("Content-Encoding", "gzip"); Response.Cache.VaryByHeaders["Accept-encoding"] = true; } else if (encodings.Contains("deflate")) { Response.Filter = new DeflateStream(Response.Filter, CompressionMode.Compress); Response.AppendHeader("Content-Encoding", "deflate"); Response.Cache.VaryByHeaders["Accept-encoding"] = true; } } } } Is anyone having better success with this?

    Read the article

  • Rewriting Live TCP/IP (Layer 4) Streams

    - by user213060
    I want to rewrite TCP/IP streams. Ettercap's etterfilter command lets you perform simple live replacements of TCP/IP data based on fixed strings or regexes. Example: if (ip.proto == TCP && tcp.dst == 80) { if (search(DATA.data, "gzip")) { replace("gzip", " "); msg("whited out gzip\n"); } } if (ip.proto == TCP && tcp.dst == 80) { if (search(DATA.data, "deflate")) { replace("deflate", " "); msg("whited out deflate\n"); } } http://ettercap.sourceforge.net/forum/viewtopic.php?t=2833 I would like to rewrite streams based on my own filter program instead of just simple string replacements. Anyone have an idea of how to do this? Is there anything other than Ettercap that can do live replacement like this, maybe as a plugin to a VPN software or something? The rewriting should occur at the transport layer (Layer 4) as it does in this example, instead of a lower layer packet-based approach. Thanks!

    Read the article

  • How to disable Apache http compression (mod_deflate) when SSL stream is compressed

    - by Mohammad Ali
    I found that Goggle Chrome supports ssl compression and Firefox should support it soon. I'm trying to configure Apache to to disable http compression if the ssl compression is used to prevent CPU overhead with the configuration option: SetEnvIf SSL_COMPRESS_METHOD DEFLATE no-gzip While the custom log (using %{SSL_COMPRESS_METHOD}x) shows that the ssl layer compression method is DEFLATE, the above option did not work and the http response content is still being compressed by Apache. I had to use the option: BrowserMatchNoCase ".Chrome." no-gzip' I prefer if there are more general method in case other browsers supports ssl compression or some has a version of chrome that does not have ssl compression.

    Read the article

  • Serving Compressed Files Amazon vs Lightty

    - by tike
    We are currently using amazon CloudFront to serve css and according to Amazon itself, Amazon CloudFront can serve both compressed and uncompressed files from an origin server. But while i check compression it shows everything fine in origin server but it shows notcompressed checking in the link with cloudfront. e.g. http://www.port80software.com/tools/compresscheck.asp?url=http%3A%2F%2Fimgsrv.mydomain.com%2Fen-UK%2Fsomething.css it would result with Compression status: (gzip) while with cloudfront http://www.port80software.com/tools/compresscheck.asp?url=http%3A%2F%2hereisit.cloudfront.net%2F%2Fsomething.css Compression status: Uncompressed Origin server is running lighttpd with mod_deflate however, allowed config is: deflate.allowed_encodings = ("bzip2", "gzip", "deflate") [i would think, putting extra allowed encoding wont affect as such.] Here i am clueless, what is the real issue.

    Read the article

  • YSLow says certain CSS are not gzipped

    - by rhand
    YSlow keeps on telling me files like http://www.example.com/wp-content/plugins/q-and-a/css/q-a-plus.css?ver=1.0.6.2 are not gzipped while the gzip test tool at Feed the Bot mentions I am all good: Compressed? Yes Compression type gzip Page size (Bytes) 32,493 Compressed size (Bytes) -1 Saving (Bytes) 32,494 Compression % 100% I added this to my .htaccess: # Gzip <ifModule mod_gzip.c> mod_gzip_on Yes mod_gzip_dechunk Yes mod_gzip_item_include file .(html?|txt|css|js|php|pl)$ mod_gzip_item_include handler ^cgi-script$ mod_gzip_item_include mime ^text/.* mod_gzip_item_include mime ^application/x-javascript.* mod_gzip_item_exclude mime ^image/.* mod_gzip_item_exclude rspheader ^Content-Encoding:.*gzip.* </ifModule> #Deflate <ifmodule mod_deflate.c> AddOutputFilterByType DEFLATE text/text text/html text/plain text/xml text/css application/x-javascript application/javascript </ifmodule> The header for the file mentioned states: CF-Cache-Status MISS CF-RAY 13945df90a9a0c1d-AMS Cache-Control public, max-age=2592000 Connection keep-alive Content-Encoding gzip Content-Type application/javascript Date Thu, 12 Jun 2014 07:34:38 GMT Expires Sat, 12 Jul 2014 07:34:38 GMT Last-Modified Thu, 21 Feb 2013 01:29:18 GMT Server cloudflare-nginx Transfer-Encoding chunked Vary Accept-Encoding Any ideas what I am missing here?

    Read the article

  • Apache LocationMatch throws 500 and AddOutputFilterByType does nothing

    - by tackleberry
    I need to add below directives to apache. But I get 500 when I add these lines. <LocationMatch "^/assets/.*$"> Header unset ETag FileETag None # RFC says only cache for 1 year ExpiresActive On ExpiresDefault "access plus 1 year" </LocationMatch> Additionally response is not gzipped when I add: AddOutputFilterByType DEFLATE text/html text/css application/javascript application/x-javascript Apache version is: Server version: Apache/2.2.22 (Unix) App: rails 3.2 app When I checked response&request for gzip problem, I see that browser requested gzip: Accept-Encoding gzip, deflate but response not gzipped.

    Read the article

  • View a pdf with quick webview though apache proxy

    - by Musa
    I have a site(IIS) that is accessed via a proxy in apache(on an IBM i). This site serves PDFs which has quick web view and if I access a pdf directly from the IIS server the PDFs starts to display immediately but if I go through the proxy I have to wait until the entire pdf downloads before I can view it. In the apache config file I use ProxyPass /path/ http://xxx.xxx.xxx.xxx/ <LocationMatch "/path/"> Header set Cache-Control "no-cache" </LocationMatch> I tried adding SetEnv proxy-sendcl to LocationMatch directive this had no effect. The PDFs that view quickly makes a lot of partial requests This is the initial request and response headers GET http://xxx.xxx.xxx.xxx/xxx.PDF HTTP/1.1 Host: xxx.xxx.xxx.xxx Proxy-Connection: keep-alive Cache-Control: no-cache Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8 Pragma: no-cache User-Agent: Mozilla/5.0 (Windows NT 6.2; rv:9.0.1) Gecko/20100101 Firefox/9.0.1 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8 Cookie: chocolatechip HTTP/1.1 200 OK Via: 1.1 xxxxxxxx Connection: Keep-Alive Proxy-Connection: Keep-Alive Content-Length: 15330238 Date: Mon, 25 Aug 2014 12:48:31 GMT Content-Type: application/pdf ETag: "b6262940bbecf1:0" Server: Microsoft-IIS/7.5 Last-Modified: Fri, 22 Aug 2014 13:16:14 GMT Accept-Ranges: bytes X-Powered-By: ASP.NET This is a partial request and response GET http://xxx.xxx.xxx.xxx/xxx.PDF HTTP/1.1 Host: xxx.xxx.xxx.xxx Proxy-Connection: keep-alive Cache-Control: no-cache Pragma: no-cache User-Agent: Mozilla/5.0 (Windows NT 6.2; rv:9.0.1) Gecko/20100101 Firefox/9.0.1 Accept: */* Referer: http://xxx.xxx.xxx.xxx/xxxx.PDF Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8 Cookie: chocolatechip Range: bytes=0-32767 HTTP/1.1 206 Partial Content Via: 1.1 xxxxxxxx Connection: Keep-Alive Proxy-Connection: Keep-Alive Content-Length: 32768 Date: Mon, 25 Aug 2014 12:48:31 GMT Content-Range: bytes 0-32767/15330238 Content-Type: application/pdf ETag: "b6262940bbecf1:0" Server: Microsoft-IIS/7.5 Last-Modified: Fri, 22 Aug 2014 13:16:14 GMT Accept-Ranges: bytes X-Powered-By: ASP.NET These are the headers I get if I go through he proxy GET /path/xxx.PDF HTTP/1.1 Host: domain:xxxx Connection: keep-alive Cache-Control: no-cache Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8 Pragma: no-cache User-Agent: Mozilla/5.0 (Windows NT 6.2; rv:9.0.1) Gecko/20100101 Firefox/9.0.1 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8 HTTP/1.1 200 OK Date: Mon, 25 Aug 2014 13:28:42 GMT Server: Microsoft-IIS/7.5 Content-Type: application/pdf Last-Modified: Fri, 22 Aug 2014 13:16:14 GMT Accept-Ranges: bytes ETag: "b6262940bbecf1:0"-gzip X-Powered-By: ASP.NET Cache-Control: no-cache Expires: Thu, 24 Aug 2017 13:28:42 GMT Vary: Accept-Encoding Content-Encoding: gzip Keep-Alive: timeout=300, max=100 Connection: Keep-Alive Transfer-Encoding: chunked I'm guessing its because the proxy uses Transfer-Encoding: chunked but I'm not sure and wasn't able to turn it off to check. Browser Chrome 36.0.1985.143 m Using the native PDF viewer Any help to get the pdf quick web view through the proxy working would be appreciated.

    Read the article

  • Websocket handshake response not forwarded from TCP to client

    - by Saharsh
    I am trying to create a websocket server. I can see the websocket client's opening handhshake. My response to it is received by the client laptop (I can see this on wireshark). So the TCP connection has been established. But the client (a chrome websocket client extension) does not receive the handshake packet. What could be a possible reason for TCP to not forward the handshake to the client or for the client to not be able to read the TCP message? Client handshake: GET HTTP/1.1 Upgrade: websocket Connection:Upgrade Cache-Control:no-cache Host:192.168.0.101 Origin:http://www.websocket.org Pragma:no-cache Sec-WebSocket-Extensions:permessage-deflate; client_max_window_bits, x-webkit-deflate-frame Sec-WebSocket-Key: qrmw/m+BoZije6h9HYKmVw== Sec-WebSocket-Version:13 Upgrade:websocket Server Response: HTTP/1.1 101 Switching Protocols Upgrade: websocket Connection: Upgrade Sec-WebSocket-Accept: jj1g5Io57m9ks8cme3jkbyo2asc= Access-Control-Allow-Origin: http://www.websocket.org Server: xyz Sec-WebSocket-Extensions: Thanks!

    Read the article

  • Session Cookies and IE 8

    - by Matt Luongo
    I recently built a simple web-app deployed over Tomcat. The app uses pretty standard session based security where a user who has logged in is given a session. Sessions work fine in Firefox and Chrome, but require the use of jsessionid in the URL for IE (tested 7 & 8), set to medium privacy. In IE 8, I tried to override cookie handling, setting "Allow all 3rd party cookies" and "Allow all session cookies"- no dice. However, when I run Tomcat on my local machine, IE accepts the cookie, and sessions work just fine. And now, for the HTTP headers. From Chrome, a logged in user gets a session GET http://devl:8080/testing/ HTTP/1.1 Host: devl:8080 Connection: keep-alive User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.5 (KHTML, like Gecko) Chrome/4.1.249.1036 Safari/532.5 Accept: application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 HTTP/1.1 200 OK Server: Apache-Coyote/1.1 P3P: CP="NON CURa ADMa DEVa TAIa OUR BUS IND UNI COM NAV INT STA" Set-Cookie: JSESSIONID=9280023BCE2046F32B13C89130CBC397; Path=/testing Content-Type: text/html;charset=UTF-8 Content-Language: en-US Content-Length: 2450 Date: Fri, 26 Mar 2010 14:14:40 GMT GET http://devl:8080/testing/logout HTTP/1.1 Host: devl:8080 Connection: keep-alive User-Agent: Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US) AppleWebKit/532.5 (KHTML, like Gecko) Chrome/4.1.249.1036 Safari/532.5 Referer: http://devl:8080/testing/ Accept: application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 Cookie: JSESSIONID=9280023BCE2046F32B13C89130CBC397 ... From IE 8, with standard medium level security and privacy- GET http://devl:8080/testing/ HTTP/1.1 Accept: application/x-ms-application, image/jpeg, application/xaml+xml, image/gif, image/pjpeg, application/x-ms-xbap, */* Accept-Language: en-US User-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Win64; x64; Trident/4.0; .NET CLR 2.0.50727; SLCC2; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; MDDC; Tablet PC 2.0) UA-CPU: AMD64 Accept-Encoding: gzip, deflate Host: devl:8080 Connection: Keep-Alive HTTP/1.1 200 OK Server: Apache-Coyote/1.1 P3P: CP="NON CURa ADMa DEVa TAIa OUR BUS IND UNI COM NAV INT STA" Set-Cookie: JSESSIONID=192999F922D6E9C868314452726764BA; Path=/testing Content-Type: text/html;charset=UTF-8 Content-Language: en-US Content-Length: 2450 Date: Fri, 26 Mar 2010 14:32:34 GMT GET http://devl:8080/testing/logout HTTP/1.1 Accept: application/x-ms-application, image/jpeg, application/xaml+xml, image/gif, image/pjpeg, application/x-ms-xbap, */* Referer: http://devl:8080/testing/;jsessionid=6371A83EFE39A46997544F9146AA5CEA Accept-Language: en-US User-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 6.1; Win64; x64; Trident/4.0; .NET CLR 2.0.50727; SLCC2; .NET CLR 3.5.30729; .NET CLR 3.0.30729; Media Center PC 6.0; MDDC; Tablet PC 2.0) UA-CPU: AMD64 Accept-Encoding: gzip, deflate Connection: Keep-Alive Host: devl:8080 ... I thought it might be P3P, but on adding a compact policy, nothing changes. This is the standard Tomcat session, so I'm really surprised I haven't been able to find other people with the same problem so far. Anyone have any ideas?

    Read the article

  • Server removes all custom HTTP header fields

    - by MartinMoizard
    Hello, I've been trying to receive HTTP requests with custom fields in the headers but it seems like my server removes them... I printed the headers of the request when I arrive on my page.php. I see that : body uri http://url.com/oauth.php/request_token parameters headers Array ....*/* ....gzip, deflate ....en-us ....keep-alive ....s320650601.onlinehome.fr ....DearStranger/1.0 CFNetwork/485.12.7 Darwin/10.6.0 method POST when I should be seeing that (it is working on a local version) body uri http://localhost:8888/oauth.php/request_token parameters headers Array ....*/* ....gzip, deflate ....en-us ....OAuth realm="", oauth_consumer_key="582d95bd45d455fa2e5819f88fc0c5a104d2c7ff3", oauth_signature_method="HMAC-SHA1", oauth_signature="7mKKzEw0Clv237nBHFzcTcA3SCE%3D", oauth_timestamp="1295267612", oauth_nonce="C546644E-8918-4FA3-A2A0-DAADCF7D1E5A", oauth_version="1.0" ....keep-alive ....0 ....localhost:8888 ....DearStranger/1.0 CFNetwork/485.12.7 Darwin/10.6.0 method POST I am using php 5.2.17 on the server. Do you have any idea to help my fix that issue? Thanks!

    Read the article

  • Saving gzipped string into Sqlite3 via Rails throws unrecognized token error

    - by user141146
    Hi, I'm using rails 2.3 and I'm trying to compress (gzip) text that I'd like to save using ActiveRecord into a sqlite database. However, the compressed text isn't being saved b/c I get this type of error: SQLite3::SQLException: unrecognized token: "'x##U?#7 Any thoughts on what I can do to avoid this error? should I compress the text in some other fashion? should I save the data using some other method(s)? Relevant code is below. # compress my text require 'zlib' defl = Zlib::Deflate test_string = "<h3>some text</h3>some additional text<p>here's some more text</p>" compressed_string = defl.deflate(test_string) => "x\234\263\3110\266+\316\317MU(I\255(\261\321\207\361\022SR2K2\363\363\022s \022\005v\031\251E\251\352\305\n`\331\334\374\"\230\206\002;\000\0225\027\222" ModelClass.new(:attribute1 => compressed_string).save ActiveRecord::StatementInvalid: SQLite3::SQLException: unrecognized token: "'x#####+##MU(I#(#?####2K2#### v#E####

    Read the article

  • Webserver parsing chrome input from post request

    - by ravenspoint
    I am developing a small embedded web server. I want to add parsing of post requests, but I am having a problem with input password fields from Chrome. Firefox and IE work perfectly. The HTML: <form action=start.webem method=post> <input value="START" type=submit /><!--#webem start --> <p>Password: <input TYPE=PASSWORD name=yourname AUTOCOMPLETE=OFF /><br> </form> From Firefox I get POST /stop.webem HTTP/1.1 Host: 127.0.0.1:8080 User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US; rv:1.9.1.9) Gecko/20100315 Firefox/3.5.9 (.NET CLR 3.5.30729) Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8 Accept-Language: en-us,en;q=0.5 Accept-Encoding: gzip,deflate Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.7 Keep-Alive: 300 Connection: keep-alive Referer: http://127.0.0.1:8080/ Content-Type: application/x-www-form-urlencoded Content-Length: 13 yourname=test However from Chrome, about 90% of the time, the yourname=test is missing POST /start.webem HTTP/1.1 Host: 127.0.0.1:8080 Connection: keep-alive User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.5 (KHTML, like Gecko) Chrome/4.1.249.1045 Safari/532.5 Referer: http://127.0.0.1:8080/ Content-Length: 13 Cache-Control: max-age=0 Origin: http://127.0.0.1:8080 Content-Type: application/x-www-form-urlencoded Accept: application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 Though, occasionally it does work!!! POST /start.webem HTTP/1.1 Host: 127.0.0.1:8080 Connection: keep-alive User-Agent: Mozilla/5.0 (Windows; U; Windows NT 5.1; en-US) AppleWebKit/532.5 (KHTML, like Gecko) Chrome/4.1.249.1045 Safari/532.5 Referer: http://127.0.0.1:8080/start.webem Content-Length: 13 Cache-Control: max-age=0 Origin: http://127.0.0.1:8080 Content-Type: application/x-www-form-urlencoded Accept: application/xml,application/xhtml+xml,text/html;q=0.9,text/plain;q=0.8,image/png,*/*;q=0.5 Accept-Encoding: gzip,deflate,sdch Accept-Language: en-US,en;q=0.8 Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3 yourname=test I cannot find what causes it to work sometimes.

    Read the article

  • Curl Wrapper Class does not return any data even though it worked previously?

    - by Scott Faisal
    We changed servers and installed all necessary software and just cannot seem to pin point what is going on. A simple CURL request does not return anything. Command Line CURL commands work just fine. We are using a wrapper for CURL utilizing streams. Do PHP streams require any out of the ordinary configuration? We are using the latest Lamp stack. This is the var_dump: object(cURL_Response)#180 (14) { ["cURL:private"]= resource(288) of type (curl) ["data_stream:private"]= object(elTempStream)#178 (1) { ["fp"]= resource(290) of type (stream) } ["request_header:private"]= NULL ["response_header:private"]= object(cURL_Headers)#179 (1) { ["headers:private"]= string(0) "" } ["response_headers:private"]= array(1) { [0]= object(cURL_Headers)#179 (1) { ["headers:private"]= string(0) "" } } ["error:private"]= string(0) "" ["errno:private"]= int(0) ["info:private"]= array(21) { ["url"]= string(21) "http://www.yahoo.com/" ["content_type"]= string(23) "text/html;charset=utf-8" ["http_code"]= int(200) ["header_size"]= int(1195) ["request_size"]= int(1153) ["filetime"]= int(-1) ["ssl_verify_result"]= int(0) ["redirect_count"]= int(1) ["total_time"]= float(0.486924) ["namelookup_time"]= float(0.003692) ["connect_time"]= float(0.005709) ["pretransfer_time"]= float(0.005714) ["size_upload"]= float(0) ["size_download"]= float(28509) ["speed_download"]= float(58549) ["speed_upload"]= float(0) ["download_content_length"]= float(211) ["upload_content_length"]= float(0) ["starttransfer_time"]= float(0.149365) ["redirect_time"]= float(0.312743) ["request_header"]= string(973) "GET / HTTP/1.0 User-Agent: cURL_ClientBase (PHP v/5.2.6-1+lenny4) Host: www.yahoo.com Accept: / Accept-Encoding: gzip, deflate, compress Referer: http://yahoo.com Cookie: B=e5iber15t7u05&b=3&s=ie; fpc_s=d=GGX6WCTIR29HWsjgLxFejKc_YJWxRqm3jYdEd6lu7W5ophpuAHBm6JGtNvhv97anG4VtaIMHQBPg3JAMOZGq59Lz_tRn_TFXgUT8T_at5HdCktVJLycy&v=2; fpt=d=nt1OT7HPe9wVIkHbMkpzQOgbP3.mQ3o1SPX7k5ztrFrWeeSWK5IgQooRY.8KtTeRMiaSEZ0kv3sO1MWtEsAzjVlRCDAZBoxqOs17v6PaZbPRqmDc92ivoMia.CqjufRs4_guOO4AyhRZ7_ml8rzxFrYeexpR2jLN0oPMyEWT0nbEf6Sdf._Bkh0HMfmI7KBnEx5uZBEEmV.wTfGRLG7zSd9sA4itOFv.r6AjP39CnogSn7NTJnqg_kEcKoiCM.lR5w_MqMc8IgWMBgSAZZgGEZpfmvxlQGnUzPwNh2pSpTe2wxFS3v1zPopDgoo2VsO3uzeyA3A_j7Hlk1P8T08DHbfr6ApDMUcr7d0QIt4pGYIxVV45XzfgpT7mgUdMei6VZrD9ozVQF0oqxrs1Ufri.XzPdB3NdQ--&v=1; fpc=d=sRPCfUfBTW96.RGiQn4hSkfi3p7WnPCAqYl5YoHecI7zjg7gH7PolscoPcq1Esm8dR.Rg1.AbQCpo2WBPXn1St96PpcjeCC.pj2.Upb3mKSRQkYPIVP1vQcL9nL7J8s9Z0VIXjiBFgSUcxyzDeUdP4us2YbVO3PbaVIwaIEfFsX3WI7YgiTbkrTGtwnFgoSYq6l8tnw-&v=2" } ["info_flagged:private"]= array(20) { [1048577]= string(21) "http://www.yahoo.com/" [2097154]= int(200) [2097166]= int(-1) [3145731]= float(0.486924) [3145732]= float(0.003692) [3145733]= float(0.005709) [3145734]= float(0.005714) [3145745]= float(0.149365) [3145747]= float(0.312743) [3145735]= float(0) [3145736]= float(28509) [3145737]= float(58549) [3145738]= float(0) [2097163]= int(1195) [2]= string(973) "GET / HTTP/1.0 User-Agent: cURL_ClientBase (PHP v/5.2.6-1+lenny4) Host: www.yahoo.com Accept: / Accept-Encoding: gzip, deflate, compress Referer: http://yahoo.com Cookie: B=e5iber15t7u05&b=3&s=ie; fpc_s=d=GGX6WCTIR29HWsjgLxFejKc_YJWxRqm3jYdEd6lu7W5ophpuAHBm6JGtNvhv97anG4VtaIMHQBPg3JAMOZGq59Lz_tRn_TFXgUT8T_at5HdCktVJLycy&v=2; fpt=d=nt1OT7HPe9wVIkHbMkpzQOgbP3.mQ3o1SPX7k5ztrFrWeeSWK5IgQooRY.8KtTeRMiaSEZ0kv3sO1MWtEsAzjVlRCDAZBoxqOs17v6PaZbPRqmDc92ivoMia.CqjufRs4_guOO4AyhRZ7_ml8rzxFrYeexpR2jLN0oPMyEWT0nbEf6Sdf._Bkh0HMfmI7KBnEx5uZBEEmV.wTfGRLG7zSd9sA4itOFv.r6AjP39CnogSn7NTJnqg_kEcKoiCM.lR5w_MqMc8IgWMBgSAZZgGEZpfmvxlQGnUzPwNh2pSpTe2wxFS3v1zPopDgoo2VsO3uzeyA3A_j7Hlk1P8T08DHbfr6ApDMUcr7d0QIt4pGYIxVV45XzfgpT7mgUdMei6VZrD9ozVQF0oqxrs1Ufri.XzPdB3NdQ--&v=1; fpc=d=sRPCfUfBTW96.RGiQn4hSkfi3p7WnPCAqYl5YoHecI7zjg7gH7PolscoPcq1Esm8dR.Rg1.AbQCpo2WBPXn1St96PpcjeCC.pj2.Upb3mKSRQkYPIVP1vQcL9nL7J8s9Z0VIXjiBFgSUcxyzDeUdP4us2YbVO3PbaVIwaIEfFsX3WI7YgiTbkrTGtwnFgoSYq6l8tnw-&v=2" [2097164]= int(1153) [2097165]= int(0) [3145743]= float(211) [3145744]= float(0) [1048594]= string(23) "text/html;charset=utf-8" } ["request_url:private"]= string(16) "http://yahoo.com" ["response_url:private"]= string(21) "http://www.yahoo.com/" ["status_code:private"]= int(200) ["cookies:private"]= array(0) { } ["request_headers"]= string(973) "GET / HTTP/1.0 User-Agent: cURL_ClientBase (PHP v/5.2.6-1+lenny4) Host: www.yahoo.com Accept: / Accept-Encoding: gzip, deflate, compress Referer: http://yahoo.com Cookie: B=e5iber15t7u05&b=3&s=ie; fpc_s=d=GGX6WCTIR29HWsjgLxFejKc_YJWxRqm3jYdEd6lu7W5ophpuAHBm6JGtNvhv97anG4VtaIMHQBPg3JAMOZGq59Lz_tRn_TFXgUT8T_at5HdCktVJLycy&v=2; fpt=d=nt1OT7HPe9wVIkHbMkpzQOgbP3.mQ3o1SPX7k5ztrFrWeeSWK5IgQooRY.8KtTeRMiaSEZ0kv3sO1MWtEsAzjVlRCDAZBoxqOs17v6PaZbPRqmDc92ivoMia.CqjufRs4_guOO4AyhRZ7_ml8rzxFrYeexpR2jLN0oPMyEWT0nbEf6Sdf._Bkh0HMfmI7KBnEx5uZBEEmV.wTfGRLG7zSd9sA4itOFv.r6AjP39CnogSn7NTJnqg_kEcKoiCM.lR5w_MqMc8IgWMBgSAZZgGEZpfmvxlQGnUzPwNh2pSpTe2wxFS3v1zPopDgoo2VsO3uzeyA3A_j7Hlk1P8T08DHbfr6ApDMUcr7d0QIt4pGYIxVV45XzfgpT7mgUdMei6VZrD9ozVQF0oqxrs1Ufri.XzPdB3NdQ--&v=1; fpc=d=sRPCfUfBTW96.RGiQn4hSkfi3p7WnPCAqYl5YoHecI7zjg7gH7PolscoPcq1Esm8dR.Rg1.AbQCpo2WBPXn1St96PpcjeCC.pj2.Upb3mKSRQkYPIVP1vQcL9nL7J8s9Z0VIXjiBFgSUcxyzDeUdP4us2YbVO3PbaVIwaIEfFsX3WI7YgiTbkrTGtwnFgoSYq6l8tnw-&v=2" }

    Read the article

  • GZipping CSS and JS files

    - by Ryan Giglio
    I'm using YSlow to improve the speed of my site, and I'm having trouble with the "compress components with gzip" grade. I have this in my .htaccess file: SetOutputFilter DEFLATE AddOutputFilterByType DEFLATE text/html text/plain text/xml text/css application/x-javascript But YSlow is saying There are 4 plain text components that should be sent compressed * http://crewinyourcode.com/css/reset.css * http://crewinyourcode.com/css/inner-pages/index.css * http://crewinyourcode.com/script/css/jquery-ui-1.8.custom.css * http://crewinyourcode.com/js/inner-pages/index.js How can I gzip the css and js files? Also...I don't have access to the httpd.conf file.

    Read the article

  • Rewriting Live TCP/IP (Layer 4) (i.e. Socket Layer) Streams

    - by user213060
    I have a simple problem which I'm sure someone here has done before... I want to rewrite Layer 4 TCP/IP streams (Not lower layer individual packets or frames.) Ettercap's etterfilter command lets you perform simple live replacements of Layer 4 TCP/IP streams based on fixed strings or regexes. Example ettercap scripting code: if (ip.proto == TCP && tcp.dst == 80) { if (search(DATA.data, "gzip")) { replace("gzip", " "); msg("whited out gzip\n"); } } if (ip.proto == TCP && tcp.dst == 80) { if (search(DATA.data, "deflate")) { replace("deflate", " "); msg("whited out deflate\n"); } } http://ettercap.sourceforge.net/forum/viewtopic.php?t=2833 I would like to rewrite streams based on my own filter program instead of just simple string replacements. Anyone have an idea of how to do this? Is there anything other than Ettercap that can do live replacement like this, maybe as a plugin to a VPN software or something? I would like to have a configuration similar to ettercap's silent bridged sniffing configuration between two Ethernet interfaces. This way I can silently filter traffic coming from either direction with no NATing problems. Note that my filter is an application that acts as a pipe filter, similar to the design of unix command-line filters: >[eth0] <----------> [my filter] <----------> [eth1]< What I am already aware of, but are not suitable: Tun/Tap - Works at the lower packet layer, I need to work with the higher layer streams. Ettercap - I can't find any way to do replacements other than the restricted capabilities in the example above. Hooking into some VPN software? - I just can't figure out which or exactly how. libnetfilter_queue - Works with lower layer packets, not TCP/IP streams. Again, the rewriting should occur at the transport layer (Layer 4) as it does in this example, instead of a lower layer packet-based approach. Exact code will help immensely! Thanks!

    Read the article

  • How do HTTP proxy caches decide between serving identity- vs. gzip-encoded resources?

    - by mrclay
    An HTTP server uses content-negotiation to serve a single URL identity- or gzip-encoded based on the client's Accept-Encoding header. Now say we have a proxy cache like squid between clients and the httpd. If the proxy has cached both encodings of a URL, how does it determine which to serve? The non-gzip instance (not originally served with Vary) can be served to any client, but the encoded instances (having Vary: Accept-Encoding) can only be sent to a clients with the identical Accept-Encoding header value as was used in the original request. E.g. Opera sends "deflate, gzip, x-gzip, identity, *;q=0" but IE8 sends "gzip, deflate". According to the spec, then, caches shouldn't share content-encoded caches between the two browsers. Is this true?

    Read the article

  • Enable mod_deflate

    - by Scarface
    Hey guys quick question, I am kind of a noob at administering my server right now. I was just wondering if anyone could let me know what I was doing wrong in trying to enable mod_deflate. I have Apache 2.0+ and tried the code in my htaccess file AddOutputFilterByType DEFLATE text/html text/plain text/xml text/css <FilesMatch "\\.(js|css|html|htm|php|xml)$"> SetOutputFilter DEFLATE </FilesMatch> It did not compress any of my files when I tested my site in firebug. If anyone knows what I am doing wrong, would really appreciate any pointers.

    Read the article

  • All client browsers repeatedly asking for NTLM authentication when running through local proxy server

    - by Marko
    All client browsers repeatedly asking for NTLM authentication when running through local proxy server. When pointing browsers through the local proxy to the internet, some but not all clients are being repeatedley prompted to authenticate to the proxy server. I have inspected the headers using firefox live headers as well as fiddler, and in all cases the authentication prompts happen when requesting SSL resources. an example of this would be as follows: GET http://gmail.google.com/mail/ HTTP/1.1 Accept: image/gif, image/jpeg, image/pjpeg, image/pjpeg, application/x-shockwave- flash, application/x-ms-application, application/x-ms-xbap, application/vnd.ms- xpsdocument, application/xaml+xml, */* Accept-Language: en-gb User-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729) Accept-Encoding: gzip, deflate Proxy-Connection: Keep-Alive Host: gmail.google.com GET http://gmail.google.com/mail/ HTTP/1.1 Accept: image/gif, image/jpeg, image/pjpeg, image/pjpeg, application/x-shockwave- flash, application/x-ms-application, application/x-ms-xbap, application/vnd.ms- xpsdocument, application/xaml+xml, */* Accept-Language: en-gb User-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729) Accept-Encoding: gzip, deflate Proxy-Connection: Keep-Alive Host: gmail.google.com Proxy-Authorization: NTLM TlRMTVNTUAABAAAAB7IIogkACQAvAAAABwAHACgAAAAFASgKAAAAD1dJTlhQMUdGTEFHU0hJUDc= GET http://gmail.google.com/mail/ HTTP/1.1 Accept: image/gif, image/jpeg, image/pjpeg, image/pjpeg, application/x-shockwave- flash, application/x-ms-application, application/x-ms-xbap, application/vnd.ms- xpsdocument, application/xaml+xml, */* Accept-Language: en-gb User-Agent: Mozilla/4.0 (compatible; MSIE 8.0; Windows NT 5.1; Trident/4.0; .NET CLR 1.1.4322; .NET CLR 2.0.50727; .NET CLR 3.0.4506.2152; .NET CLR 3.5.30729) Accept-Encoding: gzip, deflate Proxy-Connection: Keep-Alive Proxy-Authorization: NTLM TlRMTVNTUAADA (more stuff goes here I cut it short) Host: gmail.google.com At this point the username and password prompt has appeared in the browser, it does not matter what is typed into this box, correct credentials, random nonsense the browser does not accept anything in this box it will continue to popup. If I press cancel, I sometimes get a http 407 error, but on other occasions I click cancel the website proceeds to download and show normally. This is repeatable with some clients running through my proxy server, but in other cases it does not happen at all. In the cases where a client computer works normally, the only difference I can see is that the 3rd request for SSL resource comes back with a 200 response, see below: CONNECT gmail.google.com:443 HTTP/1.0 User-Agent: Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0; MALC) Proxy-Connection: Keep-Alive Content-Length: 0 Host: gmail.google.com Pragma: no-cache Proxy-Authorization: NTLM TlRMTVNTUAADAAAAGAAYAIAAAA A SSLv3-compatible ClientHello handshake was found. I have tried resetting user accounts as well as computer accounts in Active Directory. User accounts and passwords that are being used are correct and the passwords have been reset so they are not out of sync. I have removed the clients and even the proxy server from the domain, and rejoined them. I have installed a complete separate proxy server and get exactly the same problem when I point clients to a different proxy server on a different IP address.

    Read the article

  • Which browsers handle `Content-Encoding: gzip` and which of them has any special requirements on encodinq quality?

    - by user1049847
    I am creating a "hand made" HTTP 1/0, 1/1 server. I recently integrated zip lib so now I can stream encoded gziped data in and out. I wonder Which major browsers (alive ones - IE6-IE10, Chrome, FF, etc) send Accept-Encoding: deflate, gzip, ... and so can handle Content-Encoding: gzip today? Which of them send any quality expectations? Which of them can send encoded gziped post request and multypart/form data to my server?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >