Search Results

Search found 3111 results on 125 pages for 'mod gzip'.

Page 3/125 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • tar gzip slowing down server

    - by Josir
    I have a backup script that: compress some files generate md5 copy the compressed file to another server. the other server finishes comparing MD5 (to find copy errors). Here it's the core script: nice -n 15 tar -czvf $BKP $PATH_BKP/*.* \ | xargs -I '{}' sh -c "test -f '{}' && md5sum '{}'" \ | tee $MD5 scp -l 80000 $BKP $SCP_BKP scp $MD5 $SCP_BKP This routine got CPU at 90% at gzip routine, slowing down the production server. I tried to add a nice -n 15 but server still hangs. I've already read 1 but the conversation didn't help me. What is the best approach to solve this issue ? I am open to new architectures/solutions :)

    Read the article

  • htaccess mod rewrite changes http://www to http:/www

    - by Nir
    I want to replace calls like this: www.mysite.com/sub/file.php?param1=x&param2=http://www.someurl.com with: www.mysite.com/sub/param1/param2 Param 1 is an integer number Param 2 is a url I wrote this rewrite rule in htaccess: RewriteCond %{REQUEST_URI} \/sub\/ RewriteRule sub\/([0-9]+)\/(.*)$ sub\/file.php?param1=$2&param2=$1 [L] Unfortunately param2 (the URL) starts with http:/www.someurl.com instead of http://www.someurl.com (note the single slash). Any idea what causes it? When I call the same file with same parameters in the format www.mysite.com/sub/file.php?param1=x&param2=http://www.someurl.com , param2 does appear OK so it must be something with the rewrite rule.

    Read the article

  • mod rewrite escape chars

    - by Eugene
    hi, simple question www.site.com/page/3 to be www.site.com/page--3 so that i can read the pagenumber and page This is what i got but i dont know how to use the '--' and slash together instead. RewriteRule ^/?([a-zA-Z0-9-]+)/([a-zA-Z0-9-]+)$ index.php?page=$1&subpage=$2 Any help appreciated

    Read the article

  • gzip specific files

    - by byTheDrop
    for some reason these files are not gzipping on my apache server, chrome network tab shows this. Is there a specific directive I can add to htaccess to cache these files? Compressing the following resources with gzip could reduce their transfer size by about two thirds (~680.45KB): adae8bc4c3cb52cbe22358aaced87a72.css could save ~607B css_f91fa8d73b5e7661d6dcf9e58395e533.css could save ~59.54KB jquery.min.js could save ~37.27KB drupal.js could save ~6.15KB auto_image_handling.js could save ~6.72KB lightbox.js could save ~29.38KB superfish.js could save ~2.42KB jquery.bgiframe.min.js could save ~1011B jquery.hoverIntent.minified.js could save ~1.05KB nice_menus.js could save ~581B panels.js could save ~531B jquery.pngFix.js could save ~2.98KB jquery.cycle.all.min.js could save ~20.20KB views_slideshow.js could save ~8.76KB views_slideshow.js could save ~9.02KB wanderlust_custom_videos.js could save ~598B wl_helper.js could save ~777B extlink.js could save ~2.88KB cufon-yui.js could save ~11.89KB googleanalytics.js could save ~1.48KB swfobject.js could save ~6.65KB jquery.jcarousel.min.js could save ~10.19KB jcarousel.js could save ~6.01KB Akzidenz_Grotesk_BE_Super_800.font.js could save ~14.27KB Akzidenz_Grotesk_BE_Bold_700.font.js could save ~12.96KB Akzidenz_Grotesk_BE_Cn_400.font.js could save ~13.39KB SuperCondensed_500.font.js could save ~24.40KB FuturaBold_700.font.js could save ~26.19KB Futura_500.font.js could save ~57.70KB SuperGroteskB_500.font.js could save ~23.86KB jquery.cookie.js could save ~1.25KB wanderlust.js could save ~1.69KB sliderbottom.js could save ~442B jcarousellite_1.0.1.min.js could save ~4.60KB jcarousellite_control.js could save ~224B sitesdropdown.js could save ~1.09KB widgets.js could save ~50.13KB cufon-drupal.js could save ~599B swfobject_api.js could save ~348B ga.js could save ~24.02KB all.js could save ~124.67KB tweet_button.1347008535.html could save ~38.43KB xd_arbiter.php could save ~16.80KB xd_arbiter.php could save ~16.80KB

    Read the article

  • WCF GZip Compression Request/Response Processing

    - by IanT8
    How do I get a WCF client to process server responses which have been GZipped or Deflated by IIS? On IIS, I've followed the instructions here on how to make IIS 6 gzip all responses (where the request contained "Accept-Encoding: gzip, deflate") emitted by .svc wcf services. On the client, I've followed the instructions here and here on how to inject this header into the web request: "Accept-Encoding: gzip, deflate". Fiddler2 shows the response is binary and not plain old Xml. The client crashes with an exception which basically says there's no Xml header, which ofcourse is true. In my IClientMessageInspector, the app crashes before AfterReceiveReply is called. Some further notes: (1) I can't change the WCF service or client as they are supplied by a 3rd party. I can however attach behaviors and/or message inspectors via configuration if this is the right direction to take. (2) I don't want to compress/uncompress just the soap body, but the entire message. Any ideas/solutions? * SOLVED * It was not possible to write a WCF extension to achieve these goals. Instead I followed this CodeProject article which advocate a helper class: public class CompressibleHttpRequestCreator : IWebRequestCreate { public CompressibleHttpRequestCreator() { } WebRequest IWebRequestCreate.Create(Uri uri) { HttpWebRequest httpWebRequest = Activator.CreateInstance(typeof(HttpWebRequest), BindingFlags.CreateInstance | BindingFlags.Public | BindingFlags.NonPublic | BindingFlags.Instance, null, new object[] { uri, null }, null) as HttpWebRequest; if (httpWebRequest == null) { return null; } httpWebRequest.AutomaticDecompression =DecompressionMethods.GZip | DecompressionMethods.Deflate; return httpWebRequest; } } and also, an addition to the application configuration file: <configuration> <system.net> <webRequestModules> <remove prefix="http:"/> <add prefix="http:" type="Pajocomo.Net.CompressibleHttpRequestCreator, Pajocomo" /> </webRequestModules> </system.net> </configuration> What seems to be happening is that WCF eventually asks some factory or other deep down in system.net to provide an HttpWebRequest instance, and we provide the helper that will be asked to create the required instance. In the WCF client configuration file, a simple basicHttpBinding is all that is required, without the need for any custom extensions. When the application runs, the client Http request contains the header "Accept-Encoding: gzip, deflate", the server returns a gzipped web response, and the client transparently decompresses the http response before handing it over to WCF. When I tried to apply this technique to Web Services I found that it did NOT work. Although the helper class was executed in the same was as when used by the WCF client, the http request did not contain the "Accept-Encoding: ..." header. To make this work for Web Services, I had to edit the Web Proxy class, and add this method: protected override System.Net.WebRequest GetWebRequest(Uri uri) { System.Net.HttpWebRequest rq = (System.Net.HttpWebRequest)base.GetWebRequest(uri); rq.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate; return rq; } Note that it did not matter whether the CompressibleHttpRequestCreator and block from the application config file were present or not. For web services, only overriding GetWebRequest in the Web Service Proxy worked.

    Read the article

  • Package management fails in update-manager with gzip problems and compilation errors. U12.04LTS

    - by HarveyP
    Similar to but not the same as Package management system corrupted. Cannot install or remove packages. U12.04LTS (an earlier problem) with package management system. Followed all of L. D. James suggestions in his answer to no avail. This time as well as the gzip error I am also getting compilation errors. The difference may be due to a lack of compilation in my earlier problem so it may be the same error. The packages concerned are enumerated in the output from update-manager below. Also included below that is the output from apt-get -f install apt-get autoremove gives same output. Tried update without SSL updates - 9 to install and got "Unhandled Error in aptdaemon". Output number 3 below. One at a time - output 4 - is for firefox, first in the list of packages. Falls over at libssl1.0.0 despite deselection of it from update ... Tried apt-get install --reinstall dpkg which succeeded, apt-get install --reinstall tar and apt-get install --reinstall gzip both of which failed at libssl1.0.0 as ever. (as suggested by Subv3rsion elsewhere in this forum) Now cannot apt-get update with complete success even after changing server and apt-get clean - output number 5 below ... 1). Output from update-manager The following packages will be upgraded:<> firefox firefox-globalmenu firefox-locale-en libavcodec-extra-53 libavformat53 libavutil-extra-51 libjson0 libpostproc52 libssl1.0.0 libswscale2 openssl 11 to upgrade, 0 to newly install, 0 to remove and 0 not to upgrade.<br> Need to get 0 B/46.5 MB of archives. After this operation, 1,416 kB of additional disk space will be used.<br> Do you want to continue [Y/n]? y debconf: Perl may be unconfigured (Bareword "gensym" not allowed while "strict subs" in use at /usr/lib/perl/5.14/IO/Handle.pm line 67. BEGIN not safe after errors--compilation aborted at /usr/lib/perl/5.14/IO/Handle.pm line 366. Compilation failed in require at /usr/lib/perl/5.14/IO/Seekable.pm line 9. BEGIN failed--compilation aborted at /usr/lib/perl/5.14/IO/Seekable.pm line 9. Compilation failed in require at /usr/lib/perl/5.14/IO/File.pm line 11. BEGIN failed--compilation aborted at /usr/lib/perl/5.14/IO/File.pm line 11. Compilation failed in require at /usr/share/perl/5.14/FileHandle.pm line 9. Compilation failed in require at (eval 1) line 3. BEGIN failed--compilation aborted at (eval 1) line 3. ) -- aborting (Reading database ... 160575 files and directories currently installed.) Preparing to replace libssl1.0.0 1.0.1-4ubuntu5.14 (using .../libssl1.0.0_1.0.1-4ubuntu5.15_i386.deb) ... Unpacking replacement libssl1.0.0 ... dpkg-deb (subprocess): data: internal gzip read error: '<fd:4>: data error' dpkg-deb: error: subprocess <decompress> returned error exit status 2 dpkg: error processing /var/cache/apt/archives/libssl1.0.0_1.0.1-4ubuntu5.15_i386.deb (--unpack):<br> subprocess dpkg-deb --fsys-tarfile returned error exit status 2 No apport report written because MaxReports has already been reached Bareword "gensym" not allowed while "strict subs" in use at /usr/lib/perl/5.14/IO/Handle.pm line 67. BEGIN not safe after errors--compilation aborted at /usr/lib/perl/5.14/IO/Handle.pm line 366. Compilation failed in require at /usr/lib/perl/5.14/IO/Seekable.pm line 9. BEGIN failed--compilation aborted at /usr/lib/perl/5.14/IO/Seekable.pm line 9. Compilation failed in require at /usr/lib/perl/5.14/IO/File.pm line 11. BEGIN failed--compilation aborted at /usr/lib/perl/5.14/IO/File.pm line 11. Compilation failed in require at /usr/share/perl/5.14/FileHandle.pm line 9. Compilation failed in require at /usr/share/perl5/Debconf/Template.pm line 8. BEGIN failed--compilation aborted at /usr/share/perl5/Debconf/Template.pm line 8. Compilation failed in require at /usr/share/perl5/Debconf/Question.pm line 8. BEGIN failed--compilation aborted at /usr/share/perl5/Debconf/Question.pm line 8. Compilation failed in require at /usr/share/perl5/Debconf/Config.pm line 7. BEGIN failed--compilation aborted at /usr/share/perl5/Debconf/Config.pm line 7. Compilation failed in require at /usr/share/perl5/Debconf/Log.pm line 10. Compilation failed in require at /usr/share/perl5/Debconf/Db.pm line 7. BEGIN failed--compilation aborted at /usr/share/perl5/Debconf/Db.pm line 7. Compilation failed in require at /usr/share/debconf/frontend line 6. BEGIN failed--compilation aborted at /usr/share/debconf/frontend line 6. dpkg: error whale cleanang up: subprgcess installed post-installation script returned error exit status 2 Errors were encountered while processing: /var/cache/apt/archives/libssl1.0.0_1.0.1-4ubuntu5.15_i386.deb E: Sub-process /usr/bin/dpkg returned an error code (1) 2). Output from install -f harveyp@harveyp:~$ sudo apt-get -f install [sudo] password for harveyp: Reading package lists... Done Building dependency tree Reading state information... Done 0 to upgrade, 0 to newly install, 0 to remove and 11 not to upgrade. 1 not fully installed or removed.<br> After this operation, 0 B of additional disk space will be used. E: Internal Error, No file name for libssl1.0.0 3). Unhandled error from aptdaemon Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/aptdaemon/worker.py", line 1045, in _simulate trans.unauthenticated = self.__simulate(trans) File "/usr/lib/python2.7/dist-packages/aptdaemon/worker.py", line 1160, in __simulate unauthenticated = self._get_unauthenticated() File "/usr/lib/python2.7/dist-packages/aptdaemon/worker.py", line 347, in _get_unauthenticated for pkg in self._iterate_packages(): File "/usr/lib/python2.7/dist-packages/aptdaemon/worker.py", line 1356, in _iterate_packages for enum, pkg in enumerate(self._cache): File "/usr/lib/python2.7/dist-packages/apt/cache.py", line 216, in __iter__ yield self[pkgname] File "/usr/lib/python2.7/dist-packages/apt/cache.py", line 201, in __getitem__ pkg = self._weakref[key] = Package(self, self._cache[key]) KeyError: 'librqrcode-rubq-doc 4). output from update of firefox installArchives() failed: Error in function: < Setting up libssl1.0.0 (1.0.1-4ubuntu5.14) ... Bareword "gensym" not allowed while "strict subs" in use at /usr/lib/perl/5.14/IO/Handle.pm line 67. BEGIN not safe after errors--compilation aborted at /usr/lib/perl/5.14/IO/Handle.pm line 366. Compilation failed in require at /usr/lib/perl/5.14/IO/Seekable.pm line 9. BEGIN failed--compilation aborted at /usr/lib/perl/5.14/IO/Seekable.pm line 9. Compilation failed in require at /usr/lib/perl/5.14/IO/File.pm line 11. BEGIN failed--compilation aborted at /usr/lib/perl/5.14/IO/File.pm line 11. Compilation failed in require at /usr/share/perl/5.14/FileHandle.pm line 9. Compilation failed in require at /usr/share/perl5/Debconf/Template.pm line 8. BEGIN failed--compilation aborted at /usr/share/perl5/Debconf/Template.pm line 8. Compilation failed in require at /usr/share/perl5/Debconf/Question.pm line 8. BEGIN failed--compilation aborted at /usr/share/perl5/Debconf/Question.pm line 8. Compilation failed in require at /usr/share/perl5/Debconf/Config.pm line 7. BEGIN failed--compilation aborted at /usr/share/perl5/Debconf/Config.pm line 7. Compilation failed in require at /usr/share/perl5/Debconf/Log.pm line 10. 5. output from apt-get update ...snip ... Hit http://ubuntu-archive.mirrors.free.org precise-security/multiverse Translation-en Hit http://ubuntu-archive.mirrors.free.org precise-security/restricted Translation-en Hit http://ubuntu-archive.mirrors.free.org precise-security/universe Translation-en Fetched 368 kB in 6s (59.5 kB/s) W: Failed to fetch gzip:/var/lib/apt/lists/partial/ubuntu-archive.mirrors.free.org_ubuntu_dists_precise_universe_source_Sources Hash Sum mismatch E: Some index files failed to download. They have been ignored, or old ones used instead.

    Read the article

  • Dreamweaver and GZIP files

    - by Vian Esterhuizen
    Hi, I've recently tried to optimize my site for speed and brandwith. Amongst many other techniques, I've used GZIP on my .css and .js files. Using PuTTY I compressed the files on my site and then used: <IfModule mod_rewrite.c> RewriteEngine On RewriteCond %{HTTP:Accept-encoding} gzip RewriteCond %{HTTP_USER_AGENT} !Konqueror RewriteCond %{REQUEST_FILENAME}.gz -f RewriteRule ^(.*)\.css$ $1.css.gz [QSA,L] RewriteRule ^(.*)\.js$ $1.js.gz [QSA,L] <FilesMatch \.css\.gz$> ForceType text/css </FilesMatch> <FilesMatch \.js\.gz$> ForceType text/javascript </FilesMatch> </IfModule> <IfModule mod_mime.c> AddEncoding gzip .gz </IfModule> in my .htaccess file so that they get served properly because all my links are without the ".gz". My problem is, I cant work on the GZIP file in Dreamweaver. Is there a plugin or extension of somesort that allows Dreamweaver to temporarily uncompress thses files so it can read them? Or is there a way that I can work on my local copies as regular files, and server side they automatically get compressed when they are uploaded. Or is there a different code editor I should be using that would completely get around this? Or a just a different technique to doing this? I hope this question makes sense, Thanks

    Read the article

  • Tomcat gzip while chunked issue

    - by hoodoos
    I'm expiriencing some problem with one of my data source services. As it says in HTTP response headers it's running on Apache-Coyote/1.1. Server gives responses with Transfer-Encoding: chunked, here sample response: HTTP/1.1 200 OK Server: Apache-Coyote/1.1 Content-Type: text/xml;charset=utf-8 Transfer-Encoding: chunked Content-Encoding: gzip Date: Tue, 30 Mar 2010 06:13:52 GMT And problem is when I'm requesting server to send gzipped request it often sends not full response. I recieve response, see that last chunk recieved, but then after ungzipping I see that response is partial. I never seen such behavior with gzip turned off in request headers. So my question is: is it common tomcat issue? maybe one of it's mod which is doing compression? Or maybe it maybe some kind of proxy issue? I can't tell about versions of tomcat or what gzip mod they use, but feel free to ask, i'll try ask my service provider. Thanks.

    Read the article

  • Anyway to improve my gzip PHP method?

    - by Joe
    I Gzip my pages currently like so: <?php ob_start("ob_gzhandler"); //my page content ob_flush(); ?> However, I read a comment somewhere, earlier on, that this method uses a lot of memory, and I know that my website has been using a lot of memory on my virtual private server, so I thought it would be nice if I knew a way to reduce memory usage. I tested my site with an online gzip tester which says my websites are sending gzipped pages, so my gzip method works, but the main obviously I'm looking for a less memory intensive option, if any. I appreciate all suggestions. :) Oh and merry christmas ;P

    Read the article

  • Built-in GZip/Deflate Compression on IIS 7.x

    - by Rick Strahl
    IIS 7 improves internal compression functionality dramatically making it much easier than previous versions to take advantage of compression that’s built-in to the Web server. IIS 7 also supports dynamic compression which allows automatic compression of content created in your own applications (ASP.NET or otherwise!). The scheme is based on content-type sniffing and so it works with any kind of Web application framework. While static compression on IIS 7 is super easy to set up and turned on by default for most text content (text/*, which includes HTML and CSS, as well as for JavaScript, Atom, XAML, XML), setting up dynamic compression is a bit more involved, mostly because the various default compression settings are set in multiple places down the IIS –> ASP.NET hierarchy. Let’s take a look at each of the two approaches available: Static Compression Compresses static content from the hard disk. IIS can cache this content by compressing the file once and storing the compressed file on disk and serving the compressed alias whenever static content is requested and it hasn’t changed. The overhead for this is minimal and should be aggressively enabled. Dynamic Compression Works against application generated output from applications like your ASP.NET apps. Unlike static content, dynamic content must be compressed every time a page that requests it regenerates its content. As such dynamic compression has a much bigger impact than static caching. How Compression is configured Compression in IIS 7.x  is configured with two .config file elements in the <system.WebServer> space. The elements can be set anywhere in the IIS/ASP.NET configuration pipeline all the way from ApplicationHost.config down to the local web.config file. The following is from the the default setting in ApplicationHost.config (in the %windir%\System32\inetsrv\config forlder) on IIS 7.5 with a couple of small adjustments (added json output and enabled dynamic compression): <?xml version="1.0" encoding="UTF-8"?> <configuration> <system.webServer> <httpCompression directory="%SystemDrive%\inetpub\temp\IIS Temporary Compressed Files"> <scheme name="gzip" dll="%Windir%\system32\inetsrv\gzip.dll" staticCompressionLevel="9" /> <dynamicTypes> <add mimeType="text/*" enabled="true" /> <add mimeType="message/*" enabled="true" /> <add mimeType="application/x-javascript" enabled="true" /> <add mimeType="application/json" enabled="true" /> <add mimeType="*/*" enabled="false" /> </dynamicTypes> <staticTypes> <add mimeType="text/*" enabled="true" /> <add mimeType="message/*" enabled="true" /> <add mimeType="application/x-javascript" enabled="true" /> <add mimeType="application/atom+xml" enabled="true" /> <add mimeType="application/xaml+xml" enabled="true" /> <add mimeType="*/*" enabled="false" /> </staticTypes> </httpCompression> <urlCompression doStaticCompression="true" doDynamicCompression="true" /> </system.webServer> </configuration> You can find documentation on the httpCompression and urlCompression keys here respectively: http://msdn.microsoft.com/en-us/library/ms690689%28v=vs.90%29.aspx http://msdn.microsoft.com/en-us/library/aa347437%28v=vs.90%29.aspx The httpCompression Element – What and How to compress Basically httpCompression configures what types to compress and how to compress them. It specifies the DLL that handles gzip encoding and the types of documents that are to be compressed. Types are set up based on mime-types which looks at returned Content-Type headers in HTTP responses. For example, I added the application/json to mime type to my dynamic compression types above to allow that content to be compressed as well since I have quite a bit of AJAX content that gets sent to the client. The UrlCompression Element – Enables and Disables Compression The urlCompression element is a quick way to turn compression on and off. By default static compression is enabled server wide, and dynamic compression is disabled server wide. This might be a bit confusing because the httpCompression element also has a doDynamicCompression attribute which is set to true by default, but the urlCompression attribute by the same name actually overrides it. The urlCompression element only has three attributes: doStaticCompression, doDynamicCompression and dynamicCompressionBeforeCache. The doCompression attributes are the final determining factor whether compression is enabled, so it’s a good idea to be explcit! The default for doDynamicCompression='false”, but doStaticCompression="true"! Static Compression is enabled by Default, Dynamic Compression is not Because static compression is very efficient in IIS 7 it’s enabled by default server wide and there probably is no reason to ever change that setting. Dynamic compression however, since it’s more resource intensive, is turned off by default. If you want to enable dynamic compression there are a few quirks you have to deal with, namely that enabling it in ApplicationHost.config doesn’t work. Setting: <urlCompression doDynamicCompression="true" /> in applicationhost.config appears to have no effect and I had to move this element into my local web.config to make dynamic compression work. This is actually a smart choice because you’re not likely to want dynamic compression in every application on a server. Rather dynamic compression should be applied selectively where it makes sense. However, nowhere is it documented that the setting in applicationhost.config doesn’t work (or more likely is overridden somewhere and disabled lower in the configuration hierarchy). So: remember to set doDynamicCompression=”true” in web.config!!! How Static Compression works Static compression works against static content loaded from files on disk. Because this content is static and not bound to change frequently – such as .js, .css and static HTML content – it’s fairly easy for IIS to compress and then cache the compressed content. The way this works is that IIS compresses the files into a special folder on the server’s hard disk and then reads the content from this location if already compressed content is requested and the underlying file resource has not changed. The semantics of serving an already compressed file are very efficient – IIS still checks for file changes, but otherwise just serves the already compressed file from the compression folder. The compression folder is located at: %windir%\inetpub\temp\IIS Temporary Compressed Files\ApplicationPool\ If you look into the subfolders you’ll find compressed files: These files are pre-compressed and IIS serves them directly to the client until the underlying files are changed. As I mentioned before – static compression is on by default and there’s very little reason to turn that functionality off as it is efficient and just works out of the box. The one tweak you might want to do is to set the compression level to maximum. Since IIS only compresses content very infrequently it would make sense to apply maximum compression. You can do this with the staticCompressionLevel setting on the scheme element: <scheme name="gzip" dll="%Windir%\system32\inetsrv\gzip.dll" staticCompressionLevel="9" /> Other than that the default settings are probably just fine. Dynamic Compression – not so fast! By default dynamic compression is disabled and that’s actually quite sensible – you should use dynamic compression very carefully and think about what content you want to compress. In most applications it wouldn’t make sense to compress *all* generated content as it would generate a significant amount of overhead. Scott Fortsyth has a great post that details some of the performance numbers and how much impact dynamic compression has. Depending on how busy your server is you can play around with compression and see what impact it has on your server’s performance. There are also a few settings you can tweak to minimize the overhead of dynamic compression. Specifically the httpCompression key has a couple of CPU related keys that can help minimize the impact of Dynamic Compression on a busy server: dynamicCompressionDisableCpuUsage dynamicCompressionEnableCpuUsage By default these are set to 90 and 50 which means that when the CPU hits 90% compression will be disabled until CPU utilization drops back down to 50%. Again this is actually quite sensible as it utilizes CPU power from compression when available and falling off when the threshold has been hit. It’s a good way some of that extra CPU power on your big servers to use when utilization is low. Again these settings are something you likely have to play with. I would probably set the upper limit a little lower than 90% maybe around 70% to make this a feature that kicks in only if there’s lots of power to spare. I’m not really sure how accurate these CPU readings that IIS uses are as Cpu usage on Web Servers can spike drastically even during low loads. Don’t trust settings – do some load testing or monitor your server in a live environment to see what values make sense for your environment. Finally for dynamic compression I tend to add one Mime type for JSON data, since a lot of my applications send large chunks of JSON data over the wire. You can do that with the application/json content type: <add mimeType="application/json" enabled="true" /> What about Deflate Compression? The default compression is GZip. The documentation hints that you can use a different compression scheme and mentions Deflate compression. And sure enough you can change the compression settings to: <scheme name="deflate" dll="%Windir%\system32\inetsrv\gzip.dll" staticCompressionLevel="9" /> to get deflate style compression. The deflate algorithm produces slightly more compact output so I tend to prefer it over GZip but more HTTP clients (other than browsers) support GZip than Deflate so be careful with this option if you build Web APIs. I also had some issues with the above value actually being applied right away. Changing the scheme in applicationhost.config didn’t show up on the site  right away. It required me to do a full IISReset to get that change to show up before I saw the change over to deflate compressed content. Content was slightly more compressed with deflate – not sure if it’s worth the slightly less common compression type, but the option at least is available. IIS 7 finally makes GZip Easy In summary IIS 7 makes GZip easy finally, even if the configuration settings are a bit obtuse and the documentation is seriously lacking. But once you know the basic settings I’ve described here and the fact that you can override all of this in your local web.config it’s pretty straight forward to configure GZip support and tweak it exactly to your needs. Static compression is a total no brainer as it adds very little overhead compared to direct static file serving and provides solid compression. Dynamic Compression is a little more tricky as it does add some overhead to servers, so it probably will require some tweaking to get the right balance of CPU load vs. compression ratios. Looking at large sites like Amazon, Yahoo, NewEgg etc. – they all use Related Content Code based ASP.NET GZip Caveats HttpWebRequest and GZip Responses © Rick Strahl, West Wind Technologies, 2005-2011Posted in IIS7   ASP.NET  

    Read the article

  • How do I enable mod_deflate for PHP files?

    - by DM.
    I have a Liquid Web VPS account, I've made sure that mod_deflate is installed and running/active. I used to gzip my css and js files via PHP, as well as my PHP files themselves... However, I'm now trying to do this via mod_deflate, and it seems to work fine for all files except for PHP files. (Txt files work fine, css, js, static HTML files, just nothing that is generated via a PHP file.) How do I fix this? (I used the "Compress all content" option under "Optimize Website" in cPanel, which creates an .htaccess file in the home directory (not public_html, one level higher than that) with exactly the same text as the "compress everything except images" example on http://httpd.apache.org/docs/2.0/mod/mod_deflate.html) .htaccess file: <IfModule mod_deflate.c> SetOutputFilter DEFLATE <IfModule mod_setenvif.c> # Netscape 4.x has some problems... BrowserMatch ^Mozilla/4 gzip-only-text/html # Netscape 4.06-4.08 have some more problems BrowserMatch ^Mozilla/4\.0[678] no-gzip # MSIE masquerades as Netscape, but it is fine # BrowserMatch \bMSIE !no-gzip !gzip-only-text/html # NOTE: Due to a bug in mod_setenvif up to Apache 2.0.48 # the above regex won't work. You can use the following # workaround to get the desired effect: BrowserMatch \bMSI[E] !no-gzip !gzip-only-text/html # Don't compress images SetEnvIfNoCase Request_URI .(?:gif|jpe?g|png)$ no-gzip dont-vary </IfModule> <IfModule mod_headers.c> # Make sure proxies don't deliver the wrong content Header append Vary User-Agent env=!dont-vary </IfModule> </IfModule>

    Read the article

  • Does Apache Commons HttpClient support GZIP?

    - by Benju
    Does the library Apache Commons HttpClient support Gzip? We wanted to use enable gzip compression on our Apache server to speed up the client/server communications (we have a php page that allows our Android application to sync files with the Server).

    Read the article

  • Gzip web service extension with IIS

    - by agiko
    When i complete gzip with IIS, and i restart the server the problem came: that the "web service extension" the point to "c:\windows\system32\inetsrv\gzip.dll" has been auto deleted! so awesome, but why? And i google it, but find nothing about it, have anyone even meet this questiong? Please help me, thanks a lot!

    Read the article

  • WebClient and Gzip compression is faster?

    - by Yozer
    I writting an application which is using WebClient class. Adding something like that: ExC.Headers.Add("Accept-Encoding: gzip, deflate"); where ExC is: class ExWebClient1 : WebClient { protected override WebRequest GetWebRequest(Uri address) { HttpWebRequest request = (HttpWebRequest)base.GetWebRequest(address); request.AutomaticDecompression = DecompressionMethods.GZip | DecompressionMethods.Deflate; return request; } } It will be a diffrence in speed when i will be using encoded response?

    Read the article

  • Gzip In-Memory Compression

    - by feal87
    Quick and simple question. There are examples online about achieving in-memory gzip compression with zlib (C++) WITHOUT external libraries (like boost or such)? I just need to compress and decompress a block of data without much options. (it must be gzip as its the same format used by another mine C# program (the data is to be shared)) Tried to search to no avail... Thanks!

    Read the article

  • Is there a command-line tool that could tell me if Gzip is really on beyond the Gzip 1 header param?

    - by lucidquiet
    Is there a command-line tool that could tell me if Gzip is on? What I'm looking for is something that can say the stream coming from the server is really gzipped even if the header params say Gzip:1 (which it could be falsely placing in the headers). I don't see a switch in curl, or wget, or tcpdump, or anything, but maybe I'm just missing something, or perhaps there is something else that could provide me this bit of information? Any help would be appreciated.

    Read the article

  • Recognizing when to use the mod operator

    - by Will
    I have a quick question about the mod operator. I know what it does; it calculates the remainder of a division. My question is, how can I identify a situation where I would need to use the mod operator? I know I can use the mod operator to see whether a number is even or odd and prime or composite, but that's about it. I don't often think in terms of remainders. I'm sure the mod operator is useful and I would like to learn to take advantage of it. I just have problems identifying where the mod operator is applicable. In various programming situations, it is difficult for me to see a problem and realize "hey! the remainder of division would work here!" Any tips or strategies? Thanks

    Read the article

  • Apache gzip with chucked encoding

    - by hoodoos
    I'm expiriencing some problem with one of my data source services. As it says in HTTP response headers it's running on Apache-Coyote/1.1. Server gives responses with Transfer-Encoding: chunked, here sample response: HTTP/1.1 200 OK Server: Apache-Coyote/1.1 Content-Type: text/xml;charset=utf-8 Transfer-Encoding: chunked Date: Tue, 30 Mar 2010 06:13:52 GMT And problem is when I'm requesting server to send gzipped request it often sends not full response. I recieve response, see that last chunk recived, but then after ungzipping I see that response is partial. So my question is: is it common apache issue? maybe one of it's mod_deflate plugins or something? Ask questions if you need more info. Thanks.

    Read the article

  • Gzip http compression problem on iis7

    - by wpfwannabe
    My web hosting provider is running IIS7 and I am having loads of trouble to get gzip compression to work properly. Host admins say compression is installed. I can confirm compression using some online checking services but not with others. PageSpeed Firefox add-on also says the site is uncompressed. I am personally sitting behind a Squid proxy but web.config settings should take care of proxy issue. Below is the relevant web.config snippet. Most of it is borrowed from various sites. Any thoughts? <urlCompression doDynamicCompression="true" dynamicCompressionBeforeCache="true" doStaticCompression="true" /> <httpCompression cacheControlHeader="max-age=86400" noCompressionForHttp10="False" noCompressionForProxies="False" sendCacheHeaders="True" dynamicCompressionEnableCpuUsage="89" dynamicCompressionDisableCpuUsage="90" minFileSizeForComp="1" noCompressionForRange="False"> <scheme name="gzip" dll="%Windir%\system32\inetsrv\gzip.dll" /> <dynamicTypes> <add mimeType="text/*" enabled="true" /> <add mimeType="message/*" enabled="true" /> <add mimeType="application/javascript" enabled="true" /> <add mimeType="*/*" enabled="false" /> </dynamicTypes> <staticTypes> <add mimeType="text/*" enabled="true" /> <add mimeType="message/*" enabled="true" /> <add mimeType="application/javascript" enabled="true" /> <add mimeType="*/*" enabled="false" /> </staticTypes> </httpCompression>

    Read the article

  • Setup mod-rewrite

    - by Publiccert
    I'm trying to setup mod-rewrite for a few servers. The code lives in /home/jeff/www/upload/application/ However, this is what's happening. It appears to be a problem with mod-rewrite since it's appending code.py to the beginning of the directory: The requested URL /code.py/home/jeff/www/upload/application/ was not found on this server. Here are the rules. Which one is the culprit? WSGIScriptAlias / /home/jeff/www/upload/application Alias /static /home/jeff/www/upload/public_html <Directory /home/jeff/www/upload/application> SetHandler wsgi-script Options ExecCGI FollowSymLinks </Directory> AddType text/html .py <Location /> RewriteEngine on RewriteBase / RewriteCond %{REQUEST_URI} !^/static RewriteCond %{REQUEST_URI} !^(/.*)+code.py/ RewriteRule ^(.*)$ code.py/$1 [PT] </Location> </VirtualHost>

    Read the article

  • Tomcat Compression Does Not Add a Content-Encoding: gzip in the Header

    - by Julien Chastang
    I am using Tomcat to compress my HTML content like this: <Connector port="8080" maxHttpHeaderSize="8192" maxProcessors="150" maxThreads="150" minSpareThreads="25" maxSpareThreads="75" enableLookups="false" redirectPort="8443" acceptCount="150" connectionTimeout="20000" disableUploadTimeout="true" compression="on" compressionMinSize="128" noCompressionUserAgents="gozilla, traviata" compressableMimeType="text/html" URIEncoding="UTF-8" /> In the HTTP header (as observed via YSlow), however, I am not seeing Content-Encoding: gzip resulting in a poor YSlow score. All I see is HeadersPost Response Headers Server: Apache-Coyote/1.1 Content-Type: text/html;charset=ISO-8859-1 Content-Language: en-US Content-Length: 5251 Date: Sat, 14 Feb 2009 23:33:51 GMT I am running an apache mod_jk Tomcat configuration. How do I compress HTML content with Tomcat, and also have it add "Content-Encoding: gzip" in the header?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >