Search Results

Search found 3111 results on 125 pages for 'mod gzip'.

Page 4/125 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • How to enable gzip compression using PHP Simple HTML DOM Parser

    - by brant
    I have tried a few things to enable gzip compression using PHP Simple HTML DOM Parser but nothing has seemed to work thus far. Using ini_set I've manged to change the user agent, so I figured it might be possible to also enable gzip compression? include("simpdom/simple_html_dom.php"); ini_set('zlib.output_compression', 'On'); $url = 'http://www.whatsmyip.org/http_compression/'; $html = file_get_html($url); print $html; The website above tests it. Please let me know if I am going about this the wrong way completely.

    Read the article

  • php gzip xml file (53MB) casue Out of memory error

    - by ntan
    Hi, i have a 53 MB xml file that i want to gzip. The code below gzip it $gzFile = "my.gz"; $data = IMPLODE("", FILE($filename)); $gzdata = GZENCODE($data, 9); //open gz -- 'w9' is highest compression $fp = gzopen ($gzFile, 'w9'); //loop through array and write each line into the compressed file gzwrite ($fp, $gzdata); //close the file gzclose ($fp); This cause PHP Fatal error: Out of memory (allocated 70516736) (tried to allocate 24 bytes) Any one have any suggestions. I already have increase the memory in php.ini

    Read the article

  • Mod disk_cache permanent caching images and disabling reacurring header updates

    - by user135532
    I am trying to get mod disk_cache to permantly cache images retrieved from an image server on the webserver using ProxyPass. While the image is being retrieved correctly from the server and is served from the cache on further requests, then I am still having the webserver call the image server and causing the cached header to be updated. Because of load concerns then I need to never call the image server on a specific url again after it has been cached once, or extend the refresh time for as long as possible. The webserver is IHS 7.0 The mod's are mod_disk_cache.so, mod_cache.so, mod_proxy.so Version 2.2.8.0 Following is from my httpd.conf: ProxyPass /webserver/media/images/ http://imageserver.com/ws/media/images/ # Caching pictures <IfModule mod_cache.c> <IfModule mod_disk_cache.c> CacheDefaultExpire 2628000 #CacheDisable CacheEnable disk /webserver/media/images/ CacheIgnoreCacheControl On CacheIgnoreHeaders Cookie Referer User-Agent X-Forwarded-For X-Forwarded-Host X-Forwarded-Server Accept-Language Accept Host CacheIgnoreNoLastMod On CacheIgnoreQueryString Off #CacheIgnoreURLSessionIdentifiers CacheLastModifiedFactor 10000000.1 #CacheLock on #CacheLockMaxAge 5 #CacheLockPath CacheMaxExpire 1576800 CacheStoreNoStore On CacheStorePrivate On CacheDirLength 2 CacheDirLevels 3 CacheMaxFileSize 1000000 CacheMinFileSize 1 CacheRoot c:/cacheroot2 </IfModule> </IfModule>

    Read the article

  • Using both chunked transfer encoding and gzip

    - by RadiantHeart
    I recently started using gzip on my site and it worked like charm on all browsers except Opera which gives an error saying it could not decompress the content due to damaged data. From what I can gather from testing and googling it might be a problem with using both gzip and chunked transfer encoding. The fact that there is no error when requesting small files like css-files also points in that direction. Is this a known issue or is there something else that I havent thought about? Someone also mentioned that it could have something to do with sending a Content-Length header. Here is a simplified version of the most relevant part of my code: $contents = ob_get_contents(); ob_end_clean(); header('Content-Encoding: '.$encoding); print("\x1f\x8b\x08\x00\x00\x00\x00\x00"); $size = strlen($contents); $contents = gzcompress($contents, 9); $contents = substr($contents, 0, $size); print($contents); exit();

    Read the article

  • I can't update my system properly, "no package header" error

    - by joel
    Every time I try to run sudo apt-get update or try running updates from the GUI interface I run into the following problem or something similar: Reading package lists... Error! E: Encountered a section with no Package: header E: Problem with MergeList /var/lib/apt/lists/archive.ubuntu.com_ubuntu_dists_precise_restricted_binary-i386_Packages E: The package lists or status file could not be parsed or opened. I've tried purging using sudo rm -rf <filename> where <filename> is the listed file above, and then running sudo apt-get update to fix it (as listed elsewhere in this forum) and no luck, just keep getting this message. I'm running Ubuntu 12.04 and this is getting really frustrating... I just want a system that runs smoothly and doesn't require it's hand to be held when it comes to updates. Tried the solutions posted below and am still receiving the same errors, sample output: W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise_main_binary-amd64_Packages Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise_main_binary-i386_Packages Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise_restricted_binary-i386_Packages Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise_universe_binary-i386_Packages Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise_multiverse_binary-i386_Packages Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-updates_universe_source_Sources Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-updates_restricted_binary-i386_Packages Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-updates_universe_binary-i386_Packages Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-updates_multiverse_binary-i386_Packages Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-backports_universe_binary-i386_Packages Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-security_main_source_Sources Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-security_universe_binary-amd64_Packages Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-security_main_binary-i386_Packages Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-security_universe_binary-i386_Packages Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise_main_i18n_Translation-en%5fCA Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-updates_main_i18n_Translation-en%5fCA Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-updates_main_i18n_Translation-en Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-updates_multiverse_i18n_Translation-en%5fCA Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-updates_universe_i18n_Translation-en%5fCA Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-backports_main_i18n_Translation-en%5fCA Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-backports_multiverse_i18n_Translation-en%5fCA Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-backports_universe_i18n_Translation-en%5fCA Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-security_main_i18n_Translation-en%5fCA Encountered a section with no Package: header W: Failed to fetch gzip:/var/lib/apt/lists/partial/archive.ubuntu.com_ubuntu_dists_precise-security_multiverse_i18n_Translation-en%5fCA Encountered a section with no Package: header E: Some index files failed to download. They have been ignored, or old ones used instead.

    Read the article

  • Android landscape mod game.

    - by davidv
    I am beginner in android game development. I want my game to run only in landscape fullscreen mod (currently I have Optimus 2X with resolution 800x480 in landscape), and I dont know how to set it. I found the fullscreen mod settings, and tried some landscape mod (set orientation:landscape in AndroidManifest), but the game is now crashing and its very unstable (eg. when i change phone orientation). So is there any way to do that? Thank you for help.

    Read the article

  • First Minecraft mod not working: make a new sword

    - by yamikoWebs
    I am making my first mod and cannot see what is wrong with it. I am using MCP and Modloader. For my first mod I was going to make swords. I started with making a new EnumToolMaterials WOOD(0, 59, 2.0F, 0, 15), STONE(1, 131, 4.0F, 1, 5), IRON(2, 250, 6.0F, 2, 14), LAPIS(3, 750, 7.0F, 2, 14), OBSIDIAN(3, 1000, 7.5F, 3, 12), EMERALD(3, 1561, 8.0F, 3, 10),//diamond GREEN(3, 2000, 9.0F, 4, 10),//emerald GOLD(0, 200, 12.0F, 0, 22); then here is the mod class public class _Mod_Yamiko extends BaseMod{ /* mod itemts */ public static final Item swordLapis = (new ItemSword(600, EnumToolMaterial.LAPIS)).setItemName("swordLapis"); public static final Item swordObsidian = (new ItemSword(601, EnumToolMaterial.OBSIDIAN)).setItemName("swordObsidian"); public static final Item swordGreen = (new ItemSword(602, EnumToolMaterial.GREEN)).setItemName("swordGreen"); public void load(){ //set images swordLapis.iconIndex = ModLoader.addOverride("/gui/items.png","/gui/swordLapis.png"); ModLoader.addName(swordLapis, "Lapis Sword"); //craft ModLoader.addRecipe(new ItemStack(_Mod_Yamiko.swordLapis, 1), new Object[]{ " X ", " X ", " Y ", 'X', Block.dirt, 'Y', Item.stick }); } public String getVersion(){ return "0.1"; } } Then I made a 16×16 .png image. I am not sure where to save it so I recompiled and reobfuscated, took the mod files and put it in my local Minecraft install, added the image where it be should be. No problems when playing but I cannot make the new sword.

    Read the article

  • Store GZIP:ed text in mysql?

    - by Industrial
    Hi! Is it a common thing for bigger applications and databases to GZIP text data before inserting it to the database? I'll guess that any full-text search on the actual text field will not be working before unzipping it again? Thansks

    Read the article

  • Gzip for Classic ASP Pages

    - by ctroy
    I don't have access to IIS Server as I am hosted on a shared hosting. I don't have access to remote IIS manager or the command prompt on server. Now, my question is, Is it possible to gzip my asp pages? I am hosted on an IIS 7 server.

    Read the article

  • gzip compression

    - by SyAu
    I'm using Weblogic application server and Apache web server in my J2EE environment and planning to implement gzip compression of response. Not sure, whether to implement compression on the Apache server or on the weblogic.

    Read the article

  • gzip several files and pipe them into one input

    - by Daniel
    I have this program that takes one argument for the source file and then it parse it. I have several files gzipped that I would like to parse, but since it only takes one input, I'm wondering if there is a way to create one huge file using gzip and then pipe it into the only one input.

    Read the article

  • Cannot turn on gzip compression in JBoss 5

    - by Vladimir Bezugliy
    I added following file deployers\jbossweb.deployer\server.xml <Connector compression="force" compressionMinSize="512" noCompressionUserAgents="gozilla, traviata" compressableMimeType="text/html,text/xml,image/png,text/css,text/javascript"> </Connector> But fiddler shows that jboss does not compress responses. How to ensure that gzip compression in JBoss is turned on? Is it possible to check it in jmx-console?

    Read the article

  • MVC4 bundling GZIP and headers

    - by plurby
    I'm testing my site with Google PageSpeed and YSlow and the bundles that i've created with MVC4 bundles aren't getting Gzipped (Compressing resources with gzip or deflate can reduce the number of bytes sent over the network) and there is no Vary: Accept-Encoding header (Instructs proxy servers to cache two versions of the resource: one compressed, and one uncompressed. This helps avoid issues with public proxies that do not detect the presence of a Content-Encoding header properly.) And also how can i add encoding header for the whole scripts folder on the ISS. I know there is HTTP Response Headers, then Add Custom HTTP Response Header, but will this work on the whole scripts folders and subfolders and what to put in the Name and Value fields. How can this be solved. Regards.

    Read the article

  • Compress components with gzip - J2EE

    - by Venkata Sirish
    I am looking to improve front-end performance of my application, so I used YSlow tool in Firefox. When I ran this tool for my app, in the YSlow grade tab it showed up a issue 'Grade F on Compress components with gzip'. Seems to be that we need to compress the files(js, css) while sending from the server to client to increase the server response time. My app is a Struts Java application. Can anyone let me know how to compress and send the front end UI files(JS,CSS) from server so that the response time increases and my pages lot fastly? What are the things that I need to do to compress these files in Java at server?

    Read the article

  • Decompressing file with gzip produces file with no read-permissions on Windows 7

    - by Abiel
    I am attempting to decompress a .gz file using the GnuWin32 gzip program in Windows 7. I have full permissions on the compressed file, and my user account is an administrator. However, I end up not having read permissions on the decompressed file. To get read permissions I would have to manually change the permissions on it through right-clicking and selecting Properties Security. I am able to do this exact same thing with no permission problems in Windows XP, which leads me to believe that Windows 7's user account control system is causing problems. Does anyone know what I can do to make things work as I would expect (read permission on the decompressed file) in Windows 7? Thanks.

    Read the article

  • Compressing as GZip WCF requests (SOAP and REST)

    - by Joannes Vermorel
    I have a .NET 3.5 web app hosted on Windows Azure that exposes several WCF endpoints (both SOAP and REST). The endpoints typically receive 100x more data than they serve (lot of data is upload, much fewer is downloaded). Hence, I am willing to take advantage from HTTP GZip compression but not from the server viewpoint, but rather from the client viewpoint, sending compressed requests (returning compressed responses would be fine, but won't bring much gain anyway). Here is the little C# snippet used on the client side to activate WCF: var binding = new BasicHttpBinding(); var address = new EndpointAddress(endPoint); _factory = new ChannelFactory<IMyApi>(binding, address); _channel = _factory.CreateChannel(); Any idea how to adjust the behavior so that compressed HTTP requests can be made?

    Read the article

  • ASP.NET JavaScript File Embeded In DLL With GZIP

    - by Lee Hesselden
    We have several fairly large JavaScript files embedded into a single script resources DLL. This is then consumed by multiple projects by way of a reference and page includes via the ASP.NET script manager. This keeps things nice and neat within our ASP.NET pages and requires very little work to integrate into new projects. The problem is that some of these script files are quite larger (approx 100KB) and take time to download. By running minify on them before embedding this is reduced down a lot (around 70KB) but not enough. What we would like to do is GZIP the files before they are embedded. However, just gzipping the files causes syntax errors as the content is not unzipped. There is a content type "text/javascript" applied in AssemblyInfo when the resource is embedded but we can't find a way to specify content-encoding. Is there any way to make this work without having to write a httpmodule/handler (which would mean changing the config for all consuming projects)?

    Read the article

  • How to GZIP my JS und CSS Files

    - by Fincha
    Hello everyone, I habe a Problem, I have to gzip a prototype Lib, but i totaly have no idea how to do this, where to start und how does it works :) I find some tutorials but that wasn't helpfull... So I have a folder with my JS Files /compressed/js/ 1.js 2.js 3.js I caling this files for a test in this file /compresses/index.php <link rel="javascript" type="text/js" href="js/tabs.js" /> <link rel="javascript" type="text/js" href="js/fb.js" /> So what I have to do? :)

    Read the article

  • Great guide for JavaScript GZIP compression in IIS?

    - by Django Reinhardt
    Hi there, we're looking to compress our gargantuan JavaScript files with GZip to speed up the page loads of our site. I know this can be done through IIS, but I can't seem to find a simple step-by-step guide on how to implement it. If someone could point me towards such a guide, I'd really appreciate it. I've never done this before, so it would need to be quite basic. We're running IIS7.5 on Windows Server 2008 R2. Your time is much appreciated.

    Read the article

  • GZIP .htaccess and php session problem

    - by Suresh
    Hi, I am trying to implement GZIP compression for my website. I copied the below code in my .htaccess file: ExpiresActive On ExpiresDefault A604800 Header append Cache-Control "public" <IfModule mod_deflate.c> <FilesMatch "\.(js|css)$"> SetOutputFilter DEFLATE </FilesMatch> </IfModule> what happens is when I type username and password the page reloads but still the login form is displayed but session is set. When I refresh the page using ctrl + R the login form goes and the username is displayed. what will be the problem. wwaiting for ur reply.

    Read the article

  • Any way to chunk gzip with Apache and PHP

    - by donatJ
    I have a web application on a site that takes a while (~10 seconds) to complete a portion of the page near the bottom - it has been as optimized as it can be, and caching is not an option. We have compression enabled on the server via an .htaccess directive SetOutputFilter DEFLATE the problem is this causes the whole page to be held until completion before it starts outputting to the user, this is not optimal as the user sees nothing until the page completes. I have also tried it via the php ob_start("ob_gzhandler"); method. Currently I have a <FilesMatch > in my .htaccess restricting this specific script from being compressed. Basically my question is this - Is there a way to say chunk gzip or deflate so that the user gets it in pieces, so they can see that the page has begun loading?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >