Search Results

Search found 3111 results on 125 pages for 'mod gzip'.

Page 16/125 | < Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >

  • HTTP Compression problems on IIS7

    - by Jonathan Wood
    I've spent quite a bit of time on this but seem to be going nowhere. I have a large page that I really want to speed up. The obvious place to start seems to be HTTP compression, but I just can't seem to get it to work for me. After considerable searching, I've tried several variations of the code below. It kind of works, but after refreshing the browser, the results seem to fall apart. They were turning to garbage when the page used caching. If I turn off caching, then the page seems right but I lose my CSS formatting (stored in a separate file) and get an error that an included JS file contains invalid characters. Most of the resources I've found on the Web were either very old or focused on accessing IIS directly. My page is running on a shared hosting account and I do not have direct access to IIS7, which it's running on. protected void Application_BeginRequest(object sender, EventArgs e) { // Implement HTTP compression if (Request["HTTP_X_MICROSOFTAJAX"] == null) // Avoid compressing AJAX calls { // Retrieve accepted encodings string encodings = Request.Headers.Get("Accept-Encoding"); if (encodings != null) { // Verify support for or gzip (deflate takes preference) encodings = encodings.ToLower(); if (encodings.Contains("gzip") || encodings == "*") { Response.Filter = new GZipStream(Response.Filter, CompressionMode.Compress); Response.AppendHeader("Content-Encoding", "gzip"); Response.Cache.VaryByHeaders["Accept-encoding"] = true; } else if (encodings.Contains("deflate")) { Response.Filter = new DeflateStream(Response.Filter, CompressionMode.Compress); Response.AppendHeader("Content-Encoding", "deflate"); Response.Cache.VaryByHeaders["Accept-encoding"] = true; } } } } Is anyone having better success with this?

    Read the article

  • htaccess mod rewrite files to go through php first?

    - by jiexi
    I have a directory full of files. Originally people were allowed to direct link to these files. Now i would like to run all files through a php file first. Could someone help me with the .htaccess needed to do that? The phpfile used to handle the downloads will be called download.php and it will have a get variable called $ref So i need noob.com/games.zip to goto noob.com/download.php?ref=games.zip BUT still retaining the url of noob.com/games.zip Thanks!

    Read the article

  • JavaScript doesn't parse when mod-rewrited through a PHP file?

    - by Newbtophp
    If I do the following (this is the actual/direct path to the JavaScript file): <script href="http://localhost/tpl/blue/js/functions.js" type="text/javascript"></script> It works fine, and the JavaScript parses - as its meant too. However I'm wanting to shorten the path to the JavaScript file (aswell as do some caching) which is why I'm rewriting all JavaScript files via .htaccess to cache.php (which handles the caching). The .htaccess contains the following: <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^js/(.+?\.js)$ cache.php?file=$1 [NC] </IfModule> cache.php contains the following PHP code: <?php if (extension_loaded('zlib')) { ob_start('ob_gzhandler'); } $file = basename($_GET['file']); if (file_exists("tpl/blue/js/".$file)) { header("Content-Type: application/javascript"); header('Cache-Control: must-revalidate'); header('Expires: ' . gmdate('D, d M Y H:i:s', time() + 3600) . ' GMT'); echo file_get_contents("tpl/blue/js/".$file); } ?> and I'm calling the JavaScript file like so: <script href="http://localhost/js/functions.js" type="text/javascript"></script> But doing that the JavaScript doesn't parse? (if I call the functions which are within functions.js later on in the page they don't work) - so theirs a problem either with cache.php or the rewrite rule? (because the file by itself works fine). If I access the rewrited file- http://localhost/js/functions.js directly it prints the JavaScript code, as any JavaScript file would - so I'm confused as to what I'm doing wrong... All help is appreciated! :)

    Read the article

  • Converting Mod-rewrite rules which use %(QUERY_STRING) to NGINX rewrite format?

    - by HipHop-opatamus
    I've been stuck the last few days trying to convert the following Apache/Mod_Rewrite rewrite rule to NGINX format. Anyone know where I am going wrong? MOD_REWRITE: RewriteCond %{QUERY_STRING} topic=([0-9]+) RewriteRule /forum/index\.php /forum/vbseo301.php?action=thread&oldid=%1 [L] NGINX: location /forum/index.php { if ($args ~ "topic=([0-9]+)"){ rewrite ^/forum/index\.php?topic=(.+)$ /forum/vbseo301.php?action=thread&oldid=$1 last; } }

    Read the article

  • Piping stream into tar on FreeBSD

    - by Casey Jordan
    I am trying to pipe a tar/gzip archive into tar to decompress it. The script I have is part of a self extracting installer, where my archive is appended to the script. This works fine on linux, and the script looks like this: export TMPDIR=`mktemp -d /tmp/selfextract.XXXXXX` echo "TEMP: $TMPDIR" ARCHIVE=`awk '/^__ARCHIVE_BELOW__/ {print NR + 1; exit 0; }' $0` tail -n+$ARCHIVE $0 | tar xz -C $TMPDIR exit 0 __ARCHIVE_BELOW__ The tar archive as a string is after the ARCHIVE_BELOW but I omitted it from here since it's huge. However, when I do this on FreeBSD I get the following error: tar: Failed to open '/dev/sa0' I read that this is because free BSD expects to read from that device by default and you can tell it to read from stdin by passing -f - like so: tail -n+$ARCHIVE $0 | tar zxf - -C $TMPDIR However, when I do this I just get the error: tar: Damaged tar archive tar: Retrying... Can anyone point out what I am doing wrong here? I need to do it this way (Via piping) for efficiency reasons. Thanks

    Read the article

  • Why is IIS7 not compressing my static files?

    - by Peter Evjan
    I am trying to get IIS to compress jquery.js (and all other static files, but using jquery as the example here) on my localhost, but something goes wrong. The funny part is that when I look in my %SystemDrive%\inetpub\temp\IIS Temporary Compressed Files\MySiteName, I see the jquery.js file there, and its size is 24 KB. But in the browser, according to the Net tab on Firebug, the size is 69 kb. I've tried the following: - Checked that my browser accept compression. I found "Accept-Encoding gzip, deflate" in the request header via Firebug - Enabling Failed Request Tracing. Nothing turns up in the %SystemDrive%\inetpub\logs\FailedReqLogFiles folder after I do my request though.

    Read the article

  • How GZipped contents are transfered on the web?

    - by PJ
    I heard that static contents like CSS and JavaScript can be better delivered in GZip format. And Content Developer Network (CDN) always does so. However I don't understand how the format works. First when I tried making a gzipped file via command-line. The file extension is .gz. This is different from .css and .js. How do browsers recognize which file is gzipped or not. Second, how browsers "decompress" files? I dragged my index.html.gz onto my browsers. But no one worked. How do such gzipped work in the real world? What do I need to do if I want to serve CSS/JavaScript using Gzipped format.

    Read the article

  • How can I evaluate the best choice of archive format for compressing files?

    - by Mehrdad
    In general, I've observed the following: Linux-y files or tools use bzip2 or gzip for distributing archives Windows-y files or tools use ZIP for distributing archives Many people use 7-Zip for creating and distributing their own archives Questions: What are the advantages and disadvantages of these formats, all of which appear to be open formats? When/why should I choose one (say, 7-Zip) over another (say, ZIP)? Why does the trend above appear to hold, even though all of these are portable formats? Are there any particular advantages to using a particular archive format on a particular platform?

    Read the article

  • Creating a seperate static content site for IIS7 and MVC

    - by JK01
    With reference to this serverfault blog post: A Few Speed Improvements where it talks about how static content for stackexchange is served from a separate cookieless domain... How would someone go about doing this on IIS7.5 for a ASP.NET MVC site? The plan so far: Register domain eg static.com, create a new website in IIS Manually copy the js / css / images folders from MVC as is so that they have the same paths on the new server Enable IIS gzip settings (js/css = high compression, images = none) Set caching with far future expiry dates <clientCache cacheControlCustom="public" /> in the web.config Never set any cookies on the static.com site Combine and minimize js / css Auto deploy changes in static content with WebDeploy Is this plan correct? And how can you use WebDeploy to deploy the whole web app to one server and then only the static items to another? I can see there is a similar question, but for apache: Creating a cookie-free domain to serve static content so it doesn't apply

    Read the article

  • Mod rewrite with 3 parameters ?

    - by Axel
    Hello, I did tons of methods to figure out how to make this mod rewrite but i was completly unsuccessful. I want a .htaccess code that rewrite in the following method: http://www.mydomain.com/apple/upcoming/2 --- http://www.mydomain.com/handler.php?topic=apple&orderby=upcoming&page=2 This is easy to do, but the problem is that all parameters are not required so the link has different levels of parameters each time like this: http://www.mydomain.com/apple/popular/2 -- topic=apple&orderby=popular&page=2 http://www.mydomain.com/apple/2 -- topic=apple&orderby=&page=2 http://www.mydomain.com/all/popular/2 -- topic=all&orderby=popular&page=2 http://www.mydomain.com/apple/upcoming/ -- topic=apple&orderby=upcoming&page= So briefly, the url has 3 optional parameters in one static order: (topic) (orderby) (page) Note: the ORDERBY parameter can be "popular" or "upcoming" or nothing. Thanks

    Read the article

  • Math Looping Between Min and Max Using Mod?

    - by TheDarkIn1978
    i'm attempting to build a tiny (or perhaps not so tiny) formula that will contain numbers between a set min and max, but also loop these numbers so they are not clipped if they are outside of the range. so far, this is what i have. min1 = 10 max1 = 90 val1 = 92 //will make 12, which is what i want since it loops formula: min(max(min1,min(val1,max1)),mod(val1,max1)+min1) however, i'd like it to loop the other direction also, so that if val1 is 5, which is -5 outside of min1, it will become 85. another problem i'm running into is that max1 % max1 != max1 as i want it to, since the max is part of the range trying to be clear, here are some examples of desired output based on a range with looping min1 = 10 max1 = 90 ---------------------------------------------- val1 = 30 //within range: stays as 30 val1 = 90 //within range: stays as 90 val1 = -6 //below range: loops to becomes 84 val1 = 98 //above range: loops to becomes 18 i'd like not to resort to using a series of if/else statements, but one would be fine if it's absolutely required. is that even possible?

    Read the article

  • Need help with some IIS7 web.config compression settings.

    - by Pure.Krome
    Hi folks, I'm trying to configure my IIS7 compression settings in my web.config file. I'm trying to enable HTTP 1.0 requests to be gzip. MSDN has all the info about it here. Is it possible to have this config info in my own website's web.config file? Or do i need to set it at an application level? Currently, I have that code in my web.config... <system.webServer> <urlCompression doDynamicCompression="true" dynamicCompressionBeforeCache="true" /> <httpCompression cacheControlHeader="max-age=86400" noCompressionForHttp10="False" noCompressionForProxies="False" sendCacheHeaders="true" /> ... other stuff snipped ... </system.webServer> It's not working :( HTTP 1.1 requests are getting compressed, just not 1.0. That MSDN page above says that it can be used in :- Machine.config ApplicationHost.config Root application Web.config Application Web.config Directory Web.config So, can we set these settings on a per-website-basis, programatically in a web.config file? (this is an Application Web.config file...) What have i done wrong? cheers :) EDIT: I was asked how i know HTTP1.0 is not getting compressed. I'm using the Failed Request Tracing Rules, which reports back:- DYNAMIC_COMPRESSION_START DYNAMIC_COMPRESSION_NOT_SUCESS Reason: 3 Reason: NO_COMPRESSION_10 DYNAMIC_COMPRESSION_END

    Read the article

  • Compressing and copying large files on Windows Server?

    - by Aaron
    I've been having a hard time copying large database backups from the database server to a test box at another site. I'm open to any ideas that would help me get this database moved without having to resort to a USB hard drive and the mail. The database server is running Windows Server 2003 R2 Enterprise, 16 GB of RAM and two quad-core 3.0 GHz Xeon X5450s. Files are SQL Server 2005 backup files between 100 GB and 250 GB. The pipe is not the fastest and SQL Server backup files typically compress down to 10-40% of the original, so it made sense to me to compress the files first. I've tried a number of methods, including: gzip 1.2.4 (UnxUtils) and 1.3.12 (GnuWin) bzip2 1.0.1 (UnxUtils) and 1.0.5 (Cygwin) WinRAR 3.90 7-Zip 4.65 (7za.exe) I've attempted to use WinRAR and 7-Zip options for splitting into multiple segments. 7za.exe has worked well for me for database backups on another server, which has ~50 GB backups. I've also tried splitting the .BAK file first with various utilities and compressing the resulting segments. No joy with that approach either- no matter the tool I've tried, it ends up butting against the size of the file. Especially frustrating is that I've transferred files of similar size on Unix boxes without problems using rsync+ssh. Installing an SSH server is not an option for the situation I'm in, unfortunately. For example, this is how 7-Zip dies: H:\dbatmp>7za.exe a -t7z -v250m -mx3 h:\dbatmp\zip\db-20100419_1228.7z h:\dbatmp\db-20100419_1228.bak 7-Zip (A) 4.65 Copyright (c) 1999-2009 Igor Pavlov 2009-02-03 Scanning Creating archive h:\dbatmp\zip\db-20100419_1228.7z Compressing db-20100419_1228.bak System error: Unspecified error

    Read the article

  • How do I uncompress vmlinuz to vmlinux?

    - by Lord Loh.
    I have already tried uncompress, gzip, and all other solutions that come up as google results and these have not worked for me. To get just the image search for the GZ signature - 1f 8b 08 00. > od -A d -t x1 vmlinuz | grep '1f 8b 08 00' 0024576 24 26 27 00 ae 21 16 00 1f 8b 08 00 7f 2f 6b 45 so the image begins at 24576+8 => 24584. Then just copy the image from the point and decompress it - > dd if=vmlinuz bs=1 skip=24584 | zcat > vmlinux 1450414+0 records in 1450414+0 records out 1450414 bytes (1.5 MB) copied, 6.78127 s, 214 kB/s Got these instructions verbatim from a forum online: http://www.codeguru.com/forum/showthread.php?t=415186 This process does not work for me and end up giving errors that states file not found 0024576 and all subsequent numbers. How do I proceed extracting vmlinux from vmlinuz? Thank you. EDITED: This is a reverse engineering question. I have no access to the distro to install any RPM or recompile. I start with nothing but vmlinuz.

    Read the article

  • Compressing and copying large files on Windows Server?

    - by Aaron
    I've been having a hard time copying large database backups from the database server to a test box at another site. I'm open to any ideas that would help me get this database moved without having to resort to a USB hard drive and the mail. The database server is running Windows Server 2003 R2 Enterprise, 16 GB of RAM and two quad-core 3.0 GHz Xeon X5450s. Files are SQL Server 2005 backup files between 100 GB and 250 GB. The pipe is not the fastest and SQL Server backup files typically compress down to 10-40% of the original, so it made sense to me to compress the files first. I've tried a number of methods, including: gzip 1.2.4 (UnxUtils) and 1.3.12 (GnuWin) bzip2 1.0.1 (UnxUtils) and 1.0.5 (Cygwin) WinRAR 3.90 7-Zip 4.65 (7za.exe) I've attempted to use WinRAR and 7-Zip options for splitting into multiple segments. 7za.exe has worked well for me for database backups on another server, which has ~50 GB backups. I've also tried splitting the .BAK file first with various utilities and compressing the resulting segments. No joy with that approach either- no matter the tool I've tried, it ends up butting against the size of the file. Especially frustrating is that I've transferred files of similar size on Unix boxes without problems using rsync+ssh. Installing an SSH server is not an option for the situation I'm in, unfortunately. For example, this is how 7-Zip dies: H:\dbatmp>7za.exe a -t7z -v250m -mx3 h:\dbatmp\zip\db-20100419_1228.7z h:\dbatmp\db-20100419_1228.bak 7-Zip (A) 4.65 Copyright (c) 1999-2009 Igor Pavlov 2009-02-03 Scanning Creating archive h:\dbatmp\zip\db-20100419_1228.7z Compressing db-20100419_1228.bak System error: Unspecified error

    Read the article

  • What's wrong with my htaccess ? (500 Error)

    - by Dany Khalife
    I've written a small htaccess file to redirect Internet Explorer users to a specific page Here are the contents : # MS Internet Explorer - Mozilla v4 RewriteEngine On RewriteCond %{HTTP_USER_AGENT} ^Mozilla/4(.*)MSIE RewriteRule ^index\.php$ /sorry.php [L] # All other browsers #RewriteRule ^index\.html$ /index.32.html [L] Any clue why this would give a 500 Internal Server Error ? I have used mod rewrite before so i have the module loaded there...

    Read the article

  • How to decompress/inflate an XML response from ASP

    - by krisg
    Can anyone provide some insight into how i'd go about decompressing an XML response in classic ASP. We've been handed some code and asked to get it working: Set oXMLHttp = Server.CreateObject("MSXML2.ServerXMLHTTP") URL = HttpServer + re_domain + ".do;jsessionid=" + ue_session + "?" + data oXMLHttp.setTimeouts 5000, 60000, 1200000, 1200000 oXMLHttp.open "GET", URL, false oXMLHttp.setRequestHeader "Accept-Encoding", "gzip" oXMLHttp.send() if oXMLHttp.status = 200 Then if oXMLHttp.responseText = "" then htmlrequest_get = "Empty Response from Server" else htmlrequest_get = oXMLHttp.responseText end if else ... Apparently now that the response is compressed using gzip, we have to un-compress the XML response before we can start to work with the data. How should i go about this?

    Read the article

  • How do I get Java to use my multi-core processor?

    - by Rudiger
    I'm using a GZIPInputStream in my program, and I know that the performance would be helped if I could get Java running my program in parallel. In general, is there a command-line option for the standard VM to run on many cores? It's running on just one as it is. Thanks! Edit I'm running plain ol' Java SE 6 update 17 on Windows XP. Would putting the GZIPInputStream on a separate thread explicitly help? No! Do not put the GZIPInputStream on a separate thread! Do NOT multithread I/O! Edit 2 I suppose I/O is the bottleneck, as I'm reading and writing to the same disk... In general, though, is there a way to make GZIPInputStream faster? Or a replacement for GZIPInputStream that runs parallel? Edit 3 Code snippet I used: GZIPInputStream gzip = new GZIPInputStream(new FileInputStream(INPUT_FILENAME)); DataInputStream in = new DataInputStream(new BufferedInputStream(gzip));

    Read the article

  • GZipping CSS and JS files

    - by Ryan Giglio
    I'm using YSlow to improve the speed of my site, and I'm having trouble with the "compress components with gzip" grade. I have this in my .htaccess file: SetOutputFilter DEFLATE AddOutputFilterByType DEFLATE text/html text/plain text/xml text/css application/x-javascript But YSlow is saying There are 4 plain text components that should be sent compressed * http://crewinyourcode.com/css/reset.css * http://crewinyourcode.com/css/inner-pages/index.css * http://crewinyourcode.com/script/css/jquery-ui-1.8.custom.css * http://crewinyourcode.com/js/inner-pages/index.js How can I gzip the css and js files? Also...I don't have access to the httpd.conf file.

    Read the article

  • Why does Google Page-Speed say that elements need compressing, when they already are compressed?

    - by Peter Snow
    My page is compressed using the following in .htaccess <ifModule mod_gzip.c> mod_gzip_on Yes mod_gzip_dechunk Yes mod_gzip_item_include file \.(html?|txt|css|js|php|pl)$ mod_gzip_item_include handler ^cgi-script$ mod_gzip_item_include mime ^text/.* mod_gzip_item_include mime ^application/x-javascript.* mod_gzip_item_exclude mime ^image/.* mod_gzip_item_exclude rspheader ^Content-Encoding:.*gzip.* </ifModule> Yslow says that the page and specifically the elements which Page-Speed is complaining about, are compressed and it gives the page an overall score of 90/100. Why then, does Page-Speed say that Compressing the following resources with gzip could reduce their transfer size by 118.8KiB (70% reduction). and it gives the page an overall score of 33/100?

    Read the article

  • Google CDN not gzipping jquery

    - by thermal7
    If I navigate here: http://ajax.microsoft.com/ajax/jquery/jquery-1.4.2.min.js I download 70k using Firefox 3.6.3 and I can confirm it is sending Accept-Encoding: gzip. If I use the Microsoft one: http://ajax.microsoft.com/ajax/jquery/jquery-1.4.2.min.js I download 30k (and it comes through as Content-Encoding: gzip) I am also experiencing this when using jquery 1.4.2 in regular sites eg jquery.com. Funily enough, stack overflow which references jquery 1.3.2 on the google cdn, is coming through gzipped. Why is this happening? Is it some kind of issue with google or am I missing something? I live in Melbourne, Australia.

    Read the article

  • What do I need to Mod a Unreal Engine 3 game?

    - by RoadSideWarrior
    what I am looking for is some advise making a mod for a certain game and how I would go about making it. The game I am talking about is Blacklight: Retribution and what I wan't to know is; Is it possible? And if so, what programs will I need? It is an online only game so I was unsure how plausible a mod would be for it. Plus I have never made a video game before, but I do like the game and I wanted to do some things with it. Additionally, this will be my first time making anything video game related so I would appreciate any advise. To expand a bit, I plan to add something simple at first. A mod that would let you spectate another player in the first person. Then I plan do something a bit more complex where I want to make so the game optionally always records you playing (in short intervals most likely or you would run quickly out of memory). After all that is done I would add items, armor, weapons, and maybe make a map or not I am not sure but this in a shell what I hope to do. I don't know much about these things but I am reading anything I can get my hands on. So if this is overly ambitious or just plain out not a possibility any advise on what I should look to instead will be welcomed warmly. Thank you.

    Read the article

  • php zencart mod - having problems with attributes array

    - by user80151
    I inherited a zencart mod and can't figure out what's wrong. The customer selects a product and an attribute (model#). This is then sent to another form that they complete. When they submit the form, the product and the attribute should be included in the email sent. At this time, only the product is coming through. The attribute just says "array." The interesting part is, when I delete the line that prints the attribute, the products_options_names will print out. So I know that both the product and the products_options_names are working. The attribute is the only thing that is not working right. Here's what I believe to be the significant code. This is the page that has the form, so the attribute should already be passed to the form. //Begin Adding of New features //$productsimage = $product['productsImage']; $productsname = $product['productsName']; $attributes = $product['attributes']; $products_options_name = $value['products_options_name']; $arr_product_list[] = "<strong>Product Name:</strong> $productsname <br />"; $arr_product_list[] .= "<strong>Attributes:</strong> $attributes <br />"; $arr_product_list[] .= "<strong>Products Options Name:</strong> $products_options_name <br />"; $arr_product_list[] .= "---------------------------------------------------------------"; //End Adding of New features } // end foreach ($productArray as $product) ?> Above this, there is another section that has attributes: <?php echo $product['attributeHiddenField']; if (isset($product['attributes']) && is_array($product['attributes'])) { echo '<div class="cartAttribsList">'; echo '<ul>'; reset($product['attributes']); foreach ($product['attributes'] as $option => $value) { ?> Can anyone help me figure out what is wrong? I'm not sure if the problem is on this page or if the attribute isn't being passed to this page. TIA

    Read the article

  • strange bundler error: tar_input.rb:49:in `initialize': not in gzip format (Zlib::GzipFile::Error) o

    - by z3cko
    i am getting a strange bundler error when running bundle pack with bundler 0.9.12 any ideas? (see pastie for a better formatted code: http://pastie.org/881328 ) /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/site_ruby/1.8/rubygems/package/tar_input.rb:49:in `initialize': not in gzip format (Zlib::GzipFile::Error) from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/site_ruby/1.8/rubygems/package/tar_input.rb:49:in `new' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/site_ruby/1.8/rubygems/package/tar_input.rb:49:in `initialize' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/site_ruby/1.8/rubygems/package/tar_reader.rb:63:in `each' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/site_ruby/1.8/rubygems/package/tar_reader.rb:54:in `loop' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/site_ruby/1.8/rubygems/package/tar_reader.rb:54:in `each' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/site_ruby/1.8/rubygems/package/tar_input.rb:32:in `initialize' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/site_ruby/1.8/rubygems/package/tar_input.rb:17:in `new' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/site_ruby/1.8/rubygems/package/tar_input.rb:17:in `open' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/site_ruby/1.8/rubygems/package.rb:55:in `open' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/site_ruby/1.8/rubygems/format.rb:63:in `from_io' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/site_ruby/1.8/rubygems/format.rb:51:in `from_file_by_path' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/site_ruby/1.8/rubygems/format.rb:50:in `open' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/site_ruby/1.8/rubygems/format.rb:50:in `from_file_by_path' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/source.rb:115:in `specs' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/source.rb:114:in `each' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/source.rb:114:in `specs' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/index.rb:32:in `from_cached_specs' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/index.rb:23:in `application_cached_gems' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/index.rb:15:in `cached_gems' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/index.rb:5:in `build' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/index.rb:14:in `cached_gems' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/environment.rb:15:in `index' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/index.rb:5:in `build' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/environment.rb:13:in `index' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/runtime.rb:86:in `specs' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/runtime.rb:130:in `details' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/runtime.rb:119:in `write_yml_lock' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/runtime.rb:65:in `lock' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/cli.rb:89:in `lock' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/cli.rb:131:in `package' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/vendor/thor/task.rb:33:in `send' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/vendor/thor/task.rb:33:in `run' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/vendor/thor/invocation.rb:109 from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/vendor/thor/invocation.rb:116:in `call' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/vendor/thor/invocation.rb:116:in `invoke' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/vendor/thor.rb:137:in `start' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/vendor/thor/base.rb:378:in `start' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/lib/bundler/vendor/thor.rb:124:in `start' from /opt/ruby-enterprise-1.8.7-2010.01/lib/ruby/gems/1.8/gems/bundler-0.9.12/bin/bundle:11 from /opt/REE/bin/bundle:19:in `load' from /opt/REE/bin/bundle:19

    Read the article

< Previous Page | 12 13 14 15 16 17 18 19 20 21 22 23  | Next Page >