Search Results

Search found 1285 results on 52 pages for 'lossless compression'.

Page 5/52 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • What is the easiest way to add compression to WCF in Silverlight?

    - by caryden
    I have a silverlight 2 beta 2 application that accesses a WCF web service. Because of this, it currently can only use basicHttp binding. The webservice will return fairly large amounts of XML data. This seems fairly wasteful from a bandwidth usage standpoint as the response, if zipped, would be smaller by a factor of 5 (I actually pasted the response into a txt file and zipped it.). The request does have the "Accept-Encoding: gzip, deflate" - Is there any way have the WCF service gzip (or otherwise compress) the response? I did find this link but it sure seems a bit complex for functionality that should be handled out-of-the-box IMHO. OK - at first I marked the solution using the System.IO.Compression as the answer as I could never "seem" to get the IIS7 dynamic compression to work. Well, as it turns out: Dynamic Compression on IIS7 was working al along. It is just that Nikhil's Web Developer Helper plugin for IE did not show it working. My guess is that since SL hands the web service call off to the browser, that the browser handles it "under the covers" and Nikhil's tool never sees the compressed response. I was able to confirm this by using Fiddler which monitors traffic external to the browser application. In fiddler, the response was, in fact, gzip compressed!! The other problem with the System.IO.Compression solution is that System.IO.Compression does not exist in the Silverlight CLR. So from my perspective, the EASIEST way to enable WCF compression in Silverlight is to enable Dynamic Compression in IIS7 and write no code at all.

    Read the article

  • Weird unexpected image compression on a web server running Apache on Ubuntu?

    - by Billy Bob Thornton
    I have a weird problem on my production web server running Apache on Ubuntu: it compresses my images thereby dramatically lowering their quality! Actually I have two virtual hosts running, each located in a different folder. Wether I display .gif images by navigating on the two sites, or acceding them directly by their url, their size and quality are invariably degraded. I tried with three different browsers: same problem. Using them on other sites on the Web: no problem. Of course I disabled mod_deflate on the server (which should not compress images anyway), but the phenomenon remains. On my local développement server, running the same configuration, everything is Ok. Now I'm completely lost! For the record, my configuration: Ubuntu 10.04, Apache 2, Php 5.

    Read the article

  • Is there a quality, file-size, or other benefit to JPEG sizes being multiples of 8px or 16px?

    - by davebug
    The JPEG compression encoding process splits a given image into blocks of 8x8 pixels, working with these blocks in future lossy and lossless compressions. [source] It is also mentioned that if the image is a multiple 1MCU block (defined as a Minimum Coded Unit, 'usually 16 pixels in both directions') that lossless alterations to a JPEG can be performed. [source] I am working with product images and would like to know both if, and how much benefit can be derived from using multiples of 16 in my final image size (say, using an image with size 480px by 360px) vs. a non-multiple of 16 (such as 484x362). In this example I am not interested in further alterations, editing, or recompression of the final image. To try to get closer to a specific answer where I know there must be largely generalities: Given a 480x360 image that is 64k and saved at maximum quality in Photoshop [example]: Can I expect any quality loss from an image that is 484x362 What amount of file size addition can I expect (for this example, the additional space would be white pixels) Are there any other disadvantages to growing larger than the 8px grid? I know it's arbitrary to use that specific example, but it would still be helpful (for me and potentially any others pondering an image size) to understand what level of compromise I'd be dealing with in breaking the non-8px grid. The key issue here is a debate I've had is whether 8-pixel divisible images are higher quality than images that are not divisible by 8-pixels.

    Read the article

  • 7-Zip - A Free alternative to other compression utilities

    - by TATWORTH
    At http://www.7-zip.org/download.html, there is a free alternative other compression utilities. It handles a wide variety of formats including RAR!Here is the description from its home page:License 7-Zip is open source software. Most of the source code is under the GNU LGPL license. The unRAR code is under a mixed license: GNU LGPL + unRAR restrictions. Check license information here: 7-Zip license. You can use 7-Zip on any computer, including a computer in a commercial organization. You don't need to register or pay for 7-Zip. The main features of 7-Zip High compression ratio in 7z format with LZMA and LZMA2 compressionSupported formats: Packing / unpacking: 7z, XZ, BZIP2, GZIP, TAR, ZIP and WIMUnpacking only: ARJ, CAB, CHM, CPIO, CramFS, DEB, DMG, FAT, HFS, ISO, LZH, LZMA, MBR, MSI, NSIS, NTFS, RAR, RPM, SquashFS, UDF, VHD, WIM, XAR and Z. For ZIP and GZIP formats, 7-Zip provides a compression ratio that is 2-10 % better than the ratio provided by PKZip and WinZipStrong AES-256 encryption in 7z and ZIP formatsSelf-extracting capability for 7z formatIntegration with Windows ShellPowerful File ManagerPowerful command line versionPlugin for FAR ManagerLocalizations for 79 languages

    Read the article

  • "Unsupported compression method 98" error when unzipping a file. What tools support it, other than W

    - by Chris W. Rea
    I'm getting an error message: "unsupported compression method 98" when unzipping a file somebody sent to me. I've tried both an older version of WinZip, and 7-Zip 4.65. I've already asked the person to avoid using a non-standard compression method and re-send the file. I know WinZip (of which they are using a newer version) has compatibility options. But, I'm wondering: What archiving utilities, other than WinZip, support this "compression method 98"? In particular, is there a free and/or open source tool that supports that method? If not, why not? Is the method strictly proprietary to WinZip?

    Read the article

  • Creating transparent PNG with exact RGBA values

    - by rrowland
    I'm color-coding a transparent image to be read programatically. However, the image seems to be getting compressed and my code is reading color values other than the ones I mean to pass it. Concept This is the output I get, exporting as PNG-24. I programatically check each pixel for one of the six colors I use in creating the image: 0x00000F 0x0000F0 0x000F00 0x00F000 0x0F0000 0xF00000 Each color represents a different texture to apply. Top right (0x00000F) will pull texture from the tile to its top right and blend it at a ratio equal to the opacity of the pixel. The end goal is to create a hex tiled grid with differing textures that blend smoothly. What's happening It seems that when converting to PNG, Photoshop will change the RGBA to make it smoother, or just to help compression size. Parts that should be 250 red range anywhere from 150 to 255. Question Whether using PNG or another web-compatible format, I need to be able to save these pixel values, essentially instructions, loss-less and still maintain transparency. Is this possible in any format or will I need to re-think my approach?

    Read the article

  • mod_deflate Supported Encodings for Compression

    - by sparc
    It seems to me, that mod_deflate in Apache 2.2 will always return: Content-Encoding: gzip and never: Content-Encoding: deflate It was explained to me, that although there may be a deflate algorithm, mod_deflate is named after a file-format, in which the algorithm could be any of: gzip, bzip. pkzip Of those three, mod_deflate provides gzip. It seems as though gzip is the most popular and widely-supported algorithm in web browsers, but I know some web servers and proxies do return Content-Encoding: deflate. Aside from the confusion of the module's name, it true that mod_deflate will only return Content-Encoding: gzip? Thank you.

    Read the article

  • Disabling compression for IE pre SP2 with Apache mod_rewrite

    - by Ra y Mon
    I am trying to replicate this fix ( http://sebduggan.com/posts/ie6-gzip-bug-solved-using-isapi-rewrite ) with Apache mod_rewrite, but with no success... Can somebody help me translate those ISAPI rules to APACHE mod_rewrite? I don't know how to 'translate' those rules... My objective is to avoid sending compressed css and js when the user has an XP version prior to SP2, since there is a bug that prevents IE6&7 under SP1 to read the gzipped CSSs of my website BuscoUnViaje.com The rules I am trying to 'translate' to Apache mod_rewrite: RewriteCond %{HTTP:User-Agent} MSIE\ [56] RewriteCond %{HTTP:User-Agent} !SV1 RewriteCond %{REQUEST_URI} \.(css|js)$ RewriteHeader Accept-Encoding: .* $1 Thanks in advance...

    Read the article

  • An XEvent a Day (28 of 31) – Tracking Page Compression Operations

    - by Jonathan Kehayias
    The Database Compression feature in SQL Server 2008 Enterprise Edition can provide some significant reductions in storage requirements for SQL Server databases, and in the right implementations and scenarios performance improvements as well.  There isn’t really a whole lot of information about the operations of database compression that is documented as being available in the DMV’s or SQL Trace.  Paul Randal pointed out on Twitter today that sys.dm_db_index_operational_stats() provides...(read more)

    Read the article

  • Compression File

    - by Pedro Magalhaes
    Hi, I have a X bytes file. And I want to compress it in block of size 32Kb, for example. Is there any lib that Can I do this? I used Zlib for Delphi but I just can compress a full file in new compressed file. Tranks a lot, Pedro

    Read the article

  • Oracle Advanced Compression Webcast Replay Available

    - by [email protected]
    Did you miss our webcast "Save BIG on Storage - with Oracle Database 11g and Advanced Compression"? Don't worry, you can still register and view the recording including the full Q&A session with Tim Shetler and Bill Hodak. Click here to learn how Oracle Advanced Compression can reduce your disk space requirements for all types of data, improve query and storage performance and lower storage costs throughout the datacenter.

    Read the article

  • Compression Array of Bytes

    - by Pedro Magalhaes
    Hi, My problem is: I want to store a array of bytes in compressed file, and then I want to read it with a good performance. So I create a array of bytes then pass to a ZLIB algorithm then store it in the file. For my surprise the algorithm doesn't work well., probably because the array is a random sample. Using this approach, it will will be ber easy to read. Just copy the stream to memory, decompress them and copy it to a array of bytes. But i need to compress the file. Do I have to use a algorithm, like RLE, for compresse the byte array? I think that I can store the byte array like a string and then compress it. But i think I am going to have a poor performance on reading data. Sorry for my poor english. Thanks

    Read the article

  • Windows command line compression/extraction tool?

    - by Will Marcouiller
    I need to write a batch file to unzip files to their current folder from a given root folder. Folder 0 |----- Folder 1 | |----- File1.zip | |----- File2.zip | |----- File3.zip | |----- Folder 2 | |----- File4.zip | |----- Folder 3 |----- File5.zip |----- FileN.zip So, I wish that my batch file is launched like so: ocd.bat /d="Folder 0" Then, make it iterate from within the batch file through all of the subfolders to unzip the files exactly where the .zip files are located. So here's my question: Does the Windows (from XP at least) have a command line for its embedded zip tool? Otherwise, shall I stick to another third-party util?

    Read the article

  • String Compression using Lempel-Ziv

    - by roybot
    Im looking for a way of compressing a given string using the Lempel-Ziv Algorithm. Preferably there would only be a set of two functions, encoder and decoder. The encoder takes the string and returns an integer. The decoder takes the integer and returns the original string. Time complexity is not important. How would you implement this?

    Read the article

  • Maximum compression of a MSI install using WIX

    - by MX4399
    I used to build installs for an app with NSIS and the final self extractor was 1.2 MB. Now I need to use WIX due to operational needs and the same install comes out at 4.2 MB. I set the compressed flags as the docs and specs indicated on the package node. Using 7z to zip the MSI results in a 2.4 MB zip file. Question: How can I do a maximum compress on the MSI or create a small MSI (e.g. remove unneeded resources etc) ? Note - size is uber important and I have to use MSI/WIX now - this is a show stopper!

    Read the article

  • Java code compression and decompression of a string

    - by user1822710
    I am having a problem figuring how to check a string for the same characters in a row then count that same character in a row then printing it out then giving the location of the last occorance of that character count then printing it out then moving to the next character in the string that is different then the previous character and the program is case sensitive. So the input could be: aaaaAAAbbbddccc How would I compress this string to: a4A3b3d2c3 ? and then decompress it?

    Read the article

  • Can a JPEG compressed image be rotated without a loss in quality?

    - by Mat
    JPEG is a lossy compression scheme, so decompression-manipulation-recompression normally reduces the image quality further for each step. Is it possible to rotate a JPEG image without incurring further loss? From what little I know of the JPEG algorithm, it naively seems possible to avoid further loss with a bit of effort. Which common image manipulation programs (e.g. GIMP, Paint Shop Pro, Windows Photo Gallery) and graphic libraries cause quality loss when performing a rotation and which don't?

    Read the article

  • Does ZFS cache Compressed or Uncompressed data in a ZFS file-system with compression turned on?

    - by George Bailey
    ZFS supports file-system compression and it also caches frequently or recently accessed data. If a system has lots of CPU but the underlying data storage system is slow. It is possible that ZFS would perform better with compression turned on. This can be easily tested when writing files by measuring CPU and disk usage and throughput. (of course latency may exist,, but this would not be an issue for large files). But what about cache? If data will have to be decompressed every time it is read then this is probably less of a good idea. Is the cached data compressed?. Does anybody have some information on this?

    Read the article

  • Trouble installing Lagarith Lossless Codec (Codec Removal?)

    - by Xeoncross
    I had been using the Lagarith Lossless Codec with camstudio on Windows XP Pro SP3 for a little while before I switched to Ubuntu. Now I'm back trying to do something on windows and the codec is now missing. I tried to install it via the .exe and even manually with the .inf but it's not being listed by camstudio as installed. I'm wondering if one of the other codecs I installed might have messed with it and yet some of the files are still fine so whatever method the installers use to check for existing files before install is returning positive and not re-installing it. How can I get Lagarith to work again? Maybe I need to know how to uninstall codecs.

    Read the article

  • Trouble installing Lagarith Lossless Codec

    - by Xeoncross
    I had been using the Lagarith Lossless Codec with camstudio on Windows XP Pro SP3 for a little while before I switched to Ubuntu. Now I'm back trying to do something on windows and the codec is now missing. I tried to install it via the .exe and even manually with the .inf but it's not being listed by camstudio as installed. I'm wondering if one of the other codecs I installed might have messed with it and yet some of the files are still fine so whatever method the installers use to check for existing files before install is returning positive and not re-installing it. How can I get Lagarith to work again? Maybe I need to know how to uninstall codecs.

    Read the article

  • MPI: is there mpi libraries capable of message compression?

    - by osgx
    Sometimes MPI is used to send low-entropy data in messages. So it can be useful to try to compress messages before sending it. I know that MPI can work on very fast networks (10 Gbit/s and more), but many MPI programs are used with cheap network like 0,1G or 1Gbit/s Ethernet and with cheap (slow, low bisection) network switch. There is a very fast Snappy (wikipedia) compression algorithm, which has Compression speed is 250 MB/s and decompression speed is 500 MB/s so on compressible data and slow network it will give some speedup. Is there any MPI library which can compress MPI messages (at layer of MPI; not the compression of ip packets like in PPP). MPI messages are also structured, so there can be some special method, like compression of exponent part in array of double.

    Read the article

  • Increase the compression performance of VPN

    - by Martin
    I am currently switching from a system with HPN-SSH tunnels and enabled compression to something VPN based. I have tried tinc and n2n so far, hamachi requires a library I do not have. In my primitive benchmarks I am not satisfied with the achievable bandwidth compared to the SSH tunnels. In tinc the low LZO setting performed best, but compression is only available in UDP mode. Ideally I would like to have a TCP-based VPN with a multi-threaded compression. Can you suggest me some ideas how to increase the performance? Would it be possible to somehow put a compression filter in front of the tun interface? Or are there any VPN implementations that might be better suited for my needs (fast compression, TCP-based, switch mode, does not have to be super-secure)? I would consider tunnelling Ethernet over SSH, but according to some articles it is not advisable.

    Read the article

  • How does a web server/the http protocol handle version control and compression?

    - by Sune Rasmussen
    When a client browser requests a file from the web server, I know that some kind of check is performed, because the files needed to serve the web page may already be cached by the web browser. So, if a file exists in the cache, no files are sent. But if the file on the server has changed since the file was cached in the browser, the file is sent and updated anyhow. Then, if you have compression like gzipping enabled on the server, the files that are to be provided to the client must be gzipped on the way, requiring some amount of server side processing. But how is this managed? The logical approach seems to me, that the web server should have a cache as well, containing the newest version of all files that have been requested within a certain time span, thus a compressed version of these files, so that compression would not have to be done each time a files is requested. And also, how are files eventually requested? Does the browser ask for files, each time it encounters one in the HTML code and the specific file is not stored in the local cache, or does it sum all the files that are needed up and ask for the whole bunch at the same time? But that's only guessing from a programming point of view, and I don't really know. If the answers are very different among web server systems, I'm primarily interested in Apache, but other answers are appreciated, too.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >