Search Results

Search found 1285 results on 52 pages for 'lossless compression'.

Page 8/52 | < Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >

  • Disable SSL / TLS compression in Apache 2.2.x

    - by DevGav
    Is there a way to disable SSL/TLS Compression in Apache 2.2.x when using mod_ssl? If not, what are people doing to mitigate the effects of CRIME/BEAST in older browsers? Related Links: https://issues.apache.org/bugzilla/show_bug.cgi?id=53219 https://threatpost.com/en_us/blogs/new-attack-uses-ssltls-information-leak-hijack-https-sessions-090512 http://security.stackexchange.com/questions/19911/crime-how-to-beat-the-beast-successor

    Read the article

  • RAR Command Line Maximum Compression

    - by Steve Robathan
    I am writing a batch file to compress a folder using various archiving applications. Currently I also use Winrar (X64) but manually set up the parameters I would like to add rar to my batch The folder concerned has many sub folders and I need to take this into account What is the command line for the following keeping folder structure?: Solid Archive, Archive Format=RAR5, Compression Method=Best, Dictionary Size=1024MB Many thanks

    Read the article

  • VLC Dynamic Range compression multiple songs

    - by Sion
    In my collection of music I have some songs which seem to be compressed nicely. But in addition to those I have songs which are overly quite compared to the louder compressed songs. So maybe the problem isn't compression but average volume. Would the Dynamic Range Compressor in VLC work for this type of problem or would I have better luck using external speakers and running it through a guitar compressor?

    Read the article

  • How to higly compress files

    - by Alfred James
    I want to learn how can we higly compress a file. I have downloaded files whose compressed size was 4 gb and after expanding they became 10GB file. So Which software I need to use for such high compression? Will I lose quality of the data by compression? If you direct me to Winrar or 7zip, please tell me how can I higly compress a file cuz I have tried compression with it but the size was reduced just by few MBs. I am compressing games like .exe files. Thanks

    Read the article

  • Why does a zip file appear larger than the source file especially when it is text?

    - by PeanutsMonkey
    I have a text file that is 19 bytes in size and having compressed the file using zip and 7zip, it appears to be larger. I had a read of the question on Why is a 7zipped file larger than the raw file? as well as Why doesn't ZIP Compression compress anything? but considering the file is not already compressed I would have expected further compression. Attached is a screenshot. EDIT0 I took the example further by creating a file that contained random data as follows dd if=/dev/urandom of=sample.log bs=1G count=1 and attempted to compress the file using both zip and 7zip however there were no compression gains. Why is that?

    Read the article

  • Convert old videos to have smaller sizes

    - by Tim
    I have some videos from a few years ago,with various formats, such as avi, mpg, wmv, rm, rmvb, .... Their sizes are huge(more than 500 MB, and sometimes 1GB). Given there may likely be some advance in data compression, I would like to know which file formats and compression methods are recommended these days, by the standard that without losing obvious data, while achieving big size reduction. How can I perform the file format conversion and data compression in Ubuntu 12.04? Command line and batch ways would be the most convenient, although GUI ways are also appreciated. Thanks and regards!

    Read the article

  • Oracle Database 11g R2 támogatott SAP alatt is

    - by Lajos Sárecz
    Húsvét óta már SAP alatt is használható az Oracle Database 11g R2. Köztudott, hogy az SAP csak a Release 2-re ad ki támogatást, így ez most egy igazán örömteli hír az SAP felhasználóknak, hiszen az alábbi 11g R2 újdonságokat tudják alkalmazni SAP környezetben: • Advanced Compression opció (táblára, RMAN mentésre, expdp-re, Data Guard hálózatra) • Real Application Testing • Oracle Database 11g Release 2 Database Vault • Oracle Database 11g Release 2 RAC • Advanced Encryption táblaterekre, RMAN mentésekre, expdp-re, Data Guard hálózatra • Direct NFS • Deferred Segments • Online Patching Azaz például tömöríthetové válik az SAP adatbázisa, vagy az abból készített mentések. Az eddigi tapasztalatok szerint a tömörítés aránya adatbázistól függoen 2-4-szeres. Az adatbázis upgrade és minden egyéb adatbázis infrastruktúrát érinto változatatás kockázata jelentosen csökkentheto lesz a Real Application Testing alkalmazásával. A rendszergazdai szerepkörök szeparaláhatóvá válnak a Database Vault felhasználásával. A Real Application Clusters 11g R2 újdonságai is elérheto lesznek. A Transparent Data Encryption révén a táblaterek és a mentések titkosíthatók úgy, hogy az alkalmazás számára mindez transzparens, azonban a médiához közvetlenül hozzáférve nem lesznek visszafejthetok az adatok. Támogatott lesz a Direct NFS kliens, ezzel NFS elérési sebesség jelentosen javul. A Deffered Segments révén pedig a tábla szegmensek csak akkor kerülnek lefoglalásra, amikor adat kerül a táblába. Ez azért hasznos, mert általában alkalmazások telepítésekor létrejön minden tábla, azonban sok táblába nem kerül adat. Ezáltal mind a telepítés ideje, mind az adatbázis mérete csökkentheto. Az Online Patching pedig lehetové teszi a leállításmentes patch telepítést. Hát azt gondolom ezek vonzó lehetoségek, érdemes betervezni a közeljövobe az SAP rendszerek alatti adatbázis frissítését, hiszen a 10g verzió Premier Support idén nyáron lejár. Az upgrade-hez pedig mindenképp javaslom a Real Application Testing használatát, amivel az éles terhelés mellett teszthelheto teszt környezetben az upgrade. A Sun Oracle Database Machine és az Exadata sajnos még nem támogatott SAP alatt, mivel az ASM certifikáció még nem zárult le. A hírek szerint 2011 elejére várható, hogy ez megtörténik.

    Read the article

  • 'Binary XML' for game data?

    - by bluescrn
    I'm working on a level editing tool that saves its data as XML. This is ideal during development, as it's painless to make small changes to the data format, and it works nicely with tree-like data. The downside, though, is that the XML files are rather bloated, mostly due to duplication of tag and attribute names. Also due to numeric data taking significantly more space than using native datatypes. A small level could easily end up as 1Mb+. I want to get these sizes down significantly, especially if the system is to be used for a game on the iPhone or other devices with relatively limited memory. The optimal solution, for memory and performance, would be to convert the XML to a binary level format. But I don't want to do this. I want to keep the format fairly flexible. XML makes it very easy to add new attributes to objects, and give them a default value if an old version of the data is loaded. So I want to keep with the hierarchy of nodes, with attributes as name-value pairs. But I need to store this in a more compact format - to remove the massive duplication of tag/attribute names. Maybe also to give attributes native types, so, for example floating-point data is stored as 4 bytes per float, not as a text string. Google/Wikipedia reveal that 'binary XML' is hardly a new problem - it's been solved a number of times already. Has anyone here got experience with any of the existing systems/standards? - are any ideal for games use - with a free, lightweight and cross-platform parser/loader library (C/C++) available? Or should I reinvent this wheel myself? Or am I better off forgetting the ideal, and just compressing my raw .xml data (it should pack well with zip-like compression), and just taking the memory/performance hit on-load?

    Read the article

  • How Google Web Starter Kit serves adaptive image for mobile?

    - by 5argon
    My website weirdly (in a good way) serves smaller images when viewed on mobile. I wanted to know what cause this? As far as I know this is not the default behaviour, so I think it must be Google Web Starter Kit's doing.Here is the debug information when debugging on device. All images became 231 B size no matter how large it actually is. (On desktop debugging the size varies.) I tried using Google Web Starter Kit (https://github.com/google/web-starter-kit) recently. The tools in it are made of Ruby, Node.js, SASS and Gulp to help you 'build' website. Pre-build you can enjoy automatic reload because the Gulp script will watch all files for you. When build it will run various tools to minify HTML,CSS and compress images. According to this page https://developers.google.com/web/fundamentals/tools/build/build_site the gulp-imagemin was used. So I guess the imagemin is doing the mobile optimization for me? What kind of compression can serve automatically resized image on mobile? And why is the size 231 B? Is this related to my screen size?

    Read the article

  • gzip compression

    - by SyAu
    I'm using Weblogic application server and Apache web server in my J2EE environment and planning to implement gzip compression of response. Not sure, whether to implement compression on the Apache server or on the weblogic.

    Read the article

  • Cannot turn on gzip compression in JBoss 5

    - by Vladimir Bezugliy
    I added following file deployers\jbossweb.deployer\server.xml <Connector compression="force" compressionMinSize="512" noCompressionUserAgents="gozilla, traviata" compressableMimeType="text/html,text/xml,image/png,text/css,text/javascript"> </Connector> But fiddler shows that jboss does not compress responses. How to ensure that gzip compression in JBoss is turned on? Is it possible to check it in jmx-console?

    Read the article

  • MinGW tar compression problem

    - by Shiftbit
    I am unable to get the Mingw tar to work with compress files. It does not filter through the proper compression utility. However, tar will work if I manually uncompress the file first. I have tried in both the MSYS shell and Windows cmd. Has anyone had this problem or is it a MinGW bug? For example, this does not work: C:\Users\home\Desktop>tar -tzf wdiff-0.5.tar.gz tar: Cannot use compressed or remote archives tar: Error is not recoverable: exiting now C:\Users\home\Desktop>tar -t -Zgzip -f wdiff-0.5.tar.gz tar: Cannot use compressed or remote archives tar: Error is not recoverable: exiting now C:\Users\home\Desktop>tar -tf wdiff-0.5.tar.gz tar: Hmm, this doesn't look like a tar archive tar: Skipping to next file header tar: Only read 6732 bytes from archive wdiff-0.5.tar.gz tar: Error is not recoverable: exiting now However, this works: gzip -d wdiff-0.5.tar.gz tar -tf wdiff-0.5.tar

    Read the article

  • What is the best nginx compression gzip level?

    - by Chamnap
    I'm using nginx reverse proxy cache with gzip enabled. However, I got some problems from android applications http requests to my rails json web service. It seems when I turn off reverse proxy cache, it works ok because the response header comes without gzip. Therefore, I think the problem caused from gzip. What is the most appropriate level of gzip compression? gzip on; gzip_http_version 1.0; gzip_vary on; gzip_comp_level 6; gzip_proxied any; gzip_types text/plain text/css text/javascript application/javascript application/json application/x-javascript text/xml application/xml application/xml+rss;

    Read the article

  • Comparing zip/compression utilities

    - by Grant Palin
    I've used WinZip, WinRar, and 7zip for packaging and compression. I know the first two are payware, and the last is open source. Despite that, they all seem to serve the same overall purpose. Are there any other distinguishing characteristics that make any of the options stand out? I'm not really looking for a "best" package, but would like to know of noteworthy differences between the common tools. For what it's worth, I do seem to like WinRar. Not sure why, but there it is. If it matters, I'm using Windows 7.

    Read the article

  • How can I replicate Google Page Speed's lossless image compression as part of my workflow?

    - by Keefer
    I love that Google's Page Speed is able to losslessly compress a lot of my images, but I'd love to make it part of my workflow, prior to uploading a site and making it live. Is there anything I can run locally to give me the same lossless compression? I currently export images from Export For Web from Photoshop, and use a little application called PNGCrusher to reduce file size of PNGs. I'd love to find a faster way though than saving out and replacing the individual images from Page Speed's results.

    Read the article

  • How to avoid compressing compressed files

    - by Gzorg
    Most compression programs compress all files by default. But when archiving a folder containing already compressed files, there is no need to compress them a second time, such as archives, packed setup program, jpg, movies, mp3,.... Are there any compression programs that allow an arbitrary list of type of files to be stored while the others are still compressed ? It looks like Winrar can't. I expect this would be doable with tar+gz/bzip2 and some scripting in various ways. Edit : Winrar can

    Read the article

  • How to compress a huge amount of PNG images?

    - by T Pops
    At work, on certain projects I have to manage a lot of images. Most of the time PNG files work the best for what I'm doing. With such a huge amount of images, I've tried using PNG compression with PNG Gauntlet but sometimes the file doesn't really change and sometimes PNG Gauntlet reports it would've made the filesize bigger! Am I just maxing out the compression or is there something more I can do?

    Read the article

  • SQL Server 2012 Backups Not Compressing

    - by Chris
    We recently upgraded from SQL Server 2008R2 to SQL Server 2012 Enterprise. After doing this I noticed that our DB backups were barely compressing at all. Our 22gb DB was compressing down to about 4gb before upgrading to 2012 and after the upgrade our 22gb DB is only compressing down to 19gb. I've checked and double checked the compression setting on the server and all of my backup jobs and compression is turned on. Any ideas on what may be going on?

    Read the article

  • Gzip http compression problem on iis7

    - by wpfwannabe
    My web hosting provider is running IIS7 and I am having loads of trouble to get gzip compression to work properly. Host admins say compression is installed. I can confirm compression using some online checking services but not with others. PageSpeed Firefox add-on also says the site is uncompressed. I am personally sitting behind a Squid proxy but web.config settings should take care of proxy issue. Below is the relevant web.config snippet. Most of it is borrowed from various sites. Any thoughts? <urlCompression doDynamicCompression="true" dynamicCompressionBeforeCache="true" doStaticCompression="true" /> <httpCompression cacheControlHeader="max-age=86400" noCompressionForHttp10="False" noCompressionForProxies="False" sendCacheHeaders="True" dynamicCompressionEnableCpuUsage="89" dynamicCompressionDisableCpuUsage="90" minFileSizeForComp="1" noCompressionForRange="False"> <scheme name="gzip" dll="%Windir%\system32\inetsrv\gzip.dll" /> <dynamicTypes> <add mimeType="text/*" enabled="true" /> <add mimeType="message/*" enabled="true" /> <add mimeType="application/javascript" enabled="true" /> <add mimeType="*/*" enabled="false" /> </dynamicTypes> <staticTypes> <add mimeType="text/*" enabled="true" /> <add mimeType="message/*" enabled="true" /> <add mimeType="application/javascript" enabled="true" /> <add mimeType="*/*" enabled="false" /> </staticTypes> </httpCompression>

    Read the article

  • Separation of multipage tiff with compression "CCITT T.6" very slow

    - by Alex
    I need to separate multiframe tiff files, and use the following method: public static Image[] GetFrames(Image sourceImage) { Guid objGuid = sourceImage.FrameDimensionsList[0]; FrameDimension objDimension = new FrameDimension(objGuid); int frameCount = sourceImage.GetFrameCount(objDimension); Image[] images = new Image[frameCount]; for (int i = 0; i < frameCount; i++) { MemoryStream ms = new MemoryStream(); sourceImage.SelectActiveFrame(objDimension, i); sourceImage.Save(ms, ImageFormat.Tiff); images[i] = Image.FromStream(ms); } return images; } It works fine, but if the source image was encoded using the CCITT T.6 compression, separating a 20-frame-file takes up to 15 seconds on my 2,5ghz CPU.(edit: One core is at 100% during the process) When saving the images afterward to a single file using standard compression (LZW), the separation time of the LZW-file is under 1 second. Saving with CCITT compression also takes very long. Is there a way to speed up the process?

    Read the article

  • What is some good lossless video codec for recording gameplay?

    - by Don Salva
    I'm an avid gamer and I like to record my gameplay. Usually I've been using Fraps to do it, however I'm thinking of switching to Dxtory as it allows to write on multiple HDDs at once. Say I have 3 HDDs with the following write speeds: HDD1 with 50 mb/s, HDD2 with 22 mb/s and HDD3 with 45 mb/s. Combined write speed would be: 117 mb/s. Dxtory allows you to utilize all 3 HDD's at once while recording your gameplay. Using this formula: RGB24 YUV24: Width x Height x 3 x fps = bitrate (byte/sec) YUV420: Width x Height x 3 / 2 x fps = bitrate (byte/sec) YUV410: Width x Height x 9 / 8 x fps = bitrate (byte/sec) And recording in YUV420 colorspace at 1920x1080 with 30 fps I'd need about 95 mb/s write speed. Dxtory is good because it allows me to play with constant 60 fps while recording in 30 fps. Fraps does not (even though they say it does), once you start recording with Fraps, the game's fps drops. So I'm looking for a codec that doesn't need a very high write speed (bitrate) yet records in good (lossless) quality. Dxtory comes with its own codec, the Dxtory codec. Which allows me some experimentation. Fraps has it's own codec which I can use in Dxtory to expirement around. I also came across http://lags.leetcode.net/codec.html . Are there more lossless codecs out there (besides Fraps' and Dxtory's) which are good for what I want to do? Edit: To clarify, yes, I'm aware a lossless codec always has "good" quality. But that's not what I'm looking for. Let me take the Fraps codec and Dxtory codec to clarify what I'm looking for. When I record with the Dxtory codec in RGB colorspace at 1920x1080 with targeted 30 fps, I can play the game at 60 fps, BUT I'm recording with 10-15 fps, that's because RGB with Dxtory needs much, much more write speed than my hdd can handle. When recording with Dxtory codec in YUV410 colorspace at 1920x1080 with targeted 30 fps, I can play at 60 fps and record at 30 fps, again, that's because YUV410 in Dxtory's codec takes much, much less write speed than RGB When recording with Fraps codec in ??? (I dunno the color space Fraps records in, I guess YUV420), I can play with 60 fps and record with 30 fps. What I'm looking for is a lossless codec that can record in YUV420 (or even RGB??) which does not exceed a write speed (or bitrate if you will) of 100 mb/s in 1920x1080 or in other words, which will allow me to record in constant 30fps. Obviously the best solution would be to buy an SDD, but that's not what I'm after.

    Read the article

< Previous Page | 4 5 6 7 8 9 10 11 12 13 14 15  | Next Page >