Search Results

Search found 1285 results on 52 pages for 'lossless compression'.

Page 6/52 | < Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >

  • btrfs and missing free space

    - by easteregg
    I converted my ext4 partition to btrfs and deleted the save subvolume after doing so. Then I enabled the compression (lzo) of the filessystem in the fstab file and everything is correct so far. Then I forced the compression of all files using the defragmentation command with the parameter -c that the new compression is applied to all files. While doing so, I noticed that my ssd got completly filled up - before I had 6gigs of free space. No I got nothing left. easteregg@x201s:~$ btrfs fi df / Data: total=50.00GB, used=49.17GB System: total=32.00MB, used=4.00KB Metadata: total=24.50GB, used=9.86GB and easteregg@x201s:~$ df -ha Filesystem Size Used Avail Use% Mounted on /dev/sda1 75G 60G 852M 99% / So now. How can I regain my free space. I expected to gain more space because of the lzo compression. And now! The fs is correctly mounted. easteregg@x201s:~$ mount /dev/sda1 on / type btrfs (rw,noatime,ssd,compress=lzo) Any ideas how to fix this issue?

    Read the article

  • Question about Byte-Pairing for data compression [closed]

    - by user1669533
    Question about Byte-Pairing for data compression. If byte pairing converts two byte values to a single byte value, splitting the file in half, then taking a gig file and recusing it 16 times shrinks it to 62,500,000. My question is, is byte-pairing really efficient? Is the creation of a 5,000,000 iteration loop, to be conservative, efficient? I would like some feed back on and some incisive opinions please. Best Regards.

    Read the article

  • How can I maximally compress .gz files in Nautilus?

    - by Takkat
    When selecting Compress... from the right click context menu in Nautilus I am able to quickly compress files to .gz format. However by default Nautilus does not use maximum compression. Can I make Nautilus to use maximum compression like gzip -9? Using gconftool or gconf-editor to set the compression_level for File Roller to maximum seems right but infortunately has not the desired effect and will not lead to maximum compressed files. As this is the expected way of how to set compression levels a bug report has been filed upstream. Any ideas for a workaround are welcome.

    Read the article

  • Need help with some IIS7 web.config compression settings.

    - by Pure.Krome
    Hi folks, I'm trying to configure my IIS7 compression settings in my web.config file. I'm trying to enable HTTP 1.0 requests to be gzip. MSDN has all the info about it here. Is it possible to have this config info in my own website's web.config file? Or do i need to set it at an application level? Currently, I have that code in my web.config... <system.webServer> <urlCompression doDynamicCompression="true" dynamicCompressionBeforeCache="true" /> <httpCompression cacheControlHeader="max-age=86400" noCompressionForHttp10="False" noCompressionForProxies="False" sendCacheHeaders="true" /> ... other stuff snipped ... </system.webServer> It's not working :( HTTP 1.1 requests are getting compressed, just not 1.0. That MSDN page above says that it can be used in :- Machine.config ApplicationHost.config Root application Web.config Application Web.config Directory Web.config So, can we set these settings on a per-website-basis, programatically in a web.config file? (this is an Application Web.config file...) What have i done wrong? cheers :) EDIT: I was asked how i know HTTP1.0 is not getting compressed. I'm using the Failed Request Tracing Rules, which reports back:- DYNAMIC_COMPRESSION_START DYNAMIC_COMPRESSION_NOT_SUCESS Reason: 3 Reason: NO_COMPRESSION_10 DYNAMIC_COMPRESSION_END

    Read the article

  • Does Compressed Sensing bring anything new to data Compression?

    - by anon
    Compressed sensing is great for situations where capturing data is expensive (either in energy or time), and thus less samples can now be taken. However, in situations like image compression, given that the data is already on the computer -- does compressed sensing offer anything? For example, would it offer better data compression? Would it result in better image search?... (Note: If you don't know what the field of Compressed Sensing is, please do not respond.)

    Read the article

  • Sharp HealthCare Reduces Storage Requirements by 50% with Oracle Advanced Compression

    - by [email protected]
    Sharp HealthCare is an award-winning integrated regional health care delivery system based in San Diego, California, with 2,600 physicians and more than 14,000 employees. Sharp HealthCare's data warehouse forms a vital part of the information system's infrastructure and is used to separate business intelligence reporting from time-critical health care transactional systems. Faced with tremendous data growth, Sharp HealthCare decided to replace their existing Microsoft products with a solution based on Oracle Database 11g and to implement Oracle Advanced Compression. Join us to hear directly from the primary DBA for the Data Warehouse Application Team, Kim Nguyen, how the new environment significantly reduced Sharp HealthCare's storage requirements and improved query performance.

    Read the article

  • Oracle saves with Oracle Database 11g and Advanced Compression

    - by jenny.gelhausen
    Oracle Corporation runs a centralized eBusiness Suite system on Oracle Database 11g for all its employees around the globe. This clustered Global Single Instance (GSI) has scaled seamlessly with many acquisitions over the years, doubling the number of employees since 2001 and supporting around 100,000 employees today, 24 hours a day, 7 days a week around the world. In this podcast, you'll hear from Raji Mani, IT Director for Oracle's PDIT Group, on how Oracle Database 11g and Advanced Compression is helping to save big on storage costs. var gaJsHost = (("https:" == document.location.protocol) ? "https://ssl." : "http://www."); document.write(unescape("%3Cscript src='" + gaJsHost + "google-analytics.com/ga.js' type='text/javascript'%3E%3C/script%3E")); try { var pageTracker = _gat._getTracker("UA-13185312-1"); pageTracker._trackPageview(); } catch(err) {}

    Read the article

  • How do I get the compression on specific dynamic body

    - by Mike JM
    Sorry, I could not find any tag that would suit my question. Let me first show you the image and then write what I want to do: I'm using box2D. As you can see there are three dynamic bodies connected to each other (think of it as a table from front view).The LEG1 and LEG2 are connected to the static body. (it's the ground body). Another dynamic body is falling onto the table. I need to get the compression in the LEG1 and LEG2 separately. Joints have GetReactionForce() function which returns a b2Vec, which in turn has Length() and LengthSqd functions. This will give the total sum of the forces in any taken joint. But what I need is forces in individual bodies that are connected with joints. Once you connect several bodies with a single joint it again will show the sum of forces which is not useful.Here's the case iI'm talking about:

    Read the article

  • Hybrid Columnar Compression

    - by user12620172
    You heard me in the past talk about the HCC feature for Oracle databases. Hybrid Columnar Compression is a fantastic, built-in, free feature of Oracle 11Gr2. One used to need an Exadata to make use of it. However, last October, Oracle opened it up and now allows it to work on ANY Oracle DB server running 11Gr2, as long as the storage behind it is a ZFSSA for DNFS, or an Axiom for FC. If you're not sure why this is so cool or what HCC can do for your Oracle database, please check out this presentation. In it, Art will explain HCC, show you what it does, and give you a great idea why it's such a game-changer for those holding lots of historical DB data. Did I mention it's free? Click here: http://hcc.zanghosting.com/hcc-demo-swf.html

    Read the article

  • How to enable gzip HTTP compression on Windows Azure dynamic content

    - by Steven
    Hi all, I've been trying unsuccessfully to enable gzip HTTP compression on my Windows Azure hosted WCF Restful service which returns JSON only from GET and POST requests. I have tried so many things that I would have a hard time listing all of them, and I now realise I have been working with conflicting information (regarding old version of azure etc) so think it best to start with a clean slate! I am working with Visual Studio 2008, using the February 2010 tools for Visual Studio. So, according to the following link, HTTP compression has now been enabled .. http://msdn.microsoft.com/en-us/library/ff436045.aspx ... and I've used the advice at the following page (the URL compression advice only), but I get no compression. http://blog.smarx.com/posts/iis-compression-in-windows-azure <urlCompression doStaticCompression="true" doDynamicCompression="false" dynamicCompressionBeforeCache="true" /> It doesn't help that I don't know what the difference is between urlCompression and httpCompression. I've tried to find out but to no avail! Could the fact that the tools for Visual Studio were released before the version of Azure which supports compression be a problem? I read somewhere that with the latest tools, you can choose which version of Azure OS you want to use when you publish ... but I don't know if that's true, and if it is, I can't find where to choose. Could I be using a pre-http enabled version? I've also tried blowery http compression module, but no results. Does any one have any up-to-date advice on how to achieve this? i.e. advice that relates to the current version of the Azure OS. Cheers! Steven

    Read the article

  • Lossless cutting of MPEG TS files in Windows

    - by Sebastian P.R. Gingter
    I have several HD video files in transport stream (.ts) format, recorded with my satellite receiver. I want to cut them, as in simply remove a few minutes from the beginning, the end and sometimes a few minutes in the middle of it (remove early start of recordings, late ends and, for some seldom files, the ads). What is a good, ideally but not necessarily free, software with a GUI to do this? Best would be something where you could select points on a timeline and simply cut the elements out. As a resulting file, just the same .ts format would be great, but I could also live with putting the video contents into another container, as long as the video is NOT re-encoded / transcoded. The files have additional audio streams and subtitles. These should be retained in the process. My OS is Windows.

    Read the article

  • Dirt compression from vehicle tires

    - by Mungoid
    So I kinda have this working but its not correct because it just averages, so I wanted to know if anyone here has any ideas. I'm trying to simulate loose dirt compression under the tires of a vehicle to reduce the potential bumpiness of 'chunky' terrain. Currently how I do this is that I have a bounding box shape around my tires, set a little lower so they intersect with the terrain. Each frame, I (currently) average all of the heights of each point in the terrain that are within the box bounds of that tire, and then set them all to that average. Clearly this won't work in most cases because, for example, if i'm on a hill, the terrain will deform way too much. One way I thought was to have a max and min amount the points could raise and lower but that still doesn't seem to work properly and sometimes looks more like steps than smooth dirt. I wanna say that there is probably a bit more to this that what i'm currently doing but I am not sure where to look. Could anyone here shed some light on this subject? Would I benefit any by maybe looking up some smoothing algorith or something similar?

    Read the article

  • Ghost Records, Backups, and Database Compression…With a Pinch of Security Considerations

    - by Argenis
      Today Jeffrey Langdon (@jlangdon) posed on #SQLHelp the following questions: So I set to answer his question, and I said to myself: “Hey, I haven’t blogged in a while, how about I blog about this particular topic?”. Thus, this post was born. (If you have never heard of Ghost Records and/or the Ghost Cleanup Task, go see this blog post by Paul Randal) 1) Do ghost records get copied over in a backup? If you guessed yes, you guessed right. The backup process in SQL Server takes all data as it is on disk – it doesn’t crack the pages open to selectively pick which slots have actual data and which ones do not. The whole page is backed up, regardless of its contents. Even if ghost cleanup has run and processed the ghost records, the slots are not overwritten immediately, but rather until another DML operation comes along and uses them. As a matter of fact, all of the allocated space for a database will be included in a full backup. So, this poses a bit of a security/compliance problem for some of you DBA folk: if you want to take a full backup of a database after you’ve purged sensitive data, you should rebuild all of your indexes (with FILLFACTOR set to 100%). But the empty space on your data file(s) might still contain sensitive data! A SHRINKFILE might help get rid of that (not so) empty space, but that might not be the end of your troubles. You might _STILL_ have (not so) empty space on your files! One approach that you can follow is to export all of the data on your database to another SQL Server instance that does NOT have Instant File Initialization enabled. This can be a tedious and time-consuming process, though. So you have to weigh in your options and see what makes sense for you. Snapshot Replication is another idea that comes to mind. 2) Does Compression get rid of ghost records (2008)? The answer to this is no. The Ghost Records/Ghost Cleanup Task mechanism is alive and well on compressed tables and indexes. You can prove this running a simple script: CREATE DATABASE GhostRecordsTest GO USE GhostRecordsTest GO CREATE TABLE myTable (myPrimaryKey int IDENTITY(1,1) PRIMARY KEY CLUSTERED,                       myWideColumn varchar(1000) NOT NULL DEFAULT 'Default string value')                         ALTER TABLE myTable REBUILD PARTITION = ALL WITH (DATA_COMPRESSION = PAGE) GO INSERT INTO myTable DEFAULT VALUES GO 10 DELETE myTable WHERE myPrimaryKey % 2 = 0 DBCC TRACEON(2514) DBCC CHECKTABLE(myTable) TraceFlag 2514 will make DBCC CHECKTABLE give you an extra tidbit of information on its output. For the above script: “Ghost Record count = 5” Until next time,   -Argenis

    Read the article

  • Ghost Records, Backups, and Database Compression…With a Pinch of Security Considerations

    - by Argenis
      Today Jeffrey Langdon (@jlangdon) posed on #SQLHelp the following questions: So I set to answer his question, and I said to myself: “Hey, I haven’t blogged in a while, how about I blog about this particular topic?”. Thus, this post was born. (If you have never heard of Ghost Records and/or the Ghost Cleanup Task, go see this blog post by Paul Randal) 1) Do ghost records get copied over in a backup? If you guessed yes, you guessed right. The backup process in SQL Server takes all data as it is on disk – it doesn’t crack the pages open to selectively pick which slots have actual data and which ones do not. The whole page is backed up, regardless of its contents. Even if ghost cleanup has run and processed the ghost records, the slots are not overwritten immediately, but rather until another DML operation comes along and uses them. As a matter of fact, all of the allocated space for a database will be included in a full backup. So, this poses a bit of a security/compliance problem for some of you DBA folk: if you want to take a full backup of a database after you’ve purged sensitive data, you should rebuild all of your indexes (with FILLFACTOR set to 100%). But the empty space on your data file(s) might still contain sensitive data! A SHRINKFILE might help get rid of that (not so) empty space, but that might not be the end of your troubles. You might _STILL_ have (not so) empty space on your files! One approach that you can follow is to export all of the data on your database to another SQL Server instance that does NOT have Instant File Initialization enabled. This can be a tedious and time-consuming process, though. So you have to weigh in your options and see what makes sense for you. Snapshot Replication is another idea that comes to mind. 2) Does Compression get rid of ghost records (2008)? The answer to this is no. The Ghost Records/Ghost Cleanup Task mechanism is alive and well on compressed tables and indexes. You can prove this running a simple script: CREATE DATABASE GhostRecordsTest GO USE GhostRecordsTest GO CREATE TABLE myTable (myPrimaryKey int IDENTITY(1,1) PRIMARY KEY CLUSTERED,                       myWideColumn varchar(1000) NOT NULL DEFAULT 'Default string value')                         ALTER TABLE myTable REBUILD PARTITION = ALL WITH (DATA_COMPRESSION = PAGE) GO INSERT INTO myTable DEFAULT VALUES GO 10 DELETE myTable WHERE myPrimaryKey % 2 = 0 DBCC TRACEON(2514) DBCC CHECKTABLE(myTable) TraceFlag 2514 will make DBCC CHECKTABLE give you an extra tidbit of information on its output. For the above script: “Ghost Record count = 5” Until next time,   -Argenis

    Read the article

  • Weird unexpected image compression on a web server running Apache on Ubuntu?

    - by Billy Bob Thornton
    I have a weird problem on my production web server running Apache on Ubuntu: it compresses my images thereby dramatically lowering their quality! Actually I have two virtual hosts running, each located in a different folder. Wether I display .gif images by navigating on the two sites, or acceding them directly by their url, their size and quality are invariably degraded. I tried with three different browsers: same problem. Using them on other sites on the Web: no problem. Of course I disabled mod_deflate on the server (which should not compress images anyway), but the phenomenon remains. On my local développement server, running the same configuration, everything is Ok. Now I'm completely lost! For the record, my configuration: Ubuntu 10.04, Apache 2, Php 5.

    Read the article

  • Backup tape compression

    - by pufferfish
    What things should I check to confirm that compression is actually happening on our tape backup system? Although the tapes are marked as 200G/520G (native/compressed) capacity, they seem to fill up before the 200G mark (some less than 100G). I'm using - Sony AIT-4 tape autochanger - Sony SDX4-200C (AIT-4) tapes - Ubuntu Lucid - Bacula I've tried checking hardware compression with: tapeinfo -f /dev/nst0, which gives Product Type: Tape Drive Vendor ID: 'SONY ' Product ID: 'SDX-900V ' Revision: '0102' Attached Changer API: No SerialNumber: '0001000036' MinBlock: 2 MaxBlock: 8388608 SCSI ID: 1 SCSI LUN: 0 Ready: yes BufferedMode: yes Medium Type: Not Loaded Density Code: 0x33 BlockSize: 0 DataCompEnabled: yes DataCompCapable: yes DataDeCompEnabled: yes CompType: 0x3 DeCompType: 0x3 BOP: yes Block Position: 0 Partition 0 Remaining Kbytes: 201778000 Partition 0 Size in Kbytes: 201779000 ActivePartition: 0 EarlyWarningSize: 0 NumPartitions: 0 MaxPartitions: 0 ... so I presume it's on. Notes: The Bacula documentation says hardware compression needs to be enable with "system tools such as mt"

    Read the article

  • What web oriented language would work best with binary data?

    - by Qqwy
    I want to create a service where people can upload files. However, since file storage costs money, I want to compress the files so they take less space. I would want to write my own compression algorithm, however, PHP doesn't have good ways to handle binary data (which is needed for many compression algorithms). So I wondered, what would be a better language to create such a website in? I have knowledge of PHP (and Javascript, HTML and CSS) but no experience with other things like Ruby, Perl, Python, and other web development languages.

    Read the article

  • http compression shared hosting apache/php

    - by gansodesoya
    Hi, I was sniffing the response header of one my sites and apparently is not using http compression to deliver responses because I'm not seeing the Content-Encoding: gzip in the response header. But the weird thing is that phpinfo() shows me HTTP_ACCEPT_ENCODING: gzip,deflate,sdch Im using a rackspace cloud site (shared hosting, cant access httpdconfig), and I really want to activate http compression but the support guys over there tells me that if the phpinfo() says it, its already on. thanks.!

    Read the article

  • 7-Zip Command Line Maximum Compression

    - by Steve Robathan
    I am writing a batch file to compress a folder using various archiving applications. Currently I also use 7-Zip but manually set up the parameters I would like to add 7-zip to my batch The folder concerned has many sub folders and I need to take this into account What is the command line for the following keeping folder structure?: Archive Format=7z Compression Level=Ultra Compression Method=LZMA Dictionary Size=512MB Word Size=273 Solid Archive Many thanks

    Read the article

  • Finding out if a FLAC or WAVPACK audio file is NOT originally encoded from a lossy source

    - by cornel
    Is there a way of checking that the so-called FLAC or WAVPACK audio file was originally encoded from a lossless source (WAV, CDA, APE, etc.) instead of a lossy source (MP3, AAC, ATRAC, etc.)? Say I have a lossy MP3 audio file (5.17Mb, 87% compressed from its original, source unknown). I then encode it to another lossless format, say FLAC or WAVPACK. The size increases (23.14Mb, 39% compressed from its original, source MP3)! ID tags, etc, remain the same and there's no way of checking the integrity of its origin. How do I go about doing that?

    Read the article

< Previous Page | 2 3 4 5 6 7 8 9 10 11 12 13  | Next Page >