Search Results

Search found 1246 results on 50 pages for 'compression'.

Page 34/50 | < Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >

  • Data Archiving vs not

    - by Recursion
    For the sake of data integrity, is it wiser to archive your files or just leave them unarchived. No compression is being used. My thinking is that if you leave your files unarchived, if there is some form of corruption it will only hurt a smaller number of files. Though if you archive, lets say all of your documents, if there is even the slightest corruption, the entire archive is unrecoverable. So whats the best way to keep a clean file system, but not be subject to data corruption.

    Read the article

  • Debugging logrotate postrotate script

    - by robert
    Following is my logrotate conf. /mnt/je/logs/apache/jesites/web/*.log" { missingok rotate 0 size 5M copytruncate notifempty sharedscripts postrotate /home/bitnami/.conf/compress-and-upload.sh /mnt/je/logs/apache/jesites/web/ web endscript } And compress-and-upload.sh script, #!/bin/sh # Perform Rotated Log File Compression tar -czPf $1/log.gz $1/*.1 # Fetch the instance id from the instance EC2_INSTANCE_ID="`wget -q -O - http://169.254.169.254/latest/meta-data/instance-id`" if [ -z $EC2_INSTANCE_ID ]; then echo "Error: Couldn't fetch Instance ID .. Exiting .." exit; else /usr/local/bin/s3cmd put $1/log.gz s3://xxxx/logs/$(date +%Y)/$(date +%m)/$(date +%d)/$2/$EC2_INSTANCE_ID-$(date +%H:%M:%S)-$2.gz fi # Removing Rotated Compressed Log File rm -f $1/log.gz The files are rotated, but shell script is not executed. I don't know how to debug the postscript. Is there any logfile I chek to see if there is any permission issues. If i directly execute the script from commandline file upload works. Thanks.

    Read the article

  • ntbackup workalike for adhoc full backups in Windows 7 thats free and preferably open source

    - by Justin Dearing
    On windows 2000 and XP machines I used to be able to do the following: ntbackup backup systemstate c: /f e:\backups\machineName\machineName-full+systemstate_200101206.bkf This gave me a full backup of the system that I could use to do a system restore, after doing a barebones OS install. Windows 7 has a great utility for regular backups with alerting and all that stuff. It does not seem to have command line support. I'd like a backup solution for my Windwos 7 systems that has the following features: Is free Is open source (preferebly) Works while the system is booted and leaves the system functional (clonezilla is great for offline backups, and I use that too) Gives me a backup that is suited for a full system restore or partial system restore (ruling out most imaging software even if they could work while the system is booted via some sort of shadow copy voodoo) Can work via the command line Compression would be nice, the ability to pipe output would be better.

    Read the article

  • How can one tell that FLAC or WAVPACK audio file is NOT originally encoded from a Lossy source?

    - by cornel
    Hi everyone, Forgive me for my ignorance,firstly. Problem: Say I have a lossy mp3 audio file(5.17Mb ie. 87% compressed from its original souce-unknown), I then encode it to another LOSSLESS format, say FLAC or WAVPACK. The size increases(23.14Mb ie. 39% compressed from its original souce-mp3)! ID tags, etc remain the same and there's no way of checking the integrity of its origin. Question: Is there a way of checking that the so-called FLAC or WAVPACK audio file was originally encoded from a LOSSLESS source(wav,cda,ape,...etc) instead of a LOSSY source(mp3,aac,ATRAC,..etc) Thank you. Best regards, L-I-C(Lost In Compression)

    Read the article

  • Virtual Server 2005 R2 kungfu

    - by AngryHacker
    Does Virtual Server 2005 R2 have a command line interface, that's versatile enough? Here is a situation. I run a Win2k VM on an old memory constrained machine. I allocate it 378MB of RAM and the VM runs just fine. Once a month, inside the VM, I backup the (a very large) database, compress it using 7Zip and ftp it to the backup site (all in a script). Unfortunately the compression part takes a massive amount of RAM (far exceeding the 378MB), it goes for the paging file and brings absolutely everything to a crawl and literally takes 2-3 days, if left unattended. So to fix this, I have to shutdown the VM, give it temporarily 768MB of RAM and then the whole thing finishes in 20 minutes. So, is there a way do the following automatically from the host machine in a script? Shutdown the guest OS (I think, I got this part) Change the RAM allocation from 378 to 768 Start the guest OS again then, 1 hour later, do everything in reverse.

    Read the article

  • Incremental backup and sync software

    - by martjno
    I need a free software for Windows (with gui or command line) that does incremental backup copying all files and storing changed or deleted files in a directory named like last change date (or a progressive number). To be more precise: D:\ is my Data drive E:\ is my Backupdrive. If i want to backup all my data from D:\: E:\d_lastbackup\ will contain a plain copy of all the files and folder content (no compression or archiving, same files attributes) of D E:\d_20090822\ will contain all files (with their full path) that are changed or deleted in the last version (since the previous one) E:\d_20090820\ will contain all files (with their full path) that are changed or deleted in the last version (since the previous one) and so on... I had a software working prefectly with an old USB harddsik by Maxtor, but it works only on that device. Any suggestion?

    Read the article

  • Software that will burn DVDs with a SFV or Pararchive file for each disc?

    - by Matt
    I'd like to burn several thousand RAW files (.DNG) of around 10-30 MB each to DVD to backup my photo archive. I'm looking for software that can do this and include an SFV-type file on each disc burnt. These are my requirements: compression is optional, and probably not desirable due to the extra time involved files should be spread amongst the discs as self-contained units, i.e. I don't want to have to load files from more than one disc to be able to read the files on that disc, so that excludes WinRAR's spanning options I don't want to spend time writing ISO images first as this will be a task I'll need to repeat often as I add new images to my archive - the software should write to the DVDs for me as simply as possible I'd like the SFV/Pararchive/recovery record to be stored with the files on each disc, so it only references the files on that particular disk Thanks in advance!

    Read the article

  • When to use delaycompress option in logrotate?

    - by Anand Chitipothu
    The man page of logrotate says that: It can be used when some program cannot be told to close its logfile and thus might continue writing to the previous log file for some time. I'm confused by this. If a program cannot be told to close its logfile, it will continue to write forever, not for sometime. If the compression is postponed to next rotation cycle, the program continues to write to that file even after the next rotation cycle. How is postponing solving the problem? My understanding is that copytruncate should be used when a program cannot be told to close the logfile. I'm aware that some data written to the logfile gets lost when the copy is in progress. I was looking at the logrotate file for couchdb, and it had both copytruncate and delaycompress options. /usr/local/couchdb-1.0.1/var/log/couchdb/*.log { weekly rotate 10 copytruncate delaycompress compress notifempty missingok } It looks like there is no point using delaycompress when copytruncate is already there. What am I missing?

    Read the article

  • Nginx + PHP-FPM on Centos 6.5 gives me 502 Bad Gateway (fpm error: unable to read what child say: Bad file descriptor)

    - by Latheesan Kanes
    I am setting up a standard LEMP stack. My current setup is giving me the following error: 502 Bad Gateway This is what is currently installed on my server: Here's the configurations I've created/updated so far, can some one take a look at the following and see where the error might be? I've already checked my logs, there's nothing in there (http://i.imgur.com/iRq3ksb.png). And I saw the following in /var/log/php-fpm/error.log file. sidenote: both the nginx and php-fpm has been configured to run under a local account called www-data and the following folders exits on the server nginx.conf global nginx configuration user www-data; worker_processes 6; worker_rlimit_nofile 100000; error_log /var/log/nginx/error.log crit; pid /var/run/nginx.pid; events { worker_connections 2048; use epoll; multi_accept on; } http { include /etc/nginx/mime.types; default_type application/octet-stream; # cache informations about FDs, frequently accessed files can boost performance open_file_cache max=200000 inactive=20s; open_file_cache_valid 30s; open_file_cache_min_uses 2; open_file_cache_errors on; # to boost IO on HDD we can disable access logs access_log off; # copies data between one FD and other from within the kernel # faster then read() + write() sendfile on; # send headers in one peace, its better then sending them one by one tcp_nopush on; # don't buffer data sent, good for small data bursts in real time tcp_nodelay on; # server will close connection after this time keepalive_timeout 60; # number of requests client can make over keep-alive -- for testing keepalive_requests 100000; # allow the server to close connection on non responding client, this will free up memory reset_timedout_connection on; # request timed out -- default 60 client_body_timeout 60; # if client stop responding, free up memory -- default 60 send_timeout 60; # reduce the data that needs to be sent over network gzip on; gzip_min_length 10240; gzip_proxied expired no-cache no-store private auth; gzip_types text/plain text/css text/xml text/javascript application/x-javascript application/xml; gzip_disable "MSIE [1-6]\."; # Load vHosts include /etc/nginx/conf.d/*.conf; } conf.d/www.domain.com.conf my vhost entry ## Nginx php-fpm Upstream upstream wwwdomaincom { server unix:/var/run/php-fcgi-www-data.sock; } ## Global Config client_max_body_size 10M; server_names_hash_bucket_size 64; ## Web Server Config server { ## Server Info listen 80; server_name domain.com *.domain.com; root /home/www-data/public_html; index index.html index.php; ## Error log error_log /home/www-data/logs/nginx-errors.log; ## DocumentRoot setup location / { try_files $uri $uri/ @handler; expires 30d; } ## These locations would be hidden by .htaccess normally #location /app/ { deny all; } ## Disable .htaccess and other hidden files location /. { return 404; } ## Magento uses a common front handler location @handler { rewrite / /index.php; } ## Forward paths like /js/index.php/x.js to relevant handler location ~ .php/ { rewrite ^(.*.php)/ $1 last; } ## Execute PHP scripts location ~ \.php$ { try_files $uri =404; expires off; fastcgi_read_timeout 900; fastcgi_pass wwwdomaincom; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include fastcgi_params; } ## GZip Compression gzip on; gzip_comp_level 8; gzip_min_length 1000; gzip_proxied any; gzip_types text/plain application/xml text/css text/js application/x-javascript; } /etc/php-fpm.d/www-data.conf my php-fpm pool config ## Nginx php-fpm Upstream upstream wwwdomaincom { server unix:/var/run/php-fcgi-www-data.sock; } ## Global Config client_max_body_size 10M; server_names_hash_bucket_size 64; ## Web Server Config server { ## Server Info listen 80; server_name domain.com *.domain.com; root /home/www-data/public_html; index index.html index.php; ## Error log error_log /home/www-data/logs/nginx-errors.log; ## DocumentRoot setup location / { try_files $uri $uri/ @handler; expires 30d; } ## These locations would be hidden by .htaccess normally #location /app/ { deny all; } ## Disable .htaccess and other hidden files location /. { return 404; } ## Magento uses a common front handler location @handler { rewrite / /index.php; } ## Forward paths like /js/index.php/x.js to relevant handler location ~ .php/ { rewrite ^(.*.php)/ $1 last; } ## Execute PHP scripts location ~ \.php$ { try_files $uri =404; expires off; fastcgi_read_timeout 900; fastcgi_pass wwwdomaincom; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; include fastcgi_params; } ## GZip Compression gzip on; gzip_comp_level 8; gzip_min_length 1000; gzip_proxied any; gzip_types text/plain application/xml text/css text/js application/x-javascript; } I've got a file in /home/www-data/public_html/index.php with the code <?php phpinfo(); ?> (file uploaded as user www-data).

    Read the article

  • How can I diagnose cache misses when using Apache as a reverse proxy?

    - by johnstok
    I have set up Apache 2.2 as a reverse proxy with the following configuration: # jBoss proxying ProxyRequests Off <Proxy *> Order deny,allow Allow from all </Proxy> ProxyPass /foo http://localhost:9080/foo ProxyPassReverse /foo http://localhost:9080/foo ProxyPassReverseCookiePath /foo /foo # Reverse proxy caching CacheEnable disk /foo # Compression SetOutputFilter DEFLATE BrowserMatch ^Mozilla/4 gzip-only-text/html BrowserMatch ^Mozilla/4\.0[678] no-gzip BrowserMatch \bMSIE\s(7|8) !no-gzip !gzip-only-text/html DeflateCompressionLevel 9 Header append Vary User-Agent env=!dont-vary However, in a number of cases where I expect a cached response to be returned the request is sent through to the origin server at localhost:9080. Responses have a HTTP Vary header of 'Accept-Encoding,User-Agent' which is to be expected given the mod_deflate configuration. How can I determine why Apache is unable to serve a response from the cache?

    Read the article

  • Modifying the install environment for RH-like installations

    - by javanix
    I am trying to modify the basic installation environment (ie, what Anaconda runs in) for a customized CentOS distribution. For the first try, I would just like to modify a few of the splash images. My initial attempt to do this entailed: 1) Mount images/install.img to a directory ~/img/ 2) Copy all files from img/ to ~/tmpimg/ 3) Modify the splash images 4) mkisofs -o ~/final/install.img 5) cp ~/final/install.img back to my ~/cdroot/ folder and remake the iso. However, the generated .img in step 4 doesn't even come close to matching the file size from the original install.img (meaning that install.img must be created in some other fashion using compression), and it fails when I boot my iso. What settings should I be using to make the install.img file? Is there some other technique for modifying CentOS install environments?

    Read the article

  • my file download speed is 180-200kbps but speedtest is showing 1.7mbps and i am subsribed to 2mbps pack?

    - by edward
    i'm from Malaysia. I am subscribed to 2mbps pack from my country's ISP. But all i get is only 200kbps when download files from the internet, when i test my speed on speedtest.com , it shows me 1.7mbps. I am pretty confused , ISP should give users what they stated , yeah a 2mbps. But i only recieved 200kbps and the speedtest shows 1.7mbps which makes me confused even further. So i started googling and found out that it is related to something called VPN compression? can anyone explain to me how these things works? why am i getting much much lower speed than i have subscribe to. (I am using a D-Link DSL-2750 U modem) Thanks.

    Read the article

  • Slow upload to Server 2008 DC, Downloads work as expected

    - by Anthony
    I have a Windows Server 2008 Domain Controller that I run as a do-it-all server. It has a GbE connection to the network and to every machine on the network. Downloads from the server file shares work as expected, between 70MB/s and 80MB/s to all the machines. However, when I try to copy files back up to the server, speeds fall to 7MB/s-10MB/s. I've disabled flow control and large send offload properties on all the NICs. I had this problem before and managed to fix it through some properties changes, but like an idiot, I never documented my fix and have since moved to a new server. Any ideas what I need to do to get the speeds to be more symmetric? EDIT: Remote differential compression is also disabled.

    Read the article

  • Video codec that can be read on clean installs of either Windows, OS X and Ubuntu

    - by fmercille
    I have to make a video that will need to be watched on different operating systems. Is there a "universal" video codec that can be played on Windows, OS X and Linux without requiring additional plugins or player other than those that comes on a default clean install of each of those systems? Compression is not an issue, I'm merely looking for compatibility (e.g. for audio, I would use WAV as a universal codec). Note : I must assume that the video will be distributed in countries where software patents are enforced, and therefore can't rely on the user to install non-free codecs on Linux. Thanks.

    Read the article

  • Creating a seperate static content site for IIS7 and MVC

    - by JK01
    With reference to this serverfault blog post: A Few Speed Improvements where it talks about how static content for stackexchange is served from a separate cookieless domain... How would someone go about doing this on IIS7.5 for a ASP.NET MVC site? The plan so far: Register domain eg static.com, create a new website in IIS Manually copy the js / css / images folders from MVC as is so that they have the same paths on the new server Enable IIS gzip settings (js/css = high compression, images = none) Set caching with far future expiry dates <clientCache cacheControlCustom="public" /> in the web.config Never set any cookies on the static.com site Combine and minimize js / css Auto deploy changes in static content with WebDeploy Is this plan correct? And how can you use WebDeploy to deploy the whole web app to one server and then only the static items to another? I can see there is a similar question, but for apache: Creating a cookie-free domain to serve static content so it doesn't apply

    Read the article

  • Video/Image processing on Apple iPhone4 [closed]

    - by goldenmean
    Hello, I know apple iPhone4 support H.264 and MPEG-4 as the supported video codecs, and JPEG,M-JPEG as image codecs. I was looking to get some information on Apple iPhone4 Video, Image Codec and Processing Chips/SoC parts. 1] Does anyone know, Which vendor provides the SoC's to enable this image / video compression and processing solutions in iPhone4. 2] Are the video/image codec solutions - 'Software codecs' on a specialized dsp-core/processor or hard wired as in a FPGA/ASIC solution. Any pointers would be useful. thank you. -AD.

    Read the article

  • Is Protune for video only or may be used for photo too?

    - by Green
    I have Hero3+ Black Edition. I can't understand if Protune is for video only or may be applied for photos? The manual says it is both for video and for photo (page 35): High-Quality Image Capture Protune’s high data rate captures images with less compression, giving content creators higher quality for professional productions. Film/TV Rate Standard While shooting in Protune, you have the option of recording video in cinema quality 24 fps to easily intercut GoPro content with other source media without the need to perform fps conversion. But at the same time their site says that Protune is for video only: To record Protune footage, you’ll need to turn Protune ON in your camera’s settings menu. What for is Protune? Photo? Video? Or both?

    Read the article

  • Data CD for audiobooks?

    - by Marco7757
    I'm trying to burn my .mb4-audiobook files to a CD. I was impressed by the compression-rate (10 hours of audiobook within 150MB?!). The problem now is, that I cannot burn it as an audio CD as these allow only about 80 minutes of audio (audiobook is about +10 hours). I burned them as a data CD now. It works, but, of course, the downside of a data CD is, that not every player (e.g. car, stereo) can play data CDs. What can I do? I don't want to waste 100 CDs on such a simple problem ... is there any way to burn an audio CD? I mean, just regarding the filesize this shouldn't be a problem, shouldn't it? Why is an audio CD only able to play up to 80 minutes?

    Read the article

  • MP4 video - edit audio track

    - by Maccaius
    I have recorded some nice sport videos with mz GoPro HD action camera. I would like to edit the audio track. I dont want to get rid of the whole audio track - just erase small parts (e.g. compression artifacts or me saying some swearwords). When the original audio track is cleansed, Id add another music layer in FCE afterwards. I'd really like to edit the audio like in a WaveLab etc. Any ideas?

    Read the article

  • RSync over SSH hangs and fails with timeout

    - by tx2
    Client: Gentoo, GCC 4.3.4, RSync 3.0.9 Server: Ubuntu 10.04.4 LTS, RSync 3.0.7 Client and server connectet through is Internet, about 2Mbps. Ping is ok. RSync called on any files in any direction hangs on random file, then, after timeout, fails with: [sender] io timeout after 30 seconds -- exiting rsync error: timeout in data send/receive (code 30) at io.c(140) [sender=3.0.9] [sender] _exit_cleanup(code=30, file=io.c, line=140): about to call exit(30) In 1/10 trys is pass correctly. I've tryed to add SSH options TcpRcvBufPoll=yes, KeepAlive=yes; disable and enable rsync compression -- no changes. How can i make rsync works properly?

    Read the article

  • IIS7 is gzipping files but not serving the gzipped version.

    - by ptrin
    By following a number of helpful blog posts I have configured IIS to gzip my static files. I have even enabled Failed Request Tracing and filtered to the 200 status code, and I can see the successful compression events taking place as well as the finished headers, which look like this: Headers="Content-Type: text/css Content-Encoding: gzip Last-Modified: Mon, 04 Oct 2010 17:35:08 GMT Accept-Ranges: bytes ETag: "02ef37cea63cb1:0" Vary: Accept-Encoding Server: Microsoft-IIS/7.5 X-Powered-By: ASP.NET " However, when I test in Fiddler and Firefox the Content-Encoding header is missing, and the file is not gzipped. This is a similar issue to this question which was never resolved. IIS is generating the gzipped files which I can see in C:\inetpub\temp\IIS Temporary Compressed Files . Does anyone know how I can troubleshoot this?

    Read the article

  • "The requested operation could not be completed due to a file system limitation" 3202

    - by user46529
    I backup SQL Server database and it fails BACKUP DATABASE dd TO DISK = '\backupServer\backups\dd.bak' WITH COMPRESSION, CHECKSUM, NOFORMAT, INIT , BlockSize = 65536 , BufferCount = 2200 , MaxTransferSize = 4194304 The backup size is 3TB and I have 6TB free space on bacup server. I am using backup parameters per SQLCAT whitepaper. Everything works ok when I backup to local HDD and it always fails when I backup to network share. After about 6 hours. Can't find why. Thank you. Yes. The backup over the network is fastest and saves me 3Tb of local disk space :) Thanks for pointing to the memory issue. I left 4Gb to OS and it worked!

    Read the article

  • Why does sub_filter seem to not work when used in conjunction with proxy_pass?

    - by kylehayes
    Given the following configuration of nginx: server { listen 80; server_name apilocal; sub_filter "apiupstream/api" "apilocal"; sub_filter_once off; location /people/ { proxy_pass http://apiupstream/api/people/; proxy_set_header Accept-Encoding ""; } } Sub_filter does not properly response parts of the response. Once I remove proxy_pass from the configuration, it works properly. A lot of folks with this problem end up having gzip compression from the upstream server. I've verified that my upstream server does not have gzip encoding turned on for its responses. But just in case, I've also used the proxy_set_header above to not accept gzip. Is there potentially something else I'm missing?

    Read the article

  • Windows Explorer - How can an large file have a zero "Size on disk" value? What does it mean

    - by Jaans
    I would expect some discrepancy between "Size" and "Size on disk" in Windows Explorer due to file system allocations etc. Below is a screenshot of an example file on a Windows 2012 R2 file server that has a 81.4 MB "Size" but for the "Size on disk" it's 0 bytes. What gives? I have other files doing the same, but yet another set of files and folders behaving as expected showing the size on disk relatively close to the actual file size. The volume is a basic disk, formatted with NTFS and the default 4K allocation units. No compression is set for any file or folder on the volume. (For those more paranoid, I did a malware scan, and also confirmed there is not ADS streams associated with the file in question). The user account running Windows Explorer is the domain administrator, and the file owner is also the domain administrator. Thanks for reading!

    Read the article

  • Is Clonezilla a good option for a daily batch-file-based backup of a Windows XP PC?

    - by rossmcm
    Having just been through the process of rebuilding a Windows XP desktop machine when the disk died, I'm anxious to make it a lot less painful. I didn't lose any data, but reinstalling everything took ages. Clonezilla seems to be a highly mentioned free backup tool. How easy would it be to implement the following: a nightly unattended backup of the desktop's disk image to another network machine (or a second drive in the machine), hopefully with compression. restore from that image using USB boot media. so that if I come in to work and find the hard drive has tanked, it is just a matter of replacing the dead drive with a new one, booting from the USB stick, choosing the image to restore, and then finding something else to do for an hour or two. When it is finished I would hopefully be back to where I was.

    Read the article

< Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >