Search Results

Search found 23404 results on 937 pages for 'script compression'.

Page 52/937 | < Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >

  • Too big a difference between Loud and Quiet when watching DVD's... constantly have to adjust volume.

    - by Dan
    It is hard for me to find the right volume for my computer to watch my dvd's on because it seems like most reasonable volumes become overwhelming at the loudest parts of a movie and it is hard to even make out the dialog at the quietest parts. I find I'm constantly adjusting the volume during the course of a movie. Are there ways to make the difference between the louds and the quiets not so extreme? (both computer related solutions and non-computer related solutions welcome). Like moving my speakers across the room and increasing the volume? or the opposite? Or would would the extremes be less noticeable if I used headphones? Are there movie players that might have more complex sound adjustment features? If there is a software solution out there for linux that would be great too. Thanks, Dan

    Read the article

  • Encoding Uncompressed video

    - by Sakamoto Kazuma
    I'd like to encode uncompressed video into compressed avi or mpeg4. Was wondering what program I should look at getting to do such a task. Videos are between 3 and 20 minutes long, and range anywhere from 1.3 to 10 GBs in uncompressed .avi form (fraps).

    Read the article

  • tar: How to create a tar file with arbitrary leading directories w/o 'cd'ing to parent dir

    - by Yan
    Say I have a directory of files at /home/user1/dir1 and I want to create a tar with only "dir1" as the leading directory: /dir1/file1 /dir1/file2 I know I can first cd to the directory cd /home/user1/ tar czvf dir1.tar.gz dir1 But when writing scripts, jumping from directory to directory isn't always favorable. I am wondering is there a way to do it with absolute paths without changing current directories? I know I can always create a tar file with absolute paths INSIDE and use --strip-components when extracting but sometimes extra path names are extra private information that you don't want to distribute with your tar files. Thanks!

    Read the article

  • Run a script on user connection on the VM host

    - by Scott Chamberlain
    I have a server running a Virtual Desktop Managed Pool, what I would like to do is when a user logs in I would like a script to check the number of available VMs and if below a threashold add additional VMs to the pool. The script to check the load and add to the pool is not the problem, I have that already figured out: $collectionName = "Test1"; $rdvh = "vmHost.example.com"; $minAvailableVMs = 2; Import-Module RemoteDesktop; $pool = Get-VirtualDesktopCollection -CollectionName $collectionName; $availableVMs = $pool.Size - ($pool.Size * $pool.PercentInUse / 100); $status = Get-VirtualDesktopCollectionJobStatus $collectionName #only add new servers if we are below the threashold and in the JOB_COMPLETEED state if($availableVMs -lt $minAvailableVMs -and $status.Status -eq [Microsoft.RemoteDesktopServices.Management.VirtualDesktopCollectionJobStatus]::JOB_COMPLETED) { Add-RDVirtualDesktopToCollection -CollectionName $collectionName -VirtualDesktopAllocation @{"$rdvh" = 1} } The problem I am having is, how do I run the above script on the Virtualization Host/Connection Broker/Some other server when a user connects?. I don't think it would be appropriate to run this as a logon script inside the VM, I think there is a way to do this on the management side but I don't know the new scripting interface in Server 2012 R2 well enough to know which commandlets I should look for to schedule this. EDIT: I know System Center is perfect for this but I do not have a license and was denied when I asked for it to be added to the budget.

    Read the article

  • Compressing and copying large files on Windows Server?

    - by Aaron
    I've been having a hard time copying large database backups from the database server to a test box at another site. I'm open to any ideas that would help me get this database moved without having to resort to a USB hard drive and the mail. The database server is running Windows Server 2003 R2 Enterprise, 16 GB of RAM and two quad-core 3.0 GHz Xeon X5450s. Files are SQL Server 2005 backup files between 100 GB and 250 GB. The pipe is not the fastest and SQL Server backup files typically compress down to 10-40% of the original, so it made sense to me to compress the files first. I've tried a number of methods, including: gzip 1.2.4 (UnxUtils) and 1.3.12 (GnuWin) bzip2 1.0.1 (UnxUtils) and 1.0.5 (Cygwin) WinRAR 3.90 7-Zip 4.65 (7za.exe) I've attempted to use WinRAR and 7-Zip options for splitting into multiple segments. 7za.exe has worked well for me for database backups on another server, which has ~50 GB backups. I've also tried splitting the .BAK file first with various utilities and compressing the resulting segments. No joy with that approach either- no matter the tool I've tried, it ends up butting against the size of the file. Especially frustrating is that I've transferred files of similar size on Unix boxes without problems using rsync+ssh. Installing an SSH server is not an option for the situation I'm in, unfortunately. For example, this is how 7-Zip dies: H:\dbatmp>7za.exe a -t7z -v250m -mx3 h:\dbatmp\zip\db-20100419_1228.7z h:\dbatmp\db-20100419_1228.bak 7-Zip (A) 4.65 Copyright (c) 1999-2009 Igor Pavlov 2009-02-03 Scanning Creating archive h:\dbatmp\zip\db-20100419_1228.7z Compressing db-20100419_1228.bak System error: Unspecified error

    Read the article

  • Compressed disk image on Linux

    - by Aaron Digulla
    I just got my new computer with a much bigger harddisk. I think I copied all important files over but just to be sure, I'd like to keep a disk image of my old disk. To save space, I'd like to compress it but I didn't find an option to mount a compressed image. My goals: Result must be easy to access No need to decompress the whole thing before I can access anything Files should be quick to locate - no TAR/CPIO archive Necessary space should be less than just copying the files over So ideally, I'm looking for a read-only, compressed file system which I can create in a file and which grows automatically.

    Read the article

  • How to split file on Windows 2003 using MS supported tool

    - by Rune
    Hi, Is it possible to split a large file into smaller files on Windows 2003 using a tool provided/supported/sanctioned by Microsoft? I see that there are a lot of freeware tools (various zip tools) for this task, but I need to move files off of a production server, thus would like to avoid tools I don't know if I can trust. I would much prefer some tool included in the Windows Server 2003 Resource Kit Tools or something along those lines. Does such a tool exist? Thank you.

    Read the article

  • How can I evaluate the best choice of archive format for compressing files?

    - by Mehrdad
    In general, I've observed the following: Linux-y files or tools use bzip2 or gzip for distributing archives Windows-y files or tools use ZIP for distributing archives Many people use 7-Zip for creating and distributing their own archives Questions: What are the advantages and disadvantages of these formats, all of which appear to be open formats? When/why should I choose one (say, 7-Zip) over another (say, ZIP)? Why does the trend above appear to hold, even though all of these are portable formats? Are there any particular advantages to using a particular archive format on a particular platform?

    Read the article

  • Windows 8 Internet Explorer 11 proxy automation script

    - by Stefan Bollmann
    Similar to this post, I'd like to change my proxy settings using a script. However, it fails. When I am behind the proxy, IE does not connect to the internet. Here I try the first solution from craig: function FindProxyForURL(url, host) { if (isInNet(myIpAddress(), "myactualip", "myactualsubnetip")) return "PROXY proxyasshowninpicture:portihavetouseforthisproxy_see_picture"; else return "DIRECT"; } This script is saved as proxy.pac in c:\windows and my configuration is* in LAN settings: No automatically detected settings, yes, use automatic config script: file://c:/windows/proxy.pac No proxy server. So, what am I doing wrong? ---------------- update -------------- However, when I set up a proxy in my LAN configurations: IE -> Internet Options -> Connections -> LAN Settings check: Use a proxy Server for your LAN Address: <a pingable proxy> Port: <portnr> everything is fine for this environment. Now I try a simpler script like function FindProxyForURL(url, host) { return "PROXY <pingable proxy>:<portnr>; DIRECT"; } With a configuration described above** I am not able to get through the proxy.

    Read the article

  • Multithreaded TIFF to PNG conversion

    - by mtone
    I'm working with images of scanned books, so hundreds of high resolution pictures. I'm doing conversion work with Photoshop Elements - I can quickly save them to uncompressed TIFF, but converting to compressed PNG using a single thread takes ages. Do you know a software, ideally simple and free, that would batch convert those TIFFs to PNG in a multi-threaded manner (4 to 8 simultaneous files) to take advantage of those cores and cut converting times? I'm not too worried in slight variations in final size.

    Read the article

  • CRC error when extracting to SSD from 2nd HDD

    - by gbn
    Hello I have a large RAR file (split up) containing an ISO on my 2nd HDD When I extract it: to the same HDD, it's OK to the system/OS SSD, I get CRC errors I've checked memory, run memtests, checked wires etc I have no other issues; only with this one RAR file Any ideas please?

    Read the article

  • Sever 2008 R2 Powershell Script runs manually, but not as a scheduled task

    - by Aeisor
    I have a Powershell script that runs manually using the Powershell ISE; However, when run as a scheduled task using an administrator's credentials the task does not run with the expected results. The script: $request=new-object System.Net.WebClient $request.DownloadFile("...url...", "C:\path\to\file.csv") The administrator user has Full Control of both the script and the folder it is writing to. The url exists and responds in a reasonable time (<1s). If I run the task manually the status is 0x41301 ("Currently Running") until I eventually end it. I have set the task up using both of these methods: Start a Program: C:\path\to\PS.PS1 Start a Program: C:\windows\system32\WindowsPowerShell\v1.0\powershell.exe with additional options -noninteractive -command "C:\path\to\PS.PS1" Using option 1, the task history shows it has opened an instance of notepad.exe but never terminates it. Using option 2 it completes the task but doesn't download / create the file. I have used Set-ExecutionPolicy Unrestricted as this is not a signed script. Any ideas?

    Read the article

  • Minimize HTML, CSS and JS files

    - by karmic
    How do you automatically pack/minimize the HTML, CSS and JS files served on a webpage. More specifically, I wish to have this for a wordpress website. Should it be done at the webserver level (lighttpd), at the application level (wordpress), at the PHP level, or somewhere else?

    Read the article

  • Three formats? Why?

    - by Yar
    I needed to download the Ruby Source recently from here and it says, "available in three formats" which are .tar.bz2, .tar.gz and .zip. Is there any reason that we need all three formats? At least on Linux and OSX I can do any of the three easily. On Windows, only zip is built-in, I think. Is there anything behind these preferences or is this just a religious battle?

    Read the article

  • How could I compress a folder into splitted archives (individual ZIPs)?

    - by Shiki
    I have to compress folders into ZIP packages. But the size is limited, only a ~10-15mb is allowed to used per package. Every major application comes with the "Split archive to..." option, which does what I want... except I can't uncompress them one-by-one. (You need them all, and then use the .7z, .rar, .zip file to uncompress.) Here is an example. FolderX is 35 mb. That makes 4 packages, 4 zip files. The normal split function would give me: folderx.zip, folderx.zip.001, folderx.zip.002, folderx.zip.003 What I would really need is: folderx_1.zip, folderx_2.zip, folderx_3.zip, folderx_4.zip (Individually uncompressable files/packs.) I can code this down into an app, but it's a waste of time if such a utility already exists.

    Read the article

  • Is there a tool for verifying the contents of a Zip archive against the source directory's contents?

    - by Basil
    Here's the scenario: I create a ZIP archive using some GUI package like WinZip, 7-Zip or whatever by right-clicking on a directory "somename" and selecting "Compress to archive 'somename.zip'" When the archive is completed, I open it and discover that some files don't exist in the archive (for reasons yet unknown). I want to find all files that are missing from the archive without having to extract the archive to another directory, then doing directory diff, etc. So.. Is there a tool (GUI or command-line, standalone or built into a compressor, for Windows or Linux, I don't care) that can walk through an archive and compare its contents against a directory on the filesystem?

    Read the article

  • What's faster, cp -R or unpacking tar.gz files?

    - by Buttle Butkus
    I have some tar.gz files that total many gigabytes on a CentOS system. Most of the tar.gz files are actually pretty small, but the ones with images are large. One is 7.7G, another is about 4G, and a couple around 1G. I have unpacked the files once already and now I want a second copy of all those files. I assumed that copying the unpacked files would be faster than re-unpacking them. But I started running cp -R about 10 minutes ago and so far less than 500M is copied. I feel certain that the unpacking process was faster. Am I right? And if so, why? It doesn't seem to make sense that unpacking would be faster than simply duplicating existing structures.

    Read the article

  • Plesk command working in manual script, not in cronjob

    - by dsaunier
    Hi, In order to install a hosting plan, I use Plesk's commands in SSH as specified in their official guide. When typed directly in SSH (Putty), it works perfectly. The line is as follows with obviously values hard coded when in CLI: /usr/local/psa/bin/domain --create '.$url.' -owner mynamehere -ip '.IP_SERVER_PLESK.' -status enabled -hosting true -hst_type phys -login '.$ftp_user.' -passwd '.$ftp_pw.' -www false -php true -php_safe_mode false -hard_quota 100M I then put that request in a php script that does other things after hosting is installed. Now for the weird part: when calling that script from CLI it also works fine, I do a ./myscript.php and it installs the hosting, then sends emails etc. However after I create a cronjob to have that same script called regularly, then the Plesk command fails. The cronjob is started in Plesk as */15 * * * * /usr/bin/php /home/scripts/myscript.php and it works fine for everything BUT the Plesk hosting install, that returns "Unable to read Control Panel configuration file" and therefore does not install the domain hosting. Still this is the same script that I call manually ! On that server are the PHP used to call a cronjob and the one used in CLI different ? What do I miss, help greatly appreciated ! Regards.

    Read the article

  • Best archive format & tool for large amounts of data (50gb+)

    - by marcusstarnes
    I only realised this afternoon that the ZIP format has a limit of what appears to be around 20gb. I am trying to automate an archive process (using Automate) to zip/rar/whatever a collection of folders/files on one of my disks. It always appeared to bomb out with an incomplete archive at about 20gb. So I tried using WinRAR and doing it manually as a ZIP file, but it told me of the limit. So, I was wondering, what is a recommended zip format (and tool for accomplishing the task) for archiving up a large amount of data (around 50gb)? Thanks

    Read the article

  • Archive format & tool for large amounts of data (50gb+)

    - by marcusstarnes
    I only realised this afternoon that the ZIP format has a limit of what appears to be around 20gb. I am trying to automate an archive process (using Automate) to zip/rar/whatever a collection of folders/files on one of my disks. It always appeared to bomb out with an incomplete archive at about 20gb. So I tried using WinRAR and doing it manually as a ZIP file, but it told me of the limit. So, I was wondering, what is a recommended zip format (and tool for accomplishing the task) for archiving up a large amount of data (around 50gb)?

    Read the article

  • Script apparently changing file permissions on Mac OS to 000

    - by half_bit
    I wrote a little shellscript that helps installing a web application. The script itself just downloads a zip archive, extracts it and changes the permissions of the extracted files to the one needed to run the webapp. The problem now is that some users reported that after running my script, all the permissions of every file in their home directory or even on their whole computer changed to 000 (except the actual unzipped files which do have the correct permissions). The only lines in my script actually doing IO are these: URL="http://foo.com/" FILENAME="some.zip" curl --silent "$URL$FILENAME" -o $FILENAME > /dev/null echo "Unzipping...\c" if unzip -oqq $FILENAME > /dev/null then chmod -R 777 app/tmp app/webroot app/Config/database* app/configuration* chown -R www:www * rm $FILENAME echo "\t\t\tOK" exit 0 else echo "\t\t\tERROR" exit 1 fi I seriously can't explain this to myself. How can this even be possible? It is entirely possible that the users accidentally ran the script in their home directory, but that still wouldn't explain why the permissions where set to 000, not www/777.

    Read the article

< Previous Page | 48 49 50 51 52 53 54 55 56 57 58 59  | Next Page >