Search Results

Search found 46908 results on 1877 pages for 'managing files and folder'.

Page 561/1877 | < Previous Page | 557 558 559 560 561 562 563 564 565 566 567 568  | Next Page >

  • Data Archiving vs not

    - by Recursion
    For the sake of data integrity, is it wiser to archive your files or just leave them unarchived. No compression is being used. My thinking is that if you leave your files unarchived, if there is some form of corruption it will only hurt a smaller number of files. Though if you archive, lets say all of your documents, if there is even the slightest corruption, the entire archive is unrecoverable. So whats the best way to keep a clean file system, but not be subject to data corruption.

    Read the article

  • htaccess: multiple redirections depending on domain name

    - by Marcin Kmiec
    I have a server and a few domains and two webpages. Can't figure out how to do the following: A.com -> root\ www.A.com -> root\ B.com -> root\ www.B.com -> root\ C.com -> root\folder1\ www.C.com -> root\folder1\ By the way. What is the 'and' logical operator used in htaccess? I found that 'or' is [OR] but [AND] doesn't seem to work. And what is the language htaccess is written in:)? UPDATE I made a mistake in the question though. Here's what I'd really want to do. DNS is set for the domain A.com to point to the root folder of the server. Now I would like to set the following redirections: Any domain other than C.com and other than D.com redirects (301) to www.A.com. A.com points to the root folder of the server anyway and that is set in DNS. Domain www.C.com points to the folder 'folder1' on the server. Can it be set in htaccess? Now domains C.com, www.D.com and D.com redirects to www.C.com.

    Read the article

  • Sharepoint 2007 - Transaction log full

    - by Kenny Bones
    So I have this SharePoint 2007 site that is basically trash. I'm supposed to just toss it, but I'm in need of copying all of the data in form of traditional files and folders from certain projects. And since the transaction log is full, it's so damn slow. Even opening SharePoint takes up to 15 minutes, or it won't open at all. Copying of files is extremely slow. So I'm in need of a quick fix here. Just to be able to copy out some files and folders. I don't need to fix the problem per se. What can I do to fix it temporarily to be able to copy out the data?

    Read the article

  • Break a hard link of a file in use

    - by Stebi
    I used hard links to merge duplicated files on my SSD (space is still precious) and now have a weird problem. Common files like msvcr110.dll got hard linked. Now I want to delete a program which has this file in its installation directory. But I cannot because this file (on another location) is used by a currently running application (don't know which) and windows doesn't allow me to delete this file because it's in use. I can rename the file but it still points to the same file, so not possible to delete it. Is there any way to break a the hard link of a file which is currently in use? I currently use a trash folder where I move those files to so I can delete the directory structure of program to be deleted. But I'd like to get rid of this leftover (although it doesn't take much space as it's a hard link).

    Read the article

  • How do I reset the $PATH variable on Mac OS X?

    - by Neil
    I've messed up my path variable, and now some apps that I run raise errors saying Command Not Found (error 127) for commands like 'date' and 'sleep'. These commands work fine when executed directly in the shell. I'm guessing this has something to do with a malformed $PATH variable, and need to know how to reset it. I've deleted the files ~/.bashrc , ~/.bash_profile, /etc/bash.bashrc, and ~/.bashrc and ~/.profile. What other files could hold my $PATH? Is there some simpler way to reset the Path than dig into the myriad files which could hold my path? Note, this path problem is only with my user. I made a test user on my system, and the path was fine, back to normal.

    Read the article

  • Move some iTunes library items to different drive?

    - by Sören Kuklau
    My internal hard drive is somewhat small, and I only regularly listen to a fraction of my iTunes library anyway, so I'd like to keep large portions on it on an external drive for archival purposes. Since dealing with multiple iTunes libraries is somewhat painful, the solution I'm looking for is to move individual items of the library to a different location, without compromising the "Keep organized" and "Copy files" settings. I found an AppleScript that I assume is supposed to do this, Move Files To Folder…, but it instead copies them, and doesn't update the library accordingly. I can do this manually by moving the file, then accessing it in iTunes — it'll prompt me for the new location. I just don't intend to do this one by one for thousands of files.

    Read the article

  • How to INF mod: Replacing 32bit dlls with 64bits

    - by Nime Cloud
    I've got a driver setup for 32 bit: An INF file and an x86 folder with two 32 bit dlls. I need to replace these 32 bit dll files with 64 bit ones. I just simply overwrite 32 bit files but no lock. How can I make 64 bit version of the driver? Update: I tried original setup files on 32 bit Windows XP, setup asks for WdfCoinstaller01009.dll, I just simply browse & point the file from somewhere on XP. ;-------------- WDF Coinstaller installation [DestinationDirs] CoInstaller_CopyFiles = 11 [silabser.Dev.NT.CoInstallers] AddReg=CoInstaller_AddReg CopyFiles=CoInstaller_CopyFiles [CoInstaller_CopyFiles] WdfCoinstaller01009.dll [SourceDisksFiles] WdfCoinstaller01009.dll=1 [CoInstaller_AddReg] HKR,,CoInstallers32,0x00010000, "WdfCoinstaller01009.dll,WdfCoInstaller" [silabser.Dev.NT.Wdf] KmdfService = silabser, silabser_wdfsect [silabser_wdfsect] KmdfLibraryVersion = 1.9

    Read the article

  • Apache htaccess with mod_expires Not Working for certain directories

    - by keyboarddrummer
    I have a Joomla site that I am trying to enable caching using mod_expires. I have the .htaccess in the root of the site and have added the options as found on the page http://www.pactsoftware.nl/tools/joomla-optimization.html Using the PageSpeed extension in Chrome, prior to adding this in my .htaccess, my site scores a 55 (Caching is at the top, and lists a lot of images, CSS, and JS files). After these directives, it scores 70, with caching in the yellow, but still lists some image files (some are two directories deep and the rest are four). I checked for any other .htaccess files in the Joomla root, but none are between those folders and the root. It is almost as if htaccess only works in that one directory, not the subfolders. I have tried putting a .htaccess in each affected subdirectory, but it does not work. Does anyone have any ideas?

    Read the article

  • Cookieless Domain redirect in WHM/cPANEL

    - by Patrick Lanfranco
    I am currently trying to get my head around in understanding how to set-up a "cookieless" domain using WHM / Cpanel - unfortunately without any success at this moment. I have a Magento store and I would like to use "cookieless domains" for my media, skin (template) and js files. Magento has a nice feature to define URL for those folders. My current setup is as follows: www.mydomain.com <- main store media.mydomain.com <- subdomain to the media folder (mydomain.com/media/) skin.mydomain.com <- subdomain to the media folder (mydomain.com/skin/) js.mydomain.com <- subdomain to the media folder (mydomain.com/js/) I think it's poinless to have them used as "cookieliess domains" since my Magento installation uses .mydomain.com as cookie domain, so what I would like to achieve is to register a new additional domain and have it point via WHM / cPanel to those specific locations. I have tried to change the A and CNAME records although without any success as they were just simply redirecting from one page to another in the browser (newdomain.com - jump to old.com). What kind of records do I have to set to have this working properly? Some advice would be highly appreciated.

    Read the article

  • FTP script download from linux to windows

    - by user53864
    I'm using following FTP script on windows xp to download zip files from ubuntu cloud servers. A zip file is created every day on ubutnu servers and I will download it to windows via this ftp script. I run this script everyday manually as I have to edit the last line(mget /usr/backup_02-11-2010.Zip) of the script to match today's date. I want to edit this script so that it will download only today's zip file at the scheduled time without needing to edit it everyday, when scheduled. It's clear that date is appended to the zip files and is in the format dd-mm-yyyy. Need help... open server-ip-here username-here user-password-here lcd C:\Backup\files bin hash prompt mget /usr/backup_02-11-2010.zip

    Read the article

  • Experience with MQ File Transfer Edition?

    - by mfinni
    We've got several processes that move files across servers - SFTP, FTP, SCP; Windows, Linux, AIX; there is a workflow component (usually require a control file with filenames and hash values to move a batch of related files). The action is often initiated on our servers to get the files, so we need to make sure they're done being written. We have some homegrown scripts to do this, but they don't always work properly, and troubleshooting, maintenance, and log review is not easy this way. There's a lot of servers, and our scripts don't have central logging or a dashboard/console/etc. We're looking into commercial products to do this. Has anyone used MQ File Transfer Edition? Another team in our company is using Aspera, does anyone have any thoughts on that, or other favored products? I have no idea what our budget is for this, yet. Just trying to get a handle on the product space from the perspective of other admins.

    Read the article

  • ffmpeg volume parameter format

    - by tanon
    ffmpeg's -vol parameter is confusing me. 256 => normal (i guess meaning same as input volume, no change) 512 => (double the volume - read this somewhere). So what to do for 3 times the volume? 1.5 times the volume? Basically, lets say I have the max sound amplitudes (audacity levels) in 3 files as: 0.8 0.6 0.9 I want to amplify in the first two files, so that max=0.9 in all files. What parameters of -vol I would use?

    Read the article

  • Is Windows Media Player able to play DTS audio?

    - by rolgae
    I'm trying to play DTS audio with Windows Media Player 12 on Windows 7. For a MPEG-TS file with video and DTS audio, only video is played. A file containing only a DTS audio stream is rejected. But: WMP is able to play the DTS audio stream of a DVD. So, Is Windows Media Player able to play DTS audio, or not? And if: How do I make him play my DTS files? I did not find any good resources of the supported codecs. Just things like "WMP can play .mpg files, ..." VLC is able to play all of the above files. I do not want to install third party codec packs, thats not the question!

    Read the article

  • what does it mean for MalwareBytes to find malicious registry keys but nothing else?

    - by EndangeringSpecies
    I have a machine that is obviously infected, and when I ran MalwareBytes it told me that it found some "malicious" registry keys (surprisingly enough these contained file path to currently non-existent javascript files). But, that's it. Full scan did not uncover any malicious files, or malicious hidden processes in memory. Like, maybe the (hidden?) process that for whatever reason periodically injects keystrokes (hotkeys?) into whatever currently open window. Then on another, not obviously infected, machine it found a "malware.trace" registry key but again no files or processes etc. How does this jive with people's experience with MalwareBytes? Does it usually find registry key symptoms of an infection but nothing else? Or is it a common thing to have no infection but some malicious registry keys in place anyway?

    Read the article

  • Windows 7 Slow Searching

    - by Guy Thomas
    I have a new Windows 7 machine with twice as much RAM and a faster processor than my old Windows Server 2008 R2 Machine. I am disappointed that searching amongst my 10,000 image files takes twice as long on my new Windows 7 machine. Both machines have their own copy of these same files. In other respects e.g. opening my huge Outlook files, the new machine is faster. The Windows Search Service has started. And I set indexing on the image folder about 3 days ago. Any ideas why I suffer from this poor index / search experience? Other than adding / removing folders, is there anything I can do to tweak indexing?

    Read the article

  • RewriteRule applying pattern even though 1 of the RewriteCond's failed

    - by BHare
    #www. domain . tld RewriteCond %{HTTP_HOST} (?:.*\.)?([^.]+)\.(?:[^.]+)$ RewriteCond /home/%1/ -d RewriteRule ^(.+) %{HTTP_HOST}$1 RewriteRule (?:.*\.)?([^.]+)\.(?:[^.]+)/media/(.*)$ /home/$1/client/media/$2 [L] RewriteRule (?:.*\.)?([^.]+)\.(?:[^.]+)/(.*)$ /home/$1/www/$2 [L] Here is rewritelog output: #(4) RewriteCond: input='tfnoo.mydomain.org' pattern='(?:.*\.)?([^.]+)\.(?:[^.]+)$' [NC] => matched #(4) RewriteCond: input='/home/mydomain/' pattern='-d' => not-matched #(3) applying pattern '(?:.*\.)?([^.]+)\.(?:[^.]+)/media/(.*)$' to uri 'http://www.mydomain.org/files/images/logo.png' #(3) applying pattern '(?:.*\.)?([^.]+)\.(?:[^.]+)/(.*)$' to uri 'http://www.mydomain.org/files/images/logo.png' #(2) rewrite 'http://www.mydomain.org/files/images/logo.png' -> '/home/mydomain/www/logo.png' If you note on the 2nd 4 it failed the -d (if directory exists) pattern. Which is correct. mydomain does not have a /home/. Therefore it should never rewrite, atleast according to my understanding that all rewriterules are subject to rewriteconds as logical ANDs.

    Read the article

  • How to determine main movie DVD track before ripping via mencoder

    - by Ampp3
    Maybe there's a simple answer for this, but when looking at the files on a DVD (IFOs, VOBs,etc), is there a way to easily determine the longest/main track? I'm trying to automate the process of finding the main movie track on a DVD and am running into issues. I thought this could be done by finding the BIGGEST track (look through VTS_XX_N.VOB files, where XX is the track number, and find the track with the largest filesize (sum sizes of VOB files for that track)), but apparently that isn't correct. One DVD had track 7 as the largest track (by my method), but mencoder didn't produce the correct output with this track, but worked with track 9 instead. Am I missing something? EDIT: I've heard of the utility 'lsdvd' for getting track information, but I was hoping to avoid compiling this, and use a basic method instead (ie: what I tried above). Does anyone have any idea WHY my idea didn't work?

    Read the article

  • Renaming debian package

    - by Tabiko
    I'm trying to build a customized version of a nginx package for Debian/Ubuntu which had a different set of modules opposed to the default version. What would be the fastest way to modify the debian/ structure (and which files) if I'd want to rename the package from 'nginx' to 'my-nginx' for example? I've got the source deb package unpacked and which files I'd need to modify in nginx-1.4.5/debian/ directory (holding the control, rules.. files) have buildpackage generate my-nginx-1.4.5.deb package instead of nginx-1.4.6.deb package. I appreciate your help!

    Read the article

  • MacOS X 10.6 Portable Home Directory sync fails due to FileSync agent crashing

    - by tegbains
    On one of our cleanly installed MacPro machines running MacOS X 10.6.6 connected to our MacOS X 10.6.6 Server, syncing data using Portable Home Directories fails. It seems to be due to the filesync agent crashing during the home sync. We get -41 and -8026 errors, which we are suspecting are indicating that there is too much data or filesync agent can't read the files. The user is the owner of the files and can read/write to all of the files. < Logout 0:: [11/02/04 13:10:42.751] Error -41 copying /Volumes/RCAUsers/earlpeng/Library/Mail/Mailboxes/email from old imac./Attachments/12081/2.2. (source = NO) < Logout 0:: [11/02/04 13:10:42.758] Error -8062 copying /Volumes/RCAUsers/earlpeng/Library/Mail/Mailboxes/email from old imac./Attachments/12081/2.2/[email protected]. (source = NO) < Logout 1:: [11/02/04 13:10:42.758] -[DeepCopyContext deepCopyError:sourceError:sourceRef:]: error = -8062, wasSource = NO: return shouldContinue = NO

    Read the article

  • How to set default permissions for automounted FAT drives in Ubuntu

    - by piman
    I've got many FAT32 drives that I'd like to mount in Ubuntu such that they have permission mode 700 for directories and 600 for all other files. By default, they have 755 for all files, which is not particularly useful since almost no non-directories should be executable, and it screws up version control repos hosted on the drives. "Back in the day" I would have had the drives listed in /etc/fstab with the umask/dmask I want and there was no such thing as a default. These days, drives automount under their volume names. Which is great, except now I have no idea how to set the default. I have tried changing the /system/storage/default_options/vfat/mount_options gconf key with no apparently effect. It was 077 initially but the mounted drive reflected a default of 022; changing it and re-inserting the drives resulted in the files still having permission bits of 755.

    Read the article

  • Quicktime won't install on Windows 7 Ultimate 64bit

    - by Martin
    Hi! I am trying to install quicktime (actually just need iTunes..but then iTunes needs qt), but it fails. There seems to be a problem with the folder C:\Program Files(x86)\QuickTime\ - Quick Time wants to write the file QTTask.exe, but complains that it does not have permission to do so. Same thing happens with \PropertyPanels\PanelHelperBase.qpa I have tried deleting all Apple programs (in the order suggested in the support forum) and also tried to delete the temp folder. That did not work. I have tried to manually adjust the permissions of the QuickTime folder - no effect. I have run the installation file with admin rights and with different compatibility modes to no effect. I consider myself to be an experienced user - able to solve most problems - but now I am stuck. I need some input / fresh ideas on how to tackle the problem. This is very annoying as I cannot sync iPhone/iPad/iPod while iTunes is not running - due to the stupid (sorry) idea of only have your device linked to one library. Please help. Thanks!

    Read the article

  • Upload a directory recursively to an FTP server

    - by Nicolas Raoul
    I am writing a Linux shell script to copy a local directory to a remote server (removing any existing files). Local server: ftp and lftp commands are available, no ncftp or any graphical tools. Remote server: only accessible via FTP. No rsync nor SSH nor FXP. I am thinking about listing local and remote files to generate a lftp script and then run it. Is there a better way? Note: Uploading only modified files would be a plus, but not required

    Read the article

  • Linux: Alternative to rsync? (ie, scp with resume)

    - by Joernsn
    I've been using rsync to automatically send files from one box to another, which is great compared to scp, since it supports resuming. However, when resuming a very large file (10gb) rsync has to read both files and compare them, which is very slow. I don't need fancy error handling, just "scp with resume", so here's my question: Is there an alternative to rsync/scp, that supports resuming without having to read both source and destination files? I've read the manuals without finding anything I can use, please let me know if I've missed something. This is the rsync line I've been using: rsync -av --partial --progress --inplace SRC DST

    Read the article

  • How to download a url as a file?

    - by Michelle
    A website url has "hidden" some mp3 files by embedding them as shockwave files, as follows: <span class="caption"><!-- Odeo player --><embed src="http://odeo.com/flash/audio_player_tiny_gray.swf"quality="high" name="audio_player_tiny_gray" align="middle" allowScriptAccess="always" wmode="transparent" type="application/x-shockwave-flash" flashvars="valid_sample_rate=true external_url=http://podcast.cbc.ca/mp3/sundayeditionstream_20081125_9524.mp3" pluginspage="http://www.macromedia.com/go/getflashplayer"></embed></span> How can I download the files for off-line listening? I've found two methods: 1. The StackOverflow Method Create a new local html file with just the links eg <a href="http://podcast.cbc.ca/mp3/sundayeditionstream_20081125_9524.mp3">Sunday Edition 25Nov2008</a> Open the file in the browser, right click the link and File Save Link As. 2. The SuperUser Method Install the Firefox addin Iget. (Be sure to use the right version for your Firefox version.) Tools Downloads Enter url in field. Are there any other ways?

    Read the article

  • How to get filename of job in cups?

    - by Grook
    I have printed a couple of files and lpstat shows that they are completed. But the output is something like this: # lpstat -W completed -l Canon-1 root 1086464 Sat May 21 22:47:03 2011 Alerts: job-canceled-by-user queued for Canon Canon-2 root 337920 Mon May 23 20:18:02 2011 Alerts: job-canceled-by-user queued for Canon CanonWin-3 root 17408 Mon May 23 20:29:40 2011 Alerts: job-completed-successfully queued for CanonWin` How can i get names of files which has been printed? P.S. Is there is any bash-script which allows me to get names of all files which has been printed?

    Read the article

< Previous Page | 557 558 559 560 561 562 563 564 565 566 567 568  | Next Page >