Search Results

Search found 39784 results on 1592 pages for 'ignore files'.

Page 91/1592 | < Previous Page | 87 88 89 90 91 92 93 94 95 96 97 98  | Next Page >

  • Copy files with original folder structure, but to 8.3 format

    - by kokbira
    I have a folder with a lot of files and folders inside it. I would like to copy that to another location so the result is a folder with the same file and folder structure, but with all files in 8.3 format. How to do it? PS: Well, some files have extensions with more than 3 characters (e.x. home.sh3d, windows.theme etc.), so when I say about transforming all filenames to 8.3 I would like to say about transforming them to a 8.X format (i.e., to do not change extensions).

    Read the article

  • How can I easily confirm in Linux that two separate directories have the exact same contents?

    - by Mike B
    CentOS 5.x Mq question seemed similar to this one but I wasn't sure... I have two servers (completely isolated from each other), each with a directory and sub-directories that should have the same exact contents. For example the directory layout could be something like: SERVER A - /opt/foo/foob/1092380298309128301283/123.txt /opt/foo/foob/5094380298309128301283/456.txt /opt/foo/foob/5092380298309128301283/789.txt /opt/foo/foob/1592380298309128301283/abc.txt SERVER B - /opt/foo/foob/1092380298309128301283/123.txt /opt/foo/foob/5094380298309128301283/456.txt /opt/foo/foob/5092380298309128301283/789.txt /opt/foo/foob/1592380298309128301283/abc.txt Ideally I'd like a way to do a recursive check and have something confirm that everything matches. I also want to avoid using any third-party tools. Any ideas?

    Read the article

  • How to block access to files in the current directory with .htaccess

    - by kfir
    I have a few private files in a public folder and I want to block access to them. For example lets say I have the following files tree: DictA FileA FileA FileB FileC I want to block access to FileB and FileA in the current directory and allow access to the FileA in the DictA directory. The first thing that came to mind was to use the FilesMatch directive as follows: <FilesMatch "^(?:FileA)|(?:FileB)$"> Deny from all </FilesMatch> The problem here is that FileA inside DictA will also be blocked, which is not what I wanted. I could override that by adding another .htaccess file to DictA but I would like to know if there is a solution which wont involve that. P.S: I can't move the private files to a separate folder.

    Read the article

  • How to add recently set cookies to nginx's access log

    - by etoleb
    I'd like to include cookie data in an nginx access log like so: (simplified example) log_format foo '$remote_addr "$request" $cookie_bar'; access_log /var/log/nginx/access.log foo; This works great on requests that already have a cookie "bar", but for the first request to my server nginx will report "-" as the value of "bar". It seems like my problem is that nginx is looking at the request headers for the cookie value. Is there a way check for a Set-Cookie in the response and use that as a fallback?

    Read the article

  • How can I too many files upload more fast way to Cloud files in Rasckspace?

    - by andy kim
    I have a lot of image files, it's all I want to upload to RackSpace cloud files about a million in a single directory the fastest and most efficient way. but I'm use uploading python-cloudfiles script is very slow and I want to know different ways or python script code. because one by one connection upload is very slow. I think one files tar and uncompress directory is better way. but cloudfiles do not support this way. Who know any other way?

    Read the article

  • Getting SEC to only monitor latest version of a log file?

    - by user439407
    I have been tasked with running SEC to help correlate PHP logs. The basic setup is pretty straightforward, the problem I'm having is that we want to monitor a log file whose name contains the date(php-2012-10-01.log for instance). How can I tell SEC to only monitor the latest version of the file(and of course switch to the newest log file every day at midnight) I could do something like create a latest version of the file that links to the latest version and run a cron job at midnight to update the link, but I am looking for a more elegant solution

    Read the article

  • Finding .desktop files based on their titles?

    - by stwissel
    That's part 2 of a question asked earlier (to be able to give credit to the answers individually). When I type into the Dash applications show up with their title (also when hovering over the launcher), how can I find the associated desktop file. When I look into the usual suspect locations (/usr/share/applications and ~/.local/share/applications) with Nautilus I see the titles, but not the file names (not even in properties which sucks). When I look from the command line I see the file names but not the titles (a switch would be nice). How can I get a listing (a custom column?) that shows them next to each other?

    Read the article

  • cannot find java even though it is there (ubuntu 12.04)

    - by Jeff Storey
    I'm trying to just execute the java command and it's saying it cannot be found, even though it is there. Here's what my output looks like root@oneiric:/usr/lib/jvm/default-java/bin# ls -al java -rwxrwxrwx 1 uucp 143 5750 2012-09-20 11:14 java root@oneiric:/usr/lib/jvm/default-java/bin# ./java -su: ./java: No such file or directory So the ls shows it's there, but it doesn't seem to execute. Can someone explain why this is?

    Read the article

  • Something keeps deleting my downloaded files

    - by corroded
    I have been using utorrent for years now and recently I was surprised that I had 24GB free. I thought that was because I deleted some unused apps, but after awhile, I noticed my Torrents folder was gone(I put finished torrents in my Downloads/Torrents folder) I thought I accidentally deleted it(I use rm -r to delete huge files) so I shrugged and tried to download those 24GB back(after banging my head for the sheer stupidity) This morning, I noticed that again, my Torrent folder was gone! This made me think that something MUST be deleting my torrent files. I am not sure but my hunch is uTorrent(so I just upgraded it) or something else entirely. This is getting frustrating, so I hope someone can help me on this. My only guess is when I do CMD + w (I'm on a Mac, OSX Lion), it closes the window and somehow deletes the torrents? I am downloading files again now and will try to document what I do the tomorrow so I can add more input here.

    Read the article

  • rsync synchronizing files only without creating folders on destination

    - by Vincent
    Is it possible with rsync to not create directories on destination? Imagine I have that source : a/ a/x.txt b/ b/y.txt And that I have this destination : a/ a/z.txt The wanted result of rsync source destination : a/ a/x.txt a/z.txt Of course my real situation involves thousand files/folders structure and I don't want solutions involving explicit list of synced folders, which I can do. I'm looking for a clean way just to prevent any folder creation on destination. By exclude or filtering... That could even be something outside rsync, like a hack with permissions if rsync can't do this... For information, this is really easy to get this kind of situations, in my case I have: A server with 2 disks, let's say A & B. And a local drive C. I usually use rsync to sync (and merge) remote A & B into local C. Then sometimes I just want to sync back some C files into A and B. (Just new Files... not non-existing folders on destination)

    Read the article

  • Setting differing ACLs on directories and files

    - by durandal
    Quick ACL question: I want to set up default permissions for a file share so that everyone can rwx all of the directories and so that all newly created files are rw. Everyone who is accessing this share is in the same group, so this isn't a concern. I have looked at doing this via ACLs without changing all of the users' umasks and such. Here are my current invocations: setfacl -Rdm g:mygroup:rwx share_name setfacl -Rm g:mygroup:rwx share_name My problem is that while I want all of the newly created sub-directories to be rwx, I only want newly created files to be rw. Does anyone have a better method to achieve my desired end-result? Is there some way to set ACLs on directories separately from files, in a similar vein to "chmod +x" vs. "chmod +X"? Thanks

    Read the article

  • How can I delete Time Machine files using the commandline

    - by Tim
    I want to delete some files/directories from my Time Machine Partition using rm, but am unable to do so. I'm pretty sure the problem is related to some sort of access control extended attributes on files in the backup, but do not know how to override/disable them in order to get rm to work. An example of the error I'm getting is: % sudo rm -rf Backups.backupdb/MacBook/Latest/MacBook/somedir rm: Backups.backupdb/MacBook/Latest/MacBook/somedir: Directory not empty rm: Backups.backupdb/MacBook/Latest/MacBook/somedir/somefile: Operation not permitted There are a number of reasons I do not want to use either the Time Machine GUI or Finder for this. If possible, I'd like to be able to maintain the extended protection for all other files (I'd like not to disable them globally, unless I can re-enable once I've done my work).

    Read the article

  • How to change log rotate Extension..???

    - by Jayakrishnan T
    Hi all, currently my logrotate configuration adds a single number after the rotated log file: mylogfile.log is rotated to mylogfile.log.1 I would like to change the extension to mylogfile.log.Current date does anyone know a way to do this? my log rotate code is :- /usr/local/jboss/jboss-3.2.7-ND1/server/default/log/consolelog.log { copytruncate rotate 1 missingok notifempty } Currently am renaming the rotated file with script.is there any option to change the extension of log rotate default configuration. Please help me

    Read the article

  • delete multiple files on linux with spaces in file names

    - by raido
    I have a directory on my Linux box with over 10000 files that I have to delete. Running... sudo rm -rf /var/tmp/* Gives the error message... sudo: unable to execute /bin/rm: Argument list too long The solution to this is to run sudo find /var/tmp | xargs sudo rm This only works for files with no spaces in the file name. However, some of the files have names with spaces in them and they are not deleted. For example, if a file is named 'A File With Spaces in the Name.dat', Running the command gives me errors like this.... rm: cannot remove `/var/tmp/A': No such file or directory rm: cannot remove `File': No such file or directory rm: cannot remove `With': No such file or directory rm: cannot remove `Spaces': No such file or directory rm: cannot remove `in': No such file or directory rm: cannot remove `the': No such file or directory rm: cannot remove `Name.dat': No such file or directory How do I pass the complete file path to xargs sudo rm without breaking up the file name.

    Read the article

  • How to show users the reason for a message being bounced or rejected by Postfix?

    - by Ross Bearman
    A user would like to be able to view a web page showing any emails that a Postfix server has either been unable to send, or unable to receive. For example if the user was supposed to receive an email from a third party but it hasn't arrived, they'd be able to check the web page and see a list of emails rejected by Postfix, along with a clear reason as to why. I've been unable to find an existing application that offers this functionality. Does anyone know of any, or is the best way forward to write a script that parses the log and display the results?

    Read the article

  • Apache log lines contain "..."

    - by mtah
    We have a custom log line format for Apache logs which are analyzed. CustomLog "|/usr/sbin/rotatelogs -l /mnt/var/log/apache2/access-%Y%m%d%H%M%S.log 900" "%a %{%s}t \"%r\"" However, some log lines are mysteriously shortened with "..." for some reason, but how can this be? The shortest length line discovered where this occurs is 317 chars while the longest line is way over 2000 chars. "GET /exposure?sg=&ap=0x0&fv=WIN%2010,0,22,87&si=IH95VDUAVLJ0&pt=Lage%20hjemmelaget%20sengegavl%20-%20Forum%20-%20Diskusjon.no&iv=0&sd=1024x600&ct=680&tz=-120&eu=http%3A//www.diskusjon.no/index.php%3Fshowtopic%3D1011139&l...AS3&an=NO%20-%20180x500%20Pretail%20CPC&wd=1024x483&rf=http%3A//www.google.no/search%3Fhl%3Dno%26source%3Dhp%26q%3Dsengegavl+lage%26meta%3D%26aq%3D2%26aqi%3Dg10%26aql%3D%26oq%3Dsengega%26gs_rfai%3D&ui=3INYF5QAZL10&ws=0x417&ad=180x500&sa= HTTP/1.1"

    Read the article

  • Daemon for moving files between partitions?

    - by RATHI
    I have a system with Ubuntu installed in 20GB and windows in 100 GB, two partitions - each of 100GB using NTFS. While using DC++ (multiple downloading of big file) I used to get message that system is running out of memory. Is there any way to make a deamon which will be checking the Ubuntu partition so that if its used space goes up to a certain amount (let's say 18 GB) it will automatically start a moving file from this drive to another drive (let's assume it will pick the file from movie folder or largest media file from this drive to move)? Or it prompt to ask from user which file to move? Is there any program which can do this for me? If not, can you suggest something to read so that I could make it?

    Read the article

  • Missing files when Windows 7 returns from hibernate w/ dual boot

    - by Arthur N
    I have a dual-boot setup with Ubuntu (lucid) and Windows 7. I have the Windows file system shared on Ubuntu through Samba. Occasionally, I am working on Windows and my machine will go into hibernate (i.e. when the battery level is critical). By default, my GRUB settings boot me into Ubuntu. So when I get back to my PC, sometimes I just hop into Ubuntu instead of going back to Windows. However, if I write any files to the Windows file system during that Ubuntu session, the next time I do go back to Windows (which resumes from hibernate), those files are missing. Obviously, the state of the actual file system and the hibernate snapshot become out of sync, and Windows chooses the hibernate snapshot, overriding any changes I may have made thru Ubuntu. For now, I've disabled the hibernate option in the Windows power settings, but is there any utility I can use to get back some of those missing files?

    Read the article

  • PHP session files have permissions of 000 - They're ununsable

    - by vanced
    I kept having issues with a Document Management System I'm trying to install as, at the first step of the installation process, it would error with: Warning: Unknown: open(/tmp/sess_d39cac7f80834b2ee069d0c867ac169c, O_RDWR) failed: Permission denied (13) in Unknown on line 0 Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/tmp) in Unknown on line 0 I looked in /tmp and saw the sess_* files have the following permissions ---------- 1 vanced vanced 1240 Jan 20 08:48 sess_d39cac7f80834b2ee069d0c867ac169c All the session files look like this. So obviously, they're unusable by PHP and it's causing me lots of problems. How can I get PHP to set the correct permissions? I've tried changing the directory which php.ini uses to /tmp/phpsessions and the same thing occurs. The directories are a+rwx.

    Read the article

  • Missing files when Windows 7 returns from hibernate w/ dual boot

    - by Arthur N
    I have a dual-boot setup with Ubuntu (lucid) and Windows 7. I have the Windows file system shared on Ubuntu through Samba. Occasionally, I am working on Windows and my machine will go into hibernate (i.e. when the battery level is critical). By default, my GRUB settings boot me into Ubuntu. So when I get back to my PC, sometimes I just hop into Ubuntu instead of going back to Windows. However, if I write any files to the Windows file system during that Ubuntu session, the next time I do go back to Windows (which resumes from hibernate), those files are missing. Obviously, the state of the actual file system and the hibernate snapshot become out of sync, and Windows chooses the hibernate snapshot, overriding any changes I may have made thru Ubuntu. For now, I've disabled the hibernate option in the Windows power settings, but is there any utility I can use to get back some of those missing files?

    Read the article

  • PDF files are opening in Firefox, undesiredly

    - by root
    PDF files have suddenly started to open within the browser windows of Firefox 17. The PDF files are being displayed with the Adobe Acrobat plugin, which is odd, since I have explicitly disabled the Adobe Acrobat plugin in Firefox. I would like for Firefox to show the download prompt when opening a PDF file, instead. I have disabled the Adobe Acrobat plugin and I have made sure that PDF files are set to "Always Ask" in the Options dialog. For good measure, I've also tried disabling all plugins and extensions, and associating all file types to "Always Ask", but to no avail. So why is Firefox 17 suddenly ignoring these settings?

    Read the article

  • can't delete files on Windows 7 (permission)

    - by lajos
    I updated my laptop from XP to Windows 7. There are some leftover files from XP on the computer now, when I try deleting them I get an error: You need permission to perform this action. You require permission from S-1-.... to make changes to this folder. What's weird is that I am logged in with the only user account on this machine and I have administrator privileges. I tried turning UAC off, but still can't delete the files. How can I force removal of these files?

    Read the article

  • Exchange 2010 allows outside access to network files

    - by user2891127
    One of our users discovered by accident he could access our network files from his smartphone while at home. No VPN needed. He was sent an email with an internal link to a network share on his android. When he opened the email and clicked on the link, he could browse our files while at home. Looking at the access logs, the connection to the share and files he accessed came from our mail server (Exchange 2010). We have no sharepoint servers running at all, and certainly not on the Exchange server. What is this function/feature called, and is it possible to turn this function/feature off? Should I turn this off?

    Read the article

  • How to find malformed - corrupted - dos - BOMByte Files in Linux

    - by Syquus
    I've several problems maintaining large production servers, in which some developers drop files from Windows environments, sometime with BOM-bytes (We use UTF8, and no need for that), causing lots of troubles. Other times, I got a "no end of line" and "[DOS]" labels when vim-editing files directly on the server. I recently discovered how to find for the bom byte, and how to delete it in a batch script. What about illegal bytes, bad EOLs? Is it safe to use DOS Text Files on a linux environment? Any drawbacks If I use to convert them with dos2unix cmd ? Regards

    Read the article

< Previous Page | 87 88 89 90 91 92 93 94 95 96 97 98  | Next Page >