Search Results

Search found 39339 results on 1574 pages for 'batch files'.

Page 75/1574 | < Previous Page | 71 72 73 74 75 76 77 78 79 80 81 82  | Next Page >

  • How restore back up email files in qmail

    - by Maysam
    I have problem with restoring some old backup mail files in a mail server that uses qmail. The problem is, when I copy a new email file to the /cur directory, the number of emails in front of inbox increases, but when I click on the inbox, I don't see the newly copied email. I can only see the old emails. I also deleted maildirsize and courierimapuiddb files and they where automatically created again, but it didn't help and I cannot still see the email in my inbox. Is there something I am missing? How can I restore the backed up email files? Please note that when I copy the email files in /.sent-mail/cur directory, they are all displayed in my sent box, but that doesn't happen for inbox files in /cur directory.

    Read the article

  • File access with hostname or ip only - no domain?

    - by Jonathon
    It seems likely that this is an obvious question, but I'm having trouble tracking down any useful information. Normally when accessing files in a particular directory on a server, I'm able to create a virtual host, assign a domain, root directory location, etc -- however am in a situation where I have server space and need to access files with only a hostname. Is this possible? For example, let's say the hostname is 123hostname.com, and the file I want access to is in /home/sub-directory/filename.php. How do I get at it via a browser? I've tried: http://123hostname.com/home/sub-directory/filename.php ...and some other variations on that theme (that I can't post because new users are restricted to one link in messages). But generally stuck. Any help -- even if it's just to let me know that this isn't possible without some additional configuration -- would be great. Thank you!

    Read the article

  • How to share files between cPanel accounts?

    - by Darren
    I am setting up a multi-site/multi-store Magento installation, and I want each site to have its own cPanel account so I can setup the SSL and dedicated IP properly. I have tried to create a linux group called 'magento' and changed the files I need to share to that group (even added the users to that group), however when I try to access files through my scripts on those accounts it doesn't acknowledge the files exist. I first made a soft symbolic link which didn't work and then including them to their real location but it didn't work. Am I missing a step in allowing which users can access which files? I added the users to the magento group and like I said changed the group of the files I need to share to them but it's still not working. Thanks, Darren

    Read the article

  • View hidden contents of usb device

    - by Srikanth Suresh
    I have a USB with the following contents on ls -lah total 8.0K drwx------ 1 srikanth srikanth 4.0K May 27 22:54 . drwxr-xr-x 4 root root 4.0K May 28 19:37 .. -rw------- 2 srikanth srikanth 0 May 27 22:52 Files.az3w On viewing the properties of the folder I have the following information: 90.4Mb used and 16.1GB free There is data in this pen drive which I am currently unable to view also it is sensitive. After searching about hiding contents in a USB I think that there is a hidden partition here that I cant access. How should I proceed to view the contents without damaging the files already present?

    Read the article

  • Windows 7, files reappear after deletion.

    - by HeavyWave
    I'm trying to delete some files from a folder. I've taken ownership of the files and the folder. When I delete these files Windows doesn't report any errors and deletes them. BUT, after I press F5 these files reappear again. There are no messages whatsoever, they are just undeletable. I know login off will help, but how do I fix it without going through the pain of closing everything down? P.S. Files disappear from the folder after aprox. 5 minutes. Update. Turns out my version of Windows did not properly upgrade from test version, so it had some weird disk drive issues.

    Read the article

  • Massive amount of subfolders and long subfolders. ¿How can I delete all of them?

    - by Carlos
    Good day. We have a little problem here. We have a share with the backup of all the server's offices, Its a really big share with more than 8.000.000 files. Our users usually give long names to the folders they create, and then make subfolders (long too) and more subfolders... and more suboflders.... We have a new share with more capacity, and with a simpe robocopy bat we copied all the files and folders (some give problems, but we manually copied them) But the problem is deleting them. del command didnt work well when so long paths, neirder rmdir... I'm tried some commanders, but no luck. Can u recommend me any tool that can delete recursively or able to delete 255+ paths? Edited: The SO on background of the share it's NetApp OS. But I can access it from Windows Servers. 2000 and 2003 Thanks.

    Read the article

  • Large .tmp files in C:\Windows\Temp

    - by Brian
    I have a Windows 2008 server which functions as a web service server, and every other week or so, the C:\Windows\Temp folder is filled with large (between 40-300 meg) files with names like: ~DF5D73.tmp, ~DFA95.tmp, etc. They appear My webservices on the server don't create any files with that type of name, and the contents don't appear to help identify them. Process Explorer indicates that w3wp.exe is currently using several of the files, but I can't imagine specifically where they're coming from. As an example, the top of one of the files is: ÐÏࡱá > þÿ þÿÿÿ þÿÿÿ ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿýÿÿÿþÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿR o o t E n t r y ÿÿÿÿÿÿÿÿÿÿÿÿ þÿÿÿ ÿÿÿÿÿÿÿÿÿÿÿÿ ÿÿÿÿÿÿÿÿÿÿÿÿ ÿÿÿÿÿÿÿÿÿÿÿÿ BM¦h' 6 ( / ph' ÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿÿ The owner of the file is only identified as localname\Administrators, so that's no help. Anyone else experienced a similar issue or any suggestions to track where the files are coming from?

    Read the article

  • Getting error code -41 when copying files to external drive

    - by diego
    I'm having trouble copying some files from my mac to an external hard drive: I keep getting the nondescript "error code -41". I noticed some of the files with an additional "@" permission bit had the "com.apple.quarantine" flag set. I used the "xattr" command from this article What should I do about com.apple.quarantine? to take care of the quarantine flag and sort that out (these files were copied over from another mac on my network, so I guess OS X flagged them as quarantine). That took care of the problem for those files but I still have some that I can't manually copy over to the external drive. The only other thing I've noticed is that some of these files have a an extra permission bit: "drwxr-xr-x+" which I haven't been successful in googling. Aside from that I don't see anything else. Also, Disk Utility says everything's fine. Any help would be greatly appreciated.

    Read the article

  • Getting iTunes to play third party AAC files

    - by Redmastif
    I have a library filled with some old MP3 files and I'm in the process of changing them all to AAC for the better sound quality. Obviously I can't just create AAC versions of the files I already have because they would sound worse (lossy compression to converted to more lossy compression), so I'm going to their source and downloading them in a lossless form and using a third party to make them into AAC. Apparently iTunes will not handle AAC files that aren't made with iTunes. Is there a way around this? I've looked at third party programs and would be willing to use them, but since they all require the iTunes/iPod/iEverything driver, I don't know if they would still prevent my files or not. Also before you jump on my back about pirating, these files are from old CDs that I lost years ago. I paid for them. Thanks.

    Read the article

  • Organizing files relationally in Windows 7?

    - by Cayetano Gonçalves
    I just took a new job as a policy analyst, and after even one week keeping track of hundreds of files- lawsuits, legislation, letters, etc- in Windows 7 is proving difficult. In my last job I was a database architect and I helped build Linux based servers to track files among an entire department, however there is no way for me to do that at this time in this job. Is there any way to track files/indices/locations/tags/themes and store them in some kind of RDBMS system, instead of storing the files in folders that only allow for flat and fixed storage? For example, if I have a file that deals with: ELID organization Appeals court John Smith It really is inconvenient to have to decide which one of these tags to create into a folder and place the file into it, when it falls under all the categories. Even if I could place tags the way you can in Stack Exchange on files, it would solve a lot of heart ache.

    Read the article

  • Merging and sorting multiple files with "sort"

    - by NewbiZ
    Hello, I have a bunch of text logfiles in the following format: ID (17 characters) Timestamp (14 characters YYYYmmddHHMMSS e.g. "20060210100040" -> 2006/02/10 10:00:40) Random data (? characters) end of line The files are already sorted by timestamp. I need to get 1 log file with all the logs from multiple logs files, sorted by timestamp. Note that the log files are really huge, around 3-4G each (and there are dozens of them) I tried the following command: sort -s -m -t '|' -k1n,1n +17 -o data_sort.txt *.TXT Here is how I ended up with this command: -s : don't bother with tie results -m : merge all logs files -t '|' : there is no | in my logs, so the whole line should be field 1 -k1n,1n: sort on the first field as a numeric value +17 : the timestamp starts at index 17 -o : output file Actually... it fails miserably. The output file data_sort.txt is just the concatenation of all files, not sorted at all :( I would greatly appreciate if anyone could provide any help on this problem! Thanks

    Read the article

  • Remove Sync Center icon

    - by Edward Brey
    I accidentally marked a shared folder as "Available Offline" in Windows Explorer on Windows 8.1 computer. This seems to have "woken up" the Sync Center and caused the Sync Center icon to be displayed in the system notification area. Even though I've undid that by marking the folder as not available offline, and furthermore have reset CSC and disabled Offline Files, the Sync Center icon still appears in the overflow section of the system notification area. How do I remove the Sync Center icon and preferably disable the process that is displaying it? Debugging info: The registry shows that stuff is enabled, even though the Sync Center and Offline Files dialog don't indicate that anything is active. HKEY_CURRENT_USER\Software\Classes\Local Settings\Software\Microsoft\Windows\CurrentVersion\SyncMgr\HandlerInstances\{750FDF10-2A26-11D1-A3EA-080036587F03} SyncTime REG_BINARY F6DDC46CBB76CF01 Connected REG_DWORD 0x1 Enabled REG_DWORD 0x0 Active REG_DWORD 0x1 NotifiedOnFirstActivation REG_DWORD 0x0 HKEY_CURRENT_USER\Software\Classes\Local Settings\Software\Microsoft\Windows\CurrentVersion\SyncMgr\HandlerInstances\{750FDF10-2A26-11D1-A3EA-080036587F03}\SyncItems HKEY_CURRENT_USER\Software\Classes\Local Settings\Software\Microsoft\Windows\CurrentVersion\SyncMgr\HandlerInstances\{750FDF10-2A26-11D1-A3EA-080036587F03}\SyncItems\{CBA95344-4284-48CB-8083-3BDE1FDB29A7} SyncTime REG_BINARY F6DDC46CBB76CF01 Connected REG_DWORD 0x1 Enabled REG_DWORD 0x1

    Read the article

  • linux: accessing thousands of files in hash of directories

    - by 130490868091234
    I would like to know what is the most efficient way of concurrently accessing thousands of files of a similar size in a modern Linux cluster of computers. I am carrying an indexing operation in each of these files, so the 4 index files, about 5-10x smaller than the data file, are produced next to the file to index. Right now I am using a hierarchy of directories from ./00/00/00 to ./99/99/99 and I place 1 file at the end of each directory, like ./00/00/00/file000000.ext to ./00/00/00/file999999.ext. It seems to work better than having thousands of files in the same directory but I would like to know if there is a better way of laying out the files to improve access.

    Read the article

  • how to have files created by CMS have the same ownership as SSH user

    - by Cam
    I am having difficulty on our ubuntu server whereby I have an SSH user that when I create files using this user the ownership is web_user:www-data The problem is when a file is uploaded or created using a content management system like joomla. When files are uploaded through Joomla - such as components / modules... The ownership is set to www-data:www-data This means that I need to then chown all new files to web_user:www-data so we can edit the files. Is there a way to set for a directory and sub-directories that all new files created have the ownership of web_user:www-data? Do I need to use something like setuid or setgid? Any help would be greatly appreciated.

    Read the article

  • how to have files created by CMS have the same ownership as SSH user

    - by Cam
    I am having difficulty on our ubuntu server whereby I have an SSH user that when I create files using this user the ownership is web_user:www-data The problem is when a file is uploaded or created using a content management system like joomla. When files are uploaded through Joomla - such as components / modules... The ownership is set to www-data:www-data This means that I need to then chown all new files to web_user:www-data so we can edit the files. Is there a way to set for a directory and sub-directories that all new files created have the ownership of web_user:www-data? Do I need to use something like setuid or setgid? Any help would be greatly appreciated.

    Read the article

  • Can Robocopy be made to skip open files?

    - by domspurling
    We are using Robocopy to redistribute files which arrive via FTP in a drop folder. Ideally we want Robocopy to leave files alone if they are still being FTPd. Having tried various switches, Robocopy still copies the open files. It doesn't delete them, so the FTP continues unaffected. However, we end up with truncated files being distributed to their destination. Can Robocopy be made to skip open files? Perhaps there is something more suitable than Robocopy for this task?

    Read the article

  • Force Windows 8 to search indexed files

    - by Hrvoje
    When using search files in Windows 8 (win+f) I don't get expected results. For example, I installed VLC, it's in Program Files (86) folder, and that folder is selected for indexing. Search for files (win+f) gives 0 results. If I pin to start that exe, then it's found - but I don't want to do this, that's not the point. Where does it search for files? Is there any way to specify search locations? It doesn't use Indexing Options settings, at least it seams so. Also, searching from explorer window is kinda slow - I tried entering VLC.EXE in search box (when in c:\ root), and it takes some time to give correct results. It works, but it looks like it doesn't use indexing, rather scan all files/folders, which is slow.

    Read the article

  • Nginx, logrotate and empty files

    - by tzulberti
    I have a problem with nginx/logrotate. The problems is that nginx is logging access to 2 files (main and data). I have the following contrab setting: 0 * * * * /usr/sbin/logrotate -f /home/orwell/orwell-setup/bin/logrotate-nginx And the file "logrotate-nginx" has the following content: /tmp/data.log { rotate 90 daily missingok notifempty size 1 sharedscripts postrotate [ ! -f /tmp/nginx.pid ] || kill -USR1 `cat /tmp/nginx.pid` MORE THINGS endscript } /tmp/main.log { rotate 90 daily missingok notifempty size 1 sharedscripts postrotate [ ! -f /tmp/nginx.pid ] || kill -USR1 `cat /tmp/nginx.pid` MORE THINGS endscript } The work is done in the two files, but there is a problem that nginx stops logging into those files. Both files are created, but they are empty. Any ideas why nginx stop logging info to both files?

    Read the article

  • How do I set the TEMP environment variable for the "Network Service" user?

    - by Chris Phillips
    We have a system that uses Path.GetTempFile and Path.GetTempPath calls to work with temporary files fairly frequently. This system also runs as the "Network Service" user. We're finding that we're running out of room on the C drive (for other issues, our temp files are cleaned up correctly) and would like to be able to move the temp directory to a different drive. The easiest solution to this seems to be to change the TMP or TEMP environment variables for the Network Service user, but I only seem to be able to set my own user or the "system" variables that are overwritten by the Network Service user profile. How do I set these variables for the Network Service user?

    Read the article

  • Linux program unable to access files in group

    - by user1064665
    I'm having trouble configuring things on linux so that a program can access certain files. Let's call it pgm A. It has uid uA and gid gA. In addition, uid uA is listed in /etc/group as a member of group gX. The problem is that pgm A cannot access files for which the uid is root and the gid is gX, but only when pgm A is called from another program, pgm B, which also runs as user uA. If I su as user uA and run pgm A from bash, it has no problem accessing files in group gX. But if another program, pgm B, which also runs as user uA, forks and execs pgm A, pgm A cannot access the files. I've verified that pgm A is indeed running as user uA, group gA, when launched from pgm B. So, if uA is a member of group gX, why can't the program access files which are readable by group gX? It's as if the operating system is ignoring the fact that user uA is also in group gX.

    Read the article

  • Get Zipped Logs from a Remote Server

    - by Jonathan
    I am tasked with trying to find a way to download zipped logs from a remote server. There are quite a bit of these logs and they are constantly created. I do have limited ssh access to the remote server and can scp or rsync the files. However, due to the sheer size of these logs file, I do not want to rsync all of them. The logs could get to terabytes and for rsync to compare them may take some time. I only want to get any new file that was created/last updated an hour ago. I also am worried that I will rsync logs that are in the process of being created, so I was thinking to only rsync files that were last modified 3-5 minutes ago. Would anyone be so kind as to help me with such a process? Thank you in advance.

    Read the article

  • How to make scp copy hidden files?

    - by rascher
    I often use SCP to copy files around - particularly web-related files. The problem is that whenever I do this, I can't get my command to copy hidden files (eg, .htaccess). I typically invoke this: scp -rp src/ user@server:dest/ This doesn't copy hidden files. I don't want to have to invoke this again (by doing something like scp -rp src/.* ... - and that has strange . and .. implications anyway. I didn't see anything in the scp man page about an "include hidden files". How can I accomplish this?

    Read the article

  • Hadoop Rolling Small files

    - by Arenstar
    I am running Hadoop on a project and need a suggestion. Generally by default Hadoop has a "block size" of around 64mb.. There is also a suggestion to not use many/small files.. I am currently having very very very small files being put into HDFS due to the application design of flume.. The problem is, that Hadoop <= 0.20 cannot append to files, whereby i have too many files for my map-reduce to function efficiently.. There must be a correct way to simply roll/merge roughly 100 files into one.. Therefore Hadoop is effectively reading 1 large file instead of 10 Any Suggestions??

    Read the article

  • Automatically copy files out of directory

    - by wizard
    I had a user's laptop stolen recently during shipping and it was setup with windows live sync. The thief or buyer's kids took some photos of themselves and they were synced to the user's my documents. I had just finished moving the users files out of the synced my documents folder when I noticed this. Later they took some more photos and a video. I wrote up a batch script to copy files out synced directory every 5 minutes into a dated directory. In the end I ended up with a lot of copies of the same few files. Ignoring what windows livesync offers (at the time there was no way to undelete files - I've moved onto dropbox so this ins't really an issue for me) what's the best way to preserve changes and files from a directory? I'm interested in windows solutions but if you know of a good way on a *nix please go ahead and share.

    Read the article

  • rename multiple files with unique name

    - by psaima
    I have a tab-delimited list of hundreds of names in the following format old_name new_name apple orange yellow blue All of my files have unique names and end with *.txt extension and these are in the same directory. I want to write a script that will rename the files by reading my list. So apple.txt should be renamed as orange.txt. I have searched around but I couldn't find a quick way to do this.I can change one file at a time with 'rename' or using perl "perl -p -i -e ’s///g’ *.txt", and few files with sed, but I don't know how I can use my list as input and write a shell script to make the changes for all files in a directory. I don't want to write hundreds of rename command for all files in a shell script. Any suggestions will be most welcome!

    Read the article

< Previous Page | 71 72 73 74 75 76 77 78 79 80 81 82  | Next Page >