Search Results

Search found 91859 results on 3675 pages for 'help files'.

Page 69/3675 | < Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >

  • how to have files created by CMS have the same ownership as SSH user

    - by Cam
    I am having difficulty on our ubuntu server whereby I have an SSH user that when I create files using this user the ownership is web_user:www-data The problem is when a file is uploaded or created using a content management system like joomla. When files are uploaded through Joomla - such as components / modules... The ownership is set to www-data:www-data This means that I need to then chown all new files to web_user:www-data so we can edit the files. Is there a way to set for a directory and sub-directories that all new files created have the ownership of web_user:www-data? Do I need to use something like setuid or setgid? Any help would be greatly appreciated.

    Read the article

  • how to have files created by CMS have the same ownership as SSH user

    - by Cam
    I am having difficulty on our ubuntu server whereby I have an SSH user that when I create files using this user the ownership is web_user:www-data The problem is when a file is uploaded or created using a content management system like joomla. When files are uploaded through Joomla - such as components / modules... The ownership is set to www-data:www-data This means that I need to then chown all new files to web_user:www-data so we can edit the files. Is there a way to set for a directory and sub-directories that all new files created have the ownership of web_user:www-data? Do I need to use something like setuid or setgid? Any help would be greatly appreciated.

    Read the article

  • Can Robocopy be made to skip open files?

    - by domspurling
    We are using Robocopy to redistribute files which arrive via FTP in a drop folder. Ideally we want Robocopy to leave files alone if they are still being FTPd. Having tried various switches, Robocopy still copies the open files. It doesn't delete them, so the FTP continues unaffected. However, we end up with truncated files being distributed to their destination. Can Robocopy be made to skip open files? Perhaps there is something more suitable than Robocopy for this task?

    Read the article

  • Force Windows 8 to search indexed files

    - by Hrvoje
    When using search files in Windows 8 (win+f) I don't get expected results. For example, I installed VLC, it's in Program Files (86) folder, and that folder is selected for indexing. Search for files (win+f) gives 0 results. If I pin to start that exe, then it's found - but I don't want to do this, that's not the point. Where does it search for files? Is there any way to specify search locations? It doesn't use Indexing Options settings, at least it seams so. Also, searching from explorer window is kinda slow - I tried entering VLC.EXE in search box (when in c:\ root), and it takes some time to give correct results. It works, but it looks like it doesn't use indexing, rather scan all files/folders, which is slow.

    Read the article

  • Nginx, logrotate and empty files

    - by tzulberti
    I have a problem with nginx/logrotate. The problems is that nginx is logging access to 2 files (main and data). I have the following contrab setting: 0 * * * * /usr/sbin/logrotate -f /home/orwell/orwell-setup/bin/logrotate-nginx And the file "logrotate-nginx" has the following content: /tmp/data.log { rotate 90 daily missingok notifempty size 1 sharedscripts postrotate [ ! -f /tmp/nginx.pid ] || kill -USR1 `cat /tmp/nginx.pid` MORE THINGS endscript } /tmp/main.log { rotate 90 daily missingok notifempty size 1 sharedscripts postrotate [ ! -f /tmp/nginx.pid ] || kill -USR1 `cat /tmp/nginx.pid` MORE THINGS endscript } The work is done in the two files, but there is a problem that nginx stops logging into those files. Both files are created, but they are empty. Any ideas why nginx stop logging info to both files?

    Read the article

  • Linux program unable to access files in group

    - by user1064665
    I'm having trouble configuring things on linux so that a program can access certain files. Let's call it pgm A. It has uid uA and gid gA. In addition, uid uA is listed in /etc/group as a member of group gX. The problem is that pgm A cannot access files for which the uid is root and the gid is gX, but only when pgm A is called from another program, pgm B, which also runs as user uA. If I su as user uA and run pgm A from bash, it has no problem accessing files in group gX. But if another program, pgm B, which also runs as user uA, forks and execs pgm A, pgm A cannot access the files. I've verified that pgm A is indeed running as user uA, group gA, when launched from pgm B. So, if uA is a member of group gX, why can't the program access files which are readable by group gX? It's as if the operating system is ignoring the fact that user uA is also in group gX.

    Read the article

  • How do I set the TEMP environment variable for the "Network Service" user?

    - by Chris Phillips
    We have a system that uses Path.GetTempFile and Path.GetTempPath calls to work with temporary files fairly frequently. This system also runs as the "Network Service" user. We're finding that we're running out of room on the C drive (for other issues, our temp files are cleaned up correctly) and would like to be able to move the temp directory to a different drive. The easiest solution to this seems to be to change the TMP or TEMP environment variables for the Network Service user, but I only seem to be able to set my own user or the "system" variables that are overwritten by the Network Service user profile. How do I set these variables for the Network Service user?

    Read the article

  • Get Zipped Logs from a Remote Server

    - by Jonathan
    I am tasked with trying to find a way to download zipped logs from a remote server. There are quite a bit of these logs and they are constantly created. I do have limited ssh access to the remote server and can scp or rsync the files. However, due to the sheer size of these logs file, I do not want to rsync all of them. The logs could get to terabytes and for rsync to compare them may take some time. I only want to get any new file that was created/last updated an hour ago. I also am worried that I will rsync logs that are in the process of being created, so I was thinking to only rsync files that were last modified 3-5 minutes ago. Would anyone be so kind as to help me with such a process? Thank you in advance.

    Read the article

  • How to rename multiple files in multiple folders with 1 command

    - by Charles
    We want to rename our *.html files to *.php but (sadly enough) have not enough knowledge to do it with a dos batchfile and/or cmd prompt command. The problem is that each file is in seperat folder and yes talking about 1500+ different folder names. Using wildcards for the files I know is the '*' but using also a wildcard for folders is unknown to me. We probably need to use the (MSDOS) 'FOR' command but there I am stucked. Folder structure we use is: parent-folder/child-folder/grandchild-folder/file.html sample: games/A/game_name/file.html, games/B/game_name/file.html, games/C/game_name/file.html and so on. The parent folder is for all files the same, the child & grandchild folders are different for most files. After renaming these files to .php I assume following in the .htaccess will make a permanent redirect. RedirectMatch 301 (.).html$ http://oursite.com$1.php Looking forward to suggestions/answers, thnx in advance.

    Read the article

  • How to make scp copy hidden files?

    - by rascher
    I often use SCP to copy files around - particularly web-related files. The problem is that whenever I do this, I can't get my command to copy hidden files (eg, .htaccess). I typically invoke this: scp -rp src/ user@server:dest/ This doesn't copy hidden files. I don't want to have to invoke this again (by doing something like scp -rp src/.* ... - and that has strange . and .. implications anyway. I didn't see anything in the scp man page about an "include hidden files". How can I accomplish this?

    Read the article

  • Hadoop Rolling Small files

    - by Arenstar
    I am running Hadoop on a project and need a suggestion. Generally by default Hadoop has a "block size" of around 64mb.. There is also a suggestion to not use many/small files.. I am currently having very very very small files being put into HDFS due to the application design of flume.. The problem is, that Hadoop <= 0.20 cannot append to files, whereby i have too many files for my map-reduce to function efficiently.. There must be a correct way to simply roll/merge roughly 100 files into one.. Therefore Hadoop is effectively reading 1 large file instead of 10 Any Suggestions??

    Read the article

  • Automatically copy files out of directory

    - by wizard
    I had a user's laptop stolen recently during shipping and it was setup with windows live sync. The thief or buyer's kids took some photos of themselves and they were synced to the user's my documents. I had just finished moving the users files out of the synced my documents folder when I noticed this. Later they took some more photos and a video. I wrote up a batch script to copy files out synced directory every 5 minutes into a dated directory. In the end I ended up with a lot of copies of the same few files. Ignoring what windows livesync offers (at the time there was no way to undelete files - I've moved onto dropbox so this ins't really an issue for me) what's the best way to preserve changes and files from a directory? I'm interested in windows solutions but if you know of a good way on a *nix please go ahead and share.

    Read the article

  • Converting Visio (.vsd) files to pdf automatically

    - by Aseques
    I am trying to create a scheduled task to convert all my .vsd files to pdf so all of our devices can read them (linux, mac, smartphones, etc..) and I would prefer not paying for something that can be done with Visio + PDFcreator. The approach of using openoffice doesn't work with .vsd files since it's not a supported format ( Method/tools for batch-converting Microsoft Word files into PDF?) What I've currently is this: 'C:\Program Files\Microsoft Office\Visio11\VISIO.EXE' /pt "Z:\Archive\Files.vsd",-PPDFCREATORPRINTER /nologo That is able to open automatically the document I want and to prepare it to be printed, the only missing part is that it requires me to confirm on the printing dialog. There's some information here: http://support.microsoft.com/kb/314392 but it doesn't explain abotu non interactive printing.

    Read the article

  • rename multiple files with unique name

    - by psaima
    I have a tab-delimited list of hundreds of names in the following format old_name new_name apple orange yellow blue All of my files have unique names and end with *.txt extension and these are in the same directory. I want to write a script that will rename the files by reading my list. So apple.txt should be renamed as orange.txt. I have searched around but I couldn't find a quick way to do this.I can change one file at a time with 'rename' or using perl "perl -p -i -e ’s///g’ *.txt", and few files with sed, but I don't know how I can use my list as input and write a shell script to make the changes for all files in a directory. I don't want to write hundreds of rename command for all files in a shell script. Any suggestions will be most welcome!

    Read the article

  • excel date range help please

    - by Mark
    I need help with either a formula or a macro to help automate a grade sheets dates. We have class every monday or wednesday only. I would like to vlookup from a input table of each quarters date range (example Sept. 10 - Oct 24 and the code auto insert the date of every monday and wednesday in a row at the top of my grade sheet. Every year I use the same excel workbook I built to average and rate the grading with no problem, however i can't seem to get this one right. Any help would be greatly appreciated. Currently I have to enter each date by hand. Thanks Again for any help.

    Read the article

  • OS X - Automatically Set Execute Permissions for New Files?

    - by i help X u
    I'm using OS X 10.6.4 and am trying to set a folder to automatically enable execute permissions on new script files copied or created in a directory. I have used Sandbox 2 to set every permission for the folder to enabled with sticky bits and the inherit flag set but I still have to manually set the execute flag using chmod for every new flag. I've done: chmod -R a+rwxs ~/scripts I've done: chmod 7777 ~/scripts And the permissions for the folder show as: drwsrwsrwt+ for the folder. But if I add a new script file it's set to "-rw-r--r--+" (the default) I looked at setting "unmask 000" in the .profile file but the default value for files is 666 with an unmask of 022 so that's not relevant since I would need a default value of 777 for files. I have figure out how to use chmod in an AppleScript triggered by a folder action to automate this but I'm wondering if there is a simple ACL or chmod setting I'm missing. So, is there a way to automatically set execute permission for new files? (Without using a folder action and AppleScript?)

    Read the article

  • Converting Visio (.vsd) files to pdf automatically [migrated]

    - by Aseques
    I am trying to create a scheduled task to convert all my .vsd files to pdf so all of our devices can read them (linux, mac, smartphones, etc..) and I would prefer not paying for something that can be done with Visio + PDFcreator. The approach of using openoffice doesn't work with .vsd files since it's not a supported format ( Method/tools for batch-converting Microsoft Word files into PDF?) What I've currently is this: 'C:\Program Files\Microsoft Office\Visio11\VISIO.EXE' /pt "Z:\Archive\Files.vsd",-PPDFCREATORPRINTER /nologo That is able to open automatically the document I want and to prepare it to be printed, the only missing part is that it requires me to confirm on the printing dialog. There's some information here: http://support.microsoft.com/kb/314392 but it doesn't explain abotu non interactive printing.

    Read the article

  • Access denied to EFS encrypted files after PC joins domain

    - by mjmarsh
    I'm experiencing strange behavior with Windows Encrypted File System: I have a machine that is in workgroup mode (not joined to a domain) I encrypt an entire directory structure on the machine (basically a folder and subfolders with data files for my application). My application writes and reads files from the encrypted file hierarchy as a local Windows user (let's call the account 'SecureUser'). This works fine I then join the PC to a domain (Let's call it 'TEST') Afterwards, processes running as the local 'SecureUser' account can't read the files it wrote originally when it was off the domain (What is also strange is that the files are listed as "read only" now and I cannot unset this flag via Windows Explorer or the command line, even though it looks like it succeeds) I then 'un-join' the PC from the domain and everything works again Is there something about changing domain membership on a PC that changes the behavior of EFS so that previously encrypted files cannot be read, even by the originating user? Thanks in advance

    Read the article

  • Access to certain files but not others

    - by ADW
    Hoping someone can help me as I have, thus far, been unable to solve the issue. I am running a media center utilizing Ubuntu 12.04. I was initially successful accessing media files from the desktop running Ubuntu via my Windows 7 laptop and Roku device. I started backing up a new batch of DVD's I had (into MKV files, like everything else in my media folders) and noticed I cannot access the new files from either the Roku or the laptop. I have not changed any settings in the media folder and verified the shared permissions. The parent folder (Media) is shared (with permission flow-down) while the subfolders (Movies, TV Shows, Music) are not. I have changed the permissions on this to include shared when the access problem arose but with no success. I can only access the original files uploaded an not new files added. Any suggestions??? Thanks in advance for any and all help.

    Read the article

  • TrueCrypt Corrupted Files

    - by B. Knight
    Several months ago, I needed to reorganize my data across multiple external hard drives with my laptops primary hard drive as the go-between. My external hard drives are all encrypted with TrueCrypt. It appears to me that somehow during the transfer of my files between the encrypted external drive an the unencrypted internal drive, the files were transferred "as-is" (in their encrypted state). The files range from very small to very large. It appears that this may have happened during one consecutive transfer session. Has anyone ever experienced this problem, and if so were you able to fix it? Is there a way to recreate the encrypted partition, transfer the files, and then decrypt them to their usable state? Or can the files somehow be decrypted through other means? UPDATE: I am running Windows 7 (x64) HP now, but may have been runninG ENT. then. Toshiba Laptop 650GB HDD / 4GB Mem. Latest version of TC

    Read the article

  • Move files contained in a certain dir to the previous one (centOS)

    - by Alex
    i will try to explain my problem (sorry for my bad english). I have an image gallery with a directory structure like that: images/dir1/subdir1/IMG/files.jpg images/dir1/subdir2/IMG/files.jpg images/dir1/subdir3/IMG/files.jpg images/dir2/subdir1/IMG/files.jpg ....... images/dir109/subdir1/IMG/files.jpg the directory named images contains 109 dirs (dir1,dir2, ... dir109), the 109 dirs totally have 1200 subdirs inside, every subdir contain a dir named IMG with images into it (file1.jpg file2.jpg etc ...), i would like to move all the images contained into every dir named IMG into the previous dir (subdir) to have something like that: images/dir1/subdir1/file1.jpg images/dir1/subdir1/file2.jpg images/dir1/subdir2/file1.jpg ........ images/dir109/subdir1/file.jpg

    Read the article

  • Cannot apply unity --reset after modifying files

    - by Alex Cline
    So I have an idea of what I did wrong, I am just not sure how to fix it. I used the Unity Glass mod: http://www.omgubuntu.co.uk/2012/07/unity-glass-offers-refined-new-look-for-the-unity-launcher After removing it, I cannot reset unity and it does not work. Even after purging Unity and reinstalling it, I cannot seem to replace the missing files. $unity --reset WARNING: Unity currently default profile, so switching to metacity while resetting the values unity-panel-service: no process found Checking if settings need to be migrated ...no Checking if internal files need to be migrated ...no Backend : gconf Integration : true Profile : unity Adding plugins Initializing core options...done compiz (core) - Warn: failed to receive ConfigureNotify event on 0x1c00027 Initializing composite options...done Initializing opengl options...done Initializing decor options...done Initializing vpswitch options...done Initializing snap options...done Initializing mousepoll options...done Initializing resize options...done Initializing place options...done Initializing move options...done Initializing wall options...done Initializing grid options...done I/O warning : failed to load external entity "/home/arcline/.compiz/session/10b624e5c8f98c5325134625607758338300000051770001" Initializing session options...done Initializing gnomecompat options...done Initializing animation options...done Initializing fade options...done Initializing unitymtgrabhandles options...done Initializing workarounds options...done Initializing scale options...done compiz (expo) - Warn: failed to bind image to texture Initializing expo options...done Initializing ezoom options...done (compiz:7038): Gtk-WARNING **: Theme parsing error: gnome-panel.css:28:11: Not using units is deprecated. Assuming 'px'. (compiz:7038): GConf-CRITICAL **: gconf_client_add_dir: assertion `gconf_valid_key (dirname, NULL)' failed Segmentation fault (core dumped)

    Read the article

  • Uploading.to Uploads Files to Multiple File Hosts Simultaneously

    - by Jason Fitzpatrick
    If you’re looking to quickly share a file across a variety of file hosting services, Uploading.to makes it a cinch to share up to 10 files across 14 hosts. The upload process is simple. Visit Uploading.to, select your files, check the hosts you want to share the file across (by default all 14 are checked), add a description to the collection, and hit the Upload button. Uploading.to will upload your file to the various hosts; during the process you’ll see which hosts are confirmed and which have failed. We had 2 failures among the 14 hosts which still left the file mirrored across a sizable 12 host spread–not bad at all. When you’re ready to share the file hit the Copy Link button at the bottom of the screen and share it with your friends. They’ll be directed to Uploading.to and will be able to select from any of the hosts the file was successfully mirrored across. Uploading.to is a free service and requires no registration. Uploading.to [via Addictive Tips] HTG Explains: Do You Really Need to Defrag Your PC? Use Amazon’s Barcode Scanner to Easily Buy Anything from Your Phone How To Migrate Windows 7 to a Solid State Drive

    Read the article

  • How should bug tracking and help tickets integrate?

    - by Max Schmeling
    I have a little experience with bug tracking systems such as FogBugz where help tickets are issues are (or can be) bugs, and I have some experience using a bug tracking system internally completely separate from a help center system. My question is, in a company with an existing (home-grown) help center system where replacing it is not an option, how should a bug tracking system (probably Mantis) be integrated into the process? Right now help tickets get put in for issues, questions, etc and they get assigned to the appropriate person (PC Tech, Help Desk staff, or if it's an application issue they can't solve in the help desk it gets assigned to a developer). A user can put a request for small modifications or fixes to an application in a help ticket and the developer it gets assigned to will make the change at some point, apply their time to that ticket, and then close the ticket when it goes to production. We don't currently have a bug tracking system, so I'm looking into the best way to integrate one. Should we just take the help tickets and put it into the bug tracking system if it's a bug (or issue or feature request) and then close the ticket if it's not an emergency fix? We probably don't want to expose the bug tracking system to anyone else as they wouldn't know what to put in the help center system and what to put in the bug tracker... right? Any thoughts? Suggestions? Tips? Advice? To-dos? Not to-dos? etc...

    Read the article

  • HTML Parsing for multiple input files using java code [closed]

    - by mkp
    FileReader f0 = new FileReader("123.html"); StringBuilder sb = new StringBuilder(); BufferedReader br = new BufferedReader(f0); while((temp1=br.readLine())!=null) { sb.append(temp1); } String para = sb.toString().replaceAll("<br>","\n"); String textonly = Jsoup.parse(para).text(); System.out.println(textonly); FileWriter f1=new FileWriter("123.txt"); char buf1[] = new char[textonly.length()]; textonly.getChars(0,textonly.length(),buf1,0); for(i=0;i<buf1.length;i++) { if(buf1[i]=='\n') f1.write("\r\n"); f1.write(buf1[i]); } I've this code but it is taking only one file at a time. I want to select multiple files. I've 2000 files and I've given them numbering name from 1 to 2000 as "1.html". So I want to give for loop like for(i=1;i<=2000;i++) and after executing separate txt file should be generated.

    Read the article

< Previous Page | 65 66 67 68 69 70 71 72 73 74 75 76  | Next Page >