Search Results

Search found 40229 results on 1610 pages for 'deleted files'.

Page 598/1610 | < Previous Page | 594 595 596 597 598 599 600 601 602 603 604 605  | Next Page >

  • How do I set permissions structure for multiple users editing multiple sites in /var/www on Ubuntu 9

    - by Michael T. Smith
    I'm setting up an Ubuntu server that will have 3 or 4 VirtualHosts that I want users to be able to work in (add new files, edit old files, etc.). I currently plan on storing the sites in /var/www but wouldn't be opposed to moving it. I know how to add new users, I know how to add new groups. I'm unsure of the best way to handle users being only able to edit some sites. I read over the answers here in this question, so I was thinking I could setup a group and add users to that group, but then they'd all have essentially the same permissions. Am I just going to have to assign each user specific permissions? Or is there a better way of handling this? Added: I should also note, that I'll have each user login in via SSH/sFTP. The users would never need to do anything else on the server.

    Read the article

  • rsync over SSH with cron in osx-environment

    - by Martin
    I want to automatically download files and folders from a Linux server to which I have an SSH (and FTP) account. The files shall be downloaded on a regular basis (I suppose a cron is the right tool to do so) onto an OS X machine. I tried the following rsync command, which works fine: rsync -avzbe ssh [email protected]:/www/htdocs/something/somefolder /Users/me/folder/foo/ However I have to enter the account's password every time (the SSH account on the server machine). The server is a managed one and I'm afraid I can't change the password. Here are my questions: How do I bypass the entering of the password by storing it somewhere How do I automate this then correctly?

    Read the article

  • Export a single layer as an image in Photoshop

    - by wrburgess
    I have a lot of designers send me layered PSDs of their designs and I need to break out the pieces of the designs to place on web pages. I can do a decent number of things in Photoshop, but I'm hardly efficient with it. My old way of just copying the image that's in a layer and pasting into a new image seems to take forever as I screw around with cropping and such. I've got Photoshop CS5, so I don't need external software to do anything, but I just need to figure out how to take a single layer, that may hold something small like an icon, and export it as a PNG or JPG. I am aware of the script called "Export Layers to Files" but it took about an hour and exported ALL of my layers to a huge number of files. I wasn't looking for a solution that broad. Is there an easy way to do this?

    Read the article

  • logrotate by size outside the daily schedule

    - by Josh Smeaton
    We have a couple of applications that generate huge log files. It's not enough to rotate those logs daily, so I created the following logrotate conf: /var/log/ourapp/*log { compress copytruncate missingok size 200M rotate 10 } The idea is that we can keep 2GB of logs for this one application, no matter how quickly those files are filling up. The problem, though, is that logrotate only runs once daily. AFAIK, when logrotate kicks off at 4am, it will check to see that the size is at least 200M and rotate it if so. Ideally logrotate would run every minute, check the size, and rotate if the size is greater. Is there a standard way for rotating based on size outside of the daily cron schedule?

    Read the article

  • Install problems with XSendFile on Ubuntu

    - by Dan
    I installed the apache dev headers: sudo apt-get install apache2-prefork-dev Downloaded and compiled the module as outlined here: http://tn123.ath.cx/mod_xsendfile/ Added the following line to /etc/apache2/mods-available/xsendfile.load: LoadModule xsendfile_module /usr/lib/apache2/modules/mod_xsendfile.so Added this to my VirtualHost: <VirtualHost *:80> XSendFile on XSendFilePath /path/to/protected/files/ Enabled the module by doing: sudo a2enmod xsendfile Then I restarted Apache. Then this code still just provides me with an empty file with 0 bytes: file_path = '/path/to/protected/files/some_file.zip' file_name = 'some_file.zip' response = HttpResponse('', mimetype='application/zip') response['Content-Disposition'] = 'attachment; filename=%s' % smart_str(file_name) response['X-Sendfile'] = smart_str(file_path) return response And there is not in the Apache error log that pertains to XSendFile. What am I doing wrong?

    Read the article

  • Need some help with Apache .htaccess

    - by Legend
    I am trying to setup an application that was built using the Zend framework. Let's say my subdomain is: http://subdomain.domain.com and that it points to the following: http://www.domain.com/projectdir/ The structure of the project dir is the following: application/ ... ... library/ ... ... public/ ... ... .htaccess The contents of the htaccess are: SetEnv APPLICATION_ENV production RewriteEngine On # skip existing files and folders RewriteCond %{REQUEST_FILENAME} -s [OR] RewriteCond %{REQUEST_FILENAME} -l [OR] RewriteCond %{REQUEST_FILENAME} -d RewriteRule ^.*$ - [NC,L] # send everything to index RewriteRule ^.*$ index.php [NC,L] While this works, the child objects on the page are being directed to the domain i.e., the image URLs (and the CSS files etc.) are broken because they are being redirected to something like: http://www.domain.com/images/image.png Can someone please tell me how to fix this?

    Read the article

  • Converting a Word document to LaTeX format

    - by Mehper C. Palavuzlar
    I'm preparing a book to be published and keeping everything in .docx files. Other than text the files include graphs (jpeg) and lots of equations typed in MathType. Since MS Word is not fully appropriate to balance text and shapes according to book format, some pages are having spacings at the bottom after some text, and then comes a shape on the next page. I know that LaTeX is very good at formatting, so is it possible to convert MS Word documents (or PDF documents, since I can easily convert them to PDF) into LaTeX format so that I can handle my work in LaTeX from now on?

    Read the article

  • Can you set CIFS permisions from EMC Command Line?

    - by TJ.
    I am in the process of migrating file shares from my EMC NS-20 to my new VNXe 3100. I am using a RoboCopy script to move the files but am getting errors on some files and folders. I have Domain Admin privileges but when I go to view the security permissions on the folders it says I don't have permissions. I have tried taking ownership to get around the permissions issue but that fails too. So as a last resort can I set permissions on this folder from the EMC console or Web management console?

    Read the article

  • Mountable online storage (no syncing)

    - by Sam
    I have a Linux VPS that I would like to turn into a media server. Like most cheap VPS's, it has a fairly small storage capacity. What I would like to do is attach the box to an online backup system such as SpiderOak or DropBox, where the files would reside and be directly accessible to either a webserver or media server software. Since the VPS hdd is small, I do not want the files to be synced to it. I would like a storage system that is online only. Ideally mountable like a network drive. Are there any services that suit my needs, or workarounds for services such as SpiderOak that do not require syncing?

    Read the article

  • How do I correct a directory incorrectly copied into itself?

    - by Peter Boughton
    Given the following situation... <path>/mydir1/mydir2 ...where mydir2 should have overwritten mydir1, but was instead placed inside, and both directories actually have the same filename. How is that fixed? Attempting to do mv <path>/mydir/mydir/* <path>/mydir/ or mv <path>/mydir <path>/ results in: mv: cannot move `<path>/mydir/mydir` to a subdirectory of itself, `<path>/mydir` This seems stupidly simple, but it's late here and I can't figure it out. There are seventeen such directories to fix (path differs for each, but same mydir name). To confirm, the error message can be caused with this: # cd /path/to/directory # mv mydir/mydir ./ mv: cannot move `mydir/mydir' to a subdirectory of itself, `./mydir' Also tried: # mv mydir/mydir/* mydir/ mv: cannot move `mydir/mydir/otherdir1' to a subdirectory of itself, `mydir/otherdir1' mv: cannot move `mydir/mydir/otherdir2' to a subdirectory of itself, `mydir/otherdir2' and... # mv /path/to/directory/mydir/mydir/otherdir1 /path/to/directory/mydir/ mv: cannot move `/path/to/directory/mydir/mydir/otherdir1' to a subdirectory of itself, `/path/to/directory/mydir/otherdir1' and using a temporary directory: # mv mydir/mydir ./mydir-temp # mv mydir-temp/* mydir/ mv: cannot move `mydir-temp/otherdir1' to a subdirectory of itself, `mydir/otherdir1' mv: cannot move `mydir-temp/otherdir2' to a subdirectory of itself, `mydir/otherdir2' I found a similar question "How to recursively move all files (including hidden) in a subfolder into a parent folder in *nix?" which suggested that mv bar/{,.}* . would do this. But this also gives the same errors, as well as confusingly picking up . and .. from somewhere. # cd mydir # mv mydir/{,.}* . mv: cannot move `mydir/otherdir1' to a subdirectory of itself, `./otherdir1' mv: cannot move `mydir/otherdir2' to a subdirectory of itself, `./otherdir2' mv: cannot move `mydir/.' to `./.': Device or resource busy mv: cannot move `mydir/..' to `./..': Device or resource busy mv: overwrite `./.file'? y Another similar question "linux mv command weirdness" suggests that mv doesn't overwrite and a copy is required. # cd mydir # cp -rf ./mydir/* ./ cp: overwrite `./otherdir1/file1'? y cp: overwrite `./otherdir1/file2'? y cp: overwrite `./otherdir1/file3'? This appears to be working... except there's a lot of files (and dirs) - I don't want to confirm every one! Isn't the f there supposed to prevent this? Ok, so cp was aliased to cp -i (which I found out with type cp), and bypassed by using \cp -rf ./mydir/* ./ which seems to have worked. Although I've solved the problem of getting dirs/files from one place to another, I'm still curious as to what's going on with the mv stuff - is this really a deliberate feature as suggested by Warner?

    Read the article

  • Best Amazon S3 File Manager Utility?

    - by mmacaulay
    Ever since I started using Amazon's S3 service I've been struggling to find a good solution for simple file management, without having to write my own app to browse my buckets, upload and delete files, etc. The best I've found so far to do this is S3Fox. But it's far from perfect, it has problems deleting files and folders. Comments on the Firefox plugin page indicate I'm not the only person with this problem and the developer does not respond to emails. I've looked around briefly, but couldn't find anything that looked any better than S3Fox. Please tell me there's a better way! Edit (07/26/2009): The Firefox S3Fox extension seems to be getting love from its developer again, the problems I was having before have gone away, and I'm using it on a regular basis now with no problems!

    Read the article

  • Opening password protected Excel 2007 documents by double clicking from My documents does not work u

    - by erik-van-gorp
    When all of the following conditions are true, excel will open (most of the time) but will not open the document itself. No error is displayed. This only occurs with Excel files, Word and powerpoint do open perfectly. Conditions : OS is "Windows 7 Professional 64-bit" office is "Office 2007 Ultimate". excel file is in .xls (2003 format) excel file is password protected excel file is in "My Documents" (or a subfolder of it) file is double-clicked from explorer under Windows 7. Following options do open the excel file as it should : right click and selecting the (bold) open action single click the file and pressing enter moving the file to the desktop and double-click it. non password protected files do open from the same directory. Actions taken not resolving the problem: - reboot - repair office installation - system restore does not work because of Antivirus application installed (message from system restore, using "Symantec Internet Security 2010") Anyone any idea ?

    Read the article

  • Syslog permissions

    - by Niels Kristian
    I'm using the $InputFile facility in rsyslog to monitor various log files scattered around my ubuntu 12.04 server. E.g. nginx, unicorn, rails, postgres, cron etc. Now my problem is, that some of these log files are created with -rw-r----- right, so rsyslog doesn't have read rights. Since I install most of the programs using apt-get, and therefore didn't change anything from default. So, in other words, I would like not to modify every singe log file / daemon to have the right permissions, if I instead could give syslog read access to all of them at once. But the question is - can I do that, and is it the "right thing to do"?

    Read the article

  • How to make the start menu find a program based on a custom keyword?

    - by Pierre-Alain Vigeant
    I am searching for a way to type a keyword in the start menu Search programs and files field and that it will return the application that match the keyword. An example will better explain this: Suppose that I want to start the powershell. Currently what I can type in the search field is power and the first item that appear is the 64bits powershell shortcut. Now suppose that I'd like ps to return powershell as the first item of the search list. Currently, typing ps return all files with the .ps extension, along with a control panel options about recording steps but not the powershell executable itself. How can I do that?

    Read the article

  • External HDD is always in use when trying to safely remove

    - by Mario De Schaepmeester
    I have a WD 1TB Elements external hard drive and every time I use the Windows 7 "safely remove" feature, it gives me a dialog telling that a process is using the disk. Using Sysinternals Process Explorer and the answer on this question (find everything with the drive letter) I get the following result: What is the $Extend folder and why is it in use? How can I disable it? I cannot remove it using the command line (access denied). Edit: I've followed the instructions over here and under the registry key HKLM\SYSTEM\CurrentControlSet\Control\BackupRestore\FilesNotToBackup I have a Multi-String Value named IgnoreNTFS with data \$Extend* /s But this does not make any difference. Also this question is not about a server. Additionally I can tell that I use a program called mkv2vob to convert video files with a Matroska container into something my PS3 will play. I convert the source files straight from my external HDD, but I would expect if this program does not release the lock on the HDD, surely it cannot be locked if the process isn't even running?

    Read the article

  • CentOS / Redhat: Give file permission for apache and vsftp

    - by paskster
    I use CentOS 5.5 and Apache Webserver on my dedicated Server. My Folder "/var/www/myWebApp" is owned by apache, so that apache can read, write logs, etc.. But now I would like to use very secure FTP (vsftp) to upload my new files. I used to give every user rwx -Acess to "/var/www/myWebApp", but I guess this is way to insecure. On CentOS I created another user "ftpuser" for uploading files and that has "/var/www/myWebApp" as its home directory. How can I give him the permission to write into the "/var/www/myWebApp" without giving every user the same rights?

    Read the article

  • Using the right folder for the right job. Article link, please?

    - by Droogans
    There are specific folders designed for specific tasks. /var/www holds your web sites, /usr/bin contains files to run your applications...yet I still find myself putting nearly all of my work in ~. Is it possible to overuse my home directory? Will it come back to haunt me? Anyone have a good link to an article of best practices for organizing your files so that they are placed in their "correct" place? Is there even such a thing in Linux? I am referring specifically to user-generated content. I do not compile applications from source, I use apt-get for those tasks. This article has a great introduction to what I'm looking for. Table 3-2, "Subdirectories of the root directory" is the sort of thing I'm looking for, but with more details/examples.

    Read the article

  • Get the number of tiffs in a multi-tiff with command line ImageMagick?

    - by Anders
    Is there a way to get the number of tiffs in a multi-tiff with a command line utility in ImageMagick? What I want to to is to extract (split) the multi-tiff into single files. However if the tiff only contains one tiff (is not a multi-tiff), I would like to do nothing at all... Also if I use the "%d"-option to name the output files I get the "%d" in the file name if it is only one. ...or is there another way to use ImageMagick to avoid strange filenames when converting?

    Read the article

  • IE8 Refuses to run Javascript from Local Hard Drive

    - by Josh Stodola
    I have a problem that just started at work recently and the network manager is certain he did not change anything with the group policy. Anyways, here is a detailed description of the problem. My machine is Windows XP SP3, and I use IE8 to browse. We have McAffee anti-virus software that I am unable to configure. I use the following file to test... <!DOCTYPE html> <html> <head> <title>Javascript Test</title> </head> <body> <script type="text/javascript"> document.write("<h1>PASS</h1>"); </script> <noscript> <h1>FAIL</h1> </noscript> </body> </html> When I open this file from the C: drive, it fails every time. If I execute it anywhere else (local/remote web server or on a mapped network drive), it works just fine. When I am simply browsing the Internet, Javascript on web sites works just fine. It is only failing on files running from my C: drive. Additionally, I have had a couple other programmers in the department try this file on their C: drive, and it works fine for them. So I don't believe it is a group policy thing. I need to fix this because I do extensive testing from my C: drive, and I am accustomed to doing so. I don't want to get into the habit of moving files to a different drive just to test. Things I have tried: Enabled "Allow Active Content to Run Files on My Computer" in Options | Advanced | Security Enabled "Allow Active Scripting" in Options | Security | Custom Level Verified that "Script" was not checked as disabled in Developer Toolbar Added localhost to Trusted Sites in Options Disabled McAffee completely (momentarily, with help from network admin) Used an older DOCTYPE in my test HTML page Re-installed IE8 completely Ran regsvr32 on the JScript.dll Slammed keyboard I am sure that there is a setting somewhere that will fix this problem, possibly in the registry. I would not be surprised if it was related to the developer toolbar. At this point I do not know where else to look. Can anyone help me resolve this problem? EDIT: Regardless of the bounty, this issue is still ongoing.

    Read the article

  • Getting rid of a trojan. SVCHOST question

    - by MasterPeter
    My antivirus keeps notifying me of a trojan. svchost.exe keeps creating some 'drivers' (.sys files in the drivers directory under system32 of my Windows XP installation) each of which is marked as Bubnix.AB trojan. The antivirus fails to remove many of the files as they are immediately used by svchost (I presume). How do I find out which service is the culprit? Why can't the antivirus effectively rid me of this plague? Also, how many svchost processes is it normal to have running at any one time? I am using Win XP SP3, and ESET NOD32 antivirus.

    Read the article

  • reclaime like, recovery software

    - by Bou
    I need a recovery software that has the features of reclaime file recovery. Those are, to be able to read image files, to keep folder structure, to be as efficient but to be free. I can't afford reclaime and all free software that i know out there either support folder structure but cannot read an image of the array created or the opposite. Can somebody suggest some software? PS: I used reclaime to create an image of my RAID0 broken array and with reclaime file recovery i can see all my files intact but i cannot recover without purchase.

    Read the article

  • Proper Imaging Procedures to Restore and Deploy Image with Separate System Reserved Partition

    - by alharaka
    UPDATE: As per my experience here, no one responded. If I do not hear back from TechNet forum members about it, I will post a bounty here, if it makes a difference. I have banged my head against a wall for what seems like all week. I am going to explain my simple procedure, and how none of it, absolutely none, seems to work afterword despite few alternatives and everyone on the internet telling assuming this is how to do it. Diskpart Commands to Create FS Structure REM Select the disk targeted for deployment. REM REM NOTE: Usually disk 0, but drive failure can make it external USB REM media. This will erase the drive regardless! select disk 0 REM Remove previous formatting. clean REM Create System Reserved partition bootloader and files. create partition primary size=100 REM Format the volume format fs=ntfs label="System Reserved" quick override noerr REM Assign the System Reserved partition the D: mount for now assign letter=C REM The main system partition, size not specified to occupy whole drive. create partition primary REM Format the volume format fs=ntfs quick override noerr REM Assign the OS partition the D: mount for now assign letter=D REM Make this the active/bootable partition. sel disk 0 sel partition 1 active REM Close out the diskpart session. exit Now, I thought this was madness, but it turns out the System Reserved partition and standard "System Partition" (C:, commonly both the boot and system volumes where you find the Windows directory AND the bootmgr/ntldr hardware files, this is where Windows 7 diverges) as mounted in the Windows PE session where I run these commands do not matter. See reference here. Since this needs to be BitLocker-ready, enter this crappy System Reserved partition that is separate 100MB of awesome that goes before the regular boot volume. I do this, then I proceed to the next step. Deploy System Reserved and Normal System Images REM C is still the "System Reserved Partition", and the image is just like it sounds. imagex /apply G:\images\systemreserved.wim 1 C: REM D is now what will be the C: system partition on reboot, supposedly. imagex /apply G:\images\testimage.wim 1 D: Reboot the system Now, the images I just captured should look good. This is not even sysprepped, but reapplying the same fscking image I prepared on the same reference workstation hours before. Problem is I get 0xc000000e could not detect the accessible boot device \Windows\system32\winload.exe or different kinds of nonsense revolving around being able to find the boot volume with all the right files. I try different variations of things, now none of them work. I tried repairs with bcdboot, with a fresh System Reserved partition or not, bootrec, and maually editing the damn BCD store with bcdedit. I tried finalizing the above process with and without bootsect /nt60 C: /force. I need to wrap up and automate this procedure. What am I doing wrong that does not make the image happy, but really just miserable.

    Read the article

  • How to remove this malware

    - by muratto12
    Some files in my site contains some extra lines. After I've deleted them manually, I find them corrupted again some time later. it is all coming from http://*.changeip.name/ some js files. How can I remove them? <!--pizda--><script type='text/javascript' src='http://m2.changeip.name/validate.js?ftpid=15035'></script><!--/pizda--> <iframe src=http://pizda.changeip.name/?f=1065433 framebor der=0 marginheight=0 marginwidth=0 scrolling=0 width=5 heigh t=5 border=0> <iframe src=http://kuku.changeip.name/?f=1065433 framebord er=0 marginheight=0 marginwidth=0 scrolling=0 width=5 height =5 border=0>

    Read the article

  • Running PHP scripts as the owner of the PHP file: security issues

    - by thomasrutter
    I'm using suexec to ensure that PHP scripts (and other CGI/FastCGI apps) are run as the account holder associated with the relevant virtual host. This allows for securing each users' scripts from reading/writing by other users. However, it occurs to me that this opens up a different security hole. Previously, the web server ran as an unprivileged user, with read-only access to user's files (unless the user changed the file permissions for some reason). Now, the web user can also write to user's files. So while I've prevented different users taking advantage of each other's scripts, I've made it so that in the event that some application has a remote code injection vulnerability, it now has not only read access but also write access to all that user's scripts and website. How can I deal with this? One idea I've had is to create a second user account for each user account in the system, so that each user has their own user account, and all their scripts are run under another user account. But that seems cumbersome.

    Read the article

< Previous Page | 594 595 596 597 598 599 600 601 602 603 604 605  | Next Page >