Search Results

Search found 3168 results on 127 pages for 'directories'.

Page 29/127 | < Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >

  • How to setup a user account for a web application

    - by ximus
    Hi, What are the main guidelines to setting up a user account on a Linux machine for a web app? In my case it is a Rails application that does file management. First thing I can think of is to limit access rights to only the directories it needs. But how exactly should I go about this? Setup rights through a user group or a through the user's ownership of those directories. I have very little experience in user rights management. What else do I need to consider? I've heard of ACL's and SELinux, do I need to look into any of these to guaranty decent security for my simple web app? Any advice about this and anything not mentioned welcomed, Thanks, Max. I will be using Ubuntu.

    Read the article

  • On a Mac, how can I find all files on a NTFS partition that have the same name, given case *in*sensi

    - by SCdF
    Here's the deal, I have a huge mess of files on an external drive that is formatted as NTFS. I wish to copy all of these files onto my MacBook Pro. NTFS, like sane filesystems, is case sensitive. HFS is not. There is, somewhere in the mess of tens of thousands of files and directories, one or more 'duplicates' in the eyes of HFS. Theses are preventing me from copying the entire directory of data onto my mac. (MacOSX rather unhelpfully throws a general error explaining the problem, but not the exact file. It also doesn't give you an option to skip) What is the best approach to solve this? Does anyone know a tool that can find files and directories that have the same case-insensitive name?

    Read the article

  • IIS6 Permissions

    - by Gordon Carpenter-Thompson
    We have a set of IIS6 Jakarta/ASP.NET applications (implemented as virtual directories) on a machine without a domain. The directories all exist under the default website. We need to setup the permissions so that certain users can access only specific applications yet others users can access several of the applications. The way it's been setup previously has been to explicitly deny access to the users for every application except the ones that they are allowed to see. The problem is that the list of applications changes fairly often (for demos etc) and it's been known for the developers to forget to deny the old users access to the new applications which leads to security problems. This is all quite unmaintainable. Does anybody have any advice on this? Surely I can't be the only person to find this all a bit of a mess? Thanks

    Read the article

  • tag structured Filesystems

    - by A.Rashad
    I hope this is the correct site, I lose my way between the 4 sister sites :) Let me ask the question this way. all file systems I have seen before are hierarchical, that means a root directory, with some branched directories, and so on until we have files residing in these directories. except for AS/400 file structure, where it has a concept of a Library that serve somehow as a directory but one level only. Why not have directory-less filesystems where files are placed in a single location, but the file identifiers would be referenced by a database of tag/ file relation ships. This way there will be no need for symbolic links, one file may have multiple relations to multiple subjects, not only a single parent directory to contain. I hope the idea is clear.

    Read the article

  • Unable to commit file through svn, server sent truncated HTTP response body

    - by Rocket3G
    I have my own VPS, on which I want to run a simple SVN + chiliproject setup. I have re-installed SVN, CHILI and the OS several times, and it always works for a couple of hours/days and then it just stops working. Well, everything works, except I can't upload any files. Committing directories seems to work just fine, but when I try to commit a file it breaks. I have an error log file, which gives me the following text when I try to commit something x.x.x.x - - [19/Oct/2013:00:01:46 +0200] "OPTIONS /project HTTP/1.1" 200 149 x.x.x.x - - [19/Oct/2013:00:01:46 +0200] "PROPFIND /project HTTP/1.1" 207 346 x.x.x.x - - [19/Oct/2013:00:01:46 +0200] "MKACTIVITY /project/!svn/act/c11d45ac-86b6-184a-ac5a-9a1105d64563 HTTP/1.1" 401 345 x.x.x.x - admin [19/Oct/2013:00:01:46 +0200] "MKACTIVITY /project/!svn/act/c11d45ac-86b6-184a-ac5a-9a1105d64563 HTTP/1.1" 201 262 x.x.x.x - - [19/Oct/2013:00:01:46 +0200] "PROPFIND /project HTTP/1.1" 207 236 x.x.x.x - admin [19/Oct/2013:00:01:46 +0200] "CHECKOUT /project/!svn/vcc/default HTTP/1.1" 201 271 x.x.x.x - admin [19/Oct/2013:00:01:46 +0200] "PROPPATCH /project/!svn/wbl/c11d45ac-86b6-184a-ac5a-9a1105d64563/1 HTTP/1.1" 207 267 x.x.x.x - admin [19/Oct/2013:00:01:46 +0200] "CHECKOUT /project/!svn/ver/1 HTTP/1.1" 201 271 x.x.x.x - - [19/Oct/2013:00:01:46 +0200] "HEAD /project/index.html HTTP/1.1" 404 - x.x.x.x - admin [19/Oct/2013:00:01:46 +0200] "PUT /project/!svn/wrk/c11d45ac-86b6-184a-ac5a-9a1105d64563/index.html HTTP/1.1" 201 269 x.x.x.x - admin [19/Oct/2013:00:02:04 +0200] "DELETE /project/!svn/act/c11d45ac-86b6-184a-ac5a-9a1105d64563 HTTP/1.1" 204 - So it seems that it PUTs the file (test.html) correctly, and somehow somewhere something is wrong (file permissions are alright, when I purposely stated that they are wrong, it gave me errors, which is expected, and they were about the file permissions being incorrect. The odd thing is that files won't get added, but directories are fine. I also have enough storage left on my machine. What I should note, perhaps, is that I use Ubuntu 12.04.3 with ruby 1.9.3, mysql 14.14 and I have it set up that Chiliproject handles the authentication and authorization for the project. It works, because I can commit directories and read it all correctly, though I can't upload files. Help would really be appreciated, as I don't know what on earth is going on with this 'truncated http response body'. I tried to read them with wireshark, but it basically gave me the same information. With regards, Ps. I have no clue what the delay between put and delete is, as it's a file of a mere 500 bytes, so it's uploaded in approximately a second. Pps. I copied this question from StackOverflow to this site, as I didn't know the existence of this site and another user suggested that I'd get more answers here, as it's basically a server fault.

    Read the article

  • Best and Proper Permissions Settings for Directory

    - by Dr. DOT
    I am interested in knowing the proper, yet security-conscious settings for a directory. Here's my scenario: I have a username for FTP access to my server called "user". For the purpose of the scenario, PHP runs as "nobody" on my server. I have a directory off the document root called "sample". The "sample" directory is chmod'd at 0755 (drwxr-xr-x) "Sample" is owned by "user" and the group is set to "user" The above is all very straight forward and standard. So I want to have a script be able to create (mkdir) and delete (rmdir) directories under "sample". Yet, I don't want to obviously overly expose my server by opening up the permissions (I could easily chmod sample to 0777 and make it world write-able). What is the best combination of permissions, owner settings and/or group settings to allow my script to create and delete directories under "sample" while retaining the ability for "user" to continue to FTP into the directory? Thanks.

    Read the article

  • What files should be excluded from a complete Windows backup?

    - by tro
    I'm starting to use CrashPlan to backup my Win 7 PC. I've got it writing to my external HD (for quick local restores) and to CrashPlan Central (for offsite storage). I'd like to backup my entire C:\ drive (the only partition) in a way that: Preserves all of my installed software and configuration, but Avoids backing up log files and other ephemeral / temporary files that are regenerated during normal operation of the OS. Which files and/or directories should I be excluding from backups? I'd like to make this a community wiki, so that we could all contribute towards a definitive list. Here's a list of regular expressions identifying the directories and files that CrashPlan excludes on Windows by default listed at http://support.crashplan.com/doku.php/articles/admin_excludes: .*/(?:42|\d{8,})/(?:cp|~).* (?i).*/CrashPlan.*/(?:cache|log|conf|manifest|upgrade)/.* .*\.part .*/iPhoto Library/iPod Photo Cache/.* .*\.cprestoretmp.* *\.rbf :/Config\\.Msi.* .*/Google/Chrome/.*cache.* .*/Mozilla/Firefox/.*cache.* .*\$RECYCLE\.BIN/.* .*/System Volume Information/.* .*/RECYCLER/.* .*/I386.* .*/pagefile.sys .*/MSOCache.* .*UsrClass\.dat\.LOG .*UsrClass\.dat .*/Temporary Internet Files/.* (?i).*/ntuser.dat.* .*/Local Settings/Temp.* .*/AppData/Local/Temp.* .*/AppData/Temp.* .*/Windows/Temp.* (?i).*/Microsoft.*/Windows/.*\.log .*/Microsoft.*/Windows/Cookies.* .*/Microsoft.*/RecoveryStore.* (?i).:/Config\\.Msi.* (?i).*\\.rbf .*/Windows/Installer.* Other excludes: .*\.(class|obj) .*/hiberfil.sys (?i).*\.tmp (?i).*/temp/ (?i).*/tmp/ .*Thumbs\.db .*/Local Settings/History/ .*/NetHood/ .*/PrintHood/ .*/Cookies/ .*/Recent/ .*/SendTo/

    Read the article

  • What software supports full multisession CD burning on windows?

    - by dma_k
    I know, that beating Nero in this field is very difficult. I have looked through best software listed in wiki, also checked the related posts here and here: most of the software either allows you to create ISO image and/or burn the existing image. However I am interested in the following flow: Import previous session Add/rename/remove files and directories Burn new session UltraISO is very good in (1) and (2). But it cannot append a disk with new session. InfraRecorder/CDBurnerXP can do (1) (2) (3) but: InfraRecorder crashes when trying to import some Nero sessions CDBurnerXP does not allow to move files in previous session to another directories. Is anybody here happy with any other free software?

    Read the article

  • How can I get vim to set an ACL on its swap files?

    - by thsutton
    I use vim on an OS X Snow Leopard Server machine. A number of the directories I work in have ACLs (so that various groups of users can access them over AFP) that are inherited. For some reason, when I'm working in one of these directories, vim cannot read it's own swap files. It can create them fine but can't read them which, for some reason, makes it display the "swap file already exists" message (and no, the swap file does not already exist). vim -r lists the newly created swap file as "[cannot be read]". The owner and group are correct and the permissions are 0600, and the ACLs on the swap file and the file I'm editing are identical (as disclosed by ls -le and compared with diff). groups returns the same thing whether invoked from my login shell or via :! in vim. Has anyone encountered (and hopefully resolved) a problem like this before?

    Read the article

  • User account shows two Downloads folders

    - by Chris Lieb
    I have my user account on my D drive and junction'd to the C:\users folder. I accidentally moved my profile Downloads folder (C:\users\me\Downloads) and then moved it back to its path on the D drive (C:\me\Downloads). After doing this, the directory tree for my user profile lists two Downloads directories, one located at C:\users\me and one at D:\me. I tried deleting the directory from the D drive, then restoring it from the Recycle Bin to the proper location on the C drive (actually the D drive, accessed through the junction), but it gave me the two Downloads directories again. Is there some way to fix this so that the only listing is for the C:\users\me\Downloads directory, like it was to begin with?

    Read the article

  • How to generate an ASCII representation of a Unix file hierarchy?

    - by Jenn D.
    Hi, all. I'm looking for a quick and dirty way to generate some diagrams of some directories that have almost, but not exactly, the same hierarchy, so I can show them around at a meeting and we can decide which flavor we like best. I'm not interested in the "leaf" nodes, just the directories. The catch: I don't want to mess with X. This is a server system I deal with entirely through SSH. So I'm looking for something that will do ASCII layout, maybe with simple pipes-and-hyphens for lines or something. Does anyone know of such a utility? I'm sure I could write something myself, but it's such a fiddly little sort of project, with handling spacing and layout and such; I'd really like to discover that someone's done it for me. Alas, Google doesn't seem to know of such a thing...or if it does, it's hidden beneath heaps of excellent visual explications of the standard general Unix file hierarchy. Thanks!

    Read the article

  • Copying windows 8 Users folder having long long paths

    - by bilal.haider
    I was trying to move my "Users" folder in Windows 8 as described here and here. But when I try to copy the folder using "xcopy" in Windows Installation Disk Repair Mode, after some files are copied, I get "insufficient memory". The files on which the error is given are like C:\Users\Bilal\Application Data\Application Data\Application Data.........Application Data\Application Data..... What is the point in such directories within directories? I also tried copying them using Mini Windows XP, but the problem was there too.. Also tried copying using Parted Magic Live CD... but still.. So now, how can I move them? Another Question. Is moving such/ system files using Linux a good idea? Does it do anything to permissions?

    Read the article

  • how can I git-revise configs in my /etc/ dir? (sudo has different keys..)

    - by Dean Rather
    I'd like to keep some of the folders in my /etc/ dir git-revised, cause I'm quite new to server administration and am constantly messing around in my /etc/nginx/ and /etc/bind/ directories. I've heard of people git-revising their either /etc/ directories, but that seems a bit like overkill, as at this point I'm only messing in those 2 subdirectories. The problem I'm having is that if I sudo my git operations, I don't have the right pubkeys to push to my remote repo (bitbucket). But if I don't sudo, I need to mess around with all the permissions (again, not very pro at this). Does anyone know best practices for managing their configs? or how I should solve this problem? Thanks, Dean. PS. It's Ubuntu 12.04, Git, nginx, bind9, amazon aws, bitbucket...

    Read the article

  • Home folder only SFTP connection has limited access

    - by Tomasz Durka
    I have configured sftp access for user using this guide: Linux shell to restrict sftp users to their home directories? I have problem though. I have taken all steps. I have chown'ed root:user the home folder and I set permissions to 755. I can login normally using SFTP, however I can NOT transfer files, can NOT mkdir directories. If I change permissions to 777 it's ok do edit everything. However this is the thing I don't want. Additionally after exiting sftp and reconnecting then connection is reset by peer (due to setting 777). Anyone had similar problem? What I am doing wrong?

    Read the article

  • Run script when shutting down ubuntu before the logged in user is logged out

    - by Travis
    I'm writing a script to backup some local directories on a unix machine (Ubuntu) to a samba drive. The script works fine and I've got it running at shutdown and restart using the method described at http://en.kioskea.net/faq/3348-ubuntu-executing-a-script-at-startup-and-shutdown It works by placing the backup script into the /etc/rc6.d and /etc/rc0.d directories. However there is a problem. After looking at the scripts logfile it seems to be run after the user is logged out. We are using LDAP authentication and when the user logs out, the system cannot backup to their samba share. Does anyone know of anyway to run the script before the user is logged out?

    Read the article

  • mount dev, proc, sys in a chroot environment?

    - by Patrick
    I'm trying to create a Linux image with custom picked packages. I followed the guide here http://www.olpcnews.com/forum/index.php?topic=4766.0 However, when I tried to install some packages, it failed to configure due to missing the proc, sys, dev directories. So, I learned from other places that I need to "mount" the host proc, ... directories to my chroot environment. Though, I saw two syntax and am not sure which one to use. In host machine: mount --bind /proc <chroot dir>/proc and another syntax (in chroot envrionment): mount -t proc none /proc Which one should I use, and what are the difference? Edit: What I'm trying to do is to hand craft the packages I'm going to use on an XO laptop, because compiling packages takes really long time on the real XO hardware, if I can build all the packages I need and just flash the image to the XO, I can save time and space.

    Read the article

  • batch file infinite loop when parsing file

    - by Bart
    Okay, this should be a really simple task but its proving to be more complicated than I think it should be. I'm clearly doing something wrong, and would like someone else's input. What I would like to do is parse through a file containing paths to directories and set permissions on those directories. An example line of the input file. There are several lines, all formatted the same way, with a different path to a directory. E:\stuff\Things\something else (X)\ (The file in question is generated under Cygwin using find to list all directories with "(X)" in the name. The file is then passed through unix2win to make it windows compatible. I've also tried manually creating the input file from within windows to rule out the file's creation method as the problem.) Here's where I'm stuck... I wrote the following quick and dirty batch file in Windows XP and it worked without any issues at all, but it will not work in server 2k8. Batch file code to run through the file and set permissions: FOR /F "tokens=*" %%A IN (dirlist.txt) DO echo y| cacls "%%A" /T /C /G "Domain Admins":f "Some Group":f "some-security-group":f What this is SUPPOSED to do (and does in XP) is loop through the specified file (dirlist.txt) and run cacls.exe on each directory it pulls from the file. The "echo y|" is in there to automagically confirm when cacls helpfully asks "are you sure?" for every directory in the list. Unfortunately, however, what it DOES is fall into an infinite loop. I've tried surrounding everything after "DO" with quotes, which prevents the endless loop but confuses cacls so it throws an error. Interestingly, I've tried running the code from after "DO" manually (obviously replacing the variable with the full path, copied straight from the file) at a command prompt and it runs as expected. I don't think it's the file or the loop, as adding quotes to the command to be executed prevents the loop from continuing past where it's supposed to... I really have no idea at this point. Any help would be appreciated. I have a feeling it's going to be something increadibly stupid... but I'm pulling my hair out so I thought I'd ask.

    Read the article

  • Directory comparison in Meld but ignoring changes that only involve file timestamp?

    - by creamcheese
    I'm using Meld to compare two directories of source code on Ubuntu. However, because all of the files in one of the directories have been 'touched' so that all of their timestamps were updated, Meld is showing them as different, even though the contents of the files have not changed. But I'm only trying to find files that have different content. I don't see an option to get Meld just to look at changed contents. Any ideas for how to do this in Meld or is there a better GUI directory comparison tool for Ubuntu?

    Read the article

  • Save a single web page (with background images) with Wget

    - by mikael
    I want to use Wget to save single web pages (not recursively, not whole sites) for reference. Much like Firefox's "Web Page, complete". My first problem is: I can't get Wget to save background images specified in the CSS. Even if it did save the background image files I don't think --convert-links would convert the background-image URLs in the CSS file to point to the locally saved background images. Firefox has the same problem. My second problem is: If there are images on the page I want to save that are hosted on another server (like ads) these wont be included. --span-hosts doesn't seem to solve that problem with the line below. I'm using: wget --no-parent --timestamping --convert-links --page-requisites --no-directories --no-host-directories -erobots=off http://domain.tld/webpage.html

    Read the article

  • tar - exclude certain files

    - by Alan
    I wish to tar all files in a directory and its subdirectories that do NOT end in .jpg, .bmp, .gif, or png. So, given the following folders and files: foo/file.txt foo/file.gif foo/bar/file foo/bar/image.jpg I want to tar only the files file.txt and file. file.gif and image.jpg should be ignored. I would also like to maintain the folder structure. My first thought was to pipe the results of the find command in conjunction with grep -v ".jpg|.gif|.bmp.png" to a text file, and then use the tar include argument to feed it that list of files. However, the results of the grepped find command also contain directories (in the example above, it would be "foo" and "foo/bar"), and when a directory is fed to tar, it includes all files in that directory, so I would end up with a tar file containing all of the files--not what I want. Is there any way to prevent find from outputting directories? Is there a far easier way to approach this?

    Read the article

  • How to set a page as default for apache, that doesn't live in the directory and only applies if ther

    - by Kyle
    I know. This sounds complicated =D I made a PHP file browser, as an alternative to the apache one. I needed it for logic purposes, it does extra things for me, &etc. So instead of dropping this file in all of my directories, how could I get it to "show up" in all my directories that don't have an index (would use the apache dirlisting by default)? Thanks for the help! Edit I wonder if this could be done using Alias and DirectoryIndex? Is it possible to alias to a file?

    Read the article

  • securing unpatched websites

    - by neuron
    I have a client with a lot (read several thousand) websites in several old cms solutions that are no longer maintained. Now moving all of them to a maintained solution isn't really an option at this point. So I'm thinking about ways to secure the solutions without patching them. The solutions are mostly joomla 1.0/1.5 and wordpress. What I'm thinking is something like this: mod_suexec to lock everyone into their own home directory apparmor to deny any and all file writes by default. (exclude by default, include things like "images" directories). use htaccess to prevent anything in writable directories from being executed. (aka disable php_engine for images/ directory). mysql triggers to check the "users" tables to prevent adding new admins/superadmins. Does this make sense? Is it viable? Am I missing something obvious?

    Read the article

  • User accounts in FTP

    - by Brad
    I have an FTP server(proftpd on debian) that I'm going to allow a couple friends access to, and I want some safety nets in place, just in case. These are some of the things I'd like to do: Jail the accounts to their home directories and impose a cap on the amount of data they can upload Allow them access to a shared folder(via symlink or something) where they have full access(Also with a storage cap, but larger) Allow my own account full access to the system(Using groups I guess) Not allow anonymous access, or allow it with its own folder, separate from the shared user folder Currently, I've got the accounts set up and jailed, but it seems like the symlink that I put in is not allowing them to visit the shared folder. I suppose this has to do with them not having read permissions anywhere but their own home directories, or maybe it's something else, I'll continue to look into it and provide any information that is requested. Is what I'm trying to do possible? Any tips or resources that you can share are appreciated. Thanks.

    Read the article

  • On Windows Server 2003, permissions are not propagating to a group that is a member of a group

    - by Joshua K
    Windows Server 2003 on i386. FTP server is running as the SYSTEM user/group. Some files we want served (read and write) are owned by the group 'ftp.' ftp has full read/write/whatever permissions on those files and directories. SYSTEM can't read/write those directories. So, I added SYSTEM to the 'ftp' group. Windows happily complied, but even after restarting Filezilla, it still could not read/write those files. Is there any way to do what we want without "re-permissioning" all those files? Running the ftp server as 'ftp' isn't really an option because it also serves files that are owned by SYSTEM (And not ftp). Sigh... :) Any insights?

    Read the article

< Previous Page | 25 26 27 28 29 30 31 32 33 34 35 36  | Next Page >