Search Results

Search found 3168 results on 127 pages for 'directories'.

Page 73/127 | < Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >

  • How do I turn of "auto-echo" in bash when I 'cd'?

    - by Avery Chan
    I don't know when this started happening but now, every time I cd to a directory it echoes the path right before it changes directories. This happens when I log into a server but doesn't happen on my local machine. The server is running Linux. My local machine is running Mac OS X. I searched the Google as well as looked at the bash man page but I couldn't find anything. My .bashrc/.bash_profile doesn't have anything related to 'cd' (that I know of). How do I modify this "feature"?

    Read the article

  • Possible causes for Domain server being unavailable?

    - by serversurfer
    One of our servers was compromised after a user with administrative privileges accidentally loaded a virus from a USB drive on a desktop connected to the domain. The two most obvious symptoms of this were: The server is no longer responding to login attempts The root directory of the drive containing user data has been filled with randomly named empty folders. (Initially it was around a million folders, I've been slowly deleting them.) I've run several virus scans from different vendors and am fairly confident the virus has been removed but the damage is done. I'm hoping the two symptoms are related and that once the directories are gone the server will start responding again. The drive is very slow to respond. I'm deleting about 20k folders at a time. Anymore than that and windows explorer becomes unresponsive. In the event that I finish cleaning up the HD and things don't return to normal what other things can I check?

    Read the article

  • NGINX - Two different rails apps under same domain

    - by Murkin
    I have two different Rails (passenger) apps that I wan to host on one server: somehost.com/ <-- App #1 somehost.com/admin <--- App #2 Tried playing with the 'location' directive, but failed to have both operate. Can someone suggest the correct approach ? (I would prefer both to share same environment, only launch from different directories) EDIT: Sample (desired) config Trying to do something like: server { listen 80; server_name myhost.com; rails_env production; passenger_enabled on; location / { root /opt/main_site/public/; } location /dev { root /opt/admin_site/public/; } }

    Read the article

  • Apache: set aside specific number of servers for a single "high priority" script

    - by Disco
    I have a bunch of scripts, but some of them are higher priority than others: /var/www/normal-priority/script1.pl /var/www/normal-priority/script2.pl /var/www/normal-priority/script3.pl and /var/www/high-priority/script1.pl /var/www/high-priority/script2.pl /var/www/high-priority/script3.pl All running under mod-perl. They reside in separate directories. From time to time, the normal-priority scripts get very busy, and the httpd servers "swamp out" the less frequently called high priority ones. Is it possible to set aside n httpd servers so that they only listen for the "high priority" scripts?

    Read the article

  • rsync synchronizing files only without creating folders on destination

    - by Vincent
    Is it possible with rsync to not create directories on destination? Imagine I have that source : a/ a/x.txt b/ b/y.txt And that I have this destination : a/ a/z.txt The wanted result of rsync source destination : a/ a/x.txt a/z.txt Of course my real situation involves thousand files/folders structure and I don't want solutions involving explicit list of synced folders, which I can do. I'm looking for a clean way just to prevent any folder creation on destination. By exclude or filtering... That could even be something outside rsync, like a hack with permissions if rsync can't do this... For information, this is really easy to get this kind of situations, in my case I have: A server with 2 disks, let's say A & B. And a local drive C. I usually use rsync to sync (and merge) remote A & B into local C. Then sometimes I just want to sync back some C files into A and B. (Just new Files... not non-existing folders on destination)

    Read the article

  • Keep ASP.NET site and content separate

    - by Nelson
    I have an ASP.NET site in folder x. Currently lots of other static content gets added to folder x and gets mixed in, making it one big mess. I would like to keep the ASP.NET site and the content separate somehow. I know you can create virtual directories in IIS, but there are LOTS and even some content in the root. The content people are not technical and really need an easy way to add it. I would stick everything in a subfolder (they don't touch anything outside, I don't touch their folder), but that would change their URLs (www.example.com/something to www.example.com/content/something). I almost need a way to "merge" two folders and have them act as one. I'm guessing that is impossible since there could be file conflicts, etc. Any other ways I can achieve this?

    Read the article

  • Windows Server 2008 scheduled tasks cannot create files

    - by Nick Cartwright
    We have a series of tasks which, when run interactively over the command line run fine creating temporary files and (importantly) logs and backups. When we schedule the task with Administrator privileges to run at the highest priority, however, no logs or temporary files are created! All the directories have read/write privileges as administrator. Has anyone else experienced this?? We are running Windows 2008 Server & the job is configured for 'Windows Vista or Windows Server 2008'. Any help would be much appreciated! OK - so we installed Z-Cron and it works perfectly.... Still a really really strange error from Windows 2008 Task Scheduler, but a solution is perhaps not quite so urgent now we have Z-Cron working!

    Read the article

  • Write hash password to LDAP when creating a new user

    - by alibaba
    I am working on a project with a central user database system. One of the requirements of the system is that there should be only one set of users for all the application. FreeRADIUS and Samba are two my applications that both use LDAP as their backend. Since users must be the same for the entire system that contains many other applications, I have to read the list of users from the central database and recreate them in the LDAP directories for Samba and FreeRADIUS. The problem is that users are sent to me from another entity and I can save them in the database with their hash passwords. I don't have access to their cleartext passwords. I am wondering if I could enter directly a hash password for a new user in LDAP with my preferred hash mechanism. If not, can any one tell me what strategy I have to use? I am running my server on UBUNTU 12.04 and all other applications are the latest versions. My database system is PostgreSQL 9.2. Thank you

    Read the article

  • Windows DIR listing switch to exclude files in hidden folders

    - by Jason
    I'm trying to get a list of files from a directory excluding files in hidden folders. With the following command, hidden folders are traversed even though I've set /A:-H to exclude hidden directories. Is there a different switch to stop them from being traversed too? dir "C:\SVN" /A:-H /w /b /s Alternatively, for this use case I know the name of the hidden folders I want to exclude, so if there is a way to exclude the folders by name ("\.svn\") that might have to suffice. Thanks!

    Read the article

  • Does nginx auth_basic work over HTTPS?

    - by monde_
    I've been trying to setup a password protected directory in a SSL website as follows: /etc/nginx/sites-available/default server { listen 443: ssl on; ssl_certificate /usr/certs/server.crt; ssl_certificate_key /usr/certs/server.key; server_name server1.example.com; root /var/www/example.com/htdocs/; index index.html; location /secure/ { auth_basic "Restricted"; auth_basic_user_file /var/www/example.com/.htpasswd; } } The problem is when I try to access the URL https://server1.example.com/secure/, I get a "404: Not Found" error page. My error.log shows the following error: 011/11/26 03:09:06 [error] 10913#0: *1 no user/password was provided for basic authentication, client: 192.168.0.24, server: server1.example.com, request: "GET /secure/ HTTP/1.1", host: "server1.example.com" However, I was able to setup password protected directories for a normal HTTP virtual host without any problems. Is it a problem with the config or something else?

    Read the article

  • Only allow root to change filesystem

    - by Uejji
    The VPS I manage uses a simple hard link rsync archive daily backup system saved to a loop file. This is great, because each backup only takes up as much space as what has changed each day, and all user/group permissions are kept. I would like to give users direct access to their home directories in each backup, but I'm worried about intentional or accidental backup data destruction, as how it stands now users can actually change, destroy or add to backed up data they originally owned. I've been looking for a way to mount this filesystem similar to an ro mount option, but something that would still allow rw access to root, but I've had absolutely no luck. In other words, I want users to be able to view and copy their backed up data without actually being able to change it, and have that data maintain the original permissions. I've got no real preferences as far as filesystem, as long as it's a standard unix filesystem that can preserve permissions, support hard links and deny write access to users without actually stripping the w permission from everything.

    Read the article

  • How can I delete Time Machine files using the commandline

    - by Tim
    I want to delete some files/directories from my Time Machine Partition using rm, but am unable to do so. I'm pretty sure the problem is related to some sort of access control extended attributes on files in the backup, but do not know how to override/disable them in order to get rm to work. An example of the error I'm getting is: % sudo rm -rf Backups.backupdb/MacBook/Latest/MacBook/somedir rm: Backups.backupdb/MacBook/Latest/MacBook/somedir: Directory not empty rm: Backups.backupdb/MacBook/Latest/MacBook/somedir/somefile: Operation not permitted There are a number of reasons I do not want to use either the Time Machine GUI or Finder for this. If possible, I'd like to be able to maintain the extended protection for all other files (I'd like not to disable them globally, unless I can re-enable once I've done my work).

    Read the article

  • some issues with removing www and redirecting index.html

    - by MariaKeys
    Hello Fellas, I am having trouble doing what i want to do with the following setup. I would like to remove all WWW, and also forward index.html to root dir. I would like this to be for all domains, so i am doing inside httpd.conf directory directive. I tried many variations with no success. Latest version is below (domains are inside /var/www/html, in seperate directories). http://www.example.com/index.html > http://example.com http://www.example.com/someother/index.html > http://example.com/someother/ Thanks, Maria <Directory "/var/www/html/*/"> RewriteEngine on RewriteBase / RewriteCond %{HTTP_HOST} ^www\.(.+)$ [NC] RewriteRule ^(.*)$ http://%1/$1 [R=301,L] #RewriteCond %{REQUEST_URI} /^index\.html/ RewriteRule ^(.*)index\.html$ / [R=301,L] Options ExecCGI Includes FollowSymLinks AllowOverride AuthConfig AllowOverride All Order allow,deny Allow from all </Directory>

    Read the article

  • cd (change directory) to my home directory on Windows [closed]

    - by deostroll
    Possible Duplicate: Is there a short cut command in Windows command prompt to get to the current users home directory like there is in Linux? Any short way to cd to the user specific directories in the command prompt. Like for e.g. in linux shell (debian based) we do a cd ~ and it instantly takes to the current logged user's directory /home/<username>. Anything to this effect on windows? ps: currently trying to do this on xp machines. If it differs for other machines, mention that too.

    Read the article

  • ldap samba user access issue

    - by ancillary
    I have a samba share that is on the LAN. It is auth'd via ldap. Users access file system via ad windows shares. There are shortcuts in directories that point to dir's on samba. Typically a user will click the shortcut to the smb dir, and will be met with a permission denied error. Upon closing explorer and reopening, it will work. DNS is handled by the domain controller, and that is the only server any of the machines use for DNS. Nothing in eventvwr. Only see successful auth entries in samba log. Any ideas?

    Read the article

  • How to combine wildcards and spaces (quotes) in an Windows command?

    - by Jan Fabry
    I want to remove directories of the following format: C:\Program Files\FogBugz\Plugins\cache\[email protected]_NN NN is a number, so I want to use a wildcard (this is part of a post-build step in Visual Studio). The problem is that I need to combine quotes around the path name (for the space in Program Files) with a wildcard to match the end of the path. I already found out that rd is the remove command that accepts wildcards, but where do I put the quotes? I have tried no ending quote (works for dir), ...example.com*", ...example.com"*, ...example.com_??", ...cache\"[email protected]*, ...cache"\[email protected]*, but none of them work. (How many commands to remove a file/directory are there in Windows anyway? And why do they all differ in capabilities?)

    Read the article

  • How to set up Apache 2 to serve only subdirectories

    - by Lynden Shields
    I have 3 sites which need to be hosted on a web server (apache2 from repo running on Ubuntu 12.04). They are each in their own subdirectory within /var/www/ I would like apache to serve files from the relevant directories only if the directory name is given in the URL, but not serve the /var/www/ directory itself. E.g: http://1.2.3.4/site1/ should work and serve the index from /var/www/site1/index.html, but http://1.2.3.4/ should not serve anything. Currently, I can't get the url to point to the directory. Either I can get http://1.2.3.4/ to serve everything within /var/www/ (including /var/www/site2/secretstuff/), or I can get the root http://1.2.3.4/ to serve one of the subdirectories (/var/www/site1/). This is unacceptable site 1 needs Indexes enabled but the others must not. I just want to make site1's config only respond to requests of the form http://1.2.3.4/site1/* and not handle requests of the form http://1.2.3.4/ I do not have a domain name set up so I can't use subdomains.

    Read the article

  • Black screen with cursor after BIOS screen

    - by Radio
    Here is a weird one, Got computer with Windows XP. It's getting stuck on a black screen with cursor blinking. What did I do: - Boot from installation CD (recovery option - command line): chkdsk C: /R copy D:\i386\ntdetect.com c:\ copy D:\i386\ntldr c:\ fixmbr fixboot Chkdsk showed 0 bad sectors and no problems during scan. dir on C:\ shows all directories and files in place (Windows, Program Files, Documents and Settings). BIOS shows correct boot drive. Still does not boot. Not sure what to think of. Please help. UPDATE: Just performed these steps: Backed up current disk C: (without MBR) using True Image to external hard drive Ran Windows XP clean installation with deleting all partitions and creating new one. Hard drive booted fine into Windows GUI installation!!! Then: I interrupted installation. Booted from True Image recovery CD and restored archive of disk C to an new partition. Same issue with black screen.

    Read the article

  • PHP session files have permissions of 000 - They're ununsable

    - by vanced
    I kept having issues with a Document Management System I'm trying to install as, at the first step of the installation process, it would error with: Warning: Unknown: open(/tmp/sess_d39cac7f80834b2ee069d0c867ac169c, O_RDWR) failed: Permission denied (13) in Unknown on line 0 Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/tmp) in Unknown on line 0 I looked in /tmp and saw the sess_* files have the following permissions ---------- 1 vanced vanced 1240 Jan 20 08:48 sess_d39cac7f80834b2ee069d0c867ac169c All the session files look like this. So obviously, they're unusable by PHP and it's causing me lots of problems. How can I get PHP to set the correct permissions? I've tried changing the directory which php.ini uses to /tmp/phpsessions and the same thing occurs. The directories are a+rwx.

    Read the article

  • Apache2: How do I restrict access to a directory, but allow access to one file within it?

    - by Nick
    I've inherited a poorly designed web app, which has a certain file that needs to be publicly accessible, but that file is inside a directory which should not. In other words, I need a way to block all files and sub-directories within a directory, but over-ride it for a single file. I'm trying this: # No one needs to access this directly <Directory /var/www/DangerousDirectory/> Order Deny,allow Deny from all # But this file is OK: <Files /var/www/DangerousDirectory/SafeFile.html> Allow from all </Files> </Directory> But it's not working- it just blocks everything including the file I want to allow. Any suggestions?

    Read the article

  • Allow PHP to write file without 777

    - by camerongray
    I am setting up a simple website on webspace provided by my university. I do not have database access so I am storing all the data in a flat file. The issue I am experiencing is related to file permissions. I need PHP to be able to read and write the data file but I don't really want to set the file to 777 as anybody else on the system could modify it, they already have read access to everyone's web directories. Does anyone have any ideas on how to accomplish this? Thanks in advance

    Read the article

  • Find command exclude files whose path match a certain pattern

    - by user40570
    I have a find command that looks for files that was modified recently and outputs the date find /path/on/server -mtime -1 -name '*.js' -exec ls -l {} \; I would like it to exclude any deeply nested folder that matches a certain pattern e.g. there are a number of folders that have a "statistics" directory and ".svn" directories. So i'd like to be able to say if the file that was modified yesterday is in a folder named statistics ignore it. Or perhaps not search for files in those folders at all.

    Read the article

  • Facing difficulty with migrating from wordpress to Drupal

    - by rakibtg
    One of my blog was build of Wordpress but now i want to use Drupal as the CMS of my Blog. To do so I have deleted all the Wordpress files from my server and the Database and MySQL user which are associated with wordpress blog and uploaded the Drupal files in my server directory where the wordpress files were. But, when i have opened the blog it shows the Wordpress blog! though its been deleted and their should be the Drupal Installation interface. So, i have re-checked my server directories and database, there is not wordpress files and wp database all are deleted, there is only the drupal files, but when i go to the blog to install drupal there is still the Wordpress blog, I have checked the blog in many web browsers and there is not cache memory problem. My hosting server is linux based. can't understand what to do? Any idea? Thanks

    Read the article

  • Server 2008 R2 file access permissions

    - by Napster100
    I'm finding it awkward to sort out permissions for file sharing and access on my LAN. I've created an account on the server node (as a normal user) and shared a drive that has 2 folders at the root, one is for personal file storage and the other shared files, if I connect to the shared area from a workstation running windows 7 and log-in using the account I created on the server, I can look through directories but can't look in some (which I wanted as I changed the permissions for that to happen), but my problem is although the permissions are set for this user account to have full control of the specific folder I can't create a folder in that area or upload files to that folder. Could someone explain why this is? Thanks in advanced

    Read the article

  • DFS Replication, Users HOME folder - seems not to catch all files... any hints?

    - by TomTom
    I amm moving stuff out of a file Server. I am using DFS for that - the Folders are anyway in a DFS tree, so I can set up a replication temporarily, then drop the old Folder. Works nice, EXCEPT for the Folder containing the users home drives. Which, incidentally, is also the one I can not see all files in due to my permissions. Small Setup. We have 159mb in the users directories, 1280 files, 133 Folders original. The copy only has 157mb, 1269 files, 133 Folders. Anyone knwos of a way to find out what files are missing? IS this a Problem (could be some Caching files that are regenerated). Users are all offline (weekend) ;) This is pretty much the last share - all others had exactly ZERO issues.

    Read the article

< Previous Page | 69 70 71 72 73 74 75 76 77 78 79 80  | Next Page >