Search Results

Search found 11262 results on 451 pages for 'important directories'.

Page 19/451 | < Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >

  • Broken characters in filenames only in some directories

    - by Kaivosukeltaja
    We have a web server running CentOS 5.8 that uses SVN for version control. When trying to switch to the latest revision, we got an error about the filenames of files in an upload directory: svn: Error converting entry in directory 'adm/emails/upload' to UTF-8 svn: Valid UTF-8 data (hex: 54 79) followed by invalid UTF-8 sequence (hex: f6 6b 69 72) Upon investigating, we noticed there were some files that had broken filenames: $ ls ~/public_html/adm/emails/upload/ Ty?el?m?trendit.csv Ty?kirja1.csv To get the update completed quickly, we simply mved the files into our home directory. Surprisingly, their filenames looked fine in their new location: $ ls ~/ Työelämätrendit.csv Työkirja1.csv After the update we moved them back to where they were and their filenames were broken again. What could cause this and how can we fix it? The system's locale is set to LANG=en_US.UTF-8.

    Read the article

  • Using wget to recursively download whole FTP directories

    - by user9406
    I want to copy all of the files and folders from one host to another. The files on the old host sit at /var/www/html and I only have FTP access to that server, and I can't TAR all the files. Regular connection to the old host through FTP brings me to the /home/admin folder. I tried running the following command form my new server: wget -r ftp://username:[email protected] But all I get is a made up index.html file. What the right syntax for using wget recursively over FTP?

    Read the article

  • Trouble with nginx and serving from multiple directories under the same domain

    - by Phase
    I have nginx setup to serve from /usr/share/nginx/html, and it does this fine. I also want to add it to serve from /home/user/public_html/map on the same domain. So: my.domain.com would get you the files in /usr/share/nginx/html my.domain.com/map would get you the files in /home/user/public_html/map With the below configuration (/etc/nginx/nginx.conf) it appears to be going to my.domain.com/map/map as noticed by this: 2011/03/12 09:50:26 [error] 2626#0: *254 "/home/user/public_html/map/map/index.html" is forbidden (13: Permission denied), client: <edited ip address>, server: _, request: "GET /map/ HTTP/1.1", host: "<edited>" I've tried a few things but I'm still not able to get it to cooperate, so any help would be greatly appreciated. ####################################################################### # # This is the main Nginx configuration file. # ####################################################################### #---------------------------------------------------------------------- # Main Module - directives that cover basic functionality #---------------------------------------------------------------------- user nginx; worker_processes 1; error_log /var/log/nginx/error.log; pid /var/run/nginx.pid; #---------------------------------------------------------------------- # Events Module #---------------------------------------------------------------------- events { worker_connections 1024; } #---------------------------------------------------------------------- # HTTP Core Module #---------------------------------------------------------------------- http { include /etc/nginx/mime.types; default_type application/octet-stream; log_format main '$remote_addr - $remote_user [$time_local] "$request" ' '$status $body_bytes_sent "$http_referer" ' '"$http_user_agent" "$http_x_forwarded_for"'; access_log /var/log/nginx/access.log main; sendfile on; keepalive_timeout 65; server { listen 80; server_name _; #access_log logs/host.access.log main; location / { root /usr/share/nginx/html; index index.html index.htm; } location /map { root /home/user/public_html/map; index index.html index.htm; } error_page 404 /404.html; location = /404.html { root /usr/share/nginx/html; } error_page 500 502 503 504 /50x.html; location = /50x.html { root /usr/share/nginx/html; } } include /etc/nginx/conf.d/*.conf; }

    Read the article

  • icacls batch file multiple directories with wildcards help needed

    - by user153521
    I have written the following batch file that does a great job combing through all folders beginning with the number 3 and applying folder permissions to any 2010 subfolder. Example of the batch filesis below: for /D %%f in (D:\Data\3*) do icacls "%%f\2010" /inheritance:r /grant:r "Domain Admins":(OI)(CI)F Question : How can I improve this script to allow for me to apply the permissions to a specific folder below ANY folder within the folders beginning with 3? here is an example of my failed attempt: for /D %%f in (D:\Data\3*) do icacls "%%f*\specificfolder" /inheritance:r /grant:r "Domain Admins":(OI)(CI)F

    Read the article

  • Defining Virtual and Real User Directories with Dovecot & Postfix

    - by blankabout
    Following a wobble described in this question we now have virtual and real users authenticating with Dovecot, the problem now is that the real users (who have been on the system for years) can no longer access their mail. I'm guessing that it is because Dovecot is configured to point to the virtual mailboxes but not the real mail boxes. These are snippets from the config files: /etc/dovecot/dovecot.conf !include conf.d/*.conf /etc/dovecot/conf.d/10-auth.cong passdb { driver = passwd-file # Path for passwd-file. Also set the default password scheme. args = scheme=cram-md5 /etc/cram-md5.pwd } userdb { driver=static #args = mail_uid=dovecot mail_gid=dovecot /etc/dovecot/userdb args = uid=vmail gid=vmail home=/var/spool/vhosts/%d/%n /etc/dovecot/userdb } [email protected]:::::/var/spool/vhosts/virtualdomain.com/:/bin/false:: We think the problem is that the Dovecot file 10-auth.conf does not contain a method of accessing the mailboxes for the real users. We have looked around on this site, dovecot.org and done the usual googling but cannot find anywhere that describes how to set up virtual users on alongside legacy real users. Any help would be appreciated, especially by our real users who would like the contents of their inboxes to be available! If any further config is required, please let me know.

    Read the article

  • Automatically sync directories via FTP during off hours?

    - by jason
    I would like COMPUTER_A to sync anything found in a specific directory, with my FTP server, COMPUTER_B, but only during off hours. I would like it to automatically resume if the computer is rebooted. I use FileZilla now, but it will not automatically continue transferring when the computer is restarted. I also do not think that you can set the times for the transfers, with FileZilla.

    Read the article

  • LDAP + NFS + automount home directories permissions issue

    - by noobishguy
    When an LDAP user logs into the system they have incorrect permissions to their home directory. LDAP and NFS services exist on the same server. The directory shows the correct ownership / permissions: drwx------. 4 ldaptest ldaptest 4096 Jun 9 2014 ldaptest however the UID / GID do not match those on the server client: bash-4.1$ id uid=10001(ldaptest) gid=10001(ldaptest) groups=10001(ldaptest) context=unconfined_u:unconfined_r:unconfined_t:s0-s0:c0.c1023 server: [root@ldap1 log]# id ldaptest uid=502(ldaptest) gid=502(ldaptest) groups=502(ldaptest) How do I resolve this?

    Read the article

  • Windows script to create directories of 3,000 files

    - by uhpl1
    We have some email archiving that is dumping all the emails into a directory. Because of some performance reasons with the server, I want to setup an automated task that will run a script once a day and if there is more than 3,000 (or whatever number) of files in the main directory, create a new directory with the date and move all the main directory files into it. I'm sure someone has already written something similar, so if anyone could point me at it that would be great. Batch file or Powershell would both be fine.

    Read the article

  • moving files and directories between two machine, via a third, preserving permissions and usernames

    - by Jarmund
    The situation is as follows: Machine A has a file repository accessible via rsync Machine B needs the above mentioned files with all permissions and ownerships intact (including groups etc) Machine C has access to both A and B, but has a completely different set of users. Normally, i would just rsync everything over, directly between A and B, but due to severely limited bandwidth at the moment, i need something different, as rsync times out after building the list of the 430 files (49Mb uncompressed... can be compressed down to ~7Mb). What i've tried so far: rsync everything over from A to C, tar it, copy the tarball over, and then untar it, however, this messes up the ownership and/or the permissions. To rsync it from A to C, i run this command: rsync --numeric-ids --password-file=/root/rsync_pwd_file -oaPvu rsync://[email protected]/portal_2/ ./portal_2/ ...and from the looks of things, they do end up on C with the correct ownerships/permissions/flags/everything (not 100% sure, though.. are there any more switches i can throw in there? did i miss something?) copying the tarball over is simple enough (slow as a one-legged turtle due to the bandwidth, but it checksums out alright) What i'm unsure of is the flags and switches for creating and extracting the tarball, so could someone please provide the full commands for creating a tarball from /root/portal_2 on machine C (with everything intact) and extracting the tarball into /var/ex/portal_2 on machine B? ? Also, are there any other approaches worth mentioning that could allow me to perform this? I have root access to A and C, whereas i only have rsync access to B. PS: I'm running rsync v2.6.9 on machine B, and unfortunately i do not have the oportunity to upgrade to v3

    Read the article

  • IIS, SSL, and Virtual Directories

    - by yodie
    I'm running a webserver on WS 2k3, IIS 6.0. Some of the content is on that server, but most is in a virtual directory linked to another server. Everything works (almost) fine when no SSL is used. However, when using SSL, I cannot access the files in the virtual directory. Instead I get a generic error 500. Any advice?

    Read the article

  • Restrict Apache to only allow access using SSL for some directories

    - by DrStalker
    I have an Apache 2.2 server with an SSL certificate hosting several services that should be only access using SSL. ie: https://myserver.com/topsecret/ should be allowed while http://myserver.com/topsecret/ should be either denied or, ideally, redirected to https. http://myserver.com/public should not have this restriction, and should work using either http or https. The decision to allow/deny http is made at the top level directory, and affects all content underneath it. Is there a directive that can be placed in the Apache config to retrict access in this manner?

    Read the article

  • Sharing directories in Windows 7

    - by CoryR
    I've created a directory named "Shared" in my "My Documents" directory (C:\Users\Cory.MYDOMAIN\Documents\Shared). I right clicked on the Shared folder and publicly shared this one directory. At least that's what wanted to do. In reality, Windows 7 shared the entire C:\Users directory. How can I share this one subdirectory without granting access to the rest of the C:\Users tree?

    Read the article

  • Untar multiple files in different directories

    - by wildeep
    Hi, am trying to perform an update on multiple site that use an open source CMS but untaring a patch file in each sites httpdocs directory. My plan was to perform a find for the patch file then untar using the following command: find . -name "patchfile.tar.gz" -exec tar -xzvf {} \; -print but it doesnt seem to work successfully anyone have any ideas as to why not? Many thanks.

    Read the article

  • Unable to sunchronize local and remote directories ("set times: Operation not permitted")

    - by Tom Auger
    I'm running into FTP errors using software like NetBeans or WinSCP: whenever I attempt to perform a synchronization or update of files from local -- server I get errors on the client saying "set times: Operation not permitted". This is clearly an issue with the way I've configured my Fedora installation. The user that I'm logging in with cannot touch -t any of these files, though he IS part of a group that has r/w access on the files. I do have root / sudo access to this server. What I would like to know is: a) is it likely that this problem would be solved by allowing my FTP user to "touch -t" these files b) how do I enable a certain user to be able to set timestamps on files without giving them ownership of the files (certain of these files need to be owned by Apache, for instance, so I don't want to chown them). Thanks in advance.

    Read the article

  • find directories in the current directory, older than 5 days and archive them

    - by user197284
    This is basic questions. I need to find folders in the current working directory(not recursively) and if they are older than 5 days archive them. zip or tar.gz is fine. I can find the folders with following commands find ./ -maxdepth 1 -type d -mtime +5 And i know i can pass this output of the find using xargs. But i do not know how to archive with folder name intact. That is the directory test1 should be archived to test1.zip and directory "test2" should be archived to "test2.zip". Any inputs are welcome. Regards

    Read the article

  • rsync directories

    - by Buzzzz
    Hello, I'm trying to sync my music collection between my mac and my linux workstation but fails to mirror from my server share to linux. rsync -avz --progress Music/ /volumes/myserver/music works fine but the reverse on my linux ws doesnt. rsync -avz --progress /path/to/samba/share/music/ ~/Music does nothing.. Any clue what I have gotten wrong with this?? Best Regards Anders Olme

    Read the article

  • Rewrite 2 different directories with htaccess?

    - by jason
    I have a tricky problem (for me at least). I'm trying to rewrite / to a folder /webroot/www. I have some simple code and it works: RewriteRule ^$ /webroot/www/ [L] However at the same time if the URL starts with components, followed by anything else (ex. foo, as in /components/foo), and foo is an actual directory that exists inside components, I should rewrite to /components/foo/www instead. How can I achieve that? I can't seem to figure it out. I'm using Apache with .htaccess.

    Read the article

  • Apache Detects files as directories

    - by Legit
    I have a file 'result.txt' in my documentroot, now when I access: http://localhost/result/first It's accessing this instead: http://localhost/result.txt/first What could have I incorrectly misconfigured in my apache config? EDIT: My rewrite rules are as follow: <IfModule mod_rewrite.c> Options +FollowSymLinks RewriteEngine On </IfModule> <IfModule mod_rewrite.c> RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php/$1 [L] </IfModule>

    Read the article

  • How to synchronize between differently structured directories using rsync (or other program)

    - by doetoe
    Does anyone of you know how to perform the following task: Suppose you have two directory trees, which I will call source and target. They may have a very different structure, but could contain many duplicate files. An example would be a structured collection of photographs on one hand (the destination), and just a tmp directory in which you unload everything from your camera on the other (the source). Maybe some of these files are already in the structured directory tree. I would like to rsync from the source to the target, such that only the files from the source that are not in the destination at any location are copied.

    Read the article

  • How to specify different Debug/Release output directories in QMake .pro file

    - by esavard
    I have a Qt project and I would like to output compilation files outside the source tree. I currently have the following directory structure: / |_/build |_/mylib |_/include |_/src |_/resources Depending on the configuration (debug/release), I will like to output the resulting files inside the build directory under build/debug or build/release directories. How can I do that using a .pro file?

    Read the article

  • is there anyway to clean up old svn directories and files from old source code tree

    - by oo
    i have been sent a directory tree of source code that i want to import into my subversion repository. The issue is that at some point this code was in a different subversion repository. There are a huge number of directories and subdirectories and i basically want to clean up all of the subversion .svn folders before i attempt to import to a new repository and i dont want svn to get confused. is there anyway to clean out a directory structure to remove all svn references?

    Read the article

  • SVN directories not showing up in localhost when using WAMP

    - by JsusSalv
    Hi: I recently installed WAMP for actual local use. I've worked on live development servers but now am working on localhost. I've managed to get multiple virtual hosts setup on my WAMP/Vista 64-bit box but am having difficulty with directories pulled from SVN. I have four vhosts setup. Two work well and they are not tied to any SVN just yet. I'm also using TortoiseSVN in case it makes any difference. However, the other projects are coming from SVN repositories. When I view these two projects I get the following error: Internal Server Error The server encountered an internal error or misconfiguration and was unable to complete your request. Please contact the server administrator, admin@localhost and inform them of the time the error occurred, and anything you might have done that may have caused the error. More information about this error may be available in the server error log. The way I setup the vhosts is as follows: httpd.conf # Multiple Virtual Hosts <VirtualHost 127.0.0.1> ServerName localhost DocumentRoot "C:/wamp/www/" </VirtualHost> <VirtualHost 127.0.1.0> ServerName testone.local DocumentRoot "C:/wamp/www/root/projectone/" </VirtualHost> <VirtualHost 127.0.2.0> ServerName testtwo.local DocumentRoot "C:/wamp/www/root/projecttwo/" </VirtualHost> <VirtualHost 127.0.3.0> ServerName testthree.local DocumentRoot "C:/wamp/www/root/projectthree/" </VirtualHost> <VirtualHost 127.0.3.1> ServerName testfour.local DocumentRoot "C:/wamp/www/root/projectfour/" </VirtualHost> And here's the 'hosts' file: # Localhost 127.0.0.1 localhost ::1 localhost # Project One 127.0.1.0 testone.local # Project Two 127.0.2.0 testtwo.local # Project Three 127.0.3.0 testthree.local # Project Four 127.0.3.1 testfour.local Everything works just fine. So if you want to tell me I'm doing something wrong then by all means point out a few things. But as it stands, it works and I'm content using different IPs and/or named-based vhosts. The problem comes in not being able to see the directories and files in the projects that are tied to an SVN. Whenever I visit http://testxxxx.local I get the error message at the top of this post. Please provide some suggestions. Thank you!

    Read the article

< Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >