Search Results

Search found 17278 results on 692 pages for 'directory conventions'.

Page 284/692 | < Previous Page | 280 281 282 283 284 285 286 287 288 289 290 291  | Next Page >

  • Authentication in Apache2 with mod_dav_svn

    - by Poita_
    I'm having some trouble setting up authentication in Apache2 for a SVN repository that's being served using mod_dav_svn. Here is my Apache config for the directory: <Location /svn> DAV svn SVNParentPath /var/svn/repos AuthType Basic AuthName "Subversion Repository" AuthUserFile /etc/apache2/dev.passwd Require valid-user </Location> I can use svn with the projects under /var/svn/repos, so I know that the DAV is working, but when I do svn updates or commits (or anything), Apache doesn't ask for any authentication... It does the exact same thing whether the Auth directives are there or not. The permissions on the repository directory (and all subdirectories/files) only give permission to www-data (the Apache2 user/group). I have also ensured that all relevant modules are enabled (in particular mod_auth is enabled, as are all mod_dav* modules). Any ideas why svn commands aren't authenticating? Thanks in advance.

    Read the article

  • Compiling Apache 2.2.11 on AIX 6.1, .so files not genereated

    - by user176514
    I am compiling Apache 2 (2.2.11 yeh, Its old...but its a requirement) on AIX 6.1 with GCC 4.2.0. I am using the configure options: ./configure \ --enable-module=rewrite\ --enable-module=log_referer\ --with-included-apr \ --enable-proxy \ --enable-ssl=shared \ --with-ssl=/usr \ --prefix=/PATH/apache \ --enable-so \ --enable-mods-shared="proxy proxy_http proxy_connect headers mod_proxy mod_ssl" The configure, followed by the make/make install processes all run without error of any kind. However, when I look int he modules directory for the /PATH/modules directory there are no .so files created. Sadly because of the nature of what I am doing, and the business I am in, I am locked into the software versions as described.

    Read the article

  • How can I too many files upload more fast way to Cloud files in Rasckspace?

    - by andy kim
    I have a lot of image files, it's all I want to upload to RackSpace cloud files about a million in a single directory the fastest and most efficient way. but I'm use uploading python-cloudfiles script is very slow and I want to know different ways or python script code. because one by one connection upload is very slow. I think one files tar and uncompress directory is better way. but cloudfiles do not support this way. Who know any other way?

    Read the article

  • rename multiple files with unique name

    - by psaima
    I have a tab-delimited list of hundreds of names in the following format old_name new_name apple orange yellow blue All of my files have unique names and end with *.txt extension and these are in the same directory. I want to write a script that will rename the files by reading my list. So apple.txt should be renamed as orange.txt. I have searched around but I couldn't find a quick way to do this.I can change one file at a time with 'rename' or using perl "perl -p -i -e ’s///g’ *.txt", and few files with sed, but I don't know how I can use my list as input and write a shell script to make the changes for all files in a directory. I don't want to write hundreds of rename command for all files in a shell script. Any suggestions will be most welcome!

    Read the article

  • How can I password-protect a Mac shared folder on a Windows workgroup?

    - by Phillip Oldham
    We have a Mac-mini running 10.5.8 which already acts as a fileserver for our simple Windows (mixed XP/Vista) workgroup. The Mac-mini is on the same workgroup and the files are shared via SMB, FTP, and AFP. Basic file-sharing is working, and has been for some time. We'd now like to add an additional directory/share which can be secured by a password so that only a small number on the network have access. Is this possible? I've already tried creating the additional folder on the mac system, adding it to the shared folders, and limiting it to a specific "shared user", however it's not possible to log-in from an XP machine. Adding a sub-directory to the currently working share and giving limiting it's access to the shared user doesn't work either.

    Read the article

  • Amazon EC2 Sign In

    - by Barry
    When I change the home directory of my Amazon EC2 instance from /home/ubuntu to /home/ubuntu/folder in the /etc/passwd file, I am no longer able to access the instance using my existing keypair. Once I switch it back to the original directory I have no problems and can log into my instance as normal. I have checked the permissions on the new folder and they are drwxr-xr-x, which is the same the /home/ubuntu folder. I have a number of instances running at the minute and because of this change I have no way of logging back into them to rectify the situation. Does anyone have an idea what is going on? Thanks in advance

    Read the article

  • Laptop using 14.04 won't get past login GUI

    - by Dave M G
    My laptop was working perfectly yesterday, and now today I can't log in. At first, I was only getting a black screen. However, after following instructions in some questions here on AskUbuntu, I first reinstalled lightdm, and then I had to change the ownership of the file ~/.Xauthority to be my user name. Now, I get the log in GUI screen. However, once I enter my username, it flickers and then comes back to the login GUI. It does not matter if I use Gnome, Unity, or Gnome-Flashback. I don't know why lightdm needed to be reconfigured, or how ~/.Xauthority got changed, but in any case, what is still standing in my way, preventing me from logging in? Update: I have tried deleting the .Xauthority file, and the .profile files in my home directory. It has not changed anything. Logging in as guest also fails to work. The following commands did not work: mv ~/.config ~/.config.BAK mv ~/.cache ~/.cache.BAK Inside .xsession-errors in my home directory, it says: Gdk-CRITICAL: gdk_x11_display_get_xdisplay: assertion 'GDK_IS_DISPLAY (display)' failed

    Read the article

  • Is there a chroot build script somewhere?

    - by Nils
    I am about to develop a little script to gather information for a chroot-jail. In my case this looks (at the first glance) pretty simple: The application has a clean rpm-install and did install almost all files into a sub-directory of /opt. My idea is: Do a find of all binaries Check their library-dependencies Record the results into a list Do a rsync of that list into the chroot-target-directory before startup of the application Now I wonder - ist there any script around that already does such a job (perl/bash/python)? So far I found only specialized solutions for single applications (like sftp-chroot). Update I see three close-votes for the reason "off topic". This is a question that arose because I have to install that ancient piece of software on a server at work. So if you still feel this is off-topic - leave a comment...

    Read the article

  • Sharepoint 2007: Edit vs Read Only Mode

    - by user29116
    Sorry about the title, dont' really know what it should be. If I open a doc in read only mode I'm able to press save and then it opens up a save as box and the default directory is the directory on the sharepoint server and if you press save you save it to the server. This actually makes the whole process not really "read only" mode since I could actually update the document. Is there a way to prevent this from happening so that if someone chooses read only there is no way possible to updload any changes back to the sharepoint site? Also, it has been suggested as a solution to get rid of the edit/read only option so that people have to check out the document. Is there a way to remove the edit/read only option on documents?

    Read the article

  • What do I do for dependencies installing wine1.7 on 14.04

    - by user285207
    user@chrubuntu:~$ sudo apt-get install wine1.7 [sudo] password for user: user Sorry, try again. [sudo] password for user: Reading package lists... Done Building dependency tree Reading state information... Done Some packages could not be installed. This may mean that you have requested an impossible situation or if you are using the unstable distribution that some required packages have not yet been created or been moved out of Incoming. The following information may help to resolve the situation: The following packages have unmet dependencies: wine1.7 : Depends: wine1.7-i386 (= 1:1.7.19-0ubuntu2~trusty2) but it is not installable Recommends: gnome-exe-thumbnailer but it is not going to be installed or kde-runtime but it is not going to be installed Recommends: ttf-mscorefonts-installer but it is not going to be installed Recommends: fonts-horai-umefont but it is not going to be installed Recommends: fonts-unfonts-core but it is not going to be installed Recommends: ttf-wqy-microhei Recommends: winetricks but it is not going to be installed E: Unable to correct problems, you have held broken packages. user@chrubuntu:~$ Trying to install wine1.7 on Ubuntu 14.04 64 bit, and i'm not sure what this means, help is greatly appreciated. I already ran sudo apt-get update and get this: Reading package lists... Done W: Duplicate sources.list entry http://dl.google.com/linux/chrome/deb/ stable/main amd64 Packages (/var/lib/apt/lists/dl.google.com_linux_chrome_deb_dists_stable_main_binary-amd64_Packages) W: You may want to run apt-get update to correct these problems So I run apt-get update and: E: Could not open lock file /var/lib/apt/lists/lock - open (13: Permission denied) E: Unable to lock directory /var/lib/apt/lists/ E: Could not open lock file /var/lib/dpkg/lock - open (13: Permission denied) E: Unable to lock the administration directory (/var/lib/dpkg/), are you root? This is all very stressing because I have been trying to get Wine for the past week and had to reinstall and IT STILL WON'T WORK.

    Read the article

  • ConfigurationErrorsException when serving images via UNC on IIS6

    - by Mark Richman
    I have a virtual directory in my web app which connects to a Samba share via UNC. I can browse the files via Windows Explorer without issue, but my web app throws a yellow screen with the following message: Description: An error occurred during the processing of a configuration file required to service this request. Please review the specific error details below and modify your configuration file appropriately. Parser Error Message: An error occurred loading a configuration file: Could not find file '\cluster\cms\qa-images\120400\web.config'. What makes no sense to me is why it's looking for a web.config in that location. I know it's not an authentication issue because the virtual directory can serve images from its root (i.e. \cluster\cms\qa-images\test.jpg serves as http://myserver/upload/test.jpg just fine).

    Read the article

  • Upgrading Ubuntu(32 bit) 10.10 -> 11.04 fails and causes a kernel panic on boot

    - by Ubuntu Upgrade
    On Ubuntu 10.10 machine Upgrade to Ubuntu 11.04 using the update manager. The upgrade fails and leaves the system in an unstable state. When I reboot the system I get a kernel panic on boot. The error points to /opt/abc/runtime/lib/libc.so.6. By researching on this I found that there is a third party software abc causes problem. It has it's own runtime(libc) library. In /lib/ directory there is a link file /lib/ld-abc.so.2 ---/opt/abc/runtime/lib/ld-linux.so.2. If we rename this file to /lib/abc.so.2 or remove this file the the upgrade is success. Here is the upgrade log of where it crashes(apt-term.log) ===== Services restarted successfully. Processing triggers for libc-bin ... ldconfig deferred processing now taking place /usr/bin/dpkg: /opt/abc/runtime/lib/libc.so.6: version `GLIBC_2.11' not found (required by /usr/bin/dpkg) /usr/bin/dpkg: /opt/abc/runtime/lib/libc.so.6: version `GLIBC_2.8' not found (required by /lib/libselinux.so.1) ===== Could you please let me know what would be the problem of having a run time link library file in /lib directory. Does the ubuntu upgrade check the 3rd part runtime as well?

    Read the article

  • How to configure installed Ruby and gems?

    - by NARKOZ
    My current gem env returns: RubyGems Environment: - RUBYGEMS VERSION: 1.3.6 - RUBY VERSION: 1.8.7 (2008-08-11 patchlevel 72) [x86_64-linux] - INSTALLATION DIRECTORY: /home/USERNAME/.gems - RUBYGEMS PREFIX: /home/narkoz - RUBY EXECUTABLE: /usr/bin/ruby1.8 - EXECUTABLE DIRECTORY: /home/USERNAME/.gems/bin - RUBYGEMS PLATFORMS: - ruby - x86_64-linux - GEM PATHS: - /home/USERNAME/.gems - /usr/lib/ruby/gems/1.8 - GEM CONFIGURATION: - :update_sources => true - :verbose => true - :benchmark => false - :backtrace => false - :bulk_threshold => 1000 - "gempath" => ["/home/USERNAME/.gems", "/usr/lib/ruby/gems/1.8"] - "gemhome" => "/home/USERNAME/.gems" - REMOTE SOURCES: - http://rubygems.org/ How can I change path /home/USERNAME/ to my own without uninstalling? OS: Debian Linux

    Read the article

  • how to compare files/directories of 2 separate solaris boxes ?

    - by chz
    Hi Friends I have 2 solaris boxes and I need to check certain directories (on local filesystem and mounted nfs) to make sure that they match up on both boxes and to delete or move the other mismatches to elsewhere on the local filesystem. I investigated for unix commands like rsync, and tree but it appears that these commands are not supported on my Solaris boxes. What is the best approach to this problem with the least pain to solve it ? to use rsync, tree and then diff the outputs or find ? I have trouble limiting the find command to certain directories as there are mounted folders that contain too many xml files that I don't care to much in that directory. What's the find command to search multiple directory paths on a single find command. Thanks Sincerely

    Read the article

  • Apache deny access to images folder, but still able to display via <img> on site

    - by jeffery_the_wind
    I have an images folder on my site, let's call it /images/ where I keep a lot of images. I don't want anyone to have direct access to the images via the web, so I put a new directive in my Apache config that achieves this: <Directory "/var/www/images/"> Options Includes AllowOverride All Order allow,deny Deny from All </Directory> This is working, but it is blocking out ALL ACCESS, and I can't show the images anymore through my web pages. I guess this makes sense. So how do I selectively control access to these images? Basically I only want to display certain images through certain webpages and to certain users. What is best way to do this? Do I need to save the images to the database? Tim

    Read the article

  • Apache update to 2.2.23 on ubuntu 12.04 LTS

    - by user1802200
    We had done a PCI scan on one of our servers running Ubuntu 12.04 LTS with apache 2.2.22.The scan reported a vulnerability in apache 2.2.22 (Apache HTTP Server Zero-Length Directory Name in LD_LIBRARY_PATH Vulnerability).The report states to updgrade the version to the latest stable release of either 2.2.23 or 2.4.2.How do I upgrade to the 2.2.23 to fix the vulnerability or is there a patch available that can fix this and if yes can you let me know how that can be patched. Also is the latest version of apache2.2.22 available for ubuntu 12.04 LTS patched with the fix for **Apache HTTP Server Zero-Length Directory Name in LD_LIBRARY_PATH Vulnerability** Regards, Salil Phatak India

    Read the article

  • CentOS Can't connect to FTP

    - by Steven
    I'm having troubles connecting to my ftp server. Here's what it says, Status: Connected Status: Retrieving directory listing... Command: PWD Response: 257 "/home/sxxxn" Command: TYPE I Response: 200 Switching to Binary mode. Command: PASV Error: Connection timed out Error: Failed to retrieve directory listing My vsftpd.conf file: local_enable=YES write_enable=YES local_umask=022 dirmessage_enable=YES xferlog_enable=YES connect_from_port_20=YES ftpd_banner=Welcome to xxxx.com xferlog_std_format=NO chroot_local_user=NO chroot_list_enable=NO chroot_list_file=/etc/vsftpd/chroot_list listen=YES pasv_enable=YES pasv_min_port=3000 pasv_max_port=3050 pasv_address=64.xx.xx.xxx pam_service_name=vsftpd userlist_enable=YES userlist_deny=NO userlist_file=/etc/vsftpd/vsftpd.userlist And I've got these 2 in my iptables -A INPUT -p tcp -m tcp --dport 21 -j ACCEPT -A INPUT -p tcp -m tcp --dport 3000:3050 -j ACCEPT I've also disabled selinux.

    Read the article

  • Open Terminal Here, as Root (OS X)

    - by cwd
    There is a pretty awesome applescript called "Open Terminal Here" ( http://www.entropy.ch/software/applescript/ ) which you can add to your finder's toolbar and click when you want to launch a terminal console which is set to that directory. Sometimes I need to be root, and so I end up starting terminal, doing something like sudo -i and then I have to change back to the previous directory because the sudo command is landing me in /var/root. I'm using sudo -i because I like it to load things like aliases / the bash profile. The script is applescript, and here's the important part of how it works: ... set cmd to "cd " & quoted form of the_path & " && echo $'\\ec'" ... tell application "Terminal" activate do script with command cmd How do I get this to load as root?

    Read the article

  • I tried installing Ubuntu 10.04 and I got this message - any ideas on what to do?

    - by vette982
    No root file system defined. Please correct this from the partition menu. This message shows up when I first boot into Ubuntu after the installation. I installed it by mounting the ISO with Daemon Tools, and I just did the default Wubi installation. I keep reading everywhere that I need to choose my installation directory, but I don't get any option to do that. These are all the options I get for installation directory. I have a C and D partition on my drive, and I tried installing it on both and no luck either way. Any ideas?

    Read the article

  • PHP pages are not parsed by Apache on CentOS

    - by infotoknowledge
    I have installed Centos 5.x, Apache 2.2, PHP 5.3 and MySQL 5.5. I also installed phpMyAdmin. I am able to access phpMyAdmin through the browser without any issues. However, when I create a simple index.php with phpinfo() function in the default directory, that page is served without php parsing. As we all know, phpMyAdmin is a php application. This is working fine from the same server but not the simple php page from the doc root directory ??!!!. Of course, I tried moving this page into phpMyAdmin folder and tried accessing it, but no success. Please note that I updated httpd.conf file with appropriate directives based on the php installation guide. docroot - /var/www/html phpMyAdmin folder - /var/www/html/phpMyAdmin Any help is appreciated.

    Read the article

  • How to Disable Hovering Selection Block in File Browser?

    - by BGM
    I am changing from Windows XP to Windows 7. One thing I cannot stand about Windows 7 is that whenever you mouse over files in Explorer (or other file-browser), it highlights the files with a semi-transparent block. This is nice, but I want to be able to double-click on the white area background of the directory, and I can't do this with the highlighted selection always there. The hovering-block is always in the way of the background - especially if there are a lot of files in the directory. (I don't even know what that hovering-block is called; if someone enlightens me, I'll re-title my post) Is there any way to get the file selector to work like XP?

    Read the article

  • Moving a file using PuTTY

    - by Paul Trotter
    I am newbie struggling to move a file on a Linux VPS using PuTTY. I can log in with a user in PuTTY at this point I can navigate to see the file I wish to move (~/servers/apache-solr-3.6.2/example/webapps/solr.war). By using cd .. a couple of times from the directory I begin at when I first log in to PuTTY I can then navigate to the location I wish to move the file to: usr/local/jakarta/apache-tomcat-5.5.36/webapps/ I know that I need to use cp to copy the file and have tried variations on: cp ~/servers/apache-solr-3.6.2/example/webapps/solr.war usr/local/jakarta/apache-tomcat-5.5.36/webapps However each time I get 'No such file or directory' I have tried excluding the ~/ and the start and I have tried specifying solr.war at the end of the command. Please excuse the newbie question, but I would really appreciate some advice on what I am doing wrong here.

    Read the article

  • I tried installing Ubuntu 10.04 and I got this message - any ideas on what to do?

    - by user41926
    No root file system defined. Please correct this from the partition menu. This message shows up when I first boot into Ubuntu after the installation. I installed it by mounting the ISO with Daemon Tools, and I just did the default Wubi installation. I keep reading everywhere that I need to choose my installation directory, but I don't get any option to do that. These are all the options I get for installation directory. I have a C and D partition on my drive, and I tried installing it on both and no luck either way. Any ideas?

    Read the article

  • Sorting Files into Subfolders based on EXIF Date

    - by honestor
    I have a huge directory from a HDD recovery that contains 70000+ JPEG files. I tried playing around with some AppleScripts, that I found, but had no luck. I already installed EXIFtool, which might be useful for this task. The current directory structure is as follows: dir001 - file0001.jpg ... - file9999.jpg dir002 - file0001.jpg ... - file9999.jpg ... dir070 - file0001.jpg - ... - file9999.jpg The files mostly have EXIF Data, but sometimes there are Files without metadata. Now I hope to be able to sort and rename these files into folders based on the date: 1999 - 1999 01 31 - 1999_01_31_-_22_59_59.jpg 2000 - 2000 05 20 - 2000_05_20_-_21_59_59.jpg - 2000_05_20_-_22_59_59.jpg I figured Applescript/Automator might come in handy for this, however every other solution would be welcome, too!

    Read the article

  • How do I prevent my swf files being hotlinked, downloaded etc.

    - by undefined
    I have swf files that are embedded in a PHP page using SWFObject. These swf files are in the same directory as my PHP files. for example www.myurl.com/index.php embeds www.myurl.com/flashfile.swf, index.php and flashfile.swf are in the same directory. However I want to prevent people from being able to type in www.myurl.com/flashfile.swf and viewing the swf. I want the browser to deny access to this file unless it has been embedded by the PHP file. Should I move my swfs to another folder and protect this folder somehow - is this with the .htaccess file? I am running Apache on a linux machine. While my main concern is for swf files I would like to protect graphics used on the site too. all help appreciated thanks

    Read the article

< Previous Page | 280 281 282 283 284 285 286 287 288 289 290 291  | Next Page >