Search Results

Search found 26555 results on 1063 pages for 'active directory explorer'.

Page 453/1063 | < Previous Page | 449 450 451 452 453 454 455 456 457 458 459 460  | Next Page >

  • Upgraded to Mountain Lion, now 127.0.0.1 is not resolving

    - by Shanimal
    I used to be able to type 127.0.0.1 (or my network IP 10.10.53.32) and it would resolve to my "default" virtual host. 127.0.0.1/~Shanimal and shanimal.dev both resolve to their appropriate folders. localhost and 127.0.0.1 give me a 404 - "Not Found The requested URL / was not found on this server." Basically, my "It works!" screen no longer works. /private/etc/apache2/Shanimal.conf: <Directory "/Users/Shanimal/Sites/_www"> Options Indexes Multiviews AllowOverride AuthConfig Limit Order allow,deny Allow from all </Directory> hosts: 127.0.0.1 localhost 127.0.0.1 shanimal.dev

    Read the article

  • Cannot connect to domain despite successful pings

    - by egtann
    Pings to my domain name work, but I can't connect via http. I've been trying various methods for a week now, but haven't come up with anything that worked. Any idea what's causing this? /etc/apache2/httpd.conf ServerName machinename.local <VirtualHost *:80> ServerName chipperapp.com DocumentRoot "/Users/myusername/appname/public" <Directory "/Users/myusername/appname/public"> AllowOverride all Options -MultiViews </Directory> </VirtualHost> /etc/hosts 127.0.0.1 chipperapp.com I can access the app from my local machine, but not on any other. I've set up dynamic DNS. Thanks!

    Read the article

  • local wordpress installation not accessible from the outside world

    - by hello
    I have a working installation of wordpress located in /var/www/html/wordpress It is accessible in my local network at [local-machine-ip]/wordpress/ There is also a test page located in /var/www/html/test.html It is also accessible in my local network at [local-machine-ip] I would like the wordpress website to be accessible from the outside world. I know that my ISP blocks incoming requests on port 80, so I set my router to redirect requests from port 8080 to 80. This feature appears to be working correctly since I can access the test.html page using my public ip address as follows: [public-ip]:8080 However, I cannot access [public-ip]:8080/wordpress Here is my Apache config : <VirtualHost *:80> ServerAdmin webmaster@localhost DocumentRoot /var/www/html ServerName [my.domain.com] <Directory /var/www/html/> Options FollowSymLinks Indexes MultiViews AllowOverride All Order allow,deny allow from all </Directory> ErrorLog ${APACHE_LOG_DIR}/error.log CustomLog ${APACHE_LOG_DIR}/access.log combined </VirtualHost> Thanks!

    Read the article

  • How can I exclude a file in a folder from basic auth (regex help)?

    - by simon180
    Hi I have a folder on my site which contains admin files and I've added basic auth following a little unwanted attention. This works fine however a couple of the admin functions won't work through basic auth as they handle file uploads and so I want to exclude these files from the auth. It shouldn't have any security implications as any rogue user wouldn't be able to access the pages that could create a session to use these functions. I am using the following basic code to exclude a file: <FilesMatch "(index.php\/myadminfolder\/myurl\/myaction/someotherstuff?)$"> Satisfy Any Order allow,deny Allow from all Deny from none </FilesMatch> The URL exclusion is not working. The URL to exclude is in the form: index.php/directory/subdirectory/action/uniqueid/blah What is the correct URL string to add to FilesMatch to exclude any files that start with the pattern of index.php/directory/subdirectory/action - regardless of what comes after action? Thanks Simon

    Read the article

  • How to avoid loop limitation in a openvz container?

    - by mat.viguier
    On a openVZ containing Deb7 I need to lock the maximum size of a folder, which is used to upload on a php based web server. The directory is synced, so I have to lock the maxsize. MAXSIZE should be upgradable by adding some physical disk later ... I want to use a file as a block device for a file system. So I have done : dd if=/dev/zero of=/disk2/filesystem.dat bs=1M count=100 Then, I made the filesystem on it mkfs.ext4 filesystem.dat Then I tried to mount it : mkdir /opt/filesystem ; mount /disk2/filesystem.dat /opt/filesystem My OpenVZ (it is on a VPS) has no loop module in the kernerl. So I got Could not find any loop device as usual under OpenVz So i think I have to use FUSE, but I really do not know HOW .... Any idea on locking the size of directory under OpenVZ ?

    Read the article

  • Ubuntu 12.10 Quantal Quetzal and AMD 12.11 Beta Driver

    - by White
    I'm using a Quantal AMDx64 install and a XFX Radeon HD5850 video card. I first enabled restricted drivers through additional drivers, but it resulted in breaking Unity and Compiz (I can only see my wallpaper and shortcuts. But the terminal still works and Nautilus too, however, without Close/Maximize/Minimize and slower). Then I uninstalled it and everything went back to normal. Then I installed it via terminal (12.10 version), and the result was the same. Then I downloaded it via ATI's web site (12.11 beta) and installed the .run file using the terminal, but the result was yet again the same. Then I went to the terminal and entered these commands: sudo apt-get remove --purge fglrx fglrx_* fglrx-amdcccle* fglrx-dev* - It said it had nothing to uninstall sudo rm /ect/x11/xorg.conf - No such directory sudo dpkg-reconfigure xserver-xorg sudo startx sudo cp/ect/x11/xorg.conf.orig /ect/x11/xorg.conf - Also, no such directory sudo aticonfig --initial sudo reboot Then, I was presented with the log in screen, but when I tried to login (with my account), it flashed a black screen and then threw me back. Guest account still works (without unity and compiz, tough) and I can still use TTY. And I also got the "AMD Testing Only" watermark. Then I figured that I should stop messing with the terminal and get help before I unleashed Apocalypse XD. Side notes: My Ubuntu is installed on a ext4 partition with 60GB, and I dual boot with Windows 7 (at least for now). My internet is a 50kbps 3G-ish, so downloading even small files is a pain, let alone a video driver. I would rather not reinstall the O.S., it was a herculean task to download everything I had in there, and I have very little free disk space for backups. I'm still new to Ubuntu (I know some basic commands), and I don't know how to debug, so please, be patient XD And using Windows, my internet is even slower (is that possible?), so it kind of leaves a torture aftertaste xD. So, if you guys could answer quickly, it would be greatly appreciated. Thanks in advance. If you need any info, just ask (and explain how to get it XD).

    Read the article

  • Download web server structure with empty files

    - by golimar
    I want to make a mirror of a Web server, but downloading the actual files will take too long. So I thought of having just the directory and file structure, and when I need the actual contents of the file, I can download just that file. I have tried wget --spider URL and in a short time it has created in my local disk the directory structure with no files. But I've checked all of wget's or curl's switches and there is nothing like what I need. Can this be done with wget, curl or any other tool?

    Read the article

  • Solaris NFS: user permissions

    - by cjavapro
    I am very new to NFS. I would like to make sure I am clear. If the NFS server shares a directory rw,, and all the files in the directory are permissions 700 and user/group for those files is root/root,,, On the client you would have to log in as root to see it. Is this correct? I am aware that a non root user on the client could make a direct connection to override this. (as in don't use the mount, just use an NFS client hack.) It really seems like anyone who has access to the client machine should have access to the files and that the client machine should be ignoring permissions. Only the server should handle permissions. Am I correct in my understanding? Is it normal to have this type of layout? Is there a way to ignore the permissions on the client side?

    Read the article

  • Trouble with mod_rewrite and PHP Extensions - Help Making the Correct .htaccess File

    - by nicorellius
    I'm looking for a set of simple rules and redirects for my site. I've tried so many combinations that I'm starting to get confused. I'm not sure how to set this up. Generally, without mod-rewrite, I would use relative paths to link to files: <a href="link.php">Link</a> if it's in the same directory. Now I'd like to use this: <a href="link">Link</a> And so if you go to this page: localhost/mysite/link it will take you to the correct place, which would be: localhost/mysite/link.php But also, many directory levels deep I would like it to work as well: localhost/mysite/group/link2 would go to: localhost/mysite/group/link2.php and: localhost/mysite/group/section/link3 would go to: localhost/mysite/group/section/link3.php But then in all these cases, if someone were to type in this: localhost/mysite/group/section/link3.php in the URL bar, it would show this: localhost/mysite/group/section/link3 Thanks

    Read the article

  • Proper set up shared folders for users

    - by user221486
    First I would like to say thanks for helping, and I have huge problem with proper set up permission for shared folders. I have Windows 7 x64 ent. - name: backupfb - added to domain with shared folder on drive e: (e:\backup) 50 clients/laptops with TSM Tivoli fastback for workstations who save files on shared folder And I need to configure proper permission for my shared folders that only owner of folder can access to their folders. Folder structure is: e:\backup <- shared as a "backup" folder \\backupfb\backup\ e:\backup\BackupAdmin <-- directory is used by the Tivoli Storage Manager FastBack for Workstations client to download revisions and configurations. Nodes require read-only access to these directories e:\backup\RealTimeBackup <-- enable client accounts to create directories that are only accessible by the account that created them. As a result, the directory that contains data for a node is not created until that node connects to the server. So permission should look like that (take from instructions): Inheritable permissions from object`s parents are DISABLE Permission entries: \\backupfb\backup\BackupAdmin Allow Users Read, Execute This folder, subfolders, and files Traverse Folder / Execute Allow List Folder / Read Data Allow Read Attributes Allow Read Extended Attributes Allow Delete subfolders and files Allow Delete Allow Read Permission’s Allow Allow Administrators Full Control This folder, subfolders, and files Both folders have enabled option "apply these permissions to objects and/or containers within this container only" Here everything works fine \\backupfb\backup\RealTimeBackup <<-- Allow Administrators Full Control This folder, subfolders, and files Allow CREATOR OWNER Full Control This folder, subfolders, and files (from domain) Allow Users Special This folder only Traverse Folder / Execute Allow List Folder / Read Data Allow Read Attributes Allow Read Extended Attributes Allow Create Files / Write Data Allow Create Folders / Append Data Allow Delete subfolders and files Allow Read Permission’s Allow Allow OWNER RIGHTS* Full Control This folder, subfolders, and files Here I have huge problem with CREATOR OWNER Im able to set FULL CONTROL but I can only apply "Subfolders and files only". When I change props. to "This folder, subfolders and files" and save its change to "Subfolders and files only" So I try use icacls to set up permissions @echo off takeown /F E:\backup\ /R /A for /D %%i IN (E:\backup\RealTimeBackup*) DO icacls E:\backup\RealTimeBackup\%%~nxi /grant:r cloud\%%~nxi:F /T /C pause but after that user are able to create just one folder in \backupfb\backup\RealTimeBackup\userfolder but problem is with subfolders In log i have: FBW5022E Unable to access the specified file Explanation: The file specified is unable to be accessed. Possibly spelled incorrectly, or bad path, or permissions. User response: Ensure the user has the proper permissions for the file and directories involved andthat the file and directory exist Any idea ?? pls help ;-) thanks

    Read the article

  • Is it safe to sync a Firefox instance in both directions?

    - by java.is.for.desktop
    Hello, everyone! I want to use some tool (not decided yet, which one) to sync a Firefox instance (more exact: user directory) between two machines. (EDIT: I want really to sync everything in the user-directory) Would you assume, that this is safe? What would happen, if some files are newer on the one machine, and some other files are newer on the other machine? Could this lead to inconsistencies, when, let's say, there are some inter-file references? If general, I don't have good experience with syncing application in both directions. Most applications seem not to be suited for this.

    Read the article

  • Tricking Linux apps about current time with environment variables

    - by geek
    Sometimes it is possible to trick a Linux app by calling it like this: HOME=/tmp/foo myapp This would make myapp think /tmp/foo is the home directory, it won't try to get the user id, find its home directory via getpwent(). This is useful when myapp must be forced to dump some of its config files into a non-standard location different than ~. A similar trick can be done like this: LANG=foo LC_ALL=bar myapp This is useful when myapp needs to be called once with a different locale without having to make the change persistent by using the export bash built-in or even modify stuff in /etc/profile. Is it possible to pull the same trick with time and date? The goal is to make an app use another time than the system ones. The final goal - to make timestamps that appear in logs/commit messages not being tied to the system time.

    Read the article

  • Linux Permissions

    - by Tres
    I am running Fedora 12 and I've setup a partition separate from my root partition to keep shared files and home directories. Now, I've been having permission issues where it says the user cannot chdir into their home directory (/files/home/*). Now, I fixed this originally by chmodding / to 0755 and the home directories also to 0755. And yes, the user is the owner:group of their home directory. Now get this, I didn't change a thing, rebooted, everything still works. Great, right? I boot the server up a day later, and now same ol issue. This is a home server that wasn't on at all at any point in between the working state and non-working state. Also, nothing else was modified. Any ideas? Thanks!

    Read the article

  • Creating a Virtual Host with xampp

    - by Will
    i'm using xampp trying to set up a virtual host. Here's my httpd-vhosts.conf file: <VirtualHost *:80> DocumentRoot "C:\Users\wbaizer\PhpstormProjects\Test" ServerName test1.localhost </VirtualHost> my hosts file has the following: 127.0.0.1 test1.localhost when I try to access test1.localhost in Chrome it gives me this message: "You don't have permission to access the requested directory. There is either no index document or the directory is read-protected." What am I doing wrong, what do I need to do, and how do I do that?

    Read the article

  • FreeNAS plugins not able to access storage

    - by dave
    I've just setup a FreeNAS box and have a couple plugins (sick beard and SABnzbd) installed. Both of these have you select a directory where downloads should go. My storage is on /mnt/MediaVolume/ however when I navigate to mnt it's an empty directory. When I SSH to the box though, I can see it just fine. I'm thinking it may have something to do with permissions, but I'm not sure. Any suggestions how to allow these plugins to view/have access? Thank you!

    Read the article

  • htpasswd and htaccess in Plesk 9.3

    - by J White
    Greetings Here is my situation. I currently have one website on my dedicated server. As of now, I have protected the directory /exclusive using the Plesk control panel. I am having a billing company install their password management script on the server but they need the absolute location to the .htpasswd file. I can't find it or the .htaccess file. Would it be easier to unprotect the /exclusive directory and create the .htaccess file in notepad? If this is done, where should I place the .htpasswd file? -jw

    Read the article

  • How to set up apache with parallel plesk?

    - by Ran Gualberto
    I'm working with the Windows Server 2008 (a godaddy Windows dedicated server). My problem is that .htaccess is not working in the server. And I just figured out that apache is not installed. I would like to know how to run the apache with the plesk (with existing php setup). And how to run the apache with the current site directory C:\inetpub\vhosts. My goal is to make .htaccess work on the server with plesk and with the directory C:\inetpub\vhosts.

    Read the article

  • Debian SMB share having permission issues in windows 7

    - by xxpor
    Hi Everyone, I setup a Debian squeeze server with samba. I then shared my /media directory with the following configuration: [media] comment=Hard Drives read only = no locking = no path = /media guest ok = yes browsable = yes directory mask = 0777 When the drives are mapped in Windows 7, the user can write to all of the subdirectories of media (sdb1, sdc1, etc), but cannot write to any folders that they create themselves in the subdirectories of media. For example, if the user mapped /media/sdb1 to Z:, and then creates a folder Z:\test, the folder is created successfully, but no files can be written to Z:\test. If the user ssh's into the server, they have no problems writing to these directories. I have included the screenshots, in order, of what happens on Windows. This samba share is mounted with ntfs-3g, if it makes a difference. Screenshots are here

    Read the article

  • Why does FIND on Windows 7 give an "Access Denied" error?

    - by rann
    Hi All, I have an administrator account on a Windows 7 x64 machine. It is not THE administrator account, the account is simply a member of the administrators group. The install is default. When the user opens a command prompt it ends up in the users' %HOMEPATH% directory where you'll find various directories like the Documents folder. If the user uses the following (windows) FIND command, an "Access Denied" error occurs: FIND /I "My String" C:\Users\Rann\Documents Access denied - C:\USERS\RANN\DOCUMENTS Using runas or right-clicking on the command prompt to run it as an administrator does not change this behaviour; an administrator-level cmd.exe still gives me the same error. Changing the path to any other directory gives the same error. My question is thus: How is one supposed to use the FIND (and possibly other) commands? What rights are needed?

    Read the article

  • Filename Case issue with over WebDav

    - by user98365
    We are accessing SAMBA shared directory from a Windows Client with WebDav client WebDrive. But we are having the issue that it is showing same contents in both the directories ( data/ & Data/ ) though they are entirely different. I know this is because of Windows Filesystem being case insensitive and Linux being Case Sensitive. is there any solution for this? We had the same issue when viewed through the SAMBA mounted directory but we solved it by editing the SMB.conf as said in the following link Does Samba work well with Windows when case-sensitive names are enabled? Please help to solve this when accessed from the WebDav

    Read the article

  • How to tell Mercurial to never create hard links

    - by scrapdog
    I am planning to use Mercurial in the near future on some projects. These projects will normally reside in a directory on my Windows machine, but I will be sharing these directories using VirtualBox so I can work on them directly from within Linux. I understand that Mercurial will sometimes create hard links when cloning repositories. I'm not sure how a VirtualBox shared directory handles these hard links (or if it even can), so I'd rather just tell Mercurial to never attempt to make hard links and always make a copy. My question: how do I globally disable Mercurial from hard linking? (Although if someone has gotten Mercurial and VirtualBox shared folders to work nicely with hard linking, I'd like to hear about it!)

    Read the article

  • How to avoid specifying full path in sudoers file?

    - by s g
    I am trying to add a NOPASSWD entry for sudotest.sh (or any script/binary that requires sudo) in my /etc/sudoers file (on Ubuntu 12.04 LTS server), but in order to make it work, I must specify the full path. The following entry works just fine: %jenkins ALL=(ALL)NOPASSWD:/home/vts_share/test/sudotest.sh The problem is that the script might move to a different directory. This seems like a great chance to use the * wildcard in the path (i.e. /*/sudotest.sh) so that my script could be in any directory but the manual states that wildcards will not match the / character when used in a path. I've confirmed that it doesn't work. I know that I can use the word ALL in place of my script, but this means there is no password prompt for any commands which seems unsafe. How do I solve this?

    Read the article

  • How can I disallow a user's scripts from accessing anything above their user folder?

    - by Jaxo
    This is probably an extremely simple question to answer for anybody who knows what they're doing, but I can't find any answers myself. I'm trying to set up a subdirectory for my good friend to test his PHP scripts on my (Apache) hosting plan. I don't want to let him access anything else on my server, however, for obvious reasons. His FTP login already leads him to the proper directory, which does not allow navigating any higher than it's root (mydomain.com/friend/). I would like the same behavior to be applied to any scripts, so he cannot simply <?php print_r(glob("../*")); ?> and view all my files. I'm thinking this can be done with an .htaccess file setting the DocumentRoot somewhere, but I can't have the file available for modification inside the user directory. Is this possible without majorly rewiring the web server? I've tried Googling all sorts of things to describe my problem, but without the proper terminology, all I get is "shared hosting" websites and people trying to sell me security packages.

    Read the article

  • Forbidden - 403 error Apache

    - by philippe
    I was setting my local Apache server to run Python cgi, then I came with the following error: Forbidden You don't have permission to access /hw10/main.cgi on this server. What I've changed on my http config file was: ScriptAlias /cgi-bin/ "/var/www/cgi-bin/" <Directory "/var/www/cgi-bin"> AllowOverride None Options FollowSymLinks +ExecCGI Order allow,deny Allow from all Require all granted </Directory> AddHandler cgi-script .cgi .py May someone please help me with that? I was trying to configure my Apache server to run .cgi Python scripts, and I came across that.

    Read the article

< Previous Page | 449 450 451 452 453 454 455 456 457 458 459 460  | Next Page >