Search Results

Search found 3168 results on 127 pages for 'directories'.

Page 65/127 | < Previous Page | 61 62 63 64 65 66 67 68 69 70 71 72  | Next Page >

  • How to add exceptions to apache reverse proxy rules

    - by Tania
    I am trying to set a Apache reverse proxy so that requests get proxyed to another application running on 8080. However, I want some directories to be directly served rather than forwarded to proxy. What I want is: http://localhost/ - http:// localhost:8080/myapp http:// localhost/images - /var/www/html/images http:// localhost/anything-else - http:// localhost:8080/myapp/anyhthing-else My current httpd.conf is ProxyRequests Off ProxyTimeout 600 ProxyPreserveHost On ProxyPass / http:// localhost:8080/ ProxyPassReverse / http:// localhost:8080/ RewriteEngine On RewriteRule ^/(.*) http:// localhost:8080/VirtualHostBase/http/%{SERVER_NAME}:80/myapp/VirtualHostRoot/$1 [L,P] What configuration should I do to make the local path exception to work? Thank you, Tania

    Read the article

  • Warning: DocumentRoot * does not exist on Centos 6

    - by user1213807
    My virtual hosts lines: <VirtualHost *:80> ServerAdmin [email protected] DocumentRoot /www/docs/example.com ServerName example.com ErrorLog logs/example.com-error_log CustomLog logs/example.com.com-access_log common </VirtualHost> And when I apache restart (sudo apachectl -k stop) get this error: Warning: DocumentRoot [/www/docs/example.com] does not exist I've checked some ways: All files and directories permissions is OK, everything 755. I think, maybe this error about SeLinux and disable it. But not working. Still same error. How can I fix this problem?

    Read the article

  • SQL Server 2008 data directiories in SSD

    - by Kuroro
    I am going to install a new SQL server 2008 instance on my development/testing machine. My machine have one 7200rpm 500GB SATA Disk (C:OS) and one Intel X25-G2 80GB SSD(D:). Details machine config is as follow: CPU:i7 860 RAM:8GB Microsoft said I have an option to place following directories in different disk. So I plan to place User database & Temp DB on SSD and rest of it on traditional disk. Is it a good choice for gaining a performance boost in fast SSD? Data root directory :C:\Program Files\Microsoft SQL Server User database directory D:\Data User log directory C:\Logs Temp DB directory D:\TempDB Temp Log directory C:\TempDB Backup directory C:\Backups

    Read the article

  • Symlink - Permission Denied

    - by John Smith
    I'm facing an interesting problem with plenty of Permission Denied outputs when using SymLinks Linux: Slackware 13.1 Directory with Symlink: root@Tower:/var/lib# ls -lah drwxr-xr-x 8 root root 0 2012-12-02 20:09 ./ drwxr-xr-x 15 root root 0 2012-12-01 21:06 ../ lrwxrwxrwx 1 ntop ntop 21 2012-12-02 20:09 ntop - /mnt/user/media/ntop6/ Symlinked Directory: root@Tower:/mnt/user/media# ls -lah drwxrwx--- 1 nobody users 1.4K 2012-12-02 19:28 ./ drwxrwx--- 1 nobody users 128 2012-11-18 16:06 ../ drwxrwxrwx 1 ntop ntop 320 2012-12-02 20:22 ntop6/ What I have done: I have used chown -h ntop:ntop on the ntop directory in /var/lib Just to be sure, I have chmod 777 to both directories Permission denied actions: root@Tower:/var/lib# sudo -u ntop mkdir /var/lib/ntop/test mkdir: cannot create directory `/var/lib/ntop/test': Permission denied Any ideas?

    Read the article

  • Raspberry Pi: mvoe home & var folders to USB HDD?

    - by doni49
    I have a Raspberry Pi running raspian. I've installed Apache2, PHP & MySQL. Apache & MySQL are both configured to use sub-directories of /var. I'd like to use a directory on my USB HDD instead of my SD card. I'd also like to move the /home directory to the USB HDD. I'd like to avoid re-partioning my HDD if possible. I thought maybe I could use a symlink to tell raspian that /var is really at /media/USBHDD1/var and /home is really at /media/USBHDD1/home. I tried it last night but couldn't get it to work.

    Read the article

  • DVD Drive can read a DVD title and file contents (very slowly) but not play DVDs

    - by Daisetsu
    I've been able to watch DVDs in the past on this computer when it was running XP. I recently upgraded to windows 7 and haven't watched a DVD since then. I put a DVD in and it took quite a long time but windows finally found the disk title, and I was able to explore directories (saw folder like VIDEO_TS). The problem is when I try to have video players play the DVD I just get an error in VLC. It wouldn't tell me the exact error (said to check the log and I don't know where that is). Do I have to install some special drivers/certs in windows 7 to play video?

    Read the article

  • Server 2008 blue screen after login - no desktop

    - by Jake
    I came into work today to find that our Windows Server 2008 used as a terminal server for the Thin Clients was not loading the users desktops. What I have found out is that when I login via Remote Desktop, after applying user settings, it gets to the stage of 'Preaparing your desktop', but then the screen just goes blue. I have tried rebooting several times with no luck, and logging in locally at the server with no luck. Does anyone have any suggestions? I do not think that it could be a virus attack as we run antivirus on the server with OpenDNS filtering. Yesterday a software restriction policy was set up to only allow programs in the following directories to run if that could be the problem, although it was working fine all day yesterday. C:\program files C:\programdata C:\windows Thank you. Jake

    Read the article

  • 403 in Response to OPTIONS when updating working copy having full access

    - by user23419
    There is an SVN repository (single repository) http://example.net/svn The repository contains several projects (directories): http://example.net/svn/Project1 http://example.net/svn/Project2 User has full access to Project1 directory and has no access neither to root nor to Project2. Everything works fine for a while: user checks out http://example.net/svn/Project1, commits and updates it successfully. But sometimes trying to update leads to the following error: Command: Update Error: Server sent unexpected return value (403 Forbidden) in response to OPTIONS Error: request for 'http://example.net/svn' Finished! Why does TortoiseSVN request something in the root??? I have noticed that this happens after somebody else committed copy or move operation. Checking out http://example.net/svn/Project1 helps till next time... The main question: How to set up access rights for user to avoid these errors? Note, it's not an option to grant user any read or write access right on the root directory for security reasons.

    Read the article

  • Remote deploying wars to a liferay installation

    - by iftrue
    With vanilla tomcat, you can POST to URLs beneath SOMURL/manager/ with a proper manager user role defined. The liferay deployment of tomcat, however, is missing the manager and host-manager applications, and when I copy the directories from a vanilla Tomcat installation, I get the exception below: Exception: javax.servlet.ServletException: Error allocating a servlet instance org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:558) org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102) org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298) org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:852) org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588) org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489) java.lang.Thread.run(Thread.java:636) root cause java.lang.SecurityException: Servlet of class org.apache.catalina.manager.HTMLManagerServlet is privileged and cannot be loaded by this web application org.apache.catalina.authenticator.AuthenticatorBase.invoke(AuthenticatorBase.java:558) org.apache.catalina.valves.ErrorReportValve.invoke(ErrorReportValve.java:102) org.apache.catalina.connector.CoyoteAdapter.service(CoyoteAdapter.java:298) org.apache.coyote.http11.Http11Processor.process(Http11Processor.java:852) org.apache.coyote.http11.Http11Protocol$Http11ConnectionHandler.process(Http11Protocol.java:588) org.apache.tomcat.util.net.JIoEndpoint$Worker.run(JIoEndpoint.java:489) java.lang.Thread.run(Thread.java:636) What's the proper way to remote deploy wars to a liferay instance? (Not portlets, in my case.)

    Read the article

  • Linux FHS: /srv vs /var ... where do I put stuff?

    - by wag2639
    My web development experience has started with Fedora and RHEL but I'm transitioning to Ubuntu. In Fedora/RHEL, the default seems to be using the /var folder while Ubuntu uses /srv. Is there any reason to use one over the other and where does the line split? (It confused me so much that until very recently, I thought /srv was /svr for server/service) My main concern deals with two types of folders default www and ftp directories specific application folders like: samba shares (possibly grouped under a smb folder) web applications (should these go in www folder, or do can I do a symlink to its own directory like "_/www/wordpress" - "/srv/wordpress") I'm looking for best practice, industry standards, and qualitative reasons for which approach is best (or at least why its favored).

    Read the article

  • Copying files within a Workgroup

    - by Andrew La Grange
    I have three boxes operating in a Windows Server workgroup within a closed network. (No Domain / No AD) There are several derivations of the scenario that I'm about to outline, but I'm sure I will be able to retool the solution as and when I need. Essentially the boxes are: 2 x Windows Server 2008 R2 x64 Standard 1 x Windows Server 2000 Standard I need to be able to schedule the copying/and-or/moving of files from various directories and each of the boxes. Each box has a different username and password for the administrator. I have PowerShell 2.0 on the two Win2K8 boxes (obviously). Previously I have used mapped network drives to copy the files, and cmd line batches, but I'd much rather use Powershell if possible (with Shares and/or $ notation). However the Copy-Item cmdlet doesn't seem to be processing the Credential correctly. Perhaps some Powershell gurus out there might be able to help me. Essentially I'd like to schedule a PS run of script to push backup files onto my WIn2k box (old fileserver) periodically.

    Read the article

  • Error with FTP since binding via httpcfg

    - by Linda
    I was in a similar posistion to this question and bound two IP addresses using httpcfg. Since doing this ftp does not seem to be working on IIS6 in Windows Server 2003. Any ideas what could be wrong? The command I ran was: httpcfg set iplisten -i xxx.xxx.x.x I get the following when I try to conenct via Filezilla: Error: Connection timed out Error: Failed to retrieve directory listing The log file is returning the following: #Software: Microsoft Internet Information Services 6.0 #Version: 1.0 #Date: 2009-08-17 13:54:05 #Fields: date time c-ip cs-username cs-method cs-uri-stem sc-status sc-win32-status 2009-08-17 13:54:05 91.85.70.17 Client [1]USER Client 331 0 2009-08-17 13:54:05 91.85.70.17 Client [1]PASS - 230 0 In the ftp site settings I have the site pointing to the IP address used using httpcfg and the port set to 21. Update: I can see a directory listing if I connect via the inbuilt commandline ftp client in wondows vista. If I try to connect via a windows explorer I start in the incorrect folder and no files are listed just directories.

    Read the article

  • Unmounting a zfs pool while it is shared with sharenfs

    - by Ted W.
    I have a Solaris (open indiana) system which is getting poor disk write performance. In order to enable ZIL in this version of zfs I need to add a line to /etc/system. This will not take affect until I've unmounted and remounted the zpool. The trick is that this spool is shared via nfs to about 200 other servers to host users' home directories. I can guarantee that no users will be accessing the disks during this period of maintenance but I would like to avoid having to issue an unmount for 200 systems in order to unmount the disk on the Solaris box. My question is, with sharenfs, is it necessary to have all systems disconnected before unmounting the filesystem on the host? If it's possible, how do you go about it? I've tried unmounting already, the normal way, and it reports the disk is busy. There is no lsof in Solaris and pfiles (I think that's what it was) does not show anything obviously using the mounts.

    Read the article

  • EC2 out of space on root disk, moving it to ephemeral

    - by Joseph Misiti
    I am spawning a few test servers on ec2 that happen to be m1.larges. I am using these test servers for load balancin testing. Anyways, most of the servers I have used before have been backed by EBS, but these instances (ubuntu 11.04) obviously come with a lot of ephemeral space located @ /mnt. What I noticed that is happening is I am running on space on the root disk. I am trying out this tutorial http://www.turnkeylinux.org/docs/using-instance-storage moving my /home + /usr directories to /mnt and then remounting them. This works except it does not survive a reboot. Am I missing something here or is this tutorial not completely correct. How do I make space on my / drive so I can do stuff and survive re-boots.

    Read the article

  • Trouble getting FTP login to work in IIS6

    - by Frank Rosario
    Hello all, I'm trying to setup an FTP site for one of my clients to pickup files from us using IIS6. I've created the FTP site, have set to not isolate users (not necessary as FTP will be read only with authentication). Here's the problem. The FTP is to be password protected, so I turned of anonymous access on the FTP site. I then created a ftpuser account on the machine, and gave it read and browse directory permissions on the ftp's root directory. However, when I go to test the ftpuser login, I get a 530 "ftpuser cannot login" error. However, if I browse to same directory over HTTP (anonymous access turned off as well) and enter the ftpuser login info, I can download files and browse directories successfully. Why is the ftpuser working over HTTP but not FTP? Shouldn't I be able to login over FTP with the ftpuser login information I just created? Thanks in advance, - Frank

    Read the article

  • httpd.conf for case-insensitive file serving

    - by Anton Gogolev
    I'm a complete newbie with regard to managing Apache, so excuse me if I'm phrasing something incorrectly. I have a web site -- say, http://domain.com. The problem is that when I try to open http://domain.com/index.html in a web browser it displays the page, but when I attempt to access http://domain.com/Index.html (note capital I), it responds with HTTP 404. How do I configure Apache to serve both these files (and directories, for that matter) in a case-insensitive manner? Current httpd.conf is here. EDIT Dan C, thanks for a hint. I basically want to allow users to download files from my server and don't really want them to be aware that Index.html and index.html are in fact different. I'm also very willing to know as to what are the ramifications of this decision.

    Read the article

  • How to dump remote database without mysqldump?

    - by deceze
    I want to dump the database on my remotely hosted site in regular intervals using a shell script. Unfortunately the server is locked down pretty tight, has no mysqldump installed, binary files can't be executed by normal users/in home directories (so I can't "install" it myself) and the database lives on a separate server, so I can't grab the files directly. The only thing I can do is log into the webserver via SSH and establish a connection to the database server using the mysql command line client. How can I dump the contents to a file a la mysqldump in SQL format? Bonus: If possible, how can I dump the contents directly to my end of the SSH connection?

    Read the article

  • Adding a virtual directory IIS 7.5 Windows 7 Ultimate x64

    - by Dave
    Trying to get my IIS 7.5 playing nice with VS 2008 on Windows 7 Ultimate 64-bit. I'm getting this error: System.Security.SecurityException: Request for the permission of type 'System.Web.AspNetHostingPermission, System, Version=2.0.0.0, Culture=neutral, PublicKeyToken=b77a5c561934e089' failed. This happens when accessing a virtual directory outside C:\inetpub\wwwroot. I'd like to be able to create virtual directories outside the root if I can. I've added the NETWORK SERVICE to the folder hosting the virtual directory, still no luck. This folder is on my C: drive, not a share. TIA

    Read the article

  • What linux permissions are need for www?

    - by Xeoncross
    I know that 777 is full read/write/execute permission for owner/group/other. So this doesn't seem to be needed as it leaves random users full permissions. What permissions are need to be used on /var/www so that... Source control like git or svn Normal users in a group like "websites" or added to "www-data" Servers like apache or lighthttpd And PHP/Perl/Ruby can all read, create, and run files there? If I'm correct, Ruby and PHP scripts are not "executed" directly - but passed to an interpreter. So there is no need for execute permission on files in /var/www. Therefore, it seems like the correct permission would be chmod -R 1660 which would make all files shareable by these four entities all files non-executable by mistake block everyone else from the directory entirely set the permission mode to "sticky" for all future files Is this correct? Update: I just realized that files and directories might need different permissions - I was talking about files above so i'm not sure what the directory permissions would need to be.

    Read the article

  • What is the list of special variables available when writing a shell command for a context menu

    - by giovanni.pellicciotta
    When extending the Windows' shell context menu (e.g. for adding an 'Open command here' prompt on directories), a 'command' key needs to be created in the registry. The value of this 'command' key apparently can be any valid command line. I want to know which 'special variables' are available for use inside this command line. For example, I use following command for opening and cmd window from within a directory's context menu (*): cmd.exe /e:on /f:on /s /k pushd "%V" I cannot find any reference to what %V actually means or what the full list of such variables is. (*) Following registry keys are created for this: [HKEY_LOCAL_MACHINE\SOFTWARE\Classes\Directory\shell\cmdshell] @=Open Command Prompt Here" HKEY_LOCAL_MACHINE\SOFTWARE\Classes\Directory\shell\cmdshell\command] @="cmd.exe /e:on /f:on /s /k pushd \"%V\""

    Read the article

  • Why does Windows spooler require an administrator account?

    - by Software Monkey
    Does anyone know what changes I might need to make to allow restricted users to print using a printer configured for spooling? My Windows XP SP3 system currently requires me to use an Admin account for printing if the printer is configured to spool documents before printing. If the printer is configured for direct printing it works for all accounts. This used to work and some months back it just stopped, and I can't pin down why. The printer, which is an HP PSC 1200 (an old printer) itself is configured for Everyone to have Print authority and my specific (restricted) account to have Full authority, that is Print, Manage Printers and Manager Documents. My HDD is locked down for restricted users given them only read authority to the entire file system except their data directories, which is how I have run my systems for years. I assume there may be a directory somewhere that I need to allow users to write to.

    Read the article

  • Overcrowded Windows XP Folders

    - by BlairHippo
    I know that, technically, an individual Windows XP directory can hold an immense number of files (over 4.29 billion, according to a quick Google search). However, is there a practical ceiling where too many files in one directory starts having an impact on reads to those files? If so, what factors would exacerbate or help the issue? I ask because my employer has several hundred XP machines in the field at client sites, and the performance on some of the older ones is getting "sludgy." The machines download and display client-defined images, and my supervisor and I suspect that our slacktastic approach to cache management could be to blame. (Some of the directories have tens of thousands of images in them.) I'm trying to gather evidence to support or contest the theory before spending time on a coding fix.

    Read the article

  • Missing APR on apache2 ./configure

    - by arby
    I want to build the latest stable version of apache2. I downloaded the source and put APR & APR-util in the srclib folder, then changed directories to ./srclib/apr and ran: ./configure --prefix=/usr/local/apr sudo make sudo make install This seemed to install APR ok, but when I run ./configure from the apr-util directory, I receive the error: configure: error: APR could not be located. Please use the --with-apr option. Using ./configure --prefix=/usr/local/apr-util --with-apr=/usr/local/apr, the error becomes: checking for APR... configure: error: the --with-apr parameter is incorrect. It must specify an install prefix, a build directory, or an apr-config file. Why can't it find APR?

    Read the article

  • How to make FileZilla open all the required files with one click

    - by Omar Tariq
    Is there any way of configuring FileZilla so that I can open all the files on a server that I use to edit with just one click. For example if the files are like this: /home/abc/def/one.txt /home/abc/def/yet/another/directory/two.txt /home/abc/def/ghi/yet/another/directory/three.txt Then it is very time-consuming to navigate through each directory and open the required files. These are only 3 files but what if we have around 10 to 20 files? Yes, copying the path of the directories is one thing. But something that is built-in so that I can just click a button like open all the required files of this connection and it opens all the files in the editor (as set in FileZilla preferences) then that would be great!

    Read the article

  • Postfix tutorial inconsistency

    - by Desmond Hume
    I'm following this tutorial to setup a Postfix/Dovecot mail server with Postfix Admin as a web front end. As regards directory structure for virtual mail users, the author of the tutorial writes: Virtual mail users are those that do not exist as Unix system users. They thus don't use the standard Unix methods of authentication or mail delivery and don't have home directories. That is how we are managing things here: mail users are defined in the database created by Postfix Admin rather than existing as system users. Mail will be kept in subfolders per domain and account under /var/vmail - e.g. [email protected] will have a mail directory of /var/vmail/example.com/me. But when he gives instructions about configuring Postfix Admin, he suggests this to be contained by Postfix Admin's config.inc.php: // Mailboxes // If you want to store the mailboxes per domain set this to 'YES'. // Examples: // YES: /usr/local/virtual/domain.tld/[email protected] // NO: /usr/local/virtual/[email protected] $CONF['domain_path'] = 'NO'; Is there an inconsistency?

    Read the article

< Previous Page | 61 62 63 64 65 66 67 68 69 70 71 72  | Next Page >