Search Results

Search found 10546 results on 422 pages for 'apache commons'.

Page 170/422 | < Previous Page | 166 167 168 169 170 171 172 173 174 175 176 177  | Next Page >

  • htaccess problem

    - by Holian
    Hello! I have a few lines in my .htacess Options +FollowSymLinks RewriteEngine on RewriteCond %{HTTP_HOST} ^mydomain.org [NC] RewriteRule ^(.*)$ http://www.mydomain.org/$1 [L,R=301] # index.php to / RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*index\.php\ HTTP/ RewriteRule ^(.*)index\.php$ /$1 [R=301,L] # forum RewriteCond %{REQUEST_URI} !^/forums/ RewriteRule index.php/(.*) http://forum.mydomain.org/$1 [R=301,L] This code is works well, but i dont know if the code is standard. I would like to set htaccess to: * mydomain.org go to www.mydomain.org (its ok..) * mydomain.org/index.php go to www.mydomain.org (its ok..) * forum.mydomain.org stay forum.mydomain.org (its ok..? ) * www.forum.mydomain.org go to forum.mydomain.org (how?) Could anyone help me to fix this code? Thank you.

    Read the article

  • one domain name, two servers

    - by MarcoKotrotsos
    Goodmorning. Because my client wants to keep an old server (running php-fusion) running under the same domain name as a new server (running Drupal) I have a question. Is this possible? And most importantly how would I do this?? The old server URL structure is like projectname.domain.com and the new server's URL structure is domain.com/projectname. Is this possible? To run these two servers side by side, on the same domain name- but with a different URL structure? Thank you Marco

    Read the article

  • Need ability to set configuration options using single method which will work across multiple server configurations.

    - by JMC Creative
    I'm trying to set post_max_size and upload_max_filesize in a specific directory for a web application I'm building. I've tried the following in a .htaccess file in the script directory. (upload.php is the script that needs the special configuration) <Files upload.php> php_value upload_max_filesize 9998M php_value post_max_size 9999M </Files> That doesn't work at all. I've tried it without the scriptname specificity, where the only thing in the .htaccess file is: php_value upload_max_filesize 9998M php_value post_max_size 9999M This works on my pc-based xampp server, but throws a "500 Misconfiguration Error" on my production server. I've tried also creating a php.ini file in the directory with: post_max_size = 9999M upload_max_filesize = 9998M But this also doesn't always work. And lastly using the following in the php script doesn't work either, supposedly because the settings have already been compiled by the time the parser reaches the line (?): <?php ini_set('post_max_size','9999M'); ini_set('upload_max_filesize','9998M'); ?>

    Read the article

  • How can I exclude a file in a folder from basic auth (regex help)?

    - by simon180
    Hi I have a folder on my site which contains admin files and I've added basic auth following a little unwanted attention. This works fine however a couple of the admin functions won't work through basic auth as they handle file uploads and so I want to exclude these files from the auth. It shouldn't have any security implications as any rogue user wouldn't be able to access the pages that could create a session to use these functions. I am using the following basic code to exclude a file: <FilesMatch "(index.php\/myadminfolder\/myurl\/myaction/someotherstuff?)$"> Satisfy Any Order allow,deny Allow from all Deny from none </FilesMatch> The URL exclusion is not working. The URL to exclude is in the form: index.php/directory/subdirectory/action/uniqueid/blah What is the correct URL string to add to FilesMatch to exclude any files that start with the pattern of index.php/directory/subdirectory/action - regardless of what comes after action? Thanks Simon

    Read the article

  • AWSTATS - manual update error (permissions)

    - by Lewis
    Error: Couldn't open file "/var/www/awstats/awstats032014.site.net.tmp.9198" for write: Permission denied Setup ('/etc/awstats/awstats.site.net.conf' file, web server or permissions) may be wrong. Check config file, permissions and AWStats documentation (in 'docs' directory). I get this error when manual trying to update awstats (via the browser link). I have set the folder permissions of /var/www/awstats/ to 775 and still get the error. If I create a new file on that folder the default permission setting set the permissions to 774 which should work.

    Read the article

  • Same index for all (sub-)directories?

    - by whatisthis
    Hi. I was wondering if it was possible to write some .htaccess page that makes the server use an index.php file in a SINGLE directory as the index file for every directory/sub-directory on my server, rather than placing the exact same index.php in 200+ directories. If my description isn't clear, what I essentially mean is: /files/index.php is to be used as the index for, for example, /files/morefiles, as well as the index for all directories and sub-directories within /files/, even though those directories would not have an index file themselves. Thanks to all in advance.

    Read the article

  • nginx load balance with IIS backend servers waiting Host header

    - by Elgreco08
    i have a ubuntu 10.04 with nginx /0.8.54 running as a load balance proxy named: www.local.com I have two IIS backend servers which responds on Host header request web1.local.com web2.local.com Problem: When i hit my nginx balancer on www.local.com my backend servers respond with the default server blank webpage (IIS default page) since they are waiting for a right host header (e.g. web1.local.com) my nginx.conf upstream backend { server web1.local.com:80; server web2.local.com:80; } server { listen 80; location / { proxy_pass http://backend; proxy_set_header X-Real-IP $remote_addr; proxy_set_header Host $proxy_host; } } any hint ?

    Read the article

  • ModRewrite Domain

    - by Mike Knoop
    I've done a little research into ModRewrite rules and conditions but have not been able to find a satisfactory set of rules/conds which achieve the effect I'm looking for. Essentially, I have a directory on domain A (http://www.domaina.com/dir/) which I would like to redirect to a different directory on domain B (http://www.domainb.com/diff_dir/). Note that I only want to apply the rewrite rule if the user is attempting to access /dir/ on domaina. If they are accessing a different directory or root I do not want to rewrite the URL. Thank you!

    Read the article

  • mod_rewrite all but two files causing loop

    - by mpounsett
    I'm trying to set up a web site to allow the creation of a semaphore file to close the site. The logic I want to follow is: when the semaphore file exists and the request is not for /style.css or /favicon.icon show the content of /closed.html I have 1 and 3 working, but my exceptions for 2 result in a processing loop when style.css or favicon.ico are requested. This is my most recent attempt: RewriteEngine on RewriteCond %{REQUEST_URI} !^/style.css RewriteCond %{REQUEST_URI} !^/favicon.ico RewriteCond /usr/local/etc/site/closed -f RewriteRule ^.*$ /closed.html [L] This is in a VirtualHost block, not in a Directory. There is no .htaccess file in play. I have also recently tried this, based on an answer I found elsewhere, but with the same (looping) result: RewriteCond %{REQUEST_URI} ^/style.css [OR] RewriteCond %{REQUEST_URI} ^/favicon.ico RewriteRule ^.*$ - [L] RewriteCond /usr/local/etc/site/closed -f RewriteRule ^.*$ /closed.html [L] I expect a request for /style.css or /favicon.ico to fail to match one of the first two rewrite conditions, which should prevent the URI from being rewritten, which should stop the mod_rewrite iteration. However, mod_rewrite seems to think the URI has been rewritten in those cases, and iterates over the rules again (and again, and again). The above works properly in all cases except for style.css or favicon.ico. In those cases I exceed the loop limits. What am I missing here to cause the rewrite iteration to stop when someone requests style.css or favicon.ico? EDIT: Here's a loglevel 9 example of what happens using the first ruleset when a request arrives for /style.css. This is just the first two iterations.. it continues to loop identically until the limit is reached. 2001:4900:1044:0:145f:826e:6436:dc1 - - [29/May/2014:15:29:26 +0000] [host.example/sid#80c1c48b0][rid#80c1db0a0/initial] (2) init rewrite engine with requested uri /style.css 2001:4900:1044:0:145f:826e:6436:dc1 - - [29/May/2014:15:29:26 +0000] [host.example/sid#80c1c48b0][rid#80c1db0a0/initial] (3) applying pattern '^.*$' to uri '/style.css' 2001:4900:1044:0:145f:826e:6436:dc1 - - [29/May/2014:15:29:26 +0000] [host.example/sid#80c1c48b0][rid#80c1db0a0/initial] (4) RewriteCond: input='/style.css' pattern='!^/style.css' => not-matched 2001:4900:1044:0:145f:826e:6436:dc1 - - [29/May/2014:15:29:26 +0000] [host.example/sid#80c1c48b0][rid#80c1db0a0/initial] (1) pass through /style.css 2001:4900:1044:0:145f:826e:6436:dc1 - - [29/May/2014:15:29:26 +0000] [host.example/sid#80c1c48b0][rid#80c1dd0a0/initial] (2) init rewrite engine with requested uri /style.css 2001:4900:1044:0:145f:826e:6436:dc1 - - [29/May/2014:15:29:26 +0000] [host.example/sid#80c1c48b0][rid#80c1dd0a0/initial] (3) applying pattern '^.*$' to uri '/style.css' 2001:4900:1044:0:145f:826e:6436:dc1 - - [29/May/2014:15:29:26 +0000] [host.example/sid#80c1c48b0][rid#80c1dd0a0/initial] (4) RewriteCond: input='/style.css' pattern='!^/style.css' => not-matched 2001:4900:1044:0:145f:826e:6436:dc1 - - [29/May/2014:15:29:26 +0000] [host.example/sid#80c1c48b0][rid#80c1dd0a0/initial] (1) pass through /style.css

    Read the article

  • Overriding IndexIgnore in .htaccess file

    - by vsr
    I have IndexIgnore * in a directory for which I don't want to allow directory listing. Now, in a sub-directory, I want to override the IndexIgnore directive defined in parent directory and allow directory listing. The documentation for IndexIgnore says The IndexIgnore directive adds to the list of files to hide when listing a directory. How do I allow directory listing for this sub-directory?

    Read the article

  • httpd.conf ruined - cant access my vps anymore

    - by Jazerix
    Okay, so this may be incredible stupied But I was configuring my httpd.conf file yesterday. After a server restart, I can no longer access it. Port 80 is working fine, and it displays my webpages, however when I access the site via ssh, it just says the connection was refused. I cannot access webmin which is port 10000 or access it via ftp :/ Do I need to recreate the whole site or is there a way to get into it?

    Read the article

  • Solr on Tomcat (Ubuntu OS) installation help

    - by Camran
    I have to install Solr on my Ubuntu Server. However, Solr wont work without Tomcat or another container, and also Java. I have successfully installed tomcat6 and java. BUT, in a tomcat6 guide online, it says I should configure iptables to allow connections via port 8080, which I have done. Then the guide says I can test the tomcat6 by going to: http://my_ip_adress:8080 But this makes the browser just load and wait somehow for a response, and finally display "website not available". I have NO clue how to install Solr with Tomcat. Does anybody know how? How do I know Tomcat6 works? BTW: When I do this: /etc/init.d/tomcat6 start then it says OK. If you need something let me know, I really need help with this one. Thanks UPDATE: When executing this: sudo /etc/init.d/tomcat6 status it respons is Tomcat servlet engine is running with pid 28641

    Read the article

  • MySql backup (MySqldump questions)

    - by Camran
    I have a vps with ubuntu 9 server. I need to backup my MySql database. Can MySql make backups automatically? If so, how? If not, how should I do it then? The website is a classifieds website (PHP, MySql etc) Thanks

    Read the article

  • HTTPS load balancing based on some component of the URL

    - by user38118
    We have an existing application that we wish to split across multiple servers (for example: 1000 users total, 100 users split across 10 servers). Ideally, we'd like to be able relay the HTTPS requests to a particular server based on some component of the URL. For example: Users 1 through 100 go to http://server1.domain.com/ Users 2 through 200 go to http://server2.domain.com/ etc. etc. etc. Where the incoming requests look like this: https://secure.domain.com/user/{integer user # goes here}/path/to/file Does anyone know of an easy way to do this? Pound looks promising... but it doesn't look like it supports routing based on URL like this. Even better would be if it didn't need to be hard-coded- The load balancer could make a separate HTTP request to another server to ask "Hey, what server should I relay to for a request to URL {the URL that was requested goes here}?" and relay to the hostname returned in the HTTP response.

    Read the article

  • Debian 7 and PHP 5.4.4 error reporting

    - by milovan
    I use default php.ini and then in my PHP script (local.settings.php in Drupal) I simply set ini_set('error_reporting', 'E_ALL & ~E_NOTICE & ~E_STRICT'); According to documentation this means "show all messages minus notice and strict warnings". But in my case it still shows strict warnings! I have no idea why, because I clearly stated "~E_STRICT". If I comment it out then I see strict warnings. So it means that default from php.ini "E_ALL & ~E_DEPRECATED & ~E_STRICT" didn't do its job as it also has "~E_STRICT" but I still see strict warnings. On Debian 6 there was Suhoshin patch which was controlling usage of php_ini in PHP scripts. Especially when you try to get more memory than defined cap. Now on debian 7 there is no Suhoshin nor any other security element that might control php_ini. So what might cause php_ini not to be executed? Is there some new variable / setup / other that needs to be checked?

    Read the article

  • Overload Protection

    - by Tyron
    Is there a simple way how I could redirect a visitor (via .htaccess or PHP script) to a static page when the server is overloaded from too many requests? It doesn't have to be a protection against huge amounts of requests at once or protect against DoS Attacks. I think our server would be protected enough if we could prevent the standard website to be shown and instead show a single file "overloaded.html". Also how could I get a measure for a server being overloaded on a typical managed server (= non root access to a Linux server) environment?

    Read the article

  • CentOS 5: Can't access webserver via http from host OS?

    - by adred
    My server is installed on a guest OS on vmware. It really bugs me because I can't access it from the host OS's browser even though there is no discrepancy between /etc/hosts, /etc/sysconfig/network, httpd.conf files. Issuing ifconfig command also returns the same IP. I have also enabled netwroking in the vmware settings. And I can ping the guest OS's IP from the host. Any insights pls???

    Read the article

  • permissions on upload folder not working

    - by Camran
    I have a php script which uploads images to a folder. I have these permissions on the upload folder: drwxrwxr-- 4 user user 4096 2010-06-02 16:20 temp_images Shouldn't these permissions be enough for files to be uploaded to the folder? But this doesn't work. It only works when I set the permissions to 777. "user" is added to the www-data group, still no luck. Any ideas why?

    Read the article

  • Ubuntu server security; Is this enough?

    - by Camran
    I have a classifieds website, which uses php5 and mysql, and also java (solr). I am new to linux and VPS... I have installed SSH, and I have installed IPtables, and also I have PuTTY which I use as a terminal. Also, Filezilla is installed on my computer, and whenever I connect to my VPS, the "host" field in Filezilla says "Sftp://ip-adress" so I am guessing it is a safe connection. I used this command to find out if I had SSHD installed: whereis sshd and it returned some places where it actually was installed. So I havent actually installed it. Now, my Q is, is this enough? What other security measures should I take? Any good articles about security and how to set it up on a VPS? Remember, I have a windows xp OS on my laptop, but the OS for my VPS is Ubuntu 9.10. Also, I have apache2 installed... Thanks

    Read the article

  • Missed something? Cant upload files to server (permissions)

    - by Camran
    I can upload files as "root" to the Ubuntu server. Then I created a user (me). Next I added the user to the group www-data. Then assigned rwx permissions to www-data. Next, when I try to upload, delete or modify files VIA FILEZILLA, I cant. But via the terminal, I can change files using sudo command. What should I do to be able to upload files without getting the "permission denied" in filezilla? If you need more input let me know. Thanks

    Read the article

  • mod_rewrite [L] flag not working as expected?

    - by bobobobo
    I thought the [L] flag indicated that "this rule should be the last rule processed for this http request.." However when I have 2 rules like: RewriteRule ^test$ php/test.php [L] RewriteRule (.*) error.php What always happens is requests to http://localhost/test go to error.php, not to test.php as I expected, since I put the [L] there. If you comment out the second rule there, then requests to http://localhost/test go to test.php as expected. What I'm really trying to do is catch 404 errors with mod_rewrite. Its possible what I'm trying to do is just plain wrong. But I still want to know why the catch-all rule is active since I did put an [L] after the ^test rule. I see a large listing in here where the server admin lists a bunch of paths that begin with the recognized directories, but I wanted to avoid doing this by simply using a nice catch-all rule.

    Read the article

  • Please Help Me Optimize This

    - by Zero
    I'm trying to optimize my .htaccess file to avoid performance issues. In my .htaccess file I have something that looks like this: RewriteEngine on RewriteCond %{HTTP_USER_AGENT} bigbadbot [NC,OR] RewriteCond %{HTTP_USER_AGENT} otherbot1 [NC,OR] RewriteCond %{HTTP_USER_AGENT} otherbot2 [NC] RewriteRule ^.* - [F,L] The first rewrite rule (bigbadbot) handles about 100 requests per second, whereas the other two rewrite rules below it only handle a few requests per hour. My question is, since the first rewrite rule (bigbadbot) handles about 99% of the traffic would it be better to place these rules into two separate rulesets? For example: RewriteEngine on RewriteCond %{HTTP_USER_AGENT} bigbadbot [NC] RewriteRule ^.* - [F,L] RewriteCond %{HTTP_USER_AGENT} otherbot1 [NC,OR] RewriteCond %{HTTP_USER_AGENT} otherbot2 [NC] RewriteRule ^.* - [F,L] Can someone tell me what would be better in terms of performance? Has anyone ever benchmarked this? Thanks!

    Read the article

< Previous Page | 166 167 168 169 170 171 172 173 174 175 176 177  | Next Page >