Search Results

Search found 1094 results on 44 pages for '301'.

Page 23/44 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • lighttpd: weird behavior on multiple rewrite rule matches

    - by netmikey
    I have a 20-rewrite.conf for my php application looking like this: $HTTP["host"] =~ "www.mydomain.com" { url.rewrite-once += ( "^/(img|css)/.*" => "$0", ".*" => "/my_app.php" ) } I want to be able to put the webserver in kind of a "maintenance" mode while I update my application from scm. To do this, my idea was to enable an additional rewrite configuration file before this one. The 16-rewrite-maintenance.conf file looks like this: url.rewrite-once += ( "^/(img|css)/.*" => "$0", ".*" => "/maintenance_app.php" ) Now, on the maintenance page, I have a logo that doesn't get loaded. I get a 404 error. Lighttpd debug says the following: 2012-12-13 20:28:06: (response.c.300) -- splitting Request-URI 2012-12-13 20:28:06: (response.c.301) Request-URI : /img/content/logo.png 2012-12-13 20:28:06: (response.c.302) URI-scheme : http 2012-12-13 20:28:06: (response.c.303) URI-authority: localhost 2012-12-13 20:28:06: (response.c.304) URI-path : /img/content/logo.png 2012-12-13 20:28:06: (response.c.305) URI-query : 2012-12-13 20:28:06: (response.c.300) -- splitting Request-URI 2012-12-13 20:28:06: (response.c.301) Request-URI : /img/content/logo.png, /img/content/logo.png 2012-12-13 20:28:06: (response.c.302) URI-scheme : http 2012-12-13 20:28:06: (response.c.303) URI-authority: localhost 2012-12-13 20:28:06: (response.c.304) URI-path : /img/content/logo.png, /img/content/logo.png 2012-12-13 20:28:06: (response.c.305) URI-query : 2012-12-13 20:28:06: (response.c.349) -- sanatising URI 2012-12-13 20:28:06: (response.c.350) URI-path : /img/content/logo.png, /img/content/logo.png 2012-12-13 20:28:06: (mod_access.c.135) -- mod_access_uri_handler called 2012-12-13 20:28:06: (response.c.470) -- before doc_root 2012-12-13 20:28:06: (response.c.471) Doc-Root : /www 2012-12-13 20:28:06: (response.c.472) Rel-Path : /img/content/logo.png, /img/content/logo.png 2012-12-13 20:28:06: (response.c.473) Path : 2012-12-13 20:28:06: (response.c.521) -- after doc_root 2012-12-13 20:28:06: (response.c.522) Doc-Root : /www 2012-12-13 20:28:06: (response.c.523) Rel-Path : /img/content/logo.png, /img/content/logo.png 2012-12-13 20:28:06: (response.c.524) Path : /www/img/content/logo.png, /img/content/logo.png 2012-12-13 20:28:06: (response.c.541) -- logical -> physical 2012-12-13 20:28:06: (response.c.542) Doc-Root : /www 2012-12-13 20:28:06: (response.c.543) Rel-Path : /img/content/logo.png, /img/content/logo.png 2012-12-13 20:28:06: (response.c.544) Path : /www/img/content/logo.png, /img/content/logo.png 2012-12-13 20:28:06: (response.c.561) -- handling physical path 2012-12-13 20:28:06: (response.c.562) Path : /www/img/content/logo.png, /img/content/logo.png 2012-12-13 20:28:06: (response.c.618) -- file not found 2012-12-13 20:28:06: (response.c.619) Path : /www/img/content/logo.png, /img/content/logo.png Any clue on why lighttpd matches both rules (from my application rewrite config and from my maintenance rewrite config) and concatenates them with a comma - that doesn't seem to make any sense?! Shouldn't it stop after the first match with rewrite-once?

    Read the article

  • .htaccess template, suggestions needed

    - by purpler
    # Defaults AddDefaultCharset UTF-8 DefaultLanguage en-US FileETag None Header unset ETag ServerSignature Off SetEnv TZ Europe/Belgrade # Rewrites Options +FollowSymLinks RewriteEngine On RewriteBase / # Redirect to WWW RewriteCond %{HTTP_HOST} ^serpentineseo.com RewriteRule (.*) http://www.serpentineseo.com/$1 [R=301,L] # Redirect index to root RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*index\.html\ HTTP/ RewriteRule ^(.*)index\.html$ /$1 [R=301,L] # Cache media files: ExpiresActive On ExpiresDefault A0 # Month <filesMatch "\.(gif|jpg|jpeg|png|ico|swf|js)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> # Week <FilesMatch "\.(css|pdf)$"> Header set Cache-Control "max-age=604800" </FilesMatch> # 10 Min <FilesMatch "\.(html|htm|txt)$"> Header set Cache-Control "max-age=600" </FilesMatch> # Do not cache <FilesMatch "\.(pl|php|cgi|spl|scgi|fcgi)$"> Header unset Cache-Control </FilesMatch> # Compress output <IfModule mod_deflate.c> <FilesMatch "\.(html|js|css)$"> SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error Documents ErrorDocument 206 /error/206.html ErrorDocument 401 /error/401.html ErrorDocument 403 /error/403.html ErrorDocument 404 /error/404.html ErrorDocument 500 /error/500.html # Prevent hotlinking RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www\.)?serpentineseo.com/.*$ [NC] RewriteRule \.(gif|jpg|png)$ http://www.serpentineseo.com/images/angryman.png [R,L] # Prevent offline browsers RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.*$ http://www.google.com [R,L] # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Deny access to sensitive files <FilesMatch "\.(htaccess|psd|log)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • .htaccess template, suggestions needed

    - by purpler
    DefaultLanguage en-US FileETag None Header unset ETag ServerSignature Off SetEnv TZ Europe/Belgrade # Rewrites Options +FollowSymLinks RewriteEngine On RewriteBase / # Redirect to WWW RewriteCond %{HTTP_HOST} ^serpentineseo.com RewriteRule (.*) http://www.serpentineseo.com/$1 [R=301,L] # Redirect index to root RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*index\.html\ HTTP/ RewriteRule ^(.*)index\.html$ /$1 [R=301,L] # Cache media files: ExpiresActive On ExpiresDefault A0 # Month <filesMatch "\.(gif|jpg|jpeg|png|ico|swf|js)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> # Week <FilesMatch "\.(css|pdf)$"> Header set Cache-Control "max-age=604800" </FilesMatch> # 10 Min <FilesMatch "\.(html|htm|txt)$"> Header set Cache-Control "max-age=600" </FilesMatch> # Do not cache <FilesMatch "\.(pl|php|cgi|spl|scgi|fcgi)$"> Header unset Cache-Control </FilesMatch> # Compress output <IfModule mod_deflate.c> <FilesMatch "\.(html|js|css)$"> SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error Documents ErrorDocument 206 /error/206.html ErrorDocument 401 /error/401.html ErrorDocument 403 /error/403.html ErrorDocument 404 /error/404.html ErrorDocument 500 /error/500.html # Prevent hotlinking RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www\.)?serpentineseo.com/.*$ [NC] RewriteRule \.(gif|jpg|png)$ http://www.serpentineseo.com/images/angryman.png [R,L] # Prevent offline browsers RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.*$ http://www.google.com [R,L] # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Deny access to sensitive files <FilesMatch "\.(htaccess|psd|log)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • .htaccess - Too many redirects

    - by Knocks X
    I am getting a "too many redirects" error from the following two .htaccess files. .htaccess on domain 1 Redirect 301 / http://www.domain2.com/ .htaccess on domain 2 RewriteEngine On RewriteBase / RewriteCond %{HTTP_HOST} !^www\. RewriteRule (.*) http://www.%{HTTP_HOST}/$1 [L,R] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule .* /forum/ [L,R] RedirectMatch permanent ^/$ /forum/ # Options +FollowSymlinks RewriteCond %{HTTP_REFERER} badsite1\.com [NC,OR] RewriteCond %{HTTP_REFERER} badsite2\.com [NC] RewriteRule .* - [F] Anyone know the reason for the too many redirects error?

    Read the article

  • Set HTTP condition for redirect rule

    - by Török Gábor
    A have a redirect rule in my .htaccess that forwards agent from A.html to B.html using the following pattern: Redirect 301 /A.html http://mysite.com/B.html Since the Redirect directive requires to set the target host, is it possible to let this rule prevail only on a specific host? I have both a test and deploy domain, and only want it on the deploy domain. I can set HTTP conditions for Rewrite rules, but how can I for HTTP Redirects?

    Read the article

  • Hide .php add a slash

    - by Matthew
    This script works perfect it forces the trailing slash and hides the .php extension BUT! it does not redirect people going directly to the .php extension. How can I also force people going directly to the file.php to /file/ RewriteEngine On RewriteRule ^(.*)/$ /$1.php [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(.*)/$ RewriteRule ^(.*)$ http://www.mysite.com/$1/ [R=301,L]

    Read the article

  • Logging Redirect Location in Apache

    - by matthew
    Using the standard "Combined" log format, when Apache returns a 301 response it logs it in the access log, which is good, but it only logs the response code. It doesn't log the location to which the client is being directed. What argument do I need to add to the CustomLog format in order to get it to log this information? It wasn't obvious to me from the documentation.

    Read the article

  • Can you please explain substitution in RewriteRule

    - by Scott
    I have the following statements in an .htaccess file RewriteCond %{HTTP_HOST} ^myOldDomain\.com$ [NC] RewriteRule ^(.*)$ https://myNewDomaink.com/$1 [R=301,L] It works fine. I basically found some sample code and modified it to my specific purpose. What I don't quite understand is: Why does $1 refer to the the portion of the supplied url after the hostname - where is the documentation for this? There is no backreference in the RewriteCond.

    Read the article

  • Reverse a .htaccess redirection rule

    - by Aahan Krish
    Let me explain by example. Say, I have this redirection rule in my .htaccess file: RedirectMatch 301 ^/([^/]+)/([^/]+)/$ http://www.example.com/$2 What it basically does is, redirect http://www.mysite.com/sports/test-post/ to http://www.mysite.com/test-post/. Now, how do I modify the .htaccess rule to do the opposite? (i.e. redirect http://www.mysite.com/test-post/ to http://www.mysite.com/sports/test-post/)

    Read the article

  • Monit checking URL follow redirects

    - by beck
    I am looking to use monit to keep an eye on my site. I want it to treat it the site like an external user so am testing the url but it doesn't seem to follow redirects. The content check is being performed on the html of the redirect. #request works: if failed url http://www.sharelatex.com/blog/posts/future.html content == "301" #request fails if failed url http://www.sharelatex.com/blog/posts/future.html content == "actual content" Finding out how to get the url check to follow 30X would be great.

    Read the article

  • Mod_rewrite Mask URL

    - by user37143
    Hi, I need to rewrite https://mydomain1.com/chanagepassword to https://mydomain2.com/ssp then hide the url mydomain2.com/cp in browser and show https://mydomain1.com/chanagepassword. I got the first part working with the rule: Options +FollowSymLinks RewriteEngine on RewriteRule /changepassword https://mydomain2.com/ssp/ [R=301,L] However it shows https://mydomain2.com/ssp url on browser. How can I mask this and show https://mydomain1.com/chanagepassword Please support. Thanks.

    Read the article

  • How do you redirect from one directory to another and maintain the proper basepath?

    - by codeninja
    I need to redirect all /eula traffic to /tos RewriteRule ^/eula/$ /tos [R=301,NC] but this rule doesnt seem to work -- mostly because the basepath is being treated as the root when really theres another parent directory; what's happening with the above rule is /my/docs/eula -> /tos which is not right, it should be doing this /my/docs/eula -> /my/docs/tos How do I write the rule for this, without having to specify what the parent dir is?

    Read the article

  • Can anyone tell me whats wrong with this htaccess code

    - by mathew
    I am not able to get through with this code...I need to capture the request but with this code I am able to redirect but it cannot find the particular search page. RewriteCond %{THE_REQUEST} ^[A-Z]+\ /searcha\.php\?name=(www\.)?([^/\ ]+)[^\ ]*\ HTTP/ RewriteRule ^.*$ http://www.mydomain.com/%2? [R=301,L] can any one tell me what is I am missing??

    Read the article

  • Reversing a mod_rewrite rule

    - by KIRA
    I want to redirect accesses from http://www.domain.com/test.php?sub=subdomain&type=cars to http://subdomain.domain.com/cars I already have mod_rewrite rules to do the opposite: RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{HTTP_HOST} !^(www)\. [NC] RewriteCond %{HTTP_HOST} ^(.*)\.(.*)\.com [NC] RewriteRule (.*) http://www.%2.com/index.php?route=$1&name=%1 [R=301,L] What changes do I need to make to these rules to redirect requests from the script to the subdomain?

    Read the article

  • Apache rewrite - optional parameters?

    - by Mayhem
    I'm creating SEO friendly urls for my news page. My links look like this : www.site.com/1234/the-pretty-url-string/ RewriteRule ^([^/])/([^/])/$ /news.php?sid=$1&url=$2 [L] This works great, but I like to have more flexability. I want to be able to accept urls like : www.site.com/1234 www.site.com/1234/ so then I can do some php $GET's and figure out if anything is missing - and 301 to the proper URL of my choice. I would like the &url=$2 to be optional.

    Read the article

  • What does the float number indicate when using varnishtop?

    - by Abs
    I have Varnish running and I wanted to see what http response codes commonly occur. I used: varnishtop -i TxStatus However, I am struggling to work out what the numbers on the right mean? Is it number of requests? list length 7 1322.16 TxStatus 200 60.43 TxStatus 302 8.67 TxStatus 304 3.14 TxStatus 500 2.96 TxStatus 404 0.80 TxStatus 301 0.56 TxStatus 403

    Read the article

  • .htaccess to pass FULL URL into other PHP link as $_GET

    - by 4lvin
    From .htacces file, how to pass/redirect the FULL URL to another URL as GET Variable? Like: http://www.test.com/foo/bar.asp will be redirected to: http://www.newsite.com/?url=http://www.test.com/foo/bar.asp With Full Url with Domain.I tried: RewriteEngine on RedirectMatch 301 ^/(.*)\.asp http://www.newsite.com/?url=%{REQUEST_URI} But it is going out like: http://www.newsite.com/?url=?%{REQUEST_URI}

    Read the article

  • .htaccess file ?

    - by user368993
    Hello Guys, How to create .htaccess file for 301 permanent redirect. I am looking for exact code that we would put in the file. Looking forward to hear from you all. Thanks in advance.

    Read the article

  • Redirect traffic to https

    - by Arpanet
    I need to redirect the following instances to https://www.mydomain.com; mydomain.com www.mydomain.com http://www.mydomain.com I have the following in my .htaccess already, but this only redirects mydomain.com to https://www.mydomain.com; <IfModule mod_rewrite.c> RewriteCond %{HTTPS} !=on RewriteCond %{HTTP_HOST} !^www\..+$ [NC] RewriteRule ^ https://www.%{HTTP_HOST}%{REQUEST_URI} [R=301,L] </IfModule> Grateful if anyone could perhaps tell me what I need to add. Thank you.

    Read the article

  • Removing .html and index.html from URL

    - by James Turner
    I'm having some problems trying to Remove the .html extension from URLs Removing 'index.html' from an URL 1) To remove the extension I have tried using this in my htaccess file. RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME}\.html -f RewriteRule ^(.*)$ $1.html However when I click links in my HTML such as <a href="abcde.html"></a> it doesn't remove the .html from the URL and I am left with www.website.com/abcde.html 2) I tried using this to remove the index.html RewriteCond %{THE_REQUEST} \/index\.(php|html)\ HTTP [NC] RewriteRule (.*)index\.(php|html)$ /$1 [R=301,L] But when I load an index.html file on my server, my URL looks something like this www.website.com/folder// I am left with an extra / at the end. Can anyone help me out?

    Read the article

  • Is trailing slash automagically added on click of home page URL in browser?

    - by Question Overflow
    I am asking this because whenever I mouseover a link to a home page (e.g. http://www.example.com), I notice that a trailing slash is always added (as observed on the status bar of the browser) whether the home page link contains a href attribute that ends with a slash or not. But whenever I am on the home page, the URL on display will not have a trailing slash. I tried entering a slash to the URL in the URL bar. And with Firebug enabled, I notice that the site always return a 200 OK status. An article here discussing this states that having a slash at the end will avoid a 301 redirection. But I am not seeing any redirection, even on this page. Could this be a browser feature that is appending the slash?

    Read the article

  • How to track subdomains with Google Analytics while having mod_rewrite redirect to a subdomain?

    - by Marek
    When users come directly to domain.com or www.domain.com, I am redirecting them to shop.domain.com via this .htaccess rewrite: RewriteEngine on RewriteCond %{HTTP_HOST} ^www.domain.com$ [OR] RewriteCond %{HTTP_HOST} ^domain.com$ RewriteRule ^(.*)$ http://shop.domain.com/ [R=301,L] The content served by shop.domain.com has the following tracking code parameters: var _gaq = _gaq || []; _gaq.push(['_setAccount', 'UA-123456-6']); _gaq.push(['_setDomainName', '.domain.com']); _gaq.push(['_trackPageview']); All direct visits that come to shop.domain.com as a result of the rewrite from domain.com are tracked as referral traffic, showing my own domain.com as referral source in Google Amalytics. I would like to track these visits as direct traffic. How to change the configuration to track mod_rewritten traffic on my subdomain coming from my own domain as direct traffic?

    Read the article

  • Migrate add-on domain olddomain.com to newdomain.com

    - by eHx
    I have 2 domains that are registered at GoDaddy : domaina.com (not hosted, only domain name is registered to GD) domainb.com (hosted at a different webhost, domain name registered to GD) domainb.com is an already working site, with a different webhost, but the domain name is registered to GoDaddy(and I assume the nameservers are changed to redirect to the webhost). Now, I don't understand why this was done, but domainb.com is considered a subdomain on the host... meaning the files are in a seperate folder on the server. Ex : public-html/domainb.com/public-html/FILES The structure is similar to this on the webhost : HostNAME (main root folder) domainb.com (subdomain of hostname) domainc.com (etc...) domaind.com (etc...) I want to transfer the site domainb.com to domaina.com, meaning domaina.com will become the new website, without having to re-upload all the content and CMS. The old one will redirect to domaina.com once the transfer is done (using 301 redirects). Can anyone tell me how I can do this?

    Read the article

  • Migrate olddomain.com to newdomain.com

    - by eHx
    I have 2 domains that are registered at GoDaddy : domaina.com (not hosted, only domain name is registered to GD) domainb.com (hosted at a different webhost, domain name registered to GD) domainb.com is an already working site, with a different webhost, but the domain name is registered to GoDaddy(and I assume the nameservers are changed to redirect to the webhost). Now, I don't understand why this was done, but domainb.com is considered a subdomain on the host... meaning the files are in a seperate folder on the server. Ex : public-html/domainb.com/public-html/FILES The structure is similar to this on the webhost : HostNAME (main root folder) domainb.com (subdomain of hostname) domainc.com (etc...) domaind.com (etc...) I want to transfer the site domainb.com to domaina.com, meaning domaina.com will become the new website, without having to re-upload all the content and CMS. The old one will redirect to domaina.com once the transfer is done (using 301 redirects). Can anyone tell me how I can do this? Thanks !

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >