Search Results

Search found 2190 results on 88 pages for 'htaccess'.

Page 34/88 | < Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >

  • serving static file from cookieless domain: alternative cookieless directory

    - by Simone Nigro
    I'm trying to follow all the guidelines of "Google Page Speed??". The directive "Minimize request overhead" requires static content (images, js, css, etc.) on a static server (ie cookieless): https://developers.google.com/speed/docs/best-practices/request I do not want to buy a new server and I was thinking of just setting a directory of my site without cookie with htaccess www.mysite.com/static/.htaccess Header unset Cookie Header unset Set-Cookie I do not know if it can be problematic. Looking on google it seems that no one ever has adopted this type of solution, so I think that it is incorrect. What do you think? alternatively you could do www.mysite.com/.htaccess <FilesMatch "\.(css|js|jpg|png|gif)$"> Header unset Cookie Header unset Set-Cookie </FilesMatch>

    Read the article

  • Custom 403 Error page not showing

    - by Rahul Sekhar
    I want to restrict access to certain folders (includes, xml and logs for example) and so I've given them 700 permissions, and all files within them 600 permissions. Firstly, is this the right approach to restrict access? I have a .htaccess file in my root that handles rewriting and error documents. There are two pages in the root - 403.php and 404.php - for 403 and 404 errors. And I have these rules added to my .htaccess file: ErrorDocument 404 /404.php ErrorDocument 403 /403.php Now, the 404 page works just fine. The 403 page does not show when I try to access the 'includes' folder - I get the standard apache 403 error page instead, saying 'Additionally, a 404 Not Found error was encountered while trying to use an ErrorDocument to handle the request.' However, when I try going to the .htaccess file (in the web root) in my browser, I get my custom 403 error page. Why is this happening?

    Read the article

  • force all urls to www and force domain to non-www

    - by Digital site
    I was trying to force my domain to redirect without www and could success through this code: .htaccess: RewriteCond %{HTTP_HOST} ^www\.domain\.com [NC] RewriteRule ^(.*) http://domain.com/$1 [R=301,L] however, this code is going to redirect all www to non-www, which is not what I want. I just want to make the main domain from www.mydomain.com to mydomain.com and the rest of the urls should be forced to www. any idea how to add or modify the code so I can achieve that through .htaccess ? Update: Thanks to all. I found out that swf file from piecemaker was corrupted and updated it with new one. so now it is all fine and works on both www and non-www. I'm still curious how to solve this issue anyways using .htaccess. Thanks again.

    Read the article

  • mod_rewrite issue | Request exceeded the limit of 10 internal redirects

    - by Chris Anarko Meow
    ok what Im doing normally works but since my rule "includes" itself is giving me issues and can't find a solution after hours working on different options. I have a .htaccess with: RewriteEngine On RewriteBase / RewriteCond %{REQUEST_URI} !^/3.15.0/(.*) RewriteRule ^(.*)$ /3.15.0/$1 [L] this is for my software versions, I have a program that can request sometimes versions that are updated and in the server may be behind a couple version so I want to be able to say that whatever is comming in forward to the latest version that in this example is 3.15.0 /var/www/nameblabla/3.15.0 my .htaccess is on /var/www/nameblabla/.htaccess so the first Condition is to ignore request that already has the right path and version.. the second should be to grab all request and forward to 3.15.0... and of course not loose the path to the files I want from inside that should be the same. so far I can only get it to redirect to such directory but will loose the path and others I get the "Request exceeded the limit of 10 internal redirects" I guess this is because Im including the 3.15.0 path Any help or another way to do this with out mod_rewrite?

    Read the article

  • Use mod_rewrite to redirect from example.com/dir to www.example.com/dir

    - by kavoir.com
    Assume / is the document root of my domain example.com. /.htaccess RewriteEngine on RewriteCond %{HTTP_HOST} ^golfcoursesd\.com$ [NC] RewriteRule ^(.*)$ http://www.golfcoursesd.com/$1 [R=301,L] /dir/.htaccess <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /dir/index.php [L] </IfModule> I know how to redirect example.com/dir to www.example.com/dir, because /.htaccess does the very job. However, the trick here is that I have to keep /dir/.htaccess to serve up virtual directories (such as /dir/state/AK/35827/ which aren't actual directories) if you know what I mean. Problem is, if I keep /dir/.htaccess, a request of: http://example.com/dir/state/AK/35827/ DOES NOT redirect to: http://www.example.com/dir/state/AK/35827/ as would: http://example.com/ redirect to: http://www.example.com/ Not sure if I made it clear. Basically, how to make http://example.com/dir/state/AK/35827/ correctly redirect to http://www.example.com/dir/state/AK/35827/ AND I can serve virtual URLs?

    Read the article

  • How Do I make a simple .htaccess internal redirect Catch All script while forwarding POST data?

    - by RB
    I just want to catch all requests and forward them internally to my catchall page with all POST data intact Catch all page: http://www.mydomain.com/addons/redirect/catch-all.php I've tried so many combinations, but my server doesn't want to redirect internally if I specify more than catch-all.php # Internally redirect all pages to "Catch" page Options +FollowSymLinks RewriteEngine on RewriteRule (.*) /addons/redirect/catch-all.php [L] Also, do I need [L] or is it useless for internal redirects? Then, what php code would I use to grab the POST data, use it, and finally PHP redirect the page to the originally requested page Would it be done just as normal by using $_POST['variable_name']; or something different? Then, how would I go about calling the originally requested page, so I can tell PHP to header location direct them to that page? Thanks! UPDATE: Ha sick, nevermind. The condition DOES work. Here's my code: # Internally redirect all pages to "Catch" page Options +FollowSymLinks RewriteEngine on RewriteCond %{REQUEST_URI} !^/robots.txt$ RewriteCond %{REQUEST_URI} !\.(gif¦jpe?g¦png¦css¦js¦pdf¦doc¦xml)$ RewriteCond %{REQUEST_URI} !^/addons/redirect/catch-all\.php$ RewriteRule (.*)$ /addons/redirect/catch-all.php?q=$1 [L] Thanks guys for the inspiration! Now time to get that PHP to work...

    Read the article

  • Redirect 'myapp.com' to 'www.myapp.com' in rails without using htaccess?

    - by Allan L.
    Using Morph Labs' Appspace to deploy a site means no automated way to redirect 'myapp.com' to 'www.myapp.com' (and no access to .htacess). Is there an in-rails way to do this? Would I need a plugin like subdomain-fu? More specifically, I'm trying to do something like: 'myapp.com' = 'www.myapp.com' 'myapp.com/session/new' = 'www.myapp.com/session/new' Basically, I always want the 'www' subdomain prepended on every request (because the SSL cert specifically has a common name of 'www.myapp.com').

    Read the article

  • What's right for me: htAccess, form submittion, HTTP header authentication w/ PHP?

    - by Brook Julias
    I am creating a website with multiple sections--admin, client, user, and anonymous--each user group having less access then the next. I am wondering what form of authentication would be best for my use? I have heard the if you are just dealing with a websites then a web form is for you (because it's prettier). HTTP header authentication with PHP is said to get clunky/sloppy. htAcess is pretty much the hard core of various authentication methods I have looked up, but is it too much?

    Read the article

  • Redirecting via .htaccess to .php with arguments in current folder.

    - by Jengerer
    Hey, I'm trying to redirect something like foo/bar to ?foo=bar, so I can do www.mydomain.com/hey/foo/bar to www.mydomain.com/hey/?foo=bar, but I can't seem to get the syntax right. I tried the following: RewriteEngine on RewriteRule ^foo/(.*)$ ?foo=bar [NC] But this doesn't work. How would I accomplish this? I tried adding a forward slash behind the question mark, but that makes it link to the root directory. Thanks, Jengerer

    Read the article

  • Is there a better way to write this .htaccess directive?

    - by Bill H
    I want all css, javascript, and image file requests, that are named like this "filename.12345.css" to be re-routed to "filename.css". The ".12345" part will always be numbers and the length can be anywhere from 11 - 15 characters. This directive seems to work OK but I want to make sure there is no error in my logic. RewriteRule ^(.+)\.(.+)\.(js|css|jpg|gif|png)$ $1.$3 Any help would be greatly

    Read the article

  • Using htaccess redirect all files to another domain but exclude the root domain.

    - by Shawn
    Recently, I want to restructure and redesign my website, my old website located at www.example.com, there are lots of blog posts under this domain, like: www.example.com/post1 www.example.com/post2 www.example.com/post3 ... I can redirect all those posts to another sub domain points in another folder (not sub folder), (if I put all those posts in a sub folder, it will recursive the subfolder name) , anyway, it works for me by using the code below: RewriteEngine On RewriteRule ^(.*)$ http://pre.example.com/$1 [L,R=301] But there is one things I want to do is not redirect the main domain, only all the posts. RewriteEngine on RewriteCond %{REQUEST_URI} !^/blog/ # the new blog # I tried below #RewriteCond %{REQUEST_URI} !^/$ #RewriteCond %{REQUEST_URI} !^www.example.com$ RewriteRule ^(.*)$ http://pre.example.com/$1 [L,R=301] Is it possible I can do that? Thx.

    Read the article

  • How to make a better URL with .htaccess and multiple parameters?

    - by Landitus
    I have a very long a unfriendly URL an I'm looking to make SEO better for the site: http://www.site.com/sub-site/index.php?page=nameofpage&locale=en_EN I would like to have this instead: http:// www.site.com/sub-site/en/nameofpage all the URLS are hard coded in the links in the form of: <a href="index.php?page=nameofpage&locale=en_EN">link</a> What is the best way to achieve this?

    Read the article

  • Htaccess/robots.txt to allow search bots to explore main domain but not directory on other domain

    - by gX
    Ok, I understand the Title didn't make any sense so here I've tried to explain it in detail. I'm using a hosting that gives me space for my domain and lets me "add on" other domains on it. So lets say I have a domain A, and I add on a domain B. Basically my hosting gives me a public_html where I can put stuff that shows when someone visits website A. But, when I add the domain B, it lets me put the content of B, INSIDE of that public_html so that website B.com can also be visited by going to A.com/siteB... Thats all good, except that Google has started indexing B.com as well as A.com/siteB, I'm ok with it indexing B.com, but I somehow want to prevent it from indexing A.com/siteB so that when people search for B, it doesn't end up showing A.com/siteB. Any ideas? Let me know if the question is still unclear.

    Read the article

  • Htaccess help, $1 being used before its set??? what?

    - by jiexi
    I have this snippet RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond $1 !^(index\.php|images|img|css|js|robots\.txt) RewriteRule (.*) index.php?/$1 [L] It won't allow me to access a file at website.com/js/main.php but it will let me access index.php According to my webhost, $1 is being called before it is set. Any solutions? I'll accept answers when i get back tomorrow. Thank you!

    Read the article

  • How come the [L] flag isn't working in my .htaccess file?

    - by George Edison
    Here are the rules: <IfModule mod_rewrite.c> RewriteEngine on RewriteRule ^$ index.php?action=home [L] RewriteRule ^[\w\W]*$ error.php [L] When a page matches the first one, it is supposed to ignore any other further rules. Yet accessing / results in error.php being invoked. Commenting out the second rule works as intended - the page redirects to index.php. What am I doing wrong? Also: is there a better way to write the last line? It's basically a catch-all.

    Read the article

  • How can I forward a query string using htaccess?

    - by Eric
    I am using this, at present, to rewrite URLS: RewriteEngine on RewriteRule ^([^/?\.]+)$ /page.php?name=$1 [NC] So mysite.com/home gets rewritten to mysite.com/page.php?name=home How can I make it also rewrite mysite.com/home?param=value to mysite.com/page.php?name=home&param=value? Ideally, I'd like this to work for any name/value querystring pairs. Am I missing something obvious?

    Read the article

  • Rewrite in Mediawiki, remove index.php, htaccess

    - by tran cuong
    I've just installed Mediawiki on Apache and I want the url should be localhost/Main_Page/ localhost/Special:Recent_Changes ... instead of localhost/index.php/Main_Page/ localhost/index.php/Special:Recent_Changes I've tried many times and in many ways but it still doesn't work. Any suggest for a "exactly" what to do, step by step. MediaWiki docs didn't talk about .htaccess. It had only nginx and lighttpd.

    Read the article

  • folder level 301 redirect without .htaccess

    - by Vinay
    I have a website hosted in yahoo small business, I don't have access to .htaccess file. I have around 220 pages in a folder 'mysubfolder' (http://mysite.com/myfolder/mysubfolder). And the age of website is around 3 years. I am planning to move all 220 pages in 'mysubfolder' to 'myfolder' (one level up). All the pages in 'mysubfolder' are indexed. what is the best way to do this.So that it should not affects the SEO.

    Read the article

  • 301 redirect to different directory on Yahoo Small Business Hosting without .htaccess

    - by Vinay
    I have a website hosted with Yahoo Small Business Hosting, and I don't have access to use a .htaccess file. I have around 220 pages in a folder mysubfolder (http://example.com/myfolder/mysubfolder) and the age of website is around 3 years. I am planning to move all 220 pages in mysubfolder to myfolder (one level up). All the pages in mysubfolder are indexed. What is the best way to do this, so that it wouldn't affect my SEO.

    Read the article

  • blocking bad bots with robots.txt in 2012 [closed]

    - by Rachel Sparks
    does it still work good? I have this: # Generated using http://solidshellsecurity.com services # Begin block Bad-Robots from robots.txt User-agent: asterias Disallow:/ User-agent: BackDoorBot/1.0 Disallow:/ User-agent: Black Hole Disallow:/ User-agent: BlowFish/1.0 Disallow:/ User-agent: BotALot Disallow:/ User-agent: BuiltBotTough Disallow:/ User-agent: Bullseye/1.0 Disallow:/ User-agent: BunnySlippers Disallow:/ User-agent: Cegbfeieh Disallow:/ User-agent: CheeseBot Disallow:/ User-agent: CherryPicker Disallow:/ User-agent: CherryPickerElite/1.0 Disallow:/ User-agent: CherryPickerSE/1.0 Disallow:/ User-agent: CopyRightCheck Disallow:/ User-agent: cosmos Disallow:/ User-agent: Crescent Disallow:/ User-agent: Crescent Internet ToolPak HTTP OLE Control v.1.0 Disallow:/ User-agent: DittoSpyder Disallow:/ User-agent: EmailCollector Disallow:/ User-agent: EmailSiphon Disallow:/ User-agent: EmailWolf Disallow:/ User-agent: EroCrawler Disallow:/ User-agent: ExtractorPro Disallow:/ User-agent: Foobot Disallow:/ User-agent: Harvest/1.5 Disallow:/ User-agent: hloader Disallow:/ User-agent: httplib Disallow:/ User-agent: humanlinks Disallow:/ User-agent: InfoNaviRobot Disallow:/ User-agent: JennyBot Disallow:/ User-agent: Kenjin Spider Disallow:/ User-agent: Keyword Density/0.9 Disallow:/ User-agent: LexiBot Disallow:/ User-agent: libWeb/clsHTTP Disallow:/ User-agent: LinkextractorPro Disallow:/ User-agent: LinkScan/8.1a Unix Disallow:/ User-agent: LinkWalker Disallow:/ User-agent: LNSpiderguy Disallow:/ User-agent: lwp-trivial Disallow:/ User-agent: lwp-trivial/1.34 Disallow:/ User-agent: Mata Hari Disallow:/ User-agent: Microsoft URL Control - 5.01.4511 Disallow:/ User-agent: Microsoft URL Control - 6.00.8169 Disallow:/ User-agent: MIIxpc Disallow:/ User-agent: MIIxpc/4.2 Disallow:/ User-agent: Mister PiX Disallow:/ User-agent: moget Disallow:/ User-agent: moget/2.1 Disallow:/ User-agent: mozilla/4 Disallow:/ User-agent: Mozilla/4.0 (compatible; BullsEye; Windows 95) Disallow:/ User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 95) Disallow:/ User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 98) Disallow:/ User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows NT) Disallow:/ User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows XP) Disallow:/ User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows 2000) Disallow:/ User-agent: Mozilla/4.0 (compatible; MSIE 4.0; Windows ME) Disallow:/ User-agent: mozilla/5 Disallow:/ User-agent: NetAnts Disallow:/ User-agent: NICErsPRO Disallow:/ User-agent: Offline Explorer Disallow:/ User-agent: Openfind Disallow:/ User-agent: Openfind data gathere Disallow:/ User-agent: ProPowerBot/2.14 Disallow:/ User-agent: ProWebWalker Disallow:/ User-agent: QueryN Metasearch Disallow:/ User-agent: RepoMonkey Disallow:/ User-agent: RepoMonkey Bait & Tackle/v1.01 Disallow:/ User-agent: RMA Disallow:/ User-agent: SiteSnagger Disallow:/ User-agent: SpankBot Disallow:/ User-agent: spanner Disallow:/ User-agent: suzuran Disallow:/ User-agent: Szukacz/1.4 Disallow:/ User-agent: Teleport Disallow:/ User-agent: TeleportPro Disallow:/ User-agent: Telesoft Disallow:/ User-agent: The Intraformant Disallow:/ User-agent: TheNomad Disallow:/ User-agent: TightTwatBot Disallow:/ User-agent: Titan Disallow:/ User-agent: toCrawl/UrlDispatcher Disallow:/ User-agent: True_Robot Disallow:/ User-agent: True_Robot/1.0 Disallow:/ User-agent: turingos Disallow:/ User-agent: URLy Warning Disallow:/ User-agent: VCI Disallow:/ User-agent: VCI WebViewer VCI WebViewer Win32 Disallow:/ User-agent: Web Image Collector Disallow:/ User-agent: WebAuto Disallow:/ User-agent: WebBandit Disallow:/ User-agent: WebBandit/3.50 Disallow:/ User-agent: WebCopier Disallow:/ User-agent: WebEnhancer Disallow:/ User-agent: WebmasterWorldForumBot Disallow:/ User-agent: WebSauger Disallow:/ User-agent: Website Quester Disallow:/ User-agent: Webster Pro Disallow:/ User-agent: WebStripper Disallow:/ User-agent: WebZip Disallow:/ User-agent: WebZip/4.0 Disallow:/ User-agent: Wget Disallow:/ User-agent: Wget/1.5.3 Disallow:/ User-agent: Wget/1.6 Disallow:/ User-agent: WWW-Collector-E Disallow:/ User-agent: Xenu's Disallow:/ User-agent: Xenu's Link Sleuth 1.1c Disallow:/ User-agent: Zeus Disallow:/ User-agent: Zeus 32297 Webster Pro V2.9 Win32 Disallow:/

    Read the article

  • mod_rewrite for clean URL doesn't convert the URL to clean URL (but it's accessible) [on hold]

    - by deathlock
    Basically what I want to do is to convert this: http://localhost/jariungu/user_caleg.php?idCaleg2014=3 into this: http://localhost/jariungu/caleg/3 I have managed to make /jariungu/caleg/3 to direct to the original URL (as in, if I open that URL, it directs me to the appropriate page). The problem is, once opened, the URL returns to the original, ugly one in the address bar. This is what I tried. Could someone provide a help? <IfModule mod_rewrite.c> Options +FollowSymlinks RewriteEngine On RewriteBase /jariungu/ RewriteRule ^caleg\/([0-9]+)\/([a-zA-Z]+\s*[0-9]*)/?$ caleg.php?idCaleg2014=$1&namaCaleg=$2 [NC,L] RewriteRule ^caleg\/([0-9]+)/?$ caleg.php?idCaleg2014=$1 [NC,L] </IfModule>

    Read the article

< Previous Page | 30 31 32 33 34 35 36 37 38 39 40 41  | Next Page >