Search Results

Search found 4741 results on 190 pages for 'redirect p'.

Page 27/190 | < Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >

  • Configure nginx to reverse proxy a single url, and issue 301 redirects to everything else

    - by Martin
    I am using nginx to issue redirects for a domain we are changing, but one of our old mobile apps becomes broken by this redirect when it issues one specific POST request to the old domain. Here is the current nginx configuration, how could I add a reverse proxy to perform a POST to the URL /post_url and redirect everything else the same as now? server { listen 80; server_name olddomain.com www.olddomain.com; rewrite ^(.*) http://www.newdomain.com$1 permanent; }

    Read the article

  • NGinx Domain name redirects

    - by Srikar
    Lets say I have a website named xyz.co, I also have other domain names with the same prefix like xyz.com, xyz.it, xyz.co.it Right now nginx works fine with server_name xyz.co in nginx.conf in port 80 I would want all the other domains to redirect to xyz.co also I would want www.* versions of the above to redirect to xyz.co. How can I get this? Is this nginx webserver level changes? or I need to make this changes in DNS?

    Read the article

  • Reverse a .htaccess redirection rule

    - by Aahan Krish
    Let me explain by example. Say, I have this redirection rule in my .htaccess file: RedirectMatch 301 ^/([^/]+)/([^/]+)/$ http://www.example.com/$2 What it basically does is, redirect http://www.mysite.com/sports/test-post/ to http://www.mysite.com/test-post/. Now, how do I modify the .htaccess rule to do the opposite? (i.e. redirect http://www.mysite.com/test-post/ to http://www.mysite.com/sports/test-post/)

    Read the article

  • iptables intercept local traffic

    - by Anonymous
    i hope someone can help me out with somewhat simple task. I'm trying to redirect a client in my router through my desktop PC, so i can dump the traffic and analyze it (its potential source of poisoning the network with malicious packets). However i don't have a second NIC on my hands and i was hoping i can redirect all the traffic from that IP through my PC. In essence to become MITM for the client. Does anyone have any idea where to start: Current state: (localip)-(router)-(internet) And what i want to do: (localip)-(pc)-(router)-(internet)

    Read the article

  • Serving index.html from a subdirectory

    - by xbonez
    In my document root, I have to directories: home and foobar, both with their own index.html files. How can I set it up so that when someone visits my site at example.com, they see the contents on home/index.html? I tried using an index.php with a redirect in document root, as well as a .htaccess redirect, but both of them change the URL in the browser to example.com/home/, which I would like to ideally avoid.

    Read the article

  • Reversing a mod_rewrite rule

    - by KIRA
    I want to redirect accesses from http://www.domain.com/test.php?sub=subdomain&type=cars to http://subdomain.domain.com/cars I already have mod_rewrite rules to do the opposite: RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{HTTP_HOST} !^(www)\. [NC] RewriteCond %{HTTP_HOST} ^(.*)\.(.*)\.com [NC] RewriteRule (.*) http://www.%2.com/index.php?route=$1&name=%1 [R=301,L] What changes do I need to make to these rules to redirect requests from the script to the subdomain?

    Read the article

  • Why are perfectly legitamate pages on my website registering in google Webmasters as 404?

    - by christian
    I have seen this question asked several times here, but never clearly answered. I suspect it has something to do with my .htaccess file: # BEGIN WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> # END WordPress <IfModule mod_rewrite.c> RewriteEngine On RewriteRule ^moreinfo/(.*)$ http://www.kgstiles.com/moreinfo$1 [R=301] RewriteRule ^healthsolutions/(.*)$ http://www.kgstiles.com/healthsolutions$1 [R=301] RewriteRule ^(.*)\.html$ $1/ [R=301] RewriteRule ^(.*)\.htm$ $1/ [R=301] </IfModule> when I check the url without a forward slash at the end, it registers as 404 (even though it renders fine in a browser), but when I write it without the forward slash at the end, it renders 200 OK, but if I try to take off the forward slash with the htaccess file, the browser gives me a 310 error (too many redirects) you can see the 404 and 310 with this url: http://www.kgstiles.com/pureplantessentials.html which redirects to http://www.kgstiles.com/pureplantessentials/ (which is a 404), so what is a solution and why might this be registering as a 404? Any Help is appreciated! (I'm using wordpress btw)

    Read the article

  • Generate a proper 404 page for blocked sites via /etc/hosts instead of redirect to localhost

    - by Mixhael
    I have blocked some websites by editing /etc/hosts and adding several newline-entries in the following manner: 0.0.0.0 www.domain.com And it works. The only thing is: when a website is visited which is blocked, the browser is redirected to my http://localhost, resulting in a directory listing or website-presentation that is running within my localhost root-environment. It's not a very big problem, but I prefer a standard error that mentions the website cannot be visited (for instance by a 404 page). Is this possible?

    Read the article

  • Ranking hit after site migration

    - by Ben
    I migrated my site from its old domain over a month ago. I followed Google Webmaster Tools completely, including 301 redirects from every existing URL to the new domain, and then submitting a change of address. Traffic continued as normal, but then a few days after submitting the change of address traffic plummeted to about 20-30% of what it was previously. Most of my traffic comes from organic search, and I can see that for the keywords I had targeted before and performed well with and am now ranking much much lower for. In some cases for low competition keywords I've only lost a few places, for higher competition terms I have really suffered. This has started to pick up a bit (one of my keywords I have risen from 195 to 100 in the last week), but it seems to be a very slow process. How seamless is this process normally? I was under the impression that this would not affect my rankings too severely, but it has now been a month since the move and recovery seems to be very slow, if at all. Is it likely that I've missed something? The only change is that I have moved what was the home page to be more of a sub-page, and now in its place is a magazine-style home page. I understand that links to the old site will now be pointing to the latter which means that rankings for some keywords attributed to the old home page will take a hit, but even on other pages that seem to fit in exactly the same page structure as the previous site I have seen a drop in rankings.

    Read the article

  • serious 404 problem, suggestions for hunting them all down

    - by NRGdallas
    I have a bit of a situation coming up. Due to a complete website structure redesign that is basically inevitable, I expect to have the following: Our sitemap of about 12,000 url's have about 90-95% of them change Out of those 12,000, I expect around 5000-6000 internal links to go dead in the process. No external links to this site yet, as it is still in development. Is there a tool out there that can do the following: I can feed the sitemap.xml after the restructuring have it parse each pages links for 404 errors on that page only report the pages/errors, preferably with just the url it is on, the url of the error, and the anchor text I have found a few tools, but all of them seem to be limited to 100 pages. Any advice for an intermediate webmaster to help this situation? 301 redirects are not viable in this situation.

    Read the article

  • Redirect/rewrite dynamic URL to sub-domain and create DNS for subdomain

    - by Abdul Majeed
    I have created an application in PHP, I would like to re-direct the following URL to corresponding sub-domain. Dynamic URL pattern: http://mydomain.com/mypage.php?user_name=testuser I wish to re-direct this to the corresponding sub domain: http://testuser.mydomain.com/ How do I create a rewrite rule for this purpose? How do I register DNS for sub-domain without using CPANEL? (I want to activate sub-domain when the user registers to the system.)

    Read the article

  • Rebuilt website from static html to CMS need to redirect indexed links

    - by Michael Dunn
    I have rebuilt a website which was all created with static html pages, it has now been rebuilt using a CMS system. I need to find a way of redirecting all the existing links to there new corresponding pages which utilise friendly URL rewrites on the CMS based website I imagine there will be several hundred if not 1000s as i have pages and images linked from google. What is the most efficient way to complete this Thanks in advance Mike

    Read the article

  • Robots.txt practices with .htaccess redirections (inherits)

    - by Jayhal
    I have a question regarding how to write robots.txt files for many domains and subdomains with redirects in place. We have a hosting account that enacts primary and add-on domains. All of our domains and subdomains, including the primary domain, is redirected via htaccess 301s to their own subdirectories in the primary domain's root directory. I'm confused about how I would write the robots.txt for certain directories. First, I wanted to confirm I am right in understanding that for domains and subdomains, crawlers will look to the directory that acts as that urls root directory for the crawling rules(robots.txt). Also, that a directory will not be affected by a robots.txt present in their parent directory if the directory has its own domain/subdomain, and that url is the one being accessed by crawlers. (Am pretty sure, but I wanted to confirm I didnt have a fundamentally flawed understanding of robots.txt) In the original root directory on the account(where the primary domain was directed before htaccess was put in place) what should the robots.txt contain? When crawlers look to crawl our primary domain, will they look to the original root directory for the robots.txt or will they reference the file contained in the new subdirectory where all the primary domain's site files are located? If so, what should the root's robot.txt include if anything at all. Would I be right to include a simple 'disallow: /' for all agents, and then include more specific robots.txt files in each subdirectory with more specific instructions. Would that affect the crawling of the directory where the primary domain is now redirected? Any help is greatly appreciated, Thanks!

    Read the article

  • Why rewrite directive causes "301 Moved Permanently" with Nginx?

    - by Desmond Hume
    Below is a much simplified version of what I have in the configuration file of a server run by Nginx 1.2.5, yet it causes 301 Moved Permanently with Location: http://example.com/phpmyadmin/ before serving data, which is not what I expected from the default behavior of the rewrite directive. server { listen 80; location /pma { rewrite ^ /phpmyadmin; } location /phpmyadmin { root /var/www; index Documentation.html; } } When I follow http://example.com/pma, the data is served but the URL in the browser is changed to http://example.com/phpmyadmin/ while it was supposed to stay http://example.com/pma. How do I avoid Nginx sending 301 Moved Permanently so that it doesn't expose the actual directory structure on my server?

    Read the article

  • GWT: reporting crawling errors for non existing links

    - by pixeline
    Google Webmaster Tools is reporting crawl errors for links that never existed, and if i check the "Linked from" tab for a given error link, it shows another that never existed. They all mention joomla/ which is not the cms used on this domain (it's wordpress fyi). Exampled: http://example.com/joomla/index.php/component/user/register Linked from: http://example.com/joomla/component/user/login?return=L2###### What is going on? UPDATE 1 I tried something: I provided one of the faulty urls to the "Fetch as Google" functionality. Instead of returning a 404, it returns a 301 to another Joomla page. HTTP/1.1 301 Moved Permanently Server: Apache/2.4.3 X-Powered-By: PHP/5.4.4-10 X-Pingback: http://example.com/xmlrpc.php Expires: Wed, 11 Jan 1984 05:00:00 GMT Cache-Control: no-cache, must-revalidate, max-age=0 Pragma: no-cache Set-Cookie: PHPSESSID=1fgr5v2oip39miibuptd51s8h0; path=/ Set-Cookie: woocommerce_items_in_cart=0; expires=Sat, 12-Jan-2013 11:44:01 GMT; path=/ Location: http://example.com/joomla/component/user/register Content-Type: text/html; charset=iso-8859-1 Content-Length: 387 Date: Sat, 12 Jan 2013 12:44:01 GMT Via: 1.1 varnish Connection: keep-alive Accept-Ranges: bytes Age: 0 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>301 Moved Permanently</title> </head><body> <h1>Moved Permanently</h1> <p>The document has moved <a href="http://example.com/joomla/component/user/register">here</a>.</p> <p>Additionally, a 301 Moved Permanently error was encountered while trying to use an ErrorDocument to handle the request.</p> </body></html>

    Read the article

  • URL Rewrite 2.0 Performance

    - by The Official Microsoft IIS Site
    Do performance work it is easy when you have the right tools for measuring gains or lost. I will share some thoughts about how to improve performance during rewriting, but please keep in mind that any change you do must be well thought and with performance Read More......( read more ) Read More......(read more)

    Read the article

  • Redirect from https://mydomain.com to http://mydomain.com

    - by Charlie
    Many of my visitors have bookmarked my site already wtih https://mydomain.com. Under the bad advice of a programmer, I put my whole Joomla site under ssl. I do not sell anything or provide any member services. I asked him many times if it would slow my site down - he said it wouldn't. I knew it did, I've researched on this site and realized it does slow the site down because of no cache of the pages. Understood. Please, someone tell me how to get away from it now. I'm not sure how to approach this, should I add something to my htaccess or my main index.php file? I've looked all over the net, there is much advice for adding redirectives for going from http to https, but very few answers regarding the opposite of going from https to http. Thank you very much for your time. I appreciate it.

    Read the article

  • Ranking hit after WP site migration

    - by Ben
    I migrated my site from its old domain over a month ago. I followed WMT completely, including 301 redirects from every existing URL to the new domain, and then submitting a change of address. Traffic continued as normal, but then a few days after submitting the change of address traffic plummeted to about 20-30% of what it was previously. Most of my traffic come from organic search, and I can see that for the keywords I had targeted before and performed well with and am now ranking much much lower for. In some cases for low competition keywords I've only lost a few places, for higher competition terms I have really suffered. This has started to pick up a bit (one of my keywords I have risen from 195 to 100 in the last week), but it seems to be a very slow process. How seamless is this process normally? I was under the impression that this would not affect my rankings too severely, but it has now been a month since the move and recovery seems to be very slow, if at all. Is it likely that I've missed something? The only change is that I have moved what was the home page to be more of a sub-page, and now in its place is a magazine-style home page. I understand that links to the old site will now be pointing to the latter which means that rankings for some keywords attributed to the old home page will take a hit, but even on other pages that seem to fit in exactly the same page structure as the previous site I have seen a drop in rankings. Any help would be greatly appreciated. Thanks!

    Read the article

  • web.config to redirect except some given IPs

    - by Alvin
    I'm looking for a web.config which is equivalent as the .htaccess file below. <IfModule mod_rewrite.c> RewriteEngine on RewriteCond %{REMOTE_HOST} !^123\.123\.123\.123 RewriteCond %{REMOTE_HOST} !^321\.321\.321\.321 RewriteCond %{REQUEST_URI} !/coming-soon\.html$ RewriteRule (.*)$ /coming-soon.html [R=302,L] </IfModule> Which redirects everyone to a coming soon page except for the given IPs. Unfortunately I'm not familiar with IIS. Thank you

    Read the article

  • Website restyle, SEO migration plan?

    - by Goboozo
    I am currently in a project for one of my biggest clients. We have built a website that will -replace- the old website. When it comes to actual content its is largely the same. However, the presentation of the content has changed drastically. From our point of view much more user-friendly (main reason to update the site). Now, since the sites presentation has changed we have some major changes in: HTML & CSS: To change the presentation of the content URL's: To make them better understandable (301 redirects have been taken care of and are in place) Breadcrumbs: To enhance the navigation (we have made the breadcrumbs match exactly with the url's) Pagination: This was added to enable content browsing Title tags: Added descriptive title tags to the major links and buttons. Basically all user content including meta tags have remained the same. Now since this company is rather successful and 90% of its clients come from Google's organic results I am obliged to take all necessary precautions. People tell me I need a migration plan to prevent the site being hurt in Google, but I have never worked using such a plan... ...So, based on the above. Would you consider a migration plan necessary and what precautions/actions would you recommend to prevent us being put down in our SERP positions? Many thanks in advance for your answers.

    Read the article

  • Removing existing filtered pages from Google's index: noindex / 301 / canonical to non-filtered page?

    - by Noam
    I've decided to remove some of my site's pages from the Google index to focus more of the indexed pages on higher quality pages. The pages I'm going to remove are already in the index. These removed pages are filtered pages which will continue to exist, I just don't want them in the google index because they add little quality to the same page without any filter selected. I've added in webmaster tools specification of narrow for the parameters that set these filters, but it doesn't seem this changes anything in how he handles these pages. So I'm considering three options: Adding <meta name="robots" content="noindex" /> to the html header of these filtered pages 301 to the non-filtered page that contains the most similar information and will remain in the index Canonical tag. Which I'm not sure is exactly the mainstream use case, as these aren't really the same pages. Which should I use?

    Read the article

  • redirect non-www to www while preserving protocol

    - by Waleed Hamra
    I am aware of the fact that there are tons of questions in this section and in server fault dealing with redirections from non-www to www URLs. But I couldn't find one dealing with this issue while preserving protocol. I am no mod-rewrite expert, and my code is just copy/pasted... here's what i have: RewriteCond %{HTTP_HOST} ^domain.tld$ [NC] RewriteRule ^(.*)$ http://www.domain.tld$1 [R=301,L] So now http://domain.tld and https://domain.tld are forwarded to http://domain.tld How do i make it so that https stays on https while http stays on http?

    Read the article

  • migrating PR / rankings from one site to another

    - by sam
    Ive got a clients company site with decent PR, backlinks and search engines rankings. The client wants to change their comapany name and therfore URL, i will set up a rediect between the old site and the new site. But i was wandering is their a way to tell Google that they are moving while retaining all your rankings ? It is the same people, services, office building same everything just rebranded under a different name and url. Additionaly if their is a way to do this, how does google stop you buying expired domains and just pointing them onto your site, for instance i could buy several PR3 domains all relating to the same sector and point them at my site or would google catch on to this ?

    Read the article

  • Removing 301 redirect from site root

    - by Jon Clements
    I'm having a look at a friends website (a fairly old PHP based one) which they've been advised needs re-structuring. The key points being: URLs should be lower case and more "friendly". The root of the domain should be not be re-directed. The first point I'm happy with (and the URLs needed tidying up anyway) and have a draft plan of action, however the second is baffling me as to not only the best way to do it, but also whether it should be done. Currently http://www.example.com/ is redirected to http://www.example.com/some-link-with-keywords/ using the follow index.php in the root of the Apache2 instance. <?php $nextpage = "some-link-with-keywords/"; header( "HTTP/1.1 301 Moved Permanently" ); header( "Status: 301 Moved Permanently" ); header("Location: $nextpage"); exit(0); // This is Optional but suggested, to avoid any accidental output ?> As far as I'm aware, this has been the case for around three years -- and I'm sorely tempted to advise to not worry about it. It would appear taking off the 301 could: Potentially affect page ranking (as the 'homepage' would disappear - although it couldn't disappear because of the next point...) Introduce maintainance issues as existing users would still have the re-directed page in their cache Following the above, introduce duplicate content Confuse Google/other SE's as to what the homepage actually is now I may be over-analysing this but I have a feeling it's not as simple as removing the 301 from the root, and 301'ing the previous target to the root... Any suggestions (including it's not worth it) are sincerely appreciated.

    Read the article

< Previous Page | 23 24 25 26 27 28 29 30 31 32 33 34  | Next Page >