Search Results

Search found 5181 results on 208 pages for '301 redirect'.

Page 70/208 | < Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >

  • How do I use .htaccess to redirect to a URL containing HTTP_HOST?

    - by Jon Cram
    Problem I need to redirect some short convenience URLs to longer actual URLs. The site in question uses a set of subdomains to identify a set of development or live versions. I would like the URL to which certain requests are redirected to include the HTTP_HOST such that I don't have to create a custom .htaccess file for each host. Host-specific Example (snipped from .htaccess file) Redirect /terms http://support.dev01.example.com/articles/terms/ This example works fine for the development version running at dev01.example.com. If I use the same line in the main .htaccess file for the development version running under dev02.example.com I'd end up being redirected to the wrong place. Ideal rule (not sure of the correct syntax) Redirect /terms http://support.{HTTP_HOST}/articles/terms/ This rule does not work and merely serves as an example of what I'd like to achieve. I could then use the exact same rule under many different hosts and get the correct result. Answers? Can this be done with mod_alias or does it require the more complex mod_rewrite? How can this be achieved using mod_alias or mod_rewrite? I'd prefer a mod_alias solution if possible. Clarifications I'm not staying on the same server. I'd like: http://example.com/terms/ - http://support.example.com/articles/terms/ https://secure.example.com/terms/ - http://support.example.com/articles/terms/ http://dev.example.com/terms/ - http://support.dev.example.com/articles/terms/ https://secure.dev.example.com/terms/ - http://support.dev.example.com/articles/terms/ I'd like to be able to use the same rule in the .htaccess file on both example.com and dev.example.com. In this situation I'd need to be able to refer to the HTTP_HOST as a variable rather than specifying it literally in the URL to which requests are redirected. I'll investigate the HTTP_HOST parameter as suggested but was hoping for a working example.

    Read the article

  • Is there a way to "redirect" a click on a URL in a VirtualBox guest to open in the host OS browser?

    - by Graeme Donaldson
    I'm using VirtualBox OSE on Ubuntu 10.04. I have a Windows 7 guest VM which I use almost exclusively for MS Outlook to access my Exchange mailbox. If I click a URL in Outlook it obviously opens in IE in the guest VM, is there any way to have it perform a redirect of some sort? If I click a URL inside the VM, I want it to load in my default browser in the Ubuntu host.

    Read the article

  • How come I can't redirect TCP ports on this wireless router?

    - by George Edison
    I am configuring a router to redirect TCP port 5900 (yes, this is for VNC) to a specific IP address on the network. Here is what I have: From a local computer on the same network, I can telnet to 192.168.1.64 (port 5900) just fine. However, when trying to telnet to the machine (port 5900) using its external IP address, it doesn't work. (The connection times out.) The router is a Gigaset SE567, if that helps.

    Read the article

  • redirection problem for my sites.

    - by redirect-p
    I have a site example.com and another one test.example.com. Both have different configuration file. But when I enter url test.example.com it will redirect to example.com. configuration file for example.com <VirtualHost *:80> ServerName example.com ServerAlias www.example.com DirectoryIndex index.html DocumentRoot my-document-path Options -Indexes ErrorDocument 404 /errors/404.html ErrorDocument 403 /errors/404.html <Location "/"> SetHandler python-program PythonHandler django.core.handlers.modpython PythonPath "['path', 'path'] + sys.path" SetEnv DJANGO_SETTINGS_MODULE example.settings PythonInterpreter example PythonAutoReload On PythonDebug On </Location> </VirtualHost>

    Read the article

  • URL Rewrite, ServerVariables, URL Parts, HTTP to HTTPS Redirect. Week 9

    - by OWScott
    Last week I gave an intro to URL Rewrite; covering the basics and giving a real world example.  This week I dive in deeper and cover ServerVariables, the parts that make up the URL and another real world example of redirecting HTTP to HTTPS. This is week 9 of a 52 week series on various web administration related tasks.  Past and future videos can be found here. For reference, in the video I mentioned the following two blog posts: Viewing ServerVariables For a Site Parts of the URL available to URL Rewrite

    Read the article

  • URL rewrite from www.domain.com/sudirectory to http://domain.com/subdirectory

    - by chrizzbee
    I need a solution for the following problem: I use a CMS and want the backend only be available at http://domain.com/backend and not at http://www.domain.com/backend. How do I have to change my .htaccess file to achieve this? I already have a rewrite rule from HTTP (non-www) to www. Here's what I currently have in my .htaccess file: ## # Uncomment the following lines to add "www." to the domain: # RewriteCond %{HTTP_HOST} ^shaba-baden\.ch$ [NC] RewriteRule (.*) http://www.shaba-baden.ch/$1 [R=301,L] # # Uncomment the following lines to remove "www." from the domain: # # RewriteCond %{HTTP_HOST} ^www\.example\.com$ [NC] # RewriteRule (.*) http://example.com/$1 [R=301,L] # # Make sure to replace "example.com" with your domain name. ## So, the first bit is the redirect from HTTP to www. It works on the domain part of the URL. As explained, I need a rewrite rule from the backend login at http://www.shaba-baden.ch/contao to http://shaba-baden.ch/contao

    Read the article

  • 410 Responses when your CMS host doesn't support them?

    - by leeand00
    Sending a 410 responses for a page that no longer exist should make Google stop crawling for that page. The site I am working on has been recently migrated, and very little of the content was migrated. I've already turned the existing content into 301 redirects (the content that is on both the old and the new site), but now I would like to flush the old content from Google's memory by placing 410 responses in it's path when it returns to crawl for them and finds a 404 response. However, I asked our CMS host about it, and they said that our CMS does not support 410 responses. Is there some other way to post a 410 response, like making a dead link 301 redirect to a page that a 410 response in the form of a meta tag?

    Read the article

  • Opening a new Windows from ASP.NET code behind

    - by TATWORTH
    At http://weblogs.asp.net/infinitiesloop/archive/2007/09/25/response-redirect-into-a-new-window-with-extension-methods.aspx there is an excellent post on how to open a new windows from code behind. The purists may not like it but it helped solve a problem for a client's client. Here is an update for VS2010 users: using System; using System.Web; using System.Web.UI; /// <summary> /// Response Helper for opening popup windo from code behind. /// </summary> public static class ResponseHelper {   /// <summary>   /// Redirect to popup window   /// </summary>   /// <param name="response">The response.</param>   /// <param name="url">URL to open to</param>   /// <param name="target">Target of window _self or _blank</param>   /// <param name="windowFeatures">Features such as window bar</param>   /// <remarks>   ///     <list type="bullet">   ///         <item>   /// From http://weblogs.asp.net/infinitiesloop/archive/2007/09/25/response-redirect-into-a-new-window-with-extension-methods.aspx   /// </item>   /// <item>   /// Note: If you use it outside the context of a Page request, you can't redirect to a new window. The reason is the need to call the ResolveClientUrl method on Page, which I can't do if there is no Page. I could have just built my own version of that method, but it's more involved than you might think to do it right. So if you need to use this from an HttpHandler other than a Page, you are on your own.   /// </item>   ///         <item>   /// Beware of popup blockers.   /// </item>   /// <item>   /// Note: Obviously when you are redirecting to a new window, the current window will still be hanging around. Normally redirects abort the current request -- no further processing occurs. But for these redirects, processing continues, since we still have to serve the response for the current window (which also happens to contain the script to open the new window, so it is important that it completes).   /// </item>   /// <item>   /// Sample call Response.Redirect("popup.aspx", "_blank", "menubar=0,width=100,height=100");   /// </item>   ///     </list>   /// </remarks>   public static void Redirect(this HttpResponse response, string url, string target, string windowFeatures)   {     if ((String.IsNullOrEmpty(target) || target.Equals("_self", StringComparison.OrdinalIgnoreCase)) && String.IsNullOrEmpty(windowFeatures))     {       response.Redirect(url);     }     else     {       Page page = (Page)HttpContext.Current.Handler;       if (page == null)       {         throw new InvalidOperationException("Cannot redirect to new window outside Page context.");       }       url = page.ResolveClientUrl(url);       string script;       if (!String.IsNullOrEmpty(windowFeatures))       {         script = @"window.open(""{0}"", ""{1}"", ""{2}"");";       }       else       {         script = @"window.open(""{0}"", ""{1}"");";       }       script = String.Format(script, url, target, windowFeatures);       ScriptManager.RegisterStartupScript(page, typeof(Page), "Redirect", script, true);     }   } }

    Read the article

  • Preventing Duplicates on Google

    - by abel
    I am currently using a rewrite rule to enable access to .php pages, without using the php extension. However to prevent old links from breaking, the pages can still be accessed via links containing the .php extension too. For eg. domain.com/page.php can now be accessed at domain.com/page All the links on the website now use domain.com/page type links within the site. However older incoming links will still link to the .php pages, meaning Google will index both pages and mark them as duplicate. I have two plans to remedy the situation. Use a php 301 redirect: When a page is accessed with the .php extension, I can redirect each page individually using a 301 redirect using php Using Canonical: Place a canonical tag on each page, pointing to the ".php" less version My Question: Are both methods equally efficacious in preventing Google from indexing my ".php" pages? Which method should be preferred, by convention or otherwise?

    Read the article

  • Do search engines rank internal redirects negatively?

    - by siverd
    A client is in the late stages (code complete) of a website redesign and unfortunately hasn't implemented 301 redirects to point high traffic pages to the new URL's. As I understand it our only option at this point is to create redirects within the CMS. Our CMS allows us to do this: www.mysite.com/category/current-page.html will redirect to www.mysite.com/new-category-name/new-page.html The site now uses custom logic on our 404 page to check this list of redirects and if one exists forwards the user to the new-page.html I understand that using 301 redirects would be the correct way to maintain our page rank but I think that would require a code change which isn't possible. Question How will search engines respond to this? Will they wait until the redirect happens and allow us to keep our page rank (authority, trust, etc) or will they see the 404 page and down-rank us? Worst case...will they make our new-page.html start from a rank of "0"? Thanks for your help.

    Read the article

  • Duplicate domain names - .net and .com Create separate pages, or redirect?

    - by guisasso
    In a SEO point of view: This website has a good amount of traffic for a local business, but also ships some merchandise. While the .net domain (registered first) is associated with the local busines (google places, maps, etc...) the .com domain only redirects to the .net domain. Is it good, bad or okay to create a different page for the domain .com for example, that would be pretty simple, but would link to the 10 different categories of products that this company sells? I know links are good, so there's that, but what else is good, or bad? Thanks in advance!

    Read the article

  • What are the Consequences for using Relative Location Headers?

    - by Alan Storm
    According to the spec, Location headers used in a redirect require a server name HTTP/1.1 301 Moved Permanently ... Location: http://example.com/foo/baz/bar However, in 2012, most web browsers will recognize a relative path and redirect you to the new location using the original server name HTTP/1.1 301 Moved Permanently ... Location: /foo/baz/bar Are there any negative/surprising consequences to using the relative URLs in the Location headers? My particular concern is how Google/search-engines will interpret this, but if there's anything else I'm not thinking about I'd love to hear it.

    Read the article

  • Googlebot can't access my site when crawling from rootdomain

    - by PéCé
    I can't explain why I get this message for my rootdomain result in Google : trocmalin.com/ A description for this result is not available because of this site's robots.txt – learn more. Here is my site specifics : vide-greniers.trocmalin.com is the site address www.trocmalin.com redirects (301) to vide-greniers.trocmalin.com trocmalin.com redirects (301) to vide-greniers.trocmalin.com too... User-agent: * Disallow: /orga/ User-agent: * Disallow: /sitemap-update Google results for vide-greniers.trocmalin.com are well rendered, as well as sub pages allowed for bots. But the result for my rootdomain (trocmalin.com) gives this message... Can you help me ?

    Read the article

  • Meta Refresh for change of page name and content

    - by user3507399
    Hopefully just a quick one. I've got a client that is changing the name of a workshop that they run. This means a change of url, page title for keywords that they have first page ranking on. The keywords are still relevant so what I want to avoid is a 301 redirect to a page that has different keywords to the previous page. Is the best option to keep the old page live with url and title and use a meta refresh to redirect after a period of time (not instant)? That way the SEO ranking is retained for the previous workshop name while they work on the ranking for the name change? Would a 301 redirect have an inverse effect? Thanks!

    Read the article

  • Is multiple domain names and links from same IP causing poor search engine rankings?

    - by John
    I have an ecommerce website which is not doing so well in Google. I am trying to improve this of course, and am looking at some possibilities for why it isn't doing well. The website has four domain names, all of which have been indexed by Google. A few months ago I applied 301 redirects to any requests for two of the domain names so now it is down to two domain names (one is a .net, the other is a .com.au, the others were .net.au and .com). I prefer to use my main domain name (the .com.au), but one of the names has been around for a long time and has more inbound links. According to a PageRank tool, both are PR2. It is a Classic ASP site and up until recently had a lot of querystring parameters. In the last week or so I added URL rewriting so there is now no parameters for most pages. I don't do 301 redirects from the old URLs but instead I add the META canonical tag indicating the preferred new URL. At the same time I redesigned the site and improved title tags, META descriptions, and H tags but it hasn't been long enough yet for Google to index many of these yet. I also looked at what pages Google has indexed and strangely it has some strange pages in the index, there are a lot of pages which are actual keyword searches (more a bunch of random letters than an actual word). What I mean is that it is as if they had typed in something to search for in my search box - there are no links to pages like this and the only way of getting this is to type something in to the search box). So I added a META robots tag with noindex,nofollow anytime that I render pages like this. Years ago I set up a fake price comparison site which lists all my products and links back to my site. It has a different keyword rich domain name but is on the same server and same IP address. It's a completely different layout but does have the same product categories and product descriptions (although I have stripped formatting out of them so they are not identical except in text). I also have a few blog sites which again are on the same server/IP and all have advertising for the website. My questions are: What should I do with the multiple domains, just use one, or continue with two or more? Should I add 301 redirects, not just the META canonical tag? Any idea about Google indexing my search results page, and did I do the right thing with the META robots tag? Is the fake price comparison site likely to be causing problems? Are all the links to the site from other domain names but the same IP address likely to be causing problems? Thanks for any help. Sorry for so many questions in one.

    Read the article

  • How to redirect from HTTPS to HTTP without warning message?

    - by user833985
    I have two web sites: one HTTP site and one HTTPS site. I will validate the credentials in HTTPS environment and will return to HTTP once authorized. The same is working fine in IE but in Mozilla im getting a warning which is given below. Although this page is encrypted, the information you have entered to be sent over an unencrypted connection and could easily be read by a thrid party. Are you sure you want to continue sending this information? How to overcome this warning message? Currently I'm posting from HTTPS aspx page using JavaScript to the HTTP page.

    Read the article

  • RewriteRule for URL Subdirectory Root

    - by JYerdon
    Have not found this in my searches on SE. I need this scenario to work: • User visits someurl.com/news/folder or someurl.com/news/somefolder/, they get redirected to someurl.com/somefolder. • If the user visits JUST someurl.com/news or /news/, they are allowed through to visit /news. Here is my current rule: RewriteRule ^news/(.*) /$1 [NC,R=301,L] How do I make it allow the second bullet point? First seems to work with no issues. Thanks all! POST UPDATE I have got the code RewriteCond %{REQUEST_URI} ^news RewriteRule ^/news news/ [NC,L] RewriteCond %{REQUEST_URI} ^/news/(.)$ RewriteRule ^news/(.) /$1 [NC,R=301,L] BUT - it doesn't allow me to go to the URL something.com/news/ Any thoughts?

    Read the article

  • List all versions of a package

    - by askb
    Is there a equivalent of this command; yum list kernel-headers --showduplicates on Ubuntu with apt-get, apt-cache etc. The above command lists various versions of the kernel-headers rpm available on F20/RHEL or installations. Closet I can get using apt-cache showpkg, not sure if there is a better way ? $ apt-cache showpkg linux-image Package: linux-image Versions: Reverse Depends: firmware-crystalhd,linux-image systemtap,linux-image fiaif,linux-image Dependencies: Provides: Reverse Provides: linux-image-3.13.0-27-lowlatency 3.13.0-27.50 linux-image-3.13.0-27-generic 3.13.0-27.50 linux-image-3.13.0-24-lowlatency 3.13.0-24.47 linux-image-3.13.0-24-generic 3.13.0-24.47 linux-image-3.13.0-24-lowlatency 3.13.0-24.46 linux-image-3.13.0-24-generic 3.13.0-24.46 Expecting similar output: $ yum list kernel-headers --showduplicates Loaded plugins: langpacks, refresh-packagekit Installed Packages kernel-headers.x86_64 3.11.10-301.fc20 @fedora Available Packages kernel-headers.x86_64 3.11.10-301.fc20 fedora kernel-headers.x86_64 3.14.4-200.fc20 updates This would help me simply do a downgrade or upgrade to a specific version.

    Read the article

  • How to handle CNAME host redirect to virtual directory?

    - by esac
    I have an internal website and virtual directory http://server2012/logs. I created a CNAME on my DNS server as LOGS - server2012. I would like to set it up so that http://LOGS redirects to http://server2012/logs. Ideally, I would still want it so that all pages appear in the browser as being off from the LOGS URL. So http://LOGS/network.html?site=32 is what is displayed in the browser, but it is really being served from http://server2012/logs/network.html?site=32. I've looked at URL rewrite, but can't seem to get to work.

    Read the article

< Previous Page | 66 67 68 69 70 71 72 73 74 75 76 77  | Next Page >