Search Results

Search found 24378 results on 976 pages for 'pinned site'.

Page 28/976 | < Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >

  • Apache configuration: choose a site to display according to visitor's IP address

    - by user64294
    Hi all. I would like to set a special configuration in our apache web server. I would like to display sites to the users according their IP addresses. We plan to upgrade our web sites. During the upgrade we'll put a maintenance site: so all the users which will connect to our web sites will get this site. but in order to test the upgrade i need to set apache to let only my IP address to access to asked site. If my IP address is a.b.c.d and if i ask for test_dot_com i want to see it. but all other users, having a different IP address, should get the maintenane site even if they look for test_dot_com. Is there a way to do this? PS: sorry As i'm a new user i can't use more than one link. so test_dot_com = test.com Thank you.

    Read the article

  • Automatic server redirect that passes through as the referring site

    - by nsearle
    Hey All, As always, I would like to thank all of your help and assistance in advance. I am looking for a way to automatically redirect a user to a site and maintain the redirect site as the referral site. Let me explain the process and what is needed in a step by step format. User clicks on a desired link in an email User is navigated to testdomain.com User is automatically redirected to test.com/landing test.com sees testdomain.com as the referral site Data is gathered via Google Analytics I am unsure as if this PHP code will take care of it, or not - header("Location: http://google.com", true, 303); I could test and that is most likely what I am going to do. But I would like to understand a little more as to WHY this would or WHY this wouldn't work.

    Read the article

  • Site resolves fine without "www", "www" creates database error

    - by PatrickS
    Working with BOA ( Barracuda / Octopus / Aegir ) , I've installed a few Drupal sites without any problems and following the same process for all. BOA is running on Nginx. All sites are going thru Cloudflare's network , where I set the same DNS settings. A example.com points to IPADDRESS A www points to IPADDRESS the nameservers of each domain are pointing to Cloudflare's respective nameservers. It all works , except for one site that works perfectly without "www" , but with "www" returns the typical database error if Drupal can't find the site's database. Site off-line The site is currently not available due to technical problems. Please try again later. Thank you for your understanding. If you are the maintainer of this site, please check your database settings in the settings.php file and ensure that your hosting provider's database server is running. For more help, see the handbook, or contact your hosting provider. In BOA, all sites have the same alias, basically a symlink, redirecting like this... www.example.com - example.com

    Read the article

  • HTTP traffic through PIX VPN from outside site

    - by fwrawx
    I have a remote site with a website that only allows access from the outside IP assigned to our local PIX. I have users connecting to the local networking using a VPN that need to be able to view this remote site. I don't think this works because the packets want to come in and go out over the same (ext) interface. So I'm looking for a way to make this work using the PIX or setting up a service on a server on the local network to act as a middle-man for the HTTP requests. The remote site doesn't support setting up a VPN to our PIX. The remote website is dishing out pages over a non-standard port. Can I use squid or something similar to proxy just one site?

    Read the article

  • Tomcat and ASP site under IIS6 with SSL

    - by Rafe
    I've been working on migrating our companies' website from it's original server to a new one and am having two different but possibly related problems. The box this is sitting on is a Windows 2003 server x64 running IIS 6. The Tomcat version is 5.5.x as it was the version the original deployment was built on. There are two other sites on the server one in plain HTML, another in PHP and the one I am trying to migrate is a combination of Java and ASP (the introductory/sign in pages being Java as well as many reports used for our clients and the administration pages being in ASP) First of all I can only access the site if I enter the ip followed by :8080 (xxx.xxx.xxx.xxx:8080). The original setup had an index.html file in the root of the site with a bit of javascript in the header that pointed the site to 'www.mysite.com/app/public' but if I try going directly to the site without the 8080 I get a 'page not found error' and the javascript redirector causes the same problem because it doesn't add the 8080 into the URL even though on the original site the 8080 wasn't present so I don't understand why it would need it now. The js redirect looks like this: <script language="JavaScript"> <!-- location.href = "/app/public/" location.replace("/app/public/"); //--> </script> When setting the site up I used the command line to unbind IIS from all of the ip's on the system (there are 12 ip's on this box) because I was led to believe Tomcat wanted to use localhost which wasn't accessible. I'm not sure if this was the right thing to do but I'm throwing it in for the sake of completeness. And actually, at this point trying to go to localhost from the server itself throws up a 'could not connect to localhost' error. If I go to localhost:8080 I get the tomcat welcome page. If I do localhost:8080/app/public I get the intro page to our website. So I'm not sure what I'm even looking at in this case, that is what the proper behavior should be. The second part of the problem is that if I do go to either the ip or localhost such as above (localhost:8080/app/public) and click on our login link it is supposed to transfer me to our login page yet instead I receive a 'could not connect' error and the url has changed to localhost:8443/app/secure. From my research I see that port 8443 is Tomcats SSL port and the server.xml alludes to it as follows: <Connector port="8080" maxHttpHeaderSize="8192" maxThreads="150" minSpareThreads="25" maxSpareThreads="75" enableLookups="false" redirectPort="8443" acceptCount="100" connectionTimeout="20000" disableUploadTimeout="true" /> I have an SSL certificate assigned to the site via IIS and was under the impression that by default Tomcat allowed IIS to handle secure connections but apparently something is munged because it's not working. There is another section in the server.xml that reads like this: <Connector port="8009" enableLookups="false" redirectPort="443" protocol="AJP/1.3" /> Which I'm not sure what it is for although port 443 is the SSL port that IIS uses so I'm confused as to what this is supposed to be doing. Another question I have is when does the isap_redirector actually come into play? How does it know when to try and serve pages through Tomcat and when not to? I've hunted around the 'net for an answer and have yet to find a clear dialogue on the subject. Anyone have any pointers as to where to look for a solution to all of this?

    Read the article

  • mac + parallels and https site test = router restarts

    - by Erik
    Ok I have an interesting & very frustrating problem happening. I'm going to explain it the best I can. I work as a graphic designer & web designer on a mac and have a Comcast internet connection that comes through a Comcast branded router (SMC8014) which then ties into an Airport Extreme Base Station which runs my office network. I run OS 10.5.7 and also run Parallels 4.0.3 (running Windows XP) for testing websites in Internet Explorer and so on. Ok, so that the basic background. Here's my issue. I've been collaborating an ecommerce website with another designer/developer and when testing the site on the PC side we have started to run into some sort of network problem. The site is https if that matters at all I suspect it may. Basically when I run parallels for testing I am constantly having to restart the router in order to connect to the test site (it's hosted). Funny thing is I can access the rest of the internet fine, just not this site I'm working on until I restart the router (It's sorta like the site is timing out). This never happens when just running the Mac side of things. It only becomes an issue when Parallels is open and I am doing page refreshes while making css or HTML edits via something like Coda or CSS edit (connected to the hosting server via ftp). The real problem is that once the problem starts I only get about 2 or 3 page loads before I have to restart the router again. It's absolutely crippling. I cannot get any work done when I have to restart the router every couple of minutes. So, if you think this problem is isolated to me, the answer is no. The designer/developer I'm collaborating with has an office a couple miles away and experiences very similar problems under slightly different setup. He also has Comcast as his internet provider and connects his router to an Airport and primarily works on a mac. The main difference is that rather than using a visualizer like parallels to test the website on the PC he uses a real live PC that is on his network. Once he fire up the PC to do testing he runs into the same issue described above. After a couple of page refreshes in Internet Explorer or other browser on the PC the site becomes unresponsive and the router has to get restarted. Any thoughts on what is going on here would be greatly appreciated. Thanks in advance.

    Read the article

  • Why can't European users access my site?

    - by japancheese
    Hello, My site has been running just fine for the past couple of years, but all of a sudden, two days ago, European users have been experiencing serious connection problems to the site. I really want to fix this for them, but what's the best way to figure out what the issue is? I have absolutely no connection problems to the site on my end, nor do Asian or other American users it seems. Using just-ping.com, some European servers come back with some packet losses. I tried doing some traceroutes from European servers to my own, but they all seemed to work just fine. I'd at least like to be able to tell users that if the problem does not lie with my server, then it at least lies somewhere out of my control. I really want to figure out what the choke point is though. Is there another way I might be able to find out why they can't seem to connect to the site? Just looking for any other ideas from people that have had a similar experience.

    Read the article

  • Consolidate multiple site files into single location

    - by seengee
    We have a custom PHP/MySQL CMS running on Linux/Apache thats rolled out to multiple sites (20+) on the same server. Each site uses exactly the same CMS files with a few files for each site being customised. The customised files for each site are: /library/mysql_connect.php /public_html/css/* /public_html/ftparea/* /public_html/images/* There's also a couple of other random files inside /public_html/includes/ that are unique to each site. Other than this each site on the server uses the exact same files. Each site sitting within /home/username/. There is obviously a massive amount of replication here as each time we want to deploy a system update we need to update to each user account. Given the common site files are all stored in SVN it would make far more sense if we were able to simply commit to SVN and deploy to a single location direct from there. Unfortunately, making a major architecture change at this stage could be problematic. In my mind the ideal scenario would mean creating an account like /home/commonfiles/ and each site using these common files unless an account specific file exists, for example a request is made to /home/user/public_html/index.php but as this file doesnt exist the request is then redirected to /home/commonfiles/public_html/index.php. I know that generally this approach is possible, similar to how Zend Framework (and probably others) redirect all requests that dont match a specific file to index.php. I'm just not sure about how exactly to go about implementing it and whether its actually advisable. Would really welcome any input/ideas people have got. Thanks.

    Read the article

  • apache running but site not accessible

    - by Shyam
    Am pretty new to server administration. So I am not able to get to the root of the problem. I am running Apache2 with mod_php on a 1GB Rackspace Cloud Server (Ubuntu 9.10). My site goes down often, and I have to restart apache2 to get the site working. I checked the "error.log" file. There were no signs of any error messages. I even searched for words like [error] / error / warn / [warn] . But no results. The site goes down and even then apache is running. When the site was down, the checked the status /etc/init.d/apache2 status and it gave ** * Apache is running (pid 433). ** Any suggestions where I should look for the problem. Thanks a lot.

    Read the article

  • Access SharePoint 2010 site with Google Chrome Frame inside Internet Explorer 6

    - by aphoria
    Is it possible to use Google Chrome Frame in IE 6 to access a SharePoint 2010 site without displaying the WarnOnUnsupportedBrowsers message. The message I get is this: Your Web browser will have problems displaying this web page. Changes to the site may not function properly. For a better experience, please update your browser to its latest version. Ideally, I'm hoping to only have to make changes to the client, but if the site itself needs to modified I can probably get it done.

    Read the article

  • Help on using mod_rewrite to serve I18N static site

    - by Guandalino
    My static site www.example.com is translated in different languages and files are organized in this hierarchy: / /de index.html seite-1.html /en index.html page-1.html /it index.html pagina-1.html The root contains no files, just one subdirectory for each language the site is translated in, while subdirectories contain pages translated (both content and file name are) in the language corresponding to subdirectory name, de, en, it, etc. The question is: how to configure mod_rewrite so that when a client visits www.example.com it is taken to the correct version of the site, falling back to english version if the required locale is not supported (i.e. Accept-Language header doesn't exist or specifies a language for which the site is not available, e.g. fr)? Thanks for any pointer, I'm here to provide further details or feedback! Best regards

    Read the article

  • Taking user out of MACHINENAME\Users group does not disallow them from authenticating with IIS site

    - by jayrdub
    I have a site that has anonymous access disabled and uses only IIS basic authentication. The site's home directory only has the MACHINENAME\Users group with permissions. I have one user that I don't want to be able to log-in to this site, so I thought all I would need to do is take that user out of the Users group, but doing so still allows him to authenticate. I know it is the Users group that is allowing authentication because if I remove that group's permissions on the directory, he is not allowed to log in. Is there something special about the Users group that makes it so you are actually always a part of it? Is the only solution to revoke the Users group's permissions on the site's home directory and grant a new group access that contains only the allowed users?

    Read the article

  • IIS7.5 is only resolving to the Default Web Site

    - by Dennis Burnham
    I am able to access only the Default Web Site on a Windows 2008 R2 Server which is running IIS 7.5 The Default Web Site's binding is to "All Unassigned", same way as I have done it on a different machine running IIS6 under Windows 2003 Server. The bindings of the desired Web Site have the IP address of the server and the correct home directory. Regardless what I do, the only page content I can see is the default index page in the wwwroot directory which is the Default Web Site. What must I do to deliver the correct content from the Web Sites that are configured in IIS7.5?

    Read the article

  • Add site to trusted sites through GPO

    - by Matt Bear
    I need to add a site to trusted sites on all computers in my domain. I can do it with the "site to zone assignment list", however when I do, it locks trusted sites on the client computer "this setting is managed by your administrator". What I need is a way to add the site, make it persistant, and not affects the users ability to add trusted sites of thier own. (It's a development enviroment, sites are created and tested regularly, they need that ability.)

    Read the article

  • Continuous Flash music player while navigating site

    - by phx-zs
    I have a site that includes a Flash music player integrated into the layout. I want users to be able to navigate around the site without interrupting the music. I've done plenty of research and thinking and the following are the options I came up with (keeping in mind I want to be as SEO friendly as possible). Anyone have another idea? AJAX: I set up a version that changes the main content div to whatever nav link they click, thereby not interrupting the Flash player. I set it up in the proper search-engine-friendly manner with direct links and JQuery/Ajax functions. If someone goes to site.com/ and clicks the Contact nav link, it loads what's in the main content div on site.com/contact.php into the main content div and changes the URL bar to site.com/#Contact. The same goes for if they go to site.com/contact.php and click About in the nav, it loads the About content and changes the URL bar to site.com/contact.php#About. Obviously this opens up a whole new can of worms with AJAX and hash navigation/history issues, and I would end up with people possibly linking to things like site.com/contact.php#About (which I think looks terrible and can't be too great for SEO). Store the Flash player vars somewhere and reload them with the page: I'm not sure how to go about this, but I thought about keeping my regular navigation without AJAX and have it so when a user clicks a nav link, before it changes pages it stores the Flash player vars (current song and song position) somewhere, then loads them into Flash when the new page loads. Something with an iframe? Good alternative to a Flash player that will work for this type of application? Thanks!

    Read the article

  • Blocking HTML and Javascript from being displayed on my site

    - by Tim Powell
    I am working on this new social networking site. One of it's various functions is posting. You can post to Facebook and my site, or Twitter and my site. That being said, I couldn't help but try and post HTML as I was testing sql injection. When I did, I noticed that there where ways to manipulate the site to, for instance, using a element to completely screw up the CSS design, or redirect a user to another site using javascript. That being said, I want to make my site a safe environment for my users... not a site that is used to distribute computer viruses, porn, and other things that might make someone tend to stay off of my site. When I searched this topic, I found ways to "strip" the HTML out of the $post variable before submitting it to the database. However, I would just like to make it so you can post any text, including HTML and Javascript, without the browser interpreting it as "run this..." code: I want to display it as plane text. I've seen it on Facebook, and when I looked at it the source code of a post, it showed <, /, and as regular text. I tried "dissecting" Facebook's source code, but found nothing. I have tried using tags such as <pre> and <code>, but because of the lack of ability to style and control them, I gave up and went back to just allowing HTML. Please, anyone who knows how to do this, please help me out. Thanks in advance, TP

    Read the article

  • Cross-domain structure of the site

    - by Coreal
    I have a web site on which I wish to enable instant messaging via IRC. Now I am about to buy VPS to host the ircd on. So, I will have two servers: one is for the web site and the other is for the ircd. The question is how to use the web site's domain name for the ircd?

    Read the article

  • Ubuntu server 10.04 disconnects after short periods of inactivity on my site

    - by user57019
    I'm new to Ubuntu (installed it for the first time just a couple of days ago on my server). I've Ubuntu Server 10.04 and am just using the terminal, no GUI like Gnome. So far it's working pretty great except for one big thing. Whenever I go to sleep and there's no activity on my server (it's not a big site so active users drop to 0 during the night), the server kind of disconnects. The only thing that can bring the site back online is to restart the whole server. I've tried disabling powersaving by using setterm but that changes nothing. Even if I wake up the server by pressing any key or so the site wont go back online! I've tried just restarting both Apache and MySQL (I'm using LAMP-server btw) but not even that works. But as soon as I turn the power off and on at the server, everythings work like normal for a couple of minutes of inactivity (~5-15 minutes I'd guess) and then it's down again unless someone logs in to the site and is active. I was previously using XAMPP on my laptop with Windows XP and that worked 24/7 so I don't think it's anything with my router or ISP. This is driving me crazy! My site is down all the time I'm in school as I have no possibility to restart the server if it becomes offline. Does anyone have a clue to what could be wrong?

    Read the article

  • Time out when creating a site collection

    - by Daeko
    I am trying to create a site collection programmatically. It has worked for about 6 months, but after the servers have been updated (various patches) it doesn’t work anymore (we have 3 servers: 1 development, 1 test, 1 production). It is still working in my development environment which hasn’t been updated, but not on the two others. I don’t receive any error messages, it just hangs at the code that is supposed to add the site collection (see code below). I am using Windows Server 2003 R2 and Sharepoint 2007 (version 12.0.0.6421 ). It doesn’t give me any errors, it just hangs until Internet Explorer comes with a “request timed out” response. If I try and debug the code, the code just stops there and nothing happens. No error messages or anything. public static string CreateSPAccountSite(string siteName) { string url = ""; SPSecurity.RunWithElevatedPrivileges(delegate() { SPWeb web = SPContext.Current.Web; using (SPSite siteCollectionOuter = new SPSite(web.Site.ID)) { SPWebApplication webApp = siteCollectionOuter.WebApplication; SPSiteCollection siteCollection = webApp.Sites; SPSite site = siteCollection.Add("sites/" + siteName, siteName, "Auto generated Site collection.", 1033, "STS#0", siteCollectionOuter.Owner.LoginName, siteCollectionOuter.Owner.Name, siteCollectionOuter.Owner.Email); //Hangs here site.PortalName = "Portal"; site.PortalUrl = mainUrl; // https://www.ourdomain.net url = site.Url; } }); return url; //Should be "https://www.outdomain.net/sites/siteName" }

    Read the article

  • Ubuntu server 10.04 disconnects after short periods of inactivity on my site

    - by Melot
    Hi! I'm new to Ubuntu (installed it for the first time just a couple of days ago on my server). I've Ubuntu Server 10.04 and am just using the terminal, no GUI like Gnome. So far it's working pretty great except for one big thing. Whenever I go to sleep and there's no activity on my server (it's not a big site so active users drop to 0 during the night), the server kind of disconnects. The only thing that can bring the site back online is to restart the whole server. I've tried disabling powersaving by using setterm but that changes nothing. Even if I wake up the server by pressing any key or so the site wont go back online! I've tried just restarting both Apache and MySQL (I'm using LAMP-server btw) but not even that works. But as soon as I turn the power off and on at the server, everythings work like normal for a couple of minutes of inactivity (~5-15 minutes I'd guess) and then it's down again unless someone logs in to the site and is active. I was previously using XAMPP on my laptop with Windows XP and that worked 24/7 so I don't think it's anything with my router or ISP. This is driving me crazy! My site is down all the time I'm in school as I have no possibility to restart the server if it becomes offline. Does anyone have a clue to what could be wrong?

    Read the article

  • Site on IIS 7.5 accessible via SSL on server, but not from local network

    - by bnieland
    I have a site set up on IIS 7.5. I added the following binding... type: https Host Name: [Blank] Port: 443 IP Address: 192.168.1.6 Binding Information: [Blank] I can access the site via https://192.168.1.6 from the server itself. When I try to access the site from another machine (193.168.1.4) on the same sub-net via https://192.168.1.6 I get no response. http://192.168.1.6 returns the site as expected. I have used wireshark to examine the packets on the server, the first of which I have included as an image. There were two other packets, very similar to this. There was no response from the server. Any Ideas?

    Read the article

  • How to Configure SSL on Particular Folder,Not overall Site in iis 7

    - by user66001
    I have a website for example www.somesite.com. I want to apply SSL settings on Particular Folder in this site. I followed following steps. (1) Import Certificate in IIS 7. (2) Create SSL Bindings. (3) Select folder in website and set required ssl. Now my problem is that I want one particular folder to require SSL not overall site. By Following above steps. I can browse whole site with both http and https. I want whole site to be browsed by only http and the particular folder in website to be browsed using only https. Like -http://www.somesite.com and -https://www.somesite.com/somefolder Thanks in advance.

    Read the article

  • Allow login from another site

    - by tunmise fasipe
    I have web application where I store users' password and username. If you logon to this site, you can login with the password and username to have access to your profile. There is another option that requires you to login to my site from your site and have your profile within your site. This is because you might already have a site that your clients know you with. This latter part is what I don't know to implement. I have these ideas: Have a fixed IFrame within your site to contain my site: but I am concerned about size/layout since different clients have different layout/size for their content section. I am thinking of how to maintain session too A webservice: I don't know how feasible this is since the Password and ID are on my server. You may have to send them back and forth. It means client would have to code with my API. But I am not just returning data, I have to show them a page that contains the profile details OpenID, Single-SignOn: Just guessing - but the authentication and data resides on my server. there is nothing to access on your side in this case Any other methods/better approach Examples: like login into facebook within my site and still be able to do post updates, receive notifications Facebook implement some of these with IFrame e.g. the Like button

    Read the article

  • Tip/Trick: Fix Common SEO Problems Using the URL Rewrite Extension

    - by ScottGu
    Search engine optimization (SEO) is important for any publically facing web-site.  A large % of traffic to sites now comes directly from search engines, and improving your site’s search relevancy will lead to more users visiting your site from search engine queries.  This can directly or indirectly increase the money you make through your site. This blog post covers how you can use the free Microsoft URL Rewrite Extension to fix a bunch of common SEO problems that your site might have.  It takes less than 15 minutes (and no code changes) to apply 4 simple URL Rewrite rules to your site, and in doing so cause search engines to drive more visitors and traffic to your site.  The techniques below work equally well with both ASP.NET Web Forms and ASP.NET MVC based sites.  They also works with all versions of ASP.NET (and even work with non-ASP.NET content). [In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu] Measuring the SEO of your website with the Microsoft SEO Toolkit A few months ago I blogged about the free SEO Toolkit that we’ve shipped.  This useful tool enables you to automatically crawl/scan your site for SEO correctness, and it then flags any SEO issues it finds.  I highly recommend downloading and using the tool against any public site you work on.  It makes it easy to spot SEO issues you might have in your site, and pinpoint ways to optimize it further. Below is a simple example of a report I ran against one of my sites (www.scottgu.com) prior to applying the URL Rewrite rules I’ll cover later in this blog post:   Search Relevancy and URL Splitting Two of the important things that search engines evaluate when assessing your site’s “search relevancy” are: How many other sites link to your content.  Search engines assume that if a lot of people around the web are linking to your content, then it is likely useful and so weight it higher in relevancy. The uniqueness of the content it finds on your site.  If search engines find that the content is duplicated in multiple places around the Internet (or on multiple URLs on your site) then it is likely to drop the relevancy of the content. One of the things you want to be very careful to avoid when building public facing sites is to not allow different URLs to retrieve the same content within your site.  Doing so will hurt with both of the situations above.  In particular, allowing external sites to link to the same content with multiple URLs will cause your link-count and page-ranking to be split up across those different URLs (and so give you a smaller page rank than what it would otherwise be if it was just one URL).  Not allowing external sites to link to you in different ways sounds easy in theory – but you might wonder what exactly this means in practice and how you avoid it. 4 Really Common SEO Problems Your Sites Might Have Below are 4 really common scenarios that can cause your site to inadvertently expose multiple URLs for the same content.  When this happens external sites linking to yours will end up splitting their page links across multiple URLs - and as a result cause you to have a lower page ranking with search engines than you deserve. SEO Problem #1: Default Document IIS (and other web servers) supports the concept of a “default document”.  This allows you to avoid having to explicitly specify the page you want to serve at either the root of the web-site/application, or within a sub-directory.  This is convenient – but means that by default this content is available via two different publically exposed URLs (which is bad).  For example: http://scottgu.com/ http://scottgu.com/default.aspx SEO Problem #2: Different URL Casings Web developers often don’t realize URLs are case sensitive to search engines on the web.  This means that search engines will treat the following links as two completely different URLs: http://scottgu.com/Albums.aspx http://scottgu.com/albums.aspx SEO Problem #3: Trailing Slashes Consider the below two URLs – they might look the same at first, but they are subtly different. The trailing slash creates yet another situation that causes search engines to treat the URLs as different and so split search rankings: http://scottgu.com http://scottgu.com/ SEO Problem #4: Canonical Host Names Sometimes sites support scenarios where they support a web-site with both a leading “www” hostname prefix as well as just the hostname itself.  This causes search engines to treat the URLs as different and split search rankling: http://scottgu.com/albums.aspx/ http://www.scottgu.com/albums.aspx/ How to Easily Fix these SEO Problems in 10 minutes (or less) using IIS Rewrite If you haven’t been careful when coding your sites, chances are you are suffering from one (or more) of the above SEO problems.  Addressing these issues will improve your search engine relevancy ranking and drive more traffic to your site. The “good news” is that fixing the above 4 issues is really easy using the URL Rewrite Extension.  This is a completely free Microsoft extension available for IIS 7.x (on Windows Server 2008, Windows Server 2008 R2, Windows 7 and Windows Vista).  The great thing about using the IIS Rewrite extension is that it allows you to fix the above problems *without* having to change any code within your applications.  You can easily install the URL Rewrite Extension in under 3 minutes using the Microsoft Web Platform Installer (a free tool we ship that automates setting up web servers and development machines).  Just click the green “Install Now” button on the URL Rewrite Spotlight page to install it on your Windows Server 2008, Windows 7 or Windows Vista machine: Once installed you’ll find that a new “URL Rewrite” icon is available within the IIS 7 Admin Tool: Double-clicking the icon will open up the URL Rewrite admin panel – which will display the list of URL Rewrite rules configured for a particular application or site: Notice that our rewrite rule list above is currently empty (which is the default when you first install the extension).  We can click the “Add Rule…” link button in the top-right of the panel to add and enable new URL Rewriting logic for our site.  Scenario 1: Handling Default Document Scenarios One of the SEO problems I discussed earlier in this post was the scenario where the “default document” feature of IIS causes you to inadvertently expose two URLs for the same content on your site.  For example: http://scottgu.com/ http://scottgu.com/default.aspx We can fix this by adding a new IIS Rewrite rule that automatically redirects anyone who navigates to the second URL to instead go to the first one.  We will setup the HTTP redirect to be a “permanent redirect” – which will indicate to search engines that they should follow the redirect and use the new URL they are redirected to as the identifier of the content they retrieve.  Let’s look at how we can create such a rule.  We’ll begin by clicking the “Add Rule” link in the screenshot above.  This will cause the below dialog to display: We’ll select the “Blank Rule” template within the “Inbound rules” section to create a new custom URL Rewriting rule.  This will display an empty pane like below: Don’t worry – setting up the above rule is easy.  The following 4 steps explain how to do so: Step 1: Name the Rule Our first step will be to name the rule we are creating.  Naming it with a descriptive name will make it easier to find and understand later.  Let’s name this rule our “Default Document URL Rewrite” rule: Step 2: Setup the Regular Expression that Matches this Rule Our second step will be to specify a regular expression filter that will cause this rule to execute when an incoming URL matches the regex pattern.   Don’t worry if you aren’t good with regular expressions - I suck at them too. The trick is to know someone who is good at them or copy/paste them from a web-site.  Below we are going to specify the following regular expression as our pattern rule: (.*?)/?Default\.aspx$ This pattern will match any URL string that ends with Default.aspx. The "(.*?)" matches any preceding character zero or more times. The "/?" part says to match the slash symbol zero or one times. The "$" symbol at the end will ensure that the pattern will only match strings that end with Default.aspx.  Combining all these regex elements allows this rule to work not only for the root of your web site (e.g. http://scottgu.com/default.aspx) but also for any application or subdirectory within the site (e.g. http://scottgu.com/photos/default.aspx.  Because the “ignore case” checkbox is selected it will match both “Default.aspx” as well as “default.aspx” within the URL.   One nice feature built-into the rule editor is a “Test pattern” button that you can click to bring up a dialog that allows you to test out a few URLs with the rule you are configuring: Above I've added a “products/default.aspx” URL and clicked the “Test” button.  This will give me immediate feedback on whether the rule will execute for it.  Step 3: Setup a Permanent Redirect Action We’ll then setup an action to occur when our regular expression pattern matches the incoming URL: In the dialog above I’ve changed the “Action Type” drop down to be a “Redirect” action.  The “Redirect Type” will be a HTTP 301 Permanent redirect – which means search engines will follow it. I’ve also set the “Redirect URL” property to be: {R:1}/ This indicates that we want to redirect the web client requesting the original URL to a new URL that has the originally requested URL path - minus the "Default.aspx" in it.  For example, requests for http://scottgu.com/default.aspx will be redirected to http://scottgu.com/, and requests for http://scottgu.com/photos/default.aspx will be redirected to http://scottgu.com/photos/ The "{R:N}" regex construct, where N >= 0, is called a back-reference and N is the back-reference index. In the case of our pattern "(.*?)/?Default\.aspx$", if the input URL is "products/Default.aspx" then {R:0} will contain "products/Default.aspx" and {R:1} will contain "products".  We are going to use this {R:1}/ value to be the URL we redirect users to.  Step 4: Apply and Save the Rule Our final step is to click the “Apply” button in the top right hand of the IIS admin tool – which will cause the tool to persist the URL Rewrite rule into our application’s root web.config file (under a <system.webServer/rewrite> configuration section): <configuration>     <system.webServer>         <rewrite>             <rules>                 <rule name="Default Document" stopProcessing="true">                     <match url="(.*?)/?Default\.aspx$" />                     <action type="Redirect" url="{R:1}/" />                 </rule>             </rules>         </rewrite>     </system.webServer> </configuration> Because IIS 7.x and ASP.NET share the same web.config files, you can actually just copy/paste the above code into your web.config files using Visual Studio and skip the need to run the admin tool entirely.  This also makes adding/deploying URL Rewrite rules with your ASP.NET applications really easy. Step 5: Try the Rule Out Now that we’ve saved the rule, let’s try it out on our site.  Try the following two URLs on my site: http://scottgu.com/ http://scottgu.com/default.aspx Notice that the second URL automatically redirects to the first one.  Because it is a permanent redirect, search engines will follow the URL and should update the page ranking of http://scottgu.com to include links to http://scottgu.com/default.aspx as well. Scenario 2: Different URL Casing Another common SEO problem I discussed earlier in this post is that URLs are case sensitive to search engines on the web.  This means that search engines will treat the following links as two completely different URLs: http://scottgu.com/Albums.aspx http://scottgu.com/albums.aspx We can fix this by adding a new IIS Rewrite rule that automatically redirects anyone who navigates to the first URL to instead go to the second (all lower-case) one.  Like before, we will setup the HTTP redirect to be a “permanent redirect” – which will indicate to search engines that they should follow the redirect and use the new URL they are redirected to as the identifier of the content they retrieve. To create such a rule we’ll click the “Add Rule” link in the URL Rewrite admin tool again.  This will cause the “Add Rule” dialog to appear again: Unlike the previous scenario (where we created a “Blank Rule”), with this scenario we can take advantage of a built-in “Enforce lowercase URLs” rule template.  When we click the “ok” button we’ll see the following dialog which asks us if we want to create a rule that enforces the use of lowercase letters in URLs: When we click the “Yes” button we’ll get a pre-written rule that automatically performs a permanent redirect if an incoming URL has upper-case characters in it – and automatically send users to a lower-case version of the URL: We can click the “Apply” button to use this rule “as-is” and have it apply to all incoming URLs to our site.  Because my www.scottgu.com site uses ASP.NET Web Forms, I’m going to make one small change to the rule we generated above – which is to add a condition that will ensure that URLs to ASP.NET’s built-in “WebResource.axd” handler are excluded from our case-sensitivity URL Rewrite logic.  URLs to the WebResource.axd handler will only come from server-controls emitted from my pages – and will never be linked to from external sites.  While my site will continue to function fine if we redirect these URLs to automatically be lower-case – doing so isn’t necessary and will add an extra HTTP redirect to many of my pages.  The good news is that adding a condition that prevents my URL Rewriting rule from happening with certain URLs is easy.  We simply need to expand the “Conditions” section of the form above We can then click the “Add” button to add a condition clause.  This will bring up the “Add Condition” dialog: Above I’ve entered {URL} as the Condition input – and said that this rule should only execute if the URL does not match a regex pattern which contains the string “WebResource.axd”.  This will ensure that WebResource.axd URLs to my site will be allowed to execute just fine without having the URL be re-written to be all lower-case. Note: If you have static resources (like references to .jpg, .css, and .js files) within your site that currently use upper-case characters you’ll probably want to add additional condition filter clauses so that URLs to them also don’t get redirected to be lower-case (just add rules for patterns like .jpg, .gif, .js, etc).  Your site will continue to work fine if these URLs get redirected to be lower case (meaning the site won’t break) – but it will cause an extra HTTP redirect to happen on your site for URLs that don’t need to be redirected for SEO reasons.  So setting up a condition clause makes sense to add. When I click the “ok” button above and apply our lower-case rewriting rule the admin tool will save the following additional rule to our web.config file: <configuration>     <system.webServer>         <rewrite>             <rules>                 <rule name="Default Document" stopProcessing="true">                     <match url="(.*?)/?Default\.aspx$" />                     <action type="Redirect" url="{R:1}/" />                 </rule>                 <rule name="Lower Case URLs" stopProcessing="true">                     <match url="[A-Z]" ignoreCase="false" />                     <conditions logicalGrouping="MatchAll" trackAllCaptures="false">                         <add input="{URL}" pattern="WebResource.axd" negate="true" />                     </conditions>                     <action type="Redirect" url="{ToLower:{URL}}" />                 </rule>             </rules>         </rewrite>     </system.webServer> </configuration> Try the Rule Out Now that we’ve saved the rule, let’s try it out on our site.  Try the following two URLs on my site: http://scottgu.com/Albums.aspx http://scottgu.com/albums.aspx Notice that the first URL (which has a capital “A”) automatically does a redirect to a lower-case version of the URL.  Scenario 3: Trailing Slashes Another common SEO problem I discussed earlier in this post is the scenario of trailing slashes within URLs.  The trailing slash creates yet another situation that causes search engines to treat the URLs as different and so split search rankings: http://scottgu.com http://scottgu.com/ We can fix this by adding a new IIS Rewrite rule that automatically redirects anyone who navigates to the first URL (that does not have a trailing slash) to instead go to the second one that does.  Like before, we will setup the HTTP redirect to be a “permanent redirect” – which will indicate to search engines that they should follow the redirect and use the new URL they are redirected to as the identifier of the content they retrieve.  To create such a rule we’ll click the “Add Rule” link in the URL Rewrite admin tool again.  This will cause the “Add Rule” dialog to appear again: The URL Rewrite admin tool has a built-in “Append or remove the trailing slash symbol” rule template.  When we select it and click the “ok” button we’ll see the following dialog which asks us if we want to create a rule that automatically redirects users to a URL with a trailing slash if one isn’t present: Like within our previous lower-casing rewrite rule we’ll add one additional condition clause that will exclude WebResource.axd URLs from being processed by this rule.  This will avoid an unnecessary redirect for happening for those URLs. When we click the “OK” button we’ll get a pre-written rule that automatically performs a permanent redirect if the URL doesn’t have a trailing slash – and if the URL is not processed by either a directory or a file.  This will save the following additional rule to our web.config file: <configuration>     <system.webServer>         <rewrite>             <rules>                 <rule name="Default Document" stopProcessing="true">                     <match url="(.*?)/?Default\.aspx$" />                     <action type="Redirect" url="{R:1}/" />                 </rule>                 <rule name="Lower Case URLs" stopProcessing="true">                     <match url="[A-Z]" ignoreCase="false" />                     <conditions logicalGrouping="MatchAll" trackAllCaptures="false">                         <add input="{URL}" pattern="WebResource.axd" negate="true" />                     </conditions>                     <action type="Redirect" url="{ToLower:{URL}}" />                 </rule>                 <rule name="Trailing Slash" stopProcessing="true">                     <match url="(.*[^/])$" />                     <conditions logicalGrouping="MatchAll" trackAllCaptures="false">                         <add input="{REQUEST_FILENAME}" matchType="IsDirectory" negate="true" />                         <add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true" />                         <add input="{URL}" pattern="WebResource.axd" negate="true" />                     </conditions>                     <action type="Redirect" url="{R:1}/" />                 </rule>             </rules>         </rewrite>     </system.webServer> </configuration> Try the Rule Out Now that we’ve saved the rule, let’s try it out on our site.  Try the following two URLs on my site: http://scottgu.com http://scottgu.com/ Notice that the first URL (which has no trailing slash) automatically does a redirect to a URL with the trailing slash.  Because it is a permanent redirect, search engines will follow the URL and update the page ranking. Scenario 4: Canonical Host Names The final SEO problem I discussed earlier are scenarios where a site works with both a leading “www” hostname prefix as well as just the hostname itself.  This causes search engines to treat the URLs as different and split search rankling: http://www.scottgu.com/albums.aspx http://scottgu.com/albums.aspx We can fix this by adding a new IIS Rewrite rule that automatically redirects anyone who navigates to the first URL (that has a www prefix) to instead go to the second URL.  Like before, we will setup the HTTP redirect to be a “permanent redirect” – which will indicate to search engines that they should follow the redirect and use the new URL they are redirected to as the identifier of the content they retrieve.  To create such a rule we’ll click the “Add Rule” link in the URL Rewrite admin tool again.  This will cause the “Add Rule” dialog to appear again: The URL Rewrite admin tool has a built-in “Canonical domain name” rule template.  When we select it and click the “ok” button we’ll see the following dialog which asks us if we want to create a redirect rule that automatically redirects users to a primary host name URL: Above I’m entering the primary URL address I want to expose to the web: scottgu.com.  When we click the “OK” button we’ll get a pre-written rule that automatically performs a permanent redirect if the URL has another leading domain name prefix.  This will save the following additional rule to our web.config file: <configuration>     <system.webServer>         <rewrite>             <rules>                 <rule name="Cannonical Hostname">                     <match url="(.*)" />                     <conditions logicalGrouping="MatchAll" trackAllCaptures="false">                         <add input="{HTTP_HOST}" pattern="^scottgu\.com$" negate="true" />                     </conditions>                     <action type="Redirect" url="http://scottgu.com/{R:1}" />                 </rule>                 <rule name="Default Document" stopProcessing="true">                     <match url="(.*?)/?Default\.aspx$" />                     <action type="Redirect" url="{R:1}/" />                 </rule>                 <rule name="Lower Case URLs" stopProcessing="true">                     <match url="[A-Z]" ignoreCase="false" />                     <conditions logicalGrouping="MatchAll" trackAllCaptures="false">                         <add input="{URL}" pattern="WebResource.axd" negate="true" />                     </conditions>                     <action type="Redirect" url="{ToLower:{URL}}" />                 </rule>                 <rule name="Trailing Slash" stopProcessing="true">                     <match url="(.*[^/])$" />                     <conditions logicalGrouping="MatchAll" trackAllCaptures="false">                         <add input="{REQUEST_FILENAME}" matchType="IsDirectory" negate="true" />                         <add input="{REQUEST_FILENAME}" matchType="IsFile" negate="true" />                         <add input="{URL}" pattern="WebResource.axd" negate="true" />                     </conditions>                     <action type="Redirect" url="{R:1}/" />                 </rule>             </rules>         </rewrite>     </system.webServer> </configuration> Try the Rule Out Now that we’ve saved the rule, let’s try it out on our site.  Try the following two URLs on my site: http://www.scottgu.com/albums.aspx http://scottgu.com/albums.aspx Notice that the first URL (which has the “www” prefix) now automatically does a redirect to the second URL which does not have the www prefix.  Because it is a permanent redirect, search engines will follow the URL and update the page ranking. 4 Simple Rules for Improved SEO The above 4 rules are pretty easy to setup and should take less than 15 minutes to configure on existing sites you already have.  The beauty of using a solution like the URL Rewrite Extension is that you can take advantage of it without having to change code within your web-site – and without having to break any existing links already pointing at your site.  Users who follow existing links will be automatically redirected to the new URLs you wish to publish.  And search engines will start to give your site a higher search relevancy ranking – which will list your site higher in search results and drive more traffic to it. Customizing your URL Rewriting rules further is easy to-do either by editing the web.config file directly, or alternatively, just double click the URL Rewrite icon within the IIS 7.x admin tool and it will list all the active rules for your web-site or application: Clicking any of the rules above will open the rules editor back up and allow you to tweak/customize/save them further. Summary Measuring and improving SEO is something every developer building a public-facing web-site needs to think about and focus on.  If you haven’t already, download and use the SEO Toolkit to analyze the SEO of your sites today. New URL Routing features in ASP.NET MVC and ASP.NET Web Forms 4 make it much easier to build applications that have more control over the URLs that are published.  Tools like the URL Rewrite Extension that I’ve talked about in this blog post make it much easier to improve the URLs that are published from sites you already have built today – without requiring you to change a lot of code. The URL Rewrite Extension provides a bunch of additional great capabilities – far beyond just SEO - as well.  I’ll be covering these additional capabilities more in future blog posts. Hope this helps, Scott

    Read the article

< Previous Page | 24 25 26 27 28 29 30 31 32 33 34 35  | Next Page >