Search Results

Search found 20157 results on 807 pages for 'friendly url'.

Page 2/807 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • URL rewrite and domain frame

    - by Dennis
    I have registered the domain www.posti.sh at nic.sh. The website is on the server www.myskoob.com/postish. Unfortunately, nic.sh does not support frames, i.e. that the domain stays posti.sh as it forwards to www.myskoob.com/postish - so I thought about a URL rewrite on the server. Unfortunately I have no idea how rewriting works - I am thankful for explanations - but I would also like to ask whether this is generally possible. What I need is: The server needs to recognize that the folder postish is accessed Depending on the file that is opened, it needs to rewrite the url to www.posti.sh/<-according filename here- Also, the server needs to understand that a link to www.posti.sh/about.php links to www.myskoob.com/postish/about.php and likewise for other files - at the moment, when I type in posti.sh/about.php it redirects to http://www.myskoob.com/postishabout.php, which does not exist All this should be possible irrespective of whether the url contains a "www" at the beginning or not A plus but not necessary would be that it does not display the .php extensions Would that generally be possible? If not, what would be the alternatives? If anyone knows how to do it, any code and/or way to do it would be much appreciated!

    Read the article

  • present a static page url as different url which is SEO friendly

    - by Gaurav Sharma
    Hi, I have developed a site, which has some static pages. Like explore, home, feedback. The link for these goes as follows website.com/views/explore.php website.com/index.php website.com/views/feedback.php I want to write a different SEO URL for each of the URL mentioned above. Is it possible ? i.e. for example website.com/views/explore.php should be convereted/visible as website.com/explore website.com/views/feedback.php should be convereted/visible as website.com/give/feedback and so on

    Read the article

  • Support same url in different forms

    - by Vikas Gulati
    I have seen many websites supporting multiple forms of the same URL. For example consider www.example.com/question1/ OR www.example.com/Question1/ OR www.example.com/QUESTION1/ etc. all lead to one page with say <link rel="canonical" href="http://www.example.com/question1/" > or a 301 redirect to www.example.com/question1/. Does this affect the page rank anyhow or its just for seemless user experience or there is some other reason behind this? Infact even stackoverflow/stackexchange does this. No matter what the text after the id of the question in url they redirect you to the correct question!

    Read the article

  • Avoid SEO loss after URL structure change

    - by Eric Nguyen
    We recently re-wrote our site from Umbraco to WordPress. This has been done by third-party developers. I have been the project manager and it is my mistake that I haven't notice the change of URLs that affect SEO until now. New site was launch last Thursday. The old URL for a "place" (a WordPress custom post type, in case you're WordPress expert and want/ need to point me to another discussion on WP Stackexchange) page is as follows: ourdomain.com/singapore/central/alexandra/an-interesting-place Now it has been changed to ourdomain.com/places/an-interesting-place I have already requested the third-party developers to work rewriting the URLs to emulate the old URL structure. However, it's taking quite a lot of time (we have multiple custom post types e.g. events etc. so it might be complicated; the developers seem quite by blur when I first mentioned rewriting URLs for the custom post types) In the meantime, I wonder if there is a quicker work around for this 1) Use .htaccess to rewrite ourdomain.com/singapore/central/alexandra/an-interesting-place to ourdomain.com/places/an-interesting-place This should avoid 90% loss of the search traffic. I suppose I can learn how to do this quite quickly but no harm mentioning it here 2) Use rel="canonical" to indicate that ourdomain.com/places/an-interesting-place is the exact duplicate of ourdomain.com/singapore/central/alexandra/an-interesting-place I will definitely go for both approaches (and also I'm changing 404 page to cater for this temporary isue) but I wonder if 2) is even feasible and if I have missed anything. Is there anything else you could recommend me in this situation. Let me know if my question is not clear anywhere. Clarifications The old website is on a Windows Server EC2 completely separated from the Linux EC2 instance on which the new site is running. In addition, the same domain "ourdomain.com" is used here (an A record is used to point to an EC2 Elastic IP). Therefore, the old server is completely inaccessible at the moment, unless you we use the IP address to old server (which doesn't help me at all in this case). Even if the old server is accessible, I can't see where one can put the .htaccess or a HTML file to do 301 redirect here. Unless I'm successful with my approach 1) or the developers can rewrite the URLs with coding, 404 page is really a choice for me.

    Read the article

  • SEO URL Structure

    - by Neil
    Based on the following example URL structure: mysite.com/mypage.aspx?a=red&b=green&c=blue Pages in the application use ASP.net user controls and some of these controls build a query string. To prevent duplicate keys being created e.g. &pid=12&pid=10, I am researching methods of rewriting the URL: a) mysite.com/mypage.aspx/red/green/blue b) mysite.com/mypage.aspx?controlname=a,red|b,green|c,blue Pages using this structure would be publishing content that I would like to get indexed and ranked - articles and products (8,000 products to start, with thousands more being added later) My gut instinct tells me to go with the first method, but would it would be overkill to add all that infrastructure if the second method will accomplish my goal of getting pages indexed AND ranked. So my question, looking at the pro's and con's, Google Ranking, time to implement etc. which method should I use? Thanks!

    Read the article

  • Make friendly URL in ASP.NET

    - by Khou
    how do i make my web app friendly URL? currently my app URL looks like this http://www.domain.com/Page.aspx?article=103 but I would like to display the URL to look like this http://www.domain.com/Page.aspx?Google-likes-url-friendly what would i need to do?

    Read the article

  • Rewriting URL to get Wordpress "permalink" type URL

    - by user1472575
    I would like my users to enter http://mywebsite.com/the-name-of-my-post and have the following execute: http://mywebsite.com/Default.aspx#&&the-name-of-my-post ...which is what the ScriptManager generates at runtime. I have created an ASP.NET site to replace a Wordpress site that creates "permalinks". This site was around for about 2 years so there are lots of bookmarks and references to these "permalinks" on the search engines etc. Also are there any modules I have to include in my website to get this to work? Is there any configuration that I have to ask my hosting company to make so that this works?

    Read the article

  • Overview of getting and setting the URL and parts of the URL using angularjs and/or Javascript

    - by Sandy Good
    Getting and Setting the URL, and different parts of the URL are a basic part of Application Design. For Page Navigation Deep Linking Providing a link to the user Querying Data Passing information to other pages Both angularjs and javascript provide ways to get/set the URL and parts of the URL. I'm looking for the following information: Situation: Show a simple URL in the browser address bar to the user Provide a more detailed URL with string parameters to the page that the user will not see. In other words, two different URLs will be used, one simple one that the user sees in the browser, a more detailed one available to the page on load. Get URL info with PHP when then page intially loads, both don't reload the PHP page when the user needs more detailed info that is already loaded but not displayed yet. Set the URL with a more detailed URL for deep linking as the user drills down to more specific information. Get URL info in a controller or JavaSript when angularjs detects a change in the URL with routing. Hash or Query String or Both? Should I use a hash # in the URL, a string ?= or both? Here is what I currently know and what I want: A Query String HTTP:\\www.name.com?mykey=itemID will prevent angularjs from reloading the page. So, I can change the URL by adding/changing the string at the end, thereby providing new info to the page, and keep the page from reloading. I can change the URL and force a page reload with: window.location.href = "#Store/" + argUserPubId + "?itemID=home"; If home is the itemID string, I want code to simply load the page, and not display more detailed information. If there is a real itemID in the URL query string, I want the code to display the more detailed information. Code from angularjs will run either from the controller specified in the routing, or a controller specified in the HTML, or both. The angularjs code specified in the routing seems to run first, before the code specified in the HTML. A different URL for the page can be used in angularjs templateURL: than the URL that was sent to the browser address bar. when('/Store/:StoreId', { templateUrl: function(params){return 'Client_Pages/Stores.php?storeID=' + params.StoreId;}, controller: 'storeParseData' }). The above code detects http:\\www.name.com\Store\StoreID in the browser, but SENDS http:\\www.name.com\Client_Pages/Stores.php?storeID=StoreID to the page. In the above code, a function is used for the angularjs routing templateURL: to dynamically set the templateURL. So, when the user clicks something to see details of an item, how should I configure the URL? Should I use angularjs $location or window.location.href ? Should I use a longer URL with more parameters, a hash bang, or a query string? Should I use: http:\\www.name.com\Store\StoreID\ItemID or http:\\www.name.com\Store\StoreID#ItemID or http:\\www.name.com\Store\StoreID?ItemID or http:\\www.name.com\Store#StoreID?ItemID or Something else?

    Read the article

  • How to handle existing indexed Mixed Case url's?

    - by marcusstarnes
    I have an asp.net web forms application that has been live for a number of years and as such has quite a lot of indexed content on google. Ideally, I'd prefer that all Url's for the website are in lowercase but I understand that having 2 versions of the same content indexed in search engines (MixedCase.aspx and mixedcase.aspx) will be bad for seo. I was wondering: a) Should I just leave everything in its current Mixed Case form and never change it? OR b) I can change the code so everything is in lowercase from here on in, BUT, is there a way of doing this so as the search engines are aware of this change and don't penalise me?

    Read the article

  • ASP.net MVC support for URL's with hyphens

    - by John
    Is there an easy way to get the MvcRouteHandler to convert all hyphens in the action and controller sections of an incoming URL to underscores as hyphens are not supported in method or class names. This would be so that I could support such structures as sample.com/test-page/edit-details mapping to Action edit_details and Controller test_pagecontroller while continuing to use MapRoute method. I understand I can specify an action name attribute and support hyphens in controller names which out manually adding routes to achieve this however I am looking for an automated way so save errors when adding new controllers and actions.

    Read the article

  • Using subdomains or directories for main categories?

    - by Matthieu
    I have a website which references places to travel to in the world. Those places are (of course) grouped by countries. Here is an example of an actual URL of my website: http://awespot.org/country/105/iceland I am wondering if it would be better, in terms of SEO, to have the countries separated in subdomains: http://iceland.awespot.org/ I know subdomains are considered by Google as different websites, so I am considering the 2 options: separating would mean also separating the pagerank and benefits of links to the website but separating would enable me to create a "web" or related websites (all related to travel and all) that link and benefit to each other I am only asking about SEO here, I know there are other questions raised by this possibility (user experience, administration... even password completion by web browsers)

    Read the article

  • Advanced Rails Routing of short URL's and usernames off of root url

    - by Michael Waxman
    I want to have username URL's and Base 58 short URL's to resources both off of the root url like this: http://mydomain.com/username #=> goes to given user http://mydomain.com/a3x9 #=> goes to given story I am aware of the possibilities of a user names conflicting with short urls, and I have a workaround, but what I can't figure out is the best way to set this up in rails. Can I do it in rails routes? Should I do something with a piece of Rack middleware? Should I set up a routing controller? Please let me know the best way to do this. Thanks so much!

    Read the article

  • apache rewriting url doesn't work(using godaddy hosting)

    - by AzizAG
    I'm using a framework to create my website(codeigniter) by default the urls are like this:mysite.com/index.php?/etc/etc/etc. And I'm trying to remove the index.php?, I tried to remove it by doing this(didn't work): RewriteEngine on RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ /index.php?$1 [L] Note: it's working on my localhost(when putting my website's files in the root directiory). So, Is this issue associated with me or the hosting company(Go Daddy)?

    Read the article

  • Masking Domain URL with Sub domain URL

    - by Vijay Kumar
    Hai i'm having a subdomain (abc.example.in) but this is referred to mydomain/folder . the thing is i want the mask URL links and the masked url should be displayed following with sub domain can any one help me out . mydomain/folder/path to abc.example.in/path

    Read the article

  • modx friendly urls nginx FPM php5.3 - friendly url's not working

    - by okdan
    Hi, Im using php5.3 on nginx 0.8.53 with FPM on Modx revolution. Im trying to get "friendly url's" to work, but all I get is 404's. In modx config, friendly url's is set to yes, friendly aliases is set to no (so it drops the suffix) My config file: server { listen 80; server_name .mydomain.net; # index index.php; root /home/mylogin/htdocs; location / { index index.php index.html; if (!-e $request_filename) { rewrite ^/(.*)$ /index.php?q=$1 last; } } # serve static files directly location ~* ^.+\.(jpg|jpeg|gif|css|png|js|ico)$ { root /home/mylogin/htdocs; access_log off; expires 30d; break; } } Fast CGI modx file: fastcgi_connect_timeout 60; fastcgi_send_timeout 300; fastcgi_buffers 4 32k; fastcgi_busy_buffers_size 32k; fastcgi_temp_file_write_size 32k; fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name; fastcgi_ignore_client_abort on; fastcgi_intercept_errors on; fastcgi_read_timeout 300; fastcgi_param QUERY_STRING $query_string; fastcgi_param REQUEST_METHOD $request_method; fastcgi_param CONTENT_TYPE $content_type; fastcgi_param CONTENT_LENGTH $content_length; fastcgi_param SCRIPT_NAME $fastcgi_script_name; fastcgi_param REQUEST_URI $request_uri; fastcgi_param DOCUMENT_URI $document_uri; fastcgi_param DOCUMENT_ROOT $document_root; fastcgi_param SERVER_PROTOCOL $server_protocol; fastcgi_param GATEWAY_INTERFACE CGI/1.1; fastcgi_param SERVER_SOFTWARE nginx/$nginx_version; fastcgi_param REMOTE_ADDR $remote_addr; fastcgi_param REMOTE_PORT $remote_port; fastcgi_param SERVER_ADDR $server_addr; fastcgi_param SERVER_PORT $server_port; fastcgi_param SERVER_NAME $server_name; fastcgi_param REDIRECT_STATUS 200;

    Read the article

  • The Mysterious ARR Server Farm to URL Rewrite link

    - by OWScott
    Application Request Routing (ARR) is a reverse proxy plug-in for IIS7+ that does many things, including functioning as a load balancer.  For this post, I’m assuming that you already have an understanding of ARR.  Today I wanted to find out how the mysterious link between ARR and URL Rewrite is maintained.  Let me explain… ARR is unique in that it doesn’t work by itself.  It sits on top of IIS7 and uses URL Rewrite.  As a result, ARR depends on URL Rewrite to ‘catch’ the traffic and redirect it to an ARR Server Farm. As the last step of creating a new Server Farm, ARR will prompt you with the following: If you accept the prompt, it will create a URL Rewrite rule for you.  If you say ‘No’, then you’re on your own to create a URL Rewrite rule. When you say ‘Yes’, the Server Farm’s checkbox for “Use URL Rewrite to inspect incoming requests” will be checked.  See the following screenshot. However, I’m not a fan of this auto-rule.  The problem is that if I make any changes to the URL Rewrite rule, which I always do, and then make the wrong change in ARR, it will blow away my settings.  So, I prefer to create my own rule and manage it myself. Since I had some old rules that were managed by ARR, I wanted to update them so that they were no longer managed that way.  I took a look at a config in applicationHost.config to try to find out what property would bind the two together.  I assumed that there would be a property on the ServerFarm called something like urlRewriteRuleName that would serve as the link between ARR and URL Rewrite.  I found no such property.  After a bit of testing, I found that the name of the URL Rewrite rule is the only link between ARR and URL Rewrite.  I wouldn’t have guessed.  The URL Rewrite rule needs to be exactly ARR_{ServerFarm Name}_loadBalance, although it’s not case sensitive. Consider the following auto-created URL Rewrite rule: And, the link between ARR and URL Rewrite exists: Now, as soon as I rename that to anything else, for example, site.com ARR Binding, the link between ARR and URL Rewrite is broken. To be certain of the relationship, I renamed it back again and sure enough, the relationship was reestablished. Why is this important?  It’s only important if you want to decouple the relationship between ARR the URL Rewrite rule, but if you want to do so, the best way to do that is to rename the URL Rewrite rule.  If you uncheck the “Use URL Rewrite to inspect incoming requests” checkbox, it will delete your rule for you without prompting.  Conclusion The mysterious link between ARR and URL Rewrite only exists through the ARR Rule name.  If you want to break the link, simply rename the URL Rewrite rule.  It’s completely safe to do so, and, in my opinion, this is a rule that you should manage yourself anyway. 

    Read the article

  • Are there disadvantages an literal + instead of an encoded + (%2B) in an URL?

    - by M_rk
    A client of mine has a product ending with a plus-sign (e.g. Google+) and would like the webpage of this product to have an URL that is human-readable (i.e. an URL that doesn't contain %2B). Since our projects use the following .htaccess RewriteRule RewriteRule ^(.*)$ index.php?$1 it is possible to use an urlencoded space in an URL like that. However, while the url would read like /google+, the actual meaning of the URL would be /google[space]. (The markup won't let me place a real space there.) Now my concern is that this would have disadvantages for SEO. Is this concern valid and/or are there other culprits to this approach?

    Read the article

  • How can I unshorten a URL using python?

    - by Andrew
    I want to be able to take a shortened or non-shortened URL and return its un-shortened form. How can I make a python program to do this? Additional Clarification: Case 1: shortened -- unshortened Case 2: unshortened -- unshortened e.g. bit.ly/silly in the input array should be google.com in the output array e.g. google.com in the input array should be google.com in the output array Thanks for the help!

    Read the article

  • SEO Friendly URL Rewriter Parameters

    - by Kristen
    I would appreciate you advice on how to incorporate parameters into SEO Friendly URLs We have decided to have the "techie" parameters first, followed by the "SEO Slug" \product\ABC123\fly-your-own-helicopter much like S.O. - if the SEO Slug changes, or is truncated, or missing, we still have the Product and ABC123 parameters; various articles say that having such extra data doesn't harm SEO ranking. We need to have additional parameters; we could use "-" to separate parameters as it makes them look similar to the SEO Slug, or we could/should use something else? \product\ABC123-BOYTOY-2\boys\toys\fly-your-own-helicopter This is product=ABC123, in Category=BOYTOY and Page=2. We also want to keep the hierarchy as flat as possible, and thus I think: \product-ABC123-BOYTOY-2\boys\toys\fly-your-own-helicopter would be better - one level less. We have a number of "zones", e.g. \product-ABC123\seo-slug-for-product \category-BOYTOY\seo-slug-for-category \article-54321\terms-and-conditions it would help us a lot if we could just user our 5 digit Page ID number instead, so these would become \12345-ABC123\seo-slug-for-product \23456-BOYTOY\seo-slug-for-category \54321\terms-and-conditions (Products & Categories have a number of different Page IDs for different templates, this style would take us straight to the right one) I would appreciate your insight into what parameter separators to use, and if the leading techi-data is going work well for us. In case relevant: Classic ASP application on IIS7 + MSSQL2008 Product & Category codes contain A-Z, 0-9, "_" only.

    Read the article

  • SEO Friendly URL for search keywords

    - by Kyojimaru
    I have a website where you can search for a brands, item, and content inside my web. It was designed with tab for each search type, but I want to make the url when changing the tab user friendly and good for SEO. Is it better to have a url for search result like this www.example.com/search/{search_keyword}/{tab} or www.example.com/search/{search_keyword}?tab={tab} or www.example.com/search/?search={search_keyword}&tab={tab} where {search_keyword} is the keyword that user input, and {tab} is either brands / item / content, because when I look at facebook, stackoverflow, and some other website, they use query string for their search keyword Edit My past url is only www.example.com/search/{search_keyword}, and I just added the tab design recently. Consider that I should go with option 1 from the above option, should I make www.example.com/search/{search_keyword} the default for 1 of the 3 tab, and make the other 2 tab with www.example.com/search/{search_keyword}/{tab} to retain the score for the page, or should I make all the tab url with www.example.com/search/{search_keyword}/{tab} and use a permanent redirect from www.example.com/search/{search_keyword} to one of the url tab

    Read the article

  • URL Rewrite – Protocol (http/https) in the Action

    - by OWScott
    IIS URL Rewrite supports server variables for pretty much every part of the URL and http header. However, there is one commonly used server variable that isn’t readily available.  That’s the protocol—HTTP or HTTPS. You can easily check if a page request uses HTTP or HTTPS, but that only works in the conditions part of the rule.  There isn’t a variable available to dynamically set the protocol in the action part of the rule.  What I wish is that there would be a variable like {HTTP_PROTOCOL} which would have a value of ‘HTTP’ or ‘HTTPS’.  There is a server variable called {HTTPS}, but the values of ‘on’ and ‘off’ aren’t practical in the action.  You can also use {SERVER_PORT} or {SERVER_PORT_SECURE}, but again, they aren’t useful in the action. Let me illustrate.  The following rule will redirect traffic for http(s)://localtest.me/ to http://www.localtest.me/. <rule name="Redirect to www"> <match url="(.*)" /> <conditions> <add input="{HTTP_HOST}" pattern="^localtest\.me$" /> </conditions> <action type="Redirect" url="http://www.localtest.me/{R:1}" /> </rule> The problem is that it forces the request to HTTP even if the original request was for HTTPS. Interestingly enough, I planned to blog about this topic this week when I noticed in my twitter feed yesterday that Jeff Graves, a former colleague of mine, just wrote an excellent blog post about this very topic.  He beat me to the punch by just a couple days.  However, I figured I would still write my blog post on this topic.  While his solution is a excellent one, I personally handle this another way most of the time.  Plus, it’s a commonly asked question that isn’t documented well enough on the web yet, so having another article on the web won’t hurt. I can think of four different ways to handle this, and depending on your situation you may lean towards any of the four.  Don’t let the choices overwhelm you though.  Let’s keep it simple, Option 1 is what I use most of the time, Option 2 is what Jeff proposed and is the safest option, and Option 3 and Option 4 need only be considered if you have a more unique situation.  All four options will work for most situations. Option 1 – CACHE_URL, single rule There is a server variable that has the protocol in it; {CACHE_URL}.  This server variable contains the entire URL string (e.g. http://www.localtest.me:80/info.aspx?id=5)  All we need to do is extract the HTTP or HTTPS and we’ll be set. This tends to be my preferred way to handle this situation. Indeed, Jeff did briefly mention this in his blog post: … you could use a condition on the CACHE_URL variable and a back reference in the rewritten URL. The problem there is that you then need to match all of the conditions which could be a problem if your rule depends on a logical “or” match for conditions. Thus the problem.  If you have multiple conditions set to “Match Any” rather than “Match All” then this option won’t work.  However, I find that 95% of all rules that I write use “Match All” and therefore, being the lazy administrator that I am I like this simple solution that only requires adding a single condition to a rule.  The caveat is that if you use “Match Any” then you must consider one of the next two options. Enough with the preamble.  Here’s how it works.  Add a condition that checks for {CACHE_URL} with a pattern of “^(.+)://” like so: How you have a back-reference to the part before the ://, which is our treasured HTTP or HTTPS.  In URL Rewrite 2.0 or greater you can check the “Track capture groups across conditions”, make that condition the first condition, and you have yourself a back-reference of {C:1}. The “Redirect to www” example with support for maintaining the protocol, will become: <rule name="Redirect to www" stopProcessing="true"> <match url="(.*)" /> <conditions trackAllCaptures="true"> <add input="{CACHE_URL}" pattern="^(.+)://" /> <add input="{HTTP_HOST}" pattern="^localtest\.me$" /> </conditions> <action type="Redirect" url="{C:1}://www.localtest.me/{R:1}" /> </rule> It’s not as easy as it would be if Microsoft gave us a built-in {HTTP_PROTOCOL} variable, but it’s pretty close. I also like this option since I often create rule examples for other people and this type of rule is portable since it’s self-contained within a single rule. Option 2 – Using a Rewrite Map For a safer rule that works for both “Match Any” and “Match All” situations, you can use the Rewrite Map solution that Jeff proposed.  It’s a perfectly good solution with the only drawback being the ever so slight extra effort to set it up since you need to create a rewrite map before you create the rule.  In other words, if you choose to use this as your sole method of handling the protocol, you’ll be safe. After you create a Rewrite Map called MapProtocol, you can use “{MapProtocol:{HTTPS}}” for the protocol within any rule action.  Following is an example using a Rewrite Map. <rewrite> <rules> <rule name="Redirect to www" stopProcessing="true"> <match url="(.*)" /> <conditions trackAllCaptures="false"> <add input="{HTTP_HOST}" pattern="^localtest\.me$" /> </conditions> <action type="Redirect" url="{MapProtocol:{HTTPS}}://www.localtest.me/{R:1}" /> </rule> </rules> <rewriteMaps> <rewriteMap name="MapProtocol"> <add key="on" value="https" /> <add key="off" value="http" /> </rewriteMap> </rewriteMaps> </rewrite> Option 3 – CACHE_URL, Multi-rule If you have many rules that will use the protocol, you can create your own server variable which can be used in subsequent rules. This option is no easier to set up than Option 2 above, but you can use it if you prefer the easier to remember syntax of {HTTP_PROTOCOL} vs. {MapProtocol:{HTTPS}}. The potential issue with this rule is that if you don’t have access to the server level (e.g. in a shared environment) then you cannot set server variables without permission. First, create a rule and place it at the top of the set of rules.  You can create this at the server, site or subfolder level.  However, if you create it at the site or subfolder level then the HTTP_PROTOCOL server variable needs to be approved at the server level.  This can be achieved in IIS Manager by navigating to URL Rewrite at the server level, clicking on “View Server Variables” from the Actions pane, and added HTTP_PROTOCOL. If you create the rule at the server level then this step is not necessary.  Following is an example of the first rule to create the HTTP_PROTOCOL and then a rule that uses it.  The Create HTTP_PROTOCOL rule only needs to be created once on the server. <rule name="Create HTTP_PROTOCOL"> <match url=".*" /> <conditions logicalGrouping="MatchAll" trackAllCaptures="false"> <add input="{CACHE_URL}" pattern="^(.+)://" /> </conditions> <serverVariables> <set name="HTTP_PROTOCOL" value="{C:1}" /> </serverVariables> <action type="None" /> </rule>   <rule name="Redirect to www" stopProcessing="true"> <match url="(.*)" /> <conditions logicalGrouping="MatchAll" trackAllCaptures="false"> <add input="{HTTP_HOST}" pattern="^localtest\.me$" /> </conditions> <action type="Redirect" url="{HTTP_PROTOCOL}://www.localtest.me/{R:1}" /> </rule> Option 4 – Multi-rule Just to be complete I’ll include an example of how to achieve the same thing with multiple rules. I don’t see any reason to use it over the previous examples, but I’ll include an example anyway.  Note that it will only work with the “Match All” setting for the conditions. <rule name="Redirect to www - http" stopProcessing="true"> <match url="(.*)" /> <conditions logicalGrouping="MatchAll" trackAllCaptures="false"> <add input="{HTTP_HOST}" pattern="^localtest\.me$" /> <add input="{HTTPS}" pattern="off" /> </conditions> <action type="Redirect" url="http://www.localtest.me/{R:1}" /> </rule> <rule name="Redirect to www - https" stopProcessing="true"> <match url="(.*)" /> <conditions logicalGrouping="MatchAll" trackAllCaptures="false"> <add input="{HTTP_HOST}" pattern="^localtest\.me$" /> <add input="{HTTPS}" pattern="on" /> </conditions> <action type="Redirect" url="https://www.localtest.me/{R:1}" /> </rule> Conclusion Above are four working examples of methods to call the protocol (HTTP or HTTPS) from the action of a URL Rewrite rule.  You can use whichever method you most prefer.  I’ve listed them in the order that I favor them, although I could see some people preferring Option 2 as their first choice.  In any of the cases, hopefully you can use this as a reference for when you need to use the protocol in the rule’s action when writing your URL Rewrite rules. Further information: Viewing all Server Variable for a site. URL Parts available to URL Rewrite Rules Further URL Rewrite articles

    Read the article

  • URL structure preference - to slash or not to slash?

    - by TheDeadMedic
    I'm using custom post types in WordPress 3.0 to manage 'courses' (or seminars, lectures, whatever term you'd prefer to have in mind). Now for viewing a single 'course', the url structure is; /course/course-name/ But for multiple courses? /courses/category/category-name/ Or... /course-category/category-name/ Or something entirely different?

    Read the article

  • What is the best URL strategy to handle multiple search parameters and operators?

    - by Jon Winstanley
    Searching with mutltiple Parameters In my app I would like to allow the user to do complex searches based on several parameters, using a simple syntax similar to the GMail functionality when a user can search for "in:inbox is:unread" etc. However, GMail does a POST with this information and I would like the form to be a GET so that the information is in the URL of the search results page. Therefore I need the parameters to be formatted in the URL. Requirements: Keep the URL as clean as possible Avoid the use of invalid URL chars such as square brackets Allow lots of search functionality Have the ability to add more functions later. I know StackOverflow allows the user to search by multiple tags in this way: http://stackoverflow.com/questions/tagged/c+sql However, I'd like to also allow users to search with multiple additional parameters. Initial Design My design is currently to do use URLs such as these: http://example.com/search/tagged/c+sql/searchterm/transactions http://example.com/search/searchterm/transactions http://example.com/search/tagged/c+sql http://example.com/search/tagged/c+sql/not-tagged/java http://example.com/search/tagged/c+sql/created/yesterday http://example.com/search/created_by/user1234 I intend to parse the URL after the search parameter, then decide how to construct my search query. Has anyone seen URL parameters like this implemented well on a website? If so, which do it best?

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >