Search Results

Search found 8370 results on 335 pages for 'seo friendly urls'.

Page 15/335 | < Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >

  • SEO For A Doman Name That Is A Dictionary Work Misspell [closed]

    - by Miles
    Possible Duplicate: How should I handle search engines auto-correcting the spelling of a site's name? My domain name is radiuus.com. Since the correct spelling of the word is with one u, when I search for "radiuus" on google it thinks I'm misspelling the word, kindly lets me know, and returns results for "radius". Example: https://www.google.com/search?q=radiuus&oq=radiuus&sugexp=chrome,mod=2&sourceid=chrome&ie=UTF-8 What can I do to prevent this? And taking this one step further: what can I do to get my site to show up when someone searches for "radius" spelled the correct way? Thanks!!

    Read the article

  • SEO and JavaScript since Google admits JS parsing

    - by schlingel
    We're planning on building a HTML snapshot creation service to provide the Google crawlers with static HTML of our JS driven single page application. Is this still necessary and/or encouraged since Google openly admits it is parsing JS now? How should I tackle this evaluation? Are there tools to provide data on when it's needed to provide snapshots and when google has sufficent parsing? Is it better because it would be much faster in comparison to the JS incremental rendering?

    Read the article

  • Comments Application SEO

    - by user1015448
    I am developing a commenting application. Users will be able to integrate this application in Blogs. I am unsure how to make the comments searchable in Search Engines. What I want is all the comments which are being posted should be included in Search Engine results when searched with relevant keywords. Please give me some hint how to do this. Do I need to use meta tags ? If so, how should I create them?

    Read the article

  • SEO-meta description crawling issue [duplicate]

    - by user3707382
    This question already has an answer here: Meta Descriptions not working for google search 3 answers i have following code where i m including my title and description for the page But google crawled only title not the meta description from the code. Where as meta description was read from the keywords present in html of the page.. Please guide me guys where i m coding wrongly <!DOCTYPE html> <html> <head> <title>title inserted here</title> <meta http-equiv="Content-Type" content="text/html;charset=utf-8"> <meta name="description" content="description here"/>

    Read the article

  • Location-Based redirection and duplication in sub-directories affecting SEO

    - by Joshua
    I currently own the website www.xyz.com. The website has a sub-directory for each of the 3 target countries: .../en-US/ (United States), .../es-MX/ (Mexico), and .../es-DO/ (Dominican Republic). I have two main questions about this setup: Currently, the main domain/root (xyz.com) contains a blank index.php file, but I would like for a user to be redirected to one of the sub-directories based on their regional location. What is the best way to accomplish this? I have looked at using browser language-based redirection, but how would I know whether to direct a user to the MX or DO site if the browser language is set to spanish? Is there a way to detect a user's geographic location? Also, the 3 websites are practically identical except they all have 3 unique color schemes and the US site is in english while the MX and DO sites are in spanish. My problem is that I believe GoogleBot is penalizing/banning my site because the spanish text on the MX and DO pages are nearly identical and are thus marked as duplicates/spam. Is there a way to avoid this?

    Read the article

  • SEO Expert Question

    - by CheatSEO
    I have worked with a website in the past freedomist.com This site gathers wordpress articles from multiple news source sites, and then republishes them. The company that runs this site has about 50 other sites that do the same thing. They post links to sites such as twitter and secondary wordpress sites. Is this a moral way of increasing page ranking? Is this against the terms of service with lets say Google?

    Read the article

  • Difference in stats on the same domain - seo related

    - by Duoweb
    Recently i started to optimize my company's website. With a little help of market samurai. One of the adjustments was a www redirect to the prefered domain. We currently rank on #8 for our desired keyword with the domain without the www. If I add a custom url (our domain with www), we see a difference in stats BLP (backlinks to the page) {it's the website on top} Can anyone tell me how i could merge this so we might even rank better for our keyword? What I did so far: 3 weeks ago set the www.duo.be as prefered domain in GWT and put a 301 redirect in place so the duo.be gets redirected to www.duo.be Both domains are in GWT and both with www.duo.be as preferred domain. Tho I see no change. Can anyone help me out, because i'm clueless. Marketsamurai screenshot: http://i.stack.imgur.com/CYkb8.png

    Read the article

  • How good is it for SEO if you have a widget that lives on other sites?

    - by Genadinik
    I made a widget that would ideally live on other sites. Here is an example: http://www.comehike.com/outdoors/widget.php?type=hike&hike_id=108&width=500&height=500 I guess since the widget would link back to me, it would be an SEO boost for my site. Is that correct? Or would it be just an SEO boost for that particular URL? If it is just an SEO boost for the particular URL, it does me little good since that page does not link to any of my other pages. Am I thinking about this correctly? How are these things typically handled so that there is a benefit to my site's SEO?

    Read the article

  • SEO for duplicate sites with multiple domain extensions

    - by lock
    I am running business in different nations and I got domains for example www.mydomain.com www.mydomain.us www.mydomain.ca www.mydomain.uk www.mydomain.com.au So, if I run same website with same content (of course there will be little changes like address, etc.) as all these domains has same content will it be considered as spam or will the domains rank well as per the country? Also, is there solutions if Google considers this as spam.

    Read the article

  • SEO Influenc search result per device class (mobile/desktop)

    - by user32224
    We're currently building a new responsive website and while working on the site map figured that we don't want to show certain sections on mobile devices. This can be easily done by hiding the navigation parts using css/media queries. However, trouble is that the hidden sites would still show up in search engines' search results. If a user happens to click on one of these links she might happen to see a badly formatted page as we'd use desktop/tablet only code to show images and video. Is there any way to "influence" to exclude certain pages if the search is done on a mobile device? Do search engines crawl pages once or with a device specific view twice? Could we set a noindex meta tag for a specific device class?

    Read the article

  • Is my htaccess setting hurting SEO?

    - by Ramanonos
    I have a site that I have redirecting to https. I do this to leverage wildcard SSL for my password protected pages. Everything seems to work fine with testing. For example, whether you type in http or www, you always get redirected to the SSL https... That said, I have about 200-300 external backlinks -- many high quality, yet google webmaster (along with SEOMoz), shows I have just 4... Huh? I'm embarrassed to say I just discovered this. This has led me to hypothesize that maybe my settings in htaccess is messed up, so google isn't recognizing a link because it's recorded on another site as http, instead of https. Maybe? At any rate, here is my simple htaccess setting for 301 www to http, and from http to https. RewriteCond %{SERVER_PORT} !443 RewriteCond %{HTTP_HOST} ^www\.example\.com$ [NC] RewriteRule ^(.*)$ http://example.com/$1 [L,R=301] RewriteCond %{SERVER_PORT} 443 RewriteCond %{HTTP_HOST} ^www\.example\.com$ [NC] RewriteRule ^(.*)$ https://example.com/$1 [L,R=301] Like I said, everything works fine for redirect over https, so I'd rather not screw up what works. On the other hand something is very wrong with google finding all my back links, so I need to fix something... I'm just wondering that maybe google isn't picking up a my backlinks from other websites recording me as http because I'm at https. Maybe google doesn't care and it's some other issue. Am I barking up the right tree? If so any quick fixes? Thanks as always!

    Read the article

  • WHY Google does not ban these sites using this SEO pattern? [on hold]

    - by saddam.bg
    I have seen some sites using a different kind of SEO to promote copyrighted materials such as movies. They also have submitted their site to Google webmaster tools but still now did not get banned. Their Alexa ranks are 7000 or less. On the other hand I have run 5 movie affiliate sites and all of them got banned by Google within a short period of time. I have copied the url of the homepage of solarmovie.me and pasted it on the google search and instead of the homepage url I have seen that their category or tag shows as the homepage (www.solarmovie.me/watch-category/hollyw... Now is solarmovie.me publishing its posts as a single page or something else? I tried to find out what kind of SEO or coding that was, but I couldn't since I have very little knowledge about coding. Also I have seen the same thing with ALLUC.TO in google search (www.alluc.to/popular-links.html). Could anyone please help with the SEO of this kind so that I don't get banned by google frequently or index removed. All SEO webmaster i need your help!!!! Please give me some good tips for this type of SEO. Thank You Very Much

    Read the article

  • Do spaces in your URL (%20) have a negative impact on SEO?

    - by Kevin
    All the articles I Googled on this subject are dated back in 2004-2005. Basically I am structuring precanned searches, and it is based off of categories the client will input. Example content/(term name)/index.htm Does it matter if I used the raw term with a space, which is converted to %20 in the URL, or should I convert the link to '-' and remove that before querying for results? I already have it working, but does anyone know if this definitely has a negative impact on SEO and ranking?

    Read the article

  • Tool to Verify Site URLs/SiteMap?

    - by LockeCJ
    I'm moving a site from one e-commerce software to another, and I've created URL Rewriter rules to do 301 redirects from the Old URLs to the new ones. I've tested them with a small sample of URLs, but I'm looking for some sort of tool that will let me test as many of the URLs as possible. Does anyone know of a tool that I can feed a list of URLs (or a sitemap.xml). This tool will attempt to retrieve each URL, and then report the status code for each. The result should be a list of URLs with the status code, something like this: www.site.com/oldurlformat1/ 301 Permanently Moved www.site.com/newurlformat1/ 200 OK www.site.com/oldurlformat2/ 301 Permanently Moved www.site.com/newurlformat2/ 200 OK I can almost do this with wget, but getting the summary/report at the end is where I'm stuck.

    Read the article

  • What exactly is SEO friendly site?

    - by Tom
    Hey, So, I've seen web developers writing in their CV that they create "SEO friendly sites. ". Also I heard that Wordpress is SEO friendly site and other CMSs. So, what does SEO friendly site mean? I understand, that titles and URLs are probably the most important things for making good positions in google, but is there any other things which I should know? Thanks

    Read the article

  • How can I make clean search urls?

    - by newbie
    If I have search that has a lot of different options, then url becomes very long and looks very bad. Is there anyway to make urls look better? Using POST to make search would keep urls clean, but people couldn't share search urls.

    Read the article

  • how to generate sef urls manually without clicking the links in joomla site

    - by raghu-pandiri
    we have sh404 sef component installed in my site.when we click any of the link on my site a sef url willl automatically generate and insert in to the databese.but i want to generate all the available sef urls for my site with out clicking all the links.how can i generate is there any way to ping all normal urls generated by me. how can i generate sef urls manually for my site content.

    Read the article

  • GWT SE friendly application

    - by user180152
    Hi friends, How I can develop 'Search Engine Friendly' web app in GWT? Take an example of StackOverflow it self, its a web app and should be SEO friendly allowing user to search from search engine. If some one want to develope same app in GWT. How one can make it SEO friendly? As GWT contain a single HTML file. How we can allow its inner content to be visible in SE? Any suggestion or comment, will really help. Thank you.

    Read the article

  • How SEO friendly is this URL ?

    - by The_AlienCoder
    I have this products catalog site.For the sake of SEO, I would have wanted my 'view details' link to look some thing like this ~/products/26-productname or ~/products/26/productname On my machine I'm using a url re-writing module and it works well. Unfortunately My host(shared) does not support url re-writing modules or Aspnet 4.0 for now. So I came up with a workaround that attempts to be SEO friendly Instead of this : ~/Products/details.aspx?id=26 I decided to simply append the product name in the url and i.e ~/Products/details.aspx?product=26-Toshiba Qosmio Notebook So my question is how SEO friendly is such a URL and is my attempt worth anything at all?

    Read the article

  • What is the best stucture of SEO friendly URL?

    - by Aajahid
    I'm working for a website to convert the website URL to an SEO friendly URL. I plan to use this: mysite.com/category-name/pageid-123-page-name I looked at some similarly categorized, highly ranked websites. They have the same structure, except for one thing. In one case, the URL format was thissite.com/category-name/pageid-123-page-name.html Another was thatsite.com/category-name/pageid-123-page-name.php Now I know the text in URLs help with SEO. Is it more helpful to have a file extension? If yes, which one is better? Or if my current plan is okay, will it be better with a / at the end?

    Read the article

  • Using {% url ??? %} in django templates

    - by user563247
    I have looked a lot on google for answers of how to use the 'url' tag in templates only to find many responses saying 'You just insert it into your template and point it at the view you want the url for'. Well no joy for me :( I have tried every permutation possible and have resorted to posting here as a last resort. So here it is. My urls.py looks like this: from django.conf.urls.defaults import * from login.views import * from mainapp.views import * import settings # Uncomment the next two lines to enable the admin: from django.contrib import admin admin.autodiscover() urlpatterns = patterns('', # Example: # (r'^weclaim/', include('weclaim.foo.urls')), (r'^login/', login_view), (r'^logout/', logout_view), ('^$', main_view), # Uncomment the admin/doc line below and add 'django.contrib.admindocs' # to INSTALLED_APPS to enable admin documentation: # (r'^admin/doc/', include('django.contrib.admindocs.urls')), # Uncomment the next line to enable the admin: (r'^admin/', include(admin.site.urls)), #(r'^static/(?P<path>.*)$', 'django.views.static.serve',{'document_root': '/home/arthur/Software/django/weclaim/templates/static'}), (r'^static/(?P<path>.*)$', 'django.views.static.serve',{'document_root': settings.MEDIA_ROOT}), ) My 'views.py' in my 'login' directory looks like: from django.shortcuts import render_to_response, redirect from django.template import RequestContext from django.contrib import auth def login_view(request): if request.method == 'POST': uname = request.POST.get('username', '') psword = request.POST.get('password', '') user = auth.authenticate(username=uname, password=psword) # if the user logs in and is active if user is not None and user.is_active: auth.login(request, user) return render_to_response('main/main.html', {}, context_instance=RequestContext(request)) #return redirect(main_view) else: return render_to_response('loginpage.html', {'box_width': '402', 'login_failed': '1',}, context_instance=RequestContext(request)) else: return render_to_response('loginpage.html', {'box_width': '400',}, context_instance=RequestContext(request)) def logout_view(request): auth.logout(request) return render_to_response('loginpage.html', {'box_width': '402', 'logged_out': '1',}, context_instance=RequestContext(request)) and finally the main.html to which the login_view points looks like: <html> <body> test! <a href="{% url logout_view %}">logout</a> </body> </html> So why do I get 'NoReverseMatch' every time? *(on a slightly different note I had to use 'context_instance=RequestContext(request)' at the end of all my render-to-response's because otherwise it would not recognise {{ MEDIA_URL }} in my templates and I couldn't reference any css or js files. I'm not to sure why this is. Doesn't seem right to me)*

    Read the article

  • Seo friendly Accordion menu

    - by strakastroukas
    Hello, currently i use the accordion menu provided by the asp.net toolkit. The problem is that it is not Seo friendly. So what i am looking for is an accordion menu with the following characteristics. 1) Seo friendliness 2) Preserving of the selected index, on post-backs. 3) Small in k bytes 4) Free of charge Do you have anything in mind?

    Read the article

  • C# Byte[] to Url Friendly String

    - by LorenVS
    Hello, I'm working on a quick captcha generator for a simple site I'm putting together, and I'm hoping to pass an encrypted key in the url of the page. I could probably do this as a query string parameter easy enough, but I'm hoping not too (just because nothing else runs off the query string)... My encryption code produces a byte[], which is then transformed using Convert.ToBase64String(byte[]) into a string. This string, however, is still not quite url friendly, as it can contain things like '/' and '='. Does anyone know of a better function in the .NET framework to convert a byte array to a url friendly string? I know all about System.Web.HttpUtility.UrlEncode() and its equivalents, however, they only work properly with query string parameters. If I url encode an '=' inside of the path, my web server brings back a 400 Bad Request error. Anyways, not a critical issue, but hoping someone can give me a nice solution **EDIT: Just to be absolutely sure exactly what I'm doing with the string, I figured I would supply a little more information. The byte[] that results from my encryption algorithm should be fed through some sort of algorithm to make it into a url friendly string. After this, it becomes the content of an XElement, which is then used as the source document for an XSLT transformation, and is used as a part of the href attribute for an anchor. I don't believe the xslt transformation is causing the issues, since what is coming through on the path appears to be an encoded query string parameter, but causes the HTTP 400 I've also tried HttpUtility.UrlPathEncode() on a base64 string, but that doesn't seem to do the trick either (I still end up with '/'s in my url)**

    Read the article

  • Is SEO knowledge important for web developers?

    - by splattne
    Looking for some SEO (Search engine optimization) questions on Stackoverflow, I saw ambivalent reactions to these questions. Some were closed as "not programming related" or were downvoted, others were answered and got upvoted. It seems that many developers think SEO was something "dirty" or belonged in the realm of spam. IMHO designing for search engines and practising SEO techniques adds important value to the final product like, for example, a good user interface. Should SEO really be left to specialized non-programmers? Shouldn't web developers have profound SEO knowledge? Or is it okay to apply SEO as a post-development process?

    Read the article

  • How to perform a 301 redirect of all .php URLs to clean URLs?

    - by spacedatdusk
    My htaccess isn't quite working the way I want it to. I've seen some similar threads on here but they aren't quite what I need and I don't know enough yet about htaccess to modify the code to suit my needs. This is what I have working so far: I've got all non-www URLs redirecting to www URLs and I'm doing an internal rewrite of all URLs to the corresponding PHP file on the server. In the files I have relative links that are clean without any file extension on them. This is what I need to do yet: All the pages on my site are still accessible through URLs with .php on the end. For SEO reasons I want the URL's with .php to all do an external 301 redirect to the clean URL without the extension. Here's what I have in my htaccess file that's in the root folder on my server. Options +FollowSymLinks RewriteEngine On RewriteCond %{REQUEST_FILENAME} !-d RewriteCond %{REQUEST_FILENAME}\.php -f RewriteRule ^(.*)$ $1.php RewriteBase / RewriteCond %{HTTP_HOST} ^domain.com [NC] RewriteRule ^(.*)$ http://www.domain.com/$1 [L,R=301] I'd appreciate any help. Thanks in advance!

    Read the article

< Previous Page | 11 12 13 14 15 16 17 18 19 20 21 22  | Next Page >