Search Results

Search found 8370 results on 335 pages for 'seo friendly urls'.

Page 104/335 | < Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >

  • Google index iframes on Facebook fan pages? (Hole website content)

    - by user2536417
    I have a fairly simple question that I've tried to get help from the guys on the Google Webmaster Help Q&A site but so far no joy so hopefully someone here can provide me with the information I'm looking for. I have a Facebook fanpage for my website, I have made an app that basically uses an iframe and puts the site within a frame within Facebook. All works good but Google is not indexing this page. I am using <link rel="canonical" href="#" /> on my pages so prehaps this is an issue?

    Read the article

  • How do you enhance your websites speed without compromising the design and access?

    - by Thorn007
    How do you enhance your websites load speed without killing the design and accessibility? File compression, CDN, Gzip? What are the best tools for doing so? For example, Google has optimized their site without compromising the design. Also, many website can kill the purity of their images with compression. Is there a way, more or lest best practice, to increase speed without compromising the design and accessibility? Note: sorry for being so vague but I don't know how else to phrase this question.

    Read the article

  • 301 redirect from a country specific domain

    - by Raj
    I originally started using a .do domain extension for my site, but later realized that this country specific domain would prevent us from appearing in search results for places outside of the Dominican Republic. We started using a .co domain extension and redirected all requests to the new domain using an HTTP 301. The "Crawl Stats" in Google Webmaster Tools shows me that the .co domain is being crawled, but the "Index Status" shows the number of pages indexed at 0. The "Crawl Stats" for the .do domain says that it's being crawled and the "Index Status" shows a number greater than 0. I also set a "Change Of Address" in Google Webmaster Tools to have the .do domain point to the new .co domain. We're still not appearing in search results at all even for very specific strings where I would expect to find us. Am I doing something wrong?

    Read the article

  • How to remove a page from site without affecting google serp

    - by Savas Zorlu
    I have a travel website. Just for information purposes, I had put a weather page. Now I realize that this page is increasing my overall bounce rate; because people who are looking for the weather forecast are landing on that page and getting what they want and exiting. What is the safest method to get rid of that page? Would it hurt my google rank if I remove it completely? Or is there a better way to handle this situation? I realize that around 21 percent of my daily hits are on that page. I would have been happy if my aim was to provide weather data for the location; however, my site needs to focus on selling hotels. So I think I need to get rid of this weather page immediately. What do you think?

    Read the article

  • How to make the most of GWT's "Search queries"?

    - by DisgruntledGoat
    I've been looking at the "Search queries" section in Google Webmaster Tools recently, and it seems like there is a lot of potential there in finding which pages on a site need improvement. I'm trying to figure out exactly what to sort or filter on. Do I look at pages with a low average position? Low impressions but high clicks? Pages that are rising up/falling down the rankings? What is the low-hanging fruit here?

    Read the article

  • SEO Tips - Updating Your Content and Avoiding Duplicate Filters

    If you are just getting into Search Engine Optimization, it's important that you are aware that content is the most important thing in your website in order to get high ratings. If you decide to hire an SEO Consultant, make sure that you explain to them exactly what you want your website to look like and the content that you would like it to feature. Having fresh and relevant content in your website is the key as that will bring web crawlers back frequently. There are different ways of achieving this while avoiding getting your website removed off of a search engine due to duplicate content.

    Read the article

  • Upgrading Blog to WordPress, keep old one or redirect?

    - by Spazm
    I have had a blog for around 4 years now that has gained some success and gets tons of residual and organic traffic. I am using an outdated version of BlogEngine.NET and we are going to switch to WordPress. Currently, our blog is at ourwebsite.com/blog, and I don't really want to mess with our web server. I setup a new server just for WordPress and instead of dealing with proxies and things that are above my head, the new blog will be at blog.oursite.com. I am trying to figure out the best way to go about doing this. We value our search engine rankings very much so, as that is where 90% of our traffic comes from. Would I be best off: Importing all of the old blog posts from oursite.com/blog to blog.oursite.com and then redirecting all of the articles, categories, authors, pages and tags to the new blog and giving up the direct links to our current site. Keeping all of our current articles as they are, and only add new content to the new blog.oursite.com and just kind of let the old blog float around our site.

    Read the article

  • CSS being Displayed by Google spiders

    - by vipul_vj
    I have written an article HTML Image Tag for the site and it has been indexed by Google. But when I search it, google displays HTML Image Tag - ProgrammingBulls http://programmingbulls.com/html-image-tag-1: content { font-family:verdana; font-size:14px; font-weight:normal } We often use images in a webpage. To insert images in our webpage < img tag is used in. Why is CSS displayed in the google search? I know that CSS and HTML is ignored by Google but due to some reason HTML is being displayed.

    Read the article

  • Google not showing any pages from my site in the index after three months [on hold]

    - by Alex Coisman
    Despite having a sitemap and using Google Webmaster Tools, it has been over 3 months and my site has not been added to the Google index at all. Here's the site: www.famouslefthandedpeople.com As far as I know, I have done everything correctly. However, there must be something I am overlooking that is preventing Google from indexing the site. I do not have a robots.txt file, so allow/disallow isn't the issue. Although the content of the site is sparse, it is original and not duplicated internally or externally so Panda/Penguin should not be a problem. I have reviewed the answers at Why isn't my website in Google search results? and I don't think it applies here. If it matters, I am using WordPress to create the site. What other factors should I be looking at in order to troubleshoot this?

    Read the article

  • Googlebot visit but no cache update - why?

    - by Mick
    I have made a new plain vanilla HTML website. I have been making regular modifications to it on an almost daily basis. The site is hosted by hostmonster and as part of their service they offer "awstats" to let you know assorted details of visitors to the site. One thing is puzzling me. According to awstats, a "robot/spider" calling itself "Googlebot" visited my site as recently as today (28th June 2011), but when I find my site on google (e.g. by searching for "full reserve banking") the cache is dated only the 5th June. I always thought that a visit from the google robot was synonymous with a cache update. Am I wrong? Or have I accidentally put something in the site telling google that nothing has been updated? EDIT: It seems a moderator has removed the name of my website, so there is now no chance that anyone could check out if I had made some error on my site :-( ... but anyway, in answer to paulmorriss' question, here is what aw stats was telling me:

    Read the article

  • Can you disavow a whole domain apart from the index page?

    - by Silver89
    Many years ago I may have bought a few sitewide links for some of my sites, these have now come back to haunt me and I need to sort them out. I've tried to contact the owners but they're too lazy to bother changing the sites so I figure it's time to disavow the links. But is there a way to disavow all of the sitewide links on the domain apart from on the index page and would this be a benefit to leave the index or would it still be seen as spammy? Something like ... # Contacted owner of shadyseo.com on 7/1/2012 to # ask for link removal but got no response domain:shadyseo.com !shadyseo.com/index.php

    Read the article

  • Is the structure of my site's navigation (via price/service tables) considered 'Duplicate Content' by Google?

    - by James Gadsby
    As I'm building my business website, I'm using service/price tables at the bottom of each service page to demonstrate to customers/potential clients my other offerings. Of course, given that there are 7 or 8 service pages, each with (according to Google) the same service descriptions below the original content for that service, would this be counting as duplicate content? If so, what could I do about it?

    Read the article

  • In addition to Google's First Flick Free, should you whitelist search engine bots past a paywall?

    - by tobek
    Our site has subscription-only pages - non-subscribed visitors see a snippet preview. As per Google's FCF requirements, your first 5 hits to a subscriber-only pages with .google. as the referrer, you see the full page. In addition to this, should we whitelist search engine bots so that they can index the full content? I assume this is not required for Google, which can use FCF to index our content, but what about other search engines? Is this considered cloaking? My gut says that whitelisting bots past the paywall is bad practice., but I wanted to confirm - any evidence or references would be amazing.

    Read the article

  • How to make google get to know my domain name [closed]

    - by Milosh Belter
    Possible Duplicate: I cannot see my website in google I have a strange problem with my website. I have a website, let's say Abcdefg.com. Website is live for 2 months and google still doesn't know it. While searching for my domain name 'abcdefg', google displays results for similar phrase (abcdef) but not fot mine. How to make google get to know my domain name? Website and sitemaps have been submitted via Google Webmaster Tools.

    Read the article

  • A relatively new blog seems to be getting very poor Google indexing

    - by Genadinik
    I have a new blog that is 2 months old. In the first few weeks, it was getting indexed nicely and my GoogleWebmaster reports were showing that it was getting crawled and began ranking for some terms. Then as I kept writing, the GoogleWebmaster report thinned out and showed less and less terms that this blog ranks for. Now there are only 4 terms with one of them being my name. Is there something I need to do to keep the old posts to remain indexed and crawled? Thanks, Alex

    Read the article

  • Will a search engine lower the rank of my page if i have a hidden iframes?

    - by Skurpi
    As a praxis, all external content on our site is put in iframes to lower the risks of any external parties injecting stuff to our users. We also do it to make sure our content shows up before banners, to make the site feel quicker. We now have an external script running which we want to put in an iframe, but it does not have any visible content to go with it so I want to put css "visibility: hidden;" on the iframe. I read in a forum somewhere that search engines will lower the rank of a page, or even drop the page, if a iframe has "the minimal size of 1x1px". Will a search engine lower the rank of my page if I have a hidden (or 1px big) iframe?

    Read the article

  • Subdomains vs. URL Path in shareable links

    - by Adam Matan
    I am building a web application for questions and answers. Each question/answer page has all the required metadata for Facebook and Twitter, and we encourage users to share these pages. I have a dilemma regarding the shared link structure: Option 1 - subdomains Use a questions.example.com and answers.example.com, followed by an ID and optional text. The text is ignored by the request, which only takes the id into account. http://questions.example.com/<question_id>/<question_text> http://questions.example.com/12345/how-long-is-the-queue # Example http://q.example.com/12345 # Example Option 2 - URL path This is the format used by stackoverflow.com and trello.com: http://example.com/questions/<question_id>/<question_text> http://example.com/questions12345/how-long-is-the-queue # Example http://example.com/q/12345 # Example Server-wise, I can easily do both - I have a wildcard SSL certificate and Apache/NGinx configuration is pretty straightforward. Which option - subdomains or URL path - is preferred for shareble links?

    Read the article

  • Why won't Webmaster Tools let me set a preferred domain? (says to verify, but it should already be)

    - by Su'
    I've got a domain that I apparently forgot to set a preferred domain for, so I just tried to do it. Webmaster Tools instead popped up a little box: Part of the process of setting a preferred domain is to verify that you own http://www.example.com/. Please verify http://www.example.com/. I'm running into some problems following these instructions: As far as I can tell I already did verify sometime in the past. There's a TXT DNS record with the gibberish Google tells you to set for it that I couldn't have come up with myself. …and nothing is telling me this information is bad. So let's assume the site is somehow not actually verified. All the various methods' instructions start with "click the Manage Site button next to the site you want, and then click Verify this site." That button doesn't even exist on my screens. (It presumably goes away when you successfully verify?) Those instructions were all updated pretty recently, and the DNS one in particular just a couple weeks ago so it seems a bit unlikely they're inaccurate. I am not using Apps, and won't be, so can't try out the verification through there. Note that I also have another domain that I have not done any verification for which is showing the same behavior(no such button, being told to verify when it seems impossible etc.) so something appears to just be broken. I already have a no-www process in place server-side, so we can skip that. I'm just trying to get the box checked off in GWT. If I don't get any resolution, I'll eventually scrap the TXT record and see if the site gets un-verified(or whatever since it thinks it isn't), and see if I can just restart the process. It's not urgent so I'm just trying to figure out if I've gone blind to something or what. Did the button move?

    Read the article

  • Another website is mirroring my site

    - by Marlboro Goodluck
    Question for you all. There is a site of ill repute known as thedirty which has completely mirrored my site and now has links appearing on Google at the #1 spot using my content. I checked my log file and noticed that this site has been crawling mine from sometime, and also has 10k links from their site to mine. I have blocked user access which is referred from this site and reported them as web spam to Google already. I also disavowed the domain. How are they getting top links in Google (even overtaking mine) for such nefarious tactics? What are the steps to completely eliminating an issue such as this?

    Read the article

  • What is the best taxonomy from Google's perspective?

    - by ZakGottlieb
    I was wondering what the best way is to structure a new website in Google's eyes. Currently, it contains two top-level categories (X & Y), and clicking a term under either one will result in the URL: www.nameofsite.com/X/X type term, or /Y/Y type term Technically, it is correct to group all "X type terms" under X and "Y type terms" under Y, but we could probably be more granular and break all articles into 5-6 top-level categories by breaking Y up into more specific categories. Given that the current URL structure will eventually result in 1000's of "X type terms" and "Y type terms" under just two top-level categories, would it be more advisable to have several of these, as suggested? Thank you in advance.

    Read the article

  • Hide from google while developing

    - by user210757
    I will be building a (wordpress) web site. While I am developing, other team members will be pushing content. I'd like to have it hidden from google while under development. It will be hosted on godaddy. I have thought of not pointing the domain name to it until live and using "preview dns", or buying a static IP during development. Or hosting dev site in a sub-directory ("/dev/") until ready and then moving it up a level. If in the dev directory I'd add htaccess or robots.txt to not crawl. Is any of this a bad idea? Will google penalize for any of this - like search by IP and then associate that with the domain later on? Any better ideas?

    Read the article

  • Actinic and Google Analytics: does it mess with my stats?

    - by tjcss
    My fathers website is built using Actinic - not by me but by a local company - and since he went with them some years ago his traffic never went down, but has stayed more or less consistent which is fine. My question is this; does using actinic somehow confuse analytics? As it shows that 99% of all visitors come "direct", as in, not by organic search. Previous to using Actinic he would get 70 to 85% of new hits exclusively from organic search terms. So I'm wondering if Actinic somehow messes with these new hits and redirects them to a "home" page.. Not sure exactly what I mean but this change in stats is concerning and I'm struggling to find an explanation.

    Read the article

  • How to prevent a search engines from indexing a section of a page?

    - by BrunoLM
    I have many pages with lots of text in it. But I will always have two sections of text and I want to prevent one section from appearing in search results, the other section must be indexed. <p class="please-index-me">text</p> <p class="get-out">never index me please</p> I thought that maybe if I load the "please don't index me text" with Javascript maybe search engines wouldn't look for it. But I am not sure it would work and this is not really nice. I was wondering if there is a way to tell search engines "hey, this text you can't grab, move on". So, is there a way to do it?

    Read the article

  • How long before Google will update search terms matching my website?

    - by Camran
    I have a website which title I changed about a month ago. The website is a classifieds website which is dynamic, using php. The title changed from "Free classifieds" to "buy and sell free classifieds". The strange part is that after about two weeks the title showed in google search results changed to the new title, BUT when I searched for "buy and sell free classifieds" my website didn't show up at all. I mean I have gone through over 30 pages of search results and my site isn't listed. However, searching for "free classifieds" still display my website at the same position it was before the title change. Any reason for this? How patient should I be? FYI the website has a sitemap submitted and updated, good meta tags and is W3 valid etc etc, so that is not the problem here. Thanks

    Read the article

< Previous Page | 100 101 102 103 104 105 106 107 108 109 110 111  | Next Page >