Search Results

Search found 1317 results on 53 pages for 'webmaster sean'.

Page 3/53 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Google Webmaster Tools is showing duplicate URLs based on page title differences

    - by Praveen Reddy
    I have 700+ title tag duplicates showing in WMT. Every first link in that picture is as duplicate link of second one. I don't know from where the first link got indexed by Google when that link doesn't exist in the site. It's showing the title of every page as link. Original link: http://www.sitename.com/job/407/Swedish-plus-Any-other-Nordic-Language-Customer-Service-Representative-Dublin-Ireland. Duplicate link: http://www.sitename.com/job/407/Swedish-plus-Any-other-Nordic-Language-Customer-Service-Representative-Dublin-Ireland-Ireland. How can this happen? I have checked entire site I didn't find where the second version is linked. I have no images linked to with duplicated version of URL.

    Read the article

  • Google webmaster tools: parameters that only apply on one page

    - by Imagine digital
    I'm trying to get my e-commerce website on google and still figuring out how it all works. Now, I have seen this feature named URL-parameters, allowing me to set different parameters that affect page content to be indexed (one can also set parameters that do not affect the page, but for me that does not apply..). The question I have about this is whether and how I should add parameters that I only have on some pages of my site. example: The homepage of my site is www.mysite.nl. no parameters at all. But when a user clicks the navigation bar, it links to www.mysite.nl/itemList.php?category=&....subCategory=.... The parameters category and subCategory define whether there is content on my itemList page and what content that is. It gets matching products out of my database based on those 2 variables. The question: How do I make sure that I apply the google URL Parameters function decently for my website?

    Read the article

  • Google webmaster showing duplicate meta descriptions for search directory

    - by Mike Flynn
    What is the best way to get rid of this error in Google Webmasters? Do I really need to add "- Page 2" at the end of the descripton? Page Description Kansas basketball tournaments posted by organizations and teams for youth, AAU, and NCAA certified e Pages /youth-basketball-tournaments/kansas /youth-basketball-tournaments/kansas?page=2 /youth-basketball-tournaments/kansas?page=3 /youth-basketball-tournaments/kansas?page=9

    Read the article

  • Google Webmaster tools and geotargeting at TLD and Folder levels

    - by user3390918
    Hoping you can help us with this question: We are one Canada’s leading websites with a PR8. For privacy's sake, let's call our main URL www.OurMainWebsite.com As we are expanding globally, we are planning to build a web site to service the US market but want to keep the domain above as is from branding perspective. Questions: Can we keep: www.OurMainWebsite.com as the main Canada site and create a www.OurMainWebsite.com/US as the US site? Can those 2 URL's be geotargetted as per above, and wouldn't the fact that US is a subfolder of the main TLD through things off? can I target a TLD to one country and a subfolder under the TLD to abother? Thanks in advance for your help.

    Read the article

  • Still no detected structured data in Google Webmaster Tools [on hold]

    - by user6211
    Can you give me some suggestions what's wrong with my structured data? Google still cannot read it. It looks like this: <div class="identity"> <div itemscope itemtype="http://schema.org/LocalBusiness"> <a itemprop="url" href="http://MYDOMAIN.co.uk/"><div itemprop="name"><strong>MY_COMPANY</strong></div></a> <div itemprop="address" itemscope itemtype="http://schema.org/PostalAddress"> <span itemprop="streetAddress">MY_ADDRESS</span>, <span itemprop="addressLocality">London</span>, <span itemprop="postalCode">SE5 MY_XYZ</span>, <span itemprop="addressCountry">UK</span> </div> </div> </div>

    Read the article

  • Change from static HTML file to meta tag for Google Webmaster verification

    - by Wilfred Springer
    I started verifying the server by putting a couple of static HTMLs in place. Then I noticed that Google wants you to keep these files in place. I didn't want to keep the static HTMLs in, so I want to switch to an alternative verification mechanism, and include the meta tags on the home page. Unfortunately, once your site is verified, you never seem to be able to change to an alternative way of verification. I tried removing the HTML pages. No luck whatsoever. Google still considers the site to be 'verified'. Does anybody know how to undo this? All I want to do is switch to the meta tag based method of site ownership verification.

    Read the article

  • How to delete Analytics property from list in Webmaster Tools?

    - by toxalot
    When I look at the Google Analytics Property page in Webmaster Tools (where you associate a Google Analytics web property with a Webmaster Tools site), I see a list of a bunch of properties with weird names. They are test properties from when we were trying things out years ago. I can't view them or change their name in Analytics because they don't exist there. All their profiles were deleted years ago. Is there a way to remove these from the list in Webmaster Tools? Or, alternatively, a way to view them again in Analytics so I can give them a better name, at least. I know this doesn't matter in the big scheme of things, but I hate clutter.

    Read the article

  • Using Google Webmaster & Analytics, what data to look at to improve website performance?

    - by Rob
    Using data from Google Analytics and Webmaster tools, what data should I be looking at to improve my websites performance? I want to improve the SEO, usability and just general performance of my website. EDIT: It's a portfolio website that we've done the initial SEO for, also optimised all images etc and made the site as fast as possible. What kind of things should I be looking out for in the analytics and webmaster data to improve performance for both the SEO and each individual page.

    Read the article

  • How exactly is Google Webmaster Tools measuring "Site Performance"?

    - by Rémi
    I've been working for two months now on improving our response time (mainly server side) on a new forum (a brand new product on a technical point of view) we've launched in Germany a few month ago and I'm a lot surprised by the results I get. I monitor our response time using Apache logs and our own implementation of Boomerang beacon. Using my stats, I can see that our new product responds in about 680 ms where our old product was responding in about 1050 ms. On the other side, Google Webmaster Tool tells us that our pages have an average reponse time of about 1500 ms today where it was 700 three months ago with our old product. I've figured that GWT was taking client side metrics into account so I've added some measures on our Boomerang beacon and everything looks just fine. I've also ran some random pages on ySlow and Google's Page Speed and everything looks better than it was before. We event have a 82% on Google's Page Speed tool which is quite cool for a site with some ads in it :) Lately, we have signed a deal with Akamai to use two of their products : CDN for our static files (we were using another CDN before but it wasn't very effective) and RMA to improve Networks routes. We have also introduced a new agressive cache mecanism to ensure that most of the pages served to crawlers are cached by our memcache grid. After checking my metrics, it seems that this changes have improved from 650ms to about 500ms, which is good (still not great but it is definitly an improvement). But webmaster tools continues to report an increasing average response time where we see it decreasing in the same time. Have you ever had the same kind of wierd behavior on your sites while doing performance improvements ? Do you have any idea how to monitor the same thing Google does with Site Performance in Google Webmaster Tools so that we could improve our site and constantly check if it is what Google wants ? Edit 2011/07/26 : Thanks for your answers guys ! Nevertheless, I was not precise enough. The main issue we have is not with the Site Performance page but with the Crawl Stats one for now. We probably found an issue on our side with some very slow pages (around 3000 ms !!) and we are trying to fix them. I'll keep you posted as soon I'll have some infos. Thanks again !

    Read the article

  • google webmaster showing 6 pages submitted 0 indexed, yet i can see them all there when i search in google?

    - by sam
    I have a small 'brochure' type site with 6 pages, i can see them all the pages showing up in google when i search for my site. But in webmaster tools under the sitemaps section it says 6 submitted, (the blue bar of the graph), but the indexed pages - the red bar is showing 0 indexed pages, even though they seem to be indexed ? any idea why this is ? I dont really think its that important as the pages are still indexed, but it just seems odd. =================================================== UPDATE 9/3/12 having just looked in google webmaster its showing that there are 11 pages indexed, under the health index status tab.. but under the optimization sitemap tab it shows 6 urls submitted but only 1 indexed ? please see images bellow index status: Sitemap status:

    Read the article

  • How can I set parameters in Google webmaster tools so that my dynamic content is indexed?

    - by Werewolf
    I have read questions about URL parameters in Google Webmaster Tools in this site and the Google Webmaster Help Center but I have a problem. My site searches in the database and show some information. These two URL display some data: http://mydomain.com/index.aspx?category=business http://mydomain.com/index.aspx?category=graphic&City=Paris In URL parameter section, I can only define parameter category, how Google can detect proper values (business, graphics, real estate...)? Every word is not valid for search. If My page name is default.aspx or anything else, where I should define it? If I use URL rewriting like http://mydomain.com/search/category/business, my settings must change?

    Read the article

  • google webmaster soft 404 on 301

    - by Daniel
    I'm looking through google webmaster that my page is generating soft 404 errors (https://support.google.com/webmasters/answer/181708?hl=en) google says: We recommend that you always return a 404 (Not found) or a 410 (Gone) response code in response to a request for a non-existing page But I've got redirects set up that handle old pages to redirect to the proper new pages using a 301. The website links changed because of a use of a framework, which allows it to be more consistent, but the old pages till have links out there to these. Should I be worried about this? IS google penalizing the site for this? (Using IIS 8, Tomcat, CF10, Win)

    Read the article

  • Why would urls submitted in google webmaster tools drop to 0?

    - by ambient
    Why would urls submitted in google webmaster tools drop to 0? It's a small site, only like 20 pages, I submitted the xml sitemap and for about a week it said 20 urls submitted. A day or so ago it indexed about 17 of the pages, but today when looking it not only says that 0 are indexed but also 0 have been submitted. I did a site search on google and found clearly that pages are indexed, is this just an error on google webmaster tools? Any help or thoughts would be appreciated. Thanks!

    Read the article

  • In Google webmaster tools, can a "soft 404" be triggered by the text on the page?

    - by Stephen Ostermiller
    I just ran across an error in Google Webmaster Tools that I have never seen before. I manage the website for my local community band (I play trombone). One of the pages on the site is a list of our upcoming performances. It is powered by a WordPress events plugin that uses a database of upcoming events that are entered through the administration interface. We just finished up our summer and fall concerts and our next performance will be our Christmas concert. I hadn't gotten around to adding that into the website yet, so there are no upcoming events shown on the page. In fact the text on the page says: No upcoming events listed under Performance. Check out past events for this category or view the full calendar. Then in Google Webmaster Tools, this page is showing up as a "soft 404": The page is returning a 200 status and Google is indicating that he 404 is "soft". I wouldn't have expected Googlebot to be as sophisticated to parse that particular sentence. Is Googlebot able to detect that the text on the page indicates that there is currently not content and then treat it as a 404 page because of that? If Google is treating this page as a soft 404 because of the text on the page, does that mean that like regular 404 pages, the page won't show up in search results?

    Read the article

  • Cannot submit change of address to subdomain in Google Webmaster Tools?

    - by RCNeil
    I am pointing several domains to one URL, a URL which happens to include a subdomain. ALL of the domains are using 301 redirects to point to this new address. One of the older domains (which used to be a site) is a 'property' in Webmaster Tools, as is the new site (the one with the subdomain.) When registering a 'Change of Address' for the old site with WebmasterTools, it suggests the following method - Set up your content on your new domain. (done) Redirect content from your old site using 301 redirects. (done) Add and verify your new site to Webmaster Tools. (done) Then, directly below that, to proceed, it says Tell us the URL of your new domain: Your account doesn't contain any sites we can use for a change of address. Add and verify the new site, then try again. I have already submitted and verified the new site. The only reason I can fathom I am getting this error is because the new site includes a subdomain. Although I don't foresee getting punished for this, as I am correctly 301 redirecting traffic anyway, I'm curious as to why the Change of Address submission isn't working appropriately for me. Has anyone else had experience with this?

    Read the article

  • Why are the stats for HTML Improvements in Google Webmaster Tools not decreasing?

    - by Kookoriko
    I have read that resolving HTML Improvements in Google Webmaster Tools can take as long as 6 weeks to show up, but those numbers seem to increase without decreasing even though I've been fixing almost everything that Google points out. I have checked some sites with the View as Google tool and the bugs are resolved (let's say "short meta descriptions" for example). Any idea why is this happening?

    Read the article

  • How to Login Google webmaster tools with XMLHTTP

    - by darkandcold
    Hello, I have tried so many times but I couldn't get it worked. I am trying to log in google webmaster tools to get Search Queries List (top 20) I used XMLHTTP and AspTear, but no action :(. It says "my browser isnt cookie supported" But how can I log in google webmaster tools via xMLHTTP cookied enables? xmlhttp has any parameter about cookie?

    Read the article

  • Why is Google Webmaster Tools crawling invalid URLS and showing 500 errors?

    - by Amos Kane
    Google Webmaster tools is reporting 12k+ 500 errors. Eeek! None of the URLS are valid- they all contain www.youtube.com. First, why is Google crawling these URLS if they don't exist? I supplied a sitemap, and they are of course not in the sitemap. I don't have a robots.txt blocking anything. I've checked for invalid redirects--none, and checked for unclosed tags or something that would throw www.youtube.com into the URL by accident--none. In every 'linked from', the referring URL is also a bad URL, with www.youtube.com in it. The Google Tools report no malware, and I can't check the server logs because the host won't give me access. Really stuck!! Any ideas appreciated!

    Read the article

  • How much time it needs google webmaster yo generate content keyword if url masking is enabled? [closed]

    - by user1439968
    Possible Duplicate: What is domain “masking” or “cloaking”? Why should it be avoided for a new web site? my real domain is domain.in. But url masking has been enabled and the masked url is domain2.in .. In that case i have added d url bputdoubts.21backlogs.in to google webmaster a week ago but content keyword hasn't been generated. In this case when can I expect to get the content keywords generated ?? And is there a problem for getting visitors from google search if url masking is enabled ?

    Read the article

  • Authorship-verified website not included in "Author Stats" of Google Webmaster Tools?

    - by Yosi Mor
    In Google Webmaster Tools, is it normal for a website for which the Structured Data Testing Tool shows that "Authorship is working for this webpage" -- to not be listed in the "Author Stats" section (under "Labs")? I already understand that successful verification using the Structured Data Testing Tool does not guarantee that Google will actually display authorship in the SERPs, and that Google decides this at its own discretion. However, shouldn't such successful verification at least guarantee that the website is included in the "Author Stats" section (which purportedly covers "pages for which you are the verified author")? I would have assumed that, if Google is not yet displaying authorship for that site, it would show both its Impressions and Clicks as being "<10". Are my assumptions incorrect?

    Read the article

  • Adding users to multiple/all sites in Google Webmaster Tools?

    - by Christian
    I didn't find an answer to this, but maybe I didn't use the right terms for my search. So I'm sorry if this is a duplicate. Anyway, my situation is this: the company I work at manages a lot of sites (100+), and we've recently put them all into Google Webmaster Tools under my Google account, which was tedious enough. Now two coworkers are supposed to be added as users for each site, so they can see the data and manage stuff there as well. But I can only find an option to add users for a single site, not for all sites that are currently associated with my account. Do I really have to go through more than a hundred sites one by one and add the two users to each of them, or is there some way to add both users to all/multiple sites at once?

    Read the article

  • How Many Google +1's Does a Website need in order for Google WebMaster's Tools to Show Characteristics

    - by Asaph
    I have added the Google +1 Button to my website and discovered the new Social Activity section in Google WebMaster's Tools. Apparently, one of the interesting things you can learn about your audience is demographic data. But in GWT, the Social Activity Audience section for my site (currently 82 +1's), says the following: Your site doesn’t have enough +1's yet to show characteristics But I'm not sure how many +1's is enough. Google's official help page for the Audience section offers little insight: The Audience page displays information about people who have +1'd your pages, including the total number of unique users, their location, and their age and gender. All information is anonymized; Google doesn't share personal information about people who have +1’d your pages. To protect privacy, Google won't display age, gender, or location data unless a certain minimum number of people have +1'd your content. But what is that "certain minimum number"? I've tried Googling this but all I could find to date was this page which doesn't answer the question. So how many +1's does a site need before GWT will show me audience demographic characteristics?

    Read the article

  • How can I decrease relevancy of Creative Commons footer text? (In Google Webmaster Tools)

    - by anonymous coward
    I know that I may just have to link the image to make this happen, but I figured it was worth asking, just in case there's some other semantic markup or tips I could use... I have a site that uses the textual Creative Commons blurb in the footer. The markup is like so: <div class="footer"> <!-- snip --> <!-- Creative Commons License --> <a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/3.0/us/"><img alt="Creative Commons License" style="border-width:0" src="http://i.creativecommons.org/l/by-nc-sa/3.0/us/80x15.png" /></a><br />This work by <a xmlns:cc="http://creativecommons.org/ns#" href="http://www.xmemphisx.com/" property="cc:attributionName" rel="cc:attributionURL">xMEMPHISx.com</a> is licensed under a <a rel="license" href="http://creativecommons.org/licenses/by-nc-sa/3.0/us/">Creative Commons Attribution-Noncommercial-Share Alike 3.0 United States License</a>. <!-- /Creative Commons License --> </div> Within Google Webmaster Tools, the list of relevant keywords is heavily saturated with the text from that blurb. For instance, 50% of my top-ten most relevant keywords (including the site name): [site name] license [keyword] commons creative [keyword] alike [keyword] attribution [keyword] I have not done any extensive testing to find out rather or not this list even matters, and so far this doesn't impact performance in any way. The site is well designed for humans, and it is as findable as it needs to be at the moment. But, out of mostly curiosity: Do you have any tips for decreasing the relevancy of the text from the Creative Commons footer blurb?

    Read the article

  • Why does Bing webmaster tool's SEO analyzer complain about multiple <h1> tags?

    - by Mathew Foscarini
    I used the Bing webmaster tool's SEO analyzer on my website, and it reported: There are multiple tags on the page. It recommends that there should only be one <h1> tag on the page. The page is a listing of blog posts for a category. So each blog entry is structured like this. <article> <header><h1><a>...</a></h1></header> <p>summary...</p> </article> <article> <header><h1><a>...</a></h1></header> <p>summary...</p> </article> <article> <header><h1><a>...</a></h1></header> <p>summary...</p> </article> <article> <header><h1><a>...</a></h1></header> <p>summary...</p> </article> How is this invalid? I thought this was the correct way to describe a post in HTML5.

    Read the article

  • How to Fix this specific Google "Fetch as Googlebot" error appearing on my Webmaster Tools?

    - by UXdesigner
    Good day, I'm currently finding out why I have lost all of my website's rank in google. I don't even appear in google results by the domain. But other sites do link me and they appear in the google results. I think it's all about leaving my site two months alone and finding out I had 20k in comment spam, which I completely deleted and fixed with filters and adding a new Disqus comment service. Thing is, I added my site to Google Webmaster Tools and I'm finding out several awful things. For example, when I click in Google Fetch As GoogleBot. I receive this error message below in response to my request. And I don't even know what's the real problem and how to fix it. I simply don't get it. This is what appears: Date: Wednesday, July 20, 2011 9:43:35 AM PDT Googlebot Type: Web Download Time (in milliseconds): 55 HTTP/1.1 403 Forbidden Date: Wed, 20 Jul 2011 16:43:36 GMT Server: Apache Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 248 Keep-Alive: timeout=2, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 403 Forbidden Forbidden You don't have permission to access / on this server. Additionally, a 403 Forbidden error was encountered while trying to use an ErrorDocument to handle the request. Do you guys know anything about this problem ? I need to have Google crawl my site again. I used to have a really nice google result in the past three years. Now, there's nothing. thanks,

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >