Search Results

Search found 1317 results on 53 pages for 'webmaster sean'.

Page 5/53 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Tomorrow: Profit Rides into the DANGER ZONE!!!

    - by Aaron Lazenby
    On May 4 I'll be suiting up with Oracle social media maven Marius Ciortea-- Iceman and Maverick-style--for a flight in the Team Oracle stunt plane. World-renowned pilot Sean Tucker and his team were nice enough to invite us along to participate in aerial photo shoots over Oracle headquarters and the San Francisco bay. I don't think we'll be able to recreate the epic tension generated between Tom Cruise and Val Kilmer in "Top Gun" but we'll do our best to get some good photos, videos, and interviews along the way. Check back on Wednesday for a full report.

    Read the article

  • Google Webmaster Tools Index dropped to Zero [closed]

    - by Brian Anderson
    Earlier this year I rebuilt my website using ZenCart. Immediately I saw a drop in index status from 59 to 0. I then signed up for Google Webmaster Tools and noticed the Index status took a dramatic drop and has never recovered. I have worked to add content and I know I am not done, but have not seen any recovery of this index since. What confuses me is when I look at the sitemap status under Optimization it shows me there are 1239 submitted and 1127 pages indexed. Most of my pages have fallen off page one for relevant search terms and some are as far back as page 7 or 8 where they used to be on the first page. I have made some changes in the past week to robots.txt and sitemap.xml, but have not seen any improvements. Can anyone tell me what might be going on here? My website is andersonpens.net. Thanks! Brian

    Read the article

  • A drop in SERP after following webmaster guidelines [on hold]

    - by digiwig
    So here's a puzzle for all you SEO gurus out there. I recently launched my own site. I had target keywords which were ranking very well for about 1 month, within the top five and even appearing in first place. In an attempt to maintain good positioning, I followed guidelines by adding robots.txt, an xml sitemap redirecting non-www to www redirecting index.php to root domain adding htaccess 301 redirect for old pages I added rich snippets created a google+ account, verified my picture to appear, I went through each of the webmaster issues with duplicate titles and meta descriptions and improved header tag document outlines i even created a few more blog posts to keep the content freshing and moving. So now my website appears on page 2 with my target keywords - and all because I followed the guidelines. What is happening? I see competitors with stagnant content superglued to position 1.

    Read the article

  • Google-bot sees “Sorry, we have no imagery here” on pages with Google Maps

    - by friism
    I have a site with Google Maps on most of the pages. When inspecting content keywords in Google Webmaster tools, content keywords identified by Google-bot for the site include "imagery", "sorry" and "here". These turn out to be part of an error message returned by Google Maps: "Sorry, we have no imagery here". I cannot reproduce this error with normal clients, nor does "fetch as Google" show it. The problem is presumably that Google-bot tries to execute some of the Google Maps Javascript but then shoots itself on the foot and records the error message. A Google search for "Sorry, we have no imagery here" shows that this problem is endemic to sites across the internet, including Yelp and many others. I'd like to convince Google that my site is not about imagery and being sorry, but I'd also like to keep the maps in place. I guess one option would be to transition to static maps, but that's not a great alternative. There's some related discussion on Webmaster World, no resolution.

    Read the article

  • Google is not indexing my entire site despite having a sitemap

    - by Anusha
    I have an e-commerce website www.beyondtime.in. I have been constantly monitoring Googlebot crawling on my website and my webmaster account. Lately, I have found two issues that I have not been able to understand. 1.) The Google Bots have been only crawling www.beyondtime.in/telecom.php when the URL is not even valid. What needs to be done to let Google crawl other pages of the website as well? 2.) The second question is about the Google Webmaster account, where I've submitted my sitemap with 227 URLs. Out of that, only 156 have been indexed. None of the images of my website have been indexed by Google.

    Read the article

  • How to remove old robots.txt from google as old file block the whole site

    - by KnowledgeSeeker
    I have a website which still shows old robots.txt in the google webmaster tools. User-agent: * Disallow: / Which is blocking Googlebot. I have removed old file updated new robots.txt file with almost full access & uploaded it yesterday but it is still showing me the old version of robots.txt Latest updated copy contents are below User-agent: * Disallow: /flipbook/ Disallow: /SliderImage/ Disallow: /UserControls/ Disallow: /Scripts/ Disallow: /PDF/ Disallow: /dropdown/ I submitted request to remove this file using Google webmaster tools but my request was denied I would appreciate if someone can tell me how i can clear it from the google cache and make google read the latest version of robots.txt file.

    Read the article

  • Move site to new domain divided by language across subdomains

    - by mark
    I managed to find a nice domain for a fairly fledgling site of mine that actually hasn't been parked by scumbag squatters. Given the upcoming move I'm thinking I'd take the opportunity to split the content across subdomains according to language, much like wikipedia for example: current: www.old-domain.com/en/subject # English www.old-domain.com/subjecto # Spanish (default so not locale in url) proposed en.new-domain.com/subject es.new-domain.com/subjecto The advantage of doing this is a fairly competitive keyword such that I may wish to put a copy of my application on a Spanish slice in order to gain a few serp's. Also pure vanity. Google's webmaster tools allows me to move to the new domain and I can add the root domain and the subdomains but forward to only one. I'll 301 from the old domain appropriately but is there anything I should know about webmaster tools in this respect where effectively I'm moving to two addresses? (Feel free to dissuade me from doing this if it's a bad idea in comments.)

    Read the article

  • Google not recognizing microdata? [duplicate]

    - by user1795832
    This question already has an answer here: How long for data highlighter mark up to appear in structured data tool? 2 answers I put in microdata to one page of a site I help manage using schema.org. Using the Google webmaster tool test, the page checks out and displays what it sees as the microdata properly. But when I go to the Structured Data page in webmaster tools, it keeps saying the site does not have any. I put it in 2 weeks ago. Us it just something that take a while for it to recognize? Or does microdata have to be on every page for it to be recognized or something?

    Read the article

  • What does Enable/Disable mean in Bing's URL Normalization feature?

    - by DisgruntledGoat
    I'm in Bing Webmaster Tools, under Index URL Normalization. Many parameters are listed in the table with 3 other columns: Status, Source, Date. The "Source" column says "Webmaster" where I have added parameters, and "Bing" where I assume the parameter has been auto-detected. "Date" is probably the last date it detected the parameter. I've tried searching the help files but I can't find what the Status column means. The top of the page says: This feature allows you to specify query parameters for Bing’s crawler to ignore. But it's not clear whether "Enable" or "Disable" is related to this, and if so what happens in each case. Does anyone know?

    Read the article

  • What is the best way to deal with 404s that are all trying to point to the same page that are from an external site?

    - by Lee
    I started getting 404s showing up in my Google Webmaster's Tools from a site linking to a specific category but with odd characters at the end of the url. So Something like this: http://example.com/category/puppies%EF%BC%9A.textwidget%E8%A6%81%E7%B4%A0%E7%B7%A8%E9%9B%86 Google Webmaster says that there are about 120 of these links and I can imagine there will be more to come. What is the best way to handle these links from an seo point-of-view? I have heard 301 redirecting too many links at one time can cause Google to ding the site but I don't want this site to continue posting broken links. Any help on this would be appreciated.

    Read the article

  • Redirect subdomain (weblog) to new domain without access to .htaccess

    - by fafa
    I've a problem that I can't find the solution for on the web. I have a blog that has PR 1 and it's subdomain "aaaa.domain.com" that "domain.com" is a blog server. Now I want buy a domain "newdomain.com" and I want tell google webmaster to redirect the old subdomain to this new domain and send traffic to my new domain. I can't access .htaccess to use a 301 redirect. The only thing that I can do is put html code in the html. How can I do this? When I use "Change of Address" in google webmaster it say:"Restricted to root level domains only".

    Read the article

  • Geographic location settings

    - by JochemTheSchoolKid
    I am building an website. It has an .nl domain. Now only my domain is showing up on google.nl I hope I can change this somehow that it could be findable in all google's (like google.com / co.uk) and so on. If I look on google forums. They say go to webmaster tools and change your geographic position over there. But I have added this site and I am not able to change it there because there is no select box. I dont have any idea were to search (yes I searched on google offcourse) or where to ask for this special problem. So maybe here can someone redirect me or explain me what is possible and what not. The question is can I make an .nl domain findable in (almost) all google search sites? And so on how can I do that. Picture of my google webmaster tools (nl): http://i.stack.imgur.com/ZuP4L.png

    Read the article

  • Moved sitemaps to a different subdomain and losing search referrals around the same time. Red herring or correlation?

    - by er1234
    We started to lose search referral traffic around the same time that I moved some of our sitemaps to a subdomain. Could this have hurt us? I followed Google's steps to creating a sitemap under a different subdomain. The new sitemaps.foo.com subdomain is being crawled and indexed well. Both www.foo.com and sitemaps.foo.com have been verified in Google Webmaster Tools. They appear as distinct sites. Is this correct? I can't find a way in Webmaster Tools to say "Hey, sitemaps.foo.com is really owned by www.foo.com, so show them together and make sure to attribute sitemaps.foo urls to www.foo" Our www.foo.com/robots.txt Sitemap: http://www.foo.com/sitemap.xml Sitemap: http://sitemaps.foo.com/subdir/sitemap.xml.gz

    Read the article

  • Move site to new domain divided by language across subdomains

    - by mark
    I managed to find a nice domain for a fairly fledgling site of mine that actually hasn't been parked by scumbag squatters. Given the upcoming move I'm thinking I'd take the opportunity to split the content across subdomains according to language, much like wikipedia for example: current: www.old-domain.com/en/subject # English www.old-domain.com/subjecto # Spanish (default so not locale in url) proposed en.new-domain.com/subject es.new-domain.com/subjecto The advantage of doing this is a fairly competitive keyword such that I may wish to put a copy of my application on a Spanish slice in order to gain a few serp's. Also pure vanity. Google's webmaster tools allows me to move to the new domain and I can add the root domain and the subdomains but forward to only one. I'll 301 from the old domain appropriately but is there anything I should know about webmaster tools in this respect where effectively I'm moving to two addresses? (Feel free to dissuade me from doing this if it's a bad idea in comments.) I've now asked this same question on google's forums.

    Read the article

  • My website not getting any traffic, How to get traffic? [closed]

    - by Divyanshu Negi
    Possible Duplicate: How can I increase the traffic to my site? I have done pretty nice SEO of my website , the website is made in php , anyone can submit article and the submitted articles are moderated by the moderators , the site is online from more then a month but still the user count is 10-20 only total impressions are 700 according to webmaster google , how much time does google webmaster takes to refresh the data , cause from 3 -4 days the impressions shown in the dashboard are 700 only , I am posting 2 article each day , please help me , I am very disappointed with all my effort and i really need a good motivation to carry on my work. Please help my website url is http://www.viewloud.com

    Read the article

  • Issue with sitemap in GWT

    - by Anusha
    I have an e-commerce website www.beyondtime.in, i have been constantly monitoring the google bot crawling on my website and my webmaster account. Lately, i have found two issues that i have not been able to understand and hence want your help. 1.) The Google Bots have been only crawling www.beyondtime.in/telecom.php this URL of my website, when the URL is not even valid. So, kindly help me understand what needs to be done to let Google crawl other pages of the website as well. 2.) The second question is about the Google Webmaster account, where i've submitted my sitmap with 227 URLs, but out of that only 156 have been indexed. Also none of the images of my website have been indexed by Google. So kindly help me with this as well. Thanks

    Read the article

  • 301 redirect from a country specific domain

    - by Raj
    I originally started using a .do domain extension for my site, but later realized that this country specific domain would prevent us from appearing in search results for places outside of the Dominican Republic. We started using a .co domain extension and redirected all requests to the new domain using an HTTP 301. The "Crawl Stats" in Google Webmaster Tools shows me that the .co domain is being crawled, but the "Index Status" shows the number of pages indexed at 0. The "Crawl Stats" for the .do domain says that it's being crawled and the "Index Status" shows a number greater than 0. I also set a "Change Of Address" in Google Webmaster Tools to have the .do domain point to the new .co domain. We're still not appearing in search results at all even for very specific strings where I would expect to find us. Am I doing something wrong?

    Read the article

  • Google indexing and ranking a custom domain served by Google App Engine

    - by Hugues
    I have a website served on the following URL : "http://www.plugimmo.com" which is a custom domain served by Google App Engine on the following URL : http://plugimmo.appspot.com Since a while I have tried to optimise the Google indexing and ranking with no success. The problem is that searching on Google the keywords in the title of my home page does not retrieve my website at all even not in the 1,000 first results : When checking the cached version of google ( cache:www.plugimmo.com), it says that the cached version is the one of 20-Aug-12 of "plugimmo.appspot.com". It looks there are several issues : 1 - The cached version is really old. I have made a lot of changes since the 20-Aug-12 and I saw the googlebot crawling my site several times. 2 - The cached version is for "plugimmo.appspot.com" 3 - When looking at the Google Webmaster tools, I see that the number of pages indexed for www.plugimmo.com is 0, but that can't be the case given the number of changes I made since then. My questions would therefore be the following : Why is the version of the cache so old although I saw the googlebot crawling the site many times since 20-Aug-12 ? Is there a problem with indexing a custom domain served by Google App Engine ? Why is the Google Webmaster tools showing 0 pages indexed although new pages have been crawled and that no errors have been reported in the indexing ? Also, the site has been developed with Google Web Toolkit. I have followed the guidelines regarding crawling Ajax sites. The home page when crawled by a robot is redirected to http://www.plugimmo.com/HomeSnapshot.html Thanks a lot for your help ! Hugues

    Read the article

  • Soft 404 error on redirected outbound links

    - by Techlands
    I have a redirect script on my site which sends visitors to an affiliate site. However in the last month I've noticed that Google webmaster tools is reporting my outbound links as a 404 error. Here is the breakdown on how its setup: My outbound links are coded like this: <a href="/f/c123" rel="nofollow" target="_blank">Link Title</a> My redirect script will then perform a 302 redirect to the affiliate link Originally I had the affiliate links (CJ) directly in the HTML, however I noticed over time that this had some impact on my sites traffic. So I changed them to a redirect script and my traffic returned. This seemed to work with no issues for over 1 year but now I'm getting soft 404 errors in Google webmaster tools. I did try adding a rule in my robots.txt to block any links starting with /f/ but I'm not sure if this will help or Google will still report soft 404 errors. I am considering as possible options to change the a tag to a button tag and use an onclick event to load the link.

    Read the article

  • Search ranking for important keywords has gone down drastically [duplicate]

    - by Vaivhav
    This question already has an answer here: How to diagnose a search engine ranking drop? 5 answers Firstly, we are a small entrepreneurial team of 3 persons and I am more like an amateur webmaster of the company's website as we cannot really afford a technical guy/department right now. A few weeks earlier, our website traffic and rankings for most keywords decreased overnight. I did a lot of reading henceforth and learned about Penguin 2.1 which people said is the reason for the drop. Something like this had never happened before. Now, I have gone through the entire Google webmaster help section. It says there that if a manual penalty is taken against us, we would notice a message in Manual Actions page. So far, we haven't received any notice from Google for web spam. Some SEO guys I contacted said they found spam links in our backlink profile. I do believe I had mistakenly purchased a cheap link/SEO scheme when I was yet very new to SEO. This was more than a year back but since then we have been legitimate. Moreover, how do I find out which is a spam link and which is not? Our content is all original, refreshing and the best you will find in our niche. We also have a blog but on a different domain (wordpress.com) from where we send out anchored links to our business website. Is this a good thing to do? Now, how should we proceed and recover our traffic/rankings. I tried searching in webmasters for a way to reach google and ask them why the traffic has decreased suddenly, but I couldn't find a contact form or something. Can someone please go through our website and help in making things more clear regarding the reason for the drop, along with a solution. Will really appreciate this as I can't get to figure this out and its taking a lot of time. Vaivhav

    Read the article

  • Web master tools is throwing out 404 errors on link not on page

    - by plantify
    Webmaster tools is showing thousands of 404 errors, where pages on the site are referring to another incorrect url. For example, URL not found www.plantify.co.uk/shop/=, linked from http://www.plantify.co.uk/shop/gift-voucher and http://www.plantify.co.uk/shop/special-plant-offers. I obviously have checked the source and cannot find any references to this link on any page. The only consistent issue is that it only seems to report this error on pages with two section i.e. www.plantify.co.uk/shop does not report any error whilst all pages with www.plantify.co.uk/shop/xxx (where xxx can be several different pages such as gift-voucher) all report this. I cannot seem to duplicate this error. I have run a link checker (we use Screaming Frog) and it does not report this error. I have fetched these pages as a bot, and these do not report this error. I am at a total loss. I cannot even duplicate the issue, but it is most definitely an issue, as Webmaster Tools is reporting new errors every day. Is this perhaps google bot doing its own thing?

    Read the article

  • 301 redirect from HTTP to HTTPS - how to be sure Google is fetching the correct information?

    - by user33692
    I'm hoping somebody might be able to provide a bit of advice on an issue I am having. I have one site where we implemented a 301 redirect on the homepage from HTTP to HTTPS. We have links on the homepage to other parts of the site that are not under SSL (in fact there is only one other page under SSL). When I go to our Webmaster Tools account I notice that we are not being provided with any webmaster information (e.g., search queries, backlinks, etc...) related to our homepage under SSL. I performed a Fetch as Google on the homepage and the information it returned is: HTTP/1.1 301 Moved Permanently Date: Fri, 08 Nov 2013 17:26:24 GMT Server: Apache/2.2.16 (Debian) Location: https://mysite.com/ Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 242 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>301 Moved Permanently</title> </head><body> <h1>Moved Permanently</h1> <p>The document has moved <a href="https://mysite.com/">here</a>.</p> <hr> <address>Apache/2.2.16 (Debian) Server at mysite.com</address> </body></html> I am worried by the fact that Google fetch is not getting the correct Title tags and Meta information from our homepage and that this is hurting our search results. Additionally, I am worried that we need to do something specific with the sitemap to ensure that Google is correctly indexing all our pages and being able to flow from the HTTPS to the HTTP without issues. Does anybody have any advice on how we can correctly set this up or be sure that Google is fetching the correct information?

    Read the article

  • Massive 404 attack with non existent URLs. How to prevent this?

    - by tattvamasi
    The problem is a whole load of 404 errors, as reported by Google Webmaster Tools, with pages and queries that have never been there. One of them is viewtopic.php, and I've also noticed a scary number of attempts to check if the site is a WordPress site (wp_admin) and for the cPanel login. I block TRACE already, and the server is equipped with some defense against scanning/hacking. However, this doesn't seem to stop. The referrer is, according to Google Webmaster, totally.me. I have looked for a solution to stop this, because it isn't certainly good for the poor real actual users, let alone the SEO concerns. I am using the Perishable Press mini black list (found here), a standard referrer blocker (for porn, herbal, casino sites), and even some software to protect the site (XSS blocking, SQL injection, etc). The server is using other measures as well, so one would assume that the site is safe (hopefully), but it isn't ending. Does anybody else have the same problem, or am I the only one seeing this? Is it what I think, i.e., some sort of attack? Is there a way to fix it, or better, prevent this useless resource waste? EDIT I've never used the question to thank for the answers, and hope this can be done. Thank you all for your insightful replies, which helped me to find my way out of this. I have followed everyone's suggestions and implemented the following: a honeypot a script that listens to suspect urls in the 404 page and sends me an email with user agent/ip, while returning a standard 404 header a script that rewards legitimate users, in the same 404 custom page, in case they end up clicking on one of those urls. In less than 24 hours I have been able to isolate some suspect IPs, all listed in Spamhaus. All the IPs logged so far belong to spam VPS hosting companies. Thank you all again, I would have accepted all answers if I could.

    Read the article

  • Can Campaign URL tags cause a soft 404 error?

    - by user35306
    I was checking out one of my company's website's Webmaster Tools to analyze the cause behind some soft 404 errors and discovered that a few of the older errors had affiliate mp referral tags listed as the relative URLs. Since these are older problems and I don't seem too many of them coming up in the last few months I don't think it's still a problem. I'm just curious if it's possible to cause a soft 404 by improperly copying the campaign or referral tag into the URL.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >