Search Results

Search found 20852 results on 835 pages for 'local seo'.

Page 103/835 | < Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >

  • Will a search engine lower the rank of my page if i have hidden iframes?

    - by Skurpi
    As a praxis, all external content on our site is put in iframes to lower the risks of any external parties injecting stuff to our users. We also do it to make sure our content shows up before banners, to make the site feel quicker. We now have an external script running which we want to put in an iframe, but it does not have any visible content to go with it so I want to put css "visibility: hidden;" on the iframe. I read in a forum somewhere that search engines will lower the rank of a page, or even drop the page, if a iframe has "the minimal size of 1x1px". Will a search engine lower the rank of my page if I have a hidden (or 1px big) iframe?

    Read the article

  • Does Google pick up anchor text that is in nested elements?

    - by dangerDAN
    When Google looks at anchor text on a website, will it pickup the text if it is inside nested elements? So for example: <a href="http://www.google.com/">Visit Google</a> To: <a href="http://www.google.com/"> <div class="circle"> <span>Visit Google</span> </div> </a> The reason I ask is because I want to use css3 elements for certain links on my website, to style them as circles. But the anchor text needs to be picked up for these links, so I want to know wether or not the above is bad practice in this case.

    Read the article

  • Does Google penalize pseudo-duplicate pages for different locations?

    - by mikewowb
    My compony's site's home page was not specificly optimized to any location. Now, I am planning to optimize it to Boston, and create ten or so other landing pages for other locations we serve. If we made these new pages by copying the original Boston one and changing the location's name (s/Boston/Montreal/), would Google consider them as duplicate pages and penalize us? What is the best practice for this?

    Read the article

  • How to redirect a international domain to a subfolder on the English site without hurting Google rankings?

    - by ernest1a
    I have two sites: www.example.de www.main.com www.main.com is English version of www.example.de which is in German. I want to keep only www.main.com. For the English version I will keep www.main.com, but for German I want to move it to www.main.com/de. I am wondering what would be best solution for old www.example.de: Redirect everything from www.example.de to www.main.com/de using 301 redirect? Redirect everything from www.example.de towww.main.com/de/page-url-of-old-size.html? So each link actually get own address. Is that necessary or will Google realize where the page belongs on new site even if I redirect everything to home page? Any other solution, maybe just set in Google webmaster tools the new domain or anything like that?

    Read the article

  • Google index iframes on Facebook fan pages? (Hole website content)

    - by user2536417
    I have a fairly simple question that I've tried to get help from the guys on the Google Webmaster Help Q&A site but so far no joy so hopefully someone here can provide me with the information I'm looking for. I have a Facebook fanpage for my website, I have made an app that basically uses an iframe and puts the site within a frame within Facebook. All works good but Google is not indexing this page. I am using <link rel="canonical" href="#" /> on my pages so prehaps this is an issue?

    Read the article

  • What are the Search engine affects of registering the same domain on multiple top level domains (ie. .com, .ie, .nl etc.)?

    - by user1020317
    I'm looking to register a few more domains for my company, I have my-company.com at the moment, but now require my-company.com.au and my-company.nl and some others.. I'm running through my options and wondering what is the best.. Duplicate all the content on the .com package and make a replica at the other domains Buy the other domains but do a 301 redirect back to the .com domain. Create a full new website with different content for the new domains, thus having no text duplication We currently sell over the world so would like to raise our Search rankings in various countries, can this be done by buying the domain in the country, and if so, how will the above methods affect our search rankings. Any other suggestions are welcome!

    Read the article

  • 301 redirect from a country specific domain

    - by Raj
    I originally started using a .do domain extension for my site, but later realized that this country specific domain would prevent us from appearing in search results for places outside of the Dominican Republic. We started using a .co domain extension and redirected all requests to the new domain using an HTTP 301. The "Crawl Stats" in Google Webmaster Tools shows me that the .co domain is being crawled, but the "Index Status" shows the number of pages indexed at 0. The "Crawl Stats" for the .do domain says that it's being crawled and the "Index Status" shows a number greater than 0. I also set a "Change Of Address" in Google Webmaster Tools to have the .do domain point to the new .co domain. We're still not appearing in search results at all even for very specific strings where I would expect to find us. Am I doing something wrong?

    Read the article

  • How to make the most of GWT's "Search queries"?

    - by DisgruntledGoat
    I've been looking at the "Search queries" section in Google Webmaster Tools recently, and it seems like there is a lot of potential there in finding which pages on a site need improvement. I'm trying to figure out exactly what to sort or filter on. Do I look at pages with a low average position? Low impressions but high clicks? Pages that are rising up/falling down the rankings? What is the low-hanging fruit here?

    Read the article

  • quickest way to research a set of pages backlinks

    - by JeremyB
    I have a list of 300+ pages (they were chosen based on which pages rank for a keyword I'm interested in) and I want to compile a list all the (known) inbound links to those pages. What's the fastest way to do this? It seems like the only tools out there-- Yahoo Site Explorer, SEOMoz, Majestic, require you to either a) manually export each set of links by hand, or b) get data at the domain level (e.g. Majestic's clique hunter). Does anyone know of any efficient way to do this? I ask because I'm about to write a bunch of code and I don't want to waste my time if there's another tool that will work. I know SEOMoz and Majestic have API's but I'm wondering if there's a more user-friendly option.

    Read the article

  • How do you enhance your websites speed without compromising the design and access?

    - by Thorn007
    How do you enhance your websites load speed without killing the design and accessibility? File compression, CDN, Gzip? What are the best tools for doing so? For example, Google has optimized their site without compromising the design. Also, many website can kill the purity of their images with compression. Is there a way, more or lest best practice, to increase speed without compromising the design and accessibility? Note: sorry for being so vague but I don't know how else to phrase this question.

    Read the article

  • SEO Tips - Updating Your Content and Avoiding Duplicate Filters

    If you are just getting into Search Engine Optimization, it's important that you are aware that content is the most important thing in your website in order to get high ratings. If you decide to hire an SEO Consultant, make sure that you explain to them exactly what you want your website to look like and the content that you would like it to feature. Having fresh and relevant content in your website is the key as that will bring web crawlers back frequently. There are different ways of achieving this while avoiding getting your website removed off of a search engine due to duplicate content.

    Read the article

  • How to remove a page from site without affecting google serp

    - by Savas Zorlu
    I have a travel website. Just for information purposes, I had put a weather page. Now I realize that this page is increasing my overall bounce rate; because people who are looking for the weather forecast are landing on that page and getting what they want and exiting. What is the safest method to get rid of that page? Would it hurt my google rank if I remove it completely? Or is there a better way to handle this situation? I realize that around 21 percent of my daily hits are on that page. I would have been happy if my aim was to provide weather data for the location; however, my site needs to focus on selling hotels. So I think I need to get rid of this weather page immediately. What do you think?

    Read the article

  • Upgrading Blog to WordPress, keep old one or redirect?

    - by Spazm
    I have had a blog for around 4 years now that has gained some success and gets tons of residual and organic traffic. I am using an outdated version of BlogEngine.NET and we are going to switch to WordPress. Currently, our blog is at ourwebsite.com/blog, and I don't really want to mess with our web server. I setup a new server just for WordPress and instead of dealing with proxies and things that are above my head, the new blog will be at blog.oursite.com. I am trying to figure out the best way to go about doing this. We value our search engine rankings very much so, as that is where 90% of our traffic comes from. Would I be best off: Importing all of the old blog posts from oursite.com/blog to blog.oursite.com and then redirecting all of the articles, categories, authors, pages and tags to the new blog and giving up the direct links to our current site. Keeping all of our current articles as they are, and only add new content to the new blog.oursite.com and just kind of let the old blog float around our site.

    Read the article

  • Can you disavow a whole domain apart from the index page?

    - by Silver89
    Many years ago I may have bought a few sitewide links for some of my sites, these have now come back to haunt me and I need to sort them out. I've tried to contact the owners but they're too lazy to bother changing the sites so I figure it's time to disavow the links. But is there a way to disavow all of the sitewide links on the domain apart from on the index page and would this be a benefit to leave the index or would it still be seen as spammy? Something like ... # Contacted owner of shadyseo.com on 7/1/2012 to # ask for link removal but got no response domain:shadyseo.com !shadyseo.com/index.php

    Read the article

  • Can I use asterisks in URLs?

    - by KajMagnus
    Are there any reasons I shouldn't use an asterisk (*) in a URL? Background: With asterisks, I could provide these nice and user friendly (or what do you think??) URLs: example.com/some/folder/search-phrase* means search for pages with names starting with "search-phrase", located in /some/folder/. example.com/some/**/*search-phrase* means search for any page with "search-phrase" anywhere in its name. example.com/some/folder/* means list all pages in /some/folder/ (rather than showing the /some/folder/index page).

    Read the article

  • CSS being Displayed by Google spiders

    - by vipul_vj
    I have written an article HTML Image Tag for the site and it has been indexed by Google. But when I search it, google displays HTML Image Tag - ProgrammingBulls http://programmingbulls.com/html-image-tag-1: content { font-family:verdana; font-size:14px; font-weight:normal } We often use images in a webpage. To insert images in our webpage < img tag is used in. Why is CSS displayed in the google search? I know that CSS and HTML is ignored by Google but due to some reason HTML is being displayed.

    Read the article

  • Google not showing any pages from my site in the index after three months [on hold]

    - by Alex Coisman
    Despite having a sitemap and using Google Webmaster Tools, it has been over 3 months and my site has not been added to the Google index at all. Here's the site: www.famouslefthandedpeople.com As far as I know, I have done everything correctly. However, there must be something I am overlooking that is preventing Google from indexing the site. I do not have a robots.txt file, so allow/disallow isn't the issue. Although the content of the site is sparse, it is original and not duplicated internally or externally so Panda/Penguin should not be a problem. I have reviewed the answers at Why isn't my website in Google search results? and I don't think it applies here. If it matters, I am using WordPress to create the site. What other factors should I be looking at in order to troubleshoot this?

    Read the article

  • Googlebot visit but no cache update - why?

    - by Mick
    I have made a new plain vanilla HTML website. I have been making regular modifications to it on an almost daily basis. The site is hosted by hostmonster and as part of their service they offer "awstats" to let you know assorted details of visitors to the site. One thing is puzzling me. According to awstats, a "robot/spider" calling itself "Googlebot" visited my site as recently as today (28th June 2011), but when I find my site on google (e.g. by searching for "full reserve banking") the cache is dated only the 5th June. I always thought that a visit from the google robot was synonymous with a cache update. Am I wrong? Or have I accidentally put something in the site telling google that nothing has been updated? EDIT: It seems a moderator has removed the name of my website, so there is now no chance that anyone could check out if I had made some error on my site :-( ... but anyway, in answer to paulmorriss' question, here is what aw stats was telling me:

    Read the article

  • Is the structure of my site's navigation (via price/service tables) considered 'Duplicate Content' by Google?

    - by James Gadsby
    As I'm building my business website, I'm using service/price tables at the bottom of each service page to demonstrate to customers/potential clients my other offerings. Of course, given that there are 7 or 8 service pages, each with (according to Google) the same service descriptions below the original content for that service, would this be counting as duplicate content? If so, what could I do about it?

    Read the article

  • In addition to Google's First Flick Free, should you whitelist search engine bots past a paywall?

    - by tobek
    Our site has subscription-only pages - non-subscribed visitors see a snippet preview. As per Google's FCF requirements, your first 5 hits to a subscriber-only pages with .google. as the referrer, you see the full page. In addition to this, should we whitelist search engine bots so that they can index the full content? I assume this is not required for Google, which can use FCF to index our content, but what about other search engines? Is this considered cloaking? My gut says that whitelisting bots past the paywall is bad practice., but I wanted to confirm - any evidence or references would be amazing.

    Read the article

  • A relatively new blog seems to be getting very poor Google indexing

    - by Genadinik
    I have a new blog that is 2 months old. In the first few weeks, it was getting indexed nicely and my GoogleWebmaster reports were showing that it was getting crawled and began ranking for some terms. Then as I kept writing, the GoogleWebmaster report thinned out and showed less and less terms that this blog ranks for. Now there are only 4 terms with one of them being my name. Is there something I need to do to keep the old posts to remain indexed and crawled? Thanks, Alex

    Read the article

  • Why won't Webmaster Tools let me set a preferred domain? (says to verify, but it should already be)

    - by Su'
    I've got a domain that I apparently forgot to set a preferred domain for, so I just tried to do it. Webmaster Tools instead popped up a little box: Part of the process of setting a preferred domain is to verify that you own http://www.example.com/. Please verify http://www.example.com/. I'm running into some problems following these instructions: As far as I can tell I already did verify sometime in the past. There's a TXT DNS record with the gibberish Google tells you to set for it that I couldn't have come up with myself. …and nothing is telling me this information is bad. So let's assume the site is somehow not actually verified. All the various methods' instructions start with "click the Manage Site button next to the site you want, and then click Verify this site." That button doesn't even exist on my screens. (It presumably goes away when you successfully verify?) Those instructions were all updated pretty recently, and the DNS one in particular just a couple weeks ago so it seems a bit unlikely they're inaccurate. I am not using Apps, and won't be, so can't try out the verification through there. Note that I also have another domain that I have not done any verification for which is showing the same behavior(no such button, being told to verify when it seems impossible etc.) so something appears to just be broken. I already have a no-www process in place server-side, so we can skip that. I'm just trying to get the box checked off in GWT. If I don't get any resolution, I'll eventually scrap the TXT record and see if the site gets un-verified(or whatever since it thinks it isn't), and see if I can just restart the process. It's not urgent so I'm just trying to figure out if I've gone blind to something or what. Did the button move?

    Read the article

  • How to make google get to know my domain name [closed]

    - by Milosh Belter
    Possible Duplicate: I cannot see my website in google I have a strange problem with my website. I have a website, let's say Abcdefg.com. Website is live for 2 months and google still doesn't know it. While searching for my domain name 'abcdefg', google displays results for similar phrase (abcdef) but not fot mine. How to make google get to know my domain name? Website and sitemaps have been submitted via Google Webmaster Tools.

    Read the article

  • Will a search engine lower the rank of my page if i have a hidden iframes?

    - by Skurpi
    As a praxis, all external content on our site is put in iframes to lower the risks of any external parties injecting stuff to our users. We also do it to make sure our content shows up before banners, to make the site feel quicker. We now have an external script running which we want to put in an iframe, but it does not have any visible content to go with it so I want to put css "visibility: hidden;" on the iframe. I read in a forum somewhere that search engines will lower the rank of a page, or even drop the page, if a iframe has "the minimal size of 1x1px". Will a search engine lower the rank of my page if I have a hidden (or 1px big) iframe?

    Read the article

< Previous Page | 99 100 101 102 103 104 105 106 107 108 109 110  | Next Page >