Search Results

Search found 5969 results on 239 pages for 'seo man'.

Page 94/239 | < Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >

  • How to configure Google sitemap links? [duplicate]

    - by Alexander Farber
    This question already has an answer here: What are the most important things I need to do to encourage Google Sitelinks? 5 answers I run a Wordpress 3.7.1–de_DE sit, but don't have much experience with it yet. When my site comes up in a Google search, there are 2 links displayed underneath: I believe these links are called "Google sitemap" and my question is how to configure them in Wordpress. Because while the right link is pointing to the /ueber-mich URL at the website, the left link was pointing to an non-existing /imprint and I had to add that webpage as a workaround for now. And I'd like to change the /imprint to German /impressum anyway (currently I use mod_rewrite to redirect).

    Read the article

  • How can I track scrolling in a Google Analytics custom report?

    - by SnowboardBruin
    I want to track scrolling on my website since it's a long page (rather than multiple pages). I saw several different methods, with and without an underscore for trackEvent, with and without spaces between commas <script> ... ... ... ga('create', 'UA-45440410-1', 'example.com'); ga('send', 'pageview'); _gaq.push([‘_trackEvent’, ‘Consumption’, ‘Article Load’, ‘[URL]’, 100, true]); _gaq.push([‘_trackEvent’, ‘Consumption’, ‘Article Load’, ‘[URL]’, 75, false]); _gaq.push([‘_trackEvent’, ‘Consumption’, ‘Article Load’, ‘[URL]’, 50, false]); _gaq.push([‘_trackEvent’, ‘Consumption’, ‘Article Load’, ‘[URL]’, 25, false]); </script> It takes a day for counts to load with Google Analytics, otherwise I would just tweak and test right now.

    Read the article

  • Will my traffic come back after my site redesign?

    - by Steve
    I screwed up. I launched my site after rebuilding it without setting up the proper 301's and traffic immediately dropped about 60%(it's not really something I thought about). After about a week and a half, I set the 301's back up yesterday and resubmitted my sitemap to google. Google has yet to index the whole thing, but traffic isn't getting any better. Is it likely to come back? If so, I. How long? Has this happened to you? Any info is appreciated. I am really anxious!

    Read the article

  • Hiding a particulat page from search engines not to index

    - by user702325
    I have a page which i don't want search engines to index or crawl. I am not sure hat should i put in my robots.txt file to tell search engines not to crawl/index that page. The page it itself is getting generated dynamically and do not have a predefined template for it all i know about its URL which is pre-defined and will remain unchanged. I have this page say at www.mysite.com/my-nonindexable-page/ Please suggest what i should do to achieve this.I am using WordPress for my website

    Read the article

  • URL is generating a /#!/splash-page

    - by user32642
    My site for some reason is generating a shebang - /#!/splash-page on the URL. For example when I type www.modernvintage1005.com, the browser returns www.modernvintage1005.com/#!/splash-page and every subsequent page is /#!/about, /#!/contact, and so forth. There's absolutely nothing on the Google about this. There is a lot of rewrite help to eliminate .index.php from the home page, but that's it. How do I rewrite it to just say domain.com and domain.com/about.html, etc.? Here is my .htaccess file if you need to see it. # Rewrite Rule <IfModule mod_rewrite.c> RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </IfModule> # compress text, html, javascript, css, xml: <IfModule mod_deflate.c> AddOutputFilterByType DEFLATE text/plain AddOutputFilterByType DEFLATE text/html AddOutputFilterByType DEFLATE text/xml AddOutputFilterByType DEFLATE text/css AddOutputFilterByType DEFLATE application/xml AddOutputFilterByType DEFLATE application/xhtml+xml AddOutputFilterByType DEFLATE application/rss+xml AddOutputFilterByType DEFLATE application/javascript AddOutputFilterByType DEFLATE application/x-javascript AddType x-font/otf .otf AddType x-font/ttf .ttf AddType x-font/eot .eot AddType x-font/woff .woff AddType image/x-icon .ico AddType image/png .png </IfModule> ## EXPIRES CACHING ## <IfModule mod_expires.c> ExpiresActive On ExpiresByType image/jpg "access 1 year" ExpiresByType image/jpeg "access 1 year" ExpiresByType image/gif "access 1 year" ExpiresByType image/png "access 1 year" ExpiresByType text/css "access 1 month" ExpiresByType application/pdf "access 1 month" ExpiresByType text/x-javascript "access 1 month" ExpiresByType application/x-shockwave-flash "access 1 month" ExpiresByType image/x-icon "access 1 year" ExpiresDefault "access 2 days" </IfModule> ## EXPIRES CACHING ##

    Read the article

  • Will search engines ever change to allow longer title and description tags? [closed]

    - by guisasso
    I was just wondering: The standard title length is 64 characters, while meta description tags are 150-160. I was thinking, that it was probably done like that originally because of screen resolutions back in the day, that could not really fit a lot of content. Google still displays search results in a incredible small resolution fixed to the left side of the browser, and it's simplicity is probably what makes it so popular. With websites such as bing, displaying a richer more vivid search experience, in your opinion, will search engines ever change to accept better and longer meta description tags and titles? (I'm asking because we work to accommodate their standards, but what if they change?)

    Read the article

  • My site disappeared from Google search, how long does it take to get back?

    - by Sweb Dizajn
    Due to damage by malicious code, Google wrote: Google Analytics web property: link has been removed from http://swebdizajn.com November 29, 2011 Your Webmaster Tools http://swebdizajn.com site is no longer linked to a Google Analytics web property. Possible reasons are: You are no longer the owner of the site in Google Analytics, and nobody else owns both the site and the property Another site owner removed the link. After that I restored to backup and then accepted the Google message to tell them that all is well. How long will I have to wait for my site to return to the position where I was?

    Read the article

  • Is multiple domain names and links from same IP causing poor search engine rankings?

    - by John
    I have an ecommerce website which is not doing so well in Google. I am trying to improve this of course, and am looking at some possibilities for why it isn't doing well. The website has four domain names, all of which have been indexed by Google. A few months ago I applied 301 redirects to any requests for two of the domain names so now it is down to two domain names (one is a .net, the other is a .com.au, the others were .net.au and .com). I prefer to use my main domain name (the .com.au), but one of the names has been around for a long time and has more inbound links. According to a PageRank tool, both are PR2. It is a Classic ASP site and up until recently had a lot of querystring parameters. In the last week or so I added URL rewriting so there is now no parameters for most pages. I don't do 301 redirects from the old URLs but instead I add the META canonical tag indicating the preferred new URL. At the same time I redesigned the site and improved title tags, META descriptions, and H tags but it hasn't been long enough yet for Google to index many of these yet. I also looked at what pages Google has indexed and strangely it has some strange pages in the index, there are a lot of pages which are actual keyword searches (more a bunch of random letters than an actual word). What I mean is that it is as if they had typed in something to search for in my search box - there are no links to pages like this and the only way of getting this is to type something in to the search box). So I added a META robots tag with noindex,nofollow anytime that I render pages like this. Years ago I set up a fake price comparison site which lists all my products and links back to my site. It has a different keyword rich domain name but is on the same server and same IP address. It's a completely different layout but does have the same product categories and product descriptions (although I have stripped formatting out of them so they are not identical except in text). I also have a few blog sites which again are on the same server/IP and all have advertising for the website. My questions are: What should I do with the multiple domains, just use one, or continue with two or more? Should I add 301 redirects, not just the META canonical tag? Any idea about Google indexing my search results page, and did I do the right thing with the META robots tag? Is the fake price comparison site likely to be causing problems? Are all the links to the site from other domain names but the same IP address likely to be causing problems? Thanks for any help. Sorry for so many questions in one.

    Read the article

  • Will loading meta tags dynamically from a database hurt the site?

    - by Nalaka526
    I have a website (ASP.NET MVC) which has its contents mainly in Sinhala language. So the search engines will list my site only when someone searches for Sinhala words. But,I need to list my site's pages in search results when searched with appropriate English words too. So I'm planning to save HTML meta tags (in English) in database and load them dynamically with appropriate page contents. Will loading the meta tags dynamically affect the site adversely?

    Read the article

  • My website google index suddenly increase and also suddenly reduced

    - by Jeg Bagus
    Yesterday before i sleep, i check my site index. i get about 50 index on google. today morning when i wake up, i get 250 index on google. and my page ranking better on several keyword. than i add 1 page and 2 canonical link, add 404 page header, and resubmit sitemap. and after 2 hour, its going down to 50 index again. and my page ranking just rolled back to previous day. what is actually happen? is it because i resubmit sitemap? until now, google still crawl my website. do they try to refresh the index?

    Read the article

  • Include latest searches in search engines index

    - by drcelus
    My websites generally include a page with the (user input) latest searches. I know it's not a good security practice to allow this since you can find undesired content. On the other hand it boosts the number of pages indexed since every new search can provide a link on google and people can find you with related keywords that you are not using on your web page. What is the rationale behinf including or excludingthis results in search engines index ?

    Read the article

  • Can I benefit from links to pages on my site which have a `noindex` meta tag?

    - by Noam
    I'm trying to understand if/how I can benefit from people linking to pages on my site which are with pages that have a noindex meta tag. 2 actions I'm considering to perform: Remove the robots.txt disallow to these pages, to make sure inner links get the propagated link juice. Adding a canonical tag to the most similar page that doesn't have a noindex meta tag Are these valid approaches that might help? Any others I should consider?

    Read the article

  • 301 redirect from HTTP to HTTPS - how to be sure Google is fetching the correct information?

    - by user33692
    I'm hoping somebody might be able to provide a bit of advice on an issue I am having. I have one site where we implemented a 301 redirect on the homepage from HTTP to HTTPS. We have links on the homepage to other parts of the site that are not under SSL (in fact there is only one other page under SSL). When I go to our Webmaster Tools account I notice that we are not being provided with any webmaster information (e.g., search queries, backlinks, etc...) related to our homepage under SSL. I performed a Fetch as Google on the homepage and the information it returned is: HTTP/1.1 301 Moved Permanently Date: Fri, 08 Nov 2013 17:26:24 GMT Server: Apache/2.2.16 (Debian) Location: https://mysite.com/ Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 242 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>301 Moved Permanently</title> </head><body> <h1>Moved Permanently</h1> <p>The document has moved <a href="https://mysite.com/">here</a>.</p> <hr> <address>Apache/2.2.16 (Debian) Server at mysite.com</address> </body></html> I am worried by the fact that Google fetch is not getting the correct Title tags and Meta information from our homepage and that this is hurting our search results. Additionally, I am worried that we need to do something specific with the sitemap to ensure that Google is correctly indexing all our pages and being able to flow from the HTTPS to the HTTP without issues. Does anybody have any advice on how we can correctly set this up or be sure that Google is fetching the correct information?

    Read the article

  • htaccess 301 redirect help needed

    - by John
    Due to some issues in my site many pages are visible as duplicate using : www.example.com/page.html?task=view but it's content is exactly same as www.example.com/page.html. One way is to use http 301 redirect from www.example.com/page.html?task=view to www.example.com/page.html when anybody fetches page with arguments. But links like www.example.com/page.html?task=view will remain visible to outside world. Another way is canonicalization which I don't want to use as it is difficult to insert the tag in Joomla CMS. I want to hide www.example.com/page.html?task=view from external world. Is it possible to change the url from www.example.com/page.html?task=view to www.example.com/page.html ? I mean if there is href link of www.example.com/page.html?task=view in my web page, it should be visible to external world as without any arguments. This is different from using 301 to convert externally accessed page : www.example.com/page.html?task=view to without using arguments in .htaccess.

    Read the article

  • How will this affect my SEO ranking?

    - by dunc
    I run a fishkeeping website based on a WordPress (PHP) CMS. I've recently put a fairly complex "filter" into place which searches my content for mentions of fish species profiles and turns them into an active link. For example, asdasd this is a test about abdomen to see if the caudal fin will work asdadasdas try again with abdomen and A. panduro and Apistogramma panduro ...becomes asdasd this is a test about abdomen to see if the caudal fin will work asdadasdas try again with abdomen and <a href="/?p=1703" class="link_species">A. panduro</a> and <a href="/?p=1703" class="link_species">Apistogramma panduro</a> On the rest of my website, the species are linked with pretty URLs such as /species/apistogramma-panduro/ but due to the way this filter works, the only information I can get access to is the idof the post. As such, I'm using /?p=1703 or whatever the ID is. What I'd like to know is: how much will this affect my SEO rating/ranking? Will it be detrimental if I don't rewrite the function? Thanks in advance,

    Read the article

  • How will the search rank get impacted if i move my mobile website to a single page application?

    - by rahul
    I have two different versions of my site. A desktop version and a mobile optimised version. That is for the same url the server renders different html for different user agents. I had been using vary header for this scheme as recommended by Google. However, now i want to move the mobile website to a single page application for several reasons. I want to know if google stops seeing anything on my mobile web version but the desktop version continues to work as it is, then how would the search rank be impacted given that mobile web gets more traffic than the desktop version. How would the vary header come jnto play

    Read the article

  • How to configure Google sitemap links in Wordpress? (without editing its HTML or PHP source code)

    - by Alexander Farber
    I run a Wordpress 3.7.1–de_DE site, but don't have much experience with it yet. When my site comes up in a Google search, there are 2 Google sitemap links displayed underneath: One of them points to a non-existent webpage /imprint though and I had to add a page at that URL to workaround this (and I want the URL actually be /impressum anyway since the site is in German and has German URLs). How to configure Google sitemap links in Wordpress (without editing its HTML or PHP source code)?

    Read the article

  • How to specify importance of html elements?

    - by Julien Fouilhé
    Is it possible to specify what elements of the page are important, or, more specifically, what elements of the page are not important? I'm using HTML5 new elements (nav, header, footer, section, article, aside...), but in the description of the website, there's sometimes my login form (in the header of my page though) in the Google description of my website pages... Is there a solution to resolve this problem? Thank you.

    Read the article

  • Should I use subdomains or subfolders for my user groups?

    - by bilygates
    Hello, I run a photography website where each user has its own subdomain (i.e. user.site.com). I'm thinking of adding user groups but I'm unable to decide if I should also associate a separate subdomain or simply a subfolder for each group: Subfolders (www.site.com/groups/my-group) Pros: Easier to maintain from a tehnical p.o.v. Cons: Harder to memorize. The URLs can get really long (www.site.com/groups/my-group/albums/my-album/) Subdomains (my-group.site.com) Pros: Easier to memorize. Shorter URLs. One might have the impression that such an URL is somewhat more "independent" from the main site. Cons: Group and user names belong to the same name space, so we need to check for collisions when creating a new user/group. One cannot determine the content of the page by only reading the URL: Is x.site.com a user page or a group page? What's your opinion on the matter? I should note that DeviantArt.com uses the 2nd option (that's where I got the idea). Thank you in advance!

    Read the article

  • Does the Instant Preview in Google webmaster tools takes Robot.txt in account?

    - by rockyraw
    Is that the way to go If I want to visually see what the googlebot see? I'm trying to check a folder which I have just blocked in my robots.txt. If I fetch the folder as google bot, It fetches ok, so that doesn't tell me nothing about whether the block is working I know there's a tool to check for blocking, but it is dependent on the input of the robots.txt Therefore I've tried the Instant preview, and I don't get a preview for what the bot sees ("pre-render), so I think that means that it's because the robots.txt blocks it; however - I don't see the bot tried beforehand to access my updated robots.txt, so I'm not sure how does it know that this folder is blocked? (it does preview another new folder, that is not blocked)

    Read the article

  • Google showing meta descriptions from other pages in the SERPs

    - by ojek
    Recently I added some content to my website and submitted a sitemap files to Google. Now that Google has indexed those pages, I discovered that some of words and sentences that are listed in Google that lead to my website are having their meta descriptions somehow mixed up. Here is how it works: After I put a sentence on Google to check for my website ranking, I can see a page title in the results of page1, a link to page1, and a description from page2. Since my website is a forum, if Google mixes the links of threads, it leads my users to different kind of material that they were looking for. Is there anything I can do about it?

    Read the article

  • Is there a way to learn why Google penalized a site?

    - by pawelbrodzinski
    Is there any way to learn for sure why Google penalized a specific site? I think about situation when webmaster/site administrator is aware about Google rules and is sure they aren't breaking any, but the site is penalized nevertheless. The only information you get from Google is that they processed your reconsideration request but they neither say what is the result nor what is the penalty reason if they keep the site penalized. You can try to get information on Google webmasters forum or here but most of the time these are only speculations. Considering the site administrator tried to find out what's wrong but failed, is there a source which can tell what is the problem?

    Read the article

  • How to recover my inclusion in google results after being penalized for receiving comment spam?

    - by UXdesigner
    My website had very high search engine results, especially in Google. But I left the website for a couple of months and didn't notice the comments were full of SPAM, about 20k comments of SPAM. Then i checked my google results and I'm out of google ! After years of having good results, no spam, how can I now recover from that? The spam problem has been solved completely. No more spam, and the website is very legit and very nice. Well, at least I think I was penalized, I don't see any other reason.

    Read the article

  • Creating sites with local ips that pointing to a distant server.

    - by fatnjazzy
    Hi. We are a company that is distributed in several places over Europe (real offices). Each office has its own domain. company.de company.co.uk company.ch And so. Our website servers are located in one place. We can't distribute our site to different locations. How can we create a local IP in each location to show our main server. so google will see us as local ip. Explanation: Google has decided to increase your PR if you have a local IP, they think that if you bought a server in a local market means that you are very serious about your business. We have 8 employees in each office, we cant have a separate server, is that mean that we are not serious about our business? no, this is y i need to create this illusion. Thanks

    Read the article

  • Moved sitemaps to a different subdomain and losing search referrals around the same time. Red herring or correlation?

    - by er1234
    We started to lose search referral traffic around the same time that I moved some of our sitemaps to a subdomain. Could this have hurt us? I followed Google's steps to creating a sitemap under a different subdomain. The new sitemaps.foo.com subdomain is being crawled and indexed well. Both www.foo.com and sitemaps.foo.com have been verified in Google Webmaster Tools. They appear as distinct sites. Is this correct? I can't find a way in Webmaster Tools to say "Hey, sitemaps.foo.com is really owned by www.foo.com, so show them together and make sure to attribute sitemaps.foo urls to www.foo" Our www.foo.com/robots.txt Sitemap: http://www.foo.com/sitemap.xml Sitemap: http://sitemaps.foo.com/subdir/sitemap.xml.gz

    Read the article

< Previous Page | 90 91 92 93 94 95 96 97 98 99 100 101  | Next Page >