Googlebot fetches my pages very frequent, rel-nofollow, meta-noindex or robots.txt-disallow
- by trante
Googlebot fetches pages in my site very frequently. And this slowens my website. I don't want Googlebot to crawl too frequent.
I decreased crawl rate from Google webmaster tools.
But I'm supposing to use these three tools:
Adding rel="nofollow" to my inner pages. So Googlebot won't crawl and index them.
Adding meta tag "noindex" so Google will remove this page from index and won't get it again.
Adding Disallow: /mySomeFolder/ to robots.txt and Googlebot won't crawl that pages.
I'm planning to use these methods for my 56.000 pages, except the most important 6-7 pages.
Which method would you prefer and what would be disadvantages or advantages ? Or won't it change my website speed etc..