Handling SEO for Infinite pages that cause external slow API calls
Posted
by
Noam
on Pro Webmasters
See other posts from Pro Webmasters
or by Noam
Published on 2012-09-08T11:31:41Z
Indexed on
2012/09/08
15:50 UTC
Read the original article
Hit count: 391
I have an 'infinite' amount of pages in my site which rely on an external API. Generating each page takes time (1 minute). Links in the site point to such pages, and when a users clicks them they are generated and he waits. Considering I cannot pre-create them all, I am trying to figure out the best SEO approach to handle these pages.
Options:
- Create really simple pages for the web spiders and only real users will fetch the data and generate the page. A little bit 'afraid' google will see this as low quality content, which might also feel duplicated.
- Put them under a directory in my site (e.g. /non-generated/) and put a disallow in robots.txt. Problem here is I don't want users to have to deal with a different URL when wanting to share this page or make sense of it. Thought about maybe redirecting real users from this URL back to the regular hierarchy and that way 'fooling' google not to get to them. Again not sure he will like me for that.
- Letting him crawl these pages. Main problem is I can't control to rate of the API calls and also my site seems slower than it should from a spider's perspective (if he only crawled the generated pages, he'd think it's much faster).
Which approach would you suggest?
© Pro Webmasters or respective owner