Is this form of cloaking likely to be penalised?
- by Flo
I'm looking to create a website which is considerably javascript heavy, built with backbone.js and most content being passed as JSON and loaded via backbone. I just needed some advice or opinions on likely hood of my website being penalised using the method of serving plain HTML (text, images, everything) to search engine bots and an js front-end version to normal users.
This is my basic plan for my site:
I plan on having the first request to any page being html which will only give about 1/4 of the page and there after load the last 3/4 with backbone js. Therefore non javascript users get a 'bit' of the experience.
Once that new user has visited and detected to have js will have a cookie saved on their machine and requests from there after will be AJAX only. Example
If (AJAX || HasJSCookie) {
// Pass JSON
}
Search Engine server content:
That entire experience of loading via AJAX will be stripped if a google bot for example is detected, the same content will be servered but all html.
I thought about just allowing search engines to index the first 1/4 of content but as I'm considered about inner links and picking up every bit of content I thought it would be better to give search engines the entire content.
I plan to do this by just detected a list of user agents and knowing if it's a bot or not.
If (Bot) {
//server plain html
}
In addition I plan to make clean URLs for the entire website despite full AJAX, therefore providing AJAX content to www.example.com/#/page and normal html to www.example.com/page is kind of our of the question. Would rather avoid the practice of using # when there are technology such as HTML 5 push state is around.
So my question is really just asking the opinion of the masses on if it's likely that my website will be penalised?
And do you suggest an alternative which avoids 'noscript' method