When load balancing, must all copies of static web page be exactly the same?
- by Gilles Blanchette
I am used to get answers for everything on the web, but not this time...
Yesterday I enable Amazon DNS weight functionally to load balance 7 websites between two different IP addresses (split 50%-50%). Both servers run IIS 8.5, sites runs well on both sides.
Today I found out that Google WebMasterTools is reporting fails error with file robots.txt, all close to 50% of access try errors. The robots.txt file is ok and accessible (even via Google testing URL page) on both servers.
Lets say current version of static web pages are on the first computer and the updated version of the same web pages are on the second computer. Can it be the problem?
When load balancing, can static web pages be slightly different from one host server to the other?
Thank you for your help