I've been asked to take over the maintnance of an existing site that's being reworked.
At present it's serving localised content for several languages, but via a fairly unhelpful mechanism that means essentially search engines only have it indexed in English and any deep links will de facto appear in English as well.
So, new localised sites are being built under separate domains - not just for this, there's other benefits. What we're then looking to do is to redirect users correctly to the new site, where appropriate.
For humans this isn't a problem. We can send them through a gateway page on their first site visit, grab their language preference and put it in a cookie, then redirect them to the new localised content as soon as it's available.
For search engines, this isn't so good... In principle I'm happy to simply bypass the gateway page and redirect known spiders to the new site, but this means we're serving radically different content (different URL even!) to human and robot users. Won't this therefore be regarded as cloaking and cause us grief?
Anyone know a better way to handle this?