How is timezone handled in the lifecycle of an ADO.NET + SQL Server DateTime column?
- by stimpy77
Using SQL Server 2008. This is a really junior question and I could really use some elaborate information, but the information on Google seems to dance around the topic quite a bit and it would be nice if there was some detailed elaboration on how this works...
Let's say I have a datetime column and in ADO.NET I set it to DateTime.UtcNow.
1) Does SQL Server store DateTime.UtcNow accordingly, or does it offset it again based on the timezone of where the server is installed, and then return it offset-reversed when queried? I think I know that the answer is "of course it stores it without offsetting it again" but want to be certain.
So then I query for it and cast it from, say, an IDataReader column to a DateTime. As far as I know, System.DateTime has metadata that internally tracks whether it is a UTC DateTime or it is an offsetted DateTime, which may or may not cause .ToLocalTime() and .ToUniversalTime() to have different behavior depending on this state. So,
2) Does this casted System.DateTime object already know that it is a UTC DateTime instance, or does it assume that it has been offset?
Now let's say I don't use UtcNow, I use DateTime.Now, when performing an ADO.NET INSERT or UPDATE.
3) Does ADO.NET pass the offset to SQL Server and does SQL Server store DateTime.Now with the offset metadata?
So then I query for it and cast it from, say, an IDataReader column to a DateTime.
4) Does this casted System.DateTime object already know that it is an offset time, or does it assume that it is UTC?