Why does GetSqlDecimal throw when GetDecimal doesn't?

Posted by I. J. Kennedy on Stack Overflow See other posts from Stack Overflow or by I. J. Kennedy
Published on 2010-03-30T22:48:04Z Indexed on 2010/03/30 22:53 UTC
Read the original article Hit count: 583

Filed under:
|
|

I have a database table that has a column of type money, allowing nulls. Using a SqlDataReader named reader, I can do

decimal d = reader.GetDecimal(1);

which works, unless of course we're reading a null. If I try using SqlDecimal instead--and I thought the whole point of the SqlTypes was to deal with nulls--then I get an invalid cast, whether or not the value is null.

SqlDecimal s = reader.GetSqlDecimal(1);  // throws an invalid cast exception

What am I doing wrong? Do I really have to write a conditional statement to shepherd the value from the database to a SqlDecimal variable?

© Stack Overflow or respective owner

Related posts about c#

Related posts about sql