Why shouldn't I always use nullable types in C#.
Posted
by Matthew Vines
on Stack Overflow
See other posts from Stack Overflow
or by Matthew Vines
Published on 2009-05-06T16:50:47Z
Indexed on
2010/04/07
10:13 UTC
Read the original article
Hit count: 565
I've been searching for some good guidance on this since the concept was introduced in .net 2.0.
Why would I ever want to use non-nullable data types in c#? (A better question is why wouldn't I choose nullable types by default, and only use non-nullable types when that explicitly makes sense.)
Is there a 'significant' performance hit to choosing a nullable data type over its non-nullable peer?
I much prefer to check my values against null instead of Guid.empty, string.empty, DateTime.MinValue,<= 0, etc, and to work with nullable types in general. And the only reason I don't choose nullable types more often is the itchy feeling in the back of my head that makes me feel like it's more than backwards compatibility that forces that extra '?' character to explicitly allow a null value.
Is there anybody out there that always (most always) chooses nullable types rather than non-nullable types?
Thanks for your time,
© Stack Overflow or respective owner