What rules govern cross-version compatibility for .NET applications and the C# language?

Posted by John Feminella on Stack Overflow See other posts from Stack Overflow or by John Feminella
Published on 2010-02-25T13:01:17Z Indexed on 2010/05/08 23:28 UTC
Read the original article Hit count: 175

Filed under:
|

For some reason I've always had trouble remembering the backwards/forwards compatibility guarantees made by the framework, so I'd like to put that to bed forever.

Suppose I have two assemblies, A and B. A is older and references .NET 2.0 assemblies; B references .NET 3.5 assemblies. I have the source for A and B, Ax and Bx, respectively; they are written in C# at the 2.0 and 3.0 language levels. (That is, Ax uses no features that were introduced later than C# 2.0; likewise Bx uses no features that were introduced later than 3.0.)

I have two environments, C and D. C has the .NET 2.0 framework installed; D has the .NET 3.5 framework installed.

Now, which of the following can/can't I do?

Running:

  1. run A on C? run A on D?
  2. run B on C? run C on D?

Compiling:

  1. compile Ax on C? compile Ax on D?
  2. compile Bx on C? compile Bx on D?

Rewriting:

  1. rewrite Ax to use features from the C# 3 language level, and compile it on D, while having it still work on C?
  2. rewrite Bx to use features from the C# 4 language level on another environment E that has .NET 4, while having it still work on D?'

Referencing from another assembly:

  1. reference B from A and have a client app on C use it?
  2. reference B from A and have a client app on D use it?
  3. reference A from B and have a client app on C use it?
  4. reference A from B and have a client app on D use it?

More importantly, what rules govern the truth or falsity of these hypothetical scenarios?

© Stack Overflow or respective owner

Related posts about .NET

Related posts about c#