Why does my .NET 4 application know .NET 4 is not installed

Posted by Tergiver on Stack Overflow See other posts from Stack Overflow or by Tergiver
Published on 2012-04-05T16:16:49Z Indexed on 2012/04/05 17:30 UTC
Read the original article Hit count: 278

Filed under:
|

I developed an application that targeted .NET 4 the other day and XCOPY-installed it to a Windows XP machine. I had told the owner of the machine that they would need to install .NET Framework 4 to run my app and he told me he did (not a reliable source). When I ran the application I was presented with a message box that said this app requires .NET Framework 4, would I like to install it? Clicking the Yes button took me to the Microsoft web site and a few clicks later .NET 4 was installed, and the application successfully launched.

Now I normally don't develop applications that target the latest version of .NET, I always target the lowest version I can (what features do I really need?). So this was my first .NET 4 app (and I only targeted 4 because it used a library that did).

In the past, XCOPY-installing .NET applications to a machine that didn't have the correct version of .NET installed resulted in the application simply crashing on startup with no useful information presented to the user.

  1. Was it built into my app because I targeted .NET X?
  2. Was it something already installed on the target machine?

I love the feature, I just want to know precisely how to leverage it in the future.

© Stack Overflow or respective owner

Related posts about c#

Related posts about .NET