I am currently running Windows 7 Ultimate 64-bit with a dual monitor setup with an NVIDIA 7950 GT graphics card. One monitor is dedicated to this machine and the other monitor is connected to a DVI KVM switch.
When I switch to my other computer, Windows 7 disables the monitor. However, when I switch back it does not re-enable the monitor. The only circumstance that automatically re-enables the second monitor is when I switch back after Windows has put the monitors into power save mode. I am continually having to bring up the NVIDIA control panel to have it re-enable the monitor.
Under Windows XP I would just disable the NVIDIA service to prevent it from auto-detecting the monitor (which doesn't solve the problem under Win7), and in Vista there was a registry hack that would prevent this. It looks as though that has been removed in Windows 7.
I have found similar questions posted on this site, but nothing that matches my problem exactly. The following link is the question that comes the closest, but does not provide a solution to the problem.
http://superuser.com/questions/96683/how-to-fix-monitor-detection-on-windows-7
Is there a way in Windows 7 to disable monitor auto-detection?
Update: I just added a second graphics card to my Windows 7 64-bit machine. I plugged one monitor into each graphics card. Now, when I use the KVM switch to switch back and forth it will re-enable the second monitor like it should. There are however, a few quirks with this. If I have a program maximized on the second monitor and it has focus, when I switch it will move to monitor 1. If I have a program maximized on the second monitor and it does not have focus, when I switch it will behave like it is minimized and when I bring it back up it will show up maximized on monitor 1.
Definitely better than it was, but still looking for a way to disable the auto-detection.