As far as I know, you can't, thencrow.
At work, I run two 21" monitors (Sony Trinitron) with some sort of ATI card with dual outputs and full-screen applications (probably done through DirectX) don't use both monitors. I can see why they can't, too...
For all of you who have a dual setup, try this experiment. Take a window and move it around one monitor while being careful never to show any part of that window on the other monitor. Even on crappy video cards, the window movement should look smooth. Now, do the same on the other monitor. Again, it should still look smooth. Now, move the window quickly between monitors and see what kind of performance you get. It should suddenly become very choppy. This is especially pronounced if you run in a high resolution and you use a large window (I run at 1600x1200 on both monitors).
I believe the reason for this is that, while the video card probably supports hardware blitting (which enables images as big as a window to be copied quickly and, thus, moved smoothly), it can't possibly do so with another video card. At the very least, all that image data being copied must travel across the system bus to the other card. This happens even with my single video card with dual output because, I think, it's really just two video cards moulded together.
If simple blitting routines work this poorly across the bus (or, in my case, even within a single video card), imagine trying to support polygonal rendering! Or texture mapping! Or anything, really...
|