The thing to remember is that NVidia cards have have 32-bit precision for floating point calculations, while Microsoft screwed NVidia after the lawsuit over the XBox profits and made the DX9 standard only 24-bit precision (which is what ATi uses).
As such, NVidia had a choice, they could either (a) run the cards at 16-bit precision and admit that they aren't DX9 compliant, or (b) have the cards do on-the-fly calculations from 32-bit to 24-bit and take a big performance hit to be DX9 compliant.
The NVidia cards are the ones with the technologically superior floating point precision (in this particular case), but due to the DX9 standard, they're the ones getting screwed.
__________________
Eat antimatter, Posleen-boy!
|