05-28-2003, 09:12 PM | #1 (permalink) |
Insane
|
Corruption in Video Card Market
Link
Nvidia, ATI Accused of Altering Benchmarks Graphics chips may artificially inflate results, test maker says. Tom Krazit, IDG News Service Wednesday, May 28, 2003 An organization that produces benchmark tests for measuring the performance of graphics chips has updated its test after it found code in Nvidia's graphics chip drivers that detected certain tests and altered the chip's performance to inflate results Futuremark, which developed the 3DMark03 benchmark, posted a statement on its Web site Friday detailing allegations that some Nvidia drivers have "detection mechanisms" that trigger a higher level of performance when certain tests within 3DMark03 are detected. ATI Technologies, Nvidia's rival, also appears to have altered its drivers in order to boost test results, Futuremark said. Futuremark said it identified eight instances where Nvidia's Detonator FX 44.03 and 43.51 WHQL drivers detected specific 3DMark03 tests and inflated the benchmark results to the detriment of overall image quality. The allegations were first lodged by ExtremeTech, a hardware enthusiast Web site. Tough Competition 3DMark03 has been criticized by some as favoring products from ATI over those of Nvidia, said Peter Glaskowsky, editor in chief of the Microprocessor Report in San Jose, California. The benchmark contains certain DirectX calls that favor ATI, he said, and Nvidia appears to have been trying to rectify the situation by altering its drivers to convert that type of DirectX call into one that works better on Nvidia hardware, Glaskowsky said. DirectX is an API (application program interface) developed by Microsoft to help games run on Windows operating systems, and 3DMark03 is designed to measure performance on hardware running DirectX 9.0, the latest version. "It looks like what Nvidia is doing is trying to counteract their disadvantage, but some of the methods described [by Futuremark] aren't the correct methods for dealing with that problem," Glaskowsky said. In some instances the 3DMark03 "shaders"--pieces of code that render the appearance of a surface, such as a road or a tree--are discarded in favor of ones found in the Nvidia drivers that work more efficiently, according to Futuremark's statement. In other cases, code in the drivers artificially reduces the workload demanded by the benchmark test, thereby increasing performance, Futuremark said. Nvidia's Side of the Story Nvidia, based in Santa Clara, California, did not respond to specific questions about Futuremark's allegations, choosing instead to issue a statement. "Since Nvidia is not part of the Futuremark beta program--a program which costs hundreds of thousands of dollars to participate in--we do not get a chance to work with Futuremark on writing the shaders like we would with a real applications developer," the statement said. "We don't know what they did but it looks like they have intentionally tried to create a scenario that makes our products look bad." A new version of 3DMark03 with slightly different code that blocks the Nvidia drivers from detecting specific tests reduced the benchmark score of a system with Nvidia's GeForceFX 5900 Ultra and the 44.03 driver by 24 percent, Futuremark said. Making Alterations? In fact, ATI appears to have also been altering its software drivers to improve its test scores, Futuremark said. The new benchmark reduced the performance of a system with ATI's Catalyst 3.4 driver and the Radeon 9800 Pro by almost 2 percent. This was due to a drop in performance of some 8 percent for one particular part of the test, suggesting that ATI's drivers also had been making adjustments to produce better results. 3DMark03's margin of error was 3 percent. ATI's drivers make use of an "optimization" that reorders the 3DMark03 instructions for DirectX 9.0 to take advantage of its architecture, said Patricia Mikula, public relations manager for ATI. Changing the code order isn't cheating, because ATI's drivers still produce the same result, just in a different way, according to Mikula. Nevertheless, ATI will remove the optimization from an upcoming driver release to avoid creating the perception that it is doing anything wrong, Mikula said. Playing Games Chip makers love to cite benchmark results in marketing materials when announcing a new chip. Benchmarks are virtually the only method of comparing performance of competing chips or systems, but are often criticized for not mirroring real-world performance. For that reason, gaming benchmarks carry more weight among certain groups of PC buyers. Benchmarks exist for popular PC games such as Doom, Quake, and others. If a user is interested in a specific game or types of games, he or she can purchase a graphics card or system that wins that performance crown for that specific game. But according to Futuremark, benchmarks devised to measure the performance for particular games don't demonstrate the overall performance of the hardware, and are more ripe for exploitation by driver cheats than Futuremark's own tests. "What you want in a benchmark is something that correctly reflects the performance of games, not just the ones now, but the ones that haven't been developed yet," Glaskowsky said. 3DMark03 was supposed to do just that, but it doesn't accurately portray the performance of both companies' hardware, he said. ------------------------- Rather lengthy, but quite humorous! |
05-28-2003, 10:49 PM | #2 (permalink) |
Junkie
Location: Right here
|
It actually appears like neither card is doing anything wrong. We might be holding on to an old model of technological innovation.
The article asserts that the cards "inflated" the scores by using more efficient rendering and other means not mentioned to reduce the demand on the card. Just think of the possibilities, however, of software optimized for various cards--ATI, NVidia, etc. Each would reach those higher performances with code that makes similar calls like the benchmark program.
__________________
"The theory of a free press is that truth will emerge from free discussion, not that it will be presented perfectly and instantly in any one account." -- Walter Lippmann "You measure democracy by the freedom it gives its dissidents, not the freedom it gives its assimilated conformists." -- Abbie Hoffman |
05-29-2003, 11:03 AM | #3 (permalink) |
Psycho
Location: Where hockey pucks run rampant
|
Rewriting shaders behind an application's back in a way that changes the output under non-controlled circumstances is absolutely, positively wrong and indefensible.
Rewriting a shader so that it does exactly the same thing, but in a more efficient way, is generally acceptable compiler optimization, but there is a range of defensibility from completely generic instruction scheduling that helps almost everyone, to exact shader comparisons that only help one specific application. Full shader comparisons are morally grungy, but not deeply evil. The significant issue that clouds current ATI / Nvidia comparisons is fragment shader precision. Nvidia can work at 12 bit integer, 16 bit float, and 32 bit float. ATI works only at 24 bit float. There isn't actually a mode where they can be exactly compared. DX9 and ARB_fragment_program assume 32 bit float operation, and ATI just converts everything to 24 bit. For just about any given set of operations, the Nvidia card operating at 16 bit float will be faster than the ATI, while the Nvidia operating at 32 bit float will be slower. When DOOM runs the NV30 specific fragment shader, it is faster than the ATI, while if they both run the ARB2 shader, the ATI is faster. When the output goes to a normal 32 bit framebuffer, as all current tests do, it is possible for Nvidia to analyze data flow from textures, constants, and attributes, and change many 32 bit operations to 16 or even 12 bit operations with absolutely no loss of quality or functionality. This is completely acceptable, and will benefit all applications, but will almost certainly induce hard to find bugs in the shader compiler. You can really go overboard with this -- if you wanted every last possible precision savings, you would need to examine texture dimensions and track vertex buffer data ranges for each shader binding. That would be a really poor architectural decision, but benchmark pressure pushes vendors to such lengths if they avoid outright cheating. If really aggressive compiler optimizations are implemented, I hope they include a hint or pragma for "debug mode" that skips all the optimizations. John Carmack taken from: http://slashdot.org/comments.pl?sid=65617&cid=6051216 ** just to fan the flames **
__________________
Lead me, follow me, or get out of my way! |
05-29-2003, 12:21 PM | #4 (permalink) |
Psychopathic Akimbo Action Pirate
Location: ...between Christ and Belial.
|
Yeah, it's really disappointing to see this. nVidia fanboys and ATI fanboys are really having a go at it right now, pointing accusations on both sides saying, "Well nVidia/ATI had to start cheating because ATI/nVidia was doing it first!"
__________________
On the outside I'm jazz, but my soul is rock and roll. Sleep is a waste of time. Join the Insomniac Club. "GYOH GWAH-DAH GREH BLAAA! SROH WIH DIH FLIH RYOHH!!" - The Locust |
05-29-2003, 12:47 PM | #5 (permalink) |
Sultana ruined my evil persona
Location: Los Angeles
|
...and the war continues.
I myself was debating my next card purchase. I wanted a ATI 9800 Pro or the same but All In Wonder. But looking at the new Nvidia cards without the mamoth fan duct and the fact that Nvidia makes a better linux driver has made me want to buy a Nvidia again. At this point what good is 10-20 fps more gonna make if my eyes can't see the difference. As long as it supports the new features of course.
__________________
His pants are tight...but his morals are loose!! Last edited by Krycheck; 05-29-2003 at 03:59 PM.. |
05-29-2003, 08:17 PM | #6 (permalink) |
Psycho
|
Do you measure hp and torque at the flywheel or wheels?
BHP or net? Variables have a place in every equation. Some engineer can rationalise the testing methods, outcome, and why ambient volumetrics figured into the outcome of the standard deviation/sigma. Benchmarks be damned. I like the way my ATI works. |
Tags |
card, corruption, market, video |
|
|