Quote:
Originally Posted by xepherys
Well, this is partly true. So, you have SLI'd 8800GTX cards... does having a 80fps frame rate improve your experience that much over having a 70fps rate? Obviously preventing the frame rate from dropping below 30fps is important, moreso than average framerate, but on a console with a properly coded game, you'll always be above 30fps.
|
COD4 has an amazing frame rate on console but the frame rate isn't what I'm referring to. The image quality, antialiasing, and effects that are allowed in a decent gaming PC is superior to a console. My $100 videocard and $70 CPU I was using last year ran Bioshock embarassingly better than the 360 version, and I witnessed the 360 version in 1080p on a 42" LCD TV that had absolutely no ghosting. The 360 version looks horrid in comparison to even a low-end by today's standards PC. The gap in COD4 isn't nearly as big, but it's there.
Quote:
Originally Posted by xepherys
For MOST games a 2.8GHz dual-core and a 2.8GHz single-core is going to perform within a very tiny margin of difference.
|
This is flat out wrong. 2 years ago you'd be right, but nowadays almost all PC games use dual core CPUs. Tom's Hardware published an article about this exact issue this past week, check it out! It tests a midrange videocard with a single core and dual core CPU and the results are as expected: single core CPUs have no place in PC gaming nowadays, even in games made last year.