Quote:
Originally Posted by Lasereth
For the average computer user the 360 version is clearly better because the PC version requires knowledge on how to make a good gaming PC to get it to look good. But if you do have that knowledge, the PC experience is great.
|
Well, this is partly true. So, you have SLI'd 8800GTX cards... does having a 80fps frame rate improve your experience that much over having a 70fps rate? Obviously preventing the frame rate from dropping below 30fps is important, moreso than average framerate, but on a console with a properly coded game, you'll always be above 30fps.
Yes, the overall experience in a single player game (or from your POV in a multiplayer game) can be nicer... though I've played CoD4 on my 360 at 1080p and on my PC at 1080p and see very little actual difference. *shrug*
Quote:
Originally Posted by Willravel
Does that requirement take into account dual cores? A 2.8GHz Core 2 Duo has more computational power than a single core 2.8GHz.
|
Tsk tsk... also only partly true. If the software is not coded to take advantage of multithreading across CPUs, a dual-core or quad-core is not very helpful. Surely the OS itself can offload some tasks to the core that is not being used by the game, but the CPU is rarely the limiting factor in gameplay and when it is, it's not usually due to OS overhead anyhow. For MOST games a 2.8GHz dual-core and a 2.8GHz single-core is going to perform within a very tiny margin of difference.
EDIT: To clarify, Will is technically correct, the dual-core has more computational power. The issue is whether your game is designed to take advantage of that additional power. It'd be like upgrading from a V8 to a V16, but having the fuel injector control only actually providing fuel to 8 pistons. The engine is surely capable of producing more power... but isn't properly configured to do so.