Comparing CPU frequencies
I have two computers, a 1 year old laptop and a 3 year old desktop. I'm wondering why the two perform about the same when the clock speed is so different.
For pure floating point such as rendering, the performance is exactly the same, while for memory/cpu intesive tasks the laptop is slightly ahead. The latter is a factorization task, taking up about 600Mb of memory. Here are the specs:
laptop:
Intel Pentium M 738, 1.4GHz, 2Mb L2 cache
FSB 100MHz, Bus speed 400MHz
1GB + 128Mb DDR-SDRAM @ 133MHz
FSB:DRAM 3:4, 2-2-3
desktop:
Intel Pentium 4, 2.8GHz, 512Mb L2 cache
FSB 133MHz, Bus speed 533MHz
512+512Mb DDR-SDRAM @ 133MHz
FSB:DRAM 1:1, 2.5-3-3
The main difference, apart from the cpu frequency, is L2 cache, but I can't see why this would make much difference as the data isn't reused during solving.
|