>>>Essentially we will be mirroring a fairly large database locally,
Ok. How often do you have to sync? This could a painful process. Gigabit ethernet is fast - but not sure if fast enough. It's worth calculating the transfer time.
>>>and then writing scripts that parse that data looking at particular fields and attempting to find patterns.
On this note.... Here's the questions. This is just informal weekend thoughts ok. I'm on my day off (and am just running on minimum coffee).
- How much data do you expect to hold in memory at one time.
- Can you realistically work on a small chunk of data at a time?
Independant of the rest (this has implications for your programming too)
- Will you be doing large hashes/sorts or other calculations that will spill to disk,
causing contention between reading an writing activity?
On CPUS... Suggest you look at the intel models. Unless you are scaling past 4cores.
The AMD/Intel commodity processors are well compared on tomshareware.com by the way. ie they have graphs. Anandtech is another site. Although like Toms... they only deal with retail/commodity processors.
|