I’ve read that AMD’s 2.0 GHz processors are comparable to Intel’s 3.0 GHz
processors. That seems totally counter intuitive. Why would that be?
To be totally honest, this is one of those areas that I actually spend very
little time on. There are hardware geeks out there that will happy rant and
rave at how one is better than the other. In fact, I kinda hope some of them
will enlighten us with a comment on this article.
But clearly, all Gigahertz are not created equal.
Become a Patron of Ask Leo! and go ad-free!
In general, as your question alludes to, there’s much more than gigahertz
playing a factor in how fast your computer operates. It is a combination of the
processor’s internal architecture, the efficiency of its interfaces to the
motherboard, the composition of the motherboard itself and other items that all
combine to make systems faster, or slower, when compared to others. On top of
that, add the efficiency of many peripheral devices like your video adapter
which can also have a dramatic effect on the speed, or the perceived speed, of
I couldn’t tell you the equation, but I know it’s complex. :-)
What I can tell you is that I don’t think it’s worth spending a lot of
energy on, unless it’s a matter of curiosity or you really have a need to
squeeze every last bit of performance out of your machine.
in how fast your computer operates.”
Personally, I look at the applications and load that the server will be
handling. If you expect to get “close” to stressing the slower of the two
processors, in my mind, you’re too close for the other one as well. A 20%
difference, for example, in throughput between two processors is not make or
break to me. What is is cost, and other factors that add up to, say,
100% or greater differences. So when selecting a server, which I have done in
recent years, I try to look at the total picture … cost, disk capacity &
speed, network bandwidth, memory usage and capacity, my expectations for video
performance … and somewhere in the middle of all that … processor
But that’s just me.