Technology in terms you understand. Sign up for the Confident Computing newsletter for weekly solutions to make your life easier. Click here and get The Ask Leo! Guide to Staying Safe on the Internet — FREE Edition as my thank you for subscribing!

Is Moore’s Law over?

Question: My current computer is about 8 months old. It’s still being sold at Best Buy as new computer for about the same price. Also, many of the other models are about the same as mine. Nothing like this was around a decade ago. Is Moore’s law finished? Have we hit a barrier in new computers for speed? Or is it market forces that are simply responding to good enough computing?

Moore’s Law is often inaccurately quoted as saying that computer speeds double every set number of years. In reality, what Gordon Moore observed some years ago is that the number of transistors that can be packed on to a single chip was doubling roughly every two years.

Now I can’t tell you whether that still holds true. There are certainly physical limitations manufacturers must be encountering at some point, but some other interesting things have been happening as well.

Become a Patron of Ask Leo! and go ad-free!

Good enough

My gut reaction at the retail store shelf level is to agree with your comment about what you called “good enough computing”. Computers are relatively inexpensive, but with the migration of so much online, the pressure to get a purely faster CPU is probably lessening.

However, as I said, there are some other interesting pressures at work.

Look in your pocket

Much of the advancement in CPU technology recently has happened in small and low-powered devices. In fact, you can probably recognize the improvements that are being made in mobile phones and tablets every year or two. I know I certainly have more in my cell phone than was even available only two years ago!

Similarly, as opposed to increasing computational power, many chip makers are now looking instead to reduce power consumption. Once again, once the device is “good enough” to perform the tasks we we want it to, we now – sometimes desperately – want that battery to last much, much longer than it does. So mobile devices and even traditional laptops as well are getting more energy-efficient CPUs.

CPUMultiple cores

Another trend that’s actually more in line with the “number of transistors on a chip” rule has to do with the number of CPUs on a chip.

We used to think that making a single CPU go as fast possible was the best way to get the most performance out of the machine. As we reach an assortment of what I’ll call practical and physical limits, that approach may not make quite as much sense. So instead, chip makers started making dual core processors – two complete CPUs on a single chip; then quad core; then six. The new desktop machine I’m currently recording this Answercast on has twelve CPU cores.

The megahertz that you might be looking at might not appear to be growing, but once you realize that there are four, six, or twelve processors inside instead of just the one, then the processing ability that you have in front of you has still increased dramatically.

So, while Moore’s Law may be slowing down some, I think the fundamental concept remains. There are definitely continued improvements being made every year.

What might be changing is that those improvements are now manifesting in other ways, ways that are more than just increasing your chip speed every couple of years.

Do this

Subscribe to Confident Computing! Less frustration and more confidence, solutions, answers, and tips in your inbox every week.

I'll see you there!

13 comments on “Is Moore’s Law over?”

  1. Not an expert, but I think the “physical limitations” that the manufacturers have encountered is related to heat.

    As the chips have gotten more and more densely packed, they do not have enough surface area for transferring heat away from the chip. Once upon a time, CPUs just needed to have some clear space around them for natural convection to keep them cool. Then they needed case fans. Then they needed solid heat sinks with fins. Now they have sealed liquid-cooled heat sinks with a dedicated fan (in addition to the case fan) and a special grease to transfer the heat from the chip to the sink.

    So maybe after the next break-through in heat transfer technology, we will have a jump in processor performance.

    I can’t remember where I heard this theory. Nowhere reputable, but it sounded good to me at the time.

    Reply
    • Not so much heat, but trying to make a smaller transistor. Haswell CPUs released months ago have 22nm transistors. Right now Intel is producing 14nm transistors on the newest models of processors. A human hair is 100nm wide, divide that by about 7. Next in a few years is 10nm, then maybe 8. CPUs are getting far more power efficient, generating less heat, doing more and now there are setups that do not require a fan. At the time is worked, but now not so much.

      Reply
      • It’s a little of what Michael Henry said, and a little of what Billy Bob said combined. Each new Intel architecture increases the amount of transistors per core as they become smaller and more heat efficient. And every architecture has a limit for heat dissipation. As Intel has learned from their past endeavors, they have been able to start newer architectures closer to the edge of the heat transfer limit. Thus we saw a big jump from early to late Pentium 4, from Pentium 4 to Pentium D, and from Pentium D to Core 2 Duo, and from Core 2 Duo to the modern i* series. New architecture is needed for the next “level” of CPU, and as has been mentioned, it is in the works. But things have slowed down until that happens because the current i* series is maxed out for heat dissipation. Moore’s law is still relatively intact. We’ll see big jumps in speed again once smaller transistors are mastered while generating less heat (more efficient). Then things will slow again until they master the next hurdle for smaller transistors. The cycle is the same, but we no longer start with 2.4GHz chips which can easily be overclocked to 4.5GHz on air (anyone remember the Pentium 805D? Wonderful CPUs!). There will always be some leeway for overclocking to remain safe under warranties, but the leeway is becoming less and less as the technology advances and Intel is confident that they’ll last. Will it continue at the same rate forever? Almost certainly not as we’ll run into barriers for smaller transistors. But it’s quite possible that a technology which isn’t even on the drawing board now will be the wave of the future (no transistors at all). And feel free to substitute AMD or any manufacturer for Intel if it pleases you. They’re basically the same architectures, problems, and challenges for all – at least for now.

        Reply
  2. This is a great question. I thought it was just me that noticed this. I have been in the market for a new laptop for a while and have taken screen shots of new computers at Costco so I can see the improvements over times vs the cost. Lately, I have not seen much improvement on specs at all.

    Reply
  3. More CPUs can theoretically make things faster if the software is written to take advantage of multiple CPUs. Sadly a lot of software we use isn’t so performance has stagnated. But as Leo has said, it’good enough.

    Reply
    • I just upgraded an Acer Aspire One netoobk about two weeks ago. I didn’t touch the CPU but I upgraded from 1GB of Ram to 2GB then replaced the 5,400rpm hard drive for an SSD & also upgraded from? Windows XP to Windows 7 32bit. The netoobk actually ran faster with Windows XP installed once I did a few registry hacks but Windows 7 is just an all around better O.S that I was willing to take a small performance hit by upgrading to it.

      Reply
  4. “They” will continue developing faster computing systems, for the simple reason there’s a challenge to do it.
    You can ask if you need more than you have today, but the same question has been raised when the first computer with a 200MB harddisk thundered on the market.
    Or the miscalculation Microsoft made that 640KB memory was all you needed.
    The computer as we know it is just in it’s infant years.
    A 100 years from now people will be wondering how by jove those cavemen managed to get anything done on those Thinkerbell machines.

    Reply
    • “640KB memory was all you needed” – that statement may have been said by someone, sometime – but it was never said officially by Microsoft. Bill Gates never said it.

      Reply
  5. In fact, Leo, with a little more r’search I believe you’ll find that Gordon Moore actually said that for a given COST of creating a processor the number of transistors on it will double – in eighteen months.
    It was a comment on the fabrication process, not on physics and electronics per se, just that foundries were getting cleverer at doing it. He wasn’t saying that processors would continue endlessly increasing in density. That’s another story.

    Reply
  6. I don’t want a 12-core CPU that runs at 4 GHz, I want a 4-core CPU that runs at 12 GHz! Most computational problems require serial processing, so they can not be multi-threaded.

    Reply
    • You’ll never see 12GHz with the current technology as heat will prevent it from happening. But most software developers are now taking multiple processors into consideration. There’s countless computational programs made to utilize multiple cores now, and some are more efficient from video card processors than CPUs – so it’s not like it can’t be done. Yes, you calculate one problem per core, but you can do 8 problems at once with 8 cores – so still 8x as fast. I contributed to Folding At Home for years with several multiple core CPUs and high end video card machines (until my electric bill became a problem and I decided to lower my carbon footprint for the environment).

      Reply
  7. Leo, why use more than a dual core processor, if so little programs are using this technology ? Is only manipulation to buy their CPU. As i understand, a company must redesign the program from scratch, to use multiple cores. Leo, do you play a game, listen to music, use a torrent, surf the web, develop a map for a fps game, use photoshop, listen to a radio channel on a different browser tab, etc at the same time ? I look for PC components on the internet and learn much about them before i buy them, but i will never listen to advices from a PC clerk when i make a purchase at that store because in my country they purpose only the ,,good,, ones.

    Reply
    • I actually do an unhealthy number of things at the same time. :-) That being said, one application I do care about – my video editing software – will use all 12 cores at the same time if the job calls for it. But yes, between virtual machines, web browsers, the occasional World-of-Warcraft session and more, I do tend to … stress … my machines a tad. I certainly dont recommend 12cores unless you know you have a need, and most people don’t. Two cores is a great base, and quad core provides a nice buffer to keep the machine responsive even when other applications are CPU intensive.

      Reply

Leave a reply:

Before commenting please:

  • Read the article.
  • Comment on the article.
  • No personal information.
  • No spam.

Comments violating those rules will be removed. Comments that don't add value will be removed, including off-topic or content-free comments, or comments that look even a little bit like spam. All comments containing links and certain keywords will be moderated before publication.

I want comments to be valuable for everyone, including those who come later and take the time to read.