I’ve written that most people Have Already Encountered the Multicore Crisis But Just Didn’t Realize It. Now I thought I would post a chart that shows graphically how the crisis has been here for longer than people might suppose:
The Multicore Crisis has to do with a shift in the behavior of Moore’s Law. The law basically says that we can expect to double the number of transistors on a chip every 18-24 months. For a long time, it meant that clock speeds, and hence the ability of the chip to run the same program faster, would also double along the same timeline. This was a fabulous thing for software makers and hardware makers alike. Software makers could write relatively bloated software (we’ve all complained at Microsoft for that!) and be secure in the knowledge that by the time they finished it and had it on the market for a short while, computers would be twice as fast anyway. Hardware makers loved it because with machines getting so much faster so quickly people always had a good reason to buy new hardware.
Alas this trend has ground to a halt! It’s easy to see from the chart above that relatively little progress has been made since the curve flattens out around 2002. Here we are 5 years later in 2007. The 3GHz chips of 2002 should be running at about 24 GHz, but in fact, Intel’s latest Core 2 Extreme is running at about 3 GHz. Doh! I hate when this happens! In fact, Intel made an announcement in 2003 that they were moving away from trying to increase the clock speed and over to adding more cores. Four cores are available today, and soon there will be 8, 16, 32, or more cores.
What does this mean? First, Moore’s Law didn’t stop working. We are still getting twice as many transistors. The Core 2 now includes 2 complete CPU’s for the price of one! However, unless you have software that’s capable of taking advantage of this, it will do you no good. It turns out there is precious little software that benefits if we look at articles such Jeff Atwood’s comparison of 4 core vs 2 core performance. Blah! Intel says that software has to start obeying Moore’s Law. What they mean is software will have to radically change how it is written to exploit all these new cores. The software factories are going to have to retool, in other words.
With more and more computing moving into the cloud on the twin afterburners of SaaS and Web 2.0, we’re going to see more and more centralized computing built on utility infrastructure using commodity hardware. That’s means we have to learn to use thousands of these little cores. Google did it, but only with some pretty radical new tooling.
You’ll see more from me on the multicore topic (you can always click it in the tag cloud on the left), but I did this chart on a lark and thought it showed the problem pretty clearly so I wanted to get a post out.
Related Articles:
Submit to Digg | Submit to Del.icio.us | Submit to StumbleUpon
You must be logged in to post a comment.