Brett King

Moore’s Law – Why computers are increasingly disruptive to industry

In Technology Innovation on October 25, 2009 at 06:57

Excerpt from Chapter 9 – Deep Impact: Technology and Disruptive Innovation

You’ve undoubtedly heard of “Silicon Valley” right? Did you know why it is called Silicon Valley? You might think it is because of all the dot com, 2.0 companies that inhabit this region of California. But you’d be wrong. We have to go much further back to the 1950s to find out the origin of the term. It must have something to do with computer chips, because microchips are made of Silicon…

Well in 1947 a gentleman by the name of William Shockley along with John Bardeen and Walter Brattain, invented the transistor. For this, the three were awarded the Nobel Prize in Physics in 1956. The attempts of Shockley to commercialize the transistor is what led to the formation of a bunch of companies in California specializing in the manufacturing of these components. During the 50s and 60s there was a great deal of speculation in the markets about ‘tronics’ or the ability to capitalize on these ‘new’ technologies and advances.

On April 19th, 1965, Gordon Moore, the co-founder of Intel corporation, published an article in Electronics Magazine entitled “Cramming more components onto Integrated Circuits”. In that article he stated a law on computing power that has remained consistent for more than 40 years, a law that drives technology development today and for the near future.

The complexity for minimum component costs has increased at a rate of roughly a factor of two per year … Certainly over the short term this rate can be expected to continue, if not to increase. Over the longer term, the rate of increase is a bit more uncertain, although there is no reason to believe it will not remain nearly constant for at least 10 years. That means by 1975, the number of components per integrated circuit for minimum cost will be 65,000. I believe that such a large circuit can be built on a single wafer – Gordon Moore’s prediction in 1965.

The term “Moore’s Law” was reportedly coined in 1970 by the CalTech professor and VLSI pioneer Calvin Mead . Essentially what this meant was that Moore predicted computing power would double every two years. Since 1965, that law has held true and remains the backbone of classical computing platform development. But what this all means it that since 1965 we have been able to predict both the reduction in costs and the improvements in computing capability of microchips, and those predictions have held true.

In reality what does this mean. Let’s put it in perspective. In 1965 the amount of transistors that fitted on an integrated circuit could be counted in tens. In 1971 Intel introduced the 4004 Microprocessor with 2,300 transistors. In 1978 when Intel introduced the 8086 Microprocessor, the IBM PC was effectively born (the first IBM PC used the 8088 chip) – this chip had 29,000 transistors. In 2006 Intel’s Itanium 2 processor carried 1,700,000,000 transistors. What does that mean? Transistors are now so small that more than a million of them could fit on the head of a pin. While all this was happening, the cost of these transistors was also exponentially falling, as per Moore’s prediction.

In real terms this means that a mainframe computer of the 1970s that cost over $1 million, has less computing power than your iPhone has today. It means that the USB memory stick you carry around with you in your pocket would have taken a room full of Hard Disk platters in the 70s. Have you ever watched the movie Apollo 13? Remember they were trying to work out how to fire up the Apollo Guidance Computer without breaking their remaining power allowance? Well that computer, which was at the height of computing technology in the 70s, had around 32k of memory, ran at a clock speed of 1.024 MHz. When the IBM PC XT launched in 1981 it was already about 8 times faster than the Apollo computer. The next generation of smartphone we will be using in the next 2-3 years will have 1 Ghz processor chips. That is roughly 1 million times faster than the Apollo Guidance Computer…

These numbers are so mind blowing that if we apply it to the world outside computing things get a little bizarre. For example, if a house shrunk at the same pace transistors have, you would not be able to see a house without a microscope. In 1978 a commercial flight between New York and Paris cost around US$900 and took 7 hours to complete. If Moore’s law applied to aviation in the same way as computing, then that flight today would cost about 1 cent (a penny) and would take less than a second.

Now you know why your technology budget is the way it is…


Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )


Connecting to %s

%d bloggers like this: