The future of computing | The Economist

The era of predictable improvement in computer hardware is ending. What comes next?

IN 1971 the fastest car in the world was the Ferrari Daytona, capable of 280kph (174mph). The world’s tallest buildings were New York’s twin towers, at 415 metres (1,362 feet). In November that year Intel launched the first commercial microprocessor chip, the 4004, containing 2,300 tiny transistors, each the size of a red blood cell.

Since then chips have improved in line with the prediction of Gordon Moore, Intel’s co-founder. According to his rule of thumb, known as Moore’s law, processing power doubles roughly every two years as smaller transistors are packed ever more tightly onto silicon wafers, boosting performance and reducing costs. A modern Intel Skylake processor contains around 1.75 billion transistors—half a million of them would fit on a single transistor from the 4004—and collectively they deliver about 400,000 times as much computing muscle. This exponential progress is difficult to relate to the physical world. If cars and skyscrapers had improved at such rates since 1971, the fastest car would now be capable of a tenth of the speed of light; the tallest building would reach half way to the Moon.

The impact of Moore’s law is visible all around us. Today 3 billion people carry smartphones in their pockets: each one is more powerful than a room-sized supercomputer from the 1980s. Countless industries have been upended by digital disruption. Abundant computing power has even slowed nuclear tests, because atomic weapons are more easily tested using simulated explosions rather than real ones. Moore’s law has become a cultural trope: people inside and outside Silicon Valley expect technology to get better every year.

But now, after five decades, the end of Moore’s law is in sight (see Technology Quarterly). Making transistors smaller no longer guarantees that they will be cheaper or faster. This does not mean progress in computing will suddenly stall, but the nature of that progress is changing. Chips will still get better, but at a slower pace (number-crunching power is now doubling only every 2.5 years, says Intel). And the future of computing will be defined by improvements in three other areas, beyond raw hardware performance.

Source: The future of computing | The Economist

 

Raony Guimaraes