"SAN FRANCISCO — For decades, the computer industry has been guided by a faith that engineers would always find a way to make the components on computer chips smaller, faster and cheaper.
But a decision by a global alliance of chip makers to back away from reliance on Moore’s Law, a principle that has guided tech companies from the giant mainframes of the 1960s to today’s smartphones, shows that the industry may need to rethink the central tenet of Silicon Valley’s innovation ethos.
Chip scientists are nearly at the point where they are manipulating material as small as atoms. When they hit that mark within the next five years or so, they may bump into the boundaries of how tiny semiconductors can become. After that, they may have to look for alternatives to silicon, which is used to make computer chips, or new design ideas in order to make computers more powerful.
It is hard to overstate the importance of Moore’s Law to the entire world. Despite its official sound, it is not actually a scientific rule like Newton’s laws of motion. Instead, it describes the pace of change in a manufacturing process that has made computers exponentially more affordable.
In 1965, the Intel co-founder Gordon Moore first observed that the number of components that could be etched onto the surface of a silicon wafer was doubling at regular intervals and would do so for the foreseeable future.
When Dr. Moore made his observation, the densest memory chips stored only about 1,000 bits of information. Today’s densest memory chips have roughly 20 billion transistors. To put it another way, the iPad 2, which went on the market in 2011 for $400 and fits in your lap, had more computing power than the world’s most powerful supercomputer in the 1980s, a device called the Cray 2 that was about the size of an industrial washing machine and would cost more than $15 million today.
That iPad 2, mind you, is slow compared to newer models.
Without those remarkable improvements, today’s computer industry wouldn’t exist. The vast cloud-computing data centers run by companies like Google and Amazon would be impossibly expensive to build. There would be no smartphones with apps that allow you to order a ride home or get dinner delivered. And scientific breakthroughs like decoding the human genome or teaching machines to listen would not have happened.
Signaling their belief that the best way to forecast the future of computing needs to be changed, the Semiconductor Industry Associations of the United States, Europe, Japan, South Korea and Taiwan will make one final report based on a chip technology forecasting system called the International Technology Roadmap for Semiconductors.
Nearly every big chip maker, including Intel, IBM and Samsung, belongs to the organization, though Intel says it is not participating in the last report."