fbpx

The Death of Moore’s Law Will Not Kill off Computational Disruption

Exponential increases in computational power generate most of the rapid social change in our time. Some of the changes are largely good. The increase in amount and speed of information promotes the availability of more diverse and expert views on policy and politics. The rise of genomics and personalized medicine can lead to longer and  healthier lives. Even energy production, both of fossil fuels and the greener variety, is boosted by computational power. But computation is also the cause of domestic turbulence, as automation replaces some kinds of jobs, and of danger abroad, as it empowers the organization of non-state terrorist actors.

Moore’s law is thought to encapsulate ongoing computational improvements.  This law, named after Gordon Moore, one of the founders of Intel, is in a reality a prediction of a regularity, i.e. that the number of transistors that can be fitted onto a silicon computer chip doubles every eighteen months to two years.  This week Moore’s law reached  the age of fifty and there are widespread predictions and fears that it will die before sixty, because of the physical impossibility of shrinking transistors further and the expense of  trying to do so.

But the computational revolution has deeper and broader roots than Moore’s law and thus the rate of computational and social change will continue even after its demise and may indeed accelerate.   The technologist Ray Kurzweil shows that Moore’s law is actually part of a more general exponential growth in computation that has been gaining force for over a hundred years. Integrated circuits replaced transistors, which previously replaced vacuum tubes, which in their time had replaced electromechanical methods of computation. Thus, there is reason to suspect that new forms of computation, such as optical computing or carbon nanotube computing, will replace silicon as the paradigm for exponential growth in hardware.

Focusing only on the exponential increase in hardware capability also substantially understates the acceleration of computational capacity. Computational capacity advances with progress in software as well as progress in hardware. A study showed that one kind of computer task also been increased by approximately forty-three thousand times through improvements in software algorithms in the last fifteen years. Like many creative human endeavors, progress in software alternates between breakthroughs and periods of consolidation where gains are less spectacular. But in general it is a force multiplier for the gains in computational hardware.

Gains in connectivity also increase the effective power of computation. The results of the faster and greater collaboration made possible across long distances are reflected in exponential growth in the volume of scientific knowledge which powers innovation.

Thus, I am quite confident that the exponential change in computation will continue. As I have described elsewhere, computation is enveloping more and more spheres of life—from law to education to medicine, increasing the rate of technological and social change. Social change will continue to accelerate and democratic politics may thus have an even bumpier ride.

Reader Discussion

Law & Liberty welcomes civil and lively discussion of its articles. Abusive comments will not be tolerated. We reserve the right to delete comments - or ban users - without notification or explanation.

on April 22, 2015 at 16:40:53 pm

Keep in mind that the shrinking a transistor size is not a cause of exponential computing power. Surprising, this acceleration actually has a lot to do with heat dissipation. Shrinking transistors means reducing heat generation.

A lot more goes into the exponential growth in computing power than just transistor size.

Memory access speeds are important. Modern CPUs speed up memory access by guessing which data in RAM they will need, and they are surprisingly good at it. Accelerated networks play an important part, as well as faster hard disks.

A lot of growth comes from technicians who use fast and powerful computers to create a new generation of faster and more powerful computers, which in turn are used to help make the next generation of even faster and more powerful computers.

The fact that we are reaching the limit on how small we can make transistors does not mean exponential growth will stop. It only means that one minor aspect of accelerated growth is reaching its end.

Here is an insider tip. What is the next thing to look for? Silicon photonics. Google "Michal Lipson silicon photonics". She predicts that this technology will take up the slack left by the limitation of transistor shrinkage. A lot of big tech companies are betting big that she is right, including IBM and Intel.

read full comment
Image of Scott Amorian
Scott Amorian
on April 22, 2015 at 20:02:29 pm

Scott:

I was hoping you might add to this discussion. You are right about the parallel benefit of heat dissipation coming from smaller components.. Goodness, the clunkers we used to use - and then voila! I could simply not see the components our machines were placing and lo and behold, gone were all the stupid heatsink strategies and associated problems.

I'll have to look up the photonics thing. it has been some time since I was current on this stuff.

read full comment
Image of gabe
gabe
on April 23, 2015 at 14:59:23 pm

Thanks, Gabe. Silicon photonics is about replacing electrical wires with light paths on printed circuits. It allows faster data movement. It also allows multiple signals on a single light path. Where electrical wires can only carry one signal per wire a light path can carry multiple signals per path. I believe that means that chips will continue to shrink even though transistors can't get any smaller. Electrical wires have issues with resistance, capacitance and induction. Light paths don't have those issues.

Technically speaking, Moore's "Law" is about transistors shrinking by half every two years. But most of us learned that Moore's "Law" is about the cost of computers halving while the power doubles every two years, so you'll come across different interpretations. That's why some people say Moore's "Law" is dead and others say it is not.

Computer power and efficiency are still on track to double every two years, even though transistor size has reached its physical limit. Today, silicon photonics; tomorrow, maybe quantum computing.

read full comment
Image of Scott Amorian
Scott Amorian
on April 23, 2015 at 16:59:41 pm

OK - so it seems like this is the logical progression from photonic cables (sometime used for connectivity for internet, if I recall) that one of my former employers designed and manufactured (about 10 yrs ago) for a number of providers.

It should work and work quite well. My only hope is that we do the manufacture here and keep the technology here rather than giving the dang thing away (see Boeing and composite technology and capital equipment on 787 program. ).

Even with this advancement, I think your point about *smarter* software has an almost equal capacity for increased speed and effectiveness.

seeya
gabe

read full comment
Image of gabe
gabe

Law & Liberty welcomes civil and lively discussion of its articles. Abusive comments will not be tolerated. We reserve the right to delete comments - or ban users - without notification or explanation.