Tuesday, January 25, 2022

Understanding Moore's Law

Those of us attentive to computing are generally familiar with the existence of something called "Moore's Law." However, really satisfying explanation of what Moore's Law actually is would seem a rarer thing--with one result a great deal of confusion about what it means.

Simply put, Moore's Law has to do with "integrated circuits," or, in more everyday usage, microchips--small wafers ("chips") of semiconducting material, usually silicon, containing an electronic circuit. Within these chips transistors amplify, regulate and switch the electric signals passing through them, enabling them to store and move electronic data. Placing more transistors inside a chip means that more such activity can go on inside it at once, which gives the chip, and the device incorporating it, enabling more "parallelism"--the ability to do more at once, and therefore to work faster. All other things being equal one can only put more transistors on the same-sized chip if the transistors are themselves smaller--which means that the electrons passing through them travel shorter distances, which increases the speed at which the system executes its operations yet again.

Since their invention in the late 1950s microchip manufacturers have steadily increased the number of transistors in their chips, by shrinking transistor size--a process that also caused the cost of each transistor to fall. In 1965 electronics engineer Gordon Moore published a short paper titled "Cramming More Components Into Integrated Circuits" in which he noted that the "density at minimum cost per transistor" doubled every year. He extrapolated from that trend that in the next five years they would have chips with twenty times as many transistors on them, each costing just a tenth of their 1965 price, and that this pattern would continue for "at least ten years."

Moore's prediction (which, it is worth recalling, he never called a "law") was inexactly borne out during those years. He proved somewhat overoptimistic, transistor density not quite doubling annually, and today, in fact, different versions of this "law" get quoted with varying claims about doubling times. (Some say one year, some say eighteen months, some say two years, while claims about the implications for processing power and price also vary.) However, the swift doubling in the number of transistors per chip, and the fall in the price of computing power that went with it, continued for a lot longer than the ten years he suggested, instead going on for a half century past that point. The result is that where an efficiently made chip had fifty transistors on it in 1965, they now contain billions of transistors—all as the low price of these densely transistorized chips means that hundreds of billions of them are manufactured annually, permitting them to be stuffed into just about everything we use.

Nonetheless, Moore's Law has certain in-built limitations. The most significant of these is the physical limit to transistor miniaturization. One cannot make a silicon transistor smaller than a single nanometer (a billionth of a meter, equivalent to the width of a single atom) after all, while even before one gets to that point shrinking size makes transistors so small that the electrons whose movements they are supposed to control actually pass (or "tunnel") through their walls.

Of course, when Moore presented his "Law" the prospects of single atom-wide transistors, or even tunneling, seemed remote in the extreme. Transistors in 1971 were drawn on a ten micrometer (millionth of a meter) scale—ten thousand nanometers in the terms more commonly discussed today. However, by 2017 the transistors in commercially made chips were just a thousandth their earlier size, a mere ten nanometers across. The following year major chipmakers began the mass-production of mere seven nanometer transistors got underway, leaving very little space for further size reductions.

This has led a good many observers to declare that "Moore's Law is dead," or will be before too much longer. The claim is controversial--perhaps more than it ought to be. After all, no one disputes that chip speeds cannot continue to increase on the basis of reducing the size of the transistors on silicon wafers--and that is exactly what Moore's Law was concerned with, not the possibility or impossibility of continued progress in computing power. The result is that those who are convinced that the tendency to the exponential increase of computing power is virtually bound to continue as before might do better to set aside claims for Moore's Law continuing, and instead speak of Ray Kurzweil's "Law of Accelerating Returns."

No comments:

Subscribe Now: Feed Icon