[go: up one dir, main page]
More Web Proxy on the site http://driver.im/

The tyranny of numbers was a problem faced in the 1960s by computer engineers. Engineers were unable to increase the performance of their designs due to the huge number of components involved. In theory, every component needed to be wired to every other component (or at least many other components) and were typically strung and soldered by hand. In order to improve performance, more components would be needed, and it seemed that future designs would consist almost entirely of wiring.

History

edit
 
The Cray-1 contained 50 miles of wiring.

The first known recorded use of the term in this context was made by the Vice President of Bell Labs in an article celebrating the 10th anniversary of the invention of the transistor, for the "Proceedings of the IRE" (Institute of Radio Engineers), June 1958 [1]. Referring to the problems many designers were having, he wrote:

For some time now, electronic man has known how 'in principle' to extend greatly his visual, tactile, and mental abilities through the digital transmission and processing of all kinds of information. However, all these functions suffer from what has been called 'the tyranny of numbers.' Such systems, because of their complex digital nature, require hundreds, thousands, and sometimes tens of thousands of electron devices.

— Jack Morton, The Tyranny of Numbers

At the time, computers were typically built up from a series of "modules", each module containing the electronics needed to perform a single function. A complex circuit like an adder would generally require several modules working in concert. The modules were typically built on printed circuit boards of a standardized size, with a connector on one edge that allowed them to be plugged into the power and signaling lines of the machine, and were then wired to other modules using twisted pair or coaxial cable.

Since each module was relatively custom, modules were assembled and soldered by hand or with limited automation. As a result, they suffered major reliability problems. Even a single bad component or solder joint could render the entire module inoperative. Even with properly working modules, the mass of wiring connecting them together was another source of construction and reliability problems. As computers grew in complexity, and the number of modules increased, the complexity of making a machine actually work grew more and more difficult. This was the "tyranny of numbers".

Motivation for the integrated circuit

edit

It was precisely this problem that Jack Kilby was thinking about while working at Texas Instruments. Theorizing that germanium could be used to make all common electronic components (transistors, resistors, capacitors, etc.), he set about building a single-slab component that combined the functionality of an entire module. Although successful in this goal, it was Robert Noyce's silicon version and the associated fabrication techniques that make the integrated circuit (IC) truly practical.

Unlike modules, ICs were built using photoetching techniques on an assembly line, greatly reducing their cost. Although any given IC might have the same chance of working or not working as a module, they cost so little that if they didn't work you simply threw it away and tried another. In fact, early IC assembly lines had failure rates around 90% or greater, which kept their prices high. The U.S. Air Force and NASA were major purchasers of early ICs, where their small size and light weight overcame any cost issues. They demanded high reliability, and the industry's response not only provided the desired reliability but meant that the increased yield had the effect of driving down prices.

ICs from the early 1960s were not complex enough for general computer use, but as the complexity increased through the 1960s, practically all computers switched to IC-based designs. The result was what are today referred to as the third-generation computers, which became commonplace during the early 1970s. The progeny of the integrated circuit, the microprocessor, eventually superseded the use of individual ICs as well, placing the entire collection of modules onto one chip.

Seymour Cray was particularly well known for making complex designs work in spite of the tyranny of numbers. His attention to detail and ability to fund several attempts at a working design meant that pure engineering effort could overcome the problems they faced. Yet even Cray eventually succumbed to the problem during the CDC 8600 project, which eventually led to him leaving Control Data.

References

edit
  • "The Chip that Jack Built". Texas Instruments. Archived from the original on 4 January 2012.