History Software Engineering

William Shockley’s Silicon Transistor – 1947 A.D.

Return to Timeline of the History of Computers


Silicon Transistor

John Bardeen (1908–1991), Walter Houser Brattain (1902–1987), William Shockley (1910–1989)

“A transistor is an electronic switch: current flows from one terminal to another unless voltage is applied to a third terminal. Combined with the laws of Boolean algebra, this simple device has become the building block for microprocessors, memory systems, and the entire computer revolution.

Any technology that can use one signal to switch another on and off can be used to create a computer. Charles Babbage did it with rods, cogs, and steam power. Konrad Zuse and Howard Aiken did it with relays, and ENIAC used tubes. Each technology was faster and more reliable than the previous.

Likewise, transistors have several advantages over vacuum tubes: they use less power, so they generate less heat, they switch faster, and they are less susceptible to physical shock. All of these advantages arise because transistors are smaller than tubes—and the smaller the transistor, the bigger the advantage.

Modern transistors trace their lineage back to a device manufactured by John Bardeen, Walter Brattain, and William Shockley at AT&T’s Bell Laboratories in 1947. The team was trying to build an amplifier that could detect ultra-high frequency radio waves, but the tubes that they had just weren’t fast enough. So they tried working with semiconductor crystals, as radios based on semiconductor diodes called cat’s whiskers had been used since nearly the birth of radio in the 1890s.

A cat’s whisker radio uses a sharp piece of wire (the “whisker”) that’s jabbed into a piece of semiconducting germanium; by moving the wire along the semiconductor and varying the pressure, the semiconductor and the wire work together to create a diode, a device allowing current to pass in only one direction. The Bell Labs team built a contraption that attached two strips of gold foil to the crystal and then applied power to the germanium. The result was an amplifier: a signal injected into one wire was stronger when it came out of the other. Today we call this device a point-contact transistor.

For their discovery of the transistor, Bardeen, Brattain, and Shockley were awarded the Nobel Prize in 1956.”

SEE ALSO Semiconductor Diode (1874), First LED (1927)

“The first transistor ever made, built in 1947 by John Bardeen, William Shockley, and Walter H. Brattain of Bell Labs.”

Fair Use Source: B07C2NQSPV

History Software Engineering

Boolean Algebra – 1854 A.D.

Return to Timeline of the History of Computers


Boolean Algebra

George Boole (1815–1864), Claude Shannon (1916–2001)

“George Boole was born into a shoemaker’s family in Lincolnshire, England, and schooled at home, where he learned Latin, mathematics, and science. But Boole’s family landed on hard times, and at age 16 he was forced to support his family by becoming a school teacher—a profession he would continue for the rest of his life. In 1838, he wrote his first of many papers on mathematics, and in 1849 he was appointed as the first professor of mathematics at Queen’s College in Cork, Ireland.

Today Boole is best known for his invention of mathematics for describing and reasoning about logical prepositions, what we now call Boolean logic. Boole introduced his ideas in his 1847 monograph, “The Mathematical Analysis of Logic,” and perfected them in his 1854 monograph, “An Investigation into the Laws of Thought.”

Boole’s monographs presented a general set of rules for reasoning with symbols, which today we call Boolean algebra. He created a way—and a notation—for reasoning about what is true and what is false, and how these notions combine when reasoning about complex logical systems. He is also credited with formalizing the mathematical concepts of AND, OR, and NOT, from which all logical operations on binary numbers can be derived. Today many computer languages refer to such numbers as Booleans or simply Bools in recognition of his contribution.

Boole died at the age of 49 from pneumonia. His work was carried on by other logicians but didn’t receive notice in the broader community until 1936, when Claude Shannon, then a graduate student at the Massachusetts Institute of Technology (MIT), realized that the Boolean algebra he had learned in an undergraduate philosophy class at the University of Michigan could be used to describe electrical circuits built from relays. This was a huge breakthrough, because it meant that complex relay circuits could be described and reasoned about symbolically, rather than through trial and error. Shannon’s wedding of Boolean algebra and relays let engineers discover bugs in their diagrams without having to first build the circuits, and it allowed many complex systems to be refactored, replacing them with relay systems that were functionally equivalent but had fewer components.”

SEE ALSO Binary Arithmetic (1703), Manchester SSEM (1948)

“A circuit diagram analyzed using George Boole’s “laws of thought”—what today is called Boolean algebra. Boole’s laws were used to analyze complicated telephone switching systems.”

Fair Use Source: B07C2NQSPV