Bibliography Software Engineering

B00I8W50SO ISBN-13: 978-1937785338

See: Good Math: A Geek’s Guide to the Beauty of Numbers, Logic, and Computation (Pragmatic Programmers) 1st Edition

Fair Use Source:

Bibliography Python


See: Doing Math with Python: Use Programming to Explore Algebra, Statistics, Calculus, and More! 1st Edition

Fair Use Source:


The Enigma Machine – Circa 1930 AD

Return to Timeline of the History of Computers or History

The Enigma Machine, Circa 1930 – Fair Use Source: B085FW7J86

“The Enigma machine used electric-powered mechanical rotors to both encrypt and decrypt text-based messages sent over radio waves. The device had German origins and would become an important technological development during the Second World War.”

“The device looked like a large square or rectangular mechanical typewriter. On each key press, the rotors would move and record a seemingly random character that would then be transmitted to all nearby Enigma machines. However, these characters were not random, and were defined by the rotation of the rotor and a number of configuration options that could be modified at any time on the device. Any Enigma machine with a specific configuration could read or “decrypt” messages sent from another machine with an identical configuration. This made the Enigma machine extremely valuable for sending crucial messages while avoiding interception.”

“While a sole inventor of the rotary encryption mechanism used by the machine is hard to pinpoint, the technology was popularized by a two-man company called Chiffriermaschinen AG based in Germany. In the 1920s, Chiffriermaschinen AG traveled throughout Germany demonstrating the technology, which led to the German military adopting it in 1928 to secure top-secret military messages in transit.”

“The ability to avoid the interception of long-distance messages was a radical development that had never before been possible. In the software world of today, the interception of messages is still a popular technique that hackers try to employ, often called a man-in-the-middle attack. Today’s software uses similar (but much more powerful) techniques to those that the Enigma machine used a hundred years ago to protect against such attacks.”

“While the Enigma machine was an incredibly impressive technology for its time, it was not without flaws. Because the only criterion for interception and decryption was an Enigma machine with an identical configuration to the sender, a single compromised configuration log (or private key, in today’s terms) could render an entire network of Enigma machines useless.”

“To combat this, any groups sending messages via the Enigma machine changed their configuration settings on a regular basis. Reconfiguring Enigma machines was a time-consuming process. First, the configuration logs had to be exchanged in person, as secure ways of sharing them remotely did not yet exist. Sharing configuration logs between a network of two machines and two operators might not be painful. But a larger network, say 20 machines, required multiple messengers to deliver the configuration logs — each increasing the probability of a configuration log being intercepted and stolen, or potentially even leaked or sold.”

“The second problem with sharing configuration logs was that manual adjustments to the machine itself were required for the Enigma machine to be able to read, encrypt, and decrypt new messages sent from other Enigma machines. This meant that a specialized and trained staff member had to be present in case a configuration update was needed. This all occurred in an era prior to software, so these configuration adjustments required tampering with the hardware and adjusting the physical layout and wiring of the plugboard. The adjuster needed a background in electronics, which was very rare in the early 1900s.”

“As a result of how difficult and time-consuming it was to update these machines, updates typically occurred on a monthly basis — daily for mission-critical communication lines. If a key was intercepted or leaked, all transmissions for the remainder of the month could be intercepted by a malicious actor — the equivalent of a hacker today.”

“The type of encryption these Enigma machines used is now known as a symmetric key algorithm, which is a special type of cipher that allows for the encryption and decryption of a message using a single cryptographic key. This family of encryption is still used today in software to secure data in transit (between sender and receiver), but with many improvements on the classic model that gained popularity with the Enigma machine.”

“In software, keys can be made much more complex. Modern key generation algorithms produce keys so complex that attempting every possible combination (brute forcing or brute force attack) with the fastest possible modern hardware could easily take more than a million years. Additionally, unlike the Enigma machines of the past, software keys can change rapidly.”

“Depending on the use case, keys can be regenerated at every user session (per login), at every network request, or at a scheduled interval. When this type of encryption is used in software, a leaked key might expose you for a single network request in the case of per-request regeneration, or worst-case scenario, a few hours in the case of per-login (per-session) regeneration.”

“If you trace the lineage of modern cryptography far back, you will eventually reach World War II in the 1930s. It’s safe to say that the Enigma machine was a major milestone in securing remote communications. From this, we can conclude that the Enigma machine was an essential development in what would later become the field of software security.”

“The Enigma machine was also an important technological development for those who would be eventually known as “hackers.” The adoption of Enigma machines by the Axis Powers during World War II resulted in extreme pressure for the Allies to develop encryption-breaking techniques. General Dwight D. Eisenhower himself claimed that doing so would be essential for victory against the Nazis.”

“In September of 1932, a Polish mathematician named Marian Rejewski was provided a stolen Enigma machine. At the same time, a French spy named Hans-Thilo Schmidt was able to provide him with valid configurations for September and October of 1932. This allowed Marian to intercept messages from which he could begin to analyze the mystery of Enigma machine encryption.”

“Marian was attempting to determine how the machine worked, both mechanically and mathematically. He wanted to understand how a specific configuration of the machine’s hardware could result in an entirely different encrypted message being output.”

“Marian’s attempted decryption was based on a number of theories as to what machine configuration would lead to a particular output. By analyzing patterns in the encrypted messages and coming up with theories based on the mechanics of the machine, Marian and two coworkers, Jerzy Różycki and Henryk Zygalski, eventually reverse engineered the system. With the deep understanding of Enigma rotor mechanics and board configuration that the team developed, they were able to make educated guesses as to which configurations would result in which encryption patterns. They could then reconfigure a board with reasonable accuracy and, after several attempts, begin reading encrypted radio traffic. By 1933 the team was intercepting and decrypting Enigma machine traffic on a daily basis.”

“Much like the hackers of today, Marian and his team intercepted and reverse engineered encryption schemes to get access to valuable data generated by a source other than themselves. For these reasons, I would consider Marian Rejewski and the team assisting him as some of the world’s earliest hackers.”

“In the following years, Germany would continually increase the complexity of its Enigma machine encryption. This was done by gradually increasing the number of rotors required to encrypt a character. Eventually the complexity of reverse engineering a configuration would become too difficult for Marian’s team to break in a reasonable time frame. This development was also important, because it provided a look into the ever-evolving relationship between hackers and those who try to prevent hacking.”

“This relationship continues today, as creative hackers continually iterate and improve their techniques for breaking into software systems. And on the other side of the coin, smart engineers are continually developing new techniques for defending against the most innovative hackers.”

Fair Use Sources:

Fair Use Source: B085FW7J86

Artificial Intelligence DevSecOps-Security-Privacy History

Quantum Computer Factors “15” – 2001 AD

Return to Timeline of the History of Computers


Quantum Computer Factors “15”

Peter Shor (b. 1959), Isaac Chuang (b. 1968)

“Speed is the promise of quantum computers—not just faster computations, but mind-blowingly, seemingly impossibly fast computations.

That’s a problem for anyone who sends encrypted information over the internet, because the public key cryptography algorithms that secure the majority of the internet’s data depend on the difficulty of factoring large numbers. There is no known algorithm for efficiently factoring large numbers on a conventional computer. But in 1994, mathematician Peter Shor devised an algorithm for efficiently factoring large numbers on a quantum computer. This means that an organization with a working quantum computer could decrypt practically every message sent over the internet—provided the organization had a large enough quantum computer.

One way to measure a quantum computer is by the number of qubits it can process at a time. In 2001, a team of scientists at IBM’s Almaden Research Center led by Isaac Chuang successfully factored the number 15, yielding the factors 3 and 5, with a quantum computer that had 7 qubits.

Although factoring the number 15 might not seem like a big deal, the IBM researchers proved that quantum computers aren’t just theoretical—they actually work. Now the race was on to make a quantum computer that was large enough to compute something that could not be computed on a conventional machine.

Since then, quantum computers have gotten steadily bigger, and the factoring algorithms have also improved. In 2012, Shor’s algorithm was used on a 10-qubit machine to factor the number 21. That same year, a team in China factored the number 143 with a 4-qubit computer using an improved algorithm. Astonishingly, two years after the Chinese team published its findings, a group of researchers at Kyoto University pointed out that the Chinese system had also factored the numbers 3,599, 11,663, and 56,153, without the authors even being aware of it!

Cryptographers at the US National Institute of Standards and Technology are now racing to develop “post-quantum” encryption algorithms that aren’t based on factoring and, as a result, won’t be vulnerable to anyone who has a quantum computer.”

SEE ALSO The Qubit (1983)

In 2001, a team of scientists used a quantum computer with 7 qubits to factor the number 15, yielding the factors 3 and S. Since then, quantum computers have factored much larger numbers.

Fair Use Sources: B07C2NQSPV

Chirgwin, Richard. “Quantum Computing Is So Powerful It Takes Two Years to Understand What Happened.” The Register, December 4, 2014.

Chu, Jennifer. “The Beginning of the End for Encryption Schemes?” MIT News release, March 3, 2016.

IBM. “IBM’s Test-Tube Quantum Computer Makes History.” News release, December 19, 2001.


Quantum Computers Factorization of Large Integers Algorithm by Peter Shor – 1994 AD

Return to Timeline of the History of Computers

Peter Shor devises an algorithm which lets quantum computers determine the factorization of large integers quickly. This is the first interesting problem for which quantum computers promise a significant speed-up, and it therefore generates a lot of interest in quantum computers.

Fair Use Sources:

History Software Engineering

Zero-Knowledge Mathematical Proofs – 1985 AD

Return to Timeline of the History of Computers


Zero-Knowledge Proofs

Shafi Goldwasser (b. 1958), Silvio Micali (b. 1954), Charles Rackoff (b. 1948)

How do you prove that you know a secret without revealing the secret? Three computer scientists, Shafi Goldwasser, Silvio Micali, and Charles Rackoff, figured a way in 1985, establishing a new a branch of cryptography rich with applications that are only now beginning to be realized.

Zero-knowledge proofs are a mathematical technique for demonstrating facts about a proof without revealing the proof itself, provided that the demonstration involves interaction between two parties: the prover who wants to provide that some mathematical statement is true, and a verifier who checks the proof. The verifier asks the prover a question, and the prover sends back a string of bits, called a witness, that could be generated only if the statement is true.

For example, consider assigning colors to the states or countries on a map. In 1976, mathematicians proved that any two-dimensional map can be colored using just four colors such that no countries that touch are colored with the same color. But doing the same with just three colors is much harder and can’t be done with all maps. Until the invention of zero-knowledge proofs, the only way for a person to show if a specific map could be colored with three colors was to do just that: produce the map colored with three colors.

Using zero-knowledge proofs, the witness demonstrates a specific map colored with just three colors has no instance of two touching countries that are the same color, and it does this without revealing the coloring of any country.

Building a practical system from zero-knowledge proofs requires the application of both cryptography and engineering—hard work, but some practical systems have emerged, including password-authentication systems that don’t actually send the password, anonymous credential systems that allow a person to establish (for example) that the credential holder is over 18 without revealing his or her age or name, and digital money schemes that let people spend digital coins anonymously but still detect if an anonymous coin is spent twice (called double spending).

For their work in cryptography, Goldwasser and Micali won the A.M. Turing Award in 2012.

SEE ALSO Secure Multi-Party Computation (1982), Digital Money (1990)

Zero-knowledge proofs let a prover demonstrate possession of a fact without revealing that fact. For example, you can prove that there exists a way to color a map with just three colors, without revealing the completed map.

Fair Use Source: B07C2NQSPV

History Software Engineering

Computer Proves Mathematical Theorem – 1955 AD

Return to Timeline of the History of Computers


Computer Proves Mathematical Theorem

Allen Newell (1927–1992), John Clifford Shaw (1922–1991), Herbert Simon (1916–2001)

““Kind of crude, but it works, boy, it works!” So said Allen Newell to Herb Simon on Christmas Day 1955, about the program that the two of them had written with the help of computer programmer John Clifford Shaw. The program, Logic Theorist, had been given the basic definitions and axioms of mathematics and programmed to randomly combine symbols into successively complex mathematical statements and then check each one for validity. If the program discovered a true statement, it added that statement to its list of truths and kept going.

As Newell said, the approach was kind of crude, but it worked. As Logic Theorist ran, it discovered more and more mathematical truths. Over time, the program proved 38 of the 52 theorems in Principia Mathematica, the classic text on mathematics written by Alfred North Whitehead (1861–1947) and Bertrand Russell (1872–1970).

Simon sent a letter to Lord Russell telling him of the program’s contributions. In one case, it turned out that Logic Theorist had found a proof to one of Russell’s theorems that was more elegant than the one published in the text. Russell wryly wrote back: “I am delighted to know that ‘Principia Mathematica’ can now be done by machinery. I wish Whitehead and I had known of this possibility before we wasted 10 years doing it by hand.”

For Newell, Shaw, and Simon, their program kindled expectations that many of the secrets of thought and intelligence would be cracked in only a few years.

When they wrote Logic Theorist, Newell and Shaw were both computer researchers at RAND Corporation, a research and development think tank. Simon, a political scientist and economist at Carnegie Mellon, was working for RAND as a consultant. Newell ended up moving to Carnegie Mellon, where he and Simon started one of the first AI laboratories. Together Newell and Simon shared the 1975 A.M. Turing Award for their work on AI.”

SEE ALSO Algorithm Influences Prison Sentence (2013)

“To write Principia Mathematica, Whitehead and Russell painstakingly derived most of modern mathematics from a small number of axioms and rules of inference. Using a similar approach, the program Logic Theorist was also able to find and prove mathematical truths.”

Fair Use Source: B07C2NQSPV

History Software Engineering

The Bit – Binary Digit 0 or 1 – 1948 AD

Return to Timeline of the History of Computers


The Bit

Claude E. Shannon (1916–2001), John W. Tukey (1915–2000)

“It was the German mathematician Gottfried Wilhelm Leibniz (1646–1716) who first established the rules for performing arithmetic with binary numbers. Nearly 250 years later, Claude E. Shannon realized that a binary digit—a 0 or a 1—was the fundamental, indivisible unit of information.

Shannon earned his PhD from MIT in 1940 and then took a position at the Institute for Advanced Study in Princeton, New Jersey, where he met and collaborated with the institute’s leading mathematicians working at the intersection of computing, cryptography, and nuclear weapons, including John von Neumann, Albert Einstein, Kurt Gödel, and, for two months, Alan Turing.

In 1948, Shannon published “A Mathematical Theory of Communication” in the Bell System Technical Journal. The article was inspired in part by classified work that Shannon had done on cryptography during the war. In it, he created a mathematical definition of a generalized communications system, consisting of a message to be sent, a transmitter to convert the message into a signal, a channel through which the signal is sent, a receiver, and a destination, such as a person or a machine “for whom the message is intended.”

Shannon’s paper introduced the word bit, a binary digit, as the basic unit of information. While Shannon attributed the word to American statistician John W. Tukey, and the word had been used previously by other computing pioneers, Shannon provided a mathematical definition of a bit: rather than just a 1 or a 0, it is information that allows the receiver to limit possible decisions in the face of uncertainty. One of the implications of Shannon’s work is that every communications channel has a theoretical upper bound—a maximum number of bits that it can carry per second. As such, Shannon’s theory has been used to analyze practically every communications system ever developed—from handheld radios to satellite communications—as well as data-compression systems and even the stock market.

Shannon’s work illuminates a relationship between information and entropy, thus establishing a connection between computation and physics. Indeed, noted physicist Stephen Hawking framed much of his analysis of black holes in terms of the ability to destroy information and the problems created as a result.”

SEE ALSO Vernam Cipher (1917), Error-Correcting Codes (1950)

Mathematician and computer scientist Claude E. Shannon.

Fair Use Source: B07C2NQSPV


Colossus – 1943 A.D.

Return to Timeline of the History of Computers



Thomas Harold Flowers (1905–1998), Sidney Broadhurst (1893–1969), W. T. Tutte (1917–2002)

“Colossus was the first electronic digital computing machine, designed and successfully used during World War II by the United Kingdom to crack the German High Command military codes. “Electronic” means that it was built with tubes, which made Colossus run more than 500 times faster than the relay-based computing machines of the day. It was also the first computer to be manufactured in quantity.

A total of 10 “Colossi” were clandestinely built at Bletchley Park, Britain’s ultra-secret World War II cryptanalytic center, between 1943 and 1945 to crack the wireless telegraph signals encrypted with a special system developed by C. Lorenz AG, a German electronics firm. After the war the Colossi were destroyed or dismantled for their parts to protect the secret of the United Kingdom’s cryptanalytic prowess.

Colossus was far more sophisticated than the electromechanical Bombe machines that Alan Turing designed to crack the simpler Enigma cipher used by the Germans for battlefield encryption. Whereas Enigma used between three and eight encrypting rotors to scramble characters, the Lorenz system involved 12 wheels, with each wheel adding more mathematical complexity, and thus required a cipher-cracking machine with considerably more speed and agility.

Electronic tubes provided Colossus with the speed that it required. But that speed meant that Colossus needed a similarly fast input system. It used punched paper tape running at 5,000 characters per second, the tape itself moving at 27 miles per hour. Considerable engineering kept the tape properly tensioned, preventing rips and tears.

The agility was provided by a cryptanalysis technique designed by Alan Turing called Turingery, which inferred the cryptographic pattern of each Lorenz cipher wheel, and a second algorithm. The second algorithm, designed by British mathematician W. T. Tutte, determined the starting position of the wheels, which the Germans changed for each group of messages. The Colossi themselves were operated by a group of cryptanalysts that included 272 women from the Women’s Royal Naval Service (WRNS) and 27 men.”

SEE ALSO Manchester SSEM (1948)

The Colossus computing machine was used to read Nazi codes at Bletchley Park, England, during World War II.

Fair Use Source: B07C2NQSPV


Atanasoff-Berry Computer – 1942 A.D.

Return to Timeline of the History of Computers


Atanasoff-Berry Computer

John Vincent Atanasoff (1903–1995), Clifford Edward Berry (1918–1963)

“Built at Iowa State College (now Iowa State University) by professor John Atanasoff and graduate student Clifford Berry, the Atanasoff-Berry Computer (ABC) was an automatic, electronic digital desktop computer.

Atanasoff, a physicist and inventor, created the ABC to solve general systems of linear equations with up to 29 unknowns. At the time, it took a human computer eight hours to solve a system with eight unknowns; systems with more than 10 unknowns were not often attempted. Atanasoff started building the computer in 1937; he successfully tested it in 1942, and then abandoned it when he was called for duty in World War II. Although the machine was largely forgotten, it changed the course of computing decades later.

The machine was based on electronics, rather than relays and mechanical switches, performed math with binary arithmetic, and had a main memory that used an electrical charge (or its absence) in small capacitors to represent 1s and 0s—the same approach used by modern dynamic random access memory (DRAM) modules. The whole computer weighed 700 pounds.

Ironically, the lasting value of the ABC was to invalidate the original ENIAC patent, which had been filed by J. Presper Eckert and John Mauchly in June 1947. The ENIAC patent was the subject of substantial litigation, and the US Patent and Trademark Office did not issue the patent until 1964 as a result. With the patent in hand, the American electronics company Sperry Rand (which had bought the Eckert-Mauchly Computer Corporation in 1950) immediately demanded huge fees from all companies selling computers. At the time, patents were good for 18 years from the date of issuance, meaning that the ENIAC patent might stifle the computing industry until 1982.

It turned out that Mauchly had visited Iowa State and studied the ABC in June 1941—but had failed to mention the ABC as prior work in his patent application. In 1967, Honeywell sued Sperry Rand, claiming that the patent was invalid because of the omission. The US District Court for the District of Minnesota agreed and invalidated the ENIAC patent six years later.”


“A working reconstruction of the Atanasoff-Berry Computer, built by engineers at Iowa State University between 1994 and 1997.”

Fair Use Source: B07C2NQSPV

History Software Engineering

Z3 Computer – 1941 A.D.

Return to Timeline of the History of Computers


Z3 Computer

Konrad Zuse (1910–1995)

“The Z3 was the world’s first working programmable, fully automatic digital computer. The machine executed a program on punched celluloid tape and could perform addition, subtraction, multiplication, division, and square roots on 22-bit binary floating-point numbers (because binary math was more efficient than decimal); it had 64 words of 22-bit memory for storing results. The machine could convert decimal floating points to binary for input, and binary floating points back to decimal for output.

Graduating with a degree in civil engineering in 1935, German inventor Konrad Zuse immediately started building his first computer, the Z1 (constructed 1935–1938), in his parents’ apartment in Berlin. The Z1 was a mechanical calculator controlled by holes punched in celluloid film. The machine used 22-bit binary floating-point numbers and supported Boolean logic; it was destroyed in December 1943 during an Allied air raid.

Drafted into military service in 1939, Zuse started work on the Z2 (1939), which improved on the Z1’s design by using telephone relays for the arithmetic and control logic. DVL, the German Research Institute for Aviation, was impressed by the Z2 and gave Zuse funds to start his company, Zuse Apparatebau (Zuse Apparatus Construction, later renamed Zuse KG), to build the machines.

In 1941, Zuse designed and built the Z3. Like the Z1 and Z2, it was controlled by punched celluloid tape, but it also had support for loops, allowing it to be used for solving many typical engineering calculations.

With the success of the Z3, Zuse started working on the Z4, a more powerful machine with 32-bit floating-point math and conditional jumps. The partially completed machine was moved from Berlin to Göttingen in February 1945 to prevent it from falling into Soviet hands, and was completed there just before the end of the war. It remained in operation until 1959.

Surprisingly, it seems that the German military never made use of these sophisticated machines—instead, the machines were largely funded as a research project.”

SEE ALSO Atanasoff-Berry Computer (1942), Binary-Coded Decimal (1944)

“The control console, calculator, and storage cabinets of the Z3 computer by Konrad Zuse.”

Fair Use Source: B07C2NQSPV


Church-Turing Thesis – 1936 A.D.

Return to Timeline of the History of Computers


Church-Turing Thesis

David Hilbert (1862–1943), Alonzo Church (1903–1995), Alan Turing (1912–1954)

“Computer science theory seeks to answer two fundamental questions about the nature of computers and computation: are there theoretical limits regarding what is possible to compute, and are there practical limits?

American mathematician Alonzo Church and British computer scientist Alan Turing each published an answer to these questions in 1936. They did it by answering a challenge posed by the eminent German mathematician David Hilbert eight years earlier.

Hilbert’s challenge, the Entscheidungsproblem (German for “decision problem”), asked if there was a mathematical procedure—an algorithm—that could be applied to determine if any given mathematical proposition was true or false. Hilbert had essentially asked if the core work of mathematics, the proving of theorems, could be automated.

Church answered Hilbert by developing a new way of describing mathematical functions and number theory called the Lambda calculus. With it, he showed that the Entscheidungsproblem could not be solved in general: there was no general algorithmic procedure for proving or disproving theorems. He published his paper in April 1936.

Turing took a radically different approach: he created a mathematical definition of a simple, abstract machine that could perform computation. Turing then showed that such a machine could in principle perform any computation and run any algorithm—it could even simulate the operation of other machines. Finally, he showed that while such machines could compute almost anything, there was no way to know if a computation would eventually complete, or if it would continue forever. Thus, the Entscheidungsproblem was unsolvable.

Turing went to Princeton University in September 1936 to study with Church, where the two discovered that the radically different approaches were, in fact, mathematically equivalent. Turing’s paper was published in November 1936; he stayed on and completed his PhD in June 1938, with Church as his PhD advisor.”

SEE ALSO Colossus (1943), EDVAC First Draft Report (1945), NP-Completeness (1971)

Statue of Alan Turing at Bletchley Park, the center of Britain’s codebreaking operations during World War II.

Fair Use Source: B07C2NQSPV


Differential Analyzer – 1931 A.D.

Return to Timeline of the History of Computers


Differential Analyzer

Bush (1890–1974), Harold Locke Hazen (1901–1980)

“Differential equations are used to describe and predict the behavior of various phenomena in our constantly changing and complex world. They can predict ocean wave heights, population growth, how far a baseball might fly, how quickly plastic will decay, and so on. Some of these mathematical mysteries could be solved by hand, but other, more complex scenarios such as simulating nuclear explosions are far too labor intensive and intricate for a manual approach. To overcome this limitation, machines were needed in aiding human cognition.

Designed and built at MIT between 1928 and 1931 by Vannevar Bush and his graduate student Harold Locke Hazen, the differential analyzer combined six mechanical integrators, allowing complex differential equations to be analyzed. Bush designed the differential analyzer in part because he had been trying to solve a differential equation that required multiple sequences of integration. He thought it would be faster to design and build a machine to solve the equations he confronted, rather than solve the equations directly.

An analog computer, the differential analyzer used electric motors to drive a variety of gears and shafts that powered six wheel-and-disc integrators that were connected to 18 rotating shafts. Dozens of analyzers were built along the lines of the original plans. It was a breakthrough machine that enabled advances to be made in understanding seismology, electrical networks, meteorology, and ballistic calculations.

Because it was mechanical, imperfections in the machining, or simple wear on the parts, made each analyzer’s results less accurate over time. It was also slow to set up. So after becoming director of the Carnegie Institution for Science in Washington, DC, in 1938, Bush started working on a replacement machine, based on tubes, called the Rockefeller Differential Analyzer. Completed in 1942, it had 2,000 tubes and 150 motors and was an important calculating machine in World War II.”

SEE ALSO Thomas Arithmometer (1851)

Vannevar Bush with his differential analyzer, a mechanical computer designed to solve differential equations.

Fair Use Source: B07C2NQSPV

History Software Engineering

Boolean Algebra – 1854 A.D.

Return to Timeline of the History of Computers


Boolean Algebra

George Boole (1815–1864), Claude Shannon (1916–2001)

“George Boole was born into a shoemaker’s family in Lincolnshire, England, and schooled at home, where he learned Latin, mathematics, and science. But Boole’s family landed on hard times, and at age 16 he was forced to support his family by becoming a school teacher—a profession he would continue for the rest of his life. In 1838, he wrote his first of many papers on mathematics, and in 1849 he was appointed as the first professor of mathematics at Queen’s College in Cork, Ireland.

Today Boole is best known for his invention of mathematics for describing and reasoning about logical prepositions, what we now call Boolean logic. Boole introduced his ideas in his 1847 monograph, “The Mathematical Analysis of Logic,” and perfected them in his 1854 monograph, “An Investigation into the Laws of Thought.”

Boole’s monographs presented a general set of rules for reasoning with symbols, which today we call Boolean algebra. He created a way—and a notation—for reasoning about what is true and what is false, and how these notions combine when reasoning about complex logical systems. He is also credited with formalizing the mathematical concepts of AND, OR, and NOT, from which all logical operations on binary numbers can be derived. Today many computer languages refer to such numbers as Booleans or simply Bools in recognition of his contribution.

Boole died at the age of 49 from pneumonia. His work was carried on by other logicians but didn’t receive notice in the broader community until 1936, when Claude Shannon, then a graduate student at the Massachusetts Institute of Technology (MIT), realized that the Boolean algebra he had learned in an undergraduate philosophy class at the University of Michigan could be used to describe electrical circuits built from relays. This was a huge breakthrough, because it meant that complex relay circuits could be described and reasoned about symbolically, rather than through trial and error. Shannon’s wedding of Boolean algebra and relays let engineers discover bugs in their diagrams without having to first build the circuits, and it allowed many complex systems to be refactored, replacing them with relay systems that were functionally equivalent but had fewer components.”

SEE ALSO Binary Arithmetic (1703), Manchester SSEM (1948)

“A circuit diagram analyzed using George Boole’s “laws of thought”—what today is called Boolean algebra. Boole’s laws were used to analyze complicated telephone switching systems.”

Fair Use Source: B07C2NQSPV


Thomas Arithmometer – 1851 A.D.

Return to Timeline of the History of Computers


Thomas Arithmometer

Gottfried Wilhelm Leibniz (1646–1716), Charles Xavier Thomas de Colmar (1785–1870)

“German philosopher and mathematician Gottfried Leibniz became interested in mechanical calculation after seeing a pedometer while visiting Paris in 1672. He invented a new type of gear that could advance a 10-digit dial exactly 0 to 9 places, depending on the position of a lever, and used it in a machine with multiple dials and levers called the stepped reckoner. Designed to perform multiplication with repeated additions and division by repeated subtractions, the reckoner was hard to use because it didn’t automatically perform carry operations; that is, adding 1 to 999 did not produce 1,000 in a single operation. Worse, the machine had a design flaw—a bug—that prevented it from working properly. Leibniz built only two of them.

More than 135 years later, Charles Xavier Thomas de Colmar left his position as inspector of supply for the French army and started an insurance company. Frustrated by the need to perform manual arithmetic, Thomas designed a machine to help with math. Thomas’s arithmometer used Leibniz’s mechanism, now called a Leibniz wheel, but combined it with other gears, cogs, and sliding levers to create a machine that could reliably add and subtract numbers up to three digits, and multiply and divide as well. Thomas patented the machine, but his business partners at the insurance firm were not interested in commercializing it.

Twenty years later, Thomas once again turned his attention to the arithmometer. He demonstrated a version at the 1844 French national exhibition and entered competitions again in 1849 and 1851. By 1851, he had simplified the machine’s operation and extended its capabilities, giving it six sliders for setting numbers and 10 dials for display results. Aided by three decades’ advance in manufacturing technology, Thomas was able to mass-produce his device. By the time of his death, his company had sold more than a thousand of the machines—the first practical calculator that could be used in an office setting—and Thomas was recognized for his genius in creating it. The size of the arithmometer was approximately 7 inches (18 centimeters) wide by 6 inches (15 centimeters) tall.”

SEE ALSO Curta Calculator (1948)

“This Thomas Arithmometer can multiply two 6-digit decimal numbers to produce a 12-digit number. It can also divide.”

Fair Use Source: B07C2NQSPV