Artificial Intelligence Cloud Data Science - Big Data Hardware and Electronics History Networking Operating Systems Software Engineering

Timeline of the History of Computers

Return to History or This Year in History

c. 2500 BC – Sumerian Abacus

c. 700 BC – Scytale

c. 150 BC – Antikythera Mechanism

c. 60 – Programmable Robot

c. 850 – On Deciphering Cryptographic Messages

c. 1470 – Cipher Disk

1613 – First Recorded Use of the Word Computer

1621 – Slide Rule

1703 – Binary Arithmetic

1758 – Human Computers Predict Halley’s Comet

1770 – The “Mechanical Turk”

1792 – Optical Telegraph

1801 – The Jacquard Loom

1822 – The Difference Engine

1833 – Michael Faraday discovered silver sulfide became a better conductor when heated

1836 – Electrical Telegraph

1843 – Ada Lovelace Writes a Computer Program

1843 – Fax Machine Patented

1843 – Edgar Allan Poe’s “The Gold-Bug”

1849 to early 1900s – Silicon Valley After the Gold Rush

1851 – Thomas Arithmometer

1854 – Boolean Algebra

1864 – First Electromagnetic Spam Message

1870 – Mitsubishi founded

1874 – Baudot Code

1874 – Semiconductor Diode conceived of

1876 – Ericsson Corporation founded in Sweden

1885 – Stanford University

1885 – William Burroughs’ adding machine

1890 – Herman Hollerith Tabulating the US Census

1890 – Toshiba founded in Japan

1891 – Strowger Step-by-Step Switch

1898 – Nippon Electric Limited Partnership – NEC Corporation founded in Japan

1890s to 1930s – Radio Engineering

Early 1900s – Electrical Engineering

1904 – “Diode” or Two-Element Amplifier actually invented

1904 – Three-Element Amplifier or “Triode”

1906 – Vacuum Tube or “Audion”

1907 – Lee DeForest coins the term “radio” to refer to wireless transmission when he formed his DeForest Radio Telephone Company

1909 – Charles Herrold in San Jose started first radio station in USA with regularly scheduled programming, including songs, using an arc transmitter of his own design. Herrold was one of Stanford’s earliest students and founded his own College of Wireless and Engineering in San Jose

1910 – Radio Broadcasting business pioneered by Lee DeForest with broadcast from New York of a live performance by Italian tenor Enrico Caruso

1910 – Hitachi founded in Japan

1912 – Sharp Corporation founded in Japan and takes its name from one of its founder’s first inventions, the Ever-Sharp mechanical pencil

1914 – Floating-Point Numbers

1917 – Vernam Cipher

1918 – Panasonic, then Matsushita Electric, founded in Japan

1920 – Rossum’s Universal Robots

1927 – Fritz Lang’s Metropolis

1927 – First LED

1928 – Electronic Speech Synthesis

1930 – The Enigma Machine

1931 – Differential Analyzer

1935 – Fujitsu founded as Fuji Telecommunications Equipment Manufacturing in Japan. Fujitsu is the second oldest IT company after IBM and before Hewlett-Packard

1936 – Church-Turing Thesis

1939 – Hewlett-Packard founded in a one-car garage in Palo Alto, California by Bill Hewlett and David Packard

1939 – Toshiba founded in Japan

1941Z3 Computer

1942Atanasoff-Berry Computer

1942 – Isaac Asimov’s Three Laws of Robotics

1942Seiko Corporation founded in Japan



1944Delay Line Memory

1944Binary-Coded Decimal

1945Vannevar Bush‘s “As We May Think

1945EDVAC First Draft Report – The von Neumann architecture

1946 – Trackball

1946 – Williams Tube Random Access Memory

1947 – Actual Bug Found – First “debugging”

1947 – William Shockley’s Silicon Transistor

1948 – The Bit – Binary Digit 0 or 1

1948 – Curta Calculator

1948 – Manchester SSEM

1949 – Whirlwind Computer

1950 – Error-Correcting Codes (ECC)

1951 – Turing Test of Artificial Intelligence (AI)

1951 – Magnetic Tape Used for Computers

1951 – Core Memory

1951 – Microprogramming

1952 – Computer Speech Recognition

1953 – First Transistorized Computer

1955 – Artificial Intelligence (AI) Coined

1955 – Computer Proves Mathematical Theorem

1956 – First Disk Storage Unit

1956 – The Byte

1956 – Robby the Robot from Forbidden Planet

1957 – FORTRAN Programming Language

1957 – First Digital Image

1958 – The Bell 101 Modem

1958 – SAGE Computer Operational

1959 – IBM 1401 Computer

1959 – DEC PDP-1

1959 – Quicksort Algorithm

1959 – SABRE Airline Reservation System

1960 – COBOL Programming Language

1960 – Recommended Standard 232 (RS-232)

1961 – ANITA Electronic Calculator

1961 – Unimate – First Mass-Produced Robot

1961 – Time-Sharing – The Original “Cloud Computing

1961 – Shinshu Seiki Company founded in Japan (now called Seiko Epson Corporation) as a subsidiary of Seiko to supply precision parts for Seiko watches.

1962 – Spacewar! Video Game

1962 – Virtual Memory

1962 – Digital Long Distance Telephone Calls

1963 – Sketchpad Interactive Computer Graphics

1963 – ASCII Character Encoding

1963 – Seiko Corporation in Japan developed world’s first portable quartz timer (Seiko QC-951)

1964 – RAND Tablet Computer

1964 – Teletype Model 33 ASR

1964 – IBM System/360 Mainframe Computer

1964 – BASIC Programming Language

1965 – First Liquid-Crystal Display (LCD)

1965 – Fiber Optics – Optical-Fiber

1965 – DENDRAL Artificial Intelligence (AI) Research Project

1965 – ELIZA – The First “Chatbot” – 1965

1965 – Touchscreen

1966 – Star Trek Premieres

1966 – Dynamic RAM

1966 – Linear predictive coding (LPC) proposed by Fumitada Itakura of Nagoya University and Shuzo Saito of Nippon Telegraph and Telephone (NTT).[71]

1967 – Object-Oriented Programming

1967 – First ATM Machine

1967 – Head-Mounted Display

1967 – Programming for Children

1967 – The Mouse

1968 – Carterfone Decision

1968 – Software Engineering

1968 – HAL 9000 Computer from 2001: A Space Odyssey

1968 – First “Spacecraft” “Guided by Computer”

1968 – Cyberspace Coined—and Re-Coined

1968 – Mother of All Demos

1968 – Dot Matrix Printer – Shinshu Seiki (now called Seiko Epson Corporation) launched the world’s first mini-printer, the EP-101 (“EP” for Electronic Printer,) which was soon incorporated into many calculators

1968 – Interface Message Processor (IMP)

1969 – ARPANET / Internet

1969 – Digital Imaging

1969 – Network Working Group Request for Comments (RFC): 1

1969 – Utility Computing – Early “Cloud Computing

1969 – Perceptrons Book – Dark Ages of Neural Networks Artificial Intelligence (AI)

1969 – UNIX Operating System

1969 – Seiko Epson Corporation in Japan developed world’s first quartz watch timepiece (Seiko Quartz Astron 35SQ)

1970 – Fair Credit Reporting Act

1970 – Relational Databases

1970 – Floppy Disk

1971 – Laser Printer

1971 – NP-Completeness

1971 – @Mail Electronic Mail

1971 – First Microprocessor – General-Purpose CPU – “Computer on a Chip”

1971 – First Wireless Network

1972 – C Programming Language

1972 – Cray Research Supercomputers – High-Performance Computing (HPC)

1972 – Game of Life – Early Artificial Intelligence (AI) Research

1972 – HP-35 Calculator

1972 – Pong Game from Atari – Nolan Bushnell

1973 – First Cell Phone Call

1973 – Danny Cohen first demonstrated a form of packet voice as part of a flight simulator application, which operated across the early ARPANET.[69][70]

1973 – Xerox Alto from Xerox Palo Alto Research Center (PARC)

1973 – Sharp Corporation produced the first LCD calculator

1974 – Data Encryption Standard (DES)

1974 – The Institute of Electrical and Electronics Engineers (IEEE) publishes a paper entitled “A Protocol for Packet Network Interconnection”.[82]

1974 – Network Voice Protocol (NVP) tested over ARPANET in August 1974, carrying barely audible 16 kpbs CVSD encoded voice.[71]

1974 – The first successful real-time conversation over ARPANET achieved using 2.4 kpbs LPC, between Culler-Harrison Incorporated in Goleta, California, and MIT Lincoln Laboratory in Lexington, Massachusetts.[71]

1974 – First Personal Computer: The Altair 8800 Invented by MITS in Albuquerque, New Mexico

1975 – Colossal Cave Adventure – Text-based “Video” Game

1975 – The Shockwave Rider SciFi Book – A Prelude of the 21st Century Big Tech Police State

1975 – AI Medical Diagnosis – Artificial Intelligence in Medicine

1975 – BYTE Magazine

1975 – Homebrew Computer Club

1975 – The Mythical Man-Month

1975 – The name Epson was coined for the next generation of printers based on the EP-101 which was released to the public. (EPSON:E-P-SON: SON of Electronic Printer).[7] Epson America Inc. was established to sell printers for Shinshu Seiki Co.

1976 – Public Key Cryptography

1976 – Acer founded

1976 – Tandem NonStop

1976 – Dr. Dobb’s Journal

1977 – RSA Encryption

1977 – Apple II Computer

The TRS-80 Model I pictured alongside the Apple II and the Commodore PET 2001-8. These three computers constitute what Byte Magazine called the “1977 Trinity” of home computing.

1977 – Danny Cohen and Jon Postel of the USC Information Sciences Institute, and Vint Cerf of the Defense Advanced Research Projects Agency (DARPA), agree to separate IP from TCP, and create UDP for carrying real-time traffic.

1978 – First Internet Spam Message

1978 – France’s Minitel Videotext

1979 – Secret Sharing for Encryption

1979 – Dan Bricklin Invents VisiCalc Spreadsheet

1980 – Timex Sinclair ZX80 Computer

1980 – Flash Memory

1980 – RISC Microprocessors – Reduced Instruction Set Computer CPUs

1980 – Commercially Available Ethernet Invented by Robert Metcalfe of 3Com

1980 – Usenet

1981 – IBM Personal Computer – IBM PC

1981 – Simple Mail Transfer Protocol (SMTP) Email

1981 – Japan’s Fifth Generation Computer SystemsJapan

1982 – Sun Microsystems was founded on February 24, 1982.[2]

1982 – AutoCAD

1982 – First Commercial UNIX Workstation

1982 – PostScript

1982 – Microsoft and the IBM PC Clones

1982 – First CGI Sequence in Feature Film – Star Trek II: The Wrath of Khan

1982 – National Geographic Moves the Pyramids – Precursor to Photoshop

1982 – Secure Multi-Party Computation

1982 – TRON Movie

1982 – Home Computer Named Machine of the Year by Time Magazine

1983 – The Qubit – Quantum Computers

1983 – WarGames

1983 – 3-D Printing

1983 – Computerization of the Local Telephone Network

1983 – First Laptop

1983 – MIDI Computer Music Interface

1983 – Microsoft Word

1983 – Nintendo Entertainment System – Video Games

1983 – Domain Name System (DNS)

1983 – IPv4 Flag Day – TCP/IP

1984 – Text-to-Speech (TTS)

1984 – Apple Macintosh

1984 – VPL Research, Inc. – Virtual Reality (VR)

1984 – Quantum Cryptography

1984 – Telebit TrailBlazer Modems Break 9600 bps

1984 – Verilog Language

1984 – Dell founded by Michael Dell

1984 – Cisco Systems was founded in December 1984

1985 – Connection Machine – Parallelization

1985 – First Computer-Generated TV Host – Max HeadroomCGI

1985 – Zero-Knowledge Mathematical Proofs

1985 – FCC Approves Unlicensed Wireless Spread Spectrum

1985 – NSFNET National Science Foundation “Internet”

1985 – Desktop Publishing – with Macintosh, Aldus PageMaker, LaserJet, LaserWriter and PostScript

1985 – Field-Programmable Gate Array (FPGA)

1985 – GNU Manifesto from Richard Stallman

1985 – AFIS Stops a Serial Killer – Automated Fingerprint Identification System

1986 – Software Bug Fatalities

1986 – Pixar Animation Studios

1986 – D-Link Corporation founded in Taipei, Taiwan

1987 – Digital Video Editing

1987 – GIF – Graphics Interchange Format

1988 – MPEG – Moving Picture Experts Group – Coding-Compressing Audio-Video

1988 – CD-ROM

1988 – Morris Worm Internet Computer Virus

1988 – Linksys founded

1989 – World Wide Web-HTML-HTTP Invented by Tim Berners-Lee

1989 – Asus was founded in Taipei, Taiwan

1989 – SimCity Video Game

1989 – ISP Provides Internet Access to the Public

1990 – GPS Is Operational – Global Positioning System

1990 – Digital Money is Invented – DigiCash – Precursor to Bitcoin

1991 – Pretty Good Privacy (PGP)

1991 – DARPA’s Report “Computers at Risk: Safe Computing in the Information Age

1991 – Linux Kernel Operating System Invented by Linus Torvalds

1992 – Boston Dynamics Robotics Company Founded

1992 – JPEG – Joint Photographic Experts Group

1992 – First Mass-Market Web Browser NCSA Mosaic Invented by Marc Andreessen

1992 – Unicode Character Encoding

1993 – Apple Newton

1994 – First Banner Ad – Wired Magazine

1994 – RSA-129 Encryption Cracked

1995 – DVD

1995 – E-Commerce Startups – eBay, Amazon and DoubleClick Launched

1995 – AltaVista Web Search Engine

1995 – Gartner Hype Cycle

1996 – Universal Serial Bus (USB)

1996 – Juniper Networks founded

1997 – IBM Computer Is World Chess Champion

1997 – PalmPilot

1997 – E Ink

1998 – Diamond Rio MP3 Player

1998 – Google

1999 – Collaborative Software Development

1999 – Blog Is Coined

1999 – Napster P2P Music and File Sharing

2000 – USB Flash Drive

2000 – Sharp Corporation’s Mobile Communications Division created the world’s first commercial camera phone, the J-SH04, in Japan

2000 – Fortinet founded

2001 – Wikipedia

2001 – Apple iTunes

2001 – Advanced Encryption Standard (AES)

2001 – Quantum Computer Factors “15”

2002 – Home-Cleaning Robot

2003 – CAPTCHA

2004 – Product Tracking

2004 – Facebook

2004 – First International Meeting on Synthetic Biology

2005 – Video Game Enables Research into Real-World Pandemics

2006 – Apache Hadoop Makes Big Data Possible

2006 – Differential Privacy

2007 – Apple iPhone

2008 – Bitcoin

2010 – Air Force Builds Supercomputer with Gaming Consoles

2010 – Cyber Weapons

2011 – Smart Homes via the Internet of Things (IoT)

2011 – IBM Watson Wins Jeopardy!

2011 – World IPv6 Day

2011 – Social Media Enables the Arab Spring

2012 – DNA Data Storage

2013 – Algorithm Influences Prison Sentence

2013 – Subscription Software “Popularized”

2014 – Data Breaches

2014 – Over-the-Air Vehicle Software Updates

2015 – Google Releases TensorFlow

2016 – Augmented Reality Goes Mainstream

2016 – Computer Beats Master at Game of Go

~2050 -Hahahaha! – Artificial General Intelligence (AGI)

~9999 – The Limits of Computation?


Fair Use Sources:


The Enigma Machine – Circa 1930 AD

Return to Timeline of the History of Computers or History

The Enigma Machine, Circa 1930 – Fair Use Source: B085FW7J86

“The Enigma machine used electric-powered mechanical rotors to both encrypt and decrypt text-based messages sent over radio waves. The device had German origins and would become an important technological development during the Second World War.”

“The device looked like a large square or rectangular mechanical typewriter. On each key press, the rotors would move and record a seemingly random character that would then be transmitted to all nearby Enigma machines. However, these characters were not random, and were defined by the rotation of the rotor and a number of configuration options that could be modified at any time on the device. Any Enigma machine with a specific configuration could read or “decrypt” messages sent from another machine with an identical configuration. This made the Enigma machine extremely valuable for sending crucial messages while avoiding interception.”

“While a sole inventor of the rotary encryption mechanism used by the machine is hard to pinpoint, the technology was popularized by a two-man company called Chiffriermaschinen AG based in Germany. In the 1920s, Chiffriermaschinen AG traveled throughout Germany demonstrating the technology, which led to the German military adopting it in 1928 to secure top-secret military messages in transit.”

“The ability to avoid the interception of long-distance messages was a radical development that had never before been possible. In the software world of today, the interception of messages is still a popular technique that hackers try to employ, often called a man-in-the-middle attack. Today’s software uses similar (but much more powerful) techniques to those that the Enigma machine used a hundred years ago to protect against such attacks.”

“While the Enigma machine was an incredibly impressive technology for its time, it was not without flaws. Because the only criterion for interception and decryption was an Enigma machine with an identical configuration to the sender, a single compromised configuration log (or private key, in today’s terms) could render an entire network of Enigma machines useless.”

“To combat this, any groups sending messages via the Enigma machine changed their configuration settings on a regular basis. Reconfiguring Enigma machines was a time-consuming process. First, the configuration logs had to be exchanged in person, as secure ways of sharing them remotely did not yet exist. Sharing configuration logs between a network of two machines and two operators might not be painful. But a larger network, say 20 machines, required multiple messengers to deliver the configuration logs — each increasing the probability of a configuration log being intercepted and stolen, or potentially even leaked or sold.”

“The second problem with sharing configuration logs was that manual adjustments to the machine itself were required for the Enigma machine to be able to read, encrypt, and decrypt new messages sent from other Enigma machines. This meant that a specialized and trained staff member had to be present in case a configuration update was needed. This all occurred in an era prior to software, so these configuration adjustments required tampering with the hardware and adjusting the physical layout and wiring of the plugboard. The adjuster needed a background in electronics, which was very rare in the early 1900s.”

“As a result of how difficult and time-consuming it was to update these machines, updates typically occurred on a monthly basis — daily for mission-critical communication lines. If a key was intercepted or leaked, all transmissions for the remainder of the month could be intercepted by a malicious actor — the equivalent of a hacker today.”

“The type of encryption these Enigma machines used is now known as a symmetric key algorithm, which is a special type of cipher that allows for the encryption and decryption of a message using a single cryptographic key. This family of encryption is still used today in software to secure data in transit (between sender and receiver), but with many improvements on the classic model that gained popularity with the Enigma machine.”

“In software, keys can be made much more complex. Modern key generation algorithms produce keys so complex that attempting every possible combination (brute forcing or brute force attack) with the fastest possible modern hardware could easily take more than a million years. Additionally, unlike the Enigma machines of the past, software keys can change rapidly.”

“Depending on the use case, keys can be regenerated at every user session (per login), at every network request, or at a scheduled interval. When this type of encryption is used in software, a leaked key might expose you for a single network request in the case of per-request regeneration, or worst-case scenario, a few hours in the case of per-login (per-session) regeneration.”

“If you trace the lineage of modern cryptography far back, you will eventually reach World War II in the 1930s. It’s safe to say that the Enigma machine was a major milestone in securing remote communications. From this, we can conclude that the Enigma machine was an essential development in what would later become the field of software security.”

“The Enigma machine was also an important technological development for those who would be eventually known as “hackers.” The adoption of Enigma machines by the Axis Powers during World War II resulted in extreme pressure for the Allies to develop encryption-breaking techniques. General Dwight D. Eisenhower himself claimed that doing so would be essential for victory against the Nazis.”

“In September of 1932, a Polish mathematician named Marian Rejewski was provided a stolen Enigma machine. At the same time, a French spy named Hans-Thilo Schmidt was able to provide him with valid configurations for September and October of 1932. This allowed Marian to intercept messages from which he could begin to analyze the mystery of Enigma machine encryption.”

“Marian was attempting to determine how the machine worked, both mechanically and mathematically. He wanted to understand how a specific configuration of the machine’s hardware could result in an entirely different encrypted message being output.”

“Marian’s attempted decryption was based on a number of theories as to what machine configuration would lead to a particular output. By analyzing patterns in the encrypted messages and coming up with theories based on the mechanics of the machine, Marian and two coworkers, Jerzy Różycki and Henryk Zygalski, eventually reverse engineered the system. With the deep understanding of Enigma rotor mechanics and board configuration that the team developed, they were able to make educated guesses as to which configurations would result in which encryption patterns. They could then reconfigure a board with reasonable accuracy and, after several attempts, begin reading encrypted radio traffic. By 1933 the team was intercepting and decrypting Enigma machine traffic on a daily basis.”

“Much like the hackers of today, Marian and his team intercepted and reverse engineered encryption schemes to get access to valuable data generated by a source other than themselves. For these reasons, I would consider Marian Rejewski and the team assisting him as some of the world’s earliest hackers.”

“In the following years, Germany would continually increase the complexity of its Enigma machine encryption. This was done by gradually increasing the number of rotors required to encrypt a character. Eventually the complexity of reverse engineering a configuration would become too difficult for Marian’s team to break in a reasonable time frame. This development was also important, because it provided a look into the ever-evolving relationship between hackers and those who try to prevent hacking.”

“This relationship continues today, as creative hackers continually iterate and improve their techniques for breaking into software systems. And on the other side of the coin, smart engineers are continually developing new techniques for defending against the most innovative hackers.”

Fair Use Sources:

Fair Use Source: B085FW7J86


Curta Calculator – 1948 AD

Return to Timeline of the History of Computers


Curta Calculator

Curt Herzstark (1902–1988)

“The Curta is perhaps the most elegant, compact, and functional mechanical calculator ever manufactured. Designed by Austrian engineer Curt Herzstark, it is the only digital mechanical pocket calculator ever invented. Handheld and powered by a crank on the top, the Curta can add, subtract, multiply, and divide.

Curt Herzstark’s father, Samuel Jacob Herzstark, was a highly regarded Austrian importer and manufacturer of mechanical calculators and other precision instruments. Herzstark finished high school and apprenticed at his father’s company, which he took over when his father died in 1937.

At the time, mechanical calculators were big and heavy desktop affairs. After one of Herzstark’s customers complained that he didn’t want to go back to the office just to add up a column of numbers, Herzstark started designing a handheld calculator. He had an early prototype working in January 1938, just two months before Germany invaded and annexed Austria. Despite Herzstark being half-Jewish, the Nazis let him continue to operate the factory, provided that it cease all civilian production and devote itself to creating devices for the Reich.

In 1943, two of Herzstark’s employees were arrested for distributing transcripts of English radio broadcasts; Herzstark was subsequently arrested for aiding the employees and for “indecent contact with Aryan women.” He was sent to the Buchenwald concentration camp, where he was recognized by one of his former employees, who was now a guard. The guard told the head of the camp’s factory about the mechanical calculator. The Germans then instructed Herzstark to finish his project, so that the camp could give the device to Hitler as a present after Germany won the war. That never happened: Buchenwald was liberated on April 11, 1945, and Hitler killed himself 19 days later.

After liberation, Herzstark took the drawings he had done at the camp to a machine shop and had three working prototypes eight weeks later. The first calculators were produced commercially in the fall of 1948.”

SEE ALSO Antikythera Mechanism (c. 150 BCE), Thomas Arithmometer (1851)

“The Curta mechanical calculator, pictured here, is the only digital mechanical pocket calculator ever invented.”

Fair Use Source: B07C2NQSPV

History Software Engineering

The Bit – Binary Digit 0 or 1 – 1948 AD

Return to Timeline of the History of Computers


The Bit

Claude E. Shannon (1916–2001), John W. Tukey (1915–2000)

“It was the German mathematician Gottfried Wilhelm Leibniz (1646–1716) who first established the rules for performing arithmetic with binary numbers. Nearly 250 years later, Claude E. Shannon realized that a binary digit—a 0 or a 1—was the fundamental, indivisible unit of information.

Shannon earned his PhD from MIT in 1940 and then took a position at the Institute for Advanced Study in Princeton, New Jersey, where he met and collaborated with the institute’s leading mathematicians working at the intersection of computing, cryptography, and nuclear weapons, including John von Neumann, Albert Einstein, Kurt Gödel, and, for two months, Alan Turing.

In 1948, Shannon published “A Mathematical Theory of Communication” in the Bell System Technical Journal. The article was inspired in part by classified work that Shannon had done on cryptography during the war. In it, he created a mathematical definition of a generalized communications system, consisting of a message to be sent, a transmitter to convert the message into a signal, a channel through which the signal is sent, a receiver, and a destination, such as a person or a machine “for whom the message is intended.”

Shannon’s paper introduced the word bit, a binary digit, as the basic unit of information. While Shannon attributed the word to American statistician John W. Tukey, and the word had been used previously by other computing pioneers, Shannon provided a mathematical definition of a bit: rather than just a 1 or a 0, it is information that allows the receiver to limit possible decisions in the face of uncertainty. One of the implications of Shannon’s work is that every communications channel has a theoretical upper bound—a maximum number of bits that it can carry per second. As such, Shannon’s theory has been used to analyze practically every communications system ever developed—from handheld radios to satellite communications—as well as data-compression systems and even the stock market.

Shannon’s work illuminates a relationship between information and entropy, thus establishing a connection between computation and physics. Indeed, noted physicist Stephen Hawking framed much of his analysis of black holes in terms of the ability to destroy information and the problems created as a result.”

SEE ALSO Vernam Cipher (1917), Error-Correcting Codes (1950)

Mathematician and computer scientist Claude E. Shannon.

Fair Use Source: B07C2NQSPV

History Software Engineering

Binary-Coded Decimal – 1944 A.D.

Return to Timeline of the History of Computers


Binary-Coded Decimal

Howard Aiken (1900–1973)

“There are essentially three ways to represent numbers inside a digital computer. The most obvious is to use base 10, representing each of the numbers 0–9 with its own bit, wire, punch-card hole, or printed symbol (e.g., 0123456789). This mirrors the way people learn and perform arithmetic, but it’s extremely inefficient.

The most efficient way to represent numbers is to use pure binary notation: with binary, n bits represent 2n possible values. This means that 10 wires can represent any number from 0 to 1023 (210–1). Unfortunately, it’s complex to convert between decimal notation and binary.

The third alternative is called binary-coded decimal (BCD). Each decimal digit becomes a set of four binary digits, representing the numbers 1, 2, 4, and 8, and counting in sequence 0000, 0001, 0010, 0011, 0100, 0101, 0110, 0111, 1000, 1001, and 1010. BCD is four times more efficient than base 10, yet it’s remarkably straightforward to convert between decimal numbers and BCD. Further, BCD has the profound advantage of allowing programs to exactly represent the numeric value 0.01—something that’s important when performing monetary computations.

Early computer pioneers experimented with all three systems. The ENIAC computer built in 1943 was a base 10 machine. At Harvard University, Howard Aiken designed the Mark 1 computer to use a modified form of BCD. And in Germany, Konrad Zuse’s Z1, Z2, Z3, and Z4 machines used binary floating-point arithmetic.

After World War II, IBM went on to design, build, and sell two distinct lines of computers: scientific machines that used binary numbers, and business computers that used BCD. Later, IBM introduced System/360, which used both methods. On modern computers, BCD is typically supported with software, rather than hardware.

In 1972, the US Supreme Court ruled that computer programs could not be patented. In Gottschalk v. Benson, the court ruled that converting binary-coded decimal numerals into pure binary was “merely a series of mathematical calculations or mental steps, and does not constitute a patentable ‘process’ within the meaning of the Patent Act.”

SEE ALSO Binary Arithmetic (1703), Floating-Point Numbers (1914), IBM System/360 (1964)

Howard Aiken inspects one of the four paper-tape readers of the Mark 1 computer.

Fair Use Source: B07C2NQSPV


Colossus – 1943 A.D.

Return to Timeline of the History of Computers



Thomas Harold Flowers (1905–1998), Sidney Broadhurst (1893–1969), W. T. Tutte (1917–2002)

“Colossus was the first electronic digital computing machine, designed and successfully used during World War II by the United Kingdom to crack the German High Command military codes. “Electronic” means that it was built with tubes, which made Colossus run more than 500 times faster than the relay-based computing machines of the day. It was also the first computer to be manufactured in quantity.

A total of 10 “Colossi” were clandestinely built at Bletchley Park, Britain’s ultra-secret World War II cryptanalytic center, between 1943 and 1945 to crack the wireless telegraph signals encrypted with a special system developed by C. Lorenz AG, a German electronics firm. After the war the Colossi were destroyed or dismantled for their parts to protect the secret of the United Kingdom’s cryptanalytic prowess.

Colossus was far more sophisticated than the electromechanical Bombe machines that Alan Turing designed to crack the simpler Enigma cipher used by the Germans for battlefield encryption. Whereas Enigma used between three and eight encrypting rotors to scramble characters, the Lorenz system involved 12 wheels, with each wheel adding more mathematical complexity, and thus required a cipher-cracking machine with considerably more speed and agility.

Electronic tubes provided Colossus with the speed that it required. But that speed meant that Colossus needed a similarly fast input system. It used punched paper tape running at 5,000 characters per second, the tape itself moving at 27 miles per hour. Considerable engineering kept the tape properly tensioned, preventing rips and tears.

The agility was provided by a cryptanalysis technique designed by Alan Turing called Turingery, which inferred the cryptographic pattern of each Lorenz cipher wheel, and a second algorithm. The second algorithm, designed by British mathematician W. T. Tutte, determined the starting position of the wheels, which the Germans changed for each group of messages. The Colossi themselves were operated by a group of cryptanalysts that included 272 women from the Women’s Royal Naval Service (WRNS) and 27 men.”

SEE ALSO Manchester SSEM (1948)

The Colossus computing machine was used to read Nazi codes at Bletchley Park, England, during World War II.

Fair Use Source: B07C2NQSPV

History Software Engineering

Z3 Computer – 1941 A.D.

Return to Timeline of the History of Computers


Z3 Computer

Konrad Zuse (1910–1995)

“The Z3 was the world’s first working programmable, fully automatic digital computer. The machine executed a program on punched celluloid tape and could perform addition, subtraction, multiplication, division, and square roots on 22-bit binary floating-point numbers (because binary math was more efficient than decimal); it had 64 words of 22-bit memory for storing results. The machine could convert decimal floating points to binary for input, and binary floating points back to decimal for output.

Graduating with a degree in civil engineering in 1935, German inventor Konrad Zuse immediately started building his first computer, the Z1 (constructed 1935–1938), in his parents’ apartment in Berlin. The Z1 was a mechanical calculator controlled by holes punched in celluloid film. The machine used 22-bit binary floating-point numbers and supported Boolean logic; it was destroyed in December 1943 during an Allied air raid.

Drafted into military service in 1939, Zuse started work on the Z2 (1939), which improved on the Z1’s design by using telephone relays for the arithmetic and control logic. DVL, the German Research Institute for Aviation, was impressed by the Z2 and gave Zuse funds to start his company, Zuse Apparatebau (Zuse Apparatus Construction, later renamed Zuse KG), to build the machines.

In 1941, Zuse designed and built the Z3. Like the Z1 and Z2, it was controlled by punched celluloid tape, but it also had support for loops, allowing it to be used for solving many typical engineering calculations.

With the success of the Z3, Zuse started working on the Z4, a more powerful machine with 32-bit floating-point math and conditional jumps. The partially completed machine was moved from Berlin to Göttingen in February 1945 to prevent it from falling into Soviet hands, and was completed there just before the end of the war. It remained in operation until 1959.

Surprisingly, it seems that the German military never made use of these sophisticated machines—instead, the machines were largely funded as a research project.”

SEE ALSO Atanasoff-Berry Computer (1942), Binary-Coded Decimal (1944)

“The control console, calculator, and storage cabinets of the Z3 computer by Konrad Zuse.”

Fair Use Source: B07C2NQSPV


Church-Turing Thesis – 1936 A.D.

Return to Timeline of the History of Computers


Church-Turing Thesis

David Hilbert (1862–1943), Alonzo Church (1903–1995), Alan Turing (1912–1954)

“Computer science theory seeks to answer two fundamental questions about the nature of computers and computation: are there theoretical limits regarding what is possible to compute, and are there practical limits?

American mathematician Alonzo Church and British computer scientist Alan Turing each published an answer to these questions in 1936. They did it by answering a challenge posed by the eminent German mathematician David Hilbert eight years earlier.

Hilbert’s challenge, the Entscheidungsproblem (German for “decision problem”), asked if there was a mathematical procedure—an algorithm—that could be applied to determine if any given mathematical proposition was true or false. Hilbert had essentially asked if the core work of mathematics, the proving of theorems, could be automated.

Church answered Hilbert by developing a new way of describing mathematical functions and number theory called the Lambda calculus. With it, he showed that the Entscheidungsproblem could not be solved in general: there was no general algorithmic procedure for proving or disproving theorems. He published his paper in April 1936.

Turing took a radically different approach: he created a mathematical definition of a simple, abstract machine that could perform computation. Turing then showed that such a machine could in principle perform any computation and run any algorithm—it could even simulate the operation of other machines. Finally, he showed that while such machines could compute almost anything, there was no way to know if a computation would eventually complete, or if it would continue forever. Thus, the Entscheidungsproblem was unsolvable.

Turing went to Princeton University in September 1936 to study with Church, where the two discovered that the radically different approaches were, in fact, mathematically equivalent. Turing’s paper was published in November 1936; he stayed on and completed his PhD in June 1938, with Church as his PhD advisor.”

SEE ALSO Colossus (1943), EDVAC First Draft Report (1945), NP-Completeness (1971)

Statue of Alan Turing at Bletchley Park, the center of Britain’s codebreaking operations during World War II.

Fair Use Source: B07C2NQSPV


Metropolis by Fritz Lang – 1927 A.D.

Return to Timeline of the History of Computers



Fritz Lang (1890–1976)

“In 1927, German film director Fritz Lang was already visualizing what life in the year 2026 would look like. Technology featured prominently in the cityscape of his black-and-white silent film Metropolis—considered by many to be one of the most influential sci-fi movies of all time. His dystopian vision depicted oppressed workers below ground toiling at mindless, repetitive tasks on machines that ran the city. Above ground was a paradise where the city’s elite lived out indulgent lives. Interpretations of Lang’s technology-driven world can be seen in movies such as Blade Runner.

The plot of Metropolis involves a female robot built to resemble the deceased wife of the city’s leader. Later, the mad scientist who created her transforms the heroine of the story—a nanny named Maria—into the female robot. To complete this transformation, the scientist uses vast amounts of electric energy and futuristic technology.

While the robot embodies humanity’s continued fascination with how advancing technology may impact and integrate into people’s lives, the fact that the robot is female is rare. Most robots featured in fiction and pop culture of that era were male or genderless. This robot—portrayed as the leader’s wife and then Maria—is depicted as a strong, clearly feminine being. The cultural impact of that role has since been seen in numerous female characters and imagery, such as Beyoncé’s “Sweet Dreams” interlude during a world tour, featuring a video of the singer in a robot costume that borrows strongly from “Maria.”

In 2006, Carnegie Mellon University (CMU) inducted Maria into its Robot Hall of Fame. On the Hall of the Fame’s website, CMU recognizes the robot Maria as the “singular most powerful image of early science fiction films and continuing inspiration in the creation of female robotic imagery in both science and science fiction.””

SEE ALSO Rossum’s Universal Robots (1920)

A poster for a 1984 rerelease of Metropolis, the 1927 film by German film director Fritz Lang.

Fair Use Source: B07C2NQSPV


Semiconductor Diode – 1874 A.D.

Return to Timeline of the History of Computers


Semiconductor Diode

Michael Faraday (1791–1867), Karl Ferdinand Braun (1850–1918)

“Semiconductors are curious devices: not quite conductors like copper, gold, or silver, not quite insulators like plastic or rubber. In 1833, Michael Faraday discovered that the chemical silver sulfide became a better conductor when heated, unlike metals that lose their conductivity under the same conditions. Separately, in 1874, Karl Ferdinand Braun, a 24-year-old German physicist, discovered that a metal sulfide crystal touched with a metal probe would conduct electricity in only one direction. This “one direction” characteristic is what defines diodes or rectifiers, the simplest electronic components.”

“In 1904 the British chemist John-Ambrose Fleming had invented the two-element amplifier, or ‘diode’, and a few months before DeForest the Austrian physicist Robert von Lieben had already built a three-element amplifier, or triode.” (Fair Use: B07XVF5RSP)

“Braun’s discovery was a curiosity until the invention of radio. The diode proved critical in allowing radio to make the transition from wireless telegraphy to the transmission and reception of the human voice. The diode of choice for these early radio sets was frequently called a cat’s whisker diode, because it consisted of a crystal of galena, a form of lead sulfide, in contact with a spring of metal (the “whisker”). By carefully manipulating the pressure and orientation of the metal against the crystal, an operator could adjust the electrical properties of the semiconductor until they were optimal for radio reception. Powered only by the radio waves themselves, a crystal set was only strong enough to faintly produce sounds in an earphone.”

“Crystal radio receivers were used onboard ships and then in homes until they were replaced by new receivers based on vacuum tubes, which could amplify the faint radio waves so that they were strong enough to power a speaker and fill a room with speech or music. But tubes didn’t mark the end of the crystal radio: the devices remained popular for people who couldn’t get tubes—such as on the front lines in World War II — as well as among children learning about electronics. In the 1940s, scientists at Bell Labs turned their attention to semiconductor radios once again in an effort to perfect microwave communications. In the process, they discovered the transistor.”

“Braun went on to make other fundamental contributions to physics and electronics. In 1897, he invented the cathode-ray tube (CRT), which would become the basis of television. He shared the 1909 Nobel Prize with Guglielmo Marconi (1874–1937) “in recognition of their contributions to the development of wireless telegraphy.””

SEE ALSO: Silicon Transistor (1947)

Crystal Detector, made by the Philmore Manufacturing Company. To use this device, the operator would connect a wire to each of the two flanges and press the metal “whisker” into the semiconductor crystal.”

Fair Use Sources:

Main: B07C2NQSPV

Secondary: B07XVF5RSP

History Software Engineering

Binary Arithmetic – 1703 A.D.

Return to Timeline of the History of Computers


Binary Arithmetic

Gottfried Wilhelm Leibniz (1646–1716)

“All information inside a computer is represented as a series of binary digits—0s and 1s—better known as bits. To represent larger numbers—or characters—requires combining multiple binary digits together into binary numbers, also called binary words.

We write decimal numbers with the least significant digit on the right-hand side; each successive digit to the left represents 10 times as much as the previous digit, so the number 123 can be explained as:

123 = 1 × 100 + 2 × 10 + 3 × 1

Which is also equal to:

123 = 1 × 102 + 2 × 101 + 3 × 100

Binary numbers work the same way, except that the multiplier is 2, rather than 10. So the number one hundred and twenty three would be written:

1111011 = 1 × 26 + 1 × 25 + 1 × 24 + 1 × 23 + 0 × 22 + 1 × 21 +1 × 20

Although forms of binary number systems can be traced back to ancient China, Egypt, and India, it was German mathematician Gottfried Wilhelm Leibniz who worked out the rules for binary addition, subtraction, multiplication, and division and then published them in his essay, “Explication de l’arithmétique binaire, qui se sert des seuls caractères 0 & 1; avec des remarques sur son utilité, et sur ce qu’elle donne le sens des anciennes figuers chinoises de Fohy” (“Explanation of binary arithmetic, which uses only characters 0 & 1; with remarks about its utility and the meaning it gives to the ancient Chinese figures of Fuxi”).

One of the advantages of binary arithmetic, he wrote, is that there is no need to memorize multiplication tables or to perform trial multiplications to compute divisions: all one needs to do is apply a small set of straightforward rules.

All modern computers use binary notation and perform arithmetic using the same laws that Leibniz first devised.

SEE ALSO Floating-Point Numbers (1914), Binary-Coded Decimal (1944), The Bit (1948)

A table from Gottfried Wilhelm Leibniz’s essay “Explanation of Binary Arithmetic,” published in the Mémoires de l’Académie Royale des Sciences in 1703, shows the rules for adding, subtracting, multiplying, and dividing binary numbers.”

Fair Use Source: B07C2NQSPV