Python Software Engineering

Guido van Rossum – Python Creator

See also: Python, Python Bibliography and Bibliography of Python Libraries and Web Frameworks, Python Programming Courses

Guido van Rossum (Dutch: [ˈɣido vɑn ˈrɔsʏm, -səm]; born 31 January 1956) is a Dutch programmer best known as the creator of the Python programming language, for which he was the “Benevolent dictator for life” (BDFL) until he stepped down from the position in July 2018.[5][6] He remained a member of the Python Steering Council through 2019, and withdrew from nominations for the 2020 election.[7]” (WP)


Fair Use Sources:

Artificial Intelligence Cloud Data Science - Big Data Hardware and Electronics History Networking Operating Systems Software Engineering

Timeline of the History of Computers

Return to History or This Year in History

c. 2500 BC – Sumerian Abacus

c. 700 BC – Scytale

c. 150 BC – Antikythera Mechanism

c. 60 – Programmable Robot

c. 850 – On Deciphering Cryptographic Messages

c. 1470 – Cipher Disk

1613 – First Recorded Use of the Word Computer

1621 – Slide Rule

1703 – Binary Arithmetic

1758 – Human Computers Predict Halley’s Comet

1770 – The “Mechanical Turk”

1792 – Optical Telegraph

1801 – The Jacquard Loom

1822 – The Difference Engine

1833 – Michael Faraday discovered silver sulfide became a better conductor when heated

1836 – Electrical Telegraph

1843 – Ada Lovelace Writes a Computer Program

1843 – Fax Machine Patented

1843 – Edgar Allan Poe’s “The Gold-Bug”

1849 to early 1900s – Silicon Valley After the Gold Rush

1851 – Thomas Arithmometer

1854 – Boolean Algebra

1864 – First Electromagnetic Spam Message

1870 – Mitsubishi founded

1874 – Baudot Code

1874 – Semiconductor Diode conceived of

1876 – Ericsson Corporation founded in Sweden

1885 – Stanford University

1885 – William Burroughs’ adding machine

1890 – Herman Hollerith Tabulating the US Census

1890 – Toshiba founded in Japan

1891 – Strowger Step-by-Step Switch

1898 – Nippon Electric Limited Partnership – NEC Corporation founded in Japan

1890s to 1930s – Radio Engineering

Early 1900s – Electrical Engineering

1904 – “Diode” or Two-Element Amplifier actually invented

1904 – Three-Element Amplifier or “Triode”

1906 – Vacuum Tube or “Audion”

1907 – Lee DeForest coins the term “radio” to refer to wireless transmission when he formed his DeForest Radio Telephone Company

1909 – Charles Herrold in San Jose started first radio station in USA with regularly scheduled programming, including songs, using an arc transmitter of his own design. Herrold was one of Stanford’s earliest students and founded his own College of Wireless and Engineering in San Jose

1910 – Radio Broadcasting business pioneered by Lee DeForest with broadcast from New York of a live performance by Italian tenor Enrico Caruso

1910 – Hitachi founded in Japan

1912 – Sharp Corporation founded in Japan and takes its name from one of its founder’s first inventions, the Ever-Sharp mechanical pencil

1914 – Floating-Point Numbers

1917 – Vernam Cipher

1918 – Panasonic, then Matsushita Electric, founded in Japan

1920 – Rossum’s Universal Robots

1927 – Fritz Lang’s Metropolis

1927 – First LED

1928 – Electronic Speech Synthesis

1930 – The Enigma Machine

1931 – Differential Analyzer

1935 – Fujitsu founded as Fuji Telecommunications Equipment Manufacturing in Japan. Fujitsu is the second oldest IT company after IBM and before Hewlett-Packard

1936 – Church-Turing Thesis

1939 – Hewlett-Packard founded in a one-car garage in Palo Alto, California by Bill Hewlett and David Packard

1939 – Toshiba founded in Japan

1941Z3 Computer

1942Atanasoff-Berry Computer

1942 – Isaac Asimov’s Three Laws of Robotics

1942Seiko Corporation founded in Japan



1944Delay Line Memory

1944Binary-Coded Decimal

1945Vannevar Bush‘s “As We May Think

1945EDVAC First Draft Report – The von Neumann architecture

1946 – Trackball

1946 – Williams Tube Random Access Memory

1947 – Actual Bug Found – First “debugging”

1947 – William Shockley’s Silicon Transistor

1948 – The Bit – Binary Digit 0 or 1

1948 – Curta Calculator

1948 – Manchester SSEM

1949 – Whirlwind Computer

1950 – Error-Correcting Codes (ECC)

1951 – Turing Test of Artificial Intelligence (AI)

1951 – Magnetic Tape Used for Computers

1951 – Core Memory

1951 – Microprogramming

1952 – Computer Speech Recognition

1953 – First Transistorized Computer

1955 – Artificial Intelligence (AI) Coined

1955 – Computer Proves Mathematical Theorem

1956 – First Disk Storage Unit

1956 – The Byte

1956 – Robby the Robot from Forbidden Planet

1957 – FORTRAN Programming Language

1957 – First Digital Image

1958 – The Bell 101 Modem

1958 – SAGE Computer Operational

1959 – IBM 1401 Computer

1959 – DEC PDP-1

1959 – Quicksort Algorithm

1959 – SABRE Airline Reservation System

1960 – COBOL Programming Language

1960 – Recommended Standard 232 (RS-232)

1961 – ANITA Electronic Calculator

1961 – Unimate – First Mass-Produced Robot

1961 – Time-Sharing – The Original “Cloud Computing

1961 – Shinshu Seiki Company founded in Japan (now called Seiko Epson Corporation) as a subsidiary of Seiko to supply precision parts for Seiko watches.

1962 – Spacewar! Video Game

1962 – Virtual Memory

1962 – Digital Long Distance Telephone Calls

1963 – Sketchpad Interactive Computer Graphics

1963 – ASCII Character Encoding

1963 – Seiko Corporation in Japan developed world’s first portable quartz timer (Seiko QC-951)

1964 – RAND Tablet Computer

1964 – Teletype Model 33 ASR

1964 – IBM System/360 Mainframe Computer

1964 – BASIC Programming Language

1965 – First Liquid-Crystal Display (LCD)

1965 – Fiber Optics – Optical-Fiber

1965 – DENDRAL Artificial Intelligence (AI) Research Project

1965 – ELIZA – The First “Chatbot” – 1965

1965 – Touchscreen

1966 – Star Trek Premieres

1966 – Dynamic RAM

1966 – Linear predictive coding (LPC) proposed by Fumitada Itakura of Nagoya University and Shuzo Saito of Nippon Telegraph and Telephone (NTT).[71]

1967 – Object-Oriented Programming

1967 – First ATM Machine

1967 – Head-Mounted Display

1967 – Programming for Children

1967 – The Mouse

1968 – Carterfone Decision

1968 – Software Engineering

1968 – HAL 9000 Computer from 2001: A Space Odyssey

1968 – First “Spacecraft” “Guided by Computer”

1968 – Cyberspace Coined—and Re-Coined

1968 – Mother of All Demos

1968 – Dot Matrix Printer – Shinshu Seiki (now called Seiko Epson Corporation) launched the world’s first mini-printer, the EP-101 (“EP” for Electronic Printer,) which was soon incorporated into many calculators

1968 – Interface Message Processor (IMP)

1969 – ARPANET / Internet

1969 – Digital Imaging

1969 – Network Working Group Request for Comments (RFC): 1

1969 – Utility Computing – Early “Cloud Computing

1969 – Perceptrons Book – Dark Ages of Neural Networks Artificial Intelligence (AI)

1969 – UNIX Operating System

1969 – Seiko Epson Corporation in Japan developed world’s first quartz watch timepiece (Seiko Quartz Astron 35SQ)

1970 – Fair Credit Reporting Act

1970 – Relational Databases

1970 – Floppy Disk

1971 – Laser Printer

1971 – NP-Completeness

1971 – @Mail Electronic Mail

1971 – First Microprocessor – General-Purpose CPU – “Computer on a Chip”

1971 – First Wireless Network

1972 – C Programming Language

1972 – Cray Research Supercomputers – High-Performance Computing (HPC)

1972 – Game of Life – Early Artificial Intelligence (AI) Research

1972 – HP-35 Calculator

1972 – Pong Game from Atari – Nolan Bushnell

1973 – First Cell Phone Call

1973 – Danny Cohen first demonstrated a form of packet voice as part of a flight simulator application, which operated across the early ARPANET.[69][70]

1973 – Xerox Alto from Xerox Palo Alto Research Center (PARC)

1973 – Sharp Corporation produced the first LCD calculator

1974 – Data Encryption Standard (DES)

1974 – The Institute of Electrical and Electronics Engineers (IEEE) publishes a paper entitled “A Protocol for Packet Network Interconnection”.[82]

1974 – Network Voice Protocol (NVP) tested over ARPANET in August 1974, carrying barely audible 16 kpbs CVSD encoded voice.[71]

1974 – The first successful real-time conversation over ARPANET achieved using 2.4 kpbs LPC, between Culler-Harrison Incorporated in Goleta, California, and MIT Lincoln Laboratory in Lexington, Massachusetts.[71]

1974 – First Personal Computer: The Altair 8800 Invented by MITS in Albuquerque, New Mexico

1975 – Colossal Cave Adventure – Text-based “Video” Game

1975 – The Shockwave Rider SciFi Book – A Prelude of the 21st Century Big Tech Police State

1975 – AI Medical Diagnosis – Artificial Intelligence in Medicine

1975 – BYTE Magazine

1975 – Homebrew Computer Club

1975 – The Mythical Man-Month

1975 – The name Epson was coined for the next generation of printers based on the EP-101 which was released to the public. (EPSON:E-P-SON: SON of Electronic Printer).[7] Epson America Inc. was established to sell printers for Shinshu Seiki Co.

1976 – Public Key Cryptography

1976 – Acer founded

1976 – Tandem NonStop

1976 – Dr. Dobb’s Journal

1977 – RSA Encryption

1977 – Apple II Computer

The TRS-80 Model I pictured alongside the Apple II and the Commodore PET 2001-8. These three computers constitute what Byte Magazine called the “1977 Trinity” of home computing.

1977 – Danny Cohen and Jon Postel of the USC Information Sciences Institute, and Vint Cerf of the Defense Advanced Research Projects Agency (DARPA), agree to separate IP from TCP, and create UDP for carrying real-time traffic.

1978 – First Internet Spam Message

1978 – France’s Minitel Videotext

1979 – Secret Sharing for Encryption

1979 – Dan Bricklin Invents VisiCalc Spreadsheet

1980 – Timex Sinclair ZX80 Computer

1980 – Flash Memory

1980 – RISC Microprocessors – Reduced Instruction Set Computer CPUs

1980 – Commercially Available Ethernet Invented by Robert Metcalfe of 3Com

1980 – Usenet

1981 – IBM Personal Computer – IBM PC

1981 – Simple Mail Transfer Protocol (SMTP) Email

1981 – Japan’s Fifth Generation Computer SystemsJapan

1982 – Sun Microsystems was founded on February 24, 1982.[2]

1982 – AutoCAD

1982 – First Commercial UNIX Workstation

1982 – PostScript

1982 – Microsoft and the IBM PC Clones

1982 – First CGI Sequence in Feature Film – Star Trek II: The Wrath of Khan

1982 – National Geographic Moves the Pyramids – Precursor to Photoshop

1982 – Secure Multi-Party Computation

1982 – TRON Movie

1982 – Home Computer Named Machine of the Year by Time Magazine

1983 – The Qubit – Quantum Computers

1983 – WarGames

1983 – 3-D Printing

1983 – Computerization of the Local Telephone Network

1983 – First Laptop

1983 – MIDI Computer Music Interface

1983 – Microsoft Word

1983 – Nintendo Entertainment System – Video Games

1983 – Domain Name System (DNS)

1983 – IPv4 Flag Day – TCP/IP

1984 – Text-to-Speech (TTS)

1984 – Apple Macintosh

1984 – VPL Research, Inc. – Virtual Reality (VR)

1984 – Quantum Cryptography

1984 – Telebit TrailBlazer Modems Break 9600 bps

1984 – Verilog Language

1984 – Dell founded by Michael Dell

1984 – Cisco Systems was founded in December 1984

1985 – Connection Machine – Parallelization

1985 – First Computer-Generated TV Host – Max HeadroomCGI

1985 – Zero-Knowledge Mathematical Proofs

1985 – FCC Approves Unlicensed Wireless Spread Spectrum

1985 – NSFNET National Science Foundation “Internet”

1985 – Desktop Publishing – with Macintosh, Aldus PageMaker, LaserJet, LaserWriter and PostScript

1985 – Field-Programmable Gate Array (FPGA)

1985 – GNU Manifesto from Richard Stallman

1985 – AFIS Stops a Serial Killer – Automated Fingerprint Identification System

1986 – Software Bug Fatalities

1986 – Pixar Animation Studios

1986 – D-Link Corporation founded in Taipei, Taiwan

1987 – Digital Video Editing

1987 – GIF – Graphics Interchange Format

1988 – MPEG – Moving Picture Experts Group – Coding-Compressing Audio-Video

1988 – CD-ROM

1988 – Morris Worm Internet Computer Virus

1988 – Linksys founded

1989 – World Wide Web-HTML-HTTP Invented by Tim Berners-Lee

1989 – Asus was founded in Taipei, Taiwan

1989 – SimCity Video Game

1989 – ISP Provides Internet Access to the Public

1990 – GPS Is Operational – Global Positioning System

1990 – Digital Money is Invented – DigiCash – Precursor to Bitcoin

1991 – Pretty Good Privacy (PGP)

1991 – DARPA’s Report “Computers at Risk: Safe Computing in the Information Age

1991 – Linux Kernel Operating System Invented by Linus Torvalds

1992 – Boston Dynamics Robotics Company Founded

1992 – JPEG – Joint Photographic Experts Group

1992 – First Mass-Market Web Browser NCSA Mosaic Invented by Marc Andreessen

1992 – Unicode Character Encoding

1993 – Apple Newton

1994 – First Banner Ad – Wired Magazine

1994 – RSA-129 Encryption Cracked

1995 – DVD

1995 – E-Commerce Startups – eBay, Amazon and DoubleClick Launched

1995 – AltaVista Web Search Engine

1995 – Gartner Hype Cycle

1996 – Universal Serial Bus (USB)

1996 – Juniper Networks founded

1997 – IBM Computer Is World Chess Champion

1997 – PalmPilot

1997 – E Ink

1998 – Diamond Rio MP3 Player

1998 – Google

1999 – Collaborative Software Development

1999 – Blog Is Coined

1999 – Napster P2P Music and File Sharing

2000 – USB Flash Drive

2000 – Sharp Corporation’s Mobile Communications Division created the world’s first commercial camera phone, the J-SH04, in Japan

2000 – Fortinet founded

2001 – Wikipedia

2001 – Apple iTunes

2001 – Advanced Encryption Standard (AES)

2001 – Quantum Computer Factors “15”

2002 – Home-Cleaning Robot

2003 – CAPTCHA

2004 – Product Tracking

2004 – Facebook

2004 – First International Meeting on Synthetic Biology

2005 – Video Game Enables Research into Real-World Pandemics

2006 – Apache Hadoop Makes Big Data Possible

2006 – Differential Privacy

2007 – Apple iPhone

2008 – Bitcoin

2010 – Air Force Builds Supercomputer with Gaming Consoles

2010 – Cyber Weapons

2011 – Smart Homes via the Internet of Things (IoT)

2011 – IBM Watson Wins Jeopardy!

2011 – World IPv6 Day

2011 – Social Media Enables the Arab Spring

2012 – DNA Data Storage

2013 – Algorithm Influences Prison Sentence

2013 – Subscription Software “Popularized”

2014 – Data Breaches

2014 – Over-the-Air Vehicle Software Updates

2015 – Google Releases TensorFlow

2016 – Augmented Reality Goes Mainstream

2016 – Computer Beats Master at Game of Go

~2050 -Hahahaha! – Artificial General Intelligence (AGI)

~9999 – The Limits of Computation?


Fair Use Sources:

Hardware and Electronics History Networking


Return to Timeline of the History of Computers

Telefonaktiebolaget L M Ericsson (lit. Telephone Stock Company L.M. Ericsson), commonly known as Ericsson, is a Swedish multinational networking and telecommunications company headquartered in Stockholm. The company offers services, software and infrastructure in information and communications technology for telecommunications operators, traditional telecommunications and Internet Protocol (IP) networking equipment, mobile and fixed broadband, operations and business support services, cable television, IPTV, video systems, and an extensive services operation.

Ericsson had a 27% market share in the 2G/3G/4G mobile network infrastructure market in 2018, thus being the largest such non-Chinese company.[3]

The company was founded in 1876 by Lars Magnus Ericsson[4] and was taken over by the Wallenberg family in 1960; today, the family, through its holding company Investor AB, owns a controlling 22.53% voting power. As of 2016 it is headquartered in Stockholm, Sweden. The company employs around 95,000 people and operates in around 180 countries.[5][6] Ericsson holds over 49,000 granted patents as of September 2019, including many in wireless communications.[7] Ericsson is the inventor of Bluetooth technology.[8] Ericsson leads the implementation of 5G worldwide, partly through the use of massive MIMO technology.[9][10]

Fair Use Sources:


Radio Engineering – 1890s to 1930s AD

Return to Timeline of the History of Computers

Radio Engineering

“At the same time, San Francisco’s inhabitants showed a voracious interest in the radio technology invented in Europe at the turn of the century. The Italian inventor Guglielmo Marconi, then in Britain, had galvanized the sector with his long-distance radio transmissions, beginning in 1897 and culminating with the radio message from the US President Theodore Roosevelt to the British king Edward VII of 1903. Marconi’s company set up radio stations on both sides of the Atlantic to communicate with ships at sea. However, it was not yet trivial how to create a wireless communication system.”

“In 1906 an independent with a degree from Yale, Lee DeForest, had built a vacuum tube in New York without quite understanding its potential as a signal amplifier. In fact his invention, the “audion”, was useful to amplify electrical signals, and therefore to wireless transmissions. (In 1904 the British chemist John-Ambrose Fleming had invented the two-element amplifier, or “diode”, and a few months before DeForest the Austrian physicist Robert von Lieben had already built a three-element amplifier, or “triode”). In 1910 DeForest moved to San Francisco and got into radio broadcasting, a business that he had pioneered in January when he had broadcast from New York a live performance by legendary Italian tenor Enrico Caruso. In fact, DeForest is the one who started using the term “radio” to refer to wireless transmission when he formed his DeForest Radio Telephone Company in 1907. However, his early broadcasts did not use the audion yet. Interest in radio broadcasting was high in the Bay Area, even if there were no mass-produced radios yet. A year earlier, in 1909, Charles Herrold in San Jose had started the first radio station in the US with regularly scheduled programming, including songs, using an arc transmitter of his own design. Charles Herrold had been one of Stanford’s earliest students and founded his own College of Wireless and Engineering in San Jose.

The Bay Area stumbled into electronics almost by accident. In 1909 another Stanford alumnus, Cyril Elwell, had founded the Poulsen Wireless Telephone and Telegraph Company in Palo Alto, later renamed the Federal Telegraph Corporation (FTC), to commercialize a new European invention. In 1903 the Danish engineer Valdemar Poulsen invented an arc transmitter for radio transmission, but no European company was doing anything with it. Elwell understood its potential was not only technological but also legal: it allowed him to create radio products without violating Marconi’s patents. Elwell acquired the US rights for the Poulsen arc. His radio technology, adequately funded by a group of San Francisco investors led by Beach Thompson, blew away the competition of the East Coast. In 1912 he won a contract with the Navy, which was by far the biggest consumer of radio communications. Thus commercial radiotelegraphy developed first in the US. The “startup” was initially funded by Stanford’s own President, David Starr Jordan, and employed Stanford students, notably Edwin Pridham. Jordan had just inaugurated venture-capital investment in the region.

In need of better receiver amplifiers for the arc transmissions, FTC hired Lee DeForest, who by 1912 had finally realized that his audion could be used as an amplifier. The problem with long-distance telephone and radio transmissions was that the signal was lost en route as it became too faint. DeForest’s vacuum tube enabled the construction of repeaters that restored the signal at intermediate points. The audion could dramatically reduce the cost of long-distance wireless communications. FTC began applying the audion to develop a geographically distributed radiotelegraphy system. The first tower they had built, in July 1910, was on a San Francisco beach and it was 90 meters tall. Yet the most impressive of all was inaugurated in 1912 at Point San Bruno (just south of the city), a large complex boasting the tallest antenna in the world (130 meters).

By the end of 1912 FTC had grown; it had stations in Texas, Hawaii, Arizona, Missouri and Washington besides California. However, the Poulsen arc remained the main technology for radiotelephony (voice transmission) and, ironically, FTC was no longer in that business. Improvements to the design by recent Cornell graduate Leonard Fuller (mostly during World War I, when the radio industry was nationalized to produce transmitters for the Navy) that allowed the audion to amplify a signal a million times eventually led FTC to create the first global wireless communication system. The audion was still used only for receivers, while most transmitters were arc-based. It was only in 1915 that DeForest realized that a feedback loop of audions could be used to build transmitters as well. DeForest had already (in 1913) sold the patent for his audion to Graham Bell’s AT&T in New York, and AT&T had already used it to set up the first coast-to-coast telephone line (January 1915), just in time for the Panama-Pacific International Exposition. Meanwhile, DeForest had moved to New York. There, in 1916, he stunned the nation by broadcasting the results of the presidential elections with music and commentary from New York to stations within a range of 300 kilometers, and this time using an audion transmitter. Radiotelephony would switch from the Poulsen arc to his audion during the 1920s. In due time Leo Fuller took Elwell’s place as chief engineer of FTC, and in 1920 Navy engineer and former Marconi engineer Haraden Pratt was hired to launch commercial wireless telegraph service, and sugar magnate Rudolph Spreckels bought control of FTC.

The wireless industry was booming throughout the US, aided by sensational articles in the mainstream press. Earle Ennis had opened a company (Western Wireless Equipment Company) to sell wireless equipment for ships. He also ran a radio broadcast to deliver news to ships at sea. In 1910 he organized the first air-to-ground radio message, thus showing that the same technology could be used by the nascent airline industry.

Because of its maritime business, the Bay Area became one of the largest centers for amateur radio. The Bay Counties Wireless Telegraph Association was founded in 1907 by (then) amateurs such as Haraden Pratt, Ellery Stone and Lewis Clement.

Quite a bit of innovation in radio engineering came from the “ham” radio amateurs. The first wireless communications were, by definition, done by independents who set up their own equipment. This was the first “virtual” community as they frequently never met in person. The first magazine devoted to radio engineering, Modern Electrics, was launched in April 1908 in New York by Hugo Gernsback, a 24-year-old Jewish immigrant from Luxembourg. It reached a circulation of 52,000 in 1911, the year when it started publishing science-fiction stories (thus also becoming de facto the first science-fiction magazine). Amateur wireless associations popped up throughout the country, such as the Radio Club of Salt Lake City in Utah, founded in September 1909, and the Wireless Association of Central California, formed in May 1910 in Fresno. From a social point of view, the beauty of ham radio was that it blurred class boundaries: they were known by codes such as 6ZAF, not by their last names, and it made no difference whether they were rural teenagers, Stanford PhD students or professional radio engineers. They were all on the same level.

Among the amateurs of the second decade were Charlie Litton, an eleven-year old prodigy who operated an amateur station in Redwood City in 1915, and Frederick Terman, a teenager who operated an amateur station in Palo Alto in 1917. Some of those amateurs went on to create small companies. Little did they know that their hobby would in time of war constitute a strategic industry for the Air Force, Navy and Army: during World War I (in 1918) Elwell’s technology would be a pillar of naval communications for the US. The Navy had set up radio stations all over the place. In January 1918 the President of the US, Woodrow Wilson, proudly spoke live to Europe, the Far East and Latin America.

Magnavox Corp. was founded in 1910 in Napa (north of the bay). It was the brainchild of Peter Jensen (one of the Danish engineers imported by FTC to commercialize the Poulsen arc) and Edwin Pridham (a Stanford graduate who also worked at FTC). In 1917 they introduced a new type of electrical loudspeaker.

Alas, after World War I it became obvious that radio technology was strategic, and it couldn’t be left in the hands of West-Coast independents. The US government basically forced a large East-Coast company, General Electric, to buy the US business of Marconi. The US government also helped the new company to acquire the most important radio patents. Thus a new giant, RCA, was born and soon became the dominant player in consumer electronics, as the number of radios grew from 5,000 in 1920 to 25 million in 1924. Hence FTC was doomed and other Bay Area-based radio companies had to live with only military applications.

Ham-radio amateurs were the first “garage nerds” of the San Francisco Bay Area, a place isolated from the rest of the country (reaching any other city required a long journey by ship, by train or by coach). Bill Eitel presided the Santa Clara County Amateur Radio Association, formed in 1921, before he went on to launch his own “startup”. The First National Radio Conference took place in Washington in February 1922, and it pitted the five big corporations that owned all the patents (American Telephone & Telegraph, General Electric, Western Electric, Westinghouse and RCA) against the ham-radio amateur clubs. That conference established their legal legitimacy. A few weeks later, in April 1922, the first transpacific two-way amateur communication was established between 6ZAC (Clifford Down) in Hawaii and 6ZAF (A.H. Babcock) in Berkeley. The ham-radio operators became heroes in countless cases of natural disasters, especially in the Western states, at a time when there was no other way to communicate rapidly with the aid workers. A teenager, known as 6BYQ, sent out the first alarm when a dam broke in 1928 in Santa Paula, near Los Angeles, causing a flood that caused massive destruction. Ham-radios helped in September 1932 when a landslide wiped out the mining town of Tehachapi, east of Los Angeles, and in March 1933 when an earthquake struck Long Beach, south of Los Angeles. Ham-radios were the first “consumers” of the vacuum tubes made in the Bay Area.

Radio engineering created two worlds in the Bay Area that would greatly influence its future: a high-tech industry and a community of high-tech amateurs.

SEE ALSO: Electrical Engineering – Early 1900s

Fair Use Sources:


Artificial Intelligence Cloud Data Science - Big Data History Software Engineering

The Limits of Computation? – ~9999 AD

Return to Timeline of the History of Computers


The Limits of Computation?

Seth Lloyd (b. 1960)

“Each generation of technology has seen faster computations, larger storage systems, and improved communications bandwidth. Nevertheless, physics may impose fundamental limits on computing systems that cannot be overcome. The most obvious limit is the speed of light: a computer in New York City will never be able to request a web page from a server in London and download the results with a latency of less than 0.01 seconds, because light takes 0.0186 seconds to travel the 5,585 kilometers each direction, consistent with Einstein’s Theory of Special Relativity. On the other hand, recently some scientists have claimed that they can send information without sending light particles by using quantum entanglement, something Einstein dismissively called spooky action at a distance. Indeed, in 2013, scientists in China measured the speed of information propagation due to quantum entanglement and found that it was at least 10,000 times faster than the speed of light.

Computation itself may also have a fundamental limit, according to Seth Lloyd, a professor of mechanical engineering and physics at MIT. In 2000, Lloyd showed that the ultimate speed of a computer was limited by the energy that it had available for calculations. Assuming that the computations would be performed at the scale of individual atoms, a central processor of 1 kilogram occupying the volume of 1 liter has a maximum speed of 5.4258 × 1050 operations per second—roughly 1041, or a billion billion billion billion times faster than today’s laptops.

Such speeds may seem unfathomable today, but Lloyd notes that if computers double in speed every two years, then this is only 250 years of technological progress. Lloyd thinks that such technological progress is unlikely. On the other hand, in 1767, the fastest computers were humans.

Because AI is increasingly able to teach and train itself across all technological and scientific domains—doing so at an exponential rate while sucking in staggering amounts of data from an increasingly networked and instrumented world—perhaps it is appropriate that a question mark be the closing punctuation for the title of this entry.”

SEE ALSO Sumerian Abacus (c. 2500 BCE), Slide Rule (1621), The Difference Engine (1822), ENIAC (1943), Quantum Cryptography (1984)

Based on our current understanding of theoretical physics, a computer operating at the maximum speed possible would not be physically recognizable by today’s standards. It would probably appear as a sphere of highly organized mass and energy.

Fair Use Sources: B07C2NQSPV

Lloyd, Seth. “Ultimate Physical Limits to Computation.” Nature 406, no. 8 (August 2000): 1047–54.

Yin, Juan, et al. “Bounding the Speed of ‘Spooky Action at a Distance.’” Physical Review Letters 110, no. 26 (2013).

Data Science - Big Data History

DNA Data Storage – 2012 AD

Return to Timeline of the History of Computers


DNA Data Storage

George Church (b. 1954), Yuan Gao (dates unavailable), Sriram Kosuri (dates unavailable), Mikhail Neiman (1905–1975)

“In 2012, George Church, Yuan Gao, and Sriram Kosuri, all with the Harvard Medical School’s Department of Genetics, announced that they had successfully stored 5.27 megabits of digitized information in strands of deoxyribonucleic acid (DNA), the biological molecule that is the carrier of genetic information. The stored information included a 53,400-word book, 11 JPEG images, and a JavaScript program. The following year, scientists at the European Bioinformatics Institute (EMBL-EBI) successfully stored and retrieved an even larger amount of data in DNA, including a 26-second audio clip of Martin Luther King’s “I Have a Dream” speech, 154 Shakespeare sonnets, the famous Watson and Crick paper on DNA structure, a picture of EMBL-EBI headquarters, and a document that described the methods the team used to accomplish the experiment.

Although first demonstrated in 2012, the concept of using DNA as a recording, storage, and retrieval mechanism goes back to 1964, when a physicist named Mikhail Neiman published the idea in the Soviet journal Radiotekhnika.

To accomplish this storage and retrieval, first a digital file represented as 1s and 0s is converted to the letters A, C, G, and T. These letters are the four chemical bases that make up DNA. The resulting long string of letters is then used to manufacture synthetic DNA molecules, with the sequence of the original bits corresponding to the sequence of nucleic acids. To decode the DNA and reconstitute the digital file, the DNA is put through a sequencing machine that translates the letters back into the original 1s and 0s of the original digital files. Those files can then be displayed on a screen, played through a speaker, or even run on a computer’s CPU.

In the future, DNA could allow digital archives to reliably store vast amounts of digitized data: a single gram of DNA has the potential to store 215 million gigabytes of data, allowing all the world’s information to be stored in a space the size of a couple of shipping containers.”

SEE ALSO Magnetic Tape Used for Computers (1951), DVD (1995)

To store information in DNA, a digital file represented as 1s and 0s is converted to the letters A, C, G, and T, the four chemical bases that make up DNA.

Fair Use Sources: B07C2NQSPV

DevSecOps-Security-Privacy History

Advanced Encryption Standard (AES) – 2001 AD

Return to Timeline of the History of Computers


Advanced Encryption Standard

Vincent Rijmen (b. 1970), Joan Daemen (b. 1965)

“After the US government adopted the Data Encryption Standard (DES) in 1977, it quickly became the most widely used encryption algorithm in the world. But from the start, there were concerns about the algorithm’s security. DES had an encryption key of just 56 bits, which meant there were only 72,057,594,037,927,936 possible encryption keys, leaving experts to speculate whether anyone with the means had built special-purpose computers for cracking DES-encrypted messages.

DES had other problems. Designed to be implemented in hardware, software implementations were surprisingly slow. As a result, many academic cryptographers proposed new ciphers in the 1980s and 1990s. These algorithms found increasing use—in web browsers, for instance—but none had the credence that came with having gone through the government’s standards-making process.

So, in 1997, the US National Institute of Standards and Technology (NIST) announced a multiyear competition to decide upon the nation’s next encryption standard. NIST invited cryptographers all over the world to submit not only their best algorithms, but their recommendations for how the algorithms should be evaluated.

Adding another nail to the DES coffin, in 1998 the Electronic Frontier Foundation (EFF), a tiny civil liberties organization, announced that it had built one of those mythical DES-cracking machines, and for less than $250,000. Called Deep Crack, the machine could try 90 billion DES keys a second, allowing it to crack, on average, a DES-encrypted message in just 4.6 days.

In total, there were 15 credible submissions from nine different countries to the NIST contest. After considerable public analysis and three public conferences, the winner was decided in 2001: an algorithm called Rijndael, developed by two Belgian cryptographers, Vincent Rijmen and Joan Daemen. Rijndael is now called the Advanced Encryption Standard (AES). It can be run with 128-bit, 192-bit, or 256-bit keys, allowing for unprecedented levels of security. It can run on tiny 8-bit microcontrollers, and nearly all modern microprocessors now have special AES instructions, allowing them to encrypt at blindingly fast speeds.”

SEE ALSO Data Encryption Standard – DES (1974)

One of the 29 circuit boards from the Electronic Frontier Foundation’s encryption breaking machine, Deep Crack.

Fair Use Sources: B07C2NQSPV

History Software Engineering

ALGOL Programming Language Invented – 1958 AD

Return to Timeline of the History of Computers

ALGOL (/ˈælɡɒl, -ɡɔːl/; short for “Algorithmic Language“)[1] is a family of imperative computer programming languages originally developed in 1958. ALGOL heavily influenced many other languages and was the standard method for algorithm description used by the Association for Computing Machinery (ACM) in textbooks and academic sources until object-oriented languages came around, for more than thirty years.[2]

In the sense that the syntax of most modern languages is “Algol-like”,[3] it was arguably the most influential of the four high-level programming languages among which it was roughly contemporary: FORTRANLisp, and COBOL.[4] It was designed to avoid some of the perceived problems with FORTRAN and eventually gave rise to many other programming languages, including PL/ISimulaBCPLBPascal, and C.

Fair Use Sources:

Data Science - Big Data History Software Engineering

SQL Relational Database Programming Language Invented by Edgar Codd of IBM – 1974 AD

Return to Timeline of the History of Computers

SQL is a relational database programming language and was developed by Edgar Codd in 1974 and is still important in the programming language world.

Fair Use Sources:

See also: Relational Databases (1970 AD), Microsoft SQL Server (1989)

History Software Engineering

Prolog Programming Language Invented by Alain Colmerauer – 1972 AD

Return to Timeline of the History of Computers

The Prolog programming language was developed by Alain Colmerauer and colleagues in 1972 at the University of Marseilles.

Fair Use Sources:

History Python Software Engineering

Python Programming Language Invented by Guido van Rossum – 1991 AD

Return to Timeline of the History of Computers

Development of Python was started in 1989 by Guido van Rossum and released to the public in 1991.

Fair Use Sources:

History Software Engineering

First Transistorized Computer – 1953 AD

Return to Timeline of the History of Computers


First Transistorized Computer

Tom Kilburn (1921–2001), Richard Grimsdale (1929–2005), Douglas Webb (b. 1929), Jean H. Felker (1919–1994)

“With the invention of the transistor in 1947, the next step was to use it as a replacement for the vacuum tube. Tubes had a significant advantage compared to relays—they were a thousand times faster—but tubes required an inordinate amount of electricity, produced huge amounts of heat, and failed constantly. Transistors used a fraction of the power, produced practically no heat at all, and were more reliable than tubes. And because transistors were smaller than tubes, a transistorized machine would run inherently faster, because electrons had a shorter distance to move.

The University of Manchester demonstrated its prototype transistorized computer on November 16, 1953. The machine made use of the “point-contact” transistor, a piece of germanium that was in contact with two wires held in very close proximity to each other—the two “points.” The Manchester machine had 92 point-contact transistors and 550 diodes. The system had a word size of 48 bits. (Many of today’s microprocessors can operate on words that are 8, 16, 32, or 64 bits.) A few months later, Jean H. Felker at Bell Labs created the TRADIC (transistor digital computer) for the US Air Force, with 700 point-contact transistors and more than 10,000 diodes.

This point-contact transistor was soon replaced by the bipolar junction transistor, so named because it is formed by a junction involving two kinds of semiconductors. Manchester updated its prototype in 1955 with a new design that used 250 of these junction transistors. Called the Metrovick 950, that computer was manufactured by Metropolitan-Vickers, a British electrical engineering company.

In 1956, the Advanced Development Group at MIT Lincoln Lab used more than 3,000 transistors to build the TX-0 (Transistorized eXperimental computer zero), a transistorized version of the Whirlwind and the forerunner to Digital Equipment Corporation’s (DEC) PDP-1.”

SEE ALSO William Shockley’s Silicon Transistor (1947), Whirlwind (1949), PDP-1 (1959)

Close-up of the prototype of the Manchester transistorized computer.

Fair Use Source: B07C2NQSPV

History Software Engineering

Microprogramming – 1951 AD

Return to Timeline of the History of Computers



Maurice Wilkes (1913–2010)

“By 1951, the basic structure of stored-program computers had been worked out: a central processing unit (CPU) that had registers for storing numbers, an arithmetic logic unit (ALU) for performing mathematical operations, and logic for moving data between the CPU and memory. But the internal design of these early CPUs was a mess. Each instruction was implemented with a different set of wires and circuits, some with components in common, and others with their own individual logic.

British computer scientist Maurice Wilkes realized that the design of the CPU could be made more regular after seeing the design of the Whirlwind, which was controlled by a crisscrossing matrix of wires. Some of the wires connected by a diode where they crossed. Voltage was applied to each horizontal wire in sequence. If a diode was present, the corresponding vertical wire would be energized and activate different parts of the CPU.

Wilkes realized that each line of the diode matrix in the Whirlwind could be viewed as a set of microoperations that the CPU followed, a kind of “microprogram.” He formalized this idea in a lecture at the 1951 Manchester University Computer Inaugural Conference, immodestly titled “The Best Way to Design an Automatic Calculating Machine.” In the lecture, later published by the university, Wilkes proposed that his idea might seem at once obvious, because it described nothing more than a formalized way of creating a CPU using the same basic wires, diodes, and electronic switches that were already in use, as well as extravagant, because it might use more components than would be used otherwise. But, Wilkes argued, it resulted in a system that was easier to design, test, and extend.

Wilkes was right. Microprogramming dramatically simplified the creation of CPUs, allowing instruction sets to become more complex. It also created unexpected flexibility: when IBM released System/360 in 1964, its engineers used microprogramming to allow the new computers to emulate the instructions of the IBM 1401, making it easier for customers to make the transition.”

SEE ALSO Whirlwind (1949), IBM 1401 (1959), IBM System/360 (1964)

“Maurice Wilkes (front left), designer of the EDSAC, one of the earliest stored-program electronic computers.”

Fair Use Source: B07C2NQSPV

History Software Engineering

Manchester SSEM – 1948 AD

Return to Timeline of the History of Computers


Manchester SSEM

Frederic Calland Williams (1911–1977), Tom Kilburn (1921–2001)

“The defining characteristic of the digital computer is that it stores both program and data in a single memory bank. In a modern computer, this arrangement lets one program load a second program into memory and execute it. On the limited-memory machines of the 1950s, intermixing programs and code made it possible to squeeze out more functionality by writing programs that literally modified themselves, now called self-modifying code. Modern computers use this ability to load code into the computer’s memory and execute it—the fundamental capability that makes a computer a general-purpose machine. But none of the machines built before the Manchester Small-Scale Experimental Machine (SSEM) were actually digital computers, at least not in the modern sense. Either they were hardwired to perform a particular calculation, like the Atanasoff-Berry Computer, they read their instructions from some kind of punched tape, like the Konrad Zuse machines, or the program was set on wires and switches, like ENIAC. They were really calculators, not computers.

The SSEM, nicknamed Baby by its creators at the University of Manchester, was built for testing and demonstrating the storage tube that Frederic Williams had designedWilliams Tube Random Access Memory (RAM) – 1946 A.D. in 1946. Baby filled a 20-foot-square room and consisted of eight racks of equipment, the Williams storage tube, many radio tubes, and meters that reported voltages. Each tube had 1,024 bits. As the program ran and changed what was stored in its memory, the arrangement of dots on the storage tube changed.

Because the program was stored in memory, and relied on self-modifying code, it was easy for Kilburn to make changes. The first program that Baby successfully ran, written by Kilburn, was designed to find the highest factor of 218 (262,144). The program ran in 52 minutes and found the right answer: 217 (131,072), averaging 1.5 milliseconds per instruction. The original program was just 17 instructions long.

Arriving at the correct answer was no easy feat. As Williams reportedly stated, “The spots on the display tube entered a mad dance. In early trials, it was a dance of death leading to no useful result . . . But one day it stopped, and there, shining brightly in the expected place, was the expected answer.””

SEE ALSO Z3 Computer (1941), Atanasoff-Berry Computer (1942), Williams Tube (1946)

“Recreation of the Manchester Small-Scale Experimental Machine (a.k.a., the Manchester “Baby”) at the Museum of Science and Industry in Manchester, UK.”

Fair Use Source: B07C2NQSPV


Curta Calculator – 1948 AD

Return to Timeline of the History of Computers


Curta Calculator

Curt Herzstark (1902–1988)

“The Curta is perhaps the most elegant, compact, and functional mechanical calculator ever manufactured. Designed by Austrian engineer Curt Herzstark, it is the only digital mechanical pocket calculator ever invented. Handheld and powered by a crank on the top, the Curta can add, subtract, multiply, and divide.

Curt Herzstark’s father, Samuel Jacob Herzstark, was a highly regarded Austrian importer and manufacturer of mechanical calculators and other precision instruments. Herzstark finished high school and apprenticed at his father’s company, which he took over when his father died in 1937.

At the time, mechanical calculators were big and heavy desktop affairs. After one of Herzstark’s customers complained that he didn’t want to go back to the office just to add up a column of numbers, Herzstark started designing a handheld calculator. He had an early prototype working in January 1938, just two months before Germany invaded and annexed Austria. Despite Herzstark being half-Jewish, the Nazis let him continue to operate the factory, provided that it cease all civilian production and devote itself to creating devices for the Reich.

In 1943, two of Herzstark’s employees were arrested for distributing transcripts of English radio broadcasts; Herzstark was subsequently arrested for aiding the employees and for “indecent contact with Aryan women.” He was sent to the Buchenwald concentration camp, where he was recognized by one of his former employees, who was now a guard. The guard told the head of the camp’s factory about the mechanical calculator. The Germans then instructed Herzstark to finish his project, so that the camp could give the device to Hitler as a present after Germany won the war. That never happened: Buchenwald was liberated on April 11, 1945, and Hitler killed himself 19 days later.

After liberation, Herzstark took the drawings he had done at the camp to a machine shop and had three working prototypes eight weeks later. The first calculators were produced commercially in the fall of 1948.”

SEE ALSO Antikythera Mechanism (c. 150 BCE), Thomas Arithmometer (1851)

“The Curta mechanical calculator, pictured here, is the only digital mechanical pocket calculator ever invented.”

Fair Use Source: B07C2NQSPV