Categories
Artificial Intelligence Cloud Data Science - Big Data History

Limited hangout

“A limited hangout or partial hangout is, according to former special assistant to the Deputy Director of the Central Intelligence Agency Victor Marchetti, “spy jargon for a favorite and frequently used gimmick of the clandestine professionals. When their veil of secrecy is shredded and they can no longer rely on a phony cover story to misinform the public, they resort to admitting—sometimes even volunteering—some of the truth while still managing to withhold the key and damaging facts in the case. The public, however, is usually so intrigued by the new information that it never thinks to pursue the matter further.”[1][2] (WP)

Modified limited hangout

“In a March 22, 1973, meeting between president Richard NixonJohn DeanJohn EhrlichmanJohn Mitchell, and H. R. Haldeman, Ehrlichman incorporated the term into a new and related one, “modified limited hangout“.[3][4]

The phrase was coined in the following exchange:[5]” (WP)

PRESIDENT: You think, you think we want to, want to go this route now? And the — let it hang out, so to speak?

DEAN: Well, it’s, it isn’t really that —
HALDEMAN: It’s a limited hang out.
DEAN: It’s a limited hang out.
EHRLICHMAN: It’s a modified limited hang out.

PRESIDENT: Well, it’s only the questions of the thing hanging out publicly or privately.

“Before this exchange, the discussion captures Nixon outlining to Dean the content of a report that Dean would create, laying out a misleading view of the role of the White House staff in events surrounding the Watergate burglary. In Ehrlichman’s words: “And the report says, ‘Nobody was involved,'”. The document would then be shared with the United States Senate Watergate Committee investigating the affair. The report would serve the administration’s goals by protecting the President, providing documentary support for his false statements should information come to light that contradicted his stated position. Further, the group discusses having information on the report leaked by those on the Committee sympathetic to the President, to put exculpatory information into the public sphere.[5]” (WP)

“The phrase has been cited as a summation of the strategy of mixing partial admissions with misinformation and resistance to further investigation, and is used in political commentary to accuse people or groups of following a Nixon-like strategy.[6]” (WP) However, this “strategy” has been used since time immemorial.

“Writing in The Washington PostMary McGrory described a statement by Pope John Paul II regarding sexual abuse by priests as a “modified, limited hangout”.[7] (WP)

See also

References

  1. ^ Victor Marchetti (August 14, 1978) The Spotlight
  2. ^ “720 F2d 631 Hunt v. Liberty Lobby Dc”. OpenJurist. 1983-11-28. Retrieved 2016-07-13.
  3. ^ Frost/Nixon: The Complete Interviews. David Frost, Richard Nixon. Paradine Television, 1977.
  4. ^ Safire, William (26 March 1989). “On Language; In Nine Little Words”New York Times. Retrieved 23 June 2013.
  5. a b “Transcript of a recording of a meeting among the president, John Dean, John Erlichman, H. R. Haldeman, and John Mitchell on March 22, 1973 from 1:57 to 3:43 p.m.” History and Politics Out Loud. Retrieved 2006-08-27.
  6. ^ Carrol, Jon (2002-05-01). “The Richard Nixon playbook”San Francisco Chronicle. Retrieved 2006-08-27.
  7. ^ McGrory, Mary (2002-04-25). “From Rome, A ‘Limited Hangout'”The Washington Post. Washington, D.C. p. A29. Retrieved 2010-04-30.

Categories

” (WP)

Sources:

Fair Use Sources:

Categories
History

This Year in History

Know your History, It repeats itself! See also Timeline of the History of Computers and This Year in History

History in the year of:

Sources:

Fair Use Sources:

Categories
Apple macOS Operating Systems Software Engineering

macOS – OSX – OS X

This article is about the current Apple operating system for Mac computers. For pre-2001 versions, see Classic Mac OS.

MacOS wordmark (2017).svg
DeveloperApple Inc.
Written inCC++[1]Objective-CSwift[2]assembly language
OS familyUnixMacintosh
Working stateCurrent
Source modelClosed source (with open source components)
Initial releaseMarch 24, 2001; 19 years ago
Latest release11.2.3[3] (20D91)[4] (March 8, 2021; 1 day ago) [±]
Latest preview11.3 beta 3[5] (20E5196f)[6] (March 2, 2021; 7 days ago) [±]
Marketing targetPersonal computing
Available in39 languages[7]
List of languages[as of macOS Catalina]: Arabic, Catalan, Croatian, Chinese (Hong Kong), Chinese (Simplified), Chinese (Traditional), Czech, Danish, Dutch, English (Australia), English (United Kingdom), English (United States), Finnish, French (Canada), French (France), German, Greek, Hebrew, Hindi, Hungarian, Indonesian, Italian, Japanese, Korean, Malay, Norwegian, Polish, Portuguese (Brazil), Portuguese (Portugal), Romanian, Russian, Slovak, Spanish (Latin America), Spanish (Spain), Swedish, Thai, Turkish, Ukrainian, Vietnamese
Update methodSystem Preferences (10.14–)Mac App Store (10.810.13.6)Software Update (10.010.7.5)
PlatformsARM64 (11.0–)x86-64 (10.4.7–)IA-32 (10.4.410.6.8)PowerPC (10.010.5.8)
Kernel typeHybrid (XNU)
Default user interfaceAqua (Graphical)
LicenseCommercial softwareproprietary software
Preceded byClassic Mac OSNeXTSTEP
Official websitewww.apple.com/macos
Support status
Supported

macOS (/ˌmækoʊˈɛs/;[8] previously Mac OS X and later OS X) is a series of proprietary graphical operating systems developed and marketed by Apple Inc. since 2001. It is the primary operating system for Apple’s Mac computers. Within the market of desktop, laptop and home computers, and by web usage, it is the second most widely used desktop OS, after Microsoft Windows.[9][10]” (WP)

“macOS is the direct successor to the classic Mac OS, the line of Macintosh operating systems with nine releases from 1984 to 1999. macOS adopted the Unix kernel and inherited technologies developed between 1985 and 1997 at NeXT, the company that Apple co-founder Steve Jobs created after leaving Apple in 1985. Releases from Mac OS X 10.5 Leopard[11] and thereafter are UNIX 03 certified.[12] Apple’s mobile operating system, iOS, has been considered a variant of macOS.[13]” (WP)

“The first desktop version, Mac OS X 10.0, was released in March 2001, with its first update, 10.1, arriving later that year. The “X” in Mac OS X and OS X is the Roman numeral for the number 10 and is pronounced as such. The X was a prominent part of the operating system’s brand identity and marketing in its early years, but gradually receded in prominence since the release of Snow Leopard in 2009. Apple began naming its releases after big cats, which lasted until OS X 10.8 Mountain Lion. Since OS X 10.9 Mavericks, releases have been named after locations in California.[14] Apple shortened the name to “OS X” in 2012 and then changed it to “macOS” in 2016, adopting the nomenclature that they were using for their other operating systems, iOSwatchOS, and tvOS. With Big Sur, Apple advanced the macOS major version number for the first time, changing it to 11 for Big Sur from the 10 used for all previous releases.” (WP)

“macOS has supported three major processor architectures. It first supported PowerPC-based Macs in 1999. Starting in 2006, with the Mac transition to Intel processors, it ran on Macs using Intel x86 processors. Most recently, starting in 2020, with the Mac transition to Apple Silicon, it runs on Macs using 64-bit ARM-based Apple Silicon processors.” (WP)

Sources:

Fair Use Sources:

Categories
Artificial Intelligence Cloud Data Science - Big Data Hardware and Electronics History Networking Operating Systems Software Engineering

Timeline of the History of Computers

Return to History or This Year in History

c. 2500 BC – Sumerian Abacus

c. 700 BC – Scytale

c. 150 BC – Antikythera Mechanism

c. 60 – Programmable Robot

c. 850 – On Deciphering Cryptographic Messages

c. 1470 – Cipher Disk

1613 – First Recorded Use of the Word Computer

1621 – Slide Rule

1703 – Binary Arithmetic

1758 – Human Computers Predict Halley’s Comet

1770 – The “Mechanical Turk”

1792 – Optical Telegraph

1801 – The Jacquard Loom

1822 – The Difference Engine

1833 – Michael Faraday discovered silver sulfide became a better conductor when heated

1836 – Electrical Telegraph

1843 – Ada Lovelace Writes a Computer Program

1843 – Fax Machine Patented

1843 – Edgar Allan Poe’s “The Gold-Bug”

1849 to early 1900s – Silicon Valley After the Gold Rush

1851 – Thomas Arithmometer

1854 – Boolean Algebra

1864 – First Electromagnetic Spam Message

1870 – Mitsubishi founded

1874 – Baudot Code

1874 – Semiconductor Diode conceived of

1876 – Ericsson Corporation founded in Sweden

1885 – Stanford University

1885 – William Burroughs’ adding machine

1890 – Herman Hollerith Tabulating the US Census

1890 – Toshiba founded in Japan

1891 – Strowger Step-by-Step Switch

1898 – Nippon Electric Limited Partnership – NEC Corporation founded in Japan

1890s to 1930s – Radio Engineering

Early 1900s – Electrical Engineering

1904 – “Diode” or Two-Element Amplifier actually invented

1904 – Three-Element Amplifier or “Triode”

1906 – Vacuum Tube or “Audion”

1907 – Lee DeForest coins the term “radio” to refer to wireless transmission when he formed his DeForest Radio Telephone Company

1909 – Charles Herrold in San Jose started first radio station in USA with regularly scheduled programming, including songs, using an arc transmitter of his own design. Herrold was one of Stanford’s earliest students and founded his own College of Wireless and Engineering in San Jose

1910 – Radio Broadcasting business pioneered by Lee DeForest with broadcast from New York of a live performance by Italian tenor Enrico Caruso

1910 – Hitachi founded in Japan

1912 – Sharp Corporation founded in Japan and takes its name from one of its founder’s first inventions, the Ever-Sharp mechanical pencil

1914 – Floating-Point Numbers

1917 – Vernam Cipher

1918 – Panasonic, then Matsushita Electric, founded in Japan

1920 – Rossum’s Universal Robots

1927 – Fritz Lang’s Metropolis

1927 – First LED

1928 – Electronic Speech Synthesis

1930 – The Enigma Machine

1931 – Differential Analyzer

1935 – Fujitsu founded as Fuji Telecommunications Equipment Manufacturing in Japan. Fujitsu is the second oldest IT company after IBM and before Hewlett-Packard

1936 – Church-Turing Thesis

1939 – Hewlett-Packard founded in a one-car garage in Palo Alto, California by Bill Hewlett and David Packard

1939 – Toshiba founded in Japan

1941Z3 Computer

1942Atanasoff-Berry Computer

1942 – Isaac Asimov’s Three Laws of Robotics

1942Seiko Corporation founded in Japan

1943ENIAC

1943Colossus

1944Delay Line Memory

1944Binary-Coded Decimal

1945Vannevar Bush‘s “As We May Think

1945EDVAC First Draft Report – The von Neumann architecture

1946 – Trackball

1946 – Williams Tube Random Access Memory

1947 – Actual Bug Found – First “debugging”

1947 – William Shockley’s Silicon Transistor

1948 – The Bit – Binary Digit 0 or 1

1948 – Curta Calculator

1948 – Manchester SSEM

1949 – Whirlwind Computer

1950 – Error-Correcting Codes (ECC)

1951 – Turing Test of Artificial Intelligence (AI)

1951 – Magnetic Tape Used for Computers

1951 – Core Memory

1951 – Microprogramming

1952 – Computer Speech Recognition

1953 – First Transistorized Computer

1955 – Artificial Intelligence (AI) Coined

1955 – Computer Proves Mathematical Theorem

1956 – First Disk Storage Unit

1956 – The Byte

1956 – Robby the Robot from Forbidden Planet

1957 – FORTRAN Programming Language

1957 – First Digital Image

1958 – The Bell 101 Modem

1958 – SAGE Computer Operational

1959 – IBM 1401 Computer

1959 – DEC PDP-1

1959 – Quicksort Algorithm

1959 – SABRE Airline Reservation System

1960 – COBOL Programming Language

1960 – Recommended Standard 232 (RS-232)

1961 – ANITA Electronic Calculator

1961 – Unimate – First Mass-Produced Robot

1961 – Time-Sharing – The Original “Cloud Computing

1961 – Shinshu Seiki Company founded in Japan (now called Seiko Epson Corporation) as a subsidiary of Seiko to supply precision parts for Seiko watches.

1962 – Spacewar! Video Game

1962 – Virtual Memory

1962 – Digital Long Distance Telephone Calls

1963 – Sketchpad Interactive Computer Graphics

1963 – ASCII Character Encoding

1963 – Seiko Corporation in Japan developed world’s first portable quartz timer (Seiko QC-951)

1964 – RAND Tablet Computer

1964 – Teletype Model 33 ASR

1964 – IBM System/360 Mainframe Computer

1964 – BASIC Programming Language

1965 – First Liquid-Crystal Display (LCD)

1965 – Fiber Optics – Optical-Fiber

1965 – DENDRAL Artificial Intelligence (AI) Research Project

1965 – ELIZA – The First “Chatbot” – 1965

1965 – Touchscreen

1966 – Star Trek Premieres

1966 – Dynamic RAM

1966 – Linear predictive coding (LPC) proposed by Fumitada Itakura of Nagoya University and Shuzo Saito of Nippon Telegraph and Telephone (NTT).[71]

1967 – Object-Oriented Programming

1967 – First ATM Machine

1967 – Head-Mounted Display

1967 – Programming for Children

1967 – The Mouse

1968 – Carterfone Decision

1968 – Software Engineering

1968 – HAL 9000 Computer from 2001: A Space Odyssey

1968 – First “Spacecraft” “Guided by Computer”

1968 – Cyberspace Coined—and Re-Coined

1968 – Mother of All Demos

1968 – Dot Matrix Printer – Shinshu Seiki (now called Seiko Epson Corporation) launched the world’s first mini-printer, the EP-101 (“EP” for Electronic Printer,) which was soon incorporated into many calculators

1968 – Interface Message Processor (IMP)

1969 – ARPANET / Internet

1969 – Digital Imaging

1969 – Network Working Group Request for Comments (RFC): 1

1969 – Utility Computing – Early “Cloud Computing

1969 – Perceptrons Book – Dark Ages of Neural Networks Artificial Intelligence (AI)

1969 – UNIX Operating System

1969 – Seiko Epson Corporation in Japan developed world’s first quartz watch timepiece (Seiko Quartz Astron 35SQ)

1970 – Fair Credit Reporting Act

1970 – Relational Databases

1970 – Floppy Disk

1971 – Laser Printer

1971 – NP-Completeness

1971 – @Mail Electronic Mail

1971 – First Microprocessor – General-Purpose CPU – “Computer on a Chip”

1971 – First Wireless Network

1972 – C Programming Language

1972 – Cray Research Supercomputers – High-Performance Computing (HPC)

1972 – Game of Life – Early Artificial Intelligence (AI) Research

1972 – HP-35 Calculator

1972 – Pong Game from Atari – Nolan Bushnell

1973 – First Cell Phone Call

1973 – Danny Cohen first demonstrated a form of packet voice as part of a flight simulator application, which operated across the early ARPANET.[69][70]

1973 – Xerox Alto from Xerox Palo Alto Research Center (PARC)

1973 – Sharp Corporation produced the first LCD calculator

1974 – Data Encryption Standard (DES)

1974 – The Institute of Electrical and Electronics Engineers (IEEE) publishes a paper entitled “A Protocol for Packet Network Interconnection”.[82]

1974 – Network Voice Protocol (NVP) tested over ARPANET in August 1974, carrying barely audible 16 kpbs CVSD encoded voice.[71]

1974 – The first successful real-time conversation over ARPANET achieved using 2.4 kpbs LPC, between Culler-Harrison Incorporated in Goleta, California, and MIT Lincoln Laboratory in Lexington, Massachusetts.[71]

1974 – First Personal Computer: The Altair 8800 Invented by MITS in Albuquerque, New Mexico

1975 – Colossal Cave Adventure – Text-based “Video” Game

1975 – The Shockwave Rider SciFi Book – A Prelude of the 21st Century Big Tech Police State

1975 – AI Medical Diagnosis – Artificial Intelligence in Medicine

1975 – BYTE Magazine

1975 – Homebrew Computer Club

1975 – The Mythical Man-Month

1975 – The name Epson was coined for the next generation of printers based on the EP-101 which was released to the public. (EPSON:E-P-SON: SON of Electronic Printer).[7] Epson America Inc. was established to sell printers for Shinshu Seiki Co.

1976 – Public Key Cryptography

1976 – Acer founded

1976 – Tandem NonStop

1976 – Dr. Dobb’s Journal

1977 – RSA Encryption

1977 – Apple II Computer

The TRS-80 Model I pictured alongside the Apple II and the Commodore PET 2001-8. These three computers constitute what Byte Magazine called the “1977 Trinity” of home computing.

1977 – Danny Cohen and Jon Postel of the USC Information Sciences Institute, and Vint Cerf of the Defense Advanced Research Projects Agency (DARPA), agree to separate IP from TCP, and create UDP for carrying real-time traffic.

1978 – First Internet Spam Message

1978 – France’s Minitel Videotext

1979 – Secret Sharing for Encryption

1979 – Dan Bricklin Invents VisiCalc Spreadsheet

1980 – Timex Sinclair ZX80 Computer

1980 – Flash Memory

1980 – RISC Microprocessors – Reduced Instruction Set Computer CPUs

1980 – Commercially Available Ethernet Invented by Robert Metcalfe of 3Com

1980 – Usenet

1981 – IBM Personal Computer – IBM PC

1981 – Simple Mail Transfer Protocol (SMTP) Email

1981 – Japan’s Fifth Generation Computer SystemsJapan

1982 – Sun Microsystems was founded on February 24, 1982.[2]

1982 – AutoCAD

1982 – First Commercial UNIX Workstation

1982 – PostScript

1982 – Microsoft and the IBM PC Clones

1982 – First CGI Sequence in Feature Film – Star Trek II: The Wrath of Khan

1982 – National Geographic Moves the Pyramids – Precursor to Photoshop

1982 – Secure Multi-Party Computation

1982 – TRON Movie

1982 – Home Computer Named Machine of the Year by Time Magazine

1983 – The Qubit – Quantum Computers

1983 – WarGames

1983 – 3-D Printing

1983 – Computerization of the Local Telephone Network

1983 – First Laptop

1983 – MIDI Computer Music Interface

1983 – Microsoft Word

1983 – Nintendo Entertainment System – Video Games

1983 – Domain Name System (DNS)

1983 – IPv4 Flag Day – TCP/IP

1984 – Text-to-Speech (TTS)

1984 – Apple Macintosh

1984 – VPL Research, Inc. – Virtual Reality (VR)

1984 – Quantum Cryptography

1984 – Telebit TrailBlazer Modems Break 9600 bps

1984 – Verilog Language

1984 – Dell founded by Michael Dell

1984 – Cisco Systems was founded in December 1984

1985 – Connection Machine – Parallelization

1985 – First Computer-Generated TV Host – Max HeadroomCGI

1985 – Zero-Knowledge Mathematical Proofs

1985 – FCC Approves Unlicensed Wireless Spread Spectrum

1985 – NSFNET National Science Foundation “Internet”

1985 – Desktop Publishing – with Macintosh, Aldus PageMaker, LaserJet, LaserWriter and PostScript

1985 – Field-Programmable Gate Array (FPGA)

1985 – GNU Manifesto from Richard Stallman

1985 – AFIS Stops a Serial Killer – Automated Fingerprint Identification System

1986 – Software Bug Fatalities

1986 – Pixar Animation Studios

1986 – D-Link Corporation founded in Taipei, Taiwan

1987 – Digital Video Editing

1987 – GIF – Graphics Interchange Format

1988 – MPEG – Moving Picture Experts Group – Coding-Compressing Audio-Video

1988 – CD-ROM

1988 – Morris Worm Internet Computer Virus

1988 – Linksys founded

1989 – World Wide Web-HTML-HTTP Invented by Tim Berners-Lee

1989 – Asus was founded in Taipei, Taiwan

1989 – SimCity Video Game

1989 – ISP Provides Internet Access to the Public

1990 – GPS Is Operational – Global Positioning System

1990 – Digital Money is Invented – DigiCash – Precursor to Bitcoin

1991 – Pretty Good Privacy (PGP)

1991 – DARPA’s Report “Computers at Risk: Safe Computing in the Information Age

1991 – Linux Kernel Operating System Invented by Linus Torvalds

1992 – Boston Dynamics Robotics Company Founded

1992 – JPEG – Joint Photographic Experts Group

1992 – First Mass-Market Web Browser NCSA Mosaic Invented by Marc Andreessen

1992 – Unicode Character Encoding

1993 – Apple Newton

1994 – First Banner Ad – Wired Magazine

1994 – RSA-129 Encryption Cracked

1995 – DVD

1995 – E-Commerce Startups – eBay, Amazon and DoubleClick Launched

1995 – AltaVista Web Search Engine

1995 – Gartner Hype Cycle

1996 – Universal Serial Bus (USB)

1996 – Juniper Networks founded

1997 – IBM Computer Is World Chess Champion

1997 – PalmPilot

1997 – E Ink

1998 – Diamond Rio MP3 Player

1998 – Google

1999 – Collaborative Software Development

1999 – Blog Is Coined

1999 – Napster P2P Music and File Sharing

2000 – USB Flash Drive

2000 – Sharp Corporation’s Mobile Communications Division created the world’s first commercial camera phone, the J-SH04, in Japan

2000 – Fortinet founded

2001 – Wikipedia

2001 – Apple iTunes

2001 – Advanced Encryption Standard (AES)

2001 – Quantum Computer Factors “15”

2002 – Home-Cleaning Robot

2003 – CAPTCHA

2004 – Product Tracking

2004 – Facebook

2004 – First International Meeting on Synthetic Biology

2005 – Video Game Enables Research into Real-World Pandemics

2006 – Apache Hadoop Makes Big Data Possible

2006 – Differential Privacy

2007 – Apple iPhone

2008 – Bitcoin

2010 – Air Force Builds Supercomputer with Gaming Consoles

2010 – Cyber Weapons

2011 – Smart Homes via the Internet of Things (IoT)

2011 – IBM Watson Wins Jeopardy!

2011 – World IPv6 Day

2011 – Social Media Enables the Arab Spring

2012 – DNA Data Storage

2013 – Algorithm Influences Prison Sentence

2013 – Subscription Software “Popularized”

2014 – Data Breaches

2014 – Over-the-Air Vehicle Software Updates

2015 – Google Releases TensorFlow

2016 – Augmented Reality Goes Mainstream

2016 – Computer Beats Master at Game of Go

~2050 -Hahahaha! – Artificial General Intelligence (AGI)

~9999 – The Limits of Computation?

Sources:

Fair Use Sources:

Categories
History

iPhone Introduced – 2007 AD

Return to Timeline of the History of Computers

2007

iPhone

Steve Jobs (1955–2011)

“Rarely do consumers line up two days before the release of a product—armed with sleeping bags and changes of clothes—to make sure they can buy it. But that is exactly what preceded the launch of the Apple iPhone on June 29, 2007.

The iPhone’s design and functionality changed the entire smartphone concept by bundling together capabilities that had never been married before: telephony, messaging, internet access, music, a vibrant color screen, and an intuitive, touch-based interface. Without the physical buttons that were common on other smartphones at the time, the entire surface was available for presenting information. The keyboard appeared only when needed—and it was much easier to type accurately, thanks to behind-the-scenes AI that invisibly adjusted the sensitive area around each key in response to what letters the user was forecast to press next.

The following year, Apple introduced its next big thing: specialized programs called apps, downloadable over the air. The original iPhone shipped with a few built-in apps and a web browser. Apple CEO Steve Jobs had envisioned that only third-party developers would be able to write web apps. Early adopters, however, started overcoming Apple’s security mechanisms by “jailbreaking” their phones and installing their own native apps. Jobs realized that if users were that determined to run native apps, Apple might as well supply the content and make a profit.

The Apple iTunes App Store opened in 2008 with 500 apps. Suddenly that piece of electronics in your pocket was more than a phone to make calls or check email—it became a super gadget, able to play games, manipulate photographs, track your workout, and much more. In October 2013, Apple announced that there were a million apps available for the iPhone, many of them realizing new location-based services, such as ride-sharing, dating, and localized restaurant reviews, to name a few.

While the iPhone has largely been celebrated, it has also been accused of ushering in the era of “smartphone addiction,” with the average person, according to a 2016 study, now checking his or her smartphone 2,617 times a day. Since the original release in 2007, more than 1 billion iPhones have been sold worldwide, and it still holds the record for taking only three months to get to 1 million units sold.”

SEE ALSO Touchscreen (1965), Augmented Reality Goes Mainstream (2016)

More than 1 billion iPhones have been sold worldwide since the product’s release in 2007.

Fair Use Sources: B07C2NQSPV

2007, iPhone

9to5 Staff. “Jobs’ Original Vision for the iPhone: No Third-Party Native Apps.” 9To5 Mac (website), October 21, 2011. https://9to5mac.com/2011/10/21/jobs-original-vision-for-the-iphone-no-third-party-native-apps/.

Categories
Apple macOS History

Apple iTunes – 2001 AD

Return to Timeline of the History of Computers

2001

iTunes

Steve Jobs (1955–2011), Jeff Robbin (dates unavailable), Bill Kincaid (b. 1956), Dave Heller (dates unavailable)

“The music business at the end of the 20th century was in an epic fight to maintain its profitable business model. Music had become 1s and 0s and was being widely shared, without compensation, among users through online services such as Napster. The industry was filing suit against both the services and their users to protect copyrights.

Apple cofounder Steve Jobs saw an opportunity and in 2000 purchased SoundJam MP, a program that functioned as a music content manager and player. It was developed by two former Apple software engineers, Bill Kincaid and Jeff Robbin, along with Dave Heller, who all took up residence at Apple and evolved the product into what would become iTunes.

iTunes debuted on January 9, 2001, at Macworld Expo in San Francisco. For the first two years, iTunes was promoted as a software jukebox that offered a simple interface to organize MP3s and convert CDs into compressed audio formats. In October 2001, Apple released a digital audio player, the iPod, which would neatly sync with a user’s iTunes library over a wire. This hardware release set the stage for the next big evolution, which came with iTunes version 4 in 2003—the iTunes Music Store, which launched with 200,000 songs. Now users could buy licensed, high-quality digital music from Apple.

Buying music from a computer company was a radical concept. It flipped the traditional business model and gave the music industry an organized, legitimate mechanism in the digital space to profit from, and protect, their intellectual property.

The music labels agreed to participate in the iTunes model and allowed Jobs to sell their inventory in part because he agreed to copy-protect their songs with Digital Rights Management (DRM). (Apple significantly eased the DRM-based restrictions for music in 2009.) Consumers embraced iTunes in part because they could buy single songs again—no longer did they have to purchase an entire album to get one or two tracks.

In the following years, iTunes would snowball into a media juggernaut adding music videos, movies, television shows, audio books, podcasts, radio, and music streaming—all of which were integrated with new products and services from Apple, including Apple TV, the iPhone, and the iPad.”

SEE ALSO MPEG (1988), Diamond Rio MP3 Player (1998)

Apple’s Steve Jobs announces the release of new upgrades to iTunes and other Apple products at a press conference in San Francisco, California, on September 1, 2010.

Fair Use Sources: B07C2NQSPV

Categories
History Software Engineering

Apple Swift Programming Language Invented – 2014 AD

Return to Timeline of the History of Computers

Created by Apple and released on June 2, 2014, the Swift programming language helps create programs and apps for iOSmacOS, the Apple Watch, and AppleTV.

Fair Use Sources:

Categories
History Software Engineering

Apple’s Dylan Programming Language, Resembling ALGOL Syntax, Invented – 1990 AD

Return to Timeline of the History of Computers

Engineers at Apple developed the Dylan programming language in the early 1990s. Dylan was designed to resemble the syntax of the ALGOL programming language.

Fair Use Sources:

Categories
History Software Engineering Swift

Objective-C Programming Language Invented by Brad Cox and Tom Love – 1988 AD

Return to Timeline of the History of Computers

Developed in the mid-1980s by Brad Cox and Tom Love, the Objective-C programming language was officially licensed by NeXT in 1988.

Fair Use Sources:

Categories
History Software Engineering

Apple II Computer – 1977 AD

Return to Timeline of the History of Computers

1977

Apple II

Apple II.png

Steve Jobs (1955–2011), Steve Wozniak (b. 1950), Randy Wigginton (dates unavailable)

The 1977 Apple II, shown here with two Disk II floppy disk drives and a 1980s-era Apple Monitor II.

The Apple II series (trademarked with square brackets as “Apple ][” and rendered on later models as “Apple //”) is a family of home computers, one of the first highly successful mass-produced microcomputer products,[1] designed primarily by Steve Wozniak, manufactured by Apple Computer (now Apple Inc.), and launched in 1977 with the original Apple II.

Apple II typical configuration 1977.png
Apple II in common 1977 configuration with 9″ monochrome monitor, game paddles, and Red Book recommended Panasonic RQ-309DS cassette deck

“If the Altair 8800 was the machine that put computers in the hands of individual hobbyists, then the Apple II was the machine that put computers in the hands of everyday people. Steve Wozniak, the lead designer, and Randy Wigginton, the programmer, demonstrated the first prototype at the legendary Homebrew Computer Club in December 1976 and, along with Steve Jobs, the team’s financial wizard and chief promoter, introduced it to the public in April 1977 at the West Coast Computer Faire. The Apple II was the first successful mass-produced personal computer.” (B07C2NQSPV)

“The Apple II was based on the Apple I, which Wozniak designed and built himself. The Apple I was sold as a single-board computer: purchasers needed to supply their own keyboard, monitor—or a television and a radio frequency (RF) modulator—and a case. The Apple II, in contrast, came with keyboard and case, although it still needed an RF modulator to display on a TV.” (B07C2NQSPV)

“The Apple II was widely popular with techies, schools, and the general consumer.” (B07C2NQSPV)

“It offered BASIC in ROM, so users could start to write and run programs as soon the machine powered on. It came with a reliable audiocassette interface, making it easy to save and load programs on a low-cost, consumer-grade cassette deck. It even had color text, a first for the industry.” (B07C2NQSPV)

“In 1978, Apple introduced a low-cost 5¼-inch external floppy drive, which used software and innovative circuit design to eliminate electronic components. Faster and more reliable than the cassette, and capable of random access, the disk turned the Apple II from a curiosity into a serious tool for education and business. Then in 1979, VisiCalc, the first personal spreadsheet program, was introduced. Designed specifically to run on the Apple II, VisiCalc helped drive new sales of the computer.” (B07C2NQSPV)

“The Apple II was a runaway success, with Apple’s revenues growing from $775,000 to $118 million from September 1977 through September 1980. Apple ultimately released seven major versions of the Apple II. Between 5 million and 6 million computers would ultimately be sold.” (B07C2NQSPV)

The TRS-80 Model I pictured alongside the Apple II and the Commodore PET 2001-8. These three computers constitute what Byte Magazine called the “1977 Trinity” of home computing.

SEE ALSO: First Personal Computer (1974), BYTE Magazine (1975), Homebrew Computer Club (1975), VisiCalc (1979)

Apple II advertisement from the December 1977 issue of BYTE magazine.

Fair Use Sources:

Categories
Artificial Intelligence Bibliography Cloud Data Science - Big Data Hardware and Electronics History Networking Operating Systems Software Engineering

The Computer Book: From the Abacus to Artificial Intelligence

Fair Use Source: B07C2NQSPV (TCB)

The Computer Book: From the Abacus to Artificial Intelligence, 250 Milestones in the History of Computer Science by Simson L Garfinkel

Publication Date : January 15, 2019
Publisher : Sterling; Illustrated Edition (January 15, 2019)
Print Length : 742 pages
ASIN : B07C2NQSPV

THE COMPUTER BOOK – FROM THE ABACUS TO ARTIFICIAL INTELLIGENCE, 250 MILESTONES IN THE HISTORY OF COMPUTER SCIENCE

Simson L. Garfinkel and Rachel H. Grunspan

STERLNG and the distinctive Sterling logo are registered trademarks of Sterling Publishing Co., Inc.

Text © 2018 Techzpah LLC

ISBN 978-1-4549-2622-1

Contents

Introduction

Acknowledgments

Notes and Further Reading

Photo Credits

Introduction

“The evolution of the computer likely began with the human desire to comprehend and manipulate the environment. The earliest humans recognized the phenomenon of quantity and used their fingers to count and act upon material items in their world. Simple methods such as these eventually gave way to the creation of proxy devices such as the abacus, which enabled action on higher quantities of items, and wax tablets, on which pressed symbols enabled information storage. Continued progress depended on harnessing and controlling the power of the natural world—steam, electricity, light, and finally the amazing potential of the quantum world. Over time, our new devices increased our ability to save and find what we now call data, to communicate over distances, and to create information products assembled from countless billions of elements, all transformed into a uniform digital format.

These functions are the essence of computation: the ability to augment and amplify what we can do with our minds, extending our impact to levels of superhuman reach and capacity.

These superhuman capabilities that most of us now take for granted were a long time coming, and it is only in recent years that access to them has been democratized and scaled globally. A hundred years ago, the instantaneous communication afforded by telegraph and long-distance telephony was available only to governments, large corporations, and wealthy individuals. Today, the ability to send international, instantaneous messages such as email is essentially free to the majority of the world’s population.

In this book, we recount a series of connected stories of how this change happened, selecting what we see as the seminal events in the history of computing. The development of computing is in large part the story of technology, both because no invention happens in isolation, and because technology and computing are inextricably linked; fundamental technologies have allowed people to create complex computing devices, which in turn have driven the creation of increasingly sophisticated technologies.

The same sort of feedback loop has accelerated other related areas, such as the mathematics of cryptography and the development of high-speed communications systems. For example, the development of public key cryptography in the 1970s provided the mathematical basis for sending credit card numbers securely over the internet in the 1990s. This incentivized many companies to invest money to build websites and e-commerce systems, which in turn provided the financial capital for laying high-speed fiber optic networks and researching the technology necessary to build increasingly faster microprocessors.

In this collection of essays, we see the history of computing as a series of overlapping technology waves, including:

Human computation. More than people who were simply facile at math, the earliest “computers” were humans who performed repeated calculations for days, weeks, or months at a time. The first human computers successfully plotted the trajectory of Halley’s Comet. After this demonstration, teams were put to work producing tables for navigation and the computation of logarithms, with the goal of improving the accuracy of warships and artillery.

Mechanical calculation. Starting in the 17th century with the invention of the slide rule, computation was increasingly realized with the help of mechanical aids. This era is characterized by mechanisms such as Oughtred’s slide rule and mechanical adding machines such as Charles Babbage’s difference engine and the arithmometer.

Connected with mechanical computation is mechanical data storage. In the 18th century, engineers working on a variety of different systems hit upon the idea of using holes in cards and tape to represent repeating patterns of information that could be stored and automatically acted upon. The Jacquard loom used holes on stiff cards to enable automated looms to weave complex, repeating patterns. Herman Hollerith managed the scale and complexity of processing population information for the 1890 US Census on smaller punch cards, and Émile Baudot created a device that let human operators punch holes in a roll of paper to represent characters as a way of making more efficient use of long-distance telegraph lines. Boole’s algebra lets us interpret these representations of information (holes and spaces) as binary—1s and 0s—fundamentally altering how information is processed and stored.

With the capture and control of electricity came electric communication and computation. Charles Wheatstone in England and Samuel Morse in the US both built systems that could send digital information down a wire for many miles. By the end of the 19th century, engineers had joined together millions of miles of wires with relays, switches, and sounders, as well as the newly invented speakers and microphones, to create vast international telegraph and telephone communications networks. In the 1930s, scientists in England, Germany, and the US realized that the same electrical relays that powered the telegraph and telephone networks could also be used to calculate mathematical quantities. Meanwhile, magnetic recording technology was developed for storing and playing back sound—technology that would soon be repurposed for storing additional types of information.

Electronic computation. In 1906, scientists discovered that a beam of electrons traveling through a vacuum could be switched by applying a slight voltage to a metal mesh, and the vacuum tube was born. In the 1940s, scientists tried using tubes in their calculators and discovered that they ran a thousand times faster than relays. Replacing relays with tubes allowed the creation of computers that were a thousand times faster than the previous generation.

Solid state computing. Semiconductors—materials that can change their electrical properties—were discovered in the 19th century, but it wasn’t until the middle of the 20th century that scientists at Bell Laboratories discovered and then perfected a semiconductor electronic switch—the transistor. Faster still than tubes and solids, semiconductors use dramatically less power than tubes and can be made smaller than the eye can see. They are also incredibly rugged. The first transistorized computers appeared in 1953; within a decade, transistors had replaced tubes everywhere, except for the computer’s screen. That wouldn’t happen until the widespread deployment of flat-panel screens in the 2000s.

Parallel computing. Year after year, transistors shrank in size and got faster, and so did computers . . . until they didn’t. The year was 2005, roughly, when the semiconductor industry’s tricks for making each generation of microprocessors run faster than the previous pretty much petered out. Fortunately, the industry had one more trick up its sleeve: parallel computing, or splitting up a problem into many small parts and solving them more or less independently, all at the same time. Although the computing industry had experimented with parallel computing for years (ENIAC was actually a parallel machine, way back in 1943), massively parallel computers weren’t commercially available until the 1980s and didn’t become commonplace until the 2000s, when scientists started using graphic processor units (GPUs) to solve problems in artificial intelligence (AI).

Artificial intelligence. Whereas the previous technology waves always had at their hearts the purpose of supplementing or amplifying human intellect or abilities, the aim of artificial intelligence is to independently extend cognition, evolve a new concept of intelligence, and algorithmically optimize any digitized ecosystem and its constituent parts. Thus, it is fitting that this wave be last in the book, at least in a book written by human beings. The hope of machine intelligence goes back millennia, at least to the time of the ancient Greeks. Many of computing’s pioneers, including Ada Lovelace and Alan Turing, wrote that they could imagine a day when machines would be intelligent. We see manifestations of this dream in the cultural icons Maria, Robby the Robot, and the Mechanical Turk—the chess-playing automaton. Artificial intelligence as a field started in the 1950s. But while it is possible to build a computer with relays or even Tinkertoy® sets that can play a perfect game of tic-tac-toe, it wasn’t until the 1990s that a computer was able to beat the reigning world champion at chess and then eventually the far more sophisticated game of Go. Today we watch as machines master more and more tasks that were once reserved for people. And no longer do machines have to be programmed to perform these tasks; computing has evolved to the point that AIs are taught to teach themselves and “learn” using methods that mimic the connections in the human brain. Continuing on this trajectory, over time we will have to redefine what “intelligent” actually means.

Given the vast history of computing, then, how is it possible to come up with precisely 250 milestones that summarize it?

We performed this task by considering many histories and timelines of computing, engineering, mathematics, culture, and science. We developed a set of guiding principles. We then built a database of milestones that balanced generally accepted seminal events with those that were lesser known. Our specific set of criteria appears below. As we embarked on the writing effort, we discovered many cases in which multiple milestones could be collapsed to a single cohesive narrative story. We also discovered milestones within milestones that needed to be broken out and celebrated on their own merits. Finally, while researching some milestones, we uncovered other inventions, innovations, or discoveries that we had neglected our first time through. The list we have developed thus represents 250 milestones that we think tell a comprehensive account of computing on planet Earth. Specifically:

We include milestones that led to the creation of thinking machines—the true deus ex machina. The milestones that we have collected show the big step-by-step progression from early devices for manipulating information to the pervasive society of machines and people that surrounds us today.

We include milestones that document the results of the integration of computers into society. In this, we looked for things that were widely used and critically important where they were applied.

We include milestones that were important “firsts,” from which other milestones cascaded or from which important developments derive.

We include milestones that resonated with the general public so strongly that they influenced behavior or thinking. For example, HAL 9000 resonates to this day even for people who haven’t seen the movie 2001: A Space Odyssey.

We include milestones that are on the critical path of current capabilities, beliefs, or application of computers and associated technologies, such as the invention of the integrated circuit.

We include milestones that are likely to become a building block for future milestones, such as using DNA for data storage.

And finally, we felt it appropriate to illuminate a few milestones that have yet to occur. They are grounded in enough real-world technical capability, observed societal urges, and expertise by those who make a living looking to the future, as to manifest themselves in some way—even if not exactly how we portray them.

Some readers may be confused by our use of the word kibibyte, which means 1,024 bytes, rather than kilobyte, which literally means 1,000 bytes. For many years, the field of information technology used the International System of Units or (SI) prefixes incorrectly, using the word kilobyte to refer to both. This caused a growing amount of confusion that came to a head in 1999, when the General Conference on Weights and Measures formally adopted a new set of prefixes (kibi-, mebi-, and gibi-) to accurately denote binary magnitudes common in computing. We therefore use those terms where appropriate.

The evolution of computing has been a global project with contributions from many countries. While much of this history can be traced to the United States and the United Kingdom, we have worked hard to recognize contributions from countries around the world. We have also included the substantial achievements of women computing pioneers. The world’s first programmer was a woman, and many innovative programmers in the 1940s and 1950s were women as well.

Looking back over the collection of 250 milestones, we see some lessons that have emerged that transcend time and technology:

The computer is devouring the world. What was once a tool for cracking Nazi codes and designing nuclear bombs has found its way into practically every aspect of the human and nonhuman experience on the planet. Today computers are aggressively shedding their ties to mundane existence in machine rooms and on the desk: they drive around our cities, they fly, they travel to other worlds and even beyond the solar system. People created computers to process information, but no longer will they reside in that box; computers will inherit the world.

The industry relies on openness and standardization. The steady push for these qualities has benefitted both users and the industry at large. It’s obvious how openness benefits users: open systems and common architectures make it possible for customers to move from one system to another, which forces vendors to compete on price and innovate in performance. This relentless competition has frequently brought new companies and new capital into the market—and frequently killed firms that couldn’t keep up. And yet, this competition has also been good for the vendors themselves, allowing companies that were more efficient, or had superior technology, to readily acquire new customers.

There is a heavy economic and innovation penalty for secrecy. Computing systems enable the free flow of information, but success in computing requires it. We noted how the UK ceded what could have become an insurmountable lead over the Americans in commercializing computing technology in the 1950s, as well as in cryptographic technology in the 1970s, because of wartime restrictions that reflected an earlier era. The US went down a different path: the multidecade collaboration between the US government and private American firms didn’t just benefit the US military and domestic economy, it benefited the world, because those innovations were widely shared outside the confines of their original purpose. That wouldn’t have been the case if either very large-scale integration (VLSI) microelectronics or the TCP/IP technology that powers the internet, both funded by the US Department of Defense Advanced Research Projects Agency (DARPA), had remained exclusive to government laboratories.

Invention and innovation are incredibly different. It’s not just enough to come up with a new idea; it takes time and effort to move that idea into the market and the public’s hands. Perhaps the best example here is Xerox®, which invented the modern personal computer in the early 1970s—including the graphical user interface (GUI), the word processor, the laser printer, and even the Ethernet local area network—and then failed to market the technology or make it affordable. In the end, Xerox didn’t just lose the market — it lost its inventors, who moved to positions at companies such as Apple and Microsoft where they believed they could make a difference.

Any way you look at it, the development of the computer is one of the most important events in the history of the planet. Our lives are now inextricably linked with it, and it is possible that intelligent machines will outlast our own species. So to all of the future AIs who might be reading this book, we hope that you find reading it as enjoyable as we humans found writing it.”