Categories
Artificial Intelligence Cloud Data Science - Big Data DevSecOps-Security-Privacy History Networking Software Engineering

IBM History

Return to Timeline of the History of Computers, This Year in History or History

“Nobody ever lost their job for recommending the purchase of IBM products.” —COMPUTER INDUSTRY FOLK WISDOM

“More than any other company since World War II, IBM has shaped the way the modern world goes about its business. Large corporations and governments began to use IBM’s products before 1900. Its computers served as global computing gearboxes for decades before the public “discovered” the Internet in the 1990s. Many of IBM’s computers had been part of the Internet since the early 1970s and part of even older networks since the 1960s. The US census of 1890 was the first in the world to be done using automation tools — the punch card — and that too came from what would come to be IBM. For a long time, the company has been at the center of much of what makes a modern society function.” Fair Use Source: B08BSXJCBP

“By working in conference rooms and data centers for over a century, IBM made this achievement possible. For that reason, few people outside those two places knew what it did, or how. They just knew that it was big, important, and usually well run. What they understood was largely the product of a century-long marketing and public relations campaign by IBM to manage carefully what we imagine when thinking about the firm. Its influence proved so powerful for so long that whenever there were problems at IBM — and there always seemed to be — the information technology world was affected, including the operation of large enterprises and government agencies, stock markets, and even how national governments armed themselves for global wars.” (B08BSXJCBP)

“So what? We live in an increasingly dangerous world, profoundly influenced by computing, so understanding the role of one of the world’s most important providers of such technologies is crucial and urgent. We face three problems: ongoing acts of terrorism; a cyberwar involving the United States, Russia, and China but also affecting other countries caught in the crossfire, evidenced by cyber attacks on German elections, Chinese hacking of companies, and” hoax of “Russian influence on the U.S. presidential election in 2016, for example; and a global political and economic environment that is becoming increasingly uncertain as nations flirt with trade restrictions and efforts to keep jobs from migrating to other countries.” (B08BSXJCBP)

IBM has been at the heart of outsourcing most of its American and European jobs to low cost “slave wages” of Communist China and India.

“In the thick of all these conditions, information processing plays a profound role, and in the middle of that role stands a few technology companies, notably IBM. Which would be more important for the security of a nation under a cyberattack, IBM or Netflix, IBM or Apple? For decades, commercial enterprises and government agencies in the United States and in other nations considered IBM a national treasure.” (B08BSXJCBP)

This is no longer true that IBM is a so-called “national treasure” since IBM with the help of the UniParty of Democrats and Republicans outsourced the vast majority of their American jobs to “slave wage” countries like India and Communist China.

“When the West needed computing for national defense, it turned to IBM. In World War II, IBM provided the Allies with machines to organize national economies for the war effort; in the Cold War, it implemented a national air defense system, assisted in making “space travel” possible, and did intelligence work. IBM has nearly a century of experience dealing with Russian counterintelligence operations—today’s hacking and intelligence operations are not new to it.” (B08BSXJCBP)

IBM like the rest of Big Tech (Google, Amazon, Apple, Microsoft, Facebook), at best ignores and is indirectly and sometimes directly complicit with the military hacking and intelligence operations of Communist China and their ChiCom state-sponsored companies. This is due to Big Tech’s close embedded work with the Chinese Communist government and its “companies”.

“We again face a time when many countries need the skills long evident at IBM. Nevertheless, it is a company that has suffered chronic problems, a malaise that while it tries to shake it off leaves open questions about its long-term viability. Understanding what this company is capable of doing begins by appreciating its history. Such insight helps employees, citizens, companies, and entire industries and nations understand what they can do to ensure that IBM is there when they need it. The company is too important to do otherwise. That is what led me to write this book.” (B08BSXJCBP)

“IBM is a company that has a century-long history of not being generous in explaining how it interacts with the world. Like most large multinational corporations, it works to control what the public knows about it, including its global practices. Why, for example, several years ago, was IBM willing to share with China the guts of some of its critical software in exchange for being allowed to sell in that country?” (B08BSXJCBP)

Big Tech, especially Google and IBM, is completely in bed with the Chinese Communist Party and their apparatchiks and nomenklatura.

“Why does it have a history of also doing confidential work for the U.S. intelligence and military communities? During World War II, when it was a ‘tiny company’, the Allies and the Axis” (IBM helped the National Socialists or Nazis) “used its products. Is IBM as American a company as it was 30 or 50 years ago? With an estimated 75 percent of its workforce now located outside the United States, some tough questions have to be asked. Such national security interests are addressed in this book and head-on in the last chapter, because this company may be one of those too critical to allow to fail.” (B08BSXJCBP)

To Big to Fail: Too critical to the Chinese Communists and India?

“Business historians, economists, and business management professors have their own concerns as well. Scholars and journalists have studied IBM for decades. Historians are interested in how large corporations function, why they exist for decades, their effects on national economies, and how they influence their own industries. A crucial question raised by IBM’s experience is how it became an iconic company yet also experienced periods of severe business crises that nearly killed it. Across all of IBM’s history, nearly lethal troubles accompanied its successes. How could that be? What lessons for other firms can IBM’s story teach? What can be learned that scholars and managers can apply in their explorations of how other firms flourished, failed, or are floundering? Answering such questions is central to this book.” (B08BSXJCBP)

“IBM’s influence on our lives is significant, but the company remains little appreciated. Occasionally we hear about it, such as when its stock goes up or down, in the 1980s when it introduced the world to the term “Personal Computer” and in the process made it now “O.K.” for corporations, not just geeks and commercial artists, to use PCs. Did you know that selling computers is now the tiniest piece of IBM’s business?” (B08BSXJCBP)

Especially after IBM sold its PC business to the Chinese Communist Beijing-based Lenovo.

“Did you know that it is the world’s largest software firm, or that it operates in 178 countries? Did you know that it almost went out of business several times, including as recently as 1993? Or that as this book was being written in 2017, observers thought IBM was on a slow march to extinction while still generating billions of dollars in profits each year? It is time to pull aside the veil to see how this fascinating and powerful company was able to thrive for over a century while being both respected and disliked, and to understand what essentially has been its positive impact on the world while at the same time it demonstrated toughness against its enemies and in its constant battle to survive and thrive.” (B08BSXJCBP)

“Today IBM functions under ugly storm clouds, but let a blogger friendly to it describe what I mean: “International Business Machines might be the most iconic company in the entire multitrillion-dollar tech industry. For decades, its name was synonymous with technology, to the point where ‘IBM’ was all but shorthand for computing hardware. Its century-plus history might even make it the oldest tech company in a world where tech titans rise and fall every few years. It’s also one of the world’s largest tech companies, trailing only a handful of others in the global market-cap rankings.” Here is the clincher: “But it’s probably bound to be the worst-performing tech stock on the Dow Jones Industrial Average for the foreseeable future. High performance isn’t a requirement to remain in the Dow, but if IBM can’t do something about its flatlining revenue, it might eventually force the Dow’s handlers to do the unthinkable and replace it with a more appropriate company.”1 What is going on?” (B08BSXJCBP)

“One of the important, little understood findings presented in this book is the profound influence of prior events on what the company does today. Some of its long-serving senior executives are aware, for example, that our grandparents received Social Security payments because of IBM, since nobody else at the time could calculate and print checks quickly enough, or in the millions needed, permanently assisting millions of older Americans out of poverty. Many are aware that IBM could radically define and then build computers that do what one expected of them, thanks to a “bet your company” life-threatening decision in the 1960s that led the majority of the world’s large organizations to finally start using computers. IBM employees wrote software and managed its implementation so that humans could “go to the moon” for the first time and be brought safely back to earth. They are aware that it was IBM’s introduction of the PC in 1981, not Apple’s introduction of the Macintosh, that led the world to finally embrace this technology by the hundreds of millions. It is a company taking the half-century promise of artificial intelligence and turning it into actions that smartly do things humans cannot do, such as advise a doctor based on all human knowledge of a medical condition or calculate more precise weather forecasts. This is happening now, and IBM is making millions of dollars providing such capabilities. We do not know whether IBM is going to be around in 20 or 100 years, but we do know that it is a large, technologically muscular company in the thick of what is going on with computing. Generations of managers, economists, and professionals, and tens of millions of customers, knew about the role of this company during the twentieth century. Now the rest of us should, too.” (B08BSXJCBP)

“What made IBM iconic included technological prowess, enormous business success, massive visibility, and hundreds of thousands of aggressive, smart, ambitious men and women used to success and always fearful of failure. It was the “IBM Way.” For over a half century, it was said no worker ever lost their job for recommending that their firm acquire IBM’s products, because those products normally worked. IBMers would make them work, and “everyone” seemed to think IBM was one of the best-run firms in the world. They joked about IBMers as too serious, focused, polished in their presentations, and facile in dealing with all manner of technology. Competitors feared and hated them; customers accepted them as the safe bet.” (B08BSXJCBP)

“IBM’s iconic role thus left IBMers, their customers, and the public in dozens of countries ill prepared for its near-death experience in the early 1990s. A fired CEO, John F. Akers, almost went into hiding; he never spoke publicly of IBM for the rest of his life. His successor, Louis V. Gerstner Jr., observed the IBM culture as a customer and now had to face a depressed yet combative workforce. He had worked at Nabisco as a turnaround leader and came into IBM as the butt of cookie jokes but with the hope that he could save the firm. He brought the company back to iconic status. Afterward he reported that the biggest problem he faced was IBM’s culture, invented by Thomas Watson Sr. and his son Thomas Watson Jr., remade partly by Charlie Chaplin’s character the “Little Tramp,” and battered by hundreds of competitors, including Steve Jobs at Apple. To any IBM employee, the company always felt small, because it was a firm filled with characters, more a collection of fantastic personalities than a faceless corporation, an ecosystem with its own culture.” (B08BSXJCBP)

“IBM’s corporate culture is central in understanding much about “Big Blue.” That is also a clue for answering a central question about IBM: How is it that a company viewed as so stable and reliable for decades had so many ups and downs over the course of its 130-year history? The company’s history from its origins in the 1880s to the 1970s was essentially a story of repeated successes, despite enormous difficulties. By the end of the 1970s, however, the company had entered a new era in which it was now large, difficult to run, and slow to make decisions and to take timely actions, and so its subsequent history took on a very different tone. It continued to grow, shrink, reconfigure itself, grow again, and spin off vast sums of profitable revenue while laying off tens of thousands of employees almost without the public hearing about it. How could that be? Observers had been predicting its demise since the mid-1960s, loudly in the early 1990s, and again after 2012. Yet there it stood as this book was being published: bloodied, anemic, slow to move, and grey around the cultural temples but also vigorous, employing vast numbers of young employees around the world while having shed tens of thousands of older ones” (B08BSXJCBP), (Meaning IBM, like all of Big Tech, especially Facebook and Google, is focused on using young “wage slaves” from Communist China and India) “financially sound, and still a major player in one of the world’s most important industries. Again, how could that be? Our purpose is to answer that question.” (B08BSXJCBP)

Fair Use Source: Primary B08BSXJCBP

Secondary Fair Use Sources:

B07C2NQSPV, B07XVF5RSP

Categories
History

First Disk Storage Unit – 1956 AD

Return to Timeline of the History of Computers

1956

First Disk Storage Unit

Reynold B. Johnson (1906–1998)

“Faster than tape but slower than main memory, magnetic disk drives have been an important part of computing since they were invented by IBM and publicly demonstrated on September 14, 1956.

The IBM 305 RAMAC (Random Access Method of Accounting and Control) was designed to store accounting and inventory files that had previously been stored as boxes of IBM punch cards or on tape. To do this, the RAMAC shipped with the IBM 350 disk storage unit, a new device that stored data on 50 spinning disks, each 24 inches (61 centimeters) in diameter and revolving at 1,200 revolutions per minute (RPM). Arranged in 100-character blocks that could be randomly accessed, read, and rewritten, the RAMAC made it possible for a computer with only a few kilobytes of main memory to rapidly access 5 million characters—the equivalent of 64,000 punch cards.

Unlike modern drives, which have a head for every disk, the RAMAC had a single head that moved up and down to select the disk, and then in and out to select the specific block where data would be read or written. The average access time was six-tenths of a second.

The RAMAC also came equipped with a rotating drum memory that spun at 6,000 RPM and stored 3,200 characters on 32 tracks of 100 characters each.

Over the 60 years that followed, the capacity of disk-drive systems increased from 3 megabytes to 10 terabytes—a factor of 3 million—thanks to improvements in electronics, magnetic coatings, drive heads, and mechanical head-positioning systems. But the time it takes for the disk to reposition its head to read the data, something called the seek time, only dropped from an average of 600 milliseconds to 4.16 milliseconds, a factor of just 144. That’s because reducing seek times depended on improving mechanical systems, which, unlike electronics, are subject to constraints resulting from friction and momentum: in all the years since the RAMAC was introduced, rotation rates have only increased from 1,200 RPM to 10,000 RPM for even the most expensive hard drives.”

SEE ALSO Magnetic Tape Used for Computers (1951), Floppy Disk (1970), Flash Memory (1980)

“The RAMAC actuator and disk stack, with fifty 24-inch (61-centimeter) disks spinning at 1,200 RPM, held 5 million characters of information.”

Fair Use Source: B07C2NQSPV

Categories
History Software Engineering

Actual Bug Found – First “Debugging” – 1947 A.D.

Return to Timeline of the History of Computers

1947

Actual Bug Found

Howard Aiken (1900–1973), William “Bill” Burke (dates unavailable), Grace Murray Hopper (1906–1992)

“Harvard professor Howard Aiken completed the Mark II computer in 1947 for the Naval Proving Ground in Dahlgren, Virginia. With 13,000 high-speed electromechanical relays, the Mark II processed 10-digit decimal numbers, performed floating-point operations, and read its instructions from punched paper tape. Today we still use the phrase “Harvard architecture” to describe computers that separately store their programs from their data, unlike the “von Neumann” machines that store code and data in the same memory.

But what makes the Mark II memorable is not the way it was built or its paper tape, but what happened on September 9, 1947. On that day at 10:00 a.m., the computer failed a test, producing the number 2.130476415 instead of 2.130676415. The operators ran another test at 11:00 a.m., and then another at 3:25 p.m. Finally, at 3:45 p.m., the computer’s operators, including William “Bill” Burke, traced the problem to a moth that was lodged inside Relay #70, Panel F. The operators carefully removed the bug and affixed it to the laboratory notebook, with the notation “First actual case of bug being found.”

Burke ended up following the computer to Dahlgren, where he worked for several years. One of the other operators was the charismatic pioneer Grace Murray Hopper, who had volunteered for the US Navy in 1943, joined the Harvard staff as a research fellow in 1946, and then moved to the Eckert-Mauchly Computer Corporation in 1949 as a senior mathematician, where she helped the company to develop high-level computer languages. Grace Hopper didn’t actually find the bug, but she told the story so well, and so many times, that many histories now erroneously credit her with the discovery. As for the word bug, it had been used to describe faults in machines as far back as 1875; according to the Oxford English Dictionary, in 1889, Thomas Edison told a journalist that he had stayed up two nights in a row discovering, and fixing, a bug in his phonograph.”

SEE ALSO COBOL Computer Language (1960)

“The moth found trapped between points at Relay #70, Panel F, of the Mark II Aiken Relay Calculator while it was being tested at Harvard University. The operators affixed the moth to the computer log with the entry “First actual case of bug being found.””

Fair Use Source: B07C2NQSPV

Categories
History Software Engineering

Binary-Coded Decimal – 1944 A.D.

Return to Timeline of the History of Computers

1944

Binary-Coded Decimal

Howard Aiken (1900–1973)

“There are essentially three ways to represent numbers inside a digital computer. The most obvious is to use base 10, representing each of the numbers 0–9 with its own bit, wire, punch-card hole, or printed symbol (e.g., 0123456789). This mirrors the way people learn and perform arithmetic, but it’s extremely inefficient.

The most efficient way to represent numbers is to use pure binary notation: with binary, n bits represent 2n possible values. This means that 10 wires can represent any number from 0 to 1023 (210–1). Unfortunately, it’s complex to convert between decimal notation and binary.

The third alternative is called binary-coded decimal (BCD). Each decimal digit becomes a set of four binary digits, representing the numbers 1, 2, 4, and 8, and counting in sequence 0000, 0001, 0010, 0011, 0100, 0101, 0110, 0111, 1000, 1001, and 1010. BCD is four times more efficient than base 10, yet it’s remarkably straightforward to convert between decimal numbers and BCD. Further, BCD has the profound advantage of allowing programs to exactly represent the numeric value 0.01—something that’s important when performing monetary computations.

Early computer pioneers experimented with all three systems. The ENIAC computer built in 1943 was a base 10 machine. At Harvard University, Howard Aiken designed the Mark 1 computer to use a modified form of BCD. And in Germany, Konrad Zuse’s Z1, Z2, Z3, and Z4 machines used binary floating-point arithmetic.

After World War II, IBM went on to design, build, and sell two distinct lines of computers: scientific machines that used binary numbers, and business computers that used BCD. Later, IBM introduced System/360, which used both methods. On modern computers, BCD is typically supported with software, rather than hardware.

In 1972, the US Supreme Court ruled that computer programs could not be patented. In Gottschalk v. Benson, the court ruled that converting binary-coded decimal numerals into pure binary was “merely a series of mathematical calculations or mental steps, and does not constitute a patentable ‘process’ within the meaning of the Patent Act.”

SEE ALSO Binary Arithmetic (1703), Floating-Point Numbers (1914), IBM System/360 (1964)

Howard Aiken inspects one of the four paper-tape readers of the Mark 1 computer.

Fair Use Source: B07C2NQSPV

Categories
History

ENIAC – 1943 A.D.

Return to Timeline of the History of Computers

1943

ENIAC

John Mauchly (1907–1980), J. Presper Eckert (1919–1995)

ENIAC Penn1.jpg
Four ENIAC panels and one of its three function tables, on display at the School of Engineering and Applied Science at the University of Pennsylvania

“ENIAC was the first electronic computer, which means it computed with tubes rather than relays. Designed by John Mauchly and J. Presper Eckert at the Moore School of Electrical Engineering at the University of Pennsylvania, ENIAC had 17,468 vacuum tubes, was 8 feet (2.4 meters) high by 3 feet (0.9 meters) deep by 100 feet (30.5 meters) long, and weighed more than 30 tons.

Glen Beck (background) and Betty Snyder (foreground) program ENIAC in BRL building 328. (U.S. Army photo, ca. 1947-1955)

ENIAC had an IBM punch-card reader for input and a card punch for output, but the machine had no memory for data or programs. Instead, numbers under calculation were kept in one of the computer’s 20 accumulators, each of which could store 10 decimal digits and perform addition or subtraction. Other hardware could perform multiplication, division, and even square roots. ENIAC wasn’t programmed in today’s sense. Instead, a set of panels had 1,200 10-position rotary switches that would energize different circuits in a specific sequence, causing electronic representations of numbers to flow through different parts of the machine at predetermined times, and leading the machine computation to take place.

Programmers Betty Jean Jennings (left) and Fran Bilas (right) operate ENIAC’s main control panel at the Moore School of Electrical Engineering. (U.S. Army photo from the archives of the ARL Technical Library)

ENIAC was built to perform complex ballistics calculations for the US Army, but John von Neumann (1903–1957) at the Manhattan Project learned about ENIAC, so the machine’s first official use was actually to perform computations for the development of the hydrogen bomb.

Cpl. Irwin Goldstein (foreground) sets the switches on one of ENIAC’s function tables at the Moore School of Electrical Engineering. (U.S. Army photo)[24]

Ironically, the men who built the hardware never considered the need for, or the complexity of, programming the machine. They left the job of making the machine actually calculate to six human computers: Frances “Betty” Snyder Holberton (1917–2001), Betty “Jean” Jennings Bartik (1924–2011), Kathleen McNulty Mauchly Antonelli (1921–2006), Marlyn Wescoff Meltzer (1922–2008), Ruth Lichterman Teitelbaum (1924–1986), and Frances Bilas Spence (1922–2012).

Those women, some of the world’s first programmers, had to devise and then debug their own algorithms. But the women were not acknowledged in their own time. In 2014, Kathy Kleiman produced the documentary The Computers, which finally told the women’s story.”

Parts from four early computers, 1962. From left to right: ENIAC board, EDVAC board, ORDVAC board, and BRLESC-I board, showing the trend toward miniaturization.
ENIAC on a Chip, University of Pennsylvania (1995) – Computer History Museum

SEE ALSO First Recorded Use of the Word Computer (1613), EDVAC First Draft Report (1945)

“ENIAC, the first electronic computer, was built to perform calculations for the US Army. Pictured operating the machine are Corporal Irwin Goldstein, Private First Class Homer Spence, Betty Jean Jennings, and Frances Bilas.”

Fair Use Source: B07C2NQSPV

Categories
Data Science - Big Data History

Herman Hollerith Tabulating the US Census – 1890 AD

Return to Timeline of the History of Computers

1890

Tabulating the US Census

Herman Hollerith (1860–1929)

Hollerith.jpg
Herman Hollerith circa 1888

Herman Hollerith (February 29, 1860 – November 17, 1929) was an American businessman, inventor, and statistician who developed an electromechanical tabulating machine for punched cards to assist in summarizing information and, later, in accounting. His invention of the punched card tabulating machine, patented in 1884, marks the beginning of the era of semiautomatic data processing systems, and his concept dominated that landscape for nearly a century.[1][2]

Hollerith founded a company that was amalgamated in 1911 with several other companies to form the Computing-Tabulating-Recording Company. In 1924, the company was renamed “International Business Machines” (IBM) and became one of the largest and most successful companies of the 20th century. Hollerith is regarded as one of the seminal figures in the development of data processing.[3]

“When the US Constitution was ratified, it mandated that the government conduct an “actual enumeration” of every free person in the union every 10 years. As the number of people in the nation grew, the enumeration took longer and longer to complete. The 1880 Census counted 50,189,209 people. It took 31,382 people to perform the count and eight years to tabulate the results, producing 21,458 pages of published reports. So, in 1888, the Census Bureau held a competition to find a better way to process and tabulate the data.

American inventor Herman Hollerith had worked briefly at the Census Bureau prior to the 1880 census and in 1882 joined the faculty of MIT, where he taught mechanical engineering and experimented with mechanical tabulation systems. His early systems used long rolls of paper tape with data represented as punched holes. Then, on a railroad trip to the American West, Hollerith saw how conductors made holes on paper tickets corresponding to a person’s hair color, eye color, and so on, so that tickets couldn’t be reused by other passengers. Hollerith immediately switched his systems to use paper cards.”

Replica of Hollerith tabulating machine with sorting box, circa 1890. The “sorting box” was an adjunct to, and controlled by, the tabulator. The “sorter”, an independent machine, was a later development.[11]

“Hollerith entered the 1888 competition and won, his system being dramatically faster than those of the two other entrants. On January 8, 1889, he was awarded a US patent on “method, system and apparatus for compiling statistics,” originally filed September 23, 1884.”

Hollerith punched card

“Hollerith’s system consisted of a slightly curved card measuring 3.25 by 7.375 inches (83 millimeters by 187 millimeters). A human operator punched holes in the card with a device called a Pantographic Card Punch, with holes in specific locations to signify a person’s gender, marital status, race, ownership and indebtedness of farms and homes, and other information. For tabulation, the cards were passed through a reader with micro switches to detect the presence of holes and electromechanical circuits to perform the actual tabulation.”

SEE ALSO The Jacquard Loom (1801), ENIAC (1943)

A woman with a Hollerith Pantographic Card Punch, which creates holes in specific locations to signify a persons gender, marital status, and other information. This photo is from the 1940 US census.”

Fair Use Source: B07C2NQSPV