The Hewlett-Packard Company, commonly shortened to Hewlett-Packard or HP, (/ˈhjuːlɪt ˈpækərd/HEW-lit PAK-ərd), was an American multinationalinformationtechnology company headquartered in Palo Alto, California, that developed and provided a wide variety of hardware components, as well as software and related services to consumers, small and medium-sized businesses (SMBs) and large enterprises, including customers in the government, health and education sectors. The company was founded in a one-car garage in Palo Alto, California by Bill Hewlett and David Packard in 1939, and initially produced a line of electronic test and measurement equipment. The HP Garage at 367 Addison Avenue is now designated an official California Historical Landmark, and is marked with a plaque calling it the “Birthplace of ‘Silicon Valley‘”.
HP is the third oldest IT company after IBM and Fujitsu.
The company got its first big contract in 1938, providing its test and measurement instruments for production of Walt Disney‘s hugely successful animated film Fantasia. This success led Hewlett and Packard to formally establish their Hewlett-Packard Company on January 1, 1939. The company grew into a multinational corporation widely respected for its products, and its management style and culture known as the HP Way, which was adopted by other businesses worldwide. HP was the world’s leading PC manufacturer from 2007 until the second quarter of 2013, when Lenovo moved ahead of HP.[1][2][3] HP specialized in developing and manufacturing computing, data storage, and networking hardware, designing software and delivering services. Major product lines included personal computing devices, enterprise and industry standard servers, related storage devices, networking products, software and a diverse range of printers and other imaging products. HP directly marketed its products to households, small- to medium-sized businesses and enterprises as well as via online distribution, consumer-electronics and office-supply retailers, software partners and major technology vendors. HP also offered services and a consulting business for its products and partner products.
In 1999, Hewlett-Packard Company spun off its electronic and bio-analytical test and measurement instruments business as Agilent Technologies; HP retained focus on its later products, including computers and printers. It merged with Compaq in 2002, and acquired EDS in 2008, leading to combined revenues of $118.4 billion that year and a Fortune 500 ranking of 9 in 2009. In November 2009, HP announced its acquisition of 3Com,[4] with the deal closing on April 12, 2010.[5] On April 28, 2010, HP announced its buyout of Palm, Inc. for $1.2 billion.[6] On September 2, 2010, HP won its bidding war for 3PAR with a $33 a share offer ($2.07 billion), which Dell declined to match.[7]
On November 1, 2015, the company spun off its enterprise products and services business Hewlett Packard Enterprise. Hewlett-Packard retained the personal computer and printer businesses and was renamed HP Inc.[8]
Fujitsu Limited (富士通株式会社, Fujitsū Kabushiki-gaisha) is a Japanese multinational information technology equipment and services company headquartered in Tokyo.[3] In 2018, it was the world’s fourth-largest IT services provider measured by global IT services revenue (after IBM, Accenture and AWS).[4]Fortune named Fujitsu as one of the world’s most admired companies[5] and a Global 500 company.[6]
Fujitsu mainly makes computing products, but the company and its subsidiaries also offer a diversity of products and services in the areas of personal computing, enterprise computing, including x86, SPARC and mainframe compatible server products, as well as storage products, telecommunications, advanced microelectronics, and air conditioning. It has approximately 140,000 employees and its products and services are available in over 100 countries.[2]
Fujitsu is the second oldest IT company after IBM and before Hewlett Packard, established on June 20, 1935,[7] under the name Fuji Telecommunications Equipment Manufacturing (富士電気通信機器製造, Fuji Denki Tsūshin Kiki Seizō), as a spin-off of the Fuji Electric Company, itself a joint venture between the Furukawa Electric Company and the German conglomerate Siemens which had been founded in 1923. Despite its connections to the Furukawa zaibatsu, Fujitsu escaped the Allied occupation of Japan after the Second World War mostly unscathed.
In 1954, Fujitsu manufactured Japan’s first computer, the FACOM 100 mainframe,[8][9] and in 1961 launched its second generation computers (transistorized) the FACOM 222 mainframe.[10] The 1968 FACOM230 “5” Series marked the beginning of its third generation computers.[11] Fujitsu offered mainframe computers from 1955 until at least 2002[12] Fujitsu’s computer products have included minicomputers,[13] small business computers,[14] servers[15] and personal computers.[16]
John Bardeen (1908–1991), Walter Houser Brattain (1902–1987), William Shockley (1910–1989)
“A transistor is an electronic switch: current flows from one terminal to another unless voltage is applied to a third terminal. Combined with the laws of Boolean algebra, this simple device has become the building block for microprocessors, memory systems, and the entire computer revolution.
Any technology that can use one signal to switch another on and off can be used to create a computer. Charles Babbage did it with rods, cogs, and steam power. Konrad Zuse and Howard Aiken did it with relays, and ENIAC used tubes. Each technology was faster and more reliable than the previous.
Likewise, transistors have several advantages over vacuum tubes: they use less power, so they generate less heat, they switch faster, and they are less susceptible to physical shock. All of these advantages arise because transistors are smaller than tubes—and the smaller the transistor, the bigger the advantage.
Modern transistors trace their lineage back to a device manufactured by John Bardeen, Walter Brattain, and William Shockley at AT&T’s Bell Laboratories in 1947. The team was trying to build an amplifier that could detect ultra-high frequency radio waves, but the tubes that they had just weren’t fast enough. So they tried working with semiconductor crystals, as radios based on semiconductor diodes called cat’s whiskers had been used since nearly the birth of radio in the 1890s.
A cat’s whisker radio uses a sharp piece of wire (the “whisker”) that’s jabbed into a piece of semiconducting germanium; by moving the wire along the semiconductor and varying the pressure, the semiconductor and the wire work together to create a diode, a device allowing current to pass in only one direction. The Bell Labs team built a contraption that attached two strips of gold foil to the crystal and then applied power to the germanium. The result was an amplifier: a signal injected into one wire was stronger when it came out of the other. Today we call this device a point-contact transistor.
For their discovery of the transistor, Bardeen, Brattain, and Shockley were awarded the Nobel Prize in 1956.”
John Mauchly (1907–1980), J. Presper Eckert (1919–1995)
Four ENIAC panels and one of its three function tables, on display at the School of Engineering and Applied Science at the University of Pennsylvania
“ENIAC was the first electronic computer, which means it computed with tubes rather than relays. Designed by John Mauchly and J. Presper Eckert at the Moore School of Electrical Engineering at the University of Pennsylvania, ENIAC had 17,468 vacuum tubes, was 8 feet (2.4 meters) high by 3 feet (0.9 meters) deep by 100 feet (30.5 meters) long, and weighed more than 30 tons.
Glen Beck (background) and Betty Snyder (foreground) program ENIAC in BRL building 328. (U.S. Army photo, ca. 1947-1955)
ENIAC had an IBM punch-card reader for input and a card punch for output, but the machine had no memory for data or programs. Instead, numbers under calculation were kept in one of the computer’s 20 accumulators, each of which could store 10 decimal digits and perform addition or subtraction. Other hardware could perform multiplication, division, and even square roots. ENIAC wasn’t programmed in today’s sense. Instead, a set of panels had 1,200 10-position rotary switches that would energize different circuits in a specific sequence, causing electronic representations of numbers to flow through different parts of the machine at predetermined times, and leading the machine computation to take place.
ENIAC was built to perform complex ballistics calculations for the US Army, but John von Neumann (1903–1957) at the Manhattan Project learned about ENIAC, so the machine’s first official use was actually to perform computations for the development of the hydrogen bomb.
Cpl. Irwin Goldstein (foreground) sets the switches on one of ENIAC’s function tables at the Moore School of Electrical Engineering. (U.S. Army photo)[24]
Ironically, the men who built the hardware never considered the need for, or the complexity of, programming the machine. They left the job of making the machine actually calculate to six human computers: Frances “Betty” Snyder Holberton (1917–2001), Betty “Jean” Jennings Bartik (1924–2011), Kathleen McNulty Mauchly Antonelli (1921–2006), Marlyn Wescoff Meltzer (1922–2008), Ruth Lichterman Teitelbaum (1924–1986), and Frances Bilas Spence (1922–2012).
Those women, some of the world’s first programmers, had to devise and then debug their own algorithms. But the women were not acknowledged in their own time. In 2014, Kathy Kleiman produced the documentary The Computers, which finally told the women’s story.”
Parts from four early computers, 1962. From left to right: ENIAC board, EDVAC board, ORDVAC board, and BRLESC-I board, showing the trend toward miniaturization.ENIAC on a Chip, University of Pennsylvania (1995) – Computer History Museum
“ENIAC, the first electronic computer, was built to perform calculations for the US Army. Pictured operating the machine are Corporal Irwin Goldstein, Private First Class Homer Spence, Betty Jean Jennings, and Frances Bilas.”
John Vincent Atanasoff (1903–1995), Clifford Edward Berry (1918–1963)
“Built at Iowa State College (now Iowa State University) by professor John Atanasoff and graduate student Clifford Berry, the Atanasoff-Berry Computer (ABC) was an automatic, electronic digital desktop computer.
Atanasoff, a physicist and inventor, created the ABC to solve general systems of linear equations with up to 29 unknowns. At the time, it took a human computer eight hours to solve a system with eight unknowns; systems with more than 10 unknowns were not often attempted. Atanasoff started building the computer in 1937; he successfully tested it in 1942, and then abandoned it when he was called for duty in World War II. Although the machine was largely forgotten, it changed the course of computing decades later.
The machine was based on electronics, rather than relays and mechanical switches, performed math with binary arithmetic, and had a main memory that used an electrical charge (or its absence) in small capacitors to represent 1s and 0s—the same approach used by modern dynamic random access memory (DRAM) modules. The whole computer weighed 700 pounds.
Ironically, the lasting value of the ABC was to invalidate the original ENIAC patent, which had been filed by J. Presper Eckert and John Mauchly in June 1947. The ENIAC patent was the subject of substantial litigation, and the US Patent and Trademark Office did not issue the patent until 1964 as a result. With the patent in hand, the American electronics company Sperry Rand (which had bought the Eckert-Mauchly Computer Corporation in 1950) immediately demanded huge fees from all companies selling computers. At the time, patents were good for 18 years from the date of issuance, meaning that the ENIAC patent might stifle the computing industry until 1982.
It turned out that Mauchly had visited Iowa State and studied the ABC in June 1941—but had failed to mention the ABC as prior work in his patent application. In 1967, Honeywell sued Sperry Rand, claiming that the patent was invalid because of the omission. The US District Court for the District of Minnesota agreed and invalidated the ENIAC patent six years later.”
“Although the electroluminescent property of some crystals was discovered in England in 1907, it took more than a decade of work by the self-taught Russian scientist Oleg Vladimirovich Losev to develop a theory (based on Einstein’s photoelectric theory) of how the effect worked, and to produce devices that could be used in practical applications. In total, Losev published 16 academic papers that appeared in Russian, British, and German scientific journals between 1924 and 1930, comprehensively describing the devices in the process. He went on to come up with novel applications for light-emitting diodes (LEDs and other semiconductors, including a “light relay device,” a radio receiver, and a solid-state amplifier, before dying of starvation during the Siege of Leningrad in 1942.
LEDs were rediscovered in 1962 by four different groups of American researchers. This time the technology would not be lost. Compared with incandescent, fluorescent, and nixie tubes of the day, LEDs consumed far less power and produced practically no heat. They had just three disadvantages: they could make only red light, they were not very bright, and they were fantastically expensive—more than $200 each at the beginning.
By 1968, improvements in production let companies push the price of LEDs down to five cents each. At that price, LEDs started showing up in calculators, wristwatches, laboratory equipment, and, of course, computers. Indeed, LEDs arranged as individual lights and seven-segment numeric displays were one of the primary outputs for the first generation of microcomputers in the mid-1970s. Even the early LEDs could be switched on and off millions of times a second, resulting in their use in fiber-optic communications. In 1980, infrared LEDs started showing up in television remotes.
Although blue and ultraviolet LEDs were invented in the 1970s, a number of breakthroughs were required to make them bright enough for practical use. Today those challenges have been overcome. Indeed, the bright-white LED house lights that have largely replaced both incandescent and fluorescent light bulbs are based on an ultraviolet LED that stimulates a white phosphor.”
SEE ALSO First Liquid-Crystal Display (1965)
“Eight decades after it was invented in 1927, light-emitting diodes were finally bright enough and cheap enough to replace incandescent light bulbs on a massive scale.”
Michael Faraday (1791–1867), Karl Ferdinand Braun (1850–1918)
“Semiconductors are curious devices: not quite conductors like copper, gold, or silver, not quite insulators like plastic or rubber. In 1833, Michael Faraday discovered that the chemical silver sulfide became a better conductor when heated, unlike metals that lose their conductivity under the same conditions. Separately, in 1874, Karl Ferdinand Braun, a 24-year-old German physicist, discovered that a metal sulfide crystal touched with a metal probe would conduct electricity in only one direction. This “one direction” characteristic is what defines diodes or rectifiers, the simplest electronic components.”
“In 1904 the British chemist John-Ambrose Fleming had invented the two-element amplifier, or ‘diode’, and a few months before DeForest the Austrian physicist Robert von Lieben had already built a three-element amplifier, or triode.” (Fair Use: B07XVF5RSP)
“Braun’s discovery was a curiosity until the invention of radio. The diode proved critical in allowing radio to make the transition from wireless telegraphy to the transmission and reception of the human voice. The diode of choice for these early radio sets was frequently called a cat’s whisker diode, because it consisted of a crystal of galena, a form of lead sulfide, in contact with a spring of metal (the “whisker”). By carefully manipulating the pressure and orientation of the metal against the crystal, an operator could adjust the electrical properties of the semiconductor until they were optimal for radio reception. Powered only by the radio waves themselves, a crystal set was only strong enough to faintly produce sounds in an earphone.”
“Crystal radio receivers were used onboard ships and then in homes until they were replaced by new receivers based on vacuum tubes, which could amplify the faint radio waves so that they were strong enough to power a speaker and fill a room with speech or music. But tubes didn’t mark the end of the crystal radio: the devices remained popular for people who couldn’t get tubes—such as on the front lines in World War II — as well as among children learning about electronics. In the 1940s, scientists at Bell Labs turned their attention to semiconductor radios once again in an effort to perfect microwave communications. In the process, they discovered the transistor.”
“Braun went on to make other fundamental contributions to physics and electronics. In 1897, he invented the cathode-ray tube (CRT), which would become the basis of television. He shared the 1909 Nobel Prize with Guglielmo Marconi (1874–1937) “in recognition of their contributions to the development of wireless telegraphy.””
Crystal Detector, made by the Philmore Manufacturing Company. To use this device, the operator would connect a wire to each of the two flanges and press the metal “whisker” into the semiconductor crystal.”