history of computer
Eckert and Mauchly with the ENIAC
Computer pioneers Presper Eckert and John Mauchly founded the Eckert-Mauchly Computer Corp. to construct machines based on their experience with ENIAC and EDVAC. The only machine the company built was BINAC. Before completing the UNIVAC, the company became a division of Remington Rand.
Heinz Nixdorf founded Nixdorf Computer Corp. in Germany. It remained an independent corporation until merging with Siemens in 1990.
ElectroData computer in use, 1955
Burroughs buys Electrodata. Calculator manufacturer Burroughs gained entry to the computer industry by purchasing the southern California company Electrodata Corporation. The combined firm became a giant in the calculating machine business and expanded into electronics and digital computers when these technologies developed. Burroughs created many computer systems in the 1960s and 1970s and eventually merged with the makers of the Sperry Rand (maker of Univac computers) to form Unisys.
Digital Equipment Corp.
A group of engineers led by Ken Olsen left MIT´s Lincoln Laboratory founded a company based on the new transistor technology. In August, they formally created Digital Equipment Corp. It initially set up shop in a largely vacant woolen mill in Maynard, Mass., where all aspects of product development — from management to manufacturing — took place.
In Minneapolis, the original Engineering Research Associates group led by Bill Norris left Sperry Rand to form a new company, Control Data Corp., which soon released its model 1604 computer.
Tandy Radio Shack is founded. Tandy Radio Shack (TRS) was formed by the 1963 merger of Tandy Leather Company and Radio Shack. TRS began by selling a variety of electronic products, mainly to hobbyists. The TRS-80 Model I computer, introduced in 1977, was a major step in introducing home computers to the public. Like the Commodore PET and the Apple II, which were introduced within months of the TRS-80, the computer came assembled and ready to run.
Commodore Business Machine founder Jack Tramiel
Commodore Business Machines (CBM) is founded. Its founder Jack Tramiel emigrated to the US after WWII where he began repairing typewriters. In 1965, he moved to Toronto and established Commodore International which also began making mechanical and electronic calculators. In 1977, Commodore released the Commodore PET computer; in 1981 the VIC-20; and, in 1982, the Commodore 64. CBM purchased competitor Amiga Corporation in 1984. Despite being the largest single supplier of computers in the world at one time, by 1984 internal disputes and market pressures led to financial problems. The company declared bankruptcy in 1994.
Ivan Sutherland and David Evans, 1969
Evans & Sutherland is formed. In 1968, David Evans and Ivan Sutherland, both professors of computer science, founded a company to develop a special graphics computer known as a frame buffer. This device was a special high-speed memory used for capturing video. Based in Salt Lake City, Utah, the two founders trained a generation of computer graphics pioneers—either at E&S or at the University of Utah computer science department. Sutherland left the firm in 1975, and Evans retired in the early 1990s, but E & S continues today as a major supplier of military and commercial graphics systems.
Xerox Corp. bought Scientific Data Systems for nearly $1 billion — 90 times the latter´s earnings. The SDS series of minicomputers in the early 1960s logged more sales than did Digital Equipment Corp. Xerox changed the series to the XDS computers but eventually closed the division and ceased to manufacture the equipment.
Engineers at PARC circa 1972
Xerox opens Palo Alto Research Center (PARC). In 1970, Xerox Corporation hired Dr. George Pake to lead a new research center in Palo Alto, California. PARC attracted some of the United States’ top computer scientists, and produced many groundbreaking inventions that transformed computing—most notably the personal computer graphical user interface, Ethernet, the laser printer, and object-oriented programming. Xerox was unable to market the inventions from PARC but others did, including Steve Jobs (Apple), Bob Metcalfe (3Com), as well as Charles Geschke and John Warnock (Adobe)
RCA Spectra 70 advertisment
RCA sells its computer division. RCA was founded in 1919 to make vacuum tubes for radio, then a new invention. RCA began designing and selling its own computers in the early 1950s, competing with IBM and several other companies. By the 1970s, RCA, as well as other computer makers, were struggling to compete against IBM. RCA made their machines IBM-compatible, but ultimately even this strategy proved unsuccessful. RCA announced it would no longer build computers in 1971, selling its computer business to Sperry-Rand.
IMSAI 8080 System
IMSAI is founded. In 1973, Bill Millard left his regular job in management to found the consulting firm Information Management Services or IMS. The following year, while he was working on a client’s project, he developed a small computing system using the then-new Intel 8080 microprocessor. He realized this computer might attract other buyers and so placed an advertisement in the hobbyist magazine “Popular Electronics,” offering it in kit form. The IMSAI 8080, as it was known, sold briskly and eventually about 20,000 units were shipped. The company was eventually purchased by one of its dealers and is today a division of the Fischer-Freitas Company, which still offers reproductions of the original for sale to hobbyists.
Xerox closes its computer division. After acquiring computer maker Scientific Data Systems (SDS) in 1969, Xerox redesigned SDS’s well-known Sigma line of computers. Xerox struggled against competitors like IBM and in 1975 closed the division. Most of the rights to the machines were sold to Honeywell.
Doug and Gary Carlston at Broderbund Headquarters
Broderbund is founded. In 1980, brothers Doug and Gary Carlston formed a company to market the games Doug had created. Their first games were Galactic Empire, Galactic Trader and Galactic Revolution. They continued to have success with popular games such as Myst (1993) and Riven (1997) and a wide range of home products such as Print Shop, language tutors, etc. In 1998, Broderbund was acquired by The Learning Company which, a year later, was itself acquired by Mattel, Inc.
Connection Machine 2 with DataVault
Thinking Machines is founded. Thinking Machines Corporation (TMC) was formed by MIT graduate student Danny Hillis and others to develop a new type of supercomputer. Their idea was to use many individual processors of moderate power rather than one extremely powerful processor. Their first machine, called The Connection Machine (CM-1), had 64,000 microprocessors, and began shipping in 1986. TMC later produced several larger computers with more powerful—the CM-2 and CM-5. Competition from more established supercomputer firms forced them into bankruptcy in 1993.
Early Netscape diskette
Netscape Communications Corporation is founded. Netscape was originally founded as Mosaic Communications Corporation in April of 1994 by Marc Andreessen, Jim Clark and others. Its name was soon changed to Netscape and it delivered its first browser in October of 1994. On the day of Netscape's initial public offering in August of 1995, it’s share price went from $28 to $54 in the first few minutes of trading, valuing the company at $2 billion. Netscape hired many of Silicon Valley’s programmers to provide new features and products and began the Internet boom of the 1990s.
Yahoo! founders Jerry Yang and David Filo, 2000
Yahoo is founded. Founded by Stanford graduate students Jerry Yang and David Filo, Yahoo started out as "Jerry's Guide to the World Wide Web" before being renamed. Yahoo originally resided on two machines, Akebono and Konishiki, both named after famous Sumo wrestlers. Yahoo would quickly expand to become one of the Internet’s most popular search engines.
Intel and Zilog introduced new microprocessors. Five times faster than its predecessor, the 8008, the Intel 8080 could address four times as many bytes for a total of 64 kilobytes. The Zilog Z-80 could run any program written for the 8080 and included twice as many built-in machine instructions.
The Motorola 68000 microprocessor exhibited a processing speed far greater than its contemporaries. This high performance processor found its place in powerful work stations intended for graphics-intensive programs common in engineering.
Introduction to VLSI Systems
California Institute of Technology professor Carver Mead and Xerox Corp. computer scientist Lynn Conway wrote a manual of chip design, "Introduction to VLSI Systems." Demystifying the planning of very large scale integrated (VLSI) systems, the text expanded the ranks of engineers capable of creating such chips. The authors had observed that computer architects seldom participated in the specification of the standard integrated circuits with which they worked. The authors intended "Introduction to VLSI Systems" to fill a gap in the literature and introduce all electrical engineering and computer science students to integrated system architecture.
David Miller of AT&T Bell Labs patented the optical transistor, a component central to digital optical computing. Called Self-ElectroOptic-Effect Device, or SEED, the transistor involved a light-sensitive switch built with layers of gallium arsenide and gallium aluminum arsenide. Beams of light triggered electronic events that caused the light either to be transmitted or absorbed, thus turning the switch on or off.
Within a decade, research on the optical transistor led to successful work on the first all-optical processor and the first general-purpose all-optical computer. Bell Labs researchers first demonstrated the processor there in 1990. A computer using the SEED also contained lasers, lenses, and fast light switches, but it still required programming by a separate, non-optical computer. In 1993, researchers at the University of Colorado unveiled the first all-optical computer capable of being programmed and of manipulating instructions internally.
Compaq beat IBM to the market when it announced the Deskpro 386, the first computer on the market to use Intel´s new 80386 chip, a 32-bit microprocessor with 275,000 transistors on each chip. At 4 million operations per second and 4 kilobytes of memory, the 80386 gave PCs as much speed and power as older mainframes and minicomputers.
The 386 chip brought with it the introduction of a 32-bit architecture, a significant improvement over the 16-bit architecture of previous microprocessors. It had two operating modes, one that mirrored the segmented memory of older x86 chips, allowing full backward compatibility, and one that took full advantage of its more advanced technology. The new chip made graphical operating environments for IBM PC and PC-compatible computers practical. The architecture that allowed Windows and IBM OS/2 has remained in subsequent chips.
Motorola unveiled the 68030 microprocessor. A step up from the 68020, it built on a 32-bit enhanced microprocessor with a central processing unit core, a data cache, an instruction cache, an enhanced bus controller, and a memory management unit in a single VLSI device — all operating at speeds of at least 20 MHz.
Compaq and other PC-clone makers developed enhanced industry standard architecture — better than microchannel and retained compatibility with existing machines. EISA used a 32-bit bus, or a means by which two devices can communicate. The advanced data-handling features of the EISA made it an improvement over the 16-bit bus of industry standard architecture. IBM´s competitors developed the EISA as a way to avoid paying a fee to IBM for its MCA bus.
Intel released the 80486 microprocessor and the i860 RISC/coprocessor chip, each of which contained more than 1 million transistors. The RISC microprocessor had a 32-bit integer arithmetic and logic unit (the part of the CPU that performs operations such as addition and subtraction), a 64-bit floating-point unit, and a clock rate of 33 MHz.
The 486 chips remained similar in structure to their predecessors, the 386 chips. What set the 486 apart was its optimized instruction set, with an on-chip unified instruction and data cache and an optional on-chip floating-point unit. Combined with an enhanced bus interface unit, the microprocessor doubled the performance of the 386 without increasing the clock rate.
Motorola announced the 68040 microprocessor, with about 1.2 million transistors. Due to technical difficulties, it didn´t ship until 1991, although promised in January 1990. A 32-bit, 25-MHz microprocessor, the 68040 integrated a floating-point unit and included instruction and data caches. Apple used the third generation of 68000 chips in Macintosh Quadra computers.
Intel Pentium Processor diagram
The Pentium microprocessor is released. The Pentium was the fifth generation of the ‘x86’ line of microprocessors from Intel, the basis for the IBM PC and its clones. The Pentium introduced several advances that made programs run faster such as the ability to execute several instructions at the same time and support for graphics and music.
graphics & games
The DAC-1 System at GM Research Labs, 1965
DAC-1 computer aided design program is released. In 1959, the General Motors Research Laboratories appointed a special research team to investigate the use of computers in designing automobiles. In 1960, IBM joined the project, producing the first commercially-available Computer Aided Design program, known as DAC-1. Out of that project came the IBM 2250 display terminal as well as many advances in computer timesharing and the use of a single processor by two or more terminals.
Original Atari Pong Gme Screenshot
Pong is released. In 1966, Ralph Baer designed a ping-pong game for his Odyssey gaming console. Nolan Bushnell played this game at a Magnavox product show in Burlingame, California. Bushnell hired young engineer Al Alcorn to design a car driving game, but when it became apparent that this was too ambitious for the time, he had Alcorn to design a version of ping-pong instead. The game was tested in bars in Grass Valley and Sunnyvale, California where it proved very popular. Pong would revolutionize the arcade industry and launch the modern video game era.
SuperPaint system in 1973
SuperPaint is completed. SuperPaint was the first digital computer drawing system to use a frame buffer—a special high-speed memory—and the ancestor of all modern paint programs. It could create sophisticated animations, in up to 16.7 million colors, had adjustable paintbrushes, video magnification, and used a graphics tablet for drawing. It was designed by Richard Shoup and others at the Xerox Palo Alto Research Center (PARC). Its designers won a technical Academy Award in 1998 for their invention.
Atari VCS Prototype
Atari launches the Video Computer System game console. Atari released the Atari Video Computer System (VCS) later renamed the Atari 2600. The VCS was the first widely successful video game system, selling more than twenty million units throughout the 1980s. The VCS used the 8-bit MOS 6507 microprocessor and was designed to be connected to a home television set. When the last of Atari’s 8-bit game consoles were made in 1990, more than 900 video game titles had been released.
Pixar is founded. Pixar was originally called the Special Effects Computer Group at Lucasfilm (launched in 1979). The group created the computer animated segments of films such as “Star Trek II: The Wrath of Khan” and “Young Sherlock Holmes.” In 1986, Apple Computer co-founder Steve Jobs paid 10 million dollars to Lucasfilm to purchase the Group and renamed it Pixar. Over the next decade, Pixar made highly-successful (and Oscar-winning) animated films. It was bought by Disney in 2006.
The concept of virtual reality made a statement as the hot topic at Siggraph´s 1989 convention in Boston. The Silicon Graphics booth featured the new technology, designed by the computer-aided design software company Autodesk and the computer company VPL. The term describes a computer-generated 3-D environment that allows a user to interact with the realities created there. The computer must calculate and display sensory information quickly enough to fool the senses.
Howard Rheingold described, "shared and objectively present like the physical world, composable like a work of art, and as unlimited and harmless as a dream." First practical for accomplishing such tasks as flight simulation, virtual reality soon spread much further, promising new ground in video games, education, and travel. Computer users are placed into the virtual environment in a variety of ways, from a large monitor to a head-mounted display or a glove.
VideoToaster Installed at Local Television Station
Video Toaster is introduced by NewTek. The Video Toaster was a video editing and production system for the Amiga line of computers and included custom hardware and special software. Much more affordable than any other computer-based video editing system, the Video Toaster was not only for home use. It was popular with public access stations and was even good enough to be used for broadcast television shows like Home Improvement.
Original Movie Poster for Terminator 2: Judgment Day
“Terminator 2: Judgment Day” opens. Director James Cameron’s sequel to his 1984 hit “The Terminator,” featured ground-breaking special effects done by Industrial Light & Magic. Made for a record $100 million, it was the most expensive movie ever made at the time. Most of this cost was due to the expense of computer-generated special effects (such as image morphing) throughout the film. Terminator 2 is one of many films that critique civilization’s frequent blind trust in technology.
Box Art for Doom
“Doom” is released. id Software released Doom in late 1993. An immersive first-person shooter-style game, Doom became popular on many different platforms before losing popularity to games like Halo and Counter-Strike. Doom players were also among the first to customize the game’s levels and appearance. Doom would spawn several sequels and a 2005 film.
AT&T designed its Dataphone, the first commercial modem, specifically for converting digital computer data to analog signals for transmission across its long distance network. Outside manufacturers incorporated Bell Laboratories´ digital data sets into commercial products. The development of equalization techniques and bandwidth-conserving modulation systems improved transmission efficiency in national and global systems.
Online transaction processing made its debut in IBM´s SABRE reservation system, set up for American Airlines. Using telephone lines, SABRE linked 2,000 terminals in 65 cities to a pair of IBM 7090 computers, delivering data on any flight in less than three seconds.
JOSS (Johnniac Open Shop System) conversational time-sharing service began on Rand´s Johnniac. Time-sharing arose, in part, because the length of batch turn-around times impeded the solution of problems. Time sharing aimed to bring the user back into "contact" with the machine for online debugging and program development.
Acoustically coupled modem
John van Geen of the Stanford Research Institute vastly improved the acoustically coupled modem. His receiver reliably detected bits of data despite background noise heard over long-distance phone lines. Inventors developed the acoustically coupled modem to connect computers to the telephone network by means of the standard telephone handset of the day.
Citizens and Southern National Bank in Valdosta, Ga., installed the country´s first automatic teller machine.
Computer-to-computer communication expanded when the Department of Defense established four nodes on the ARPANET: the University of California Santa Barbara and UCLA, SRI International, and the University of Utah. Viewed as a comprehensive resource-sharing network, ARPANET´s designers set out with several goals: direct use of distributed hardware services; direct retrieval from remote, one-of-a-kind databases; and the sharing of software subroutines and packages not available on the users´ primary computer due to incompatibility of hardware or languages.
Ray Tomlinson in 2001
The first e-mail is sent. Ray Tomlinson of the research firm Bolt, Beranek and Newman sent the first e-mail when he was supposed to be working on a different project. Tomlinson, who is credited with being the one to decide on the "@" sign for use in e-mail, sent his message over a military network called ARPANET. When asked to describe the contents of the first email, Tomlinson said it was “something like "QWERTYUIOP"”
Wozniak´s "blue box"
Wozniak´s "blue box", Steve Wozniak built his "blue box" a tone generator to make free phone calls. Wozniak sold the boxes in dormitories at the University of California Berkeley where he studied as an undergraduate. "The early boxes had a safety feature — a reed switch inside the housing operated by a magnet taped onto the outside of the box," Wozniak remembered. "If apprehended, you removed the magnet, whereupon it would generate off-frequency tones and be inoperable ... and you tell the police: It´s just a music box."
Robert Metcalfe devised the Ethernet method of network connection at the Xerox Palo Alto Research Center. He wrote: "On May 22, 1973, using my Selectric typewriter ... I wrote ... "Ether Acquisition" ... heavy with handwritten annotations — one of which was "ETHER!" — and with hand-drawn diagrams — one of which showed `boosters´ interconnecting branched cable, telephone, and ratio ethers in what we now call an internet.... If Ethernet was invented in any one memo, by any one person, or on any one day, this was it."
Robert M. Metcalfe, "How Ethernet Was Invented", IEEE Annals of the History of Computing, Volume 16, No. 4, Winter 1994, p. 84.
Telenet, the first commercial packet-switching network and civilian equivalent of ARPANET, was born. The brainchild of Larry Roberts, Telenet linked customers in seven cities. Telenet represented the first value-added network, or VAN — so named because of the extras it offered beyond the basic service of linking computers.
The Queen of England sends first her e-mail. Elizabeth II, Queen of the United Kingdom, sends out an e-mail on March 26 from the Royal Signals and Radar Establishment (RSRE) in Malvern as a part of a demonstration of networking technology.
The Shockwave Rider
John Shoch and Jon Hupp at the Xerox Palo Alto Research Center discover the computer "worm," a short program that searches a network for idle processors. Initially designed to provide more efficient use of computers and for testing, the worm had the unintended effect of invading networked computers, creating a security threat.
Shoch took the term "worm" from the book "The Shockwave Rider," by John Brunner, in which an omnipotent "tapeworm" program runs loose through a network of computers. Brunner wrote: "No, Mr. Sullivan, we can´t stop it! There´s never been a worm with that tough a head or that long a tail! It´s building itself, don´t you understand? Already it´s passed a billion bits and it´s still growing. It´s the exact inverse of a phage — whatever it takes in, it adds to itself instead of wiping... Yes, sir! I´m quite aware that a worm of that type is theoretically impossible! But the fact stands, he´s done it, and now it´s so goddamn comprehensive that it can´t be killed. Not short of demolishing the net!" (247, Ballantine Books, 1975).
USENET established. USENET was invented as a means for providing mail and file transfers using a communications standard known as UUCP. It was developed as a joint project by Duke University and the University of North Carolina at Chapel Hill by graduate students Tom Truscott, Jim Ellis, and Steve Bellovin. USENET enabled its users to post messages and files that could be accessed and archived. It would go on to become one of the main areas for large-scale interaction for interest groups through the 1990s.
Richard Bartle and Roy Trubshaw circa 1999
The first Multi-User Domain (or Dungeon), MUD1, is goes on-line. Richard Bartle and Roy Trubshaw, two students at the University of Essex, write a program that allows many people to play against each other on-line. MUDs become popular with college students as a means of adventure gaming and for socializing. By 1984, there are more than 100 active MUDs and variants around the world.
The ARPANET splits into the ARPANET and MILNET. Due to the success of the ARPANET as a way for researchers in universities and the military to collaborate, it was split into military (MILNET) and civilian (ARPANET) segments. This was made possible by the adoption of TCP/IP, a networking standard, three years earlier. The ARPANET was renamed the “Internet” in 1995.
The modern Internet gained support when the National Science foundation formed the NSFNET, linking five supercomputer centers at Princeton University, Pittsburgh, University of California at San Diego, University of Illinois at Urbana-Champaign, and Cornell University. Soon, several regional networks developed; eventually, the government reassigned pieces of the ARPANET to the NSFNET. The NSF allowed commercial use of the Internet for the first time in 1991, and in 1995, it decommissioned the backbone, leaving the Internet a self-supporting industry.
The NSFNET initially transferred data at 56 kilobits per second, an improvement on the overloaded ARPANET. Traffic continued to increase, though, and in 1987, ARPA awarded Merit Network Inc., IBM, and MCI a contract to expand the Internet by providing access points around the country to a network with a bandwidth of 1.5 megabits per second. In 1992, the network upgraded to T-3 lines, which transmit information at about 45 megabits per second.
Stewart Brand and Larry Brilliant lecturing on the Well, 1999
The Whole Earth 'Lectronic Link (WELL) is founded. Stewart Brand and Larry Brilliant started an on-line Bulletin Board System (BBS) to build a “virtual community” of computer users at low cost. Journalists were given free memberships in the early days, leading to many articles about it and helping it grow to thousands of members around the world.
Robert Morris´ worm flooded the ARPANET. Then-23-year-old Morris, the son of a computer security expert for the National Security Agency, sent a nondestructive worm through the Internet, causing problems for about 6,000 of the 60,000 hosts linked to the network. A researcher at Lawrence Livermore National Laboratory in California discovered the worm. "It was like the Sorcerer´s Apprentice," Dennis Maxwell, then a vice president of SRI, told the Sydney (Australia) Sunday Telegraph at the time. Morris was sentenced to three years of probation, 400 hours of community service, and a fine of $10,050.
Morris, who said he was motivated by boredom, programmed the worm to reproduce itself and computer files and to filter through all the networked computers. The size of the reproduced files eventually became large enough to fill the computers´ memories, disabling them.
The World Wide Web was born when Tim Berners-Lee, a researcher at CERN, the high-energy physics laboratory in Geneva, developed HyperText Markup Language. HTML, as it is commonly known, allowed the Internet to expand into the World Wide Web, using specifications he developed such as URL (Uniform Resource Locator) and HTTP (HyperText Transfer Protocol). A browser, such as Netscape or Microsoft Internet Explorer, follows links and sends a query to a server, allowing a user to view a site.
Berners-Lee based the World Wide Web on Enquire, a hypertext system he had developed for himself, with the aim of allowing people to work together by combining their knowledge in a global web of hypertext documents. With this idea in mind, Berners-Lee designed the first World Wide Web server and browser — available to the general public in 1991. Berners-Lee founded the W3 Consortium, which coordinates World Wide Web development.
Screen Capture from Original Mosaic Browser
The Mosaic web browser is released. Mosaic was the first commercial software that allowed graphical access to content on the internet. Designed by Eric Bina and Marc Andreessen at the University of Illinois’s National Center for Supercomputer Applications, Mosaic was originally designed for a Unix system running X-windows. By 1994, Mosaic was available for several other operating systems such as the Mac OS, Windows and AmigaOS.
people & pop culture
significantly in his development of the concept of a "universal machine."
First meeting of SHARE, the IBM users group, convened. User groups became a significant educational force allowing companies to communicate innovations and users to trade information.
Vietnam War protesters attacked university computer centers. At the University of Wisconsin, the toll was one human and four machines.
Time magazine altered its annual tradition of naming a "Man of the Year," choosing instead to name the computer its "Machine of the Year." In introducing the theme, Time publisher John A. Meyers wrote, "Several human candidates might have represented 1982, but none symbolized the past year more richly, or will be viewed by history as more significant, than a machine: the computer.
His magazine, he explained, has chronicled the change in public opinion with regard to computers. A senior writer contributed: "computers were once regarded as distant, ominous abstractions, like Big Brother. In 1982, they truly became personalized, brought down to scale, so that people could hold, prod and play with them." At Time, the main writer on the project completed his work on a typewriter, but Meyers noted that the magazine's newsroom would upgrade to word processors within a year.
The use of computer-generated graphics in movies took a step forward with Disney´s release of "Tron." One of the first movies to use such graphics, the plot of "Tron" also featured computers - it followed the adventures of a hacker split into molecules and transported inside a computer. Computer animation, done by III, Abel, MAGI, and Digital Effects, accounted for about 30 minutes of the film.
In his novel "Neuromancer," William Gibson coined the term "cyberspace." He also spawned a genre of fiction known as "cyberpunk" in his book, which described a dark, complex future filled with intelligent machines, computer viruses, and paranoia.
Gibson introduced cyberspace as: "A consensual hallucination experienced daily by billions of legitimate operators, in every nation, by children being taught mathematical concepts... A graphic representation of data abstracted from the banks of every computer in the human system. Unthinkable complexity. Lines of light ranged in the nonspace of the mind, clusters and constellations of data. Like city lights, receding..." (p. 51).
Still from Pixar's Tin Toy
Pixar´s "Tin Toy" became the first computer-animated film to win an Academy Award, taking the Oscar for best animated short film. A wind-up toy first encountering a boisterous baby narrated "Tin Toy." To illustrate the baby´s facial expressions, programmers defined more than 40 facial muscles on the computer controlled by the animator.
Founded in 1986, one of Pixar´s primary projects involved a renderer, called Renderman, the standard for describing 3-D scenes. Renderman describes objects, light sources, cameras, atmospheric effects, and other information so that a scene can be rendered on a variety of systems. The company continued on to other successes, including 1995´s "Toy Story," the first full-length feature film created entirely by computer animation.
robots & artificial intelligence
Norbert Wiener published "Cybernetics," a major influence on later research into artificial intelligence. He drew on his World War II experiments with anti-aircraft systems that anticipated the course of enemy planes by interpreting radar images. Wiener coined the term "cybernetics" from the Greek word for "steersman."
In addition to "cybernetics," historians note Wiener for his analysis of brain waves and for his exploration of the similarities between the human brain and the modern computing machine capable of memory association, choice, and decision making.
MIT´s Servomechanisms Laboratory demonstrated computer-assisted manufacturing. The school´s Automatically Programmed Tools project created a language, APT, used to instruct milling machine operations. At the demonstration, the machine produced an ashtray for each attendee.
UNIMATE, the first industrial robot, began work at General Motors. Obeying step-by-step commands stored on a magnetic drum, the 4,000-pound arm sequenced and stacked hot pieces of die-cast metal.
The brainchild of Joe Engelberger and George Devol, UNIMATE originally automated the manufacture of TV picture tubes.
Researchers designed the Rancho Arm at Rancho Los Amigos Hospital in Downey, California as a tool for the handicapped. The Rancho Arm´s six joints gave it the flexibility of a human arm. Acquired by Stanford University in 1963, it holds a place among the first artificial robotic arms to be controlled by a computer.
A Stanford team led by Ed Feigenbaum created DENDRAL, the first expert system, or program designed to execute the accumulated expertise of specialists. DENDRAL applied a battery of "if-then" rules in chemistry and physics to identify the molecular structure of organic compounds.
Marvin Minsky developed the Tentacle Arm, which moved like an octopus. It had twelve joints designed to reach around obstacles. A PDP-6 computer controlled the arm, powered by hydraulic fluids. Mounted on a wall, it could lift the weight of a person.
Victor Scheinman´s Stanford Arm made a breakthrough as the first successful electrically powered, computer-controlled robot arm. By 1974, the Stanford Arm could assemble a Ford Model T water pump, guiding itself with optical and contact sensors. The Stanford Arm led directly to commercial production. Scheinman went on to design the PUMA series of industrial robots for Unimation, robots used for automobile assembly and other industrial tasks.
SRI International´s Shakey became the first mobile robot controlled by artificial intelligence. Equipped with sensing devices and driven by a problem-solving program called STRIPS, the robot found its way around the halls of SRI by applying information about its environment to a route. Shakey used a TV camera, laser range finder, and bump sensors to collect data, which it then transmitted to a DEC PDP-10 and PDP-15. The computer radioed back commands to Shakey — who then moved at a speed of 2 meters per hour.
David Silver at MIT designed the Silver Arm, a robotic arm to do small-parts assembly using feedback from delicate touch and pressure sensors. The arm´s fine movements corresponded to those of human fingers.
Hirose´s Soft Gripper
Shigeo Hirose´s Soft Gripper could conform to the shape of a grasped object, such as this wine glass filled with flowers. The design Hirose created at the Tokyo Institute of Technology grew from his studies of flexible structures in nature, such as elephant trunks and snake spinal cords.
Speak & Spell creators
Texas Instruments Inc. introduced Speak & Spell, a talking learning aid for ages 7 and up. Its debut marked the first electronic duplication of the human vocal tract on a single chip of silicon. Speak & Spell utilized linear predictive coding to formulate a mathematical model of the human vocal tract and predict a speech sample based on previous input. It transformed digital information processed through a filter into synthetic speech and could store more than 100 seconds of linguistic sounds.
Shown here are the four individuals who began the Speak & Spell program: From left to right, Gene Frantz, Richard Wiggins, Paul Breedlove, and George Brantingham.
In development since 1967, the Stanford Cart successfully crossed a chair-filled room without human intervention in 1979. Hans Moravec rebuilt the Stanford Cart in 1977, equipping it with stereo vision. A television camera, mounted on a rail on the top of the cart, took pictures from several different angles and relayed them to a computer. The computer gauged the distance between the cart and obstacles in its path.
The Musical Instrument Digital Interface was introduced at the first North American Music Manufacturers show in Los Angeles. MIDI is an industry-standard electronic interface that links electronic music synthesizers. The MIDI information tells a synthesizer when to start and stop playing a specific note, what sound that note should have, how loud it should be, and other information.
Raymond Kurzweil, a pioneer in developing the electronic keyboard, predicts MIDI and other advances will make traditional musical instruments obsolete in the future. In the 21st century, he wrote in his book, "The Age of Intelligent Machines," "There will still be acoustic instruments around, but they will be primarily of historical interest, much like harpsichords are today.... While the historically desirable sounds of pianos and violins will continue to be used, most music will use sounds with no direct acoustic counterpart.... There will not be a sharp division between the musician and nonmusician."
software & languages
Richard Stallman, a programmer at MIT’s Artificial Intelligence Lab, experienced a significant shift in attitudes during the late 1970s. Whereas the MIT hacker culture was one of sharing and openness, the commercial software world moved towards secrecy and access to source code became ever more restricted.
Stallman set out to develop a free alternative to the popular Unix operating system. This operating system called GNU (for Gnu's Not Unix) was going to be free of charge but also allow users the freedom to change and share it. Stallman founded the Free Software Foundation (FSF) based on this philosophy in 1985.
While the GNU work did not immediately result in a full operating system, it provided the necessary tools for creating Linux. The software developed as part of the GNU project continues to form a large part of Linux, which is why the FSF asks for it to be called GNU/Linux.
Aldus announced its PageMaker program for use on Macintosh computers, launching an interest in desktop publishing. Two years later, Aldus released a version for IBMs and IBM-compatible computers. Developed by Paul Brainerd, who founded Aldus Corp., PageMaker allowed users to combine graphics and text easily enough to make desktop publishing practical.
Chuck Geschke of Adobe Systems Inc., a company formed in 1994 by the merger of Adobe and Aldus, remembered: "John Sculley, a young fellow at Apple, got three groups together — Aldus, Adobe, and Apple — and out of that came the concept of desktop publishing. Paul Brainerd of Aldus is probably the person who first uttered the phrase. All three companies then took everybody who could tie a tie and speak two sentences in a row and put them on the road, meeting with people in the printing and publishing industry and selling them on this concept. The net result was that it turned around not only the laser printer but, candidly, Apple Computer. It really turned around that whole business.
The C++ programming language emerged as the dominant object-oriented language in the computer industry when Bjarne Stroustrup published "The C++ Programming Language." Stroustrup, at AT&T Bell Laboratories, said his motivation stemmed from a desire to write event-driven simulations that needed a language faster than Simula. He developed a preprocessor that allowed Simula style programs to be implemented efficiently in C.
Stroustrup wrote in the preface to "The C++ Programming Language": "C++ is a general purpose programming language designed to make programming more enjoyable for the serious programmer. Except for minor details, C++ is a superset of the C programming language. In addition to the facilities provided by C, C++ provides flexible and efficient facilities for defining new types.... The key concept in C++ is class. A class is a user-defined type. Classes provide data hiding, guaranteed initialization of data, implicit type conversion for user-defined types, dynamic typing, user-controlled memory management, and mechanisms for overloading operators.... C++ retains C's ability to deal efficiently with the fundamental objects of the hardware (bits, bytes, words, addresses, etc.). This allows the user-defined types to be implemented with a pleasing degree of efficiency."
Apple engineer William Atkinson designed HyperCard, a software tool that simplifies development of in-house applications. HyperCard differed from previous programs of its sort because Atkinson made it interactive rather than language-based and geared it toward the construction of user interfaces rather than the processing of data. In HyperCard, programmers built stacks with the concept of hypertext links between stacks of pages. Apple distributed the program free with Macintosh computers until 1992.
Hypercard users could look through existing HyperCard stacks as well as add to or edit the stacks. As a stack author, a programmer employed various tools to create his own stacks, linked together as a sort of slide show. At the lowest level, the program linked cards sequentially in chronological ordered, but the HyperTalk programming language allowed more sophisticated links.
Box Art for SimCity
Maxis released SimCity, a video game that helped launch of series of simulators. Maxis cofounder Will Wright built on his childhood interest in plastic models of ships and airplanes, eventually starting up a company with Jeff Braun and designing a computer program that allowed the user to create his own city. A number of other Sims followed in the series, including SimEarth, SimAnt, and SimLife.
In SimCity, a player starts with an untouched earth. As the mayor of a city or city planner, he creates a landscape and then constructs buildings, roads, and waterways. As the city grows, the mayor must provide basic services like health care and education, as well as making decisions about where to direct money and how to build a revenue base. Challenges come in the form of natural disasters, airplane crashes, and monster attacks.
Microsoft shipped Windows 3.0 on May 22. Compatible with DOS programs, the first successful version of Windows finally offered good enough performance to satisfy PC users. For the new version, Microsoft revamped the interface and created a design that allowed PCs to support large graphical applications for the first time. It also allowed multiple programs to run simultaneously on its Intel 80386 microprocessor.
Microsoft released Windows amid a $10 million publicity blitz. In addition to making sure consumers knew about the product, Microsoft lined up a number of other applications ahead of time that ran under Windows 3.0, including versions of Microsoft Word and Microsoft Excel. As a result, PCs moved toward the user-friendly concepts of the Macintosh, making IBM and IBM-compatible computers more popular.
Linus Torvalds, 1991
Designed by Finnish university student Linus Torvalds, Linux was released to several Usenet newsgroups on September 17th, 1991. Almost immediately, enthusiasts began developing and improving Linux, such as adding support for peripherals and improving its stability. In February 1992, Linux became free software or (as its developers preferred to say after 1998) open source. Linux typically incorporated elements of the GNU operating system and became widely used.
Pretty Good Privacy is introduced. Pretty Good Privacy, or PGP, is an e-mail encryption program. Its inventor, software engineer Phil Zimmermann, created it as a tool for people to protect themselves from intrusive governments around the world. Zimmermann posted PGP on the Internet in 1991 where it was available as a free download. The United States government, concerned about the strength of PGP, which rivaled some of the best secret codes in use at the time, prosecuted Zimmermann but dropped its investigation in 1996. PGP is now the most widely used encryption system for e-mail in the world.
IBM 726 Dual Tape Drives
Magnetic tape allows for inexpensive mass storage of information and so is a key part of the computer revolution. The IBM 726 was one of the first practical high-speed magnetic tape systems for electronic digital computers. Announced on May 21, 1952, the system used a unique ‘vacuum channel’ method of keeping a loop of tape circulating between two points allowing the tape drive to start and stop the tape in a split-second. The Model 726 was first sold with IBM’s first electronic digital computer the Model 701 and could store 2 million digits per tape—an enormous amount at the time. It rented for $850 a month.
IBM's Rey Johnson in front of RAMAC 350 Disk File
The era of magnetic disk storage dawned with IBM´s shipment of a 305 RAMAC to Zellerbach Paper in San Francisco. The IBM 350 disk file served as the storage component for the Random Access Method of Accounting and Control. It consisted of 50 magnetically coated metal platters with 5 million bytes of data. The platters, stacked one on top of the other, rotated with a common drive shaft.
The IBM 1301 Disk Storage System
IBM 1301 Disk Storage Unit is released. The IBM 1301 Disk Drive was announced on June 2nd, 1961 for use with IBM’s 7000-series of mainframe computers. Maximum capacity was 28 million characters and the disks rotated at 1,800 R.P.M. The 1301 leased for $2,100 per month or could be purchased for $115,500. The drive had one read/write arm for each disk as well as flying heads, both of which are still used in today’s disk drives.
Tom Kilburn in front of Manchester Atlas console
Virtual memory emerged from a team under the direction of Tom Kilburn at the University of Manchester on its Atlas computer (1962). Virtual memory permitted a computer to use its storage capacity to switch rapidly among multiple programs or users and is a key requirement for timesharing.
Four Views of the IBM 1311 Including Removable Disk Pack
IBM 1311 Disk Storage Drive is announced. Announced on October 11, 1962, the IBM 1311 was the first disk drive IBM made with a removable disk pack. Each pack weighed about ten pounds, held six disks, and had a capacity of 2 million characters. The disks would rotate at 1,500 RPM and were accessed by a hydraulic actuator with one head per disk. [storage] The 1311 offered some of the advantages of both tapes and disks.
The IBM Photo Digital Storage System, code-named Cypress
IBM 1360 Photo-Digital Storage System is delivered. In 1967, IBM delivered the first of its photo-digital storage systems to Lawrence Livermore National Laboratory. The system could read and write up to a trillion bits of information—the first such system in the world.. The 1360 used thin strips of film which were developed with bit patterns via a photographic developing system housed in the machine. The system used sophisticated error correction and a pneumatic robot to move the film strips to and from a storage unit. Only five were built.
IBM 23FD 8
An IBM team, originally led by David Noble, invented the 8-inch floppy diskette. It was initially designed for use in loading microcode into the controller for the "Merlin" (IBM 3330) disk pack file. It quickly won widespread acceptance as a program and data-storage medium. Unlike hard drives, a user could easily transfer a floppy in its protective jacket from one drive to another.
Original Shugart SA400 5 1/4
The 5 1/4" flexible disk drive and diskette were introduced by Shugart Associates in 1976. This was the result of a request by Wang Laboratories to produce a disk drive small enough to use with a desktop computer, since 8" floppy drives were considered too large for that purpose. By 1978, more than 10 manufacturers were producing 5 1/4" floppy drives.
Shugart ST506 5MB Hard Disk Drive
Seagate Technology created the first hard disk drive for microcomputers, the ST506. The disk held 5 megabytes of data, five times as much as a standard floppy disk, and fit in the space of a floppy disk drive. The hard disk drive itself is a rigid metallic platter coated on both sides with a thin layer of magnetic material that stores digital data.
Seagate Technology grew out of a 1979 conversation between Alan Shugart and Finis Conner, who had worked together at Memorex. The two men decided to found the company after developing the idea of scaling down a hard disk drive to the same size as the then-standard 5 1/4-inch floppies. Upon releasing its first product, Seagate quickly drew such big-name customers as Apple Computer and IBM. Within a few years, it had sold 4 million units.
IBM 3380 Disk System
Hard disks are an essential part of the computer revolution, allowing fast, random access to large amounts of data. IBM announced its most successful mainframe hard disk (what IBM called a “Direct Access Storage Device (DASD)” in June of 1980, actually shipping units the following year. The 3380 came in six models initially (later growing to many more) and price at time of introduction ranged from $81,000 to $142,200. The base model stored 2.5 GB of data, later models extended this to 20GB. IBM sold over 100,000 3380s, generating tens of billions of dollars in revenue making the 3380 one of IBM’s most successful products of all time.
Sony 3 1/2
Sony introduced and shipped the first 3 1/2" floppy drives and diskettes in 1981. The first signficant company to adopt the 3 1/2" floppy for general use was Hewlett-Packard in 1982, an event which was critical in establishing momentum for the format and which helped it prevail over the other contenders for the microfloppy standard, including 3", 3 1/4", and 3.9" formats.
Able to hold 550 megabytes of prerecorded data, CD-ROMs grew out of music Compact Disks (CDs). The first general-interest CD-ROM product released after Philips and Sony announced the CD-ROM in 1984 was "Grolier´s Electronic Encyclopedia," which came out in 1985. The 9 million words in the encyclopedia only took up 12 percent of the available space. The same year, computer and electronics companies worked together to set a standard for the disks so any computer would be able to access the information.
Original Bernoulli Box
The Bernoulli Box is released. Using a special cartridge-based system that used hard disk technology, the Bernoulli Box was a type of removable storage that allowed people to move large files between computers when few alternatives (such as a network) existed. Allowing for many times the amount of storage afforded by a regular floppy disk, the cartridges came in capacities ranging from 5MB to 230MB.
IBM 3480 Cartridge Tape System
Magnetic tape allows for inexpensive mass storage of information and so is a key part of the computer revolution. Announced in March 1984, IBM’s new 3480 cartridge tape system sought to replace the traditional reels of magnetic tape in the computer center with a 4” x 5” cartridge that held more information (200MB) and offered faster access to it. IBM withdrew the system in 1989 but the new format caught on with other computer makers who began making 3480-compatible storage systems for several years after that, offering increased storage capacity in the same physical format.
Early Zip Drive with Disks
The Iomega Zip Disk is released. The initial Zip system allowed 100MB to be stored on a cartridge roughly the size of a 3 ½ inch floppy disk. Later versions increased the capacity of a single disk from 100Mbytes to 2GB.