A Critical History of Computer Graphics and Animation

Section 1:
The history of computing technology


The study of the history of CGI (computer generated imagery) is an important part of our overall educational experience, not necessarily to build on the historical precedent, but to gain an understanding of the evolution of our discipline and to gain a respect for the key developments that have brought us to where we are. The discipline is so recent in its early developments and so rapidly changing that we are in fact living it, and it evolves as we speak. Yet we have been so busy in advancing the discipline that we have often neglected to accurately record this history. So we will decide to agree upon certain past events in order to begin to develop a definitive record of what has transpired in this evolutionary process.

We must learn from the past, as we develop a theory and methodology which is tuned to the capabilities and qualities inherent in software, hardware, animation techniques, etc. that are part of our broad, contemporary, and creative computer graphics environment.

Sylvan Chasen of Lockheed in a 1981 paper characterized the evolution of the graphics discipline in a fashion similar to human existence. He placed "conception to birth", or the gestational period, as 1950-1963; childhood from 1964 to 1970; adolescence from 1970 to 1981, adulthood from 1981.

Herbert Freeman, in an introduction to his 1980 IEEE compilation of computer graphics papers, presents a succinct overview of the first two decades of the development of the CGI discipline. Like many other disciplines, computer graphics and animation has a rich (albeit relatively short) history that involves the following four eras, which are very much linked and related:

  • pioneers
  • innovators
  • adapters
  • followers

Early pioneers include artists and researchers. These visionaries saw the possibilities of the computer as a resource for making and interacting with pictures, and pushed the limits of an evolving technology to take it where computer scientists never imagined it could go. Their work motivated the work of the others as they tried to realize the potential of this new vision.

Many of the so-called innovators were housed in universities and research labs, and were working toward solving fundamental problems of making "pictures" of data using the computer.

The early adapters included pioneering CGI production facilities, artists, researchers, and research labs and industries with an interest in converting much of this early work into a viable (and marketable) tool for realizing their disparate goals.The late seventies and early eighties saw the second wave of adapters, which were primarily special effects production companies, equipment and software developers, universities, motion picture companies, etc.

As the technology advanced and the acceptance of this new approach to image making increased, the industry likewise evolved, and many of the current contributors, or followers (this descriptor is not intended to be demeaning or derogatory) came into being. These include effects production companies, universities and companies and research labs.

 

 


Click on the images below to view a larger version (when available).

 

 

 

 

 

 

In order to adequately discuss the beginnings of computer graphics, we need to step back further in history and investigate a number of contributions that influenced the way we do things. Some of the innovations still are used in one form or another today.

http://www.computerhistory.org/timeline/


Computing or calculating instruments date back to the abacus, used by early man to represent a positional counting notation that was used in counting tables, called abaci. They were really not calculators per se, but provided a technique to keep track of sums and carries in addition. Although the abacus existed as far back as 5 A.D. the abacus as we know it was attributed to the Chinese in 1200 AD.

John Napier in 1617 introduced a calculation aid for multiplication, called Napier's Bones. They consist of a set of wooden rods, each marked with a counting number at the top, and multiples of that number down the lengths of the rods. When aligned against the row of multiples as shown, any multiple of the top number can be read off from right to left by adding the digits in each parallelogram in the appropriate row. Multiplication is thus reduced to addition. (Click on the image to the right for an explanation of how the bones work.)

Napier also invented the logarithm, which was used in the first slide rule introduced in approximately 1622.

http://www.sliderule.ca/intro.htm

 

 


Abacus


Napier's Bones


Slide Rules



Several automatic mechanical "calculators" were built in the 1600s, including the Schickard implementation of Napier's Bones, the Pascalene automatic adder, and the Liebniz automatic multiplier. Each of these devices was considered an "analog" device.

Alternatively, most modern computational devices are "digital" . One of the earliest implementations of a digital system is attributed to Jacquard in 1801. The Jacquard loom was developed in by Joseph-Marie Jacquard of France. He used a punched card to control the weaving actions of a loom, which introduced much more intricate patterns in woven cloth. Jacquard's approach was a variation on the original punched-card design of Jacques de Vaucanson in 1745. de Vaucanson was a toy maker (most famous for a mechanical duck), and his idea of automating the weaving process was not well accepted by weavers (a situation not unlike that of the modern day computer ink and paint process in traditional animation.)

http://www.columbia.edu/acis/history/jacquard.html
http://www.swarthmore.edu/Humanities/pschmid1/essays/pynchon/vaucanson.html


The punched-card idea was adopted later by Charles Babbage in about 1830 to control his Analytical Engine, and later by Herman Hollerith for tabulating the 1890 census. The Babbage Analytical engine (which was never completed by him) was designed to use Jacquard's punched cards to control an automatic calculator, which could make decisions based on the results of previous computations. It was intended to employ several features later used in modern computers, including sequential control, branching, and looping.

An assistant to Babbage was Augusta Ada Lovelace, the daughter of the English poet Lord Byron, and a mathematician, who created a "program" for the Analytical Engine to compute a mathematical sequence known as Bernoulli numbers. Based on this work, Ada is now credited as being the first computer programmer and, in 1979, a modern programming language was named ADA in her honor.

http://www.agnesscott.edu/lriddle/women/love.htm


In 1878, Oberlin Smith devised a crude magnetic recording device made of a silk thread covered with steel dust. In theory, when exposed to a magnetic field, the steel dust particles would align with the magnet, creating a digital pattern. Smith applied for a patent, but never followed through with the application. He concluded that he wouldn't be able to establish a useful pattern on the strings, published his results in 1888, but dropped his investigations.

In 1898, inventor Valdemar Poulsen of Denmark filed a patent for a "Method of, and apparatus for, effecting the storing up of speech or signals by magnetically influencing magnetisable bodies". His idea was that a wire, when touched with an electromagnet at different points and times, would store a signal that later could be retrieved to recover the same energy that caused the magnetization in the first place. He developed his idea as a "telephone answering machine" called the Telegraphone and started a company to market it. Another of Poulsen's devices can be considered to be the original version of the hard disk. It consisted of a 4.5 inch diameter steel disk with a raised spiral on the surface which was traced by the electromagnet as the disk rotated, magnetizing the disk in the same fashion as the wire.

Further contributions to magnetic recording were few, until Fritz Pfleumer developed the magnetic tape, which was a strip of paper covered with magnetic dust (The first paper tape used was covered with high grade ferric oxide barn paint (rust red), and a cloud of red dust sprayed the air as the tape was used). The German General Electric company bought the patents from Pfluemer and marketed the first true tape recorder, the Magnetophone (meaning "tape recorder" in French) in 1936.

http://www.amps.net/newsletters/issue27/27_poulsen.htm


Pascalene adder

Analog: relating to, or being a device in which data are represented by continuously variable, measurable, physical quantities, such as length, width, voltage, or pressure; a device having an output that is proportional to the input.

Digital: A description of data which is stored or transmitted as a sequence of discrete symbols from a finite set, most commonly this means binary data represented using electronic or electromagnetic signals.

Ref: dictionary.com


Jacquard's loom


Telegraphone

The U.S. Census Bureau was concerned about the difficulty of tabulating the 1890 census. One of its statisticians, Herman Hollerith envisioned a machine that could automate the process, based on an idea similar to that used in the Jacquard loom. Hollerith designed punches for his system, which he called the Hollerith Electric Tabulating System. A pin would go through a hole in the census card to make an electrical connection with mercury placed beneath. The resulting electrical current activated a mechanical counter and the information would be tabulated. The tabulating system was featured in a 1900 issue of Scientific American magazine.

The 80 column punch card introduced by Hollerith in 1928 became the standard input medium for computers until the late 70s when interactive systems became usable. It was sized at 7 3/8 inches wide by 3 1/4 inches high by .007 inches thick. Prior to 1929, this was a standard size for many US banknotes, and Hollerith apparently chose it so that he could store cards in boxes made for the Treasury Department.

http://www.cs.uiowa.edu/~jones/cards/history.html
http://www.computer.org/portal/pages/annals/content/punchedcards.html


The "IBM card" was the source of a popular phrase which became the topic for a great article by Steven Lubar of the Smithsonian in 1991, titled "Do not fold, spindle or mutilate": A cultural history of the punch card.

"Do not fold, spindle or mutilate..."


Hollerith obtained over 30 patents for his research, but he was not a marketer. He felt he had a choke hold on the census tabulating machine, and he charged the Census Bureau more than it would have cost to do it by hand. As a result, they developed, and in fact patented their own version of the machine. Hollerith almost closed the doors on his company, but he was able to attract a brilliant man, Thomas J. Watson, and his company survived and would later become International Business Machines (IBM).


Hollerith Punch Card Machine


Punch Card


 

The turn of the century saw a significant number of electronics related contributions. One of the most significant was the vacuum tube, invented by Lee de Forest in 1906. It was an improvement on the Fleming tube, or Fleming valve, introduced by John Ambrose Fleming two years earlier. The vacuum tube contains three components: the anode, the cathode and a control grid. It could therefore control the flow of electrons between the anode and cathode using the grid, and could therefore act as a switch or an amplifier. A recent PBS special on the history of the computer used monkeys (the cathode) throwing pebbles (the electrons) through a gate (the grid) at a target (the anode) to explain the operation of the triad tube.

http://www.nobel.se/physics/educational/integrated_circuit/history/
http://www.pbs.org/transistor/album1/addlbios/deforest.html

 


Simulation of a Vacuum Tube

 

Engineers were interested in the functionality of the vacuum tube, but were intent on discovering an alternative. Much like a light bulb, the vacuum tube generated a lot of heat and had a tendency to burn out in a very short time. It required a lot of electricity, and it was slow and big and bulky, requiring fairly large enclosures. For example, the first digital computer (the ENIAC) weighed over thirty tons, consumed 200 kilowatts of electrical power, and contained around 19,000 vacuum tubes that got very hot very fast, and as a result constantly burned out, making it very unreliable.

A special kind of vacuum tube was invented in 1885. Called the Cathode Ray Tube (CRT), images are produced when an electron beam generated by the cathode strikes a phosphorescent anode surface. The practicality of this tube was shown in 1897, when German scientist Ferdinand Braun introduced a CRT with a fluorescent screen, known as the cathode ray oscilloscope. The screen would emit a visible light when struck by a beam of electrons.

This invention would result in the introduction of the modern television when Philo Farnsworth introduced the image dissector in 1927, and the first 60 line "raster scanned" image was shown. (It was an image of a dollar sign.) Farnsworth has been called one of the greatest inventors of all times, but he suffered for a long period of time in obscurity because of an unfortunate set of circumstances. RCA challenged the patents that Farnsworth received in 1930 for the technology which was the television, and although he won the litigation, it took so long that his patents expired and RCA maintained a public relations campaign to promote one of their engineers as the actual inventor. (See Note 1 below)

Variations of the CRT have been used throughout the history of computer graphics, and it was the graphics display device of choice until the LCD display introduced 100 years later. The three main variations of the CRT are the vector display, a "storage tube" CRT (developed in 1949), and the raster display.

 


Vacuum tube & Transistor

 

http://entertainment.howstuffworks.com/tv2.htm


 

The specter of World War II, and the need to calculate complex values, eg weapons trajectory and firing tables to be used by the Army, pushed the military to replace their mechanical computers, which were error prone. Lt. Herman Goldstine of the Aberdeen Proving Grounds contracted with two professors at the University of Pennsylvania's Moore School of Engineering to design a digital device. Dr. John W. Mauchly and J. P. Eckert, Jr., professors at the school, were awarded a contract in 1943 to develop the preliminary designs for this electronic computer. The ENIAC (Electronic Numerical Integrator and Computer) was placed in operation at the Moore School in 1944. Final assembly took place during the fall of 1945, and it was formally announced in 1946.


The ENIAC was the prototype from which most other modern computers evolved. All of the major components and concepts of today's digital computers were embedded in the design. ENIAC knew the difference in the sign of a number, it could compare numbers, add, subtract, multiply, divide, and compute square roots. Its electronic accumulators combined the functions of an adding machine and storage unit. No central memory unit existed, and storage was localized within the circuitry of the computer.


The primary aim of the designers was to achieve speed by making ENIAC as all-electronic as possible. The only mechanical elements in the final product were actually external to the calculator itself. These were an IBM card reader for input, a card punch for output, and the 1,500 associated relays.


An interesting side note occurred after the delivery of the prototype to the military, when Eckert and Mauchly formed a company to commercialize the computer. Disputes arose over who owned the patents for the design, and the professors were forced to resign from the faculty of the University. The concept of "technology transfer" from the university research labs to the private sector, which is common today, had no counterpart in the late 40s and even into the 80s. See note 2 below


The ENIAC

The revolution in electronics can be traced to the tube's successful replacement with the discovery of the transistor in 1947 by a team at Bell Labs (Shockley, Bardeen and Brattain). Based on the semiconductor technology, the transistor, like the vacuum tube, functioned as a switch or an amplifier. Unlike the tube, the transistor was small, had a very stable temperature, was fast and very reliable. Because of it's size and low heat, it could be arranged in large numbers in a small area, allowing the devices built from it to decrease significantly in size.

http://www.bellsystemmemorial.com/belllabs_transistor.html


The transistor still had to be soldered into the circuits by hand, so the size was still limited. The ubiquitous presence of the transistor resulted in all sorts of mid-sized devices, from radios to computers that were introduced in the 50s. The next breakthrough which was credited with spawning an entire industry of miniature electronics came in 1958 with the discovery (independently by two individuals) of the integrated circuit. The integrated circuit (IC, or Chip), invented by Jack St. Clair Kilby of Texas Instruments and Robert Noyce of Fairchild Electronics, allowed the entire circuit (transistors, capacitors, resistors, wires, ...) to be made out of silicon on a single board.

http://www.nobel.se/physics/educational/integrated_circuit/history/


The earliest transistor (source: Bell Labs)

A photo of the first integrated circuit can be found at
http://www.computer-museum.org/collection/ti-icphoto.html

   

Notes:

1. The following text is from Time's accounting of the 100 great inventors of all time:
    "As it happens, [Vladamir] Zworykin had made a patent application in 1923, and by 1933 had developed a camera tube he called an Iconoscope. It also happens that Zworykin was by then connected with the Radio Corporation of America, whose chief, David Sarnoff, had no intention of paying royalties to Farnsworth for the right to manufacture television sets. "RCA doesn't pay royalties," he is alleged to have said, "we collect them."
     And so there ensued a legal battle over who invented television. RCA's lawyers contended that Zworykin's 1923 patent had priority over any of Farnsworth's patents, including the one for his Image Dissector. RCA's case was not strong, since it could produce no evidence that in 1923 Zworykin had produced an operable television transmitter. Moreover, Farnsworth's old [high school] teacher, [Justin] Tolman, not only testified that Farnsworth had conceived the idea when he was a high school student, but also produced the original sketch of an electronic tube that Farnsworth had drawn for him at that time. The sketch was almost an exact replica of an Image Dissector.
     In 1934 the U.S. Patent Office rendered its decision, awarding priority of invention to Farnsworth. RCA appealed and lost, but litigation about various matters continued for many years until Sarnoff finally agreed to pay Farnsworth royalties.
     But he didn't have to for very long. During World War II, the government suspended sales of TV sets, and by the war's end, Farnsworth's key patents were close to expiring. When they did, RCA was quick to take charge of the production and sales of TV sets, and in a vigorous public-relations campaign, promoted both Zworykin and Sarnoff as the fathers of television."

Ref: http://www.time.com/time/time100/scientist/profile/farnsworth.html
and   http://www.farnovision.com/chronicles/

2. The ENIAC patents (which covered basic patents relating to the design of electronic digital computers) were filed in 1947 by John W. Mauchly and J. Presper Eckert arising from the work conducted at the Moore School of Electrical Engineering at the University of Pennsylvania. In 1946, Eckert and Mauchly left the Moore School and formed their own commercial computer enterprise, the Electronic Control Company, which was later incorporated as the Eckert-Mauchly Computer Corporation. In 1950 Remington Rand acquired Eckert-Mauchly and the rights to the ENIAC patent eventually passed to Sperry Rand as a result of a merger of the Sperry Corporation and Remington Rand in 1955. After the patent was granted to the Sperry Rand Corporation in 1964, the corporation demanded royalties from all major participants in the computer industry. Honeywell refused to cooperate, so Sperry Rand then filed a patent infringement suit against Honeywell in 1967. Honeywell responded in the same year with an antitrust suit charging that the Sperry Rand-IBM cross-licensing agreement was a conspiracy to monopolize the computer industry, and also that the ENIAC patent was fraudulently procured and invalid.

Ref: Charles Babbage Institute, Honeywell vs. Sperry Litigation Records, 1947-1972; Also see a first-person accounting by Charles McTiernan in an Anecdote article in the Annals of the History of Computing

 
   

 


Next: The emergence of Computer Graphics
 
Next>>
<<Back to Section Index