A Critical History of Computer Graphics and Animation

Lesson 1:
The history of computing technology


The study of the history of CGI (computer generated imagery) is an important part of our overall educational experience, not necessarily to build on the historical precedent, but to gain an understanding of the evolution of our discipline and to gain a respect for the key developments that have brought us to where we are. The discipline is so recent in its early developments and so rapidly changing that we are in fact living it, and it evolves as we speak. Yet we have been so busy in advancing the discipline that we have often neglected to accurately record this history. So we will decide to agree upon certain past events in order to begin to develop a definitive record of what has transpired in this evolutionary process.

We must learn from the past, as we develop a theory and methodology which is tuned to the capabilities and qualities inherent in software, hardware, animation techniques, etc. that are part of our broad, contemporary, and creative computer graphics environment.

Herbert Freeman, in an introduction to his 1980 IEEE compilation of computer graphics papers, presents a succinct overview of the first two decades of the development of the CGI discipline. Like many other disciplines, computer graphics and animation has a rich (albeit relatively short) history that involves the following four eras, which are very much linked and related:

  • pioneers
  • innovators
  • adapters
  • followers

Early pioneers include artists and researchers. These visionaries saw the possibilities of the computer as a resource for making and interacting with pictures, and pushed the limits of an evolving technology to take it where computer scientists never imagined it could go. Their work motivated the work of the others as they tried to realize the potential of this new vision.

Many of the so-called innovators were housed in universities and research labs, and were working toward solving fundamental problems of making "pictures" of data using the computer.

The early adapters included pioneering CGI production facilities, artists, researchers, and research labs and industries with an interest in converting much of this early work into a viable (and marketable) tool for realizing their disparate goals.The late seventies and early eighties saw the second wave of adapters, which were primarily special effects production companies, equipment and software developers, universities, motion picture companies, etc.

As the technology advanced and the acceptance of this new approach to image making increased, the industry likewise evolved, and many of the current contributors, or followers (this descriptor is not intended to be demeaning or derogatory) came into being. These include effects production companies, universities and companies and research labs.

 

 




 

 

 

 

 

 

In order to adequately discuss the beginnings of computer graphics, we need to step back further in history and investigate a number of contributions that influenced the way we do things. Some of the innovations still are used in one form or another today.

http://www.computer.org/history/development/index.html


Computing or calculating instruments date back to the abacus, used by early man to represent a positional counting notation that was used in counting tables, called abaci. They were really not calculators per se, but provided a technique to keep track of sums and carries in addition. Although the abacus existed as far back as 5 A.D. the abacus as we know it was attributed to the Chinese in 1200 AD.

John Napier in 1617 introduced a calculation aid for multiplication, called Napier's Bones. They consist of a set of wooden rods, each marked with a counting number at the top, and multiples of that number down the lengths of the rods. When aligned against the row of multiples as shown, any multiple of the top number can be read off from right to left by adding the digits in each parallelogram in the appropriate row. Multiplication is thus reduced to addition.

Napier also invented the logarithm, which was used in the first slide rule introduced in approximately 1622.

http://www.sliderule.ca/intro.htm

 

 


Several automatic "calculators" were built in the 1600s, including the Schickard implementation of Napiers Bones, the Pascalene automatic adder, and the Liebniz automatic multiplier. Each of these devices was considered an "analog" device.

Modern computational devices are "digital" . One of the earliest implementations of a digital system is attributed to Jacquard in 1801. The Jacquard loom was developed in by Joseph-Marie Jacquard of France. He used a punched card to contol the weaving actions of a loom, which introduced much more intricate patterns in woven cloth. Jacquard's approach was a variation on the original punched-card design of Jacques de Vaucanson in 1745. de Vaucanson was a toy maker (most famous for a mechanical duck), and his idea of automating the weaving process was not well accepted by weavers (a situation not unlike that of the modern day computer ink and paint process in traditional animation.)


The punched-card idea was adopted later by Charles Babbage about 1830 to control his Analytical Engine, and later by Herman Hollerith for tabulating the 1890 census . The Babbage Analytical engine (which was never completed by him) was designed to use Jacquard's punched cards to control an automatic calculator, which could make decisions based on the results of previous computations. It was intended to employ several features later used in modern computers, including sequential control, branching, and looping.

An assistant to Babbage was mathematician Augusta Ada Lovelace, the daughter of the English poet Lord Byron, who created a "program" for the Analytical Engine. Had the Analytical Engine ever actually been built, Ada's program would have been able to compute a mathematical sequence known as Bernoulli numbers. Based on this work, Ada is now credited as being the first computer programmer and, in 1979, a modern programming language was named ADA in her honor.
http://www.columbia.edu/acis/history/jacquard.html

http://www.maxmon.com/


In 1878, Oberlin Smith devised a crude magnetic recording device made of a silk thread covered with steel dust. In theory, when exposed to a magnetic field, the steel dust particles would align with the magnet, creating a digital pattern. Smith applied for a patent, but never followed through with the application. He concluded that he wouldn't be able to establish a useful pattern on the strings, published his results in 1888, but dropped his investigations. In 1898, inventor Valdemar Poulsen of Denmark filed a patent for a "Method of, and apparatus for, effecting the storing up of speech or signals by magnetically influencing magnetisable bodies". His idea was that a wire, when touched with an electromagnet at different points and times, would store a signal that later could be retrieved to recover the same energy that caused the magnetization in the first place. He developed his idea as a "telephone answering machine" called the Telegraphone and started a company to market it.

Another of Poulsen's devices can be considered to be the original version of the hard disk. It consisted of a 4.5 inch diameter steel disk with a raised spiral on the surface which was traced by the electromagnet as the disk rotated, magnetizing the disk in the same fashion as the wire. Further contributions to magnetic recording were few, until Fritz Pfleumer developed the magnetic tape, which was a strip of paper covered with magnetic dust. The German General Electric company bought the patents from Pfluemer and marketed the first true tape recorder, the Magnetophone in 1936.

Analog: relating to, or being a device in which data are represented by continuously variable, measurable, physical quantities, such as length, width, voltage, or pressure; a device having an output that is proportional to the input.

 

Digital: A description of data which is stored or transmitted as a sequence of discrete symbols from a finite set, most commonly this means binary data represented using electronic or electromagnetic signals

Ref: dictionary.com

The U.S. Census Bureau was concerned about the difficulty of tabulating the 1890 census. One of its statisticians, Herman Hollerith envisioned a machine that could automate the process, based on an idea similar to that used in the Jacquard loom. Hollerith designed punches for his system, which he called the Hollerith Electric Tabulating System. A pin would go through a hole in the census card to make an electrical connection with mercury placed beneath. The resulting electrical current activated a mechanical counter and the information would be tabulated. The tabulating system was featured in a 1900 issue of Scientific American magazine.

The 80 column punch card introduced by Hollerith in 1928 became the standard input medium for computers until the late 70s when interactive systems became usable. It was sized at 7 3/8 inches wide by 3 1/4 inches high by .007 inches thick. Prior to 1929, this was a standard size for many US banknotes, and Hollerith apparently chose it so that he could store cards in boxes made for the Treasury Department.
http://www.cs.uiowa.edu/~jones/cards/history.html


The "IBM card" was the source of a popular phrase which became the topic for a great article by Steven Lubar of the Smithsonian in 1991, titled "Do not fold, spindle or mutilate": A cultural history of the punch card
http://ccat.sas.upenn.edu/slubar/fsm.html


Hollerith obtained over 30 patents for his research, but he was not a marketer. He felt he had a chokehold on the census tabulating machine, and he charged the Census Bureau more than it would have cost to do it by hand. As a result, they developed, and in fact patented their own version of the machine. Hollerith almost closed the doors on his company, but he was able to attract a brilliant man, Thomas J. Watson, and his company would later become International Business Machines (IBM).

The turn of the century saw a significant number of electronics related contributions. One of the most significant was the vacuum tube, invented by Lee de Forest in 1906. It was an improvement on the Fleming tube, or Fleming valve, introduced by John Ambrose Fleming two years earlier. Th vacuum tube contains three components: the anode, the cathode and a control grid. It could therefore control the flow of electrons between the anode and cathode using the grid, and could therefore act as a switch or an amplifier. A recent PBS special on the history of the computer used monkeys (the cathode) throwing pebbles (the electrons) through a gate (the grid) at a target (the anode) to explain the operation of the triad tube.

http://www.nobel.se/physics/educational/integrated_circuit/history/

http://www.pbs.org/transistor/album1/addlbios/deforest.html

A special kind of vacuum tube was invented in 1885. Called the Cathode Ray Tube (CRT), images are produced when an electron beam generated by the cathode strikes a phosphorescent anode surface. The practicality of this tube was shown in 1897, when German scientist Ferdinand Braun introduced a CRT with a fluorescent screen, known as the cathode ray oscilloscope. The screen would emit a visible light when struck by a beam of electrons. This invention would result in the introduction of the modern television when Philo Farnsworth introduced the image dissector in 1927, and the first 60 line "raster scanned" image was shown. (It was an image of a dollar sign.) Variations of the CRT have been used throughout the history of computer graphics, and it was the graphics display device of choice until the LCD display introduced 100 years later. The three main variations of the CRT are the vector display, a "storage tube" CRT (developed in 1949), and the raster display.

Engineers were interested in the functionality of the vacuum tube, but were intent on discovering an alternative. Much like a light bulb, the vacuum generates a lot of heat and had a tendency to burn out in a very short time. It required a lot of electricity, and it was slow, big and bulky, requiring fairly large enclosures. For example, the first digital computer (the ENIAC) weighed over thirty tons, consumed 200 kilowatts of electrical power, and contained around 19,000 vacuum tubes that got very hot very fast, and as a result constantly burned out, making it very unreliable.

The specter of World War II, and the need to calculate complex values, eg weapons trajectory and firing tables to be used by the army, pushed the military to replace their mechanical computers, which were error prone. Lt. Herman Goldstine of the Aberdeen Proving Grounds contracted with two professors at the University of Pennsylvania's Moore School of Engineering to design a digital device. Dr. John W. Mauchly and J. P. Eckert, Jr. , professors at the school were awarded a contract in 1943 to develop the preliminary designs for this electronic computer. The ENIAC (Electronic Numerical Integrator and Computer ) was placed in operation at the Moore School in 1944. Final assembly took place during the fall of 1945, and it was formally announced in 1946.


The ENIAC was the prototype from which most other modern computers evolved. All of the major components and concepts of today's digital computers were embedded in the design. ENIAC knew the difference in the sign of a number, compare numbers, add, subtract, multiply, divide, and compute square roots. Its electronic accumulators combined the functions of an adding machine and storage unit. No central memory unit existed, and storage was localized within the circuitry of the computer.


The primary aim of the designers was to achieve speed by making ENIAC as all-electronic as possible. The only mechanical elements in the final product were actually external to the calculator itself. These were an IBM card reader for input, a card punch for output, and the 1,500 associated relays.


An interesting side note occurred after the delivery of the prototype to the military, when Eckert and Mauchly formed a company to commercialize the computer. Disputes arose over who owned the patents for the design, and the professors were forced to resign from the faculty of the University. The concept of "technology transfer" from the university research labs to the private sector, which is common today, had no counterpart in the late 40s and even into the 80s.

The revolution in electronics can be traced to the tube's successful replacement with the discovery of the transistor in 1947 by a team at Bell Labs (Shockley, Bardeen and Brattain). Based on the semiconductor technology, the transistor, like the vacuum tube, functioned as a switch or an amplifier. Unlike the tube, the transistor was small, had a very stable temperature, was fast and very reliable. Because of it's size and low heat, it could be arranged in large numbers in a small area, allowing the devices built from it to decrease significantly in size.

The transistor still had to be soldered into the circuits by hand, so the size was still limited. The ubiquitous presence of the transistor resulted in all sorts of mid-sized devices, from radios to computers that were introduced in the 50s. The next breakthrough which was credited with spawning an entire industry of miniature electronics cam in 1958 with the discovery (independently by two individuals) of the integrated circuit. The integrated circuit (IC, or Chip), invented by Jack St. Clair Kilby of Texas Instruments and Robert Noyce of Fairchild Electronics, allowed the entire circuit (transistors, capacitors, resistors, wires, ...) out of silicon on a single board.


http://www.nobel.se/physics/educational/integrated_circuit/history/

   
   
   

 

Next: The emergence of Computer Graphics
 
Next>>