A Critical History of Computer Graphics and Animation

Section 12:
Analog approaches, non-linear editing, and compositing


As seen in the previous section, the 1970s and 1980s saw a proliferation of computer graphics production finding its way into the commercial CG production world. Most of these companies and efforts were built around the evolving digital computer technology, and relied on computer programs and related CG systems to allow the artist to affordably and efficiently create the synthetic imagery. A parallel technology evolution was taking place at the same time, only in the analog world of image-making technology and production. For example, Computer Image Corporation (CIC) developed complex hardware and software systems that included ANIMAC, SCANIMATE and CAESAR. All of these systems worked by scanning in existing artwork, then manipulating it, making it squash, stretch, spin, fly around the screen, colorizing it, scaling or expanding it, etc. Bell Telephone, DuPont, ABC TV, NBC, and CBS Sports were among the many corporations who made use of this style of computer graphics.   Click on the images below to view a larger version (when available).

Perhaps one of the earliest pioneers of this analog computer animation approach was Lee Harrison III. In the early 1960s, he experimented with animating figures using analog circuits and a cathode ray tube. Ahead of his time, he rigged up a body suit with potentiometers and created the first working motion capture rig, animating 3D figures in real-time on his CRT screen. He made several short films with this system, called ANIMAC.

Harrison studied at the School of Fine Arts at Washington University in St. Louis. After a stint in the Coast Guard, he joined McDonald Aircraft in St. Louis as a technical illustrator. He returned to Washington University to study engineering, and took a job as an engineer at Philco Corporation in Philadelphia and later as a bio-cybernetic Engineer at the Denver Research Institute at the University of Denver.

It was while he was at Philco that he decided to chase his idea of systematically creating animated figures. His concept was to view a stick figure as a collection of lines that could be independently moved and positioned to form an animated character. Each of the lines would be displayed on a CRT and controlled with a vector deflection of the CRT's electron beam. Each figure would be composed of bones, skin, joints, wrinkles, eyes, and moving lips, all drawn in sequence to create what Harrison called a "cathode ray marionette."

In order to accomplish this task, he founded a company in Philadelphia called Lee Harrison and Associates. His ideas were realized with a hardware system designed developed by his new company, which he called called ANIMAC. Harrison participated in the Ars Elecronica Festival, along with computer video artists Bill Etra and Bill Diamond and others, in 1992. An facsimile, entitled Notes for an Early Animation Device (reference) (PDF) in which the design and operation of ANIMAC was described is included in the archives of the Festival. The following is from an interview with Jeff Schier, also published in the Festival proceedings:

"WE STARTED OUT by developing what later became ANIMAC. At first we called our machine "The Bone Generator" because it made sections of straight lines that could be hooked together and could be individually animated or moved in three dimensional space. To determine what a bone was you had to determine where it was to start in X, Y, Z space, in which direction it went from there, and for how long, in order to determine its length. The parameters that determined which direction it was going in also determined the actual length projected onto the face of the tube. lf you saw a bone from the side you saw its full length but if it were pointing toward you, you saw only a portion of it.

A bone was composed of a bi-stable multi-vibrator or a flip-flop. To start it was to essentially put a signal on a line that governed the opening of a lot of sampling gates. The inputs to the gates were the parameters that governed the position and some of the qualities and characteristics of that bone. To program it we had a patch panel.We always had a navel point on our figures and we'd always flip back to the navel point. We'd go up and out an arm and go back to the navel point, go up and out another arm and back to the navel, go up and out to the head. Those were all fly-back bones and we would fly-back by just collapsing the information that was contained on a capacitor.

In order to determine the length of a bone we used time as the basis. We'd start drawing in a certain direction determined by the specific parameters and we'd go in that direction until we'd turned that bone off and then esscntially we'd wait there until we drew another bone. The length was determined by plugging a timing circuit into a place which was reset after each bone. When you started a bone you also started that counter and that flip-flop was plugged into the counter that would turn that bone off. It was pretty much all digital. The next bone would be plugged into another count and so forth and you varied the counts depending. A count represented some number of high frequency units that was part of the clock network of the whole machine.

The patch panel was color-coded and it was a big patch panel we got out of the junkyard someplace. If you understood the code you could actually see the bones on this patch panel. There would be a certain color like green and the output might be a blue. If you were going to bone number one, you brought a start pulse that was located somewhere and you'd plug into the first bone and then you'd plug in the output of the first bone into the second bone and so forth. The inputs to the parameter gates were not located on that panel. They were located down a little lower on the face of the Animac and there were hundreds of them. You had all of these hundreds of inputs required to make the thing happen and to change it over time. After this, the main thrust of our development was to make things change over time which eventually culminated in what we called key frame programming where we would turn knobs until we got what we wanted."

The "skin" was added to the bones by superimposing "springs" that modulated the stick vectors with circular sweeps of spinning vectors. The thickness of the bones, or displacement of the rings from the center of the line, was voltage modulated by a "skin scanner." The scanner was constructed from a "flying spot scanner," a vector camera pointing at an intensity graph with higher brightness representing a larger bone displacement. The "joints" or connection of bones to skin were formed by drawing the bones in a specified order, the endpoints being momentarly held till the next bone was drawn. A synthetic mouth, lips and eyeballs were created through parabolas and sine waves modulated with precise control from voltage sources. The entire figure was manipulated in three dimensions by passing the control signals through a three dimensional (3D) rotation matrix. These control signals were formed from horizontal and vertical sweep generators, with camera angle, size and position voltages run through rotation matrices constructed from adders, multipliers and sine/cosine generators.

To give the illusion of depth, an additional camera tracked the intensity of the skin, giving the illusion of an edge by modulating the skin brightness and leaving it in silhouette. This same camera scanned a texture and superimposed it on the skin surface of the bone.

Harrison's motion capture rigging was a prelude of things to come. The animation harness was fabricated from potentiometers and Lincoln Logs used as armatures. Manipulating the harness tied tactile movement with control voltages, making the character "dance" like the person in the harness.

The ANIMAC was largely a proof of concept prototyped with vacuum tubes mounted on 2 by 4's, using a Heathkit oscillator as the master clock and driving an XY oscilloscope for the display. Harrison's company went public and was renamed Computer Image Corporation in 1969.

That same year ANIMAC was converted into a transistorized version. To commercialize on the scan processing experiments, the approach for moving the animated character was used instead for moving logos and high contrast graphics about the screen. The skin was "unraveled" and became small movable rasters called "flags." The skin scanner was modified to point at the hi-contrast artwork of a logo or corporate graphic. For example, the intensity of the scanned image filled the undulating flag and was flown and spun across the surface of the screen. The multiple bone mechanism was simplified into five flag generators. The XY display was re-scanned by a video camera with 5 levers of colorization and combined with a background graphic for recording onto video tape. The new machine that incorporated all of these modifications was called SCANIMATE.

SCANIMATE allowed interactive control (scaling, rotation, translation), recording and playback of video overlay elements to generate complex 2D animations for television. In fact, most of the 2D flying logos and graphics elements for television advertising and promotion in the 1970s were produced using SCANIMATE systems.

In 1972 Harrison won an Emmy award for his technical achievements. As computer graphics systems became more powerful in the 1980s, Harrison's analog systems began to be superseded by digital CG rendered keyframe animation, and now are no longer used in production. There are several still running. Dave Sieg brought one to the SIGGRAPH 98 conference as part of a CG history exhibition.

Computer Image went on to develop a system called CAESAR, which stood for “Computer Animated Events with Single Axis Rotation.” CAESAR was targeted at moving cartoon characters’ limbs in an attempt to automate the Saturday morning cartoon production process. CAESAR used most of Scanimate’s analog processing technology with digital parameter storage on a Data General minicomputer. According to Dave Sieg:

CAESAR was never mass produced to my knowledge. A later product, System V was produced, with an improved digital computer, and it replaced Scanimate in several facilities. While at Image West, I also worked on development of a hybrid analog/digital system called VersEFx. But ultimately, the analog image quality produced by such CRT rescan systems could not compete with that of totally digital systems.

There were several major production facilities that used the equipment made by CIC. Two of the most reputable were Dolphin Productions in New York and Image West in Los Angeles.

 

 

 

 

 

 


Image courtesy of Lee Harrison

 

 

Edwin J. Tajchman (left) & Lee Harrison III receiving the National Academy of Television Arts and Science award for "Outstanding Achievement in Engineering Development"

Above three images from Notes for an Early Animation Device, presented at the Ars Electronica Endo Und Nano Festival, 1992

 

 

 

 

 

 

 

 

 



Dave Sieg in front of his working Scanimate at
Siggraph 98
For more about Scanimate, visit Sieg's historical web site at

http://scanimate.zfx.com/

 

Dave Sieg wrote an article for the SIGGRAPH newsletter about his experiences with Scanimate as part of the SIGGRAPH 98 History Project, called
Scanimation in the Analog Days


Animac Stick Man

 


Interview with Lee Harrison


Scanimate operation

 


The Body Human

 

Note: The Real videos on this page (except Electric Company) are no longer available on the host streaming server. I will try to get new links ASAP. Sorry.


Interview with Dave Sieg at SIGGRAPH 98

 


The Electric Company

 

 


Images of productions on Scanimate

Dolphin Productions

Text to come soon.

 

 


The Scanimate installed at Dolphin in 1974

 

 

 

Image West

In 1977, Computer Image decided it could find a larger market by having a facility in Hollywood, so it borrowed heavily and set up a new company, Image West, Ltd. The company, while very good technically, did not know the Hollywood market well, and was eventually foreclosed by its bank. The bank then approached Computer Image’s largest customer, Omnibus, Inc. from Canada and offered to have them take over Image West. Omnibus agreed, and operated the company until 1982 when it sold Image West and went on to form Omnibus Computer Graphics Inc., using digital technology licensed from the New York Institute of Technology. Image West continued to operate Scanimates until 1986 when it discontinued operations.


Operation of Scanimate - 1981


Image West Demo 15


Ron Hays

The New Television Workshop at WGBH supported the creation and broadcast of experimental works by artists. One of their projects was the Music Image Workshop, which was primarily a project of Ron Hays, who used the Paik-Abe videosynthesizer to create elaborate visual scores set to music. It was funded by the Rockefeller Foundation and the National Endowment for the Arts from 1972 through 1974. Hays worked closely with WGBH producer and director, David Atwood, to create both live broadcasts and finished works. Additionally, works by other artists were presented under the auspices of the Music Image Workshop.

Hays later produced a short film with Michael Tilson Thomas, called Space for Head and Hands. It was an improvisation by Hays with piano by Thomas. He also produced animation for the Julie Christie movie Demon Seed, and a video art compilation of music, computer graphics and art for Odyssey called Ron Hays Music Image. This animation was produced using the Scanimate system.


Sergeant Pepper

 

 


 

 

 

Quantel was founded in 1973 in Newbury, UK. It's focus was to create technology for use in television production. Its first product was the DFS 3000, which was the first digital framestore. The DFS 3000 is most widely known as the device that allowed the first inset video for television broadcast, a video image inset into the main picture showing a close-up of the Olympic torch as the runner entered the stadium at the 1976 Montreal Olympics. They followed this with aseries of digital effects devices, including the DPE 5000 and the DLS 6000.

The most widely known Quantel product is the Paintbox, introduced in 1981 and still in production at facilities all over the world. The Paintbox is considered one of the first commercial paint systems, and was actually introduced at the spring NAB show in 1980. Quantel obtained patents related to the system, and challenged the entire digital image community with a series of controversial infringement lawsuits. (More can be read about these challenges in Note 1 below.) The Paintbox spawned a range of digital technologies from Quantel.

The next device was released in 1982, Called the Mirage, it was the first digital effects machine able to manipulate 3D images in 3D space. It also used an interesting method for transforming one image into another by using a two-dimensional "particle system" to map pixels from one image onto pixels from the second image. As the pixel tiles move over time the first image appears to disintegrate and then restructure itself into the second image.

Another radical leap was made in 1986 when Quantel introduced Harry, the world's first non-linear editor. NLEs are now fairly standard in the editing business, making linear editing suites now largely obsolete. In some ways Harry did for video editors what Paintbox had done for graphic designers, giving them a tool for moving their trade forward by leaps and bounds. The Harry combined several minutes of digital-disk storage with a 2D graphics system and a crude but elegant means to assemble video clips. It was the only digital non-linear editing workstation capable of producing broadcast-quality material for almost a decade.

Other important Quantel contributions can be seen below from their timeline at http://www.quantel.com

1975: DFS 3000 The world's first digital framestore.
1977: DSC 4002 The first portable digital standards converter.
1978: DPE 5000 The first commercially succesful digital effects machine.
1980: DLS 6000 Digital still storage for on-air presentation.
1981: Paintbox The industry standard graphics kit.
1982: Mirage The first digital effects machine able to manipulate 3D images in 3D space.
1986: Harry The first NLE also makes multilayering of live video a practical proposition.
1989: V-series The second generation Paintbox. Faster, smaller and more powerful
1990: Picturebox Integrates the storage, presentation and management of stills.
1990: Harriet Manipulating live graphics over video.
1992: Henry The effects editor. Offers simultaneous layering of multiple live video sources.
1992: Hal The video design suite. The first dedicated graphics and compositing centre.


Note: The Paintbox was used extensively in producing the graphics for the 1985 television series Max Headroom.

There were several other video image manipulation technologies that competed with the Harry. For example, in 1981 Ampex introduced the ADO® system, which created digital special effects, allowing rotation and perspective of video images.

The earliest digital compositing suites for video (still in evidence today) were Quantel's dedicated hardware systems, like the Harry. Quantel's Domino console first evolved the power to take digital compositing to film resolutions. Many post houses continue to use Quantel systems for their ability to process large dataflows in real time (something that later, open systems running on PCs and offering the flexibility of mutiple software use were often less able to provide).

Flame, developed by Australian Garry Tregaskis was the first software-only system to run on the Silicon Graphics platform as the general computing power needed to process layers of video and effects became available. Part of the appeal was that these were hardware units which could also be devoted to other tasks, such as editing and 3D graphics.

A plethora of software products running on Unix and NT systems followed. Some evolved from the proprietary software of production studios like ILM or Digital Domain, or NYPD in Australia, in the case of Digital Fusion. Others were developed from the ground up as commercial products by the creators of non-linear editing, 3D or other graphics systems. Adobe's After Effects, one of the least expensive desktop video compositors, is still basically an adaption of Photoshop with keyframe animation.

 

 

 

 

 

 

 

 

 

 


Notes:

1. Dick Phillips wrote about the Quantel lawsuits in a 1998 SIGGRAPH newsletter. The following is an excerpt from the article, which can be read in its entirety at

http://www.siggraph.org/publications/newsletter/v32n3/contributions/phillips.html


In fact, it was as recent as 1996 that digital paint systems became the subject of a lawsuit. This is surprising, especially when you consider that [Smith97] points out that the first digital paint program can be traced back to 1969. But it was indeed in January 1996 that Adobe Systems Inc. was sued by Quantel Ltd. for alleged infringement of five of their patents by Adobe’s Photoshop product. The stakes were huge; Quantel was seeking damages of $138 million, to be trebled if willful infringement was determined. Moreover, Quantel was seeking an injunction to stop Adobe from selling Photoshop.

 


Next: Flight simulators
<<Previous
Next>>