A Critical History of Computer Graphics and Animation

Section 13:
Flight simulators

Rolfe and Staples, in their 1986 book Flight Simulation, note that "the object of flight simulation is to reproduce on the ground the behavior of an aircraft in flight." This includes the use of physical simulators, with the real and perceived motion that satisfy the motion cues expected by the human pilot, as well as the extremely important visual system, which includes the generation and display of a simulated perspective view of the outside world. These two are inextricably linked, as the visual response can trigger perceptions in the physical reaction, and vice versa. In 1986 there were over 500 simulators in use. We will review a few that have had an impact on the field of computer graphics and animation.

Flight Simulation

The early visual systems, dating from the late 1950s or early 1960s, included film systems and closed circuit television systems. The film system used a 35mm or 70 mm film that was shot from a real airplane. Servo driven optics distorted the image to simulate a range of other flight paths responding to pilot interaction. The CCTV system moved a camera with a special optical lens over a physical terrain model, or terrain board. Neither approach was very practical, even though high degrees of realism could be achieved, because variations were not easy to present, and situations that may confront a pilot in flight were limited. These early systems were replaced in the early 1970s with CGI systems, or CIG (Computer Image Generators) as they were often called.


The first computer image generation systems for simulation were produced by the General Electric Company (USA) for the space program. Early versions of these systems produced a patterned "ground plane" image, while later systems were able to generate images of three-dimensional objects.


Click on the images below to view a larger version (when available).

J.M. Rolfe and K.J. Staples, Flight Simulation, Cambridge University Press, London, 1986

The physical simulator had a mock-up of the cockpit mounted on computer controllable pneumatic devices to simulate motion.

The visual component of the simulator generated images that were presented to the training pilot on display devices mounted inside the physical simulator.

CCTV Terrain Board

Night-only systems usually used vector devices, rather than the raster scan display that gives the image complexity used today. The first of these systems was produced by the McDonnell-Douglas Electronics Corporation in 1971 and was called Vital II (Virtual Image Takeoff And Landing). It was certified by the FAA for commercial flight training for Pacific Southwest Airlines in 1972. The scene was only a night scene of an airport, showing the light pattern of the runways. As an FAA commercial Phase II system, it had to be capable of showing directional lights (only visible from certain directions), flashing lights or beacons, runway end illuminator lights, and vertical approach slope indicators, which are white when the correct approach slope is maintained, and red otherwise.

Phase III systems, on the other hand, were required to show day scenes of greater detail. The airport had to be recognizable, and terrain and physical landmarks had to be visible. In order to adequately represent these complex visual images, raster systems were employed.

Military simulation systems require significantly more complexity than the "takeoff/landing" visual simulations for commercial simulators. They require the simulation of complicated maneuvers, perhaps carrier landing, high speed flight, ground attack situations, etc. The vector devices used for many commercial systems were not sufficient, and the industry embarked on serious research and development efforts that both contributed to and took advantage of parallel efforts in CGI for special effects activities for movies, etc.


The Link Company was one of the first companies to develop flight simulators. Edwin Link worked for his father in a piano and organ company in New York, and designed his first "Link trainer", the Link "Blue Box", in his basement between 1927 and 1929. The U.S. Army Air Corps recognized the value of the trainer and they pushed the development of Link's original idea to include more sophisticated control and monitoring devices. It was very successful during the 1930s and into World War II, even used by German pilots as well as U.S. and British pilots. Initially there was no visual feedback. The trainer was used primarily to familiarize the pilot with instruments, and to give some rudimentary motion feedback to control stick manipulation. A first form of visual feedback for a later Link Trainer was a cyclorama. The scene from the cockpit was painted on the walls of the training room.

In 1939, Link developed a trainer that allowed for cross Atlantic navigation and night training, called the celestial navigation trainer. It represented stars on a dome over the physical trainer, and the stars could be relocated to correspond with time as well as changes in location. 

With the desire for better motion control of the trainer came the development of analog and later digital control mechanisms. In the early 1960s, Link developed the Link Mark I computer to accomplish real time simulation by computing aircraft equations of motion.

Link merged with the General Precision Equipment Corporation in 1954, and in 1968 was purchased by the Singer Company, of sewing machine fame. In 1981, the same year as Ed Link died, the Link division split into the Link Simulation Systems Division (Maryland) and the Link Flight Simulation Division (California). In 1988 CAE Industries from Canada purchased the Maryland operation, and consolidated it with the Flight Simulation Division in 1990, to form CAE-Link Corporation.

In the 1960s, the General Precision Group of Singer-Link began working with NASA to develop simulators for the Gemini space program. Singer held the contract for the simulators under the direction of prime contractor McDonnell-Douglas, which supplied cabin and instrumentation mock-ups. Fully functional simulators came on line at Cape Canaveral and Houston in 1964.

In 1978, the Singer-Link DIG or "Digital Image Generator" was developed. This device is considered to be one of the world's first-generation Computer-Generated-Image (CGI) systems.

In the 1980s, during the height of the Cold War, Singer-Link fielded simulators for numerous military systems including the B52-G, AH-64 Apache, B-1B, F-16C, P3-C Orion, and S3-B Viking aircraft. Singer-Link also developed a number of ship and submarine trainers for ASW (Anti-Submarine Warfare), Mine Warfare, and Sonar System Training. In the 1990s, Singer developed commercial flight simulators such as the MD88. Singer-Link simulators and the DIG Digital Image Generator were featured in an episode of the television show Nova called "Why planes crash".


"Computers in Spaceflight: The NASA Experience" http://www.hq.nasa.gov/pao/History/computers/Ch9-2.html

The Link Trainer with the cyclorama visual image

The Link Celestial Navigation System
You can see the "cockpit" with the navigator and pilot trainees shown above in the upper image, situated right under the projection dome.







Evans & Sutherland Computer Corporation

Evans & Sutherland, with its connection to the University of Utah, attracted a large number of the leading CG researchers of the late 1960s and 1970s. They developed algorithmic approaches to high performance image making, as well as the hardware workstations that could support the software. They developed the LDS-1 and LDS-2, followed by a line of graphics workstations, called the E&S Picture System. These were used by most of the CGI production companies through the 1980s. They also developed the CT-5 and CT-6 flight simulators. After many years of successful marketing , they changed the acronym of their simulation products from CT (Continuous Texture) to ESIG (Evans & Sutherland Image Generator) for their simulation product line.

E&S developed turnkey simulation systems for military and commercial training, including systems for air, sea, and land simulation. Their expertise in this area also opened a market for digital theater products, such as planetarium theater systems, domed theater presentations, and digital projection systems. The visualization industry also was partial to E&S products, and they also developed graphics acceleration products for professional workstations and personal computers.

E&S holds many CG related patents. In the late 1980s, they threatened a number of workstation manufacturers with patent infringement for a number of technologies, most notably clipping.



The following is from the E&S history page at http://www.es.com/about_eands/history.asp
Also available as multimedia video clips in Real and Windows Media format at http://www.es.com/about_eands/

For more than 30 years, Evans & Sutherland has been the power behind the scenes, providing complete visualization solutions for a wide variety of applications. Whether it's for training simulation, education, entertainment, or business, E&S creates the technology that makes it come alive.

It was 1968 when Professor David Evans, founder of the University of Utah's computer science department, convinced his friend and associate, Dr. Ivan Sutherland, to leave his teaching position at Harvard and join him in a new venture in Utah. That year, the two professors began a collaboration that would shape the history of the computer industry. Their collaboration, based on their theory that computers could be used interactively for a variety of tasks, became Evans & Sutherland Computer Corporation. Established in abandoned barracks on the campus of the University of Utah, the company began by recruiting students from the university and looking for new ways to use computers as tools.

Although E&S is hailed as a leader in computer graphics technology, David Evans contended that developing graphics was only part of the dream. In starting the company, he had a different idea: that computers were simulators. Simulators can replace real objects on occasions when a simulation can be built more cheaply than the physical model can be. "The company began with graphics because we thought they were an essential link between the human user and the simulation."

The strongly academic environment surrounding E&S provided a uniquely creative and academic work environment that shaped and refined some of the most innovative minds in computer graphics. Many of today's computer graphics visionaries began their careers at Evans & Sutherland. Industry leaders such as Jim Clark, who started Silicon Graphics, Ed Catmull, founder of Pixar Animation Studios, and John Warnock, president and founder of Adobe, trace their roots back to E&S and the tutelage of Dave Evans and Ivan Sutherland.

With its emphasis on computers as simulators, training became a natural market for E&S, so the company continued to develop and enhance its simulation systems. Then, in the mid-1970's, E&S established a partnership with Rediffusion, a British simulation company, that gave E&S exclusive rights to provide visual systems for Rediffusion's commercial flight training simulators. Today, approximately 80% of the world's commercial airline pilots are trained on simulators using E&S visual systems.

Like the computer graphics industry as a whole, E&S saw significant growth and enormous change during its first 15 years. The company stayed on the leading edge of computer graphics technology as it broadened its product line of visual systems for simulation. And, using the technologies developed for simulation, E&S began exploring some new applications such as planetarium systems.

During that time, the company outgrew the old barracks on the University of Utah campus, and E&S became one of the first residents of the university's new research park. Employment continued to grow as the company's business base grew. Then, in 1975, Ivan Sutherland left E&S to pursue other interests. Currently a research fellow at Sun Microsystems, he is also a member of the E&S Board of Directors.
1978 marked another milestone for E&S when the company went public.

In the 1980's, a worldwide recession and changing marketplace brought serious challenges to the company. But E&S continued to lead the simulation industry in providing the highest quality, most realistic visual systems in the world as it looked for new markets for its technology, such as digital projectors for planetariums and entertainment applications.

The early 1990's were a period of change for the company as Dr. Evans retired. In 1994, James Oyler joined E&S as President and CEO. Under Mr. Oyler's leadership, the company has continued to lead the visual systems industry for both military and commercial simulation applications as it leverages its technologies into entertainment, education and workstation applications.

Today, the company's businesses include visual systems for all kinds of military and commercial training, including systems for air, sea, and land simulation; digital theater products, such as planetarium theater systems, domed theater presentations, and digital projection systems; virtual set products for video and television producers; visualization products for land developers; and graphics acceleration products for professional workstations.

As E&S enters a new millennium, the company will continue its role as the pioneer, innovator, and leader in computer graphics as it explores new markets and takes visual system technology to new levels of realism.




E&S history - Pt. 1


E&S history - Pt. 3

E&S Product Overview

E&S history - Pt. 2


E&S history - Pt. 4

Multimedia clips reviewing company history in Microsoft Windows Media format.

CT-5 Simulator
Part 1

CT-5 Simulator
Part 2

CT-6 Simulator

NASA's requirements for flight simulators far exceeded the state of the art when the first astronaut crews reported for duty in 1959. Feeling obligated to prepare the astronauts for every possible contingency, NASA required hundreds of training hours in high fidelity simulators. Each crewman in the Mercury, Gemini, and Apollo programs spent one third or more of his total training time in simulators.

The primary simulator for the first manned spacecraft was the Mercury Procedures Simulator (MPS), of which two existed. One was at Langley Space Flight Center, and the other at the Mission Control Center at Cape Canaveral. Analog computers calculated the equations of motion for these simulators, providing signals for the cockpit displays.

The Gemini Mission Simulators used between 1963 and 1966 in the space program operated on a mix of analog and digital data and thus were a transition between the nearly all-analog Mercury equipment and the nearly all-digital Apollo and later equipment. Three DDP-224 digital computers performed the data processing tasks in the Mission Simulator. Built by Computer Control Corporation, which was later absorbed by Honeywell Corporation in 1966, the three computers provided the simulator with display signals, a functional simulation of the activities of the onboard computer, and signals to control the scene generators.

Scene depiction in the Gemini era still depended on the use of television cameras and fake "spacescapes", as in aircraft simulators. Models or large photographs of the earth from space provided scenes that were picked up by a television camera on a moving mount. Signals from the computers moved the camera, thus changing the scene visible from the spacecraft "windows," actually CRTs. A planetarium type of projection was also used on one of the moving-base simulators at Johnson Space Center to project stars, horizon, and target vehicles.

The visual images for the Apollo trainers, and the later Shuttle and Skylab trainers, moved to an entirely digital control. Window scenes were entirely computer generated. Coupled with sophisticated image processing techniques, the simulations could not only represent the environment of Earth and near space, but also the surfaces and environments of the moon and Mars and other celestial bodies.



Flight over Miranda, moon of Uranus


The U.S. Navy maintained an active simulation activity, particularly at their Naval Training Systems Center in Florida. Most notable was the early development on their VTRS, or Visual Technology Research Simulator. This simulator was an example of a "target tracked" system, which placed an image of the air-to-air combat target dynamically in a larger scene, in this case on a spherical screen surrounding the trainer cockpit. In the interest of computation speed, the area of most visual interest (the target and its nearby surroundings) was rendered in higher detail, and inset into a lower resolution background display representing the surrounding terrain outside of the interest field of view.

NTSC also developed a head mounted display for American Airlines that used the target tracking approach that was also developed by Singer-Link in their ESPRIT (Eye-Slaved Projected Raster Inset) system.The NTSC system used eye and head tracking technologies, and projected from a lens on the helmet, so the higher resolution image was always coordinated with the pilot's view.




The Air Force was also interested in low altitude simulation of high resolution, high detail terrain. Their ASPT (Advanced Simulator for Pilot Training) installed in 1974 at Williams AFB in Arizona was one of the first examples of a multiple display, multiple CGI channel "butted" display system, which was the model for most simulators built during the 1980s and 1990s. The field of view was divided between multiple CRTs surrounding the pilot, each fed with a signal from an independent but synchronized computer image generator. By aligning the boundary of one display with that of the adjacent, it gave the feel of a continuous image. The ASPT used seven CRTs with complex optics to eliminate overlapped images, in what they called the "Pancake Window"  (Farrand Optical Company). Each pentagonal window provided more than an 86° field of view.

An alternate approach emerged from the Air Force Resource Laboratory. The Pancake Window mosaic display was becoming difficult to maintain. Seeking a low cost, full color replacement for the dim, monochrome Pancake Window led to experiments with rear projection screens and CRT projectors. A bright, clear real image was formed at approximately arm's length from the pilot's eye. An example of this technology was the Boeing VIDS (Visual Integrated Display System.)

The E&S CT5 and CT6 systems are configured like the ASPT system. The early CT5, coupled with the Rediffusion physical simulator, cost around $20M. Other similar systems included the GE Compuscene IV made for Martin Marietta and the DIG from Singer-Link.





Boeing VIDS


Researchers experimented with many different approaches to the generation of complex imagery for simulation. The Low-res environment coupled with the hi-res target described above gave only moderately acceptable results. Hardware, like that developed at E&S and elsewhere, made image generation faster, as did algorithms that were embedded in the software. General Electric used texture mapping to achive reasonable results, as shown in the GE Cell Texture shown at right.

Honeywell experimented with assembling real images stored on video discs, and retrieved, transformed and seamed them together based on the desired field of view and range in real time.

Vought researchers experimented with creating a mosaic of images obtained from aerial photographs into a terrain data base. A similar approach took advantage of the data bases of the Defense Mapping Agency, which can combine height maps representing elevations (for the geometry) with terrain and cultural artifacts.









Another interesting approach is attributed to Geoff Gardner at Grumman Data Systems, for the generation of terrain data and effects in simulations. Gardner presented a method to generate terrain, clouds and other objects in a SIGGRAPH 84 paper, realistic smoke and clouds in a 1985 paper, which he extended to smoke and fire in a 1992 paper.The models he presented used quadrics or ellipsoids (which are very computationally inexpensive for things like view intersection) that were covered using a texture derived as a function of the transmittance of transparency each ellipsoid should possess. The transmittance of transparency varied from the center of the object to the edges of the ellipsoid as a mathematical function. Gardner was able to use these ellipsoids and transparent textures to model the terrain, trees, clouds, smoke, fire and other elements.

GE "cell texture"

Gardner's transparent texture on ellipsoids

Gardner's ellipsoid approach used for smoke and fire

Gardner ellipsoids - 1984

Grumman non-edge CIG
(Gardner - 1981)



Evans and Sutherland, as well as other simulation manufacturers, and researchers in simulation as well as CGI in general extended the ideas of flight simulation to other vehicles, including maritime and automobile. For example, In the mid-80s, NTSC contracted with Ohio State to develop a submarine pilot simulation trainer for pilots who manuever the ships into dock at the Norfolk base in Virginia.

E&S modified the databases for the flight simulators to represent accurate automobile simulation.

Nakamae at Hiroshima University experimented with accurate lighting approaches to represent night time driving scenarios.



Feast of Lights

Light & Shadow

Still Life Etude

Visitor on a Foggy Night



Next: CGI in the movies
<<Back to Section Index