Computer Graphics In TV & The Movies​

Graphics

Computer Graphics In TV & The Movies

Staffordshire University

School of Computing

B.Sc (Hons) Applied Computing (Year 2)

Tutor: C. Mayer & G. Banks

Student: Sean Michael Lewis

Date: 7th March 1995

Contents
Computers In The Movies
  • Westworld & Futureworld
  • 2001
  • Star Trek II
  • Death Becomes Her
Computers In Television
  • Space Precinct
  • Into The Future

Introduction

Until recently, the most spectacular computer graphics effects have been reserved for science fiction films.  These films required pictures of aliens, pre-historic creatures, or non-earth environments that are difficult to create in any other way.  As computer graphics has matured, however, it has made its way into other film genres.

In the early days of computer graphics in film e.g. The Last Starfighter and TRON circa 1981; CRAY mainframe computers were required in order to generate the high-resolution animated graphics for the big screen.  Fifteen years later high-resolution graphics are being produced for screen and TV on Commodore(?) Amiga computers and the Video Toaster hardware, for TV series like Babylon 5 and Seaquest DSV.

Computers in the film industry are creating even more fantastic and realistic effects giving humans cartoon like qualities (The Mask), allowing humans and cartoons to interact (Roger Rabbit), and creating cinematic backgrounds for cartoon animated films such as Disney’s Beauty and the Beast.

One of the leading companies in the field of computer special effects has to be the world renowned Industrial Light & Magic (ILM) effects team.  Responsible for such films as Ghost, Death Becomes Her, Roger Rabbit and The Mask it is just some of their work that will be featured mostly in this paper.

Computers In The Movies

Computer graphics have been introduced into feature films slowly and cautiously.  In the 1970’s the era of the vector display, almost no feature film producer took advantage of this relatively cheap and adaptable technology which could give a stylish high-tech look.  In the 1980’s when so many more variations became available, the wireframe display suited the film directors imaginations.  Superman’s body was analysed by the computer as a wire-frame of green lines in Superman III, as was the landscape of the alien planetoid in Alien five years earlier.  There have even been occasions when directors have asked for hidden lines to be removed from a computer generated image in order to make the image more obviously computer-generated, this includes George Lucas in Star Wars for the X-Wing fighters console displays.

Westworld & Futureworld

The first application of computer graphics in the movies was in Westworld, produced in 1977 where a robot gunman (Yul Brynner), had eyesight that took the form of a mosaic of ‘quantized’ patterns.  The technique used involved sampling the pictures of a photographic image averaging the intensity of the light from 50 to 60 pixels at equal intervals and separating the red, blue and green signals.  The resulting large squares of colour were enough to provide a recognisable image, especially when the object moved.

Gary Demos and John Whitney Jr, the son of the computer graphics pioneer, were responsible for this achievement.  Along with Richard Taylor, they soon became the key figures in the Entertainment’s Technology Group of Information International Inc. (Triple I).  Their influence was particularly important at this stage since their was a great resistance among feature-film professionals to the idea that computers could generate ‘realistic’ graphic images.  With the powerful Cray computers available through Triple I’s computer print composting system and microfilm plotter business, they demonstrated that computer-generated images could approach the realism of photographs.

When feature film makers were satisfied with the images computers could produce on futuristic screens and instrument panels the next step was to use the computer image for special effects; to help characters materialise and de-materialise or to fire lasers.   Triple I again led the way with the materialisation of Samurai warriors in Futureworld.  A still photograph of the warriors was digitised as geometric patterns on the VDU.  Each point in the pattern could be used to generate a geometric shape; a few dozen points were chosen and became the foci for triangles which diminished in size, became more numerous and returned to the simple live action image.   Instead of relying on optical techniques in which the foreground character is filmed against a blue background (blue screen), Triple I, averaged the information in two scenes, one with Samurai and one without, and the computer recorded the differences – the Samurai in various stages of their materialisation.  The Samurai could then be inserted back into the landscape when holes of the right size were electronically cut into the background image.

2001

Computer graphics really began to effect feature films more radically when they were applied to the creation of vehicles rather than the glowing lines around them. Up until this point film makers relied on the model maker’s art for there most spectacular journeys, crashed and chases.  For 2001 a 55-foot model had to be manipulated by a skilled team in order to simulate the slow flypast on a huge, detailed spaceship.  Nowadays a computer can hold a representation (model) of the object internally and produce views of this model on screen, or directly onto film, as required.  This three-dimensional computer model can be produced by one of two basic methods: (i) a real, or imagined, object may be digitised and its surface modelled; or (ii) the object may be produced by combining shapes to produce the shape required – this is referred to a constructive solid geometry.

Star Trek II

Star Trek II: The Wrath Of Khan provided an impressively complex sequence of photographic realism in the 60 second ‘Genesis Device’ sequence.  The production of this sequence required a co-ordinated input from eight designers and up to ten exposures of film were needed to create the various parts of the image.  The appearance of the barren planet was painted with the aid of a program for making craters, executed by Tom Duff, (with a little help from a box of mud and a large stone).  The effect of the wall of fire was the result of an idea by Bill Reeves using particle systems.  Reeves realised that to produce convincing-looking flames it was possible to make use of the random generation of particles and particle systems.  These particles may be thought of as points of light ejected in a random direction by the initial explosion, following suitable paths through space, fading in brightness and eventually dying.  As particles died more were generated by randomly-created particle systems representing secondary explosions.  The initial explosion of the Genesis device created 25,000 particles later when the fire had engulfed the entire planet there were 750,000 particles.  The position of each individual particle was computed for each frame so that it could be drawn in the correct position.  In fact, each particle was drawn as a short straight line representing the path of its motion during that frame (motion blurring).

Death Becomes Her

The execution on the effects in this film won ILM an Oscar.  The primary difficulty for the most complex scenes in this film lay in matching the created imagery to the human actors.

In nearly all of ILM’s previous films, their computer imagery either started with a human and ended with an alien creature, or vice versa.  They had never had to start with a real human, go through a series of changes, and then end up with a human again.  This increased the difficulty of any digital effects immensely because of the subtlety of human motion.

In one of the scenes in this film Madeline (Meryl Streep) plunges to her death, she is not however killed but her neck is broken and as a result of her plunge, her neck is twisted 180 degrees around to face her back.

The execution of this effect involved the use of rod puppets, motion-control cameras, blue screen shots and digital imagery and composting.  Manipulated by five puppeteers positioned beneath the stage floor, the broken Madeline slowly climbs to her feet and makes a few halting steps towards Ernest (Bruce Willis) who is talking on the phone.

To make Madeline’s body orientation believable, Streep had to act this entire scene backward.  What’s more, she had to act it with a tight blue lycra baggy covering her head.  The blue baggy was used to hold in her hair so that it would not obscure her neck and shoulders, and to later guide the removal and replacement of her head.

The scene was shot using hand-directed motion-control cameras that recorded the camera’s movements automatically.  This allowed the effects team to immediately shoot a duplicate clean plate of the same set (no actors present).  From this clean plate, Smythe’s group then extracted the parts of the image that were hidden by Streep’s baggy-covered head.  Once these elements were composted over the live action shot, it looked as if her head was completely gone.

To capture Streep’s real head so they could put it on her body the wrong way, they then shot her against a blue screen.  She was dressed in a full body suit of blue with only her head exposed.  Sitting in a rotating chair, she was then filmed speaking her lines, and the chair was turned at the appropriate times to simulate her turning her head from one side to the other as she walked.

This was all filmed using the same motion-control camera move recorded from the live action pass to ensure that when the elements were composited together they would fit.  At the same time as she was filmed in the chair, a set of computer-controlled dimmers, a large ring of surrounding lights, simulated the lighting environment of her pass through the live-action set.

Most of the effects in this film were accomplished using physical techniques beyond the scope of this paper, but it does serve as an example of how the digital and the physical are being used in ever-growing collaboration in the quest for believability.  Similarly, all of the digital effects of Death becomes her show that the technology’s increasing sophistication is bringing computer graphics into realms not previously open to it.

Computers In Television

Thanks to television, we all know about computer graphics.  Commercials, children’s programs, science documentary’s, news programmes have all drawn on the resources and flexibility of computer graphics.

In 1981 the top-end of the market for computer graphics for television was the Ampex AVA system, which first retailed for £200,000, this included a PDP 11-34 computer, a display-controller and a DM-9160 disk drive.

Computer graphics effects first started to make a significant impact on television in the 1970’s.  Viewers were assaulted be a deluge of wobbly, tumbling titles produced on analogue computers.  One of the problems associated with the use of computer graphics in the television industry is its association with the primitive graphics of previous generations of hardware.

It is unfortunate that so much of the talent expended on producing showreel material for television remains unseen.  Computer graphics have yet to establish a constituency in its own right.  It is considered as a tool for commercials, titles or cartoons, but the entertainment value inherent is the extraordinary visual range of computer graphics is almost never taken up by the programme controllers.  It remains for some enterprising producer to produce a series called Computer Graphics Showcase on which to air these computer generated images to the general public.

However, computer graphics are now being used more and more in the production of science-fiction series for television such as Babylon 5 and a new series still in production Space Precinct.  These series’ use computer graphics on a regular basis and show just what can be done even on a television show which may not command the same resources and finances as the film industry.

Space Precinct

The computer graphics department attached to George Lucas’s new production for television  Space Precinct contains half a dozen Silicon Graphics workstations.  Each of these workstations are fitted with 128mbytes of RAM, ultra fast R4400 RISC processors and 8 Gigabyte hard disks.  Each of these workstations has enough space to store 6,500 frames of broadcast quality video and, in any ten-day production period, these computers may be used to produce up to 25,000 frames of computer manipulated and enhanced film.  At 25 frames per second that is around a quarter of an hour of film per episode.  All the pieces of film that relates to the special effects sequences are then transferred to video, each frame being scanned in at a PAL resolution of 720 lines (for movie work frames are normally scanned at 2,000 or more lines).

The Silicon Graphics workstations are running a range of different specialist video image manipulation software packages.  Which software package is used depends on the type of effect that the user wants to achieve; some are better at producing one particular effect that the others.  The images are all kept in a standard bit-mapped form to allow then to be transferred over the network and between the different packaged easily.  The main software used are Discrete Logic’s FLINT, Parallax Software’s Matadore.  Other software packages are used for the production of a handful of special effects.  Dynmation is used to create effects such as an Ion storm, vortices, lighting flashed and sparks.  Elastic Reality is used for morphing and warping effects.  They even use an Amiga computer running Lightwave 3D to generate a lens flare – that is light bursting directly into the camera.

Into The Future

It would seem that the way in which films of all kinds, not just science fiction and fantasy films, are being made is changing forever,  Purely optical techniques are being replaces by far more sophisticated and versatile digital techniques.  Human actors no longer have complete dominance of the stage, they are being joined with a host of other actors in the form of animatronics and computer animation.

Many people in the film industry are predicting that by the end of the decade every frame of every film will have reprocessed by computer.  Computer graphics may have started out in the special effects departments of the movie makers, but there are now powerful economic reasons why it will be quickly applied to other areas as well.

A recent example of this new trend was to produce a crowd scene in the film Miracle on 34th Street.  In this film, it was both impracticable and far to costly to hire a vast number of extras for the enormous crowds; instead a small number of extras were multiplied by computer and used to fill the appropriate shots of New York.

In fact, George Lucas’s company Industrial Light and Magic is going even further.  Already they have demonstrated the capability to create digital actors who look as real as real actors.  But the cost of creating these actors is a few hundred thousand dollars per second – even real actors are cheaper!

But this development does point to the fact that the creative process is shifting from the font of the camera to being behind the camera. Quoting George Lucas “Actors will never be obsolete, but in time those skills once used by a performer in front of the camera will instead be used by people behind the scenes and they will be called animators not actors”.

If this prediction is even remotely true that it looks as if technology is ready to take over in yet another area of human endeavour.  For the acting profession this may be bad news, but for the viewer it could herald the beginning of an exiting new era where any visual experience is possible – an era where films as we know then are merging with computer technology to become interactive multimedia entertainment.

References
  • Computer Illusion in Film and TV, Chris Baker, Alpha Books 1994.
  • Creative Computer Graphics, Anabel Jankel & Rock Morton, Cambridge University Press 1984
  • Electronics Today International, Jan 1995
  • Masters Of Illusion, NBC Television 1993.