Weekly Newsletter!

Sign up for Mac Alert, the weekly newsletter from the producers of Creative Mac. You'll get news, reviews, features and tutorials, all delivered to your e-mail box.

Sign up now!

 

Our Related Sites

Creative Mac

Animation
Animation Artist
Digital Animators

Audio
Digital Pro Sound

Design
Digital Media Designer

Presentation
Presentation Master

Streaming
Digital Webcast
DMN TV

Video
AV Video
Digital Post Production
Digital Producer
Digital Video Editing
DV Format
DVD Creation
Film & Video Magazine

Our Related User Forums
Creative Mac
View All Forums A-Z
Mac Sites We Like

Ramseeker PowerBook CentralMacinsteinLowEndMacMacs OnlyMacWindowsGo2MacMacSpeedzoneMacReviewzone

 

 

USER STORY SEPTEMBER 5 , 2001
All-Purpose Animation

Mixing broadcast and Web media: then and now

by Bob Self
Special to Creative Mac
[email protected]

In the summer of 2000 I was hired by David E. Kelley Productions to produce character animations for the Fox series Boston Public. The animations would be photographed practically while playing back on a computer screen as part of a fictitious Web site.

To achieve this effect many Web designers would turn to Flash, but the Flash format restricts Animators in their use of certain types of graphics and effects. Flash features are limited in controlling and manipulating motion over time and achieving certain looks and textures due to bitmap versus vector issues, and simplified keyframe and time controls restrict subtle or last minute changes. Instead, I used several products to achieve the visual results and flexible workflow I wanted. I created the art in Adobe Photoshop, animated in Adobe After Effects and delivered the final product using Apple Quicktime. Unfortunately, I didn't prepare for Fox.com asking me to post the animations as low bandwidth files on the show's real Web site.

How it was
At the beginning of the 2000/2001 television season there were limited tools for creating character animation for both television and the Web. In the 2001/2002 season, things are looking a lot better. Following is the process used to make the original animations for the show and repurpose those animations for the Web. I have also outlined how I would go about this differently today given the advances in current software.

I created five human characters, a horse and a gorilla based on the script for the first episode. The clothing on the human bodies was to look like simple line art. The character heads, horse and gorilla would be photographic.


Bob Self: The man and his gorilla.

Each body part was created on its own layer in Photoshop making the character setup in After Effects a breeze. A typical forward-facing character would consist of one layer each for torso, upper right arm, lower right arm, right hand, upper right leg, lower right leg, right sock, right shoe, with the same parts on the left side. The head would be several head layers with different expressions such as a smile or a frown and opened and closed eyes.


The horse's beginnings in Adobe Photoshop

I saved a flattened copy of my layered setup files as a jpeg and e-mailed it from my workstation to the show's producers. When the producers wanted changes such as "make the vice principal's coat a different color" or "change the shape of the English teacher's torso," I made alterations to the corresponding layers and sent out a new .jpeg within minutes. Photoshop's adjustment layers feature makes nondestructive changes to the hue and saturation of body parts, so I could always return to previous versions. The characters' photographic heads were to be black and white, but I had the actors photographed in color and used adjustment layers to desaturate and adjust levels in case the producers decided to use color heads later.

I imported my layered Photoshop files into After Effects as compositions to begin animating. A comparison of my layers palette in Photoshop and my timeline window in After Effects illustrates the benefits of this workflow. The layers in both programs retain their naming and stacking order, and settings for transparency. My intended output would be 640x480 or 480x360 pixels depending on which of the prop department's computers the animation would be playing back on, but I built my Photoshop layouts at 720x540 so I could retain the option of scaling them for broadcast quality output with non-square pixels at 720x486. I used two different import approaches depending on the complexity of a given layout. If a layout involved complex interactions between two characters or between a character and his environment, I would import the entire scene (characters, props, foreground overlays and backgrounds) as one file with anywhere between 30 and 60 layers. Although organizationally complex, it allowed me to see and manipulate any element in the scene. If a layout were more simplistic, I would import each character as a separate comp so that I could cut down on the number of layers I had at one time. After Effects allows you to precompose any number of layers at any time (thus creating an intermediate composition within the main composition), so I could rearrange my layer groupings as needed.


The animated horse in Adobe After Effects

After the character set up was achieved in Photoshop, I moved each layer's anchor point using After Effects' pan behind tool so that body parts would pivot at the intended location rather than at the geometric center. After Effects 4.1 did not allow parent/child relationships between layers. This was an annoying limitation to character animation that forced clumsy workarounds to character set up. To move a character's arm required animating the rotation of the upper arm, then animating the position of the lower arm to maintain the illusion that the two objects were connected at the elbow. This workflow meant that all joints downstream from an animated joint had to be animated as well.

To deliver work-in-progress samples to the producers, I rendered DV Quicktime files out of After Effects, which I output through Firewire and a DV/Analog converter to VHS tape. Sending VHS insured that the production team could easily view the animations in any office at the studio. As the production dates drew closer and the animations became more polished, I rendered them to Sorrenson video and burned the files to CD. The producers could see the cartoons in the proper context on the same type of screen that would be used on set.

On occasion, the producers would come to my studio for over the shoulder fine tuning of the details or timing of the animation. My choice of After Effects and Photoshop really paid off. While developing the look of some chunky vomit, I was able to quickly alter the source art in Photoshop and have the results instantly update in my After Effects composition. When minute timing changes of two or three frames were requested on actions that were nested deep within other actions, After Effects' time remapping feature allowed me to assign new frame numbers to existing frames, allowing complete control over the pacing and order of the entire sequence. The combination of round trip Photoshop editing, time remapping and RAM previews allowed me to provide the producers with what seemed like real time results on what was actually not a real-time system.

The final Quicktime files were delivered to set on CD for playback within a dummy Web page. The night before one sequence was scheduled to work, the director asked for a significant change in timing to tie in with his intended staging of the live action scene. The new version was on set and ready to go first thing the next morning.

I was ready if the need for a broadcast version of the sequences arose; instead I needed a low bandwidth version of the animations to play on an actual Web site. I was unable to simply hand over an acceptable Web format for the animation. The download time for a Quicktime file would be too long considering that the animation looked like Flash. A streaming version could be made for RealPlayer, but the image size would have to be much smaller than what viewers saw on TV to accommodate streaming on slow connections.

What fox.com really wanted was a Flash animation, and the only way to make one was to reanimate the sequences in a program that could generate Flash format (.swf) files.


Repurposing animation in Adobe LiveMotion

Adobe had recently released a new Web animation product called LiveMotion that could output .swf files, and could import layered Photoshop files in a manner similar to After Effects. LiveMotion's timeline was also similar to After Effects' timeline. I reduced a copy of my Photoshop layouts to 480x360 pixels, the size I wanted the on-line versions to be. I imported the layered files into LiveMotion then, using my After Effects timeline as a guide, placed the necessary keyframes into the LiveMotion timeline. Once my LiveMotion animation matched my original animation, I used LiveMotion's export settings to convert my Photoshop (.psd) art to web friendly JPEG (.jpg) art and output the whole shebang to a small, self-contained .swf file. But it wasn't small enough for Fox.com. Fox.com wanted as many bitmap objects as possible to be redrawn as vectors. The actors' heads would have to remain bitmaps, but all body parts, props and background elements were to be recreated as vector art to reduce the final .swf file size even more.

I had to start animations for future episodes of the Boston Public, so Fox.com took on the task of making the .swf files. They used my original layered Photoshop files as templates for tracing the art with vector tools, and a copy of the final Quicktime movie for use as a timing guide. The results were very low bandwidth copies of my original art and animation. There were a few artistic compromises made here and there, but they were simply the result of an imperfect process.

And that imperfect process is the reason for this history lesson.

How it is
Since last year, both Macromedia Flash and Adobe After Effects have evolved to version 5. Flash has added features that make character animation easier and After Effects has added the ability to output .swf files. Given the same Boston Public assignment, here is what I would do today....

In order to generate vector objects, I would create most of my art in Illustrator. I would still use Photoshop for character heads and other photographic elements. I would import my layered Illustrator and Photoshop layouts into After Effects, maintaining a round trip relationship and a high level of artistic and editorial control. After Effects 5 can now work with parent/child hierarchies, so rotating a character's upper arm automatically moves the lower arm without needing to set additional keyframes or set up non-intuitive nested compositions. This feature alone would have saved hours. I could still output Quicktime movies, but now I would be able to output .swf files also; and, with the use of vector art, the file sizes would be appropriately small to meet Fox.com's delivery requirements. Features like motion blur that I might have used in the final Quicktime renderings can simply be turned off when I output the .swf file since some effects are not possible or practical in Flash.

If I had After Effects 5 last spring, animation for both the television show and the related Web site would have required considerably less work. Software companies have been working hard to meet the needs of the animation industry. It is only a matter of time until digital animators can stop focusing on creation and delivery limitations and get back to issues like squash and stretch and overlapping action.

GO TO PAGE [ 1, 2, 3, Complete, Home ]

Post a comment or question on the Creative Mac World Wide User Forum!

Read More Columns.


Dave Nagel is the producer of Creative Mac and Digital Media Designer; host of several World Wide User Groups, including Synthetik Studio Artist, Adobe Photoshop, Adobe InDesign, Adobe LiveMotion, Creative Mac and Digital Media Designer; and executive producer of the Digital Media Net family of publications.

[an error occurred while processing this directive]