Animation Production Pipeline - Nora Willett

Animation Production Pipeline

Nora Willett, Ivan Lee, Oliver Castaneda


This project explores the production pipeline used to create a short or feature film in animation studios. Each part of the process is described with examples from industry and from the animated short "Croak," which we created with Autodesk Maya and Final Cut Pro. The short can be found here:

The Steps in the Animation Production Pipeline

Story Boarding

When a film is in its beginning stages of production, the storyboard is one of the first aspects to be completed. The story board, a series of pictures with captions that describe the outline of the story, helps in the placement of cameras and the timing of animation later on in the production process. The story boarding process begins with an initial rough draft. Then, one of the creators acts out and describes the pictures in the story to his or her colleagues in order to receive feedback. After taking into consideration the suggestions presented, the story board is redrawn with the new ideas. This process is repeated until a final story is agreed upon.

One panel of a story board for Pixar's film "Finding Nemo."

One part of the story board for "Croak."

Voice recording

Before the animators begin working on the different scenes, the voices are recorded for the film. Before the actors are brought in to record the lines, people within the studio record scratch voices. These recordings allow the animators to gain ideas for the action based upon the movement of the actors. The recordings also influence the timing in the scenes.

One of the actors recording for Pixar's "Ratatouille."

Concept Design

Concept design is the process where artists visualize what the director wants the film to look like. In this process, the characters and environment are created through sketches. Later on, the modelers use these sketches as guidelines when they create models in the computer.

Character designs from Pixar's "Up."

Character designs for "Croak."


Given the sketches of the characters, modelers use programs such as Autodesk Maya and 3D Studio Max to create the 3D computer characters. When modeling, the base object can be chosen from a cube, sphere, cylinder, cone and a other options. Through the manipulation of vertices, faces, and edges, the original mesh is transformed into the final character.

The transformation of the real frog in "Croak" from its beginning stage to the final result.


For texturing, one must "unwrap" the mesh into 2D and then paint in either 3D or 2D.


There are two parts to rigging a character for animation in our film. One is constructing the skeletal rig of the model. The skeletal rig is made from joints connected together. When rigging you can choose to use a combination of inverse and forward kinematics. Inverse kinematics is usually used for legs since this allows the animators to place the foot in the correct spot and the rest of the leg will follow. Forward kinematics is usually used to rig the arms and the back. This type of rigging requires the animator to move every joint individually. Once the joints are connected properly, the rigger adds controls to them. This allows the animator to key only the controls and not to worry about the joints. Once the joints are connected to controls, this rig is connected to the mesh through "skinning" and then the rigger must paint weights to specify how the joints affect the mesh geometry.

Secondly, we need to create the extreme positions, or modes, of the face to give our character expressions. Maya interpolates between these extremes, called "blend shapes" to allow us to combine different shapes in different proportions and make new expressions:

Another way to affect the mesh is by using clusters or lattice deformers.


Once we have our models, our scene and cameras set up, we can start animating based on our storyboard. Unlike in traditional animation where the animator must draw each individual frame, in computer animation the animator only sets the main key frames and the computer "tweens" between them.

Autodesk Maya gives animators the graph editor and the dope sheet to help with animation. The graph editor is used to change how the computer interpolates between the main key frames set by the animator. The dope sheet is used to change the timing of the animation.

Animation is tiring but rewarding. After many hours spent setting key frames and tweaking the timing, the characters feel like they are alive.

Special Effects

Maya 2011 provides a number of powerful tools for the simulation of water in Maya. Given the numerous varieties of water, we will begin with a focus on river and waterfall simulation here, and touch briefly on other water effects at the end.

Water simulation can be approached in a number of manners. Useful tools include particle emitters, the relatively new nParticles, Maya fluids, and the built-in ocean and pond shaders.

Particle emitters can be the most efficient method of simulating fluids. Particle effects produce a vast number of individually customizable particles. Maya is flexible enough to calculate for particle collisions, random lifespans, and display attributes such as faded tail lengths. Shaders can be used to color individual particles as well. At the same time, there is no calculation of interparticle effects, and despite randomizable initial parameters, each particle acts individually.

Maya fluids are built in to Maya 2011 to allow users to quickly create containers filled with predefined fluids. Such fluids are also often very render-efficient and have properties such as Density and Velocity that are common to fluids. The fog displayed in this project is accomplished using Maya fluids. The screenshot to the right was one of our drafts for a river and would have provided a much more cartoonish look.

nParticles were introduced in Maya 2009 as a powerful new tool for fluid dynamics. An nParticle is generally much larger, having its own mass, collision densities and collision radius, and numerous other attributes. More powerful Mental Ray shaders can be assigned to an nParticle, allowing for more flexible display properties. Upon finalizing the display properties of the nParticles, the nParticles are then converted to an output mesh which attempts to generate a smooth meshes for every cluster of nParticles. Important attributes at this stage include surface tension, which determines how taut the surfaces remain. You can see to the right a transparent shader creating a dynamic, transparent liquid flowing smoothly. (Note: image generated by following a Maya tutorial. All other environmental objects in the shot were provided).

Finally, Maya has some built-in ocean and pond shaders that directly contain attributes such as Example of using Maya's built-in ocean ocean plane and shader to create a static ocean. The following shot was generated early on as a possible solution for water creation. Unfortunately, this high level of realism comes at the cost of the water remaining entirely static. We also decided this was not the level of detail we wanted for our cartoon-like film.

Below are some rough drafts at various stages along the quarter

I actually began with an online tutorial ( for creating a shower-like effect. By altering the attributes and actually adding multiple emitters, I created by first draft of a waterfall.

Using multiple particle emitters to create a waterfall effect. At this stage, I was analyzing photos of real waterfalls and attempted to imitate a layer of blue behind the waterfall, accomplished by both keeping a solid blue plane behind the waterfall, and a layer of blue particles with very long tails. I also discovered that simply adding more and more particles generally made the waterfall look more full, so I added as many as possible while trying not to kill the render time. Finally, a much more realistic effect by simulating water hitting rocks and spraying outward; so I placed emitters deliberately spraying forward at various points within the fall. This was very similar to our final waterfall.

nParticles were created to also create a smooth river scene with reflections.

We also attempted some work with using nParticles for waterfalls. This would have been ideal, given the interparticle interactions. Unfortunately, we quickly discovered that this was simply not feasible given the numerous hours of render time required for even a few frames of such a waterfall. An example of such a waterfall (not created by the authors of this project) can be seen here:

A very early prototype created in 3ds Max can be seen here:


Lighting is a very important part of the pipeline since it contributes to the feeling of a scene. There are many different types of lights that are available to artists as they create the mood in a shot. These types are: ambient light, directional light, point light, spot light, area light, and volume light. For each light, the color and intensity can be changed to customize it. With spot lights, the cone angle and drop off can also be changed. Lighting with a computer offers several advantages over lighting in real life. Lights can only shine on certain objects and the artist can choose which light casts shadows. With so much control over the lighting in a scene, the artist can create whatever mood is desired by the director.

The changes that result from adding lights to a shot in "Croak."


When you have reached the time to render your shots, you are almost done! Autodesk Maya offers Maya software, Maya Hardware, and Mental Images Mental Ray as the renderers that you can choose from. You also have the choice of rendering from inside Maya or rendering from the command prompt. Rendering from the command prompt offers definite advantages since you can render multiple files right after each other. Rendering outputs a picture file for every frame.

The first picture is rendered with Maya Software and the second is rendered with Mental Ray.


During editing the whole film comes together. In the steps up to the final editing process, a minimal amount of editing has already been done. Once the storyboard is complete, each picture is scanned into the computer. These are then edited together with the voice recordings and some sound effects to create an animatic that gives the director and those working on the film a rough idea of how it will look. While the animators finish animating different shots, they playblast (create a rough video of) the shots. These playblasts are inserted into the animatic where the storyboard drawings were. Finally, as the shots are rendered, the final frames are put into the animatic. Through this process, the rough animatic transforms into the final film.

A screen shot of editing "Croak" in Final Cut Pro.

Troubleshooting and Future work

There is clearly some work left to be done for this project. With more time, we would have loved to use nParticles for the waterfall, achieving a much more realistic effect. Unfortunately, we did not have the hardware or the time to commit to rendering such a realistic waterfall. Furthermore, you can see that the water is very choppy at the bottom of the waterfall. We are unsure as to why this happened, but it appears as if the particles from the waterfall were interacting disruptively with the nParticles making up the pond, and the resulting output mesh rendered differently in every frame, leading to frantic choppiness. We attempted to resolve this by putting the two bodies of water in separate physics systems. While this rendered well in some scenes, the pond would disappear entirely in other scenes. With more time, we would like to resolve and rerender this scene.

2011-03-29 17:43