top of page

Prototype to Production: Building a Lighting Workflow in 
Houdini for Animation

Soon after adoption of USD, Walt Disney Animation Studios (WDAS) began exploring and then transitioning its legacy Maya-based lighting workflow into Houdini Solaris. We present the history and journey, workflow and tool development from prototype to use in the first feature production, Moana 2. We will discuss lessons learned and adjustments made from the original MVP workflow to the one that is used today.
https://dl.acm.org/doi/10.1145/3744199.3744636

Frozen 2: Effects Vegetation Pipeline

Walt Disney Animation Studios’ ”Frozen 2” takes place in the Enchanted Forest, which is full of vegetation (e.g. distinctive leaves and foliage) that is manipulated by other characters, including the wind character, Gale. ”Frozen 2” also has multiple scenes where a large portion of the forest is on fire. The quantity and scale of vegetation effects in ”Frozen 2” presented a challenge to our Effects department. We developed two workflows, the Vegetation Asset workflow and the Fire Tree workflow, to help us achieve high quality artistic performance of procedural tree animation and fire tree simulations on ”Frozen 2”. Using the new workflows we not only saw an order of magnitude improvement in the work efficiency of our Effects artists, but also saw an increase in work satisfaction and overall artistic quality since the workflows handled the data management of various assets in the shot, allowing artists to concentrate more on their craft.

Projects: Welcome

Crowd Promotion Pipeline

The Crowds department can animate a large number of characters and agents using procedural animation. A secondary layer of animation can be applied to a series of base cycles to complement the performance using scripting and coding techniques. However, it is often necessary for the artists to customize the animation performance for a subset of the characters. To address the large scale and scope of crowds in Walt Disney Animation Studio’s productions, the Crowds department has developed automated systems that allow animators to use familiar tools to refine the performance of crowd characters on demand. For characters that require custom performances not achievable by the procedural animation system, the Crowd Promotion system maps procedural animation authored in Houdini onto the heavier, fully-rigged character representation, which can then be manipulated by the in-house animation, hair, and cloth simulation tools in Maya. Once the performance is re- fined, for efficiency, the characters can be demoted back to the lightweight procedural representation, while still preserving the hand-crafted animation.

Projects: Welcome

Virtual Window Shader

The feature film “Big Hero 6” is set in a fictional city with numerous scenes encompassing hundreds of buildings. The objects visible inside the windows, especially during nighttime, play a vital role in portraying the realism of the scene. Unfortunately, it can be expensive to individually model each room in every building. Thus, the production team needed a way to render building interiors with reasonable parallax effects, without adding geometry in an already large scene. This paper describes a novel building interior visualization system using a Virtual Window Shader (Shader) written for a ray-traced global illumination (GI) multi-bounce renderer [Eisenacher et al. 2013]. The Shader efficiently creates an illusion of geometry and light sources inside building windows using only pre-baked textures.

Projects: Welcome

Crowd Artist Work

Projects: Welcome
Zootopia_Crowd_Pipeline_image.png

Zootopia crowd pipeline

Disney’s 55th feature animated film Zootopia takes place in a modern animal metropolis. Bringing this bustling city to life required creating a universe in which moose drive cars, lions take selfies and wildebeest herds roam the sidewalks. Many different species of animals of various sizes and proportions inhabit this city and interact with each other as well as objects and vehicles in their environment, creating some unprecedented challenges for our crowd pipeline. This required us to rethink how we approach the crowd toolset. We needed to develop tools flexible enough to handle such a wide variety of cases. Building off of the work done on Big Hero 6 [Hamed et al. 2015], a modular design was constructed in which a reliable core set the foundation over which tools could be developed and abstracted, providing the framework for artists to easily construct tools and be able to build on each other's work and tackle increasingly complex tasks effectively.

Projects: Welcome

Stereoscopic visualization as a tool for learning astronomy concepts

The goal of this project is to create a stereoscopic application to be used in a classroom setting. Also included, is the installation of the hardware. The Virtual Galaxy is a scientific visualization of the local group of galaxies and the solar system. It is an interactive stereoscopic application developed using the Vizard Virtual Reality Toolkit. The system is designed to run on a 3D desktop PC / Laptop, a classroom and a cave environment. The navigation mechanism in the application can be controlled using the keyboard and mouse as well as a wand and head tracker system. The application also allows the user to modify the Inter-pupillary distance (IPD) at runtime thus allowing for adjustments of the distance between the left eye image and right eye image. The system has been developed using Python scripting and the individual galaxy and planet models were developed using 3D Studio MAX. This educational tool is currently being used for two descriptive astronomy courses in the Physics department and Purdue University. These courses are taken by engineering students as well as students from other departments. The objects in this system are rendered to scale in order for students to understand the large variation in sizes of objects found in the universe and to help them comprehend the velocity required to travel through space, the distance between two planets, two stars or even two galaxies.

3D+Cave.jpg
3D+Cave+Closed+with+head+tracker.jpg
classroom2.jpg
classroom3.jpg
Projects: Welcome

Impact of Visuo-Haptic Simulation for Teaching Nano Particle Behavior

This project is an educational study where the Falcon device was used. The Falcon is a three degree of freedom force feedback device. A student could hold the ball interface of the Falcon device and control the position of any nano particle that they have selected by clicking on the button on the ball interface of the Falcon device. The Feedback force is calculated based on the equations described in the next sections and this force is exerted to the student’s hand via the ball interface.



Visual Rendering​
The software is developed using Visual C++, CHAI3D and OpenGL libraries. The software simulates the behaviors of nano particles under brawnian motion in 2D as well as 3D. As seen in Fig. 1, a student can click on any nano particle and drag the particles to any required location on the screen or on any substrate. The software allows the user to modify the size of the cubes, the number of cubes present in the environment and also temperature. With increase in temperature it can be seen that the forces that act on the nano particles increases and thus these nano particles start vibrating by a greater amount.


Haptic rendering​
To enable the student to experience the interaction forces, haptic rendering was developed to let the student control the position of the nano particle. When the haptic device is connected to the system the haptic cursor is displayed as a small square on the screen. The students can grasp the nano particle by clicking on the haptic controller button when the cursor is on the nano particle. Once a nano particle is grabed the student will experience the forces exerted on the nano particle as rendered according to Eqn. (1) below,



       Force=amp1*sin(2*PI*freq1*timef)+amp2*sin(2*PI*freq2*timef)+amp3*sin(2*PI*freq3*timef)               (1)

         Where,
                 amp1, amp2 and amp3 are randomly selected three amplitudes in a given range,
                 freq1, freq2 and freq3 are three random frequencies generated between zero and a defined Maximum frequency.
                 timef is elapsed time



These forces cause the simulation of the ramdon forces that the nano particle faces in brawian motion. The value of temperature determines the range between which the values of amp1, amp2 and amp3 are chosen thus at higher temperature the three amplitudes will have higher values and thus the force rendered will be larger.

Picture1.png
IMG_9712+28229.jpg
IMG_9714+28229.jpg
Projects: Welcome

Virtual Ecosystems on Cuda

Implemented the project from the paper, “Interactive Modeling of Virtual Ecosystems” by Authors Bedrich Beneš, Nathan Andrysco, and Ondrej Št’ava on Cuda to improve performance.



System Description

A Quick and intuitive creation of models of virtual trees and ecosystems. The user defines intuitive plant parameters, initial positions, and obstacles.

This project involved moving the collision check implementation of the above system from CPU to GPU using CUDA



Problems with CPU implementation and Goals

  • Collision Check

    • Each Bud is checked serially on CPU for collision in Voxel space and thus execution is slow.

  • Implementing Collision Checks on Cuda

    • Execute Bud collision check in parallel.

    • Implementation of the complete Voxel space on Cuda.


Cuda Implementation

  • Moving Voxels on GPU

    • Voxels are implemented as a one dimensional array of Booleans which represents a 3D space.

    • If we define Voxel dimensions as 512x512x512 thus we get, 512x512x512x1Byte = 128 MB of data on the GPU.



Results

The CUDA implementation resulted in large performance improvement for large number of nodes as can be  in the time comparison chart on the left.

VirtualEcosystems.JPG.jpg
graph-JPG.jpg
Projects: Welcome

©2019 by Norman Moses Joseph. Proudly created with Wix.com

bottom of page