Challenge:  Create a system that can produce hours of animated content in real time via a puppeteer user interface.  
Solution:  Build a "Video Game for Puppeteers" that incorporated a library of triggered animations with a live lip-sync engine.
Below is a Pitch I built for BBDO/M&M's that consisted of a Live Digital Puppetry System.  Everything is recorded live to tape.  All animation is puppeteered live behind the scenes, and no Motion Capture was involved.  The Audio lip sync for Red is done live using a real-time custom Voice Phoneme Recognition algorithm.  This was produced in 2011-2012.
(Note:  This video does not seem to play properly in Chrome, and I have not had time to troubleshoot this.)

Live Digital Puppetry



Below are some additional videos that I worked on for BBDO M&Ms while working as a Technology Consultant at (the now defunct) Semerad VFX.
I was the Technical Director and Systems Architect responsible for designing, implementing, and maintaining, the real-time animation system used to bring Ms. Green live.  The below videos were all recorded live, and taken from actual broadcasts.  These were all done using redundant Fullbody Motion Capture systems, a facial puppeteering rig, with the Voice Actor in an ADR Booth while everyone was watching a live feed monitor.  
After only a few practice runs, all of the puppeteers were working in sync, and we were able to produce hours of live, spontaneous, and engaging content that would not have been possible otherwise.  
(Note:  The entire system was capable of 30fps 1080p HDTV, but unfortunately, these broadcasts were recorded in SD, so none of these videos are very high quality.)
A selection of screen grabs.
Back to Top