Digitized a musical

 

Role

Technical Animator with Leo & Hyde

Technologies

    • Sequencer

    • Metahumans

    • Xsens MVN

    • Stretchsense

 
 

Leo & Hyde embraced motion capture technology with open arms, even though it was an entirely new experience for their company. They understood the potential of the technology in bringing their musical performances to new heights.


The retarget process involved a comprehensive performance capture setup that allowed me to stream facial, body, and finger movements in real-time onto a Metahuman in Unreal Engine, minimizing latency to ensure a smooth and seamless performance. The resulting experience was nothing short of spectacular, and it left the audience spellbound.

I enjoyed my role in the project, which involved using sequencer to assign camera switches to specific parts of the song, adding an exciting element similar to a live broadcast event. The result was a cohesive and dynamic performance that kept the audience engaged from start to finish.

 
Previous
Previous

Created Strictly Come Dancing’s first ever mocap

Next
Next

Unlimited Motion invited to speak at Xsens Motion Capture Creators Conference