Researched Unreal Engine at De Montfort University

 

Role

Unreal Engine Programmer with DeMontfort University

Technologies

    • Sequencer

    • Live Link

    • Blueprinting

 
 

This project was delivered with De Montfort University as part of Audience of the Future - an R&D fund designed to find out what will be the next form of consumer entertainment. What can be accomplished in VR, AR, and other new mediums?

Well, this is what we accomplished over the course of a few days in their Optitrack motion capture volume at the university. The brief was fairly open - just to translate the movement of a performer into something abstract on-screen. So, it shouldn’t just be mapped onto an avatar - it should have some effect in the virtual world.

What I decided to come up with was a light controlled by the movement of the performer. The rotation of his left and right arm independently controlled the colour and brightness of the light. So, as he moved the bow the brightness would change, and as he moved his other arm up and down the body (altering the pitch of the sound) the colour would change to suit.

The project was also polished and packaged to run on headsets. It was a much more elaborate environment, plus with more environmental effects.

For this one, the cello audio track was synced with the animation track - but the audio was broken up into 5 separate streams (cello, vocals, drums etc). Each audio stem was placed on a separate rock with a fade-off distance just beyond the player start. So, the player could hear the complete audio track when standing in the centre, but also each individual stem when approaching and bending down to the rocks.

Care had to be taken to point the directional audio precisely at the player avatar, but also to overlap them enough so that dynamic blending could take place - where 2 or 3 separate tracks could merge to create a completely new sound.

 
Previous
Previous

Train new doctors with medical VR

Next
Next

Created Strictly Come Dancing’s first ever mocap