As a part of Unite 2015 Melbourne, Deakin Motion Lab created an opening performance comprised of two parts. Both were motion-captured dance, with interactive visuals built inside of Unity. First, a preview of another performance, Pinoke: a robot-dancer interaction piece; and second, a re-imagining of Vox Lumen, an earlier Deakin Motion Lab production.
I created many pieces of tech: a ripple simulation, multiple particle simulations, a trail renderer, a mesh building solution… all GPU-based compute shaders, new for me!
The tech behind the mesh-building is actually maze generation! Mike Bostock’s Visualising Algorithms was a big help there. Just creating meshes in the triangle order they were assigned looks bad! I mean, it looks okay, but it can look better. I tried both random traversal and randomised depth-first traversal, both created unique looks. Random traversal being most suitable. The trails are miniaturised versions of my tunnel, but running entirely on the GPU instead.
Hinting at how this works right at the end of the video: a ripple simulation on a plane of points, with a selective pixel pulling effect applied over it. Motion capture pulls the height of the ripple simulation up and down, matching the dancer’s hands.
I had a hand in creating the particle effects in Stephen Jeal’s Nebula segment, and he provided the models for the Island scene.