Komanda Real-time content editing system

What if we could create a film or an experience in real-time?

We created Komanda to be able to build and drive immersive experiences. It acts as a conductor baton allowing for experiences where we take a group of people through a shared journey, usually using mobile VR headsets.

At the top of the screen, the command interface displays what the audience sees together with the temperature, performance and battery status for each device.

On the lower part it shows the different scenes available and a visual representation of the different effect commands:

  • Camera movements
  • Particles sensitivity (How much the particles react to music)
  • Light sensitivity (How much the light reacts to music)
  • Sun intensity
  • Particle morphing

These commands, together with the scene triggers are driven using a physical interface (MIDI controller). In shorts, we decide in real-time:

  • What scene comes next
  • When to cut
  • How to move the camera
  • What/How much effects in scene

The latency being very low it is particularly effective for experiences with live music as we can adapt the edit to the music.

Born as a student project the idea was initially thought for live film editing but labelled as a “dull idea” by the course leader (he had sensible arguments).

Early concepts for Fables, © Guillaume Couche, 2014

It subsequently became a live music video editing system that we developed to build our real-time composited music video for Matterflow.

Komanda is what we use to run the Miroshot live immersive experience.

Contact

Any question, idea, suggestion?

hello@wolfinmotion.com

And follow us on Instagram to get the latest updates

© Wolf In Motion, 2016-2018