Voice Control of Stage Lighting

Waiting for the App Store review process is boring. So a few weeks ago I threw together an Apple Watch app for stage lighting to pass the time. The concept is simple: tap a button and talk out loud to the lighting desk. Through the power of Siri and a little text parsing, the correct command gets sent to a Chamsys MagicQ lighting console over wifi. No more suicidal phones falling from the grid! The watch itself isn't actually sending any DMX, it’s talking to MagicQ via the Chamsys Remote Ethernet Protocol.

Made using RubyMotion, code on GitHub

Music is Our Demons (Filous Remix) by The Glitch Mob and It’s All Good by GRiZ

May 2015

A More Complex Mapping

Started just after the release of Mad Liberation by GRiZ, I left this project unfinished and abandoned until deciding to just point a camera at it and hit record. This was my first attempt to do a more complex mapping that wasn’t just a box or some simple quads. Made using Quartz Composer, CoGe, Ableton and MadMapper, it’s basically created using a mask with the visuals (triggered by midi) sitting underneath.

Music is Too Young For Tragedy by GRiZ

October 2012

Evesham Comedy Festival

This project was my first go at using CoGe for something in the real world. Having been handed a logo for the festival, I set to work loading effects to make the projection audio reactive. The video includes some pre-show shots and some live shots too.

June 2012

More Projection Mapping

These experiments were done whilst assessing the various merits of different mapping solutions including MaxMSP/Jitter, Quartz Composer and MadMapper. The subject is a simple foldaway display board system and if you look closely in one shot, a mirror as well.

May 2012

Video Mapped Backdrop

This was my first large scale project, created for Imaginary Friend’s production of The Last 5 Years. Made entirely in Max/Jitter using my own mapping library Maptk, a custom JavaScript media loading engine and a queuing system triggered by a Chamsys MagicQ lighting desk.

The 4 screens were mapped using 2 projectors running off a single laptop to form the backdrop for the show.

March 2012

Human Interaction

Towards the middle of 2011 I started experimenting a bit more with pitch analysis. This simple example involved analysing a live input of a musician playing (with MaxMSP), and lighting their face (via a fullscreen Jitter window on a laptop screen) with varying colours depending on the pitch.

This video also shows that a rubbish camera needs more light, hence the dark and out of focus result! The switching between colours was also a smooth transition and not a fade through black as it appears most of the time.

April 2011

Audio Analysis and Projection Mapping Combined

After the previous few projection mapping experiments, I decided it was time to make things a bit more interesting and add some interaction. Still keeping with the Radiator as a subject, this time I analysed a live input to create a 14-band vu meter. This is also my first example of a pretty accurate mapping too :)

Made in MaxMSP/Jitter.

February 2011

Lighting up a Radiator

Getting bored of boxes and doors, the next progression was obviously a radiator! This project (in MaxMSP/Jitter) involved mesh warping alpha masks to cut out the grills in the radiator and allow the video to show through.

February 2011

Projection Mapping, Take 2

Some more mesh warping in MaxMSP/Jitter, this time with the projector waaaay off axis, and with a bit of processing to make the video appear masked onto the door panels. At this point it became apparent that anti-aliasing would make a noticeable difference.

Again, more stupendous shakey camera action can be seen here.

February 2011

Projection Mapping, Take 1

My first experiment in mesh warping a video onto a real object. Made in MaxMSP/Jitter using nothing more than an OpenGL mesh and a 2x2 control matrix. As this only uses a mesh, there is no perspective correction applied in the transformation, and I didn’t put in much effort to preserve the video aspect ratio either.

The wobbly camera effect (even after a stabilising filter) must have been a result of my pure excitement whilst recording the clips that make up this video!

February 2011

Visualising a Tap Dance

Another audio reactive patch made in MaxMSP/Jitter.

The patch was an initial attempt to visualise the sound of someone tap dancing.

January 2011

An Audio Reactive Video

An audio reactive patch made in MaxMSP/Jitter.

All chops and effects in this clip were made by the patch, I simply loaded the video, played the music and let the computer do the rest.

December 2010