Jump to content

The future of Lighting Control


smalljoshua

Recommended Posts

Just over a month ago, Paul Smith (@Smiffy) had a guest post on Rob Sayer (@indyld) On Stage Lighting Blog titled Lighting Control – Where Are We Going?. (an interesting read in itself)

 

Imagine my thoughts today then, when I saw this on my Twitter feed.

 

Kinect Midi Controller

 

There is certainly an interesting potential in this. Once it becomes available as source, is there anyone out there who fancies having a play with this? Unfortunately, I've not got any experience with 3D position mapping (of either input or output) and the associated geometric algorithms that it requires (I don't know C++ well enough either). Meaning, althought I have a few UI ideas, I've just not got the skills (currently) to expand on them.

 

Anyone fancy taking a shot? You may just become the next big thing in Console Interface Design.

 

This could also become a great discussion base for the next gen of console interface ideas too.

 

Josh

Link to comment
Share on other sites

Personally, I think gesture controlled lighting is going to be a tad limited, imagine this:

 

You've just programmed in a show that you're really proud of, by pointing at fixtures and waving at them as if you were God (which will really confuse people for a while - lightning and smoke at the palm of your hand?)...

 

All good so far.

Show night, you have the sudden urge to scratch your head/some other itch. Something to that effect. You move, the console mistakes that for whatever the equivalent of the GO button is.

Career comes out looking slightly worse.

 

For that, I think gesture control is limited to programming time.

 

Even then, we have other problems to work around...

Setting intensity, position, RGB/CMY, focus. etc will going to be far more accurate & speedier with conventional control : people have a habit of not being perfectly still when they want, so any value a system makes will fluctuate - conventional faders / encoders just stay where they are!

 

That said though, if a machine can deduce that you have selected a line of fixtures and you are producing a sine-wave (or other shape) with your fingers, then programming chases will be pretty quick!

Link to comment
Share on other sites

The Playstation generation haven't had their chance at designing their user interface yet; they're still using the old farts interfaces designed by my generation.

 

When they do, we "wont get it".

 

Thats the way it supposed to be...

Link to comment
Share on other sites

The Playstation generation haven't had their chance at designing their user interface yet; they're still using the old farts interfaces designed by my generation.

<snip>

 

Consider me, a product of the playstation generation, an old fart. I Like my panels of controls!

 

Meanwhile, I await the development of Ultra high definition large (40"+) touch-screens, so that someone can make a hybrid of MagicQ (or other..) with shape gesture capture for fixture arrays - so at least we get half way to surface less gesture control...

Link to comment
Share on other sites

For that, I think gesture control is limited to programming time.

I definitely agree with that, 100%.

 

Setting intensity

A simple thumb/forefinger pinch would work quite nicely there.

 

position

Once lights are selected by your hands, the position is just controlled by where you point your hands.

 

RGB/CMY

A minority report style HUD, with Hue on the X-Axis and Saturation on the Y would be a good start here then select from that HUD.

 

going to be far more accurate & speedier with conventional control : people have a habit of not being perfectly still when they want, so any value a system makes will fluctuate - conventional faders / encoders just stay where they are!

I'll agree, potentially less accurate, but infinitely quicker (I would imagine) once you're used to it.

 

That said though, if a machine can deduce that you have selected a line of fixtures and you are producing a sine-wave (or other shape) with your fingers, then programming chases will be pretty quick!

Agreed, all these little things that make your life easier should be seen as a bonus.

 

Josh

Link to comment
Share on other sites

Thanks for the Shout Out Josh. :)

 

Looking at the comments thus far, it's pretty much the response I though we'd get from the article. New ideas and technologies always take a while to catch on, and while I am not saying that the ideas I discussed are the way forwards, they are merely put forwards as an option. On a somewhat more conventional note, it took Egor Popovski at MA Middle East about two years to get me to try the MA1. I would look at it every trade show, and they would try and demo it to me (I was a devoted Hog III user at the time) and I would always walk away from the experience with the utterly British feeling that it was just 'too German' :)

 

Eventually, Egor's final strategy was to say screw this, here's an MA1 Light, take it away, and use it on some shows for a month. Now of course, the MA 1&2 Range are one of only two manufacturers consoles that I will spec (the other being the ETC Eos should you be wondering).

 

As with every prediction of the future, it's just a logical assembly of current technology developments, and their potential for modification to work within our business. I happen to think that while there are certainly a lot of technological hurdles to overcome, it wouldn't be impossible, and of all the thoughts I had about the future of control, the most plausible was the one that I blogged about at Rob's site. The others ranged from voice recognition (I can barely hear the Spot Op's at a live gig, a computer would struggle) through to thought control (how often do you look at the stage and think that a pink might work, when it probably wouldn't. The console would need to be able to distinguish an idea from a command.. not easy).

 

Keep the comments coming though, I think it's an interesting discussion that certainly worth consideration.

 

Cheers

 

Smiffy

Link to comment
Share on other sites

One immediate use that I can think of for the Kinect controller is to operate a follow-spot. Another would be to synchronise moving-heads with dancers - gesture control, but done by the performers, not the LX-op. There's already a large community dedicated to hacking this device, and the potential is enormous. If you wanted to build a device with these capabilities from scratch, it would cost a fortune - as it is, you can get one off the shelf for around £100 - what a bargain!
Link to comment
Share on other sites

One immediate use that I can think of for the Kinect controller is to operate a follow-spot.

Interestingly I've just been reading a 'teardown' report on the Kinect. As it stands I don't think it would be much use for followspot. The system works by projecting a grid of dots from an Class 1 infrared laser. An IR CMOS image sensor of 1.3 Mpixels resolution then feeds a processor to derive XYZ locations of each segment. The sensors are motor driven to keep them aimed at the action. Within the confines of a living room, both in terms of ambient light levels and dimensions, the system works fine. However, I suspect that scaling the target up to a typical stage space with its much higher illumination levels, and inherent IR levels, will pose problems.

Link to comment
Share on other sites

Just..... no.

 

You'll look enough of an idiot jumping around in front of your TV at home, I honestly don't think that theres much potential in a professional setting for gesture control for lighting.

 

The whole Natal system doesn't lend itself nicely for accurately transferring human *relative* points to an *absolute* location. E.g. if I point my hand at the stage, then one centimetre of my head moving left or right will end up in the actual location im pointing at to be out by a fair margin more. And I don't want to have to stand perfectly still in a calibrated position to get something to work.

 

I've done 3D gesture capture before with 3 wiimotes (similar sort of concept to the Project Natal system, but using pre peak-detected coordinates from a camera image, rather than a greyscale image, and about 3 years ago), and although it's not too tricky to do the maths and filtering, it just is too tricky to map it into a real world coordinate. Also: Do you have the space of a normal living room available where you have your lighting desk, to a point where if you're standing up a couple of metres away from the sensor, can you still see the stage?

 

The concept of some more interactive methods of controlling lighting could be interesting, but I'd say that Natal just isn't the right technology for it.

 

Augmented reality would be a better bet, as then the system can work out where you're pointing to on stage and map accordingly. That way you then don't have to worry about the whole sensor / standing in front of something problem. Might be a better route to go down.

 

e2a: I forgot to mention the followspot thing, but Brian has pretty much summed up what I was going to say. With the usual "automated movers for followspots" tracking people / identifying people argument, the Natal sensor just isn't powerful enough to work over that size area.

Link to comment
Share on other sites

The coverage of the sensor will be a problem but I don't think the ambient light levels will be too much of an issue. I think the sensor is 'tuned' to a fairly narrow band of IR which the device uses to project the depth pattern (Look up 'Structured Light').

 

A colleague and I have got some funding to research the Kinect as a user interface for novel applications. We will definitely be looking at how it might be used to control lighting and sound systems. I agree, though, that it is unlikely to be a suitable control mechanism for every aspect of lighting programming/operation.

Link to comment
Share on other sites

After reading the article it seemed to me that what the authors want is something like - dare I say it - the Light Console! Something that allows you to play the light without getting in the way. I think that's right. I read the bit about gesture control etc. with something approaching panic! For my money what we need in the future is increased simplicity - putting the equipment back in its place rather than allowing ourselves to be led by the nose by the gear.

 

 

Link to comment
Share on other sites

I can see some kind of holographic interface - similar to that used in Iron Man when Tony Stark is redesigning his suit.

 

3D Map of the stage with all the fixtures shown - point at a ficture to select and Minority Report style tabs pop up - and all able to be manipulated in real-time before being 'thrown' back at the desk to save it and move onto the next state, possibly with some voice commands or drop back into the MS Table for a plan view.

 

Of course, this'll be the point where the desk programmer has to spend some time storing their biometrics on the desk so that it can recognise them ...

 

"Here's the basic show on my portable solid state recorder and this one has my body profile for upload ..."

Link to comment
Share on other sites

Not quite gesture, but for focussing movers, if there were a device combining focus remote and a tracking system, it could be positioned on the stage, or merely held, the fixture(s) selected from the keypad, and they could be told to focus on the position of the remote...
Link to comment
Share on other sites

My opinion is that we will probably not see many radical changes in the actual "surface design" over the next few generations of consoles - integration of multi-touch, yes, but to the basic console layout, no. Why? Because the current UI works, and it works well.

 

I think the big thing will be in the console knowing the venue and tracking the talent - look at Cast Software BlackTrax. That is where I think next gen consoles are heading. The LD will focus once again on making things look good environmentally, but telling his console (and lights) - P/T position: Track #1, Zoom: Headshot... instead of creating focus positions and continually updating them based on small blocking changes etc the talent will have yet another body pack and no matter what, the tallent won't be able to miss their special.

 

Ingrained in this sort of change is that pre-vis will become a bigger aspect of programming - because you need a relatively accurate model of (at least) the lighting rig so that your system can accurately track. So 3d cads will become more prevalent I believe.

 

The LX operator will probably find their role changing slightly in smaller theatres too - Show control tends to make more sense being operated by lighting - and ACN is a great protocol for automation - so I believe that as ACN becomes more accepted and our consoles become "ACN Controllers" as opposed to "Lighting Desks". Would that be a bad thing?

 

It may even evolve so that desks become modular - not just our desks, but audio too, all tied together on the same backbone, all speaking the same language... That could mean that smaller shows, they pull out just one console, and they patch 8 channels of Digital Audio Signal Processing, 12 dimmers and 8 movers into it. On the larger shows, you pull our multiple consoles, the live audio technician patches all his DASP stuff, the playback programmer patches all his stuff to his console, the stage automation technican grabs his stuff, and the lighting techs grab what they need. Then for the show run, which would use a lot less crew (maybe just 1 audio and 1 LX) everything gets simplified down so that audio can easily operate playback and live (with tracking surround sound etc) and lx can do automation and lights.

 

As for minority report style control (or IronMan style control)... Thanks... but no thanks. I need to control a large number of items simultaneously. Those systems of control seem more more oriented on fine detail work - tweaking one aspect at a time, looking in closer and things...

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.