Jump to content

motion analysis and gesture recognition


stevie

Recommended Posts

hello there,

 

I am engaged on a project involving the analysis of a performer's motion and gestures, and I am looking for suggestions as how best to achieve this under stage lighting and effects conditions. any input gratefully received

 

stevie

Link to comment
Share on other sites

Birmingham university elec. Eng. department does hudge amounts of work on gesture regonition and general vision processing (I did my final year project counting people on CCTV - if you think I can add anything to the party feel free to PM me). My first and most obvous thought is to convert everything to mono with the threshold calculated dynamicly, further split the image up into fairly small blocks - I think I went 9 * 9 pixels - and do seperate thresholds for each block. Further if you have the time (I didn't but then whilst I did have pretty poor lighting conditions it was stage sort of bad) you could try lookin for beams of light (i.e. area's of above average intensity) working out what direction they run, then if they intersect with the performers face subtract a certain amount from the brightness down that route. If you manage to do that in real time I take my hat off to you.

Lastly as a entirely different thought how about you stick to daylight gigs where the lampie doesn't bother trying to outshine the sun, and gigs lit for video where the lighting should be a whole lot more reasonable to ignore.

Link to comment
Share on other sites

How about flooding the stage with a lot of near IR?

 

A dozen or so 1Ks with primary red and blue gel in them should do (It might be wise to contrive some airflow between the two gel frames to stop the rear one melting). Or use IR PAR 64s (a specialist product, but it is available, aimed at the security camera market).

 

Then use an IR pass filter on the front of the camera.. All of a sudden the nasty dynamic stage lighting is washed out in the band that matters to the motion tracking camera.

 

Heatshield on all the visible light sources (Together with cold mirrors on same) might help with this as well.

 

We can get cameras that can see outside the visible spectrum, so why not use that fact...

 

Finally in software, using a kallman filter to provide course and speed information for each point being tracked (and constrain the acceleration values of these to something realistic) will possibly help to remove much of the noise, look at the papers on automatically deriving course data from radar images for ideas.

 

Sounds like a fun project.

 

Regards, Dan.

Link to comment
Share on other sites

you could try lookin for beams of light (i.e. area's of above average intensity) working out what direction they run, then if they intersect with the performers face subtract a certain amount from the brightness down that route. If you manage to do that in real time I take my hat off to you.

 

I feel you do have one advantage, in that you have a lighting console that's outputting all the lighting changes, so you could theoretically interface that information with the video in order to indicate when lighting changes are occuring and compensate accordingly. Quite how it would work, I don't know. But best to make use of all the information available, and DMX output is one source...

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue.