For one of my modules at uni I'm attempting to make an algorithmic composition that can be controlled by your motions (using data from a web cam).
I'm in the prototype/learning how the hell to use the Gem objects in Pure Data at the moment and so far I've come up with this:
Which does this to the input of my webcam:
I still haven't got the pix_data object to work (arrrrrrggggg!), but basically what the patch does at the moment is it only shows movement (thanks to the pix_movement object) and then the green and blue channels are removed (using colourRGB) to reduce noise and make the video input easier to analyze. Then the pix_blob object tracks the 'center of gravity' of the video and gives me data for the distance away and X and Y coordinates.
I may post updates on this project and eventually audio, maybe even the patch when I'm done.
Back to the metaphorical banging of my head against a wall that is pure data...