Realtime control of audio and video through physical motion:

STEIM's BigEye

Richard Povall
assoc. prof. of computer music & new media
oberlin conservatory of music | timara dept

Abstract: An overview of BigEye, developed at STEIM, Europe's leading research centre for the development of interactive performance instruments. BigEye analyses a video signal in realtime and maps the resulting data to MIDI. Thus it become possible, for example, to follow an individual colour as it moves through a space, and control any MIDI-equipped software or device with that motion. Functions such as x, y, acceleration, size, speed, relative x and relative y are available through BigEye's own scripting language

The paper also presents a contextual overview of other motion-sensing systems, including other video-based/software solutions, hardware sensing systems (such as that developed at the CNUCE/CNR in Pisa, Italy), and wireless body suit systems such as the MidiDancer, developed by Mark Coniglio of troika ranch in New York.

Keywords: multimedia, performance, programming, interactive composition, digital video

 

Introduction | Background | Tools: Systems and Software

A word about digital video | BigEye | Conclusions | Works Cited

 

 

Acknowledgments

parts of this paper were first published in the Proceedings to the 1997 Art & Technology Symposium, Connecticut College, New London, Connecticut, USA.