My research concerns sound, music and movement interaction, from modeling to applications (music, dance, pedagogy, gaming, rehabilitation)
Keywords: multimodal interfaces, gesture control of digital media, sound and music computing, gesture modeling and recognition, interactive machine learning, sensori-motor learning using auditory feedback.
Recent projects as coordinator
LEGOS project, Sensorimotor learning in gesture-sound interactive systems, Ircam and University Paris Descartes, funded by ANR (French National Agency for Research)
Urban Musical Game, festival Futur en Seine 2011
Before 2004, I’ve conducted research on biomedical optics for ten years at EPFL (Switzerland) and at the Beckman Laser Institute – University of California Irvine