2023

Loriette, W. Liu, F. Bevilacqua, and B. Caramiaux, ‘Describing movement learning using metric learning’, Plos one, vol. 18, no. 2, p. e0272509, 2023.

2022

Liu, M. A. Magalhaes, W. E. Mackay, M. Beaudouin-Lafon, and F. Bevilacqua. Motor Variability in Complex Gesture Learning: Effects of Movement Sonification and Musical Background. ACM Trans. Appl. Percept., vol. 19, no. 1, Jan. 2022,

Ley-Flores, E. Alshami, A. Singh, F. Bevilacqua, N. Bianchi-Berthouze, O. Deroy, A. Tajadura-Jiménez. Effects of pitch and musical sounds on body-representations when moving with sound, Scientific reports, vol. 12, no. 1, pp. 1–20, 2022.

J. Françoise, G. Meseguer-Brocal, and F. Bevilacqua, “Movement Analysis and Decomposition with the Continuous Wavelet Transform”, in Proceedings of the 8th International Conference on Movement and Computing, in MOCO ’22. 2022.

Paredes, J. Françoise, and F. Bevilacqua, “Entangling Practice with Artistic and Educational Aims: Interviews on Technology-based Movement-Sound Interactions”, in NIME 2022, PubPub, 2022.

Caramiaux, A. Altavilla, J. Françoise, and F. Bevilacqua, “Gestural Sound Toolkit: Reflections on an Interactive Design Project”, in International Conference on New Interfaces for Musical Expression, PubPub, 2022.

Antoniadis, S. Paschalidou, A. Duval, J.-F. Jégo, and F. Bevilacqua, “Rendering embodied experience into multimodal data: concepts, tools and applications for Xenakis’ piano performance”, in Xenakis 22: Centenary International Symposium, 2022.

2021

Bevilacqua, B. Matuszewski, G. Paine, N. Schnell. On Designing, Composing and Performing Networked Collective Interactions. Organised Sound, Cambridge University Press (CUP), 2021, 26 (3), pp.333-339.

Paine, F. Bevilacqua, B. Matuszewski. Editorial: Collective and networked sound practices. Organised Sound, Cambridge University Press (CUP), 2021, 26 (3), pp.303-304

Scurto, B. V. Kerrebroeck, B. Caramiaux, F. Bevilacqua. Designing deep reinforcement learning for human parameter exploration. ACM Transactions on Computer-Human Interaction (TOCHI), 28(1), 1-35. 2021

Ley-Flores, E. Alshami, A. Singh, F. Bevilacqua, N. Bianchi-Berthouze, O. Deroy, and A. Tajadura- Jiménez, “Bodies moving with sound: Effects of pitch and musical sounds on body-representations”. 2021.

Ley-Flores, L. Turmo Vidal, N. Berthouze, A. Singh, F. Bevilacqua, and A. Tajadura-Jiménez, “Soni- band: Understanding the effects of metaphorical movement sonifications on body perception and physical activity,” in Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1–16, 2021.

Liu, A. Dementyev, D. Schwarz, E. Flety, W. E. Mackay, M. Beaudouin-Lafon, and F. Bevilacqua, “Sonichoop: Using interactive sonification to support aerial hoop practices,” in Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1–16, 2021.

Sanchez, B. Caramiaux, J. Françoise, F. Bevilacqua, and W. E. Mackay, “How do people train a machine? strategies and (mis) understandings,” Proceedings of the ACM on Human-Computer Interaction, vol. 5, no. CSCW1, pp. 1–26, 2021.

Scurto, B. Caramiaux, and F. Bevilacqua, “Prototyping machine learning through diffractive art practice,” in Designing Interactive Systems Conference 2021, pp. 2013–2025, 2021.

Walton, B. Caramiaux, S. F. Alaoui, F. Bevilacqua, and W. E. Mackay, “Reconciling technology-driven and experiential approaches for movement-based design,” in IHM’20.21-32e Conférence Francophone pour l’Interaction Homme-Machine, 2021.

Glowinski, C. Levacher, F. Buchheit, C. Malagoli, B. Matuszewski, S. Schaerlaeken, C. Noera, K. Ed- wards, C. Chiorri, F. Bevilacqua, D. Grandjean., “Emotional, cognitive, and motor development in youth orches- tras,” in Together in Music: Coordination, expression, participation, p. 250, Oxford University Press, 2021.

2020

Caramiaux, J. Françoise, W. Liu, T . Sanchez, and F. Bevilacqua. Machine learning approaches for motor learning: A short review. Frontiers in Computer Science, 2:16, 2020.

O. Boyer, F. Bevilacqua, E. Guigon, S. Hanneton, and A. Roby-Brami. Modulation of ellipses drawing by sonification. Experimental Brain Research, pages 1–14, 2020.

Caramiaux, J. Françoise, W. Liu, T. Sanchez, and F. Bevilacqua. Machine learning approaches for motor learning: A short review. Frontiers in Computer Science, 2:16, 2020.

Loriette, W. Liu, F. Bevilacqua, and B. Caramiaux, ‘Describing movement learning using metric learning’, Plos one, vol. 18, no. 2, p. e0272509, 2023.

Liu, M. A. Magalhaes, W. E. Mackay, M. Beaudouin-Lafon, and F. Bevilacqua. Motor Variability in Complex Gesture Learning: Effects of Movement Sonification and Musical Background ;, ACM Trans. Appl. Percept., vol. 19, no. 1, Jan. 2022,

Ley-Flores, E. Alshami, A. Singh, F. Bevilacqua, N. Bianchi-Berthouze, O. Deroy, A. Tajadura-Jiménez. Effects of pitch and musical sounds on body-representations when moving with sound’, Scientific reports, vol. 12, no. 1, pp. 1–20, 2022.

Bevilacqua, B. Matuszewski, G. Paine, N. Schnell. On Designing, Composing and Performing Networked Collective Interactions. Organised Sound, Cambridge University Press (CUP), 2021, 26 (3), pp.333-339.

Paine, F. Bevilacqua, B. Matuszewski. Editorial: Collective and networked sound practices. Organised Sound, Cambridge University Press (CUP), 2021, 26 (3), pp.303-304

Scurto, B. V. Kerrebroeck, B. Caramiaux, F. Bevilacqua. Designing deep reinforcement learning for human parameter exploration. ACM Transactions on Computer-Human Interaction (TOCHI), 28(1), 1-35. 202

Ley-Flores, E. Alshami, A. Singh, F. Bevilacqua, N. Bianchi-Berthouze, O. Deroy, and A. Tajadura- Jiménez, “Bodies moving with sound: Effects of pitch and musical sounds on body-representations”. 2021.

Ley-Flores, L. Turmo Vidal, N. Berthouze, A. Singh, F. Bevilacqua, and A. Tajadura-Jiménez, “Soni- band: Understanding the effects of metaphorical movement sonifications on body perception and physical activity,” in Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1–16, 2021.

Liu, A. Dementyev, D. Schwarz, E. Flety, W. E. Mackay, M. Beaudouin-Lafon, and F. Bevilacqua, “Sonichoop: Using interactive sonification to support aerial hoop practices,” in Proceedings of the 2021 CHI Conference on Human Factors in Computing Systems, pp. 1–16, 2021.

Sanchez, B. Caramiaux, J. Françoise, F. Bevilacqua, and W. E. Mackay, “How do people train a machine? strategies and (mis) understandings,” Proceedings of the ACM on Human-Computer Interaction, vol. 5, no. CSCW1, pp. 1–26, 2021.

Scurto, B. Caramiaux, and F. Bevilacqua, “Prototyping machine learning through diffractive art practice,” in Designing Interactive Systems Conference 2021, pp. 2013–2025, 2021.

Walton, B. Caramiaux, S. F. Alaoui, F. Bevilacqua, and W. E. Mackay, “Reconciling technology-driven and experiential approaches for movement-based design,” in IHM’20.21-32e Conférence Francophone pour l’Interaction Homme-Machine, 2021.

Glowinski, C. Levacher, F. Buchheit, C. Malagoli, B. Matuszewski, S. Schaerlaeken, C. Noera, K. Ed- wards, C. Chiorri, F. Bevilacqua, D. Grandjean., “Emotional, cognitive, and motor development in youth orches- tras,” in Together in Music: Coordination, expression, participation, p. 250, Oxford University Press, 2021.

2020

Caramiaux, J. Françoise, W. Liu, T . Sanchez, and F. Bevilacqua. Machine learning approaches for motor learning: A short review. Frontiers in Computer Science, 2:16, 2020.

O. Boyer, F. Bevilacqua, E. Guigon, S. Hanneton, and A. Roby-Brami. Modulation of ellipses drawing by sonification. Experimental Brain Research, pages 1–14, 2020.

Caramiaux, J. Françoise, W. Liu, T. Sanchez, and F. Bevilacqua. Machine learning approaches for motor learning: A short review. Frontiers in Computer Science, 2:16, 2020.

Voillot, F. Bevilacqua, and J. Chevrier, “Le corps au coeur de l’apprentissage grâce au numérique: Proposition d’un nouveau paradigme pour l’ ́education à la petite enfance,” in Les dossiers de l’écran: Controverses, paniques morales et usages éducatifs des écrans (L. Tessier and A. S.-M. (coord.), Editions du Croquant, 2020.

Schwarz, W. Liu, and F. Bevilacqua. A survey on the use of 2d touch interfaces for musical expression. In New Interfaces for Musical Expression (NIME), 2020.

2019

Lemouton, F. Bevilacqua, R. Borghesi, E. Fléty, S. Haapamäki. Following Orchestra Conductors: the IDEA Open Movement Dataset. MOCO ’19: 6th International Conference on Movement and Computing, Oct 2019, Tempe AZ USA, France. pp.1–6,

Scurto, Wanyu Liu, B. Matuszewski, F. Bevilacqua, J.-L. Frechin, U. Petrevski, N. Schnell, Entrain: Encouraging Social Interaction in Collective Music Making. ACM SIGGRAPH Studio, Jul 2019, Los Angeles, United States.

Ley-Flores, F. Bevilacqua, N. Bianchi-Berthouze, N., A.  Tajadura-Jiménez. Altering body perception and emotion in physically inactive people through movement sonification. In : 2019 8th International Conference on Affective Computing and Intelligent Interaction (ACII). IEEE, 2019. p. 1-7.

Voillot, F. Bevilacqua, J. Chevrier, C. Eliot. Exploring Embodied Learning for Early Childhood Education. In : Proceedings of the 18th ACM International Conference on Interaction Design and Children. 2019. p. 747-750.  

Guellaï, A. Callin, F. Bevilacqua, D. Schwarz, A. Pitti, S. Boucenna, M. Gratier, Sensus Communis: Some Perspectives on the Origins of Non-synchronous Cross-Sensory Associations. Frontiers in Psychology, Frontiers, 2019, 10:523

Matuszewski,, N. Schnell, F. Bevilacqua,. Interaction Topologies in Mobile-Based Situated Networked Music Systems. Wireless Communications and Mobile Computing, 2019.

2018

Delle Monache, D. Rocchesso, F. Bevilacqua, G. Lemaitre, S. Baldan, and A. Cera, “Embodied sound design”, International Journal of Human-Computer Studies 118 (2018), 47–59.

Françoise, F. Bevilacqua, “Motion-Sound Mapping through Interaction: An Approach to User-Centered Design of Auditory Feedback Using Machine Learning” ACM Transactions on Interactive Intelligent Systems (TiiS) 8 (2), 16, 2018

Caramiaux, F. Bevilacqua, M. M. Wanderley, and C. Palmer, “Dissociable effects of practice variability on learning motor and timing skills,” PloS one, vol. 13, no. 3, p. e0193580, 2018.

F Bevilacqua, I. Peyre, M.Segalen, V. Marchand-Pauvert, P. Pradat-Diehl, and A. Roby-Brami, Exploring different movement sonification strategies for rehabilitation in clinical settings, Proceedings of the 5th International Conference on Movement and Computing, ACM, 2018, p. 42.

Bevilacqua, ,0. Houix, N. Misdariis, P., Susini, D. Schwarz, D. MIMES: outil de prototypage et d’exploration pour la création sonore. Abstract for the Workshop « IHM pour l’exploration et la créativité », Conférence francophone sur l’Interaction Homme-Machine, Oct 2018, Brest, France.

Matuszewski and F. Bevilacqua, Toward a web of audio things, Sound and Music Computing Conference, 2018.

Matuszewski, J. Larralde, and F. Bevilacqua, Designing movement driven audio applications using a web-based interactive machine learning toolkit, Web Audio Conference (WAC), 2018.

Roby-Brami, I. Peyre, M. Segalen, A. Lackmy-Vallee, V. Marchand-Pauvert, P. Pradat-Diehl, F. Bevilacqua, Upper limb rehabilitation with movement-sound coupling after brain lesions, Annals of Physical and Rehabilitation Medicine 61 (2018), e488.

Scurto and F. Bevilacqua, Appropriating music computing practices through human-ai collaboration, Journées d’Informatique Musicale (JIM 2018), 2018. <hal-01791504v1>

Scurto, F. Bevilacqua, and B. Caramiaux, Perceiving agent collaborative sonic exploration in interactive reinforcement learning, Proceedings of the 15th Sound and Music Computing Conference (SMC 2018), 2018.

Tajadura-Jiménez, F. Cuadrado, P. Rick, N. Bianchi-Berthouze, A. Singh, A. Väljamäe, and F.B evilacqua, Designing a gesture-sound wearable system to motivate physical activity by altering body perception, Proceedings of the 5th International Conference on Movement and Computing, ACM, 2018, p. 46.

2017

O. Boyer, A. Portron, F. Bevilacqua, and J. Lorenceau, “Continuous auditory feedback of eye movements: An exploratory study toward improving oculomotor control,” Frontiers in Neuroscience, vol. 11, 2017.

Lemaitre, H. Scurto, J. Françoise, F. Bevilacqua, O. Houix, and P. Susini, “Rising tones and rustling noises: Metaphors in gestural depictions of sounds,” PLOS ONE, vol. 12, pp. 1–30, 07 2017.

Boyer, F. Bevilacqua, P. Susini, and S. Hanneton, “Investigating three types of continuous auditory feedback in visuo-manual tracking,” Experimental Brain Research, vol. 235, pp. 235–691, 2017.

 

Fdili Alaoui, J. Francoise, T. Schiphorst, K. Studd, and F. Bevilacqua, “Seeing, sensing and recognizing laban movement qualities,” in Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, CHI ’17, (New York, NY, USA), pp. 4009–4020, ACM, 2017.

Bevilacqua, J. Larralde, and B. Matuszewski, “Prototypes for exploring gestural interaction using smartphones,” in Adjunct proceedings of the Interact’17 Conference, 2017.

Dubos, F. Bevilacqua, J. Larralde, J. Chevrier, and J.-F. Jégo, “Designing gestures for interactive systems: Towards multicultural perspectives,” in Proceedings of the Interact’17 Conference (extended abstract), 2017.

Caramiaux, F. Bevilacqua, C. Palmer, and M. Wanderley, “Individuality in piano performance depends on skill learning,” in Proceedings of the 4th International Conference on Movement Computing, p. 14, ACM, 2017.

Scurto, F. Bevilacqua J. Françoise, “Shaping and exploring interactive motion-sound mappings using online clustering techniques,” in International Conference on New Interfaces For Musical Expression (NIME’17), 2017.

Schwarz, G. Lorieux, E. Lizère, A. Tarrès, and F. Bevilacqua, “A topo-phonic table for tangible sonic interaction,” in International Computer Music Conference (ICMC), 2017.

Bevilacqua, N. Schnell, J. Françoise, E O. Boyer, D. Schwarz, B. Caramiaux, Designing action-sound metaphors using motion sensing and descriptor-based synthesis of recorded sound materials, Routledge Companion to Embodied Music Interaction, pp 391-401, 2017.

Caramiaux, J. Françoise, F. Bevilacqua, Dynamic Bayesian networks for musical interaction, Routledge Companion to Embodied Music Interaction, 2017

Tajadura-Jiménez, A. Väljamäe, F. Bevilacqua, N. Bianchi-Berthouze, Principles for Designing Body-Centered Auditory Feedback, ch. 18, pp. 371–403. Wiley-Blackwell, 2017.

2016

Bevilacqua, E. O. Boyer, J. Francoise,O. Houix,P. Susini, A. Roby-Brami,S. Hanneton, Sensori-motor Learning With Movement Sonification: A Perspective From Recent Interdisciplinary Studies, Frontiers in Neuroscience, section Neuroprosthetics, vol 10, article 385, 2016

Bevilacqua, N. Schnell, From Musical Interfaces To Musical Interactions, in Human Computer Confluence: Transforming Human Experience Through Symbiotic Technologies, Eds. A. Gaggioli, A. Ferscha, G. Riva, S. Dunne, I. Viaud-Delmon, Published by De Gruyter Open Ltd, Warsaw/Berlin, 124-139, 2016

Bevilacqua, B. Caramiaux, and J. Françoise, Perspectives on real-time computation of movement coarticulation, in Proceedings of the 3rd International Symposium on Movement and Computing, MOCO ’16, 35:1–35:5, ACM, 2016.

Françoise, F. Bevilacqua, and T. Schiphorst, Gaussbox: Prototyping movement interaction with interactive visualizations of machine learning, in Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, CHI EA ’16, 3667–3670, ACM, 2016.

Françoise, O. Chapuis, S. Hanneton, and F. Bevilacqua, Soundguides: Adapting continuous auditory feedback to users, in Proceedings of the 2016 CHI Conference Extended Abstracts on Human Factors in Computing Systems, CHI EA ’16, , 2829–2836, ACM, 2016.

Manitsaris, A. Tsagaris, A. Glushkova, F. Moutarde, and F. Bevilacqua, Fingers gestures early- recognition with a unified framework for RGB or depth camera, in Proceedings of the 3rd International Symposium on Movement and Computing, MOCO ’16, 26:1–26:8, ACM, 2016.

Houix, S. Delle Monache, H. Lachambre, F. Bevilacqua, D. Rocchesso, and G. Lemaitre, “Innovative Tools for Sound Sketching Combining Vocalizations and Gesture,” in Proceedings of the Audio Mostly conference 2016, (Norrköping, Sweden), pp. 12–19,2016.

2015

Tajadura-Jiménez, N. Bianchi-Berthouze, E. Furfaro F. Bevilacqua, Sonification of Surface Tapping Changes Behavior, Surface Perception, and Emotion, IEEE MultiMedia, vol. 22, pp. 48–57, 2015.

F. Alaoui, F. Bevilacqua, and C. Jacquemin, Interactive visuals as metaphors for dance movement qualities, ACM Transactions on Interactive Intelligent Systems (TiiS), 5(3), no. 3, 2015.

2014

Schnell and F. Bevilacqua, Engaging with Recorded Sound Materials through Metaphorical Actions, Contemporary Music Review, 38(3), 34–48, 2014

Caramiaux, J. Françoise, N. Schnell, F. Bevilacqua, Mapping Through Listening, Computer Music Journal, 38(3), 34–48, 2014

B. Caramiaux, F. Bevilacqua, T. Bianco, N. Schnell, O. Houix, P. Susini , The Role of Sound Source Perception in Gestural Sound Description, ACM Transaction on Applied Perception, vol. 11, n° 1, 2014

B. Zamborlin, F. Bevilacqua, M. Gillies, and M. d’Inverno, Fluid gesture interaction design: applications of continuous recognition for the design of modern gestural interfaces, ACM Transactions on Interactive Intelligent Systems, 3(4), pp. 30-45, 2014

J. Françoise, N. Schnell, and F. Bevilacqua. MaD: Mapping by Demonstration for Continuous Sonification. In ACM SIGGRAPH 2014 Emerging Technologies, SIGGRAPH ’14, pages 16:1—-16:1, Vancouver, BC, Canada, 2014.

J. Françoise, N. Schnell, R. Borghesi, and F. Bevilacqua, Probabilistic Models for Designing Motion and Sound Relationships. In Proceedings of the 2014 International Conference on New Interfaces for Musical Expression (NIME’14), 2014.

Françoise, S. Fdili Alaoui, T. Schiporst, F. Bevilacqua, Vocalizing dance movement for interactive sonification of Laban effort factors, Proceedings of the 2014 conference on Designing interactive systems, 2014

Boyer, Q. Pyanet, S. Hanneton, F. Bevilacqua, Learning Movement Kinematics with a Targeted Sound, Lecture Notes in Computer Science, Vol 8905, 218-233,, 2014

Houix, N. Misdariis, P. Susini, F. Bevilacqua, F. Gutierrez, Sonically Augmented Artifacts: Design Methodology Through Participatory Workshops,  Lecture Notes in Computer Science, Vol 8905, 20-4, 2014

2013

E. O. Boyer, B. M. Babayan, F. Bevilacqua, M. Noisternig, O. Warusfel, A. Roby-Brami, S. Hanneton, and I. Viaud-Delmon, From ear to hand: the role of the auditory-motor loop in pointing to an auditory source, Frontiers in Computational Neuroscience, vol. 7, iss. 26, 2013

J. Françoise, N. Schnell, F. Bevilacqua, A Multimodal Probabilistic Model for Gesture-based Control of Sound Synthesis, Proceedings of the 21st ACM international conference on Multimedia (MM’13), Barcelona, 2013

J. Françoise, N. Schnell, F. Bevilacqua , Gesture-based Control of Physical Modeling Sound Synthesis: a Mapping-by-Demonstration Approach, Proceedings of the 21st ACM international conference on Multimedia (MM’13), Barcelona, 2013

F Bevilacqua, N Schnell, N Rasamimanana, J Bloit, E Flety, B Caramiaux, J. Francoise, E. Boyer, De-Mo: designing action-sound relationships with the mo interfaces, CHI’13 EA on Human Factors in Computing Systems, 2907-2910, 2013

F Bevilacqua, S Fels, AR Jensenius, MJ Lyons, N Schnell, A Tanaka, SIG NIME: music, technology, and human-computer interaction, CHI’13EA on Human Factors in Computing Systems, 2529-2532, 2013

B Caramiaux, F Bevilacqua, A Tanaka, Beyond recognition: using gesture variation for continuous interaction, CHI’13 EA on Human Factors in Computing Systems, 2109-2118, 2013

S Fdili Alaoui, C Jacquemin, F Bevilacqua, Chiseling bodies: an augmented dance performance, CHI’13 EA on Human Factors in Computing Systems, 2915-2918, 2013

D Fober, S Letz, Y Orlarey, F Bevilacqua, Programming Interactive Music Scores with INScore. In Proceedings of the Sound and Music Computing Conference 2013 (pp. 185-190).

E Furfaro, N Berthouze, F Bevilacqua, A Tajadura-Jimenez, Sonification of surface tapping: Influences on behavior, emotion and surface perception. In Interactive Sonification Workshop (ISon 2013), 2013.

J. Françoise, I. Lallemand, T. Artières, F. Bevilacqua, N. Schnell, D. Schwarz, « Perspectives pour l’apprentissage interactif du couplage geste-son », Actes des Journées d’Informatique Musicale (JIM 2013), paris, 2013

2012

F Bevilacqua, F Baschet, S Lemouton,The Augmented String Quartet: Experiments and Gesture Following, Journal of New Music Research 41 (1), 103-119, 2012

S. Fdili Alaoui, F. Bevilacqua, B. Bermudez, C. Jacquemin, Dance Interaction with physical model visualization based on movement qualities, International Journal of Arts and Technologies, vol. 4, n° 6, 2013

B Caramiaux, MM Wanderley, F Bevilacqua, Segmenting and Parsing Instrumentalists’ Gestures, Journal of New Music Research 41 (1), 13-29,2012

T Bianco, V Freour, I Cossette, F Bevilacqua, R Caussé, Measures of Facial Muscle Activation, Intra-oral Pressure and Mouthpiece Force in Trumpet Playing, Journal of New Music Research 41 (1), 49-65,2012

N. Rasamimanana, F. Bevilacqua, J. Bloit, N. Schnell, E. Fléty, A. Cera, U. Petrevski, J.-L. Frechin, The urban musical game: using sport balls as musical interfaces, CHI 2012, extended abstracts, pp 1027-1030, 2012

S. Fdili Alaoui, B. Caramiaux, M. Serrano, F. Bevilacqua, Movement Qualities as Interaction Modality, Designing Interactive Systems (DIS’2012), Newcastle, 2012

J. Françoise, B. Caramiaux, F. Bevilacqua, A Hierarchical Approach for the Design of Gesture-to-Sound Mappings, Conference on Sound and Music Computing, Copenhague, 2012

M. Kimura, N. Rasamimanana, F. Bevilacqua, N. Schnell, B. Zamborlin, E. Fléty, Extracting Human Expression For Interactive Composition with the Augmented Violin, International Conference on New Interfaces for Musical Expression (NIME 2012), 2012

I Viaud-Delmon, J Mason, N Karim, F Bevilacqua, O Warusfel, A Sounding Body in a Sounding Space: the Building of Space in Choreography–Focus on Auditory-motor Interactions, Dance Research 29, 431-447, 2012

2011

Bevilacqua, F., Schnell, N., and Alaoui, S. F. (2011). Gesture capture: Paradigms in interactive music/dance systems. In Klein, G. and Noeth, S., editors, Emerging Bodies, pages 183–193. transcript Verlag.

F. Bevilacqua, N. Schnell, N. Rasamimanana, B. Zamborlin, F. Guedy, Online Gesture Analysis and Control of Audio Processing, Musical Robots and Interactive Multimodal Systems, Springer Tracts in Advanced Robotics, Volume 74, Springer Verlag, 2011.

N. Rasamimanana, F. Bevilacqua, N. Schnell, F. Guedy,, E. Come Maestracci, B. Zamborlin, JL. Frechin,U. Petrevski,« Modular Musical Objects Towards Embodied Control Of Digital Music », Tangible Embedded and Embodied Interaction, 2011

N. Schnell, F. Bevilacqua, F. Guédy, N. Rasamimanana, Playing and Replaying – Sound, Gesture and Music Analysis and Re-Synthesis for the Interactive Control and Re-Embodiment of Recorded Music, in H. von Loesch and S. Weinzierl (Eds.), Gemessene Interpretation – Computergestützte Aufführungsanalyse im Kreuzverhör der Disziplinen, Klang und Begriff, Volume 4, Schott Verlag, Mainz, 2011.

N. Schnell, F. Bevilacqua, N. Rasamimanana, J. Bloit, F. Guedy, E. Flety, Playing the MO – Gestural Control and Re-Embodiment of Recorded Sound and Music, International Conference on New Interfaces of Musical Expression (NIME), Oslo, June 2011.

B. Caramiaux, F. Bevilacqua, N. Schnell,Sound Selection by Gestures, International Conference on New Interfaces for Musical Expression (NIME), Oslo, June 2011.

B. Caramiaux, P. Susini, T. Bianco, F. Bevilacqua, O. Houix, N. Schnell, N. Misdariis, Gestural Embodiment of Environmental Sounds: an Experimental Study, International Conference on New Interfaces for Musical Expression (NIME), Oslo, June 2011.

B. Bermudez, S. DeLahunta, M. Hoogenboom, C. Ziegler, F. Bevilacqua, S. Fdili Alaoui « The Double Skin/Double Mind Interactive Installation », The Journal for Artistic Research (JAR), Janvier, 2011 research catalogue

2010

Bevilacqua, F., Zamborlin, B., Sypniewski, A., Schnell, N., Guédy, F., Rasamimanana, N. « Continuous realtime gesture following and recognition », In Embodied Communication and Human-Computer Interaction, volume 5934 of Lecture Notes in Computer Science, pages 73–84. Springer Berlin / Heidelberg, 2010.

Bloit, J. Rasamimanana, N., Bevilacqua, F., Modeling and segmentation of audio descriptor profiles with Segmental Models, Pattern Recognition Letters 31 (12), 2010

Bianco, T., Fréour, V., Rasamimanana, N., Maniatakos, F., Bevilacqua, F., Causse, R. ,« Parametric modeling of gestural co-articulation effects in music performance », Lecture Notes in Computer Science (LNCS) volume 5934, Springer Verlag, 2010.

B. Caramiaux, F. Bevilacqua, N. Schnell « Towards a Gesture-Sound Cross-Modal Analysis », Lecture Notes in Computer Science (LNCS) volume 5934, Springer Verlag, 2010.

Caramiaux, B., Bevilacqua, F., Schnell, N. Analysing Gesture and Sound Similarities with a HMM-based Divergence Measure. Proceeding of the Sound and Music Computing conference (SMC 2010), Barcelona, Spain, 2010

T. Bianco, M. Wanderley, F. Bevilacqua, Mimicry of tone production: results from a pilot experiment , Sound and Music Computing Conference (SMC 2010), Barcelona, Spain, 2010

G. Leslie, D. Schwarz, O. Warusfel, F. Bevilacqua, B. Zamborlin, P. Jodlowski, N.Schnell, « Grainstick: A Collaborative, Interactive Sound Installation », International Computer Music Conference (ICMC), New York, 2010

Bevilacqua F., Zamborlin B., Rasamimanana N., Schnell N., Contrôle gestuel continu de médias sonores et visuelsForum on tactile and gestural interaction, Lille, 2010.

2009

N. Rasamimanana, F. Kaiser, F. Bevilacqua « Perspectives on gesture-sound relationships informed from acoustic instrument studies », Organised Sound, vol. 14, n° 2, 2009

S. Dahl, Bevilacqua, F., N. Rasamimanana, R. Bresin, M. Clayton, L. Leante «Gesture in Performance», Musical Gestures. Sound, Movement, and Meaning, (R. I Godoy and M. Leman editors), Routledge, 2009

Bloit, J. Rasamimanana, N., Bevilacqua, F. « Towards morphological sound description using segmental models », International Conference on Digital Audio Effects DAFx, Milan, Italie, 2009

N. Rasamimanana, D. Bernardin, M. Wanderley, and F. Bevilacqua. String bowing gestures at varying bow stroke frequencies: A case study. In Gesture in Human-Computer Interaction and Simulation, Lecture Notes In Artificial Intelligence, Vol. 5085, p216-226, Springer-Verlag, Berlin, Heidelberg, 2009.

G. Leslie, D. Schwarz, O. Warusfel, F. Bevilacqua, P. Jodlowski « Wavefield synthesis for interactive sound installations », 127th AES Convention, New-York, 2009

2008

N. Rasamimanana and F. Bevilacqua. Effort-based analysis of bowing movements: evidence of anticipation effects, Journal of New Music Research, vol. 37, n° 4, 2008.

N. Rasamimanana, F. Guedy, N. Schnell, J.-P. Lambert, F. Bevilacqua. Three pedagogical scenarios using the Sound and Gesture Lab, in Proceedings of the 4th i-Maestro Workshop on Technology-Enhanced Music Education, Genova, Italy, pp. 5-10, 2008.

Koerselman T., Larkin O., Ong B., Leroy N., Lambert J.-P., Schwarz D., Guédy F., Schnell, N., Bevilacqua, F., Ng K., “SDIF Integration in i-Maestro Gesture Tools”. 4th i-Maestro Workshop on Technology-Enhanced Music Education,: Juin 2008, p. 15-20

2007

F. Bevilacqua, Momentary Notes on Capturing Gestures in Capturing Intentions, Editors EG/PC and the Amsterdam School fo the Arts, 2007.

S. deLahunta and F. Bevilacqua, “Sharing Descriptions of Movement », International Journal of Performance and Digital Media. Vol 3, No 1.

F. Bevilacqua, F. Guédy, N. Schnell, E. Fléty, N. Leroy,  » Wireless sensor interface and gesture-follower for music pedagogy« , Proc. of the International Conference of New Interfaces for Musical Expression (NIME 07), p 124-129, 2007

Guédy, F., Leroy, N., Bevilacqua, F., Grosshauser, T., Schnell, N., « Pedagogical experimentation using the Sound and Gesture Lab », 3rd I-MAESTRO Workshop, Barcelone.

Ng K., Larkin O., Koerselman T., Ong B., Schwarz D., Bevilacqua F., “The 3D Augmented Mirror: Motion Analysis for String Practice Training”. International Computer Music Conference (ICMC). Copenhagen, 2007, vol. II, p. 53-56

Viaud-Delmon I., Bresson J., Pachet F., Bevilacqua F., Roy P., Warusfel O., Eartoy : interactions ludiques par l’audition. Journées d’Informatique Musicale – JIM’07. Lyon : 2007

2006

F. Bevilacqua, N. Rasamimanana, N. Schnell « Interfaces gestuelles, captation du mouvement et création artistique »,L’inouï #2. Revue de l’Ircam, (Editions Léo Scheer, Berlin), 2006

N. Rasamimanana, E. Fléty, F. Bevilacqua « Gesture Analysis of Violin Bow Strokes », Gesture in Human-Computer Interaction and Simulation, Lecture Notes in Computer Science, Vol. 3881, 145-155, 2006

F. Bevilacqua, N. Rasamimanana, E. Fléty, S. Lemouton, F. Baschet « The augmented violin project: research, composition and performance report », 6th International Conference on New Interfaces for Musical Expression (NIME 06), Paris, 2006

N. Leroy, E. Fléty, F. Bevilacqua « Reflective Optical Pickup For Violin », 6th International Conference on New Interfaces for Musical Expression, 2006

E. Schoonderwaldt, N. Rasamimanana, F. Bevilacqua « Combining accelerometer and video camera: Reconstruction of bow velocity profiles », 6th International Conference on New Interfaces for Musical Expression, 2006

Schnell N., Bevilacqua F., Schwarz D., Rasamimanana N., Guédy F., Technology and Paradigms to Support the Learning of Music Performance. 2nd i-Maestro Workshop on Technology Enhanced Music Education. Leeds : 2006

2005

F. Bevilacqua, R. Muller, N. Schnell « MnM: a Max/MSP mapping toolbox », New Interfaces for Musical Expression, Vancouver, 2005

F. Bevilacqua, R. Muller « A Gesture follower for performing arts », Gesture Workshop, 2005

N. Schnell, R. Borghesi, D. Schwarz, F. Bevilacqua, R. Müller «FTM — Complex data structures for Max», International Computer Music Conference (ICMC), Barcelona, 2005

2001-2004

F. Bevilacqua, E. Fléty « Captation et analyse du mouvement pour l’interaction entre danse et musique », Rencontres Musicales Pluridisciplinaires – le corps & la musique, Lyon, 2004

E. Fléty, N. Leroy, J. Ravarini, F. Bevilacqua « Versatile sensor acquisition system utilizing Network Technology », International Conference on New Interfaces for Musical Expression (NIME), Hamamatsu, 2004

C. Dobrian and F. Bevilacqua, “Gestural Control of Music Using the Vicon 8 Motion Capture System”, Proceedings of the New Interfaces for Musical Expression 2003 Conference, Montréal, Quebec, Canada.

F. Bevilacqua, J.Ridenour, and David J. Cuccia, “Mapping Music to Gesture: A study using 3D motion capture data”, Proceeding of the Workshop/Symposium on Sensing and Input for Media-centric Systems, Santa Barbara CA, 2002.

F. Bevilacqua, L. Naugle and I. Valverde, “Virtual dance and music environment using motion capture” Proceeding of the IEEE – Multimedia Technology And Applications Conference, Irvine CA, 2001.