V i l e l m i n i   K a l a

May 2021

Workshop: From Data to Artistic Creations

Presenters: Vilelmini Kalampratsidou
                          Katerina El Raheb
                          Marina Stergiou
                          Pandelis Diamantides
Presensted in: https://moco21.movementcomputing.org/ (also known as SloMOCO)
Presentation Date: May 14th, 2021
Publication Link: https://www.slomoco.surf/projects/from-data-to-artistic-creations

This workshop aims at the practical solutions to the requirements about the use of biometric data in art projects focusing on undergoing research. The workshop is led by the concept of the transdisciplinary project Transition to 8: bridging social issues, tech and contemporary art. Transition to 8 is a collaboration between computer researchers, sound producers, artists and psychologists. You will share your insights in creating such works of arts and/or musical compositions on the use of physiological data (heart rate, breathing, galvanic reaction) for creating audiovisual artworks. What is the role of bodily reactions in creating and inspiring sound and visual artists? We would like to set the stage for creative practitioners to discuss their needs, desires, and what inspires during the process for artistic research.

This workshop is held in the framework of the project Transition to 8: Bridging social issues, tech and contemporary art - Co-financed by Greece and the European Union.

Instagram: @transitionto8, Project page: transitionto8.com

February - June 2021

SPECIAL ISSUE: WEARABLES FOR TRANSDISCIPLINARY MOVEMENT AND COMPUTING

Guest Editors: Antonia Zaferiou, Ph.D., Assistant Professor, Stevens Institute of Technology, Hoboken, NJ, USA
                                Vilelmini Kalampratsidou, Ph.D., Adjunct Researcher, Athena Research & Innovation Center, Athens, Greece
                                Gregory Corness, Ph.D., Associate Professor, Columbia College Chicago, Chicago, IL, USA
                                Frédéric Bevilacqua, Ph.D., Head Researcher, Institut de Recherche et Coordination Acoustique/Musique (IRCAM), Paris, France
Journal: Wearable Technologies, Cambridge University Press
Link: https://www.cambridge.org/core/journals/wearable-technologies/information/call-for-papers/special-issue-wearables-for-transdisciplinary-movement-and-computing

Wearable technologies play an important role in innovations in fields intersecting movement and computing, fuelled by research across many disciplines. This special section of Wearable Technologies will gather research focusing on areas of wearable technology, for accessing movement data used in analysis or embodied interface design, and targeting applications in Performing Arts, Entertainment, Sports, Education and Human-Computer Interaction.

The International Conference on Movement and Computing (MOCO) and its community develops computational technology to support and understand human movement practice (e.g., computational analysis). The MOCO community also welcomes the study of movement as a means of interacting with computers (e.g., human-computer and human-machine interfaces). This requires an interdisciplinary understanding of movement that ranges from biomechanics to embodied cognition and the phenomenology of bodily experience.

As an extension of MOCO, this special collection bridges the two creative communities of art and science and bolsters the innovative nature of MOCO’s transdisciplinary community of academics and practitioners. Novel technology, methodologies, and perspectives described in this special collection will provide cross-cutting opportunities in broader wearable technology practices.

This special section focuses on research in wearable technologies for transdisciplinary movement and computing research and practices.

We welcome original contributions, along with review papers, as well as revisions of proceedings of the International Conference on Movement and Computing (MOCO). We require submissions based on previous MOCO proceedings to be extended or revised significantly (e.g., adding depth to describing methodology, or applicability to wearables, etc.), leading to at least 30% new scientific material or results. It is highly encouraged to re-frame your paper for wide transdisciplinary Wearable Technologies readership. Submissions will require confirmation of meeting this requirement in the cover letter.

May 2021

REAL-TIME PROXY-CONTROL OF RE-PARAMETERIZED PERIPHERAL SIGNALS USING A CLOSE_LOOP INTERFACE

Authors: Vilelmini Kalampratsidou, Steven Kemper, Elizabeth Torres
Published in: J. Vis. Exp.
Link: https://www.jove.com/t/61943

The fields that develop methods for sensory substitution and sensory augmentation have aimed to control external goals using signals from the central nervous systems (CNS). Less frequent however, are protocols that update external signals self-generated by interactive bodies in motion. There is a paucity of methods that combine the body-heart-brain biorhythms of one moving agent to steer those of another moving agent during dyadic exchange. Part of the challenge to accomplish such a feat has been the complexity of the setup using multimodal bio-signals with different physical units, disparate time scales and variable sampling frequencies.

In recent years, the advent of wearable bio-sensors that can non-invasively harness multiple signals in tandem, has opened the possibility to re-parameterize and update the peripheral signals of interacting dyads, in addition to improving brain- and/or body-machine interfaces. Here we present a co-adaptive interface that updates efferent somatic-motor output (including kinematics and heart rate) using biosensors; parameterizes the stochastic bio-signals, sonifies this output, and feeds it back in re-parameterized form as visuo/audio-kinesthetic reafferent input. We illustrate the methods using two types of interactions, one involving two humans and another involving a human and its avatar interacting in near real time. We discuss the new methods in the context of possible new ways to measure the influences of external input on internal somatic-sensory-motor control.

July 2020

SONIFICATION OF HEART RATE VARIABILITY CAN ENTRAIN BODIES IN MOTION

Authors: Vilelmini Kalampratsidou, Elizabeth Torres
Presented in: 7th International Conference on Movement and Computing (MOCO'20)
DOI: https://doi.org/10.1145/3401956.3404186

In this work, we introduce a co-adaptive closed-loop interface driven by audio augmented with a parameterization of the dancer's heart-rate in near real-time. In our set-up, two salsa dancers perform their routine dance (previously choreographed and well-trained) and a spontaneously improvised piece lead by the male dancer. They firstly dance their pieces while listening to the original version of the song (baseline condition). Then, we ask them to dance while listening to the music, as altered by the heart rate extracted from the female dancer in near real-time. Salsa dancing is always led by the male. As such, their challenge is to adapt, their movements, as a dyad, to the real-time change induced by the female's heart activity.

Our work offers a new co-adaptive set up for dancers, new data types and analytical methods to study two forms of dance: well-rehearsed choreography and improvisation. We show that the small variations in heart activity, despite its robustness for autonomic function, can distinguish well between these two modes of dance.

July 2020

7th INTERNATIONAL CONFERENCE ON MOVEMENT AND COMPUTING

Sientific Community: Movement and Computing
Date: July 15–17, 2020
Location: Virtual conference presented from New Jersey, USA
Organizing Committee: Antonia Zaferiou, Stevens Institute of Technology
                                                  Vilelmini Kalampratsidou, Rutgers Center for Cognitive Science, Rutgers University
                                                  Carla Caballero Sánchez, Miguel Hernandez University of Elche
                                                  Steven Kemper, Mason Gross School of the Arts, Rutgers University
                                                  Sara Pixley, Rutgers Center for Cognitive Science, Rutgers University
Link: https://moco20.movementcomputing.org/

The international conference on Movement and Computing (MOCO) aims to gather academics and practitioners interested in the computational study, modeling, representation, segmentation, recognition, classification, or generation of movement information. MOCO is positioned within emerging interdisciplinary domains between art & science.

October 2019

BODILY SIGNAL ENTRAINMENT IN THE PRESENCE OF MUSIC

Authors: Vilelmini Kala, Elizabeth Torres
Presented in: 6th International Conference on Movement and Computing (MOCO'19)
Location: Tempe, Arizona, USA
DOI: https://doi.org/10.1145/3347122.3347125

As a user listens to music, his bodily biorhythms can entrain with the music's rhythms. This work describes a human computer interface used to characterize the evolution of the stochastic signatures of physiological rhythms across the central and the peripheral nervous systems in the presence (or absence) of music. We track the heart, EEG and kinematics' variability under different music-driven conditions to identify the parameter manifold and context with maximal signal to noise ratio as well as to identify regions of maximal and minimal statistical co-dependencies of present events from past events.

Spring 2019

DANCE FROM THE HEART

Research Team: Postdoc Vilelmini Kala, Associate Professor Elizabeth Torres
Research Lab: Sensory-Motor Integration Lab , Center for Cognitive Sience, Rutgers University, New Brunswick, NJ
Residency: Mana Contemporary , Hoboken, NJ
Computer Composition: Doctorate student Michael Zavorskas and Assistant Professor Steven Kemper, Music Department, Mason Gross School of the Arts
Choreography: Joseph Albano, Albano Ballet Company of America
Dancer: Vilelmini Kala
Related publication: https://www.academia.edu/40076651/Dance_from_the_heart_A_dance_performance_of_sounds_led_by_the_dancers_heart

This project is about sonifying the little fluctuations that exist in our body motions even when we are seemingly at rest. Here we present an example of sonifying heart activity. We utilize ECG signal collected by wearable biosensors on dancing bodies, to extract musical features and characterize the ECG activity of the dancing bodies to create music. In this work, we transform a dancer's body into a musical instrument. We aim at creating a new type of performance, whereby people could experience an audiovisual form of dance. This form of dance would expand the audience's perception beyond visual into auditory via the sounds that the body in motion could produce.

The video bellow demonstrates the developement process.