Piano Street Magazine

Avatar Piano Duet in Metaverse

November 2nd, 2022 in Articles by | 1 comment

Artistic research is an expanding sector at institutions worldwide. In a performance at Cremona Musica, pianist Giusy Caruso performed a “Contrapuntal dialogue between a pianist and her avatar in the Metaverse”. Piano Street had a chance to experience the performance and to talk to Caruso about her artistic research project.

The pianist Giusy Caruso is a concert artist and academic scholar and is one of the pioneers of artistic research in Italy. She lives in Belgium, is a professor at the Royal Conservatory of Antwerp and affiliated with the Institute for psychoacoustics and electronic music (IPEM) of the University of Ghent and the Laboratoire de Musicologie (LaM) of the University of Brussels.

The performance in Cremona opened by using the Disklavier where Caruso pre-recorded, played from the first movement of the “Memento Mori Collection” by Belgian composer Wim Henderickx (1962). Then Caruso entered the scene with a suit and markers, essential for the motion capture system to identify the pianist’s movements and thus creating an avatar. This way the pianist was projected into virtual reality interacting with the piano’s string-board, while the piano continued to play the MIDI. The second part of the performance was in dialogue with the avatar, detected by the motion capture system in the piece “Piano Phase for Two Pianos” by Steve Reich (1936).

Piano Street: Giusy, thank you for your exciting performance at Cremona Musica in duet with your Avatar in Metaverse. This takes a whole new grip on musical performance. How did you initially get involved in this new digitized stage for performance?

Giusy Caruso: Thank you Patrick for this question that can give me the possibility to go deeper in my aesthetic approach and artistic research activity. I started working with motion tracking technology in 2015 at the Art Interactive Science Lab (ASIL) of the Institute of Psychoacoustics and Electronic Music (IPEM) in Ghent, Belgium, within my PhD research study in music performance. At that moment, my research investigated the relation between the gestural approach and interpretation in piano performance.

I conducted myself some experiments by using the motion tracking in playing a contemporary piano work, 72 carnatic etudes by the French composer Jacques Charpentier (1933-2017), to study my corporeal reactions related to my music score analysis and practice. This experience brought me to creatively work and interact with this technology and discover all the possibilities of this system which maps gestures and provides quantitative data on displacement, acceleration and velocity of the movements (precious information for musicians who want to be aware of and improve their bodily approach in playing an instrument).

The system also creates a virtual agent (commonly an avatar) based on the real movements of performers moving in the scene with reflective markers on their body captured by infrared cameras. From this perspective of visualisation of corporeal measurements, I matured the idea together with the Engineer Paolo Pelluco and the designer Samuele Polistina – founder of the LWT3 society of data analysis and visualization, design and development of ICT infrastructures based in Milano – to also start creating digitized performances. The motion tracking technology used on stage provides an augmented visualisation of the expressive ‘choreography’ of a musician and an intriguing and innovative approach in live music production because it is also a way to disseminate both artistic and scientific findings.

Giusy Caruso dressed up for her concert at the Cremona Musica Piano Experience, 23 September 2022.

PS: In Cremona we saw your interaction with your Avatar on a big screen with you both performing together. We also saw you using a VR headset. How much does your VR experience differ from what we as an audience experience in the hall?

GC: My experience was completely immersive because by wearing the VR headsat I entered in the meta-performative-scene where I could watch a virtual agent playing the first part of the piece Piano Phase. Actually, the avatar positioned at the second piano in the metaverse scene reproduced my expressive movements captured by the motion tracking system, together with the audio recorded on the Yamaha Disklavier the day before my public performance.

This allowed me then to simulate the real interaction and dialogue with my own avatar by following both the music track and the corporeal signs that make me able to play with my counterpart, in sync and out of sync, as required in the piece by Steve Reich.

Spectators can see the projection of the virtual scene and watch the performance in real time by being involved in an intriguing “augmented” and hybrid experience (i.e.phygital) caused by the three-dimensional perspective of the stage space and the doubling of me as a real pianist in dialogue with my avatar in metaverse. During the days after the public concert, spectators could enjoy the performance in post-production completely, immersed in the perspective of metaverse by wearing the VR headset at LTW3’s stand appositely prepared for Cremona International Exposition.

This performance is the result of an empirical experiment on physical and digital presence in virtual performative scenes carried out in Belgium, at the Art & Science Interactive Lab IPEM in Ghent, by researchers Bavo Van Kerrebrouck, Pieter-Jan Maes and me. I am currently developing the artistic part of this project at the Royal Conservatoire of Antwerp in collaboration with LWT3 Society in Milano who supported and made possible my XR performances on stage.

PS: Music is a language we all understand. The Metaverse though is an unfamiliar environment that feels enigmatic. What are your ideas on how to create a digitally “augmented” experience?

GC: We are in the ‘phygital’ era, i. e. hybrid approaches (physical+digital) in hybrid spaces (real+virtual). Actually, since the first appearance of the Internet, we were completely projected and absorbed in a ‘virtual’ world with possibilities of interactions in cyberspace. People got used to live and communicate in virtual environments through smartphones or laptops (e.g. chats, email, platforms, channels…). With the development of AR/VR projections, people can now also create interactions of their avatar in the ‘metaverse’ (a term coined by Neal Stephenson in his novel Snow Crash already in 1992). This is the present and this is the future that certainly appears enigmatic because it needs to be fully explored.

In this context, artists want to also experiment with new ‘augmented ‘ ways of communicating and creating their artefact and so, also tracing a futuristic techno-aesthetics. What’s next in music? These are all factors that spurred my artistic research and performance in the metaverse which integrates techno metalanguages (audiovisuals) to enhance and extend possibilities of musical fruition and perception to new generation audiences

Watch an example of the Avatar Piano Project:

Concerto for Piano and Motion Capture
February 2020 in Museo della Scienza e della Tecnica – Milano, Italy
Giusy Caruso, piano

Comments

  • Douglas Johnson says:

    Does not do it for me.

  • Write a reply or comment

    Your email address will not be published. Required fields are marked *


    For more information about this topic, use the search form below!