2024 | Visualization Research Centre, HKBU
Project Overview
In Remote Gestures, Dr. Roberto Alonso Trillo and Dr. Peter A C Nelson explored the potential for remote artistic collaboration by integrating sound, visual art, and robotics within a 360-degree immersive environment. The project enabled musicians and visual artists to interact and create in real time, regardless of physical distance. By transforming the sound and motion of a violin bow into visual patterns, robotic brush movements, and complex soundscapes, the artists fostered a fluid interplay between music, movement, and visual expression.
Technical Narrative
The performance unfolded in three stages. Initially, the audience experienced the violinist’s telepresence, with his live performance streamed into the immersive space. In the second stage, the visual and audio streams were fully integrated into the immersive environment. During the final stage, both performers used sensors embedded in their Meta Bows to control a robotic arm, creating a tangible link between music, movement, and visual art.
The performance itself consisted of three distinct phases: first, violin bow motions were mapped to visual patterns on screen; second, a robotic arm was used to play cello strings; and third, the violinist remotely controlled a robotic calligraphy brush. This combination of visual elements and digital sound compositions provided an immersive, multisensory experience, as live painting projections and surround sound enveloped the audience in a unique artistic fusion.
第一部分:項目概述
在遠程手勢中,羅伯托·阿隆索·特里略博士和彼得·A·C·尼爾森博士探索了在 360 度沉浸式環境中,通過聲音、視覺藝術和機器人技術的融合,實現遠程藝術協作的潛力。該項目使音樂家和視覺藝術家無論身處何地都能即時互動和創作。通過將小提琴弓的聲音與動作轉化為視覺圖案、機器人筆刷的運動和複雜的音景,藝術家們促成了音樂、動作與視覺表達之間的流暢交融。
第二部分:技術敘述
表演分為三個階段。最初,觀眾體驗到了小提琴家的“遠程存在”,他們的現場表演被流傳至沉浸式空間中。第二階段,視覺和音頻流完全整合進入了沉浸式環境。最後階段,兩位表演者使用嵌入在其 Meta Bow 中的感應器來控制機器人手臂,創造了音樂、動作與視覺藝術之間的具體聯繫。
整場演出分為三個不同的部分:首先,小提琴弓的動作被映射到螢幕上的視覺圖案;接著,機器人手臂被用來演奏大提琴弦;最後,小提琴家遠程控制了一個機器人書法筆刷。這些視覺元素與數位音效的組合帶來了沉浸式的多感官體驗,通過現場繪畫投影和環繞聲音,觀眾被包圍在一個獨特的藝術融合之中。
Leads: Roberto Alonso TRILLO, Peter A C NELSON
Collaborator: ---
Partner: ---
Date: 2024
Team: Daniel Stempfer (Robot Scripting), Marlen Runz (Robot Mapping), Alexis Mailles (Interaction Design), Davor Vincze (Music), Jiafan Weng (Production), Roberto Alonso TRILLO (Performer_Violin), Kung Chi Shing (Performer_Violin), Ms Karen Yu (Performer_Percussion)
Hardware: nVis with 29.6 surround sound system, MetaBow, UR3e 6 DoF robotic arm, 2 custom strings, a mic-equipped resonator, a ceiling-mounted camera
Software: TouchDesigner, MaxMSP, JackTrip, a Python bridge, a custom Python version of AllHands in the bow bridging app, Robot Motion GUI, Blender
2024.06.12
17 minutes performed 5 times
Innovation and Technology Fund, HKSAR