// UniSSON - Unity Supercollider Sound Object Notation
Exposing Interface: Visualising sonic interactions in multi-player digital performance - UniSSON (Unity Supercollider Sound Object Notation)
This project introduces interdisciplinary research by Jules Rawlinson and Marcin Pietruszewski (with input from Owen Green and Dave Murray-Rust) into action, agency and legibility in electronic music performance, with a specific focus on small ensembles. The research was made possible by financial support from the University of Edinburgh's Challenge Investment Fund
View the slide deck outlining the UniSSON research as presented at the Convergence International Conference/Festival of Music, Technology and Ideas at De Montfort University, UK, September 2019
In electronic music, especially by groups, it can be hard for audiences and performers to gauge who is doing what because physical movement is so decoupled from sonic results. On the other hand, mutual connectivity offers radical possibilities for making the musical instrument something that is fluid, shared and responsive. The project builds on existing strands of work in creative computing, computer music and musicology but seeks to make newly playful use of these techniques whilst also addressing accessibility issues by working with both widely adopted and open source software.
The aims of this research were to develop a framework for graphical representations of multiplayer sonic/musical interactions in audiovisual performance, explore data visualisation and machine listening/learning tactics for sharing musical gestures in performance, design creative alternatives to standard user interfaces for analysis of multiplayer digital sound/music, devise collaborative strategies for integrated and embedded audiovisual performance and the release of software utilities for multiplayer experimentation.
The main output from this research is a suite of software tools piloting frameworks for analysis and visualisation (UniSSON, Unity Super Collider Sound Object Notation) in multiplayer performance. The main successes of the analysis and visualisation tools are in presenting a multitemporal and multiresolution view of sonic data allowing the simultaneous display of 'instant', 'recent' and 'long-term' data across a number of important sound-based parameters in accessible ways, which contributes to audience engagement and collaborative performance
There's a video overview of the software below, audio starts at 45 seconds!
This research investigates poetic ways of exploring productively the tension between legibility and co-agency in performance. Real-time video display of visualised sonic data both reveals and structures collective and individual activity through the capture, transformation, and re-presention of interactions. Explorations of visual form and graphic mapping provide new opportunities for performers and audiences to engage with provocation and process in immersive audiovisual performance.
Alongside a set of functional criteria, a set of musical criteria for success included whether players feel as if they have an improved sense of who is doing what in comparison to practice with an orthodox interface, whether players feel as if the visualisation helps prompt decisions about what to do next, whether players feel that the interface both reflects their contribution to the interactive web and helps structure musical flow, whether audience members find the visualisation aids their attention to the sound, and whether audience members get a sense of who is contributing particular types of sound to the overall flow
If you'd like to try UniSSON (Mac only sorry), please get in touch via email.
UniSSON files to follow