Viale Oriani 32, 40137 Bologna
Composing in real-time
Within the Ipercello system the electronic sounds belonging to the composition are directly driven by the performer through audio analysis and/or inertial sensors positioned under the cello bow. The idea is to develop compositional processes (also in terms of macroform) starting from the gesture of the performer through audio analysis (timbre and performance patterns), through the software libraries of audio analysis in real-time developed at IRCAM and at CCNMAT, combined with a motion tracking system (accelerometers + gyroscopes) developed at the Center for Speckled
Computing in Edinburgh with which a well-established collaboration is ongoing
The interactive environment is designed with the MAX/Msp software.
The system is developed by Nicola Baroni and is part of a research project carried out at the University of Edinburgh and concluded in 2016 https://nicolabaroni.com/phd/
Several concert performances have taken place in the international festivals Maskfest, Forfest, Stanford, SonicWarehouse, 5 days in Milan, Rive-Gauche, Spazio-Musica, Eterotopie.
The main results were presented at universities such as Stanford and Kennesaw (US)
https://ccrma.stanford.edu/events/concert-nicola-baroni-presents-zadig-voltaire-meets-kaftka-celloand-hypercello, Goteborg, Vienna and at the EMS conference in Berlin in 2014 http://www.ems-network.org/spip.php?article405
CV Nicola Baroni
The recital can take the form of a monograph, with the following options:
_Nicola Baroni, K_Messages for Ipercello from short stories by Kafka (2014-17)
Vor dem Gesetz, The Wish to be a Red Indian, Odradek, The Trees, The Metamorphosis
_Massimiliano Messieri Zadig, 21 Capricci for Cello and Ipercello (2000-2012)
from the novel Zadig by Voltaire
Or, as happens more frequently, as a mixed program :
and in combination with the following repertoire for solo cello
Further options may include cello duo and cello ensemble
Interactive projects have also been developed with a dance company
The system underlying the project makes explicit reference to the concept of Hypercello conceived by Tod Machover and developed at the MIT since the 1990s, and the IRCAM project Augmented Violin. The technology of hyperstruments (also called augmented instruments) is based upon the analysis and calculation in real time (physical computing) of the movement and sound of the performers: the capture of the performance gesture (sensing) thus becomes a dynamic basis for algorithmic processes, whose modular structure is managed and modified by the performer in real time. A collateral study is oriented to define and explore some timbre parameters of classical
musical instruments in their interaction with performing practices and gestures.
In this project the musical productions are based on the concept of interactive composition. In this sense, Augmented Cello and Software begin to be more and more a single entity sometimes difficult to distinguish from the score and the compositional processes: the virtual sounds emerge as a seamless consequence of the actions taking place on the stage, in an ecosystem fashion.
Amplification: 4 speakers with relative cables of square footage suitable for the size of the room.
The mixer is not strictly necessary.
The remaining technical equipment is provided by the musicians.
The project can be realized in concert form, or as a presentation conference and/or workshop for students and active participants.