Please stay tuned for the ICMC 2018 workshop program!
- Making Musebots @ ICMC: A Workshop
Time: Aug 5, 14:00-18:00
- Beyond Performance Recordings: Strategies for Capturing, Taxonomizing, and Preserving Live-Coded and Improvisatory Electronic Music Practices
Time: Aug 5, 10:00-11:00
- Soundcool: Smartphones, Tablets and Kinect for Collaborative Creation
Jorge Sastre, Roger Dannenberg
Time: Aug 5, 11:30-13:00
- Controlling DC Motors and Solenoids for Kinetic Sound Art and Music Workshop
Time: Aug 5, 10:00-13:00
- HASGS: Composing for an Hybrid Augmented Saxophone of Gestural Symbiosis
- Workshop on Design Strategies for Audio-Haptic Composition
Time: Aug 5, 10:00-12:00
- Soundcamp – Samsung way for mobile music creation
JeongWook Seo, Tomasz Rybicki, Marcin Glinski, Seunghun Kim
Time: Aug 5, 14:00-18:00
- Music for Mediation and Meditation for Music
Kim Je Chang
Time: Aug 5, 14:00-18:00
- Live Coding with Csound
Time: Aug 5, 14:00-18:00
- Designing Interactive realtime Sound and Video works in Max/MSP
Esther Lamneck and Cort Lippe
Time: Aug 5, 14:00-18:00
- Improvisation and Voice
Time: Aug 5, TBD
** Participants are required to bring their own laptops! **
What does it mean to “capture” a piece of music? Is a copy of a musical score enough to fully articulate and reproduce a performance—especially considering the broad and diverse approaches electronic and electroacoustic composers require? A performer could be composing onstage, building an instrument, developing an algorithm, or collaborating globally from their laptop. Where acoustic music often relies on a causal relationship between a note on a page and a particular sound in a listener’s ears, live-coded and improvisatory electronic music provides infinitely more options for performers and performer/composers. As such, we must rethink our understanding of what it means to “record” a performance.
We will lead children and adults to generate a creative work using the Soundcool system, which is a free system designed for young people to work with electroacoustic music. The workshop will be developed around an animation of a modern city and we will construct a soundscape based on city sounds. The sound creation is preceded by listening and selecting sounds, which are then used in the Soundcool system. We will introduce the basic modules of Soundcool, such as Player, SamplePlayer and Keyboard. We will learn how to control modules with apps on iOS and Android (please bring a mobile phone or tablet if you have one). We will use Soundcool effects modules to generate more complex textures. Finally, a real-time creation by all workshop participants will be performed. For adults, we will discuss concepts and education with Soundcool.
Combining computer music control techniques with the direct actuation of sounding materials in the physical world can produce exciting sonic results. While pioneering artists have used motors in their work for decades, advances in low-cost electronics and easy-to-use microcontroller platforms have made this technology more accessible to musicians and sound artists. This workshop will teach participants the basics of motor control for the purposes of creating kinetic sound art and music. Topics covered will include an introduction to DC motors and solenoids as well as how to control these motors using the Arduino microcontroller platform and a computer. These technical concepts will be put into context through a discussion of contemporary kinetic sound art and robotic musical instruments. Participants will be able explore the sonic possibilities of motor control themselves by composing short studies for a group performance at the end of the session.
This project is part of the research driven by the saxophonist and sound designer Henrique Portovedo, designated Multidimensionality of Contemporary Performance. Starting as an artistic exploratory project, the conception and development of the HASGS (Hybrid Augmented System of Gestural Symbiosis ) for Saxophone became, as well, a research project. The project has been developed at Portuguese Catholic University, University of California Santa Barbara, ZKM Karlsruhe and McGill University Montreal with insights from researchers as Henrique Portovedo, Paulo Ferreira Lopes, Ricardo Mendes, Curtis Roads, Clarence Barlow, Marcelo Wanderley. On this workshop we will explore techniques and approaches of composition having as starting point some of the pieces already composed for the instrument.
By examining the relationships between sound and touch, new compositional and performance
strategies start to emerge for practitioners using digital technologies. In this workshop we will explore why vibrotactile interfaces, which offer physical feedback to the performer, may be viewed as an important approach in addressing potential limitations of current physical dynamic systems used to mediate the digital performer’s control of various sorts of musical (and other) information. We will examine methods where feedback is artificially introduced to the performer’s and audience’s body offering different information about what is occurring within the sonic domain. We will explore mapping strategies, as well as placing of vibration on the skin. Participants are encouraged to bring their own music, sounds, or instruments and experience how they feel in addition to how they sound.
Further information on this research project and recent events can be found at:
This workshop is intended to introduce how Samsung has worked to create a mobile music-creating ecosystem. We will introduce two services: Samsung Professional Audio SDK and Soundcamp. Samsung Professional Audio is an Android framework for creating virtual instruments in mobile devices through low-latency audio and MIDI/Audio connections. Also, Soundcamp is a mobile Digital Audio Workstation application based on Samsung Professional Audio. Soundcamp hosts other music applications working on Samsung Professional Audio, so participants will learn how to join in our mobile music-creating ecosystem for Android OS. After the introduction and demonstration, the presenters and participants will shares the experiences regarding mobile music creation systems as well as our services.
The title of this workshop is “An Experiment to find out the Technique to improve the cognizing power of meditator by releasing the congestions inside bodily organs with the help of sound resonance phenomena.” In this 4 hours workshop, we’re going to attempt a practice of one hour meditation(30 min. short & shallow breatning + 30 min. observation of painful bodily sensations) practice together on the chair after short lecture. After meditation we’ll try to find out the exact frequency of the congestions in the bodily organs of selected meditator with the help of AI technology and try to create the exactly same frequency of the congestions of the selected meditator. We believe that this experiment will be able to release the congestions of a meditator shortly and help him/her to reach deep meditation stage very quickly. By this experiment we’ll try to find out the formula of this process of releasing congestions and if it is possible we hope that this technique would help many meditators who’re suffering by the energy stagnent inside the bodily organs. In this workshop musicians would find out new role in the society to help professional mediators by releasing congestions inside bodily organs with the help of resonance phenomena. In addition if we’ll be able to arrange the sounds produced by this process properly, we hope that we’ll be able to create new style of music which has artistic value.
Link: (URL, pdf or video)
This workshop will introduce users to live coding techniques and practices using the Csound sound and music computing system. Attendees will work through a series of practice-based exercises to explore live coding with Csound themselves. We will use the presenter’s csound-live-code project to explore various approaches to sound design and real time event generation. We will explore topics such as: metronomes and time; hexadecimal notation for percussion writing; score generation using realtime, callback, and event-time (i.e., temporal recursion) coding approaches; and more. Modern Csound 6 syntax and practices will be used for the workshop.
The target audience for this workshop includes those new to live coding and/or language-based systems; those looking to employ live coding as part of their composition workflow; and those seeking to perform music live with code. No prior knowledge of Csound is necessary.
The workshop will be dedicated to building collaborative pieces which explore Esther’s use of the rich sonic material of the Hungarian Tárogató in improvisatory live electronic environments. Composers/Sound designers and/or visual designers are invited to submit material they would like to develop during the workshop consisting of Max/MSP/Jitter patches along with musical sketches/ideas destined for live electronic art environments. Participants will be encouraged to collaborate during the workshop.
The short-term goal will be to create etudes/sketches during the workshop, and the long-term goal will be to identify and begin collaborations with Esther for subsequent performances by her or the New Music Ensemble at New York University, which Esther directs. Materials (no notated scores) along with any questions can be sent to Esther Lamneck at email@example.com.
We will look at various real-time analysis tools with the goal of taking information from a performance and mapping this data to audio and visual control, allowing performers to influence musical and visual parameters in an improvisatory interactive environment
The workshop will investigate the extended voice, computer/electronic performance, and improvisation through listening, thoughtfulness, and practice. The voice, as an instrument, will be examined with a focus on the use of extended techniques as a catalyst for improvisation. The use of computers and electronics as transformative and provocative tools will be explored. As well, the role of computers and electronics as performing instruments, in the context of improvisation, will be investigated.
Performance will be a central component of the workshop with hands-on engagement by participants. Listening exercises will act as a foundation bringing participants into the mental space necessary for successful improvisatory performance.