Workshops

Please stay tuned for the ICMC 2018 workshop program!

Making Musebots @ ICMC: A Workshop
Presenter: Arne Eigenfeldt
Time: Aug 5, 14:00-18:00

Description
This 4 hour workshop will introduce Musebots, a specification and set of tools for collaboratively creating networked generative music agents. The workshop will introduce the concept of Musebots and give existing examples, as well as introduce the latest web-based version which uses WebRTC and WebAudio. The second half of the workshop will involve creating and/or adapting musebot templates, so this workshop is aimed at ICMC participants who have familiarity with either Javascript (for the web-based musebots), MaxMSP, or Max4Live. Musebot templates also exist in PD, Java, Extempore, and SuperCollider; however, these platforms are unfamiliar to the presenter (come, but I can’t support you!).

** Participants are required to bring their own laptops! **

Link:
2015 community musebots: https://www.youtube.com/watch?v=JCpcjrsuOmM
2018 musebots: https://aeigenfeldt.wordpress.com/musebots/

Beyond Performance Recordings: Strategies for Capturing, Taxonomizing, and Preserving Live-Coded and Improvisatory Electronic Music Practices
Presenter: Hunter Ewen
Time: Aug 5, 10:00-11:00

Description
What does it mean to “capture” a piece of music? Is a copy of a musical score enough to fully articulate and reproduce a performance—especially considering the broad and diverse approaches electronic and electroacoustic composers require? A performer could be composing onstage, building an instrument, developing an algorithm, or collaborating globally from their laptop. Where acoustic music often relies on a causal relationship between a note on a page and a particular sound in a listener’s ears, live-coded and improvisatory electronic music provides infinitely more options for performers and performer/composers. As such, we must rethink our understanding of what it means to “record” a performance.

Link: http://www.hunterewen.com/main.html

Soundcool: Smartphones, Tablets and Kinect for Collaborative Creation
Presenters: Jorge Sastre, Roger Dannenberg
Time: Aug 5, 11:30-13:00

Description
We will lead children and adults to generate a creative work using the Soundcool system, which is a free system designed for young people to work with electroacoustic music. The workshop will be developed around an animation of a modern city and we will construct a soundscape based on city sounds. The sound creation is preceded by listening and selecting sounds, which are then used in the Soundcool system. We will introduce the basic modules of Soundcool, such as Player, SamplePlayer and Keyboard. We will learn how to control modules with apps on iOS and Android (please bring a mobile phone or tablet if you have one). We will use Soundcool effects modules to generate more complex textures. Finally, a real-time creation by all workshop participants will be performed. For adults, we will discuss concepts and education with Soundcool.

Link: http://soundcool.org/

Controlling DC Motors and Solenoids for Kinetic Sound Art and Music Workshop
Presenter: Steven Kemper
Time: Aug 5, 10:00-13:00

Description
Combining computer music control techniques with the direct actuation of sounding materials in the physical world can produce exciting sonic results. While pioneering artists have used motors in their work for decades, advances in low-cost electronics and easy-to-use microcontroller platforms have made this technology more accessible to musicians and sound artists. This workshop will teach participants the basics of motor control for the purposes of creating kinetic sound art and music. Topics covered will include an introduction to DC motors and solenoids as well as how to control these motors using the Arduino microcontroller platform and a computer. These technical concepts will be put into context through a discussion of contemporary kinetic sound art and robotic musical instruments. Participants will be able explore the sonic possibilities of motor control themselves by composing short studies for a group performance at the end of the session.

Link: Kemper_ICMC_Workshop.pdf

HASGS: Composing for an Hybrid Augmented Saxophone of Gestural Symbiosis
Presenter: Henrique Portovedo
Time: TBD

Description
This project is part of the research driven by the saxophonist and sound designer Henrique Portovedo, designated Multidimensionality of Contemporary Performance. Starting as an artistic exploratory project, the conception and development of the HASGS (Hybrid Augmented System of Gestural Symbiosis ) for Saxophone became, as well, a research project. The project has been developed at Portuguese Catholic University, University of California Santa Barbara, ZKM Karlsruhe and McGill University Montreal with insights from researchers as Henrique Portovedo, Paulo Ferreira Lopes, Ricardo Mendes, Curtis Roads, Clarence Barlow, Marcelo Wanderley. On this workshop we will explore techniques and approaches of composition having as starting point some of the pieces already composed for the instrument.

Link: https://www.henriqueportovedo.com/

Workshop on Design Strategies for Audio-Haptic Composition
Presenter: Lauren Hayes
Time: Aug 5, 10:00-12:00

Description
By examining the relationships between sound and touch, new compositional and performance

strategies start to emerge for practitioners using digital technologies. In this workshop we will explore why vibrotactile interfaces, which offer physical feedback to the performer, may be viewed as an important approach in addressing potential limitations of current physical dynamic systems used to mediate the digital performer’s control of various sorts of musical (and other) information. We will examine methods where feedback is artificially introduced to the performer’s and audience’s body offering different information about what is occurring within the sonic domain. We will explore mapping strategies, as well as placing of vibration on the skin. Participants are encouraged to bring their own music, sounds, or instruments and experience how they feel in addition to how they sound.

Link:
Further information on this research project and recent events can be found at:
https://www.pariesa.com/home/category/Sound%20and%20Touch

Soundcamp – Samsung way for mobile music creation
Presenter: JeongWook Seo*, Tomasz Rybicki, Marcin Glinski, Seunghun Kim
Time: Aug 5, 14:00-18:00

Description
This workshop is intended to introduce how Samsung has worked to create a mobile music-creating ecosystem. We will introduce two services: Samsung Professional Audio SDK and Soundcamp. Samsung Professional Audio is an Android framework for creating virtual instruments in mobile devices through low-latency audio and MIDI/Audio connections. Also, Soundcamp is a mobile Digital Audio Workstation application based on Samsung Professional Audio. Soundcamp hosts other music applications working on Samsung Professional Audio, so participants will learn how to join in our mobile music-creating ecosystem for Android OS. After the introduction and demonstration, the presenters and participants will shares the experiences regarding mobile music creation systems as well as our services.

Link: Soundcamp workshop ICMC 2018.pdf

Music for Mediation and Meditation for Music
Presenter: Kim Je Chang
Time: Aug 5, 14:00-18:00

Description
The title of this workshop is “An Experiment to find out the Technique to improve the cognizing power of meditator by releasing the congestions inside bodily organs with the help of sound resonance phenomena.” In this 4 hours workshop, we’re going to attempt a practice of one hour meditation(30 min. short & shallow breatning + 30 min. observation of painful bodily sensations) practice together on the chair after short lecture. After meditation we’ll try to find out the exact frequency of the congestions in the bodily organs of selected meditator with the help of AI technology and try to create the exactly same frequency of the congestions of the selected meditator. We believe that this experiment will be able to release the congestions of a meditator shortly and help him/her to reach deep meditation stage very quickly. By this experiment we’ll try to find out the formula of this process of releasing congestions and if it is possible we hope that this technique would help many meditators who’re suffering by the energy stagnent inside the bodily organs. In this workshop musicians would find out new role in the society to help professional mediators by releasing congestions inside bodily organs with the help of resonance phenomena. In addition if we’ll be able to arrange the sounds produced by this process properly, we hope that we’ll be able to create new style of music which has artistic value.

Link: (URL, pdf or video)

Live Coding with Csound
Presenter: STEVEN YI
Time: Aug 5, 14:00-18:00

Description
This workshop will introduce users to live coding techniques and practices using the Csound sound and music computing system. Attendees will work through a series of practice-based exercises to explore live coding with Csound themselves.  We will use the presenter’s csound-live-code project to explore various approaches to sound design and real time event generation. We will explore topics such as: metronomes and time; hexadecimal notation for percussion writing; score generation using realtime, callback, and event-time (i.e., temporal recursion) coding approaches; and more. Modern Csound 6 syntax and practices will be used for the workshop.

The target audience for this workshop includes those new to live coding and/or language-based systems; those looking to employ live coding as part of their composition workflow; and those seeking to perform music live with code.  No prior knowledge of Csound is necessary.

Link: http://kunstmusik.com/

Collaborate on Building Improvisatory Interactive Realtime Works in Max/MSP
Presenter: Esther Lamneck and Cort Lippe
Time: Aug 5, 14:00-18:00

Description
The workshop will be dedicated to building collaborative pieces which explore Esther’s use of the rich sonic material of the Hungarian Tárogató in improvisatory live electronic environments. Composers/Sound designers and/or visual designers are invited to submit material they would like to develop during the workshop consisting of Max/MSP/Jitter patches along with musical sketches/ideas destined for live electronic art environments.  Participants will be encouraged to collaborate during the workshop.

The short-term goal will be to create etudes/sketches during the workshop, and the long-term goal will be to identify and begin collaborations with Esther for subsequent performances by her or the New Music Ensemble at New York University, which Esther directs. Materials (no notated scores) along with any questions can be sent to Esther Lamneck at el2@nyu.edu.

We will look at various real-time analysis tools with the goal of taking information from a performance and mapping this data to audio and visual control, allowing performers to influence musical and visual parameters in an improvisatory interactive environment

Link: https://steinhardt.nyu.edu/faculty/Esther_Lamneck

Improvisation and Voice
Presenter: Paul Botelho
Time: Aug 5, 10:00-12:00

Description
The workshop will investigate the extended voice, computer/electronic performance, and improvisation through listening, thoughtfulness, and practice. The voice, as an instrument, will be examined with a focus on the use of extended techniques as a catalyst for improvisation. The use of computers and electronics as transformative and provocative tools will be explored. As well, the role of computers and electronics as performing instruments, in the context of improvisation, will be investigated.

Performance will be a central component of the workshop with hands-on engagement by participants. Listening exercises will act as a foundation bringing participants into the mental space necessary for successful improvisatory performance.

Link: ICMC 2018 Botelho – Improvisation and Voice Workshop Blurb.pdf