IASPM 2014 – Dynamic Popular Music


Dynamic Popular Music – The First Stages of a New Art Form

Keith Hennigan, Trinity College Dublin

Keith begins with an entertainingly ‘sci-fi’ way of looking at musical creativity – that is, speculating about the opportunity to make different choices at various stages in the composition’s development (and in its playback timeline). Dynamic Music is categorised (after Collins) as ‘Interactive’ (where the music changes in response to the user) and ‘Adaptive’ (where the user interacts with an additional element that in turn affects the music). Keith adds ‘Generative’ to the taxonomy in order to include music that changes due to internal systems [JB note – he comments that he was prevented for tech reasons from doing a live iPhone demo but I infer he was going to show us something like this – http://www.generativemusic.com/].

So what does a Dynamic Music work require in order to exist? Keith’s own perspective was originally from a production PoV but he intends today to take the listeners’, and the following list of requirements is based on this. DM needs two things – the originating content (sounds, recordings, audio assets etc) and something to create change (a system or means of interaction to allow participation in or driving of change).

We hear four examples. The first is The National Mall by Bluebrain, a piece based on physical maps – the system that drives the music is based on the geography of the city (e.g. dynamics are mapped to physical height). The work can only be fully experienced in situ (in Washington DC). The trigger mechanism is GPS (iPhone/tablet etc). The programming of the system uses a secondary feature (physical location data) and this in turn drives the music engine’s selections (of texture, dynamics, presumably sample choice).

Example 2 is Radiohead’s Polyfauna. The music is constructed from remix material from the RH album King of Limbs. Sound sources are placed within a 3D virtual landscape. Audio stems are triggered and the sonic palette responds to the user’s virtual location (similar to a regular computer game sound effects engine where sound mixing responds to the avatar’s location in real time).

Example 3 is the Bronze Format – generative music without user input (and internal randomisation from the software to drive change in the music), and there is only one current artist with a release in the format – Gwilym Gold http://gwilymgold.com/. These are generative works to some extent but song form (and vocal track) remain unchanged on each playback

The final example is Bjork’s Biophilia project and the related apps. Every song on the album had an associated app that functioned like a sound toy, although some were adaptive songs in their own right. The project also had a quasi-educational intention. In ‘Virus’ virtual cells fight each other and progression in the app (and playback of the song) is linked to the health/survival of the cell. This is also a commentary on the life/death lyric theme of the song.

Dynamic music comes as a software artefact. Smartphones have given us a new and compact set of hardware controls unavailable until very recently. App downloads provide rapid dissemination. The ubiquity of phones makes a large number of people into prime customers for dynamic music. DM is the music of possibilities. Keith ends by speculating about future directions (including musicological directions) for DM.

Leave a Reply

Please log in using one of these methods to post your comment:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: