Sound and Process: Music, Technology, and Creativity
The technology of music is both pragmatic and expressive. Instruments, software, notation, interfaces, and other technologies all have practical value: what most efficiently makes the desired sound? What allows a performer to best manipulate their tools? What most clearly helps to communicate the ideas and vision of a composer, improviser, or performer?
But how does the UX of musical tech affect creative engagement with it? What has the ubiquitous 7-white/5-black piano keyboard layout ever done for music? What’s behind the huge resurgence of modular synths in an era when even a mid-level laptop is orders of magnitude more powerful for electronic music? Why are “alternative controllers” still relegated to the music fringes when isomorphic keyboards, for example, have been around for at least a hundred years? As performers and creators, how can we feel invested in our work, engage our audiences, and explore new musical realms in an era of music-tech industrialization? What does it mean to engage your whole body in playing an instrument versus the all-too-common mouse-n’-slouch? What do various systems of musical notation hope to communicate?
And what does this all mean for the performer, composer, improviser, and listener?