keynote – about electronic music and instrument making

Home » keynotes » keynote – about electronic music and instrument making

this keynote about specificities of electronic music instruments as well as similarities with acoustic music instruments and instrument making was given during the wocmat 2015 conference (international workshop on computer music and audio technology) at the nctu (national chiao tung university), Taiwan, parallel to workshops (in collaboration with liao lin-ni and christelle séry) on electronic music composition.

this presentation, which presents some of the challenges that have to be faced in electronic music and computer music design (klangregie), does not intend to provide an exhaustive review of what electronic music instruments are, but rather to formulate open questions about this topic, as well as reminding the huge variety of fields implied by the term “electronic music” and the difficulty to classify the different techniques in terms of classical organology.


what is an electronic music instrument?

  • definition of “instrument” => anything that produces sound has to be considered as a music instrument, as long as it’s being used with a musical intention
  • example: music performance by erikm on a vinyl disc player
  • examples of electric and electronic devices that can be considered as instrument
  • is a computer, a speaker etc… itself the instrument, or only a part of it?
  • reminder: the different parts of an acoustical music instrument
  • transposition to an electronic music instrument
  • cultural acceptance of electronic music instruments
  • why is the electric guitar universally recognized as a music instrument, in contrary to a many other electronic instruments?

specificities of electronic music instruments

  • first revolution: amplification
    • example: andrea neumann performing with her “inside piano”
    • stereophony, surround, spatialization, 3d audio
    • perspective reversal: the sound is not any more radiating from the source to the listener (centrifugal radiation), but is coming from around the listener to him. the subject is not any more the instrument, but the listener
  • second revolution: recording
    • recording revolutionizes our representation of time in music and sound (haptic relation to the sound, semantic and musical structure…).
    • audio recoding did not create this rupture. it existed previously in other forms of art (example: sound collages, kurt schwitters, ursonate). but audio recording was one of the tools that allowed to make this rupture way more obvious and to develop it.
    • digital technology allows to build instruments that allow even more precise and complex manipulations of the sound
  • third revolution: signal processing
    • signal processing, an engineering field based on applied mathematics, relies on a model that does not exist in the nature
    • many acoustic natural phenomena can be imitated thanks to signal processing elements. but acceptable physical models are in general very complex, requiring a lot of different signal processing modules
      moreover, good mathematical models are still missing for highly non-linear instruments, like cymbals for instance.
    • however, the opposite is also true: very simple signal processing effects can hardly be imitated acoustically
    • signal processing introduced previously unheard categories of timbre, like recording should never be considered as a way to achieve a faithful reproduction of a live situation…
    • there is much more potential in using signal processing (and also physical models) to extend the timbres and the musical possibilities, rather than trying to imitate acoustic instruments
  • fourth revolution: programming
    • fields: digital signal processing, computer-aided algorithmic composition, automated synth control, score following, computer-generated scores, computer-generated music, machine learning, machine improvisation, ….
    • is programming really a conceptual novelty in instrument making? => yes and no actually.
    • why using automated processes for generating music?

traps and challenges with electronic music instruments

  • trap 1: the „wow“ effect (focus on a technical demonstration at the expense of music)
  • trap 2: having two or more hats
  • dilemma: newness vs. stability
  • trap 3: the only search for transparency
  • challenge: human-user interfaces (hui)
  • challenge: notating electronic music