{"id":2952,"date":"2013-10-18T22:16:26","date_gmt":"2013-10-18T22:16:26","guid":{"rendered":"http:\/\/blog.soton.ac.uk\/digitalhumanities\/?p=2952"},"modified":"2013-10-15T22:16:50","modified_gmt":"2013-10-15T22:16:50","slug":"sotondh-small-grants-investigation-into-synthesizer-parameter-mapping-and-interaction-for-sound-design-purposes-post-1","status":"publish","type":"post","link":"https:\/\/digitalhumanities.soton.ac.uk\/small-grants\/2952","title":{"rendered":"sotonDH Small Grants: Investigation into Synthesizer Parameter Mapping and Interaction for Sound Design Purposes \u2013 Post 1"},"content":{"rendered":"

sotonDH Small Grants: Investigation into Synthesizer Parameter Mapping and Interaction for Sound Design Purposes \u2013 Post 1 by Darrell Gibson<\/h4>\n

Introduction<\/h3>\n

The phrase \u201csound design\u201d first started being used in the film industry in the 1970s [1].\u00a0 Since this time it has been used in many different contexts and now means different things, to different people so it is worth clarifying its context here.\u00a0 Sound design is considered to be the generation, synthesis, recording (studio and location) and manipulation of sound.\u00a0 That is, creating and making sounds to meet a given set of requirements or specification.\u00a0 Therefore, it means all the following are different forms of sound design: synthesizer programming, generating and recording found sounds, Foley, applying effects during audio production, etc.\u00a0 Today sound design is required in many areas including: music production, soundscapes, film, television, theatre, computer\/video games, live sound and sound art.\u00a0 One important area of Sound Design as a discipline is synthesizer programming, where the designer will configure a sound synthesizer\u2019s available parameters to give a desired output sound.\u00a0 Whether the synthesizer is implemented in hardware or software, in a standard synthesizer model the parameters are typically accessed through controls such as dials, sliders and buttons.\u00a0 This method relies on the designer having extensive knowledge of the particular synthesis paradigm used by each synthesizer, the internal architecture and the sound design possibilities of each parameter.<\/p>\n

The sheer number of parameters that synthesizers possess, often hundreds and sometimes thousands, further compounds the difficulties, while for some forms of synthesis (e.g. Frequency Modulation and Wavetable) the parameters\u2019 relationship to the sound generation characteristics are not always simple.\u00a0 In addition to these problems, sound designers require creative and critical listen skills that also take considerable practice to develop, in order to move the process towards a defined sonic result.\u00a0 This combination of factors means that it can be very difficult to learn how to design sounds with synthesizers and often places effective design outside of the reach of traditional musicians and casual users.\u00a0 Over the years synthesizer manufacturers have addressed this problem by supplying their devices with extensive banks of preset patches.\u00a0 Although this is satisfactory for users that only want to use predesigned sounds, it detracts from the creative process and can be restrictive.\u00a0 It is also of limited value to those wishing to learn the intricacies of synthesizer programming.\u00a0 The best they can hope for is to audition preset patches until they find something close to the desired sound and then attempt to modify the sound by selectively \u201ctweaking\u201d the used parameters.\u00a0 This situation is also not desirable for experienced sound designers, who will often have a good idea of the sound they are aiming to create, but without considerable synthesizer experience it may not be obvious how to go about either creating them from scratch or moving from a preset to the desired sound.\u00a0 This is particularly evident for more complex synthesis systems.\u00a0 In addition, there is normally no way of working between multiple target sounds so that designers can arrange the sound in different configurations and explore the sound space defined by multiple target sounds. These limitations of synthesizers, combined with the historical origins of sound design in Foley, have perhaps led to designers still working more with recorded sound than with synthetic sound.<\/p>\n

Synthesizer Interfaces<\/h4>\n

One of the unique features of synthesizer technology compared with traditional instruments is that they present two interfaces to the user, one for the programming of the sound generator and the other for the actual musical input.\u00a0 However, during a performance the user can potentially interact with either, or both interfaces offering potentially a very rich form of performance and expression.\u00a0 However, this is dependent on the performer understanding the tonal possibility that a particular synthesizer patch offers and being able to access these, via the synthesizers interface.\u00a0 Over the years there has been significant research in the areas of both synthesizer performance interfaces and programming interfaces.<\/p>\n

Synthesizer Performance Interfaces<\/h4>\n

When the first electronic hardware synthesizers were designed there was no defined structure for the performance interface.\u00a0 As a result, new performance interfaces emerged, such as the Theremin [2] and Trautonium [3].\u00a0 However, as time progressed the manufactures standardised the interface by using a representation of a traditional piano keyboard.\u00a0 This performance interface offers a logical layout and familiarity to those who had learnt traditional keyboard instruments [4], [5].\u00a0 Many manufactures, such as Moog, ARP, Korg, Roland, etc., adopted this performance interface for their \u201call-in-one\u201d designs.\u00a0 The advent of MIDI (Musical Instrument Digital Interface) in the 1980\u2019s [6], led to a standardised mechanism of separating the performance interface from the sound-generating synthesizer.\u00a0 Despite the fact that MIDI was primarily designed for keyboard devices, significant flexibility was allowed in the standard that has permitted new input devices to be designed that do not use traditional designs [7], [8] & [9].<\/p>\n

Synthesizer Programming Interfaces<\/h4>\n

As previously mentioned, the programming interface typically presents the user with knobs, dials, sliders, etc. that are directly control the synthesizer\u2019s parameters.\u00a0 This is a direct mapping of the synthesis parameters and does not relate to the output sound.\u00a0 It follows directly from original electronic hardware synthesizers, such as the Moog Modular [10], where the controls are directly connected to the electronic components.\u00a0 Various proposed solutions examine the mapping of the synthesizer parameters between the synthesis engine and the programming interface to see if the relationship can be more intuitive and less technical [11], [12] & [13].<\/p>\n

Synthesizer Interface Mapping<\/h4>\n

As synthesizers possess two interfaces the mapping between them will ultimately affect the expressiveness of the synthesizer as an instrument.\u00a0 Parameters of the sound synthesised can be changed or modified so that different articulation sounds can be created.\u00a0 Assuming that the performance interface allows suitable physical expressions to be captured then the original sounds and the articulations can be mapped to performance interface. The choice of the parameter mapped and their quantities will ultimately affect the expressiveness of the instrument [14].\u00a0 As a result, the expressive control of both systems has been considered extensively.<\/p>\n

Research Questions<\/h4>\n

These issues raise three distinct questions in relationship to how synthesizers are used for sound design:\u00a0 First, is there a way that sound design can be performed without an in-depth knowledge of the underlying synthesis technique?\u00a0 Second, can a large number of synthesizer parameters be controlled intuitively with a set of interface controls that relate to the sounds themselves?\u00a0 Finally, can multiple sets of complex synthesizer parameters be controlled and explored simultaneously?<\/p>\n

References<\/h4>\n

1.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Whittington, W. B., Sound Design and Science Fiction. University of Texas Press, 2007.<\/p>\n

2.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Douglas, A., Electrical synthesis of music. Electronics & Power, Volume 10, issue 3, p. 83 \u2013 86, March 1964.<\/p>\n

3.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Glinsky, A., Theremin: ether music and espionage. University of Illinois Press, 2000.<\/p>\n

4.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Moog, R. A., and T. L. Rhea. Evolution of the keyboard interface: The B\u00f6sendorfer 290 SE recording piano and the Moog multiply-touch-sensitive keyboards. Computer Music Journal 14, no. 2, 52-60, 1990.<\/p>\n

5.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Goebl, W., R. Bresin & A. Galembo, The piano action as the performer\u2019s interface: Timing properties, dynamic behaviour, and the performer\u2019s possibilities. In Proceedings of the Stockholm Music Acoustics Conference (SMAC\u201903), August 6\u20139, vol. 1, pp. 159-162. Stockholm, Sweden: Department of Speech, Music, and Hearing, Royal Institute of Technology, 2003.<\/p>\n

6.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 MIDI, MMA Complete. 1.0 Detailed Specification. MIDI Manufacturers Association, Los Angeles, CA, USA, 2000.<\/p>\n

7.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Jord\u00e0, S., G. Geiger, M. Alonso & M. Kaltenbrunner, The reacTable: exploring the synergy between live music performance and tabletop tangible interfaces. In Proceedings of the 1st international conference on Tangible and embedded interaction, pp. 139-146. ACM, 2007.<\/p>\n

8.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Collins, N., C. Kiefer, M. Z. Patoli & M. White, Musical exoskeletons: Experiments with a motion capture suit. Proceedings of New Interfaces for Musical Expression (NIME), Sydney, Australia (2010).<\/p>\n

9.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Rothman, P., The Ghost: an open-source, user programmable MIDI performance controller. In Proceedings of the International Conference on New Interfaces for Musical Expression, pp. 431-435. 2010.<\/p>\n

10.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Jenkins, M. Analog Synthesizers: Understanding, Performing, Buying. Focal Press, 2007.<\/p>\n

11.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Wanderley, M. M., & P. Depalle. Gestural control of sound synthesis. Proceedings of the IEEE 92, no. 4, 632-644, 2004.<\/p>\n

12.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Hunt, A., & M. M. Wanderley. “Mapping performer parameters to synthesis engines.” Organised Sound 7, no. 2, 97-108, 2002.<\/p>\n

13.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Goudeseune, C., Interpolated Mappings for Musical Instruments. Organised Sound, 7(2):85\u201396, 2002.<\/p>\n

14.\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0\u00a0 Hunt, A., M. Wanderley, and M. Paradis, The Importance Of Parameter Mapping In Electronic Instrument Design. Journal of New Music Research, Volume 32, Issue 4, page 429\u2013440, 2003.<\/p>\n

 <\/p>\n","protected":false},"excerpt":{"rendered":"

sotonDH Small Grants: Investigation into Synthesizer Parameter Mapping and Interaction for Sound Design Purposes \u2013 Post 1 by Darrell Gibson Introduction The phrase \u201csound design\u201d first started being used in the film industry in the 1970s [1].\u00a0 Since this time it has been used in many different contexts and now means different things, to different people so it is worth …<\/p>\n","protected":false},"author":93693,"featured_media":0,"comment_status":"closed","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[198239],"tags":[],"_links":{"self":[{"href":"https:\/\/digitalhumanities.soton.ac.uk\/wp-json\/wp\/v2\/posts\/2952"}],"collection":[{"href":"https:\/\/digitalhumanities.soton.ac.uk\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/digitalhumanities.soton.ac.uk\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/digitalhumanities.soton.ac.uk\/wp-json\/wp\/v2\/users\/93693"}],"replies":[{"embeddable":true,"href":"https:\/\/digitalhumanities.soton.ac.uk\/wp-json\/wp\/v2\/comments?post=2952"}],"version-history":[{"count":3,"href":"https:\/\/digitalhumanities.soton.ac.uk\/wp-json\/wp\/v2\/posts\/2952\/revisions"}],"predecessor-version":[{"id":3007,"href":"https:\/\/digitalhumanities.soton.ac.uk\/wp-json\/wp\/v2\/posts\/2952\/revisions\/3007"}],"wp:attachment":[{"href":"https:\/\/digitalhumanities.soton.ac.uk\/wp-json\/wp\/v2\/media?parent=2952"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/digitalhumanities.soton.ac.uk\/wp-json\/wp\/v2\/categories?post=2952"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/digitalhumanities.soton.ac.uk\/wp-json\/wp\/v2\/tags?post=2952"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}