Misplaced Pages

Adaptive music

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
(Redirected from Interactive music) Music that changes in response to specific events in a video game "Interactive music" redirects here. For interactive music albums, see Interactive album. For media that may feature an interactive musical component, see Interactive media. "Dynamic music" redirects here. For the general variation in loudness, see Dynamics (music).
This article needs additional citations for verification. Please help improve this article by adding citations to reliable sources. Unsourced material may be challenged and removed.
Find sources: "Adaptive music" – news · newspapers · books · scholar · JSTOR (November 2014) (Learn how and when to remove this message)

Adaptive music is music which changes in response to real-time events or user interactions, found most commonly in video games. It may change in volume, arrangement, tempo, and more. Adaptive music is a staple within the role-playing game genre, often being used to change the tone and intensity of music when the player enters and leaves combat. Music video games, in which a core gameplay element involves player interaction with music, also have fundamentally adaptive soundtracks.

History

The first example of adaptive music is generally said to have been in Space Invaders by Taito in 1978. The game's simple background music, a four-note ostinato which repeats continuously throughout gameplay, increases in tempo as time goes on and the aliens descend upon the player. However, this music could also be considered sound effects for the aliens' movement, so some argue this is not an example of adaptive music.

Other early examples of adaptive music include Frogger by Konami from 1981, where the music abruptly switches once the player reaches a safe point in the game, and Sheriff by Nintendo from 1979, where different pieces of music play in response to events such as a condor flying overhead or bandits approaching the player.

George Lucas' video game development group LucasArts (before becoming Lucasfilm Games) created and patented the iMUSE interactive music system in the early 1990s, which was used to synchronise video game music with game events. The first game to make use of this system was Monkey Island 2: LeChuck's Revenge in 1991.

Techniques

Vertical orchestration

Vertical orchestration is the technique in which the music's arrangement is changed. Musical layers are added and removed in response to game events to affect the music’s texture, intensity, and emotional feel without interrupting the flow of music. Layers are generally faded in and out for smoother transitions.

In video games, this technique may be used for more subtle game events than horizontal re-sequencing, such as an increase or decrease in intensity during a battle.

In Dead Space 2, the background music appears to be arranged into four layers, each a stereo track corresponding with a specific level of "fear". Each of these layers is then either individually or collectively mixed during gameplay depending on a variety of game variables, such as the distance the player is from enemies.

Horizontal re-sequencing

Horizontal re-sequencing is the technique in which different pieces of music are transitioned between. Musical pieces in a “branching” sequence are transitioned between in response to game events. The most simple kind of transition is a crossfade; when triggered by an event, the old piece is faded out while the new piece fades in. Another kind is phrase branching; in this case, the change to the next segment starts when the current musical phrase has ended. Another kind involves using dedicated "bridge" transitions, which are sections of music composed to join the two pieces of music together.

In video games, this technique may be used for more significant game events, such as a change in location, beginning of a battle, or opening of a menu, as it generally draws more attention and makes a greater impact than vertical orchestration.

Algorithmic generation

See also: Generative music and Algorithmic composition

Some video games generate musical content live using algorithms instead of relying solely on pre-made musical pieces (such as in horizontal re-sequencing and vertical orchestration).

Spore uses an embedded version of the music software Pure Data to generate music according to certain game events such as the phase of gameplay, the player's actions in the "creature editor", and the duration of the gameplay session. In addition, Ape Out features a procedurally generated jazz soundtrack which changes based on the intensity of gameplay and the players inputs.

Blending music and sound effects

This section may contain information not important or relevant to the article's subject. Please help improve this section. (November 2024) (Learn how and when to remove this message)

Some video games, such as Rez and Extase, synchronise their sound effects with the background music to blend them together. This is done by delaying playback of the sound effects after they're triggered by the player.

Uses

This section needs expansion with: more uses for adaptive music. You can help by adding to it. (November 2024)

As goal or reward

The music game Sound Shapes uses an adaptive soundtrack to reward the player. As the player improves at the game and collects more "coins", the soundtrack, which is entirely composed of the melodies and beats created by these "coins", intensifies.

See also

References

  1. ^ Redhead, Tracy (2024-07-04). Interactive Technologies and Music Making: Transmutable Music (1 ed.). London: Routledge. doi:10.4324/9781003273554. ISBN 978-1-003-27355-4.
  2. Sporka, Adam; Valta, Jan (2 October 2017). "Design and implementation of a non-linear symphonic soundtrack of a video game". New Review of Hypermedia and Multimedia. 23 (4): 229–246. Bibcode:2017NRvHM..23..229S. doi:10.1080/13614568.2017.1416682. S2CID 46835283.
  3. Fritsch, Melanie (2021). Summers, Tim (ed.). The Cambridge companion to video game music. Cambridge companions to music. Cambridge, United Kingdom: Cambridge University Press. ISBN 978-1-108-60919-7.
  4. Sweet, Michael (October 2, 2014). Writing Interactive Music for Video Games: A Composer's Guide. Addison-Wesley Professional. p. 99. ISBN 978-0321961587. Frustrated with the state of music in games at the time, two composers at LucasArts Peter MccConnell and Michal Land created one of the first adaptive music systems, called iMuse. iMuse (Interactive MUsic Streaming Engine) let composers insert branch and loop markers into a sequence that would allow the music to change based on the decisions of the player. The iMuse engine was one of the first significant contributions to interactive music for video games. Its importance in shaping many of the techniques that you see in video games today cannot be overemphasized. (...) Other excellent iMuse titles includes Grim Fandango (1998), which features an incredible jazz-based soundtrack composed by Peter McConnell. (...)
  5. Collins, Karen (August 8, 2008). Game Sound: An Introduction to the History, Theory, and Practice of Video. The MIT Press. p. 102, 146. ISBN 978-0262033787.
  6. ^ Kamp, Michiel; Summers, Tim; Sweeney, Mark eds (2016) Ludomusicology : Approaches to Video Game Music. Sheffield: Equinox. pp 188-189. ISBN 9781781791974
  7. Sweet, Michael (13 June 2016). "Top 6 Adaptive Music Techniques in Games - Pros and Cons - Designing Music NOW". Designing Music Now. Archived from the original on 13 November 2018. Retrieved 13 November 2018.
  8. Kosak, Dave (20 February 2008). "The Beat Goes on: Dynamic Music in Spore". GameSpy. IGN Entertainment, Inc.
  9. Wright, Steven (27 February 2019). "How 'Ape Out' Creates a Soundscape Worthy of Smashing". Variety. Variety Media.
Categories: