Revision as of 20:22, 23 March 2006 editKuuenbu (talk | contribs)47 editsm →History← Previous edit | Revision as of 02:08, 24 March 2006 edit undo12.75.165.199 (talk) spelling mistakesNext edit → | ||
Line 25: | Line 25: | ||
At the turn of the decade CDs louder than this common reference level began to surface, and CDs steadily became more and more apt to have peaks exceeding the digital limit as long as such amplification would not involve the clipping of more than approximately 2-4 digital samples, resulting in recordings where the peaks on an average rock or beat-heavy pop CD hovered close (usually in the range of 3dB) to full scale but only occasionally reached it. ]'s ] album ] is an early example of this, with RMS levels averaging -15dB for all the tracks. In the early ], however, some mastering engineers decided to take this a step further, and treat the CDs levels exactly as they would the levels of an analog tape and equate digital full scale with the analog saturation point, with the recording just loud enough so that each (or almost every) beat would peak at or over full scale. Though there were some early cases (such as ]'s self-titled ] in ]), albums mastered in this fashion generally did not appear until ]. ]'s ], ]'s ] and ]'s ] are some examples of this from said year. This time period (1988-1992) was an extremely erratic time for CD mastering. The loudness of CDs varied massively depending on the philosophies of the engineer (and others involved in the mastering process) as CD mastering became more lenient as opposed to what it was in the early stages of the medium's existence. ] was arguably the year in which this style of "hot" mastering became commonplace, though exceptions, such as the album ] by ] from the same year, still existed. The most common loudness for a rock CD in terms of RMS power was around -12dB, though depending on how an album was mixed it could be higher or lower (as was the case with the ] album ], a consistently-peaking melodic metal album released in ] averaging -14db, as well as the aformentioned 1991 Metallica release). Overall, most rock and pop CDs released in the '90s followed this method to a certain extent. | At the turn of the decade CDs louder than this common reference level began to surface, and CDs steadily became more and more apt to have peaks exceeding the digital limit as long as such amplification would not involve the clipping of more than approximately 2-4 digital samples, resulting in recordings where the peaks on an average rock or beat-heavy pop CD hovered close (usually in the range of 3dB) to full scale but only occasionally reached it. ]'s ] album ] is an early example of this, with RMS levels averaging -15dB for all the tracks. In the early ], however, some mastering engineers decided to take this a step further, and treat the CDs levels exactly as they would the levels of an analog tape and equate digital full scale with the analog saturation point, with the recording just loud enough so that each (or almost every) beat would peak at or over full scale. Though there were some early cases (such as ]'s self-titled ] in ]), albums mastered in this fashion generally did not appear until ]. ]'s ], ]'s ] and ]'s ] are some examples of this from said year. This time period (1988-1992) was an extremely erratic time for CD mastering. The loudness of CDs varied massively depending on the philosophies of the engineer (and others involved in the mastering process) as CD mastering became more lenient as opposed to what it was in the early stages of the medium's existence. ] was arguably the year in which this style of "hot" mastering became commonplace, though exceptions, such as the album ] by ] from the same year, still existed. The most common loudness for a rock CD in terms of RMS power was around -12dB, though depending on how an album was mixed it could be higher or lower (as was the case with the ] album ], a consistently-peaking melodic metal album released in ] averaging -14db, as well as the aformentioned 1991 Metallica release). Overall, most rock and pop CDs released in the '90s followed this method to a certain extent. | ||
However, with the advent of CDs being taken to this level, the concept of making CDs "hotter" began to enter the minds of people within the industry due to how noticably louder CDs had become than in the past decade. Engineers, musicians and labels developed their own ideas of how much of the peaks could be compromised as some became fascinated with the concept of making a CD louder than another one. During the late '90s, the ethics of simply stopping at a general transparency point were steadily thrown out the window. Of course, while the process was for the most part gradual, some opted to push the format to the limit as soon as the means arose, such as ], whose widely popular album ] hit a whopping -8dB on many of its tracks, something almost |
However, with the advent of CDs being taken to this level, the concept of making CDs "hotter" began to enter the minds of people within the industry due to how noticably louder CDs had become than in the past decade. Engineers, musicians and labels developed their own ideas of how much of the peaks could be compromised as some became fascinated with the concept of making a CD louder than another one. During the late '90s, the ethics of simply stopping at a general transparency point were steadily thrown out the window. Of course, while the process was for the most part gradual, some opted to push the format to the limit as soon as the means arose, such as ], whose widely popular album ] hit a whopping -8dB on many of its tracks, something almost completely unheard of, especially in the year it was released (]), as well as Iggy Pop, who in ] assisted in the remix and remaster of the ] album ] by his former band ], which to this day is arguably the loudest rock CD ever recorded, hitting a staggering -4dB in places, and still barely touched by today's standards. Eventually, however, the standards of loudness would reach its limit in the ]. And while some may debate the severity of the loudness war during its humble beginnings in the 1990s, there is little doubt in the minds of the vast majority of audio enthusiasts that nearly all rock and pop CDs released on large corporate labels (and not just RIAA-owned ones; independent metal label Century Media has been stated as being a major offender, for example) in the 21st century are simply unacceptable sound-wise, with -10dB being the standard for the past several years, very often being pushed to -9dB (and on some occasions, even a dB or two louder!). Exceptions to today's hot standards are practically non-existant at this point, and the chances of the situation reversing are very slim due to the fact that knowledge of and concern over the loudness war is still mostly exclusive to audio enthusiasts and individuals within the audio field, demographics who make up an extremely small portion of music buyers. | ||
== External links == | == External links == |
Revision as of 02:08, 24 March 2006
It has been suggested that this article be merged into Audio mastering. (Discuss) |
The phrase loudness war (or loudness race) refers to the practice of recording music at progressively higher and higher levels, to create CDs that are as loud as possible or louder than CDs from competing artists or recording labels. Louder CDs sound louder when played with the same equipment at the same settings. One reason for this practice is that when comparing two CDs, the louder one will sound better on first impression. Higher levels can result in better sounding recordings on low quality reproduction systems, such as web audio formats, AM radio, mono television and telephones, but since most of the material affected is delivered via CD audio, it is largely seen as detrimental to overall quality, given that one of the initial benefits of a CD was its enhanced dynamic range.
To educated ears this practice is unnecessary, since if listeners want to listen to loud music, they can simply turn up the volume on their playback equipment. If a CD is broadcast by a radio station, the station will have its own equipment that flattens everything it broadcasts to far more closely matching levels of absolute amplitude, regardless of the original recording's loudness. ,
This practice often results in a form of distortion known as clipping. The loudness wars have reached a point at which most pop CDs, and many classical and jazz CDs, have large amounts of digital clipping, making them harsh and fatiguing to listen to, especially, ironically, on high quality equipment. On the CDs where clipping does not occur—or does not occur as frequently as would when simple digital amplification is involved—a process known as limiting is used. While the resulting distortion is lessened from the final product this way, it unfortunately has the side effect of significantly reducing transient response, most often heard as lessened drum impact and, when taken to severe levels, can hamper the natural dynamics of other instruments within the mix and reduce sonic clarity. Both methods can be relatively transparent in moderate cases; however with the levels that are commonly demanded as of now this is seldom a possibility.
Further, current compression equipment allows engineers to create a recording that has a nearly uniform dynamic level. When that level is set very close to the maximum allowed by the CD format, this creates nearly non-stop distortion throughout the disk.
This situation has been widely condemned. Some have petitioned their favorite groups to rerelease some CDs with less distortion. Others have even said that recording engineers who knowingly push their recording equipment past clipping should be blacklisted and not allowed to "victimize artists or music lovers." Many have suggested boycotting recordings that they feel showcase the phenomenon to the point of significantly lessened satisfaction with the product (often to the point of lessened, or even nonexistent, listening as opposed to otherwise) to communicate the existence of disdain for the practice to the offending parties, though it is often stated that such an attempt would be interpreted by the music industry as wanton piracy.
It should be made clear that this distortion is different from other kinds of distortion such as overdrive or feedback (see Overdrive (music)), which is created by electronic musical instruments, not by the recording process, and which can be an intentional and integral part of the performance (see Jimi Hendrix). Digital clipping is created by recording engineers, not by musicians, though musicians have been accused of requesting the sorts of loudness that encourages this phenomenon. Ironically, sometimes analog-style distortion is used in the mastering process to achieve similar results, either through analog tape saturation prior to digital transfer or computer software used to emulate the process; this is notably more common in European recordings than North American ones.
Another consequence of the loudness war is that even if there is no distortion, every song on a CD, and every moment within each song, will have the same dynamic (i.e., loudness) level, with no rise or fall or any sense of dynamic shaping. The music has been flattened against the ceiling, so to speak. Pop music in general has not been interested in the expressive possibilities of crescendos, diminuendos, sudden loudness or quietness, or any of the other dynamic devices available to musicians, but the loudness war has eliminated even the possibility of dynamic expressiveness in recorded pop music.
History
(Note: Some of these examples are explained using RMS (Root Mean Square) power values. In reference to CD audio, these values are based on the calculation of the average of CD audio sample values with digital full scale used as a reference. It is a common way to determine the absolute loudness of a recording, though discrepancies in musical arrangements can cause inconsistencies in regards to aformentioned value versus percieved loudness.)
Due to the high subjectiveness of audio listening amongst individuals, pinpointable stages of the loudness war and its impact will vary quite a bit chronologically depending on who you inquire. The practice of focusing on loudness in mastering for purposes of competition or otherwise can not only be traced back to the very introduction of the compact disc itself, but has also been said to exist during the period when vinyl was the primary released recording medium for popular music. However, because of the limitations of the vinyl format, loudness and compression on a released recording were restricted in order to make the physical medium playable—restrictions distant from the infinite possibilities of digital playback mediums such as CDs—and as a result never reached the significance that they have in the digital music age. Extraordinarily hot recording levels like those showcased by Japanese noise artist Merzbow, which are significantly louder than even the norm for popular CDs today, would be impossible on vinyl.
The stages of CD loudness is often generally split over the three decades of the medium's existance. Since CDs were not the primary listening medium for popular music up until the tail end of the 1980s, there was little motivation for competitive loudness practices for the format during most of the decade. The fact that CD players were also very expensive and thus commonly exclusive to high-end systems that benifited less from advanced recording levels during this period can also be attributed as a major contributing factor. As a result, the two theoretical common practices of mastering CDs involved either matching the highest peak of a recording at, or close to, digital full scale, or referencing digital levels along the lines of more commonly familiarized analog VU meters, with a certain point (usually -6dB, or 50% of the disc's amplitude on a linear scale) utilized in the same way as the saturation point (signified as 0db) of analog recording, with several dB of the CDs recording level reserved in the same way that a similar amount was for the amplitude exceeding the saturation point in analog tape (often referred to as the "red zone", signified by a red bar in the meter display), only in this case there would be no saturation of the peaks involved since the actual medium is digital. The RMS level of the average rock song during most of the decade was usually around -18dB.
At the turn of the decade CDs louder than this common reference level began to surface, and CDs steadily became more and more apt to have peaks exceeding the digital limit as long as such amplification would not involve the clipping of more than approximately 2-4 digital samples, resulting in recordings where the peaks on an average rock or beat-heavy pop CD hovered close (usually in the range of 3dB) to full scale but only occasionally reached it. Guns N' Roses's 1987 album Appetite for Destruction is an early example of this, with RMS levels averaging -15dB for all the tracks. In the early '90s, however, some mastering engineers decided to take this a step further, and treat the CDs levels exactly as they would the levels of an analog tape and equate digital full scale with the analog saturation point, with the recording just loud enough so that each (or almost every) beat would peak at or over full scale. Though there were some early cases (such as Metallica's self-titled Black Album in 1991), albums mastered in this fashion generally did not appear until 1992. Alice in Chains's Dirt, Rage Against the Machine's self-titled debut and Faith No More's Angel Dust are some examples of this from said year. This time period (1988-1992) was an extremely erratic time for CD mastering. The loudness of CDs varied massively depending on the philosophies of the engineer (and others involved in the mastering process) as CD mastering became more lenient as opposed to what it was in the early stages of the medium's existence. 1994 was arguably the year in which this style of "hot" mastering became commonplace, though exceptions, such as the album Superunknown by Soundgarden from the same year, still existed. The most common loudness for a rock CD in terms of RMS power was around -12dB, though depending on how an album was mixed it could be higher or lower (as was the case with the Type O Negative album Bloody Kisses, a consistently-peaking melodic metal album released in 1993 averaging -14db, as well as the aformentioned 1991 Metallica release). Overall, most rock and pop CDs released in the '90s followed this method to a certain extent.
However, with the advent of CDs being taken to this level, the concept of making CDs "hotter" began to enter the minds of people within the industry due to how noticably louder CDs had become than in the past decade. Engineers, musicians and labels developed their own ideas of how much of the peaks could be compromised as some became fascinated with the concept of making a CD louder than another one. During the late '90s, the ethics of simply stopping at a general transparency point were steadily thrown out the window. Of course, while the process was for the most part gradual, some opted to push the format to the limit as soon as the means arose, such as Oasis, whose widely popular album (What's the Story) Morning Glory? hit a whopping -8dB on many of its tracks, something almost completely unheard of, especially in the year it was released (1995), as well as Iggy Pop, who in 1997 assisted in the remix and remaster of the 1973 album Raw Power by his former band The Stooges, which to this day is arguably the loudest rock CD ever recorded, hitting a staggering -4dB in places, and still barely touched by today's standards. Eventually, however, the standards of loudness would reach its limit in the '00s. And while some may debate the severity of the loudness war during its humble beginnings in the 1990s, there is little doubt in the minds of the vast majority of audio enthusiasts that nearly all rock and pop CDs released on large corporate labels (and not just RIAA-owned ones; independent metal label Century Media has been stated as being a major offender, for example) in the 21st century are simply unacceptable sound-wise, with -10dB being the standard for the past several years, very often being pushed to -9dB (and on some occasions, even a dB or two louder!). Exceptions to today's hot standards are practically non-existant at this point, and the chances of the situation reversing are very slim due to the fact that knowledge of and concern over the loudness war is still mostly exclusive to audio enthusiasts and individuals within the audio field, demographics who make up an extremely small portion of music buyers.