Misplaced Pages

Burst error

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
(Redirected from Error burst) Contiguous sequence of errors occurring in a communications channel
You can help expand this article with text translated from the corresponding article in German. (August 2020) Click for important translation instructions.
  • Machine translation, like DeepL or Google Translate, is a useful starting point for translations, but translators must revise errors as necessary and confirm that the translation is accurate, rather than simply copy-pasting machine-translated text into the English Misplaced Pages.
  • Consider adding a topic to this template: there are already 2,136 articles in the main category, and specifying|topic= will aid in categorization.
  • Do not translate text that appears unreliable or low-quality. If possible, verify the text with references provided in the foreign-language article.
  • You must provide copyright attribution in the edit summary accompanying your translation by providing an interlanguage link to the source of your translation. A model attribution edit summary is Content in this edit is translated from the existing German Misplaced Pages article at ]; see its history for attribution.
  • You may also add the template {{Translated|de|Burstfehler}} to the talk page.
  • For more guidance, see Misplaced Pages:Translation.

In telecommunications, a burst error or error burst is a contiguous sequence of symbols, received over a communication channel, such that the first and last symbols are in error and there exists no contiguous subsequence of m correctly received symbols within the error burst. The integer parameter m is referred to as the guard band of the error burst. The last symbol in a burst and the first symbol in the following burst are accordingly separated by m correct symbols or more. The parameter m should be specified when describing an error burst.

Channel model

The Gilbert–Elliott model is a simple channel model introduced by Edgar Gilbert and E. O. Elliott that is widely used for describing burst error patterns in transmission channels and enables simulations of the digital error performance of communications links. It is based on a Markov chain with two states G (for good or gap) and B (for bad or burst). In state G the probability of transmitting a bit correctly is k and in state B it is h. Usually, it is assumed that k = 1. Gilbert provided equations for deriving the other three parameters (G and B state transition probabilities and h) from a given success/failure sequence. In his example, the sequence was too short to correctly find h (a negative probability was found) and so Gilbert assumed that h = 0.5.

See also

References

  1. Federal Standard 1037C
  2. Gilbert, E. N. (1960), "Capacity of a burst-noise channel", Bell System Technical Journal, 39 (5): 1253–1265, doi:10.1002/j.1538-7305.1960.tb03959.x.
  3. Elliott, E. O. (1963), "Estimates of error rates for codes on burst-noise channels", Bell System Technical Journal, 42 (5): 1977–1997, doi:10.1002/j.1538-7305.1963.tb00955.x.
  4. Lemmon, J.J.: Wireless link statistical bit error model. US National Telecommunications and Information Administration (NTIA) Report 02-394 (2002)

External links

Categories: