Misplaced Pages

q-Gaussian distribution

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
(Redirected from Q-Gaussian) Probability distribution This article is about the Tsallis q-Gaussian. For a different q-analog, see Gaussian q-distribution.
q-Gaussian
Probability density functionProbability density plots of q-Gaussian distributions
Parameters q < 3 {\displaystyle q<3} shape (real)
β > 0 {\displaystyle \beta >0} (real)
Support x ( ; + ) {\displaystyle x\in (-\infty ;+\infty )\!} for 1 q < 3 {\displaystyle 1\leq q<3}
x [ ± 1 β ( 1 q ) ] {\displaystyle x\in \left} for q < 1 {\displaystyle q<1}
PDF β C q e q ( β x 2 ) {\displaystyle {{\sqrt {\beta }} \over C_{q}}e_{q}({-\beta x^{2}})}
CDF see text
Mean 0  for  q < 2 {\displaystyle 0{\text{ for }}q<2} , otherwise undefined
Median 0 {\displaystyle 0}
Mode 0 {\displaystyle 0}
Variance 1 β ( 5 3 q )  for  q < 5 3 {\displaystyle {1 \over {\beta (5-3q)}}{\text{ for }}q<{5 \over 3}}
 for  5 3 q < 2 {\displaystyle \infty {\text{ for }}{5 \over 3}\leq q<2}
Undefined for  2 q < 3 {\displaystyle {\text{Undefined for }}2\leq q<3}
Skewness 0  for  q < 3 2 {\displaystyle 0{\text{ for }}q<{3 \over 2}}
Excess kurtosis 6 q 1 7 5 q  for  q < 7 5 {\displaystyle 6{q-1 \over 7-5q}{\text{ for }}q<{7 \over 5}}

The q-Gaussian is a probability distribution arising from the maximization of the Tsallis entropy under appropriate constraints. It is one example of a Tsallis distribution. The q-Gaussian is a generalization of the Gaussian in the same way that Tsallis entropy is a generalization of standard Boltzmann–Gibbs entropy or Shannon entropy. The normal distribution is recovered as q → 1.

The q-Gaussian has been applied to problems in the fields of statistical mechanics, geology, anatomy, astronomy, economics, finance, and machine learning. The distribution is often favored for its heavy tails in comparison to the Gaussian for 1 < q < 3. For q < 1 {\displaystyle q<1} the q-Gaussian distribution is the PDF of a bounded random variable. This makes in biology and other domains the q-Gaussian distribution more suitable than Gaussian distribution to model the effect of external stochasticity. A generalized q-analog of the classical central limit theorem was proposed in 2008, in which the independence constraint for the i.i.d. variables is relaxed to an extent defined by the q parameter, with independence being recovered as q → 1. However, a proof of such a theorem is still lacking.

In the heavy tail regions, the distribution is equivalent to the Student's t-distribution with a direct mapping between q and the degrees of freedom. A practitioner using one of these distributions can therefore parameterize the same distribution in two different ways. The choice of the q-Gaussian form may arise if the system is non-extensive, or if there is lack of a connection to small samples sizes.

Characterization

Probability density function

The standard q-Gaussian has the probability density function

f ( x ) = β C q e q ( β x 2 ) {\displaystyle f(x)={{\sqrt {\beta }} \over C_{q}}e_{q}(-\beta x^{2})}

where

e q ( x ) = [ 1 + ( 1 q ) x ] + 1 1 q {\displaystyle e_{q}(x)=_{+}^{1 \over 1-q}}

is the q-exponential and the normalization factor C q {\displaystyle C_{q}} is given by

C q = 2 π Γ ( 1 1 q ) ( 3 q ) 1 q Γ ( 3 q 2 ( 1 q ) )  for  < q < 1 {\displaystyle C_{q}={{2{\sqrt {\pi }}\Gamma \left({1 \over 1-q}\right)} \over {(3-q){\sqrt {1-q}}\Gamma \left({3-q \over 2(1-q)}\right)}}{\text{ for }}-\infty <q<1}
C q = π  for  q = 1 {\displaystyle C_{q}={\sqrt {\pi }}{\text{ for }}q=1\,}
C q = π Γ ( 3 q 2 ( q 1 ) ) q 1 Γ ( 1 q 1 )  for  1 < q < 3. {\displaystyle C_{q}={{{\sqrt {\pi }}\Gamma \left({3-q \over 2(q-1)}\right)} \over {{\sqrt {q-1}}\Gamma \left({1 \over q-1}\right)}}{\text{ for }}1<q<3.}

Note that for q < 1 {\displaystyle q<1} the q-Gaussian distribution is the PDF of a bounded random variable.

Cumulative density function

For 1 < q < 3 {\displaystyle 1<q<3} cumulative density function is

F ( x ) = 1 2 + q 1 Γ ( 1 q 1 ) x β 2 F 1 ( 1 2 , 1 q 1 ; 3 2 ; ( q 1 ) β x 2 ) π Γ ( 3 q 2 ( q 1 ) ) , {\displaystyle F(x)={\frac {1}{2}}+{\frac {{\sqrt {q-1}}\,\Gamma \left({1 \over q-1}\right)x{\sqrt {\beta }}\,{}_{2}F_{1}\left({\tfrac {1}{2}},{\tfrac {1}{q-1}};{\tfrac {3}{2}};-(q-1)\beta x^{2}\right)}{{\sqrt {\pi }}\,\Gamma \left({3-q \over 2(q-1)}\right)}},}

where 2 F 1 ( a , b ; c ; z ) {\displaystyle {}_{2}F_{1}(a,b;c;z)} is the hypergeometric function. As the hypergeometric function is defined for |z| < 1 but x is unbounded, Pfaff transformation could be used.

For q < 1 {\displaystyle q<1} , F ( x ) = { 0 x < 1 β ( 1 q ) , 1 2 + 1 q Γ ( 5 3 q 2 ( 1 q ) ) x β 2 F 1 ( 1 2 , 1 q 1 ; 3 2 ; ( q 1 ) β x 2 ) π Γ ( 2 q 1 q ) 1 β ( 1 q ) < x < 1 β ( 1 q ) , 1 x > 1 β ( 1 q ) . {\displaystyle F(x)={\begin{cases}0&x<-{\frac {1}{\sqrt {\beta (1-q)}}},\\{\frac {1}{2}}+{\frac {{\sqrt {1-q}}\,\Gamma \left({5-3q \over 2(1-q)}\right)x{\sqrt {\beta }}\,{}_{2}F_{1}\left({\tfrac {1}{2}},{\tfrac {1}{q-1}};{\tfrac {3}{2}};-(q-1)\beta x^{2}\right)}{{\sqrt {\pi }}\,\Gamma \left({2-q \over 1-q}\right)}}&-{\frac {1}{\sqrt {\beta (1-q)}}}<x<{\frac {1}{\sqrt {\beta (1-q)}}},\\1&x>{\frac {1}{\sqrt {\beta (1-q)}}}.\end{cases}}}

Entropy

Just as the normal distribution is the maximum information entropy distribution for fixed values of the first moment E ( X ) {\displaystyle \operatorname {E} (X)} and second moment E ( X 2 ) {\displaystyle \operatorname {E} (X^{2})} (with the fixed zeroth moment E ( X 0 ) = 1 {\displaystyle \operatorname {E} (X^{0})=1} corresponding to the normalization condition), the q-Gaussian distribution is the maximum Tsallis entropy distribution for fixed values of these three moments.

Related distributions

Student's t-distribution

While it can be justified by an interesting alternative form of entropy, statistically it is a scaled reparametrization of the Student's t-distribution introduced by W. Gosset in 1908 to describe small-sample statistics. In Gosset's original presentation the degrees of freedom parameter ν was constrained to be a positive integer related to the sample size, but it is readily observed that Gosset's density function is valid for all real values of ν. The scaled reparametrization introduces the alternative parameters q and β which are related to ν.

Given a Student's t-distribution with ν degrees of freedom, the equivalent q-Gaussian has

q = ν + 3 ν + 1  with  β = 1 3 q {\displaystyle q={\frac {\nu +3}{\nu +1}}{\text{ with }}\beta ={\frac {1}{3-q}}}

with inverse

ν = 3 q q 1 ,  but only if  β = 1 3 q . {\displaystyle \nu ={\frac {3-q}{q-1}},{\text{ but only if }}\beta ={\frac {1}{3-q}}.}

Whenever β 1 3 q {\displaystyle \beta \neq {1 \over {3-q}}} , the function is simply a scaled version of Student's t-distribution.

It is sometimes argued that the distribution is a generalization of Student's t-distribution to negative and or non-integer degrees of freedom. However, the theory of Student's t-distribution extends trivially to all real degrees of freedom, where the support of the distribution is now compact rather than infinite in the case of ν < 0.

Three-parameter version

As with many distributions centered on zero, the q-Gaussian can be trivially extended to include a location parameter μ. The density then becomes defined by

β C q e q ( β ( x μ ) 2 ) . {\displaystyle {{\sqrt {\beta }} \over C_{q}}e_{q}({-\beta (x-\mu )^{2}}).}

Generating random deviates

The Box–Muller transform has been generalized to allow random sampling from q-Gaussians. The standard Box–Muller technique generates pairs of independent normally distributed variables from equations of the following form.

Z 1 = 2 ln ( U 1 ) cos ( 2 π U 2 ) {\displaystyle Z_{1}={\sqrt {-2\ln(U_{1})}}\cos(2\pi U_{2})}
Z 2 = 2 ln ( U 1 ) sin ( 2 π U 2 ) {\displaystyle Z_{2}={\sqrt {-2\ln(U_{1})}}\sin(2\pi U_{2})}

The generalized Box–Muller technique can generates pairs of q-Gaussian deviates that are not independent. In practice, only a single deviate will be generated from a pair of uniformly distributed variables. The following formula will generate deviates from a q-Gaussian with specified parameter q and β = 1 3 q {\displaystyle \beta ={1 \over {3-q}}}

Z = 2  ln q ( U 1 )  cos ( 2 π U 2 ) {\displaystyle Z={\sqrt {-2{\text{ ln}}_{q'}(U_{1})}}{\text{ cos}}(2\pi U_{2})}

where  ln q {\displaystyle {\text{ ln}}_{q}} is the q-logarithm and q = 1 + q 3 q {\displaystyle q'={{1+q} \over {3-q}}}

These deviates can be transformed to generate deviates from an arbitrary q-Gaussian by

Z = μ + Z β ( 3 q ) {\displaystyle Z'=\mu +{Z \over {\sqrt {\beta (3-q)}}}}

Applications

Physics

It has been shown that the momentum distribution of cold atoms in dissipative optical lattices is a q-Gaussian.

The q-Gaussian distribution is also obtained as the asymptotic probability density function of the position of the unidimensional motion of a mass subject to two forces: a deterministic force of the type F 1 ( x ) = 2 x / ( 1 x 2 ) {\textstyle F_{1}(x)=-2x/(1-x^{2})} (determining an infinite potential well) and a stochastic white noise force F 2 ( t ) = 2 ( 1 q ) ξ ( t ) {\textstyle F_{2}(t)={\sqrt {2(1-q)}}\xi (t)} , where ξ ( t ) {\displaystyle \xi (t)} is a white noise. Note that in the overdamped/small mass approximation the above-mentioned convergence fails for q < 0 {\displaystyle q<0} , as recently shown.

Finance

Financial return distributions in the New York Stock Exchange, NASDAQ and elsewhere have been interpreted as q-Gaussians.

See also

Notes

  1. Tsallis, C. Nonadditive entropy and nonextensive statistical mechanics-an overview after 20 years. Braz. J. Phys. 2009, 39, 337–356
  2. d'Onofrio A. (ed.) Bounded Noises in Physics, Biology, and Engineering. Birkhauser (2013)
  3. ^ Umarov, Sabir; Tsallis, Constantino; Steinberg, Stanly (2008). "On a q-Central Limit Theorem Consistent with Nonextensive Statistical Mechanics" (PDF). Milan J. Math. 76. Birkhauser Verlag: 307–328. doi:10.1007/s00032-008-0087-y. S2CID 55967725. Retrieved 2011-07-27.
  4. Hilhorst, H.J. (2010), "Note on a q-modified central limit theorem", Journal of Statistical Mechanics: Theory and Experiment, 2010 (10): 10023, arXiv:1008.4259, Bibcode:2010JSMTE..10..023H, doi:10.1088/1742-5468/2010/10/P10023, S2CID 119316670.
  5. https://reference.wolframcloud.com/language/ref/TsallisQGaussianDistribution.html
  6. W. Thistleton, J.A. Marsh, K. Nelson and C. Tsallis, Generalized Box–Muller method for generating q-Gaussian random deviates, IEEE Transactions on Information Theory 53, 4805 (2007)
  7. Douglas, P.; Bergamini, S.; Renzoni, F. (2006). "Tunable Tsallis Distributions in Dissipative Optical Lattices" (PDF). Physical Review Letters. 96 (11): 110601. Bibcode:2006PhRvL..96k0601D. doi:10.1103/PhysRevLett.96.110601. PMID 16605807.
  8. Domingo, Dario; d’Onofrio, Alberto; Flandoli, Franco (2017). "Boundedness vs unboundedness of a noise linked to Tsallis q-statistics: The role of the overdamped approximation". Journal of Mathematical Physics. 58 (3). AIP Publishing: 033301. arXiv:1709.08260. Bibcode:2017JMP....58c3301D. doi:10.1063/1.4977081. ISSN 0022-2488. S2CID 84178785.
  9. Borland, Lisa (2002-08-07). "Option Pricing Formulas Based on a Non-Gaussian Stock Price Model". Physical Review Letters. 89 (9). American Physical Society (APS): 098701. arXiv:cond-mat/0204331. Bibcode:2002PhRvL..89i8701B. doi:10.1103/physrevlett.89.098701. ISSN 0031-9007. PMID 12190447. S2CID 5740827.
  10. L. Borland, The pricing of stock options, in Nonextensive Entropy – Interdisciplinary Applications, eds. M. Gell-Mann and C. Tsallis (Oxford University Press, New York, 2004)

Further reading

External links

Tsallis statistics
Probability distributions (list)
Discrete
univariate
with finite
support
with infinite
support
Continuous
univariate
supported on a
bounded interval
supported on a
semi-infinite
interval
supported
on the whole
real line
with support
whose type varies
Mixed
univariate
continuous-
discrete
Multivariate
(joint)
Directional
Univariate (circular) directional
Circular uniform
Univariate von Mises
Wrapped normal
Wrapped Cauchy
Wrapped exponential
Wrapped asymmetric Laplace
Wrapped Lévy
Bivariate (spherical)
Kent
Bivariate (toroidal)
Bivariate von Mises
Multivariate
von Mises–Fisher
Bingham
Degenerate
and singular
Degenerate
Dirac delta function
Singular
Cantor
Families
Categories: