Misplaced Pages

Inverse distribution

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
(Redirected from Reciprocal normal distribution) Probability theory Not to be confused with Inverse distribution function.
This article includes a list of general references, but it lacks sufficient corresponding inline citations. Please help to improve this article by introducing more precise citations. (April 2013) (Learn how and when to remove this message)

In probability theory and statistics, an inverse distribution is the distribution of the reciprocal of a random variable. Inverse distributions arise in particular in the Bayesian context of prior distributions and posterior distributions for scale parameters. In the algebra of random variables, inverse distributions are special cases of the class of ratio distributions, in which the numerator random variable has a degenerate distribution.

Relation to original distribution

In general, given the probability distribution of a random variable X with strictly positive support, it is possible to find the distribution of the reciprocal, Y = 1 / X. If the distribution of X is continuous with density function f(x) and cumulative distribution function F(x), then the cumulative distribution function, G(y), of the reciprocal is found by noting that

G ( y ) = Pr ( Y y ) = Pr ( X 1 y ) = 1 Pr ( X < 1 y ) = 1 F ( 1 y ) . {\displaystyle G(y)=\Pr(Y\leq y)=\Pr \left(X\geq {\frac {1}{y}}\right)=1-\Pr \left(X<{\frac {1}{y}}\right)=1-F\left({\frac {1}{y}}\right).}

Then the density function of Y is found as the derivative of the cumulative distribution function:

g ( y ) = 1 y 2 f ( 1 y ) . {\displaystyle g(y)={\frac {1}{y^{2}}}f\left({\frac {1}{y}}\right).}

Examples

Reciprocal distribution

The reciprocal distribution has a density function of the form

f ( x ) x 1  for  0 < a < x < b , {\displaystyle f(x)\propto x^{-1}\quad {\text{ for }}0<a<x<b,}

where {\displaystyle \propto \!\,} means "is proportional to". It follows that the inverse distribution in this case is of the form

g ( y ) y 1  for  0 b 1 < y < a 1 , {\displaystyle g(y)\propto y^{-1}\quad {\text{ for }}0\leq b^{-1}<y<a^{-1},}

which is again a reciprocal distribution.

Inverse uniform distribution

Inverse uniform distribution
Parameters 0 < a < b , a , b R {\displaystyle 0<a<b,\quad a,b\in \mathbb {R} }
Support [ b 1 , a 1 ] {\displaystyle }
PDF y 2 1 b a {\displaystyle y^{-2}{\frac {1}{b-a}}}
CDF b y 1 b a {\displaystyle {\frac {b-y^{-1}}{b-a}}}
Mean ln ( b ) ln ( a ) b a {\displaystyle {\frac {\ln(b)-\ln(a)}{b-a}}}
Median 2 a + b {\displaystyle {\frac {2}{a+b}}}
Variance 1 a b ( ln ( b ) ln ( a ) b a ) 2 {\displaystyle {\frac {1}{a\cdot b}}-\left({\frac {\ln(b)-\ln(a)}{b-a}}\right)^{2}}

If the original random variable X is uniformly distributed on the interval (a,b), where a>0, then the reciprocal variable Y = 1 / X has the reciprocal distribution which takes values in the range (b ,a), and the probability density function in this range is

g ( y ) = y 2 1 b a , {\displaystyle g(y)=y^{-2}{\frac {1}{b-a}},}

and is zero elsewhere.

The cumulative distribution function of the reciprocal, within the same range, is

G ( y ) = b y 1 b a . {\displaystyle G(y)={\frac {b-y^{-1}}{b-a}}.}

For example, if X is uniformly distributed on the interval (0,1), then Y = 1 / X has density g ( y ) = y 2 {\displaystyle g(y)=y^{-2}} and cumulative distribution function G ( y ) = 1 y 1 {\displaystyle G(y)={1-y^{-1}}} when y > 1. {\displaystyle y>1.}

Inverse t distribution

Let X be a t distributed random variate with k degrees of freedom. Then its density function is

f ( x ) = 1 k π Γ ( k + 1 2 ) Γ ( k 2 ) 1 ( 1 + x 2 k ) 1 + k 2 . {\displaystyle f(x)={\frac {1}{\sqrt {k\pi }}}{\frac {\Gamma \left({\frac {k+1}{2}}\right)}{\Gamma \left({\frac {k}{2}}\right)}}{\frac {1}{\left(1+{\frac {x^{2}}{k}}\right)^{\frac {1+k}{2}}}}.}

The density of Y = 1 / X is

g ( y ) = 1 k π Γ ( k + 1 2 ) Γ ( k 2 ) 1 y 2 ( 1 + 1 y 2 k ) 1 + k 2 . {\displaystyle g(y)={\frac {1}{\sqrt {k\pi }}}{\frac {\Gamma \left({\frac {k+1}{2}}\right)}{\Gamma \left({\frac {k}{2}}\right)}}{\frac {1}{y^{2}\left(1+{\frac {1}{y^{2}k}}\right)^{\frac {1+k}{2}}}}.}

With k = 1, the distributions of X and 1 / X are identical (X is then Cauchy distributed (0,1)). If k > 1 then the distribution of 1 / X is bimodal.

Reciprocal normal distribution

See also: Propagation of uncertainty § Reciprocal and shifted reciprocal

If variable X {\displaystyle X} follows a normal distribution N ( μ , σ 2 ) {\displaystyle {\mathcal {N}}(\mu ,\sigma ^{2})} , then the inverse or reciprocal Y = 1 X {\displaystyle Y={\frac {1}{X}}} follows a reciprocal normal distribution:

f ( y ) = 1 2 π σ y 2 e 1 2 ( 1 / y μ σ ) 2 . {\displaystyle f(y)={\frac {1}{{\sqrt {2\pi }}\sigma y^{2}}}e^{-{\frac {1}{2}}\left({\frac {1/y-\mu }{\sigma }}\right)^{2}}.}
Graph of the density of the inverse of the standard normal distribution

If variable X follows a standard normal distribution N ( 0 , 1 ) {\displaystyle {\mathcal {N}}(0,1)} , then Y = 1/X follows a reciprocal standard normal distribution, heavy-tailed and bimodal, with modes at ± 1 2 {\displaystyle \pm {\tfrac {1}{\sqrt {2}}}} and density

f ( y ) = e 1 2 y 2 2 π y 2 {\displaystyle f(y)={\frac {e^{-{\frac {1}{2y^{2}}}}}{{\sqrt {2\pi }}y^{2}}}}

and the first and higher-order moments do not exist. For such inverse distributions and for ratio distributions, there can still be defined probabilities for intervals, which can be computed either by Monte Carlo simulation or, in some cases, by using the Geary–Hinkley transformation.

However, in the more general case of a shifted reciprocal function 1 / ( p B ) {\displaystyle 1/(p-B)} , for B = N ( μ , σ ) {\displaystyle B=N(\mu ,\sigma )} following a general normal distribution, then mean and variance statistics do exist in a principal value sense, if the difference between the pole p {\displaystyle p} and the mean μ {\displaystyle \mu } is real-valued. The mean of this transformed random variable (reciprocal shifted normal distribution) is then indeed the scaled Dawson's function:

2 σ F ( p μ 2 σ ) . {\displaystyle {\frac {\sqrt {2}}{\sigma }}F\left({\frac {p-\mu }{{\sqrt {2}}\sigma }}\right).}

In contrast, if the shift p μ {\displaystyle p-\mu } is purely complex, the mean exists and is a scaled Faddeeva function, whose exact expression depends on the sign of the imaginary part, Im ( p μ ) {\displaystyle \operatorname {Im} (p-\mu )} . In both cases, the variance is a simple function of the mean. Therefore, the variance has to be considered in a principal value sense if p μ {\displaystyle p-\mu } is real, while it exists if the imaginary part of p μ {\displaystyle p-\mu } is non-zero. Note that these means and variances are exact, as they do not recur to linearisation of the ratio. The exact covariance of two ratios with a pair of different poles p 1 {\displaystyle p_{1}} and p 2 {\displaystyle p_{2}} is similarly available. The case of the inverse of a complex normal variable B {\displaystyle B} , shifted or not, exhibits different characteristics.

Inverse exponential distribution

If X {\displaystyle X} is an exponentially distributed random variable with rate parameter λ {\displaystyle \lambda } , then Y = 1 / X {\displaystyle Y=1/X} has the following cumulative distribution function: F Y ( y ) = e λ / y {\displaystyle F_{Y}(y)=e^{-\lambda /y}} for y > 0 {\displaystyle y>0} . Note that the expected value of this random variable does not exist. The reciprocal exponential distribution finds use in the analysis of fading wireless communication systems.

Inverse Cauchy distribution

If X is a Cauchy distributed (μ, σ) random variable, then 1 / X is a Cauchy ( μ / C, σ / C ) random variable where C = μ + σ.

Inverse F distribution

If X is an F(ν1, ν2 ) distributed random variable then 1 / X is an F(ν2, ν1 ) random variable.

Reciprocal of binomial distribution

If X {\displaystyle X} is distributed according to a Binomial distribution with n {\displaystyle n} number of trials and a probability of success p {\displaystyle p} then no closed form for the reciprocal distribution is known. However, we can calculate the mean of this distribution.

E [ 1 ( 1 + X ) ] = 1 p ( n + 1 ) ( 1 ( 1 p ) n + 1 ) {\displaystyle E\left={\frac {1}{p(n+1)}}\left(1-(1-p)^{n+1}\right)}


An asymptotic approximation for the non-central moments of the reciprocal distribution is known.

E [ ( 1 + X ) a ] = O ( ( n p ) a ) + o ( n a ) {\displaystyle E=O((np)^{-a})+o(n^{-a})}

where O() and o() are the big and little o order functions and a {\displaystyle a} is a real number.

Reciprocal of triangular distribution

For a triangular distribution with lower limit a, upper limit b and mode c, where a < b and a ≤ c ≤ b, the mean of the reciprocal is given by

μ = 2 ( a l n ( a c ) a c + b l n ( c b ) b c ) a b {\displaystyle \mu ={\frac {2\left({\frac {a\,\mathrm {ln} \left({\frac {a}{c}}\right)}{a-c}}+{\frac {b\,\mathrm {ln} \left({\frac {c}{b}}\right)}{b-c}}\right)}{a-b}}}

and the variance by

σ 2 = 2 ( l n ( c a ) a c + l n ( b c ) b c ) a b μ 2 {\displaystyle \sigma ^{2}={\frac {2\left({\frac {\mathrm {ln} \left({\frac {c}{a}}\right)}{a-c}}+{\frac {\mathrm {ln} \left({\frac {b}{c}}\right)}{b-c}}\right)}{a-b}}-\mu ^{2}} .

Both moments of the reciprocal are only defined when the triangle does not cross zero, i.e. when a, b, and c are either all positive or all negative.

Other inverse distributions

Other inverse distributions include

inverse-chi-squared distribution
inverse-gamma distribution
inverse-Wishart distribution
inverse matrix gamma distribution

Applications

Inverse distributions are widely used as prior distributions in Bayesian inference for scale parameters.

See also

References

  1. Hamming R. W. (1970) "On the distribution of numbers" Archived 2013-10-29 at the Wayback Machine, The Bell System Technical Journal 49(8) 1609–1625
  2. ^ Johnson, Norman L.; Kotz, Samuel; Balakrishnan, Narayanaswamy (1994). Continuous Univariate Distributions, Volume 1. Wiley. p. 171. ISBN 0-471-58495-9.
  3. Hayya, Jack; Armstrong, Donald; Gressis, Nicolas (July 1975). "A Note on the Ratio of Two Normally Distributed Variables". Management Science. 21 (11): 1338–1341. doi:10.1287/mnsc.21.11.1338. JSTOR 2629897.
  4. ^ Lecomte, Christophe (May 2013). "Exact statistics of systems with uncertainties: an analytical theory of rank-one stochastic dynamic systems". Journal of Sound and Vibration. 332 (11): 2750–2776. doi:10.1016/j.jsv.2012.12.009.
  5. Lecomte, Christophe (May 2013). "Exact statistics of systems with uncertainties: an analytical theory of rank-one stochastic dynamic systems". Journal of Sound and Vibration. 332 (11). Section (4.1.1). doi:10.1016/j.jsv.2012.12.009.
  6. Lecomte, Christophe (May 2013). "Exact statistics of systems with uncertainties: an analytical theory of rank-one stochastic dynamic systems". Journal of Sound and Vibration. 332 (11). Eq.(39)-(40). doi:10.1016/j.jsv.2012.12.009.
  7. Cribari-Neto F, Lopes Garcia N, Vasconcellos KLP (2000) A note on inverse moments of binomial variates. Brazilian Review of Econometrics 20 (2)
Categories: