Misplaced Pages

Pairwise independence

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
(Redirected from Pairwise independent) Set of random variables of which any two are independent

In probability theory, a pairwise independent collection of random variables is a set of random variables any two of which are independent. Any collection of mutually independent random variables is pairwise independent, but some pairwise independent collections are not mutually independent. Pairwise independent random variables with finite variance are uncorrelated.

A pair of random variables X and Y are independent if and only if the random vector (X, Y) with joint cumulative distribution function (CDF) F X , Y ( x , y ) {\displaystyle F_{X,Y}(x,y)} satisfies

F X , Y ( x , y ) = F X ( x ) F Y ( y ) , {\displaystyle F_{X,Y}(x,y)=F_{X}(x)F_{Y}(y),}

or equivalently, their joint density f X , Y ( x , y ) {\displaystyle f_{X,Y}(x,y)} satisfies

f X , Y ( x , y ) = f X ( x ) f Y ( y ) . {\displaystyle f_{X,Y}(x,y)=f_{X}(x)f_{Y}(y).}

That is, the joint distribution is equal to the product of the marginal distributions.

Unless it is not clear in context, in practice the modifier "mutual" is usually dropped so that independence means mutual independence. A statement such as " X, Y, Z are independent random variables" means that X, Y, Z are mutually independent.

Example

Pairwise independence does not imply mutual independence, as shown by the following example attributed to S. Bernstein.

Suppose X and Y are two independent tosses of a fair coin, where we designate 1 for heads and 0 for tails. Let the third random variable Z be equal to 1 if exactly one of those coin tosses resulted in "heads", and 0 otherwise (i.e., Z = X Y {\displaystyle Z=X\oplus Y} ). Then jointly the triple (X, Y, Z) has the following probability distribution:

( X , Y , Z ) = { ( 0 , 0 , 0 ) with probability   1 / 4 , ( 0 , 1 , 1 ) with probability   1 / 4 , ( 1 , 0 , 1 ) with probability   1 / 4 , ( 1 , 1 , 0 ) with probability   1 / 4. {\displaystyle (X,Y,Z)=\left\{{\begin{matrix}(0,0,0)&{\text{with probability}}\ 1/4,\\(0,1,1)&{\text{with probability}}\ 1/4,\\(1,0,1)&{\text{with probability}}\ 1/4,\\(1,1,0)&{\text{with probability}}\ 1/4.\end{matrix}}\right.}

Here the marginal probability distributions are identical: f X ( 0 ) = f Y ( 0 ) = f Z ( 0 ) = 1 / 2 , {\displaystyle f_{X}(0)=f_{Y}(0)=f_{Z}(0)=1/2,} and f X ( 1 ) = f Y ( 1 ) = f Z ( 1 ) = 1 / 2. {\displaystyle f_{X}(1)=f_{Y}(1)=f_{Z}(1)=1/2.} The bivariate distributions also agree: f X , Y = f X , Z = f Y , Z , {\displaystyle f_{X,Y}=f_{X,Z}=f_{Y,Z},} where f X , Y ( 0 , 0 ) = f X , Y ( 0 , 1 ) = f X , Y ( 1 , 0 ) = f X , Y ( 1 , 1 ) = 1 / 4. {\displaystyle f_{X,Y}(0,0)=f_{X,Y}(0,1)=f_{X,Y}(1,0)=f_{X,Y}(1,1)=1/4.}

Since each of the pairwise joint distributions equals the product of their respective marginal distributions, the variables are pairwise independent:

  • X and Y are independent, and
  • X and Z are independent, and
  • Y and Z are independent.

However, X, Y, and Z are not mutually independent, since f X , Y , Z ( x , y , z ) f X ( x ) f Y ( y ) f Z ( z ) , {\displaystyle f_{X,Y,Z}(x,y,z)\neq f_{X}(x)f_{Y}(y)f_{Z}(z),} the left side equalling for example 1/4 for (x, y, z) = (0, 0, 0) while the right side equals 1/8 for (x, y, z) = (0, 0, 0). In fact, any of { X , Y , Z } {\displaystyle \{X,Y,Z\}} is completely determined by the other two (any of X, Y, Z is the sum (modulo 2) of the others). That is as far from independence as random variables can get.

Probability of the union of pairwise independent events

Bounds on the probability that the sum of Bernoulli random variables is at least one, commonly known as the union bound, are provided by the Boole–Fréchet inequalities. While these bounds assume only univariate information, several bounds with knowledge of general bivariate probabilities, have been proposed too. Denote by { A i , i { 1 , 2 , . . . , n } } {\displaystyle \{{A}_{i},i\in \{1,2,...,n\}\}} a set of n {\displaystyle n} Bernoulli events with probability of occurrence P ( A i ) = p i {\displaystyle \mathbb {P} (A_{i})=p_{i}} for each i {\displaystyle i} . Suppose the bivariate probabilities are given by P ( A i A j ) = p i j {\displaystyle \mathbb {P} (A_{i}\cap A_{j})=p_{ij}} for every pair of indices ( i , j ) {\displaystyle (i,j)} . Kounias derived the following upper bound:

P ( i A i ) i = 1 n p i max j { 1 , 2 , . . , n } i j p i j , {\displaystyle \mathbb {P} (\displaystyle {\cup }_{i}A_{i})\leq \displaystyle \sum _{i=1}^{n}p_{i}-{\underset {j\in \{1,2,..,n\}}{\max }}\sum _{i\neq j}p_{ij},}

which subtracts the maximum weight of a star spanning tree on a complete graph with n {\displaystyle n} nodes (where the edge weights are given by p i j {\displaystyle p_{ij}} ) from the sum of the marginal probabilities i p i {\displaystyle \sum _{i}p_{i}} .
Hunter-Worsley tightened this upper bound by optimizing over τ T {\displaystyle \tau \in T} as follows:

P ( i A i ) i = 1 n p i max τ T ( i , j ) τ p i j , {\displaystyle \mathbb {P} (\displaystyle {\cup }_{i}A_{i})\leq \displaystyle \sum _{i=1}^{n}p_{i}-{\underset {\tau \in T}{\max }}\sum _{(i,j)\in \tau }p_{ij},}

where T {\displaystyle T} is the set of all spanning trees on the graph. These bounds are not the tightest possible with general bivariates p i j {\displaystyle p_{ij}} even when feasibility is guaranteed as shown in Boros et.al. However, when the variables are pairwise independent ( p i j = p i p j {\displaystyle p_{ij}=p_{i}p_{j}} ), Ramachandra—Natarajan showed that the Kounias-Hunter-Worsley bound is tight by proving that the maximum probability of the union of events admits a closed-form expression given as:

max P ( i A i ) = min ( i = 1 n p i p n ( i = 1 n 1 p i ) , 1 ) {\displaystyle \max \mathbb {P} (\displaystyle {\cup }_{i}A_{i})=\displaystyle \min \left(\sum _{i=1}^{n}p_{i}-p_{n}\left(\sum _{i=1}^{n-1}p_{i}\right),1\right)} (1)

where the probabilities are sorted in increasing order as 0 p 1 p 2 p n 1 {\displaystyle 0\leq p_{1}\leq p_{2}\leq \ldots \leq p_{n}\leq 1} . The tight bound in Eq. 1 depends only on the sum of the smallest n 1 {\displaystyle n-1} probabilities i = 1 n 1 p i {\displaystyle \sum _{i=1}^{n-1}p_{i}} and the largest probability p n {\displaystyle p_{n}} . Thus, while ordering of the probabilities plays a role in the derivation of the bound, the ordering among the smallest n 1 {\displaystyle n-1} probabilities { p 1 , p 2 , . . . , p n 1 } {\displaystyle \{p_{1},p_{2},...,p_{n-1}\}} is inconsequential since only their sum is used.

Comparison with the Boole–Fréchet union bound

It is useful to compare the smallest bounds on the probability of the union with arbitrary dependence and pairwise independence respectively. The tightest Boole–Fréchet upper union bound (assuming only univariate information) is given as:

max P ( i A i ) = min ( i = 1 n p i , 1 ) {\displaystyle \displaystyle \max \mathbb {P} (\displaystyle {\cup }_{i}A_{i})=\displaystyle \min \left(\sum _{i=1}^{n}p_{i},1\right)} (2)

As shown in Ramachandra-Natarajan, it can be easily verified that the ratio of the two tight bounds in Eq. 2 and Eq. 1 is upper bounded by 4 / 3 {\displaystyle 4/3} where the maximum value of 4 / 3 {\displaystyle 4/3} is attained when

i = 1 n 1 p i = 1 / 2 {\displaystyle \sum _{i=1}^{n-1}p_{i}=1/2} , p n = 1 / 2 {\displaystyle p_{n}=1/2}

where the probabilities are sorted in increasing order as 0 p 1 p 2 p n 1 {\displaystyle 0\leq p_{1}\leq p_{2}\leq \ldots \leq p_{n}\leq 1} . In other words, in the best-case scenario, the pairwise independence bound in Eq. 1 provides an improvement of 25 % {\displaystyle 25\%} over the univariate bound in Eq. 2.

Generalization

More generally, we can talk about k-wise independence, for any k ≥ 2. The idea is similar: a set of random variables is k-wise independent if every subset of size k of those variables is independent. k-wise independence has been used in theoretical computer science, where it was used to prove a theorem about the problem MAXEkSAT.

k-wise independence is used in the proof that k-independent hashing functions are secure unforgeable message authentication codes.

See also

References

  1. Gut, A. (2005) Probability: a Graduate Course, Springer-Verlag. ISBN 0-387-27332-8. pp. 71–72.
  2. Hogg, R. V., McKean, J. W., Craig, A. T. (2005). Introduction to Mathematical Statistics (6 ed.). Upper Saddle River, NJ: Pearson Prentice Hall. ISBN 0-13-008507-3.{{cite book}}: CS1 maint: multiple names: authors list (link) Definition 2.5.1, page 109.
  3. Hogg, R. V., McKean, J. W., Craig, A. T. (2005). Introduction to Mathematical Statistics (6 ed.). Upper Saddle River, NJ: Pearson Prentice Hall. ISBN 0-13-008507-3.{{cite book}}: CS1 maint: multiple names: authors list (link) Remark 2.6.1, p. 120.
  4. Boole, G. (1854). An Investigation of the Laws of Thought, On Which Are Founded the Mathematical Theories of Logic and Probability. Walton and Maberly, London. See Boole's "major" and "minor" limits of a conjunction on page 299.
  5. Fréchet, M. (1935). Généralisations du théorème des probabilités totales. Fundamenta Mathematicae 25: 379–387.
  6. ^ E. G. Kounias (1968). "Bounds for the probability of a union, with applications". The Annals of Mathematical Statistics. 39 (6): 2154–2158. doi:10.1214/aoms/1177698049.
  7. ^ D. Hunter (1976). "An upper bound for the probability of a union". Journal of Applied Probability. 13 (3): 597–603. doi:10.2307/3212481. JSTOR 3212481.
  8. ^ K. J. Worsley (1982). "An improved Bonferroni inequality and applications". Biometrika. 69 (2): 297–302. doi:10.1093/biomet/69.2.297.
  9. Boros, Endre; Scozzari, Andrea; Tardella, Fabio; Veneziani, Pierangela (2014). "Polynomially computable bounds for the probability of the union of events". Mathematics of Operations Research. 39 (4): 1311–1329. doi:10.1287/moor.2014.0657.
  10. ^ Ramachandra, Arjun Kodagehalli; Natarajan, Karthik (2023). "Tight Probability Bounds with Pairwise Independence". SIAM Journal on Discrete Mathematics. 37 (2): 516–555. arXiv:2006.00516. doi:10.1137/21M140829.
Categories: