Article snapshot taken from Wikipedia with creative commons attribution-sharealike license.
Give it a read and then ask your questions in the chat.
We can research this topic together.
Variance of random sum
The Blackwell-Girshick equation is an equation in probability theory that allows for the calculation of the variance of random sums of random variables. It is the equivalent of Wald's lemma for the expectation of composite distributions.
Let be a random variable with values in , let be independent and identically distributed random variables, which are also independent of , and assume that the second moment exists for all and . Then, the random variable defined by
has the variance
.
The Blackwell-Girshick equation can be derived using conditional variance and variance decomposition.
If the are natural number-valued random variables, the derivation can be done elementarily using the chain rule and the probability-generating function.
Proof
For each , let be the random variable which is 1 if equals and 0 otherwise, and let . Then
By Wald's equation, under the given hypotheses, . Therefore,
as desired.
Example
Let have a Poisson distribution with expectation , and let follow a Bernoulli distribution with parameter . In this case, is also Poisson distributed with expectation , so its variance must be . We can check this with the Blackwell-Girshick equation: has variance while each has mean and variance , so we must have
For an example of an application: Mühlenthaler, M.; Raß, A.; Schmitt, M.; Wanka, R. (2021). "Exact Markov chain-based runtime analysis of a discrete particle swarm optimization algorithm on sorting and OneMax". Natural Computing: 1–27.
References
Blackwell, D. A.; Girshick, M. A. (1979). Theory of games and statistical decisions. Courier Corporation.