Misplaced Pages

Poisson-type random measure

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
(Redirected from Poisson-type random measures) Family of three random counting measures
A major contributor to this article appears to have a close connection with its subject. It may require cleanup to comply with Misplaced Pages's content policies, particularly neutral point of view. Please discuss further on the talk page. (May 2020) (Learn how and when to remove this message)

Poisson-type random measures are a family of three random counting measures which are closed under restriction to a subspace, i.e. closed under thinning. They are the only distributions in the canonical non-negative power series family of distributions to possess this property and include the Poisson distribution, negative binomial distribution, and binomial distribution. The PT family of distributions is also known as the Katz family of distributions, the Panjer or (a,b,0) class of distributions and may be retrieved through the Conway–Maxwell–Poisson distribution.

Throwing stones

Let K {\displaystyle K} be a non-negative integer-valued random variable K N 0 = N > 0 { 0 } {\displaystyle K\in \mathbb {N} _{\geq 0}=\mathbb {N} _{>0}\cup \{0\}} ) with law κ {\displaystyle \kappa } , mean c ( 0 , ) {\displaystyle c\in (0,\infty )} and when it exists variance δ 2 > 0 {\displaystyle \delta ^{2}>0} . Let ν {\displaystyle \nu } be a probability measure on the measurable space ( E , E ) {\displaystyle (E,{\mathcal {E}})} . Let X = { X i } {\displaystyle \mathbf {X} =\{X_{i}\}} be a collection of iid random variables (stones) taking values in ( E , E ) {\displaystyle (E,{\mathcal {E}})} with law ν {\displaystyle \nu } .

The random counting measure N {\displaystyle N} on ( E , E ) {\displaystyle (E,{\mathcal {E}})} depends on the pair of deterministic probability measures ( κ , ν ) {\displaystyle (\kappa ,\nu )} through the stone throwing construction (STC)

N ω ( A ) = N ( ω , A ) = i = 1 K ( ω ) I A ( X i ( ω ) ) for ω Ω , A E {\displaystyle \quad N_{\omega }(A)=N(\omega ,A)=\sum _{i=1}^{K(\omega )}\mathbb {I} _{A}(X_{i}(\omega ))\quad {\text{for}}\quad \omega \in \Omega ,\,\,\,A\in {\mathcal {E}}}

where K {\displaystyle K} has law κ {\displaystyle \kappa } and iid X 1 , X 2 , {\displaystyle X_{1},X_{2},\dotsb } have law ν {\displaystyle \nu } . N {\displaystyle N} is a mixed binomial process

Let E + = { f : E R + } {\displaystyle {\mathcal {E}}_{+}=\{f:E\mapsto \mathbb {R} _{+}\}} be the collection of positive E {\displaystyle {\mathcal {E}}} -measurable functions. The probability law of N {\displaystyle N} is encoded in the Laplace functional

E e N f = E ( E e f ( X ) ) K = E ( ν e f ) K = ψ ( ν e f ) for f E + {\displaystyle \quad \mathbb {E} e^{-Nf}=\mathbb {E} (\mathbb {E} e^{-f(X)})^{K}=\mathbb {E} (\nu e^{-f})^{K}=\psi (\nu e^{-f})\quad {\text{for}}\quad f\in {\mathcal {E}}_{+}}

where ψ ( ) {\displaystyle \psi (\cdot )} is the generating function of K {\displaystyle K} . The mean and variance are given by

E N f = c ν f {\displaystyle \quad \mathbb {E} Nf=c\nu f}

and

V ar N f = c ν f 2 + ( δ 2 c ) ( ν f ) 2 {\displaystyle \quad \mathbb {V} {\text{ar}}Nf=c\nu f^{2}+(\delta ^{2}-c)(\nu f)^{2}}

The covariance for arbitrary f , g E + {\displaystyle f,g\in {\mathcal {E}}_{+}} is given by

C ov ( N f , N g ) = c ν ( f g ) + ( δ 2 c ) ν f ν g {\displaystyle \quad \mathbb {C} {\text{ov}}(Nf,Ng)=c\nu (fg)+(\delta ^{2}-c)\nu f\nu g}

When K {\displaystyle K} is Poisson, negative binomial, or binomial, it is said to be Poisson-type (PT). The joint distribution of the collection N ( A ) , , N ( B ) {\displaystyle N(A),\ldots ,N(B)} is for i , , j N {\displaystyle i,\ldots ,j\in \mathbb {N} } and i + + j = k {\displaystyle i+\cdots +j=k}

P ( N ( A ) = i , , N ( B ) = j ) = P ( N ( A ) = i , , N ( B ) = j | K = k ) P ( K = k ) = k ! i ! j ! ν ( A ) i ν ( B ) j P ( K = k ) {\displaystyle \mathbb {P} (N(A)=i,\ldots ,N(B)=j)=\mathbb {P} (N(A)=i,\ldots ,N(B)=j|K=k)\,\mathbb {P} (K=k)={\frac {k!}{i!\cdots j!}}\,\nu (A)^{i}\cdots \nu (B)^{j}\,\mathbb {P} (K=k)}

The following result extends construction of a random measure N = ( κ , ν ) {\displaystyle N=(\kappa ,\nu )} to the case when the collection X {\displaystyle \mathbf {X} } is expanded to ( X , Y ) = { ( X i , Y i ) } {\displaystyle (\mathbf {X} ,\mathbf {Y} )=\{(X_{i},Y_{i})\}} where Y i {\displaystyle Y_{i}} is a random transformation of X i {\displaystyle X_{i}} . Heuristically, Y i {\displaystyle Y_{i}} represents some properties (marks) of X i {\displaystyle X_{i}} . We assume that the conditional law of Y {\displaystyle Y} follows some transition kernel according to P ( Y B | X = x ) = Q ( x , B ) {\displaystyle \mathbb {P} (Y\in B|X=x)=Q(x,B)} .

Theorem: Marked STC

Consider random measure N = ( κ , ν ) {\displaystyle N=(\kappa ,\nu )} and the transition probability kernel Q {\displaystyle Q} from ( E , E ) {\displaystyle (E,{\cal {E)}}} into ( F , F ) {\displaystyle (F,{\cal {F)}}} . Assume that given the collection X {\displaystyle \mathbf {X} } the variables Y = { Y i } {\displaystyle \mathbf {Y} =\{Y_{i}\}} are conditionally independent with Y i Q ( X i , ) {\displaystyle Y_{i}\sim Q(X_{i},\cdot )} . Then M = ( κ , ν × Q ) {\displaystyle M=(\kappa ,\nu \times Q)} is a random measure on ( E × F , E F ) {\displaystyle (E\times F,{\cal {E\otimes F)}}} . Here μ = ν × Q {\displaystyle \mu =\nu \times Q} is understood as μ ( d x , d y ) = ν ( d x ) Q ( x , d y ) {\displaystyle \mu (dx,dy)=\nu (dx)Q(x,dy)} . Moreover, for any f ( E F ) + {\displaystyle f\in ({\cal {E}}\otimes {\cal {F}})_{+}} we have that E e M f = ψ ( ν e g ) {\displaystyle \mathbb {E} e^{-Mf}=\psi (\nu e^{-g})} where ψ ( ) {\displaystyle \psi (\cdot )} is pgf of K {\displaystyle K} and g E + {\displaystyle g\in {\mathcal {E}}_{+}} is defined as e g ( x ) = F Q ( x , d y ) e f ( x , y ) . {\displaystyle e^{-g(x)}=\int _{F}Q(x,dy)e^{-f(x,y)}.}

The following corollary is an immediate consequence.

Corollary: Restricted STC

The quantity N A = ( N I A , ν A ) {\displaystyle N_{A}=(N\mathbb {I} _{A},\nu _{A})} is a well-defined random measure on the measurable subspace ( E A , E A ) {\displaystyle (E\cap A,{\mathcal {E}}_{A})} where E A = { A B : B E } {\displaystyle {\mathcal {E}}_{A}=\{A\cap B:B\in {\mathcal {E}}\}} and ν A ( B ) = ν ( A B ) / ν ( A ) {\displaystyle \nu _{A}(B)=\nu (A\cap B)/\nu (A)} . Moreover, for any f E + {\displaystyle f\in {\mathcal {E}}_{+}} , we have that E e N A f = ψ ( ν e f I A + b ) {\displaystyle \mathbb {E} e^{-N_{A}f}=\psi (\nu e^{-f}\mathbb {I} _{A}+b)} where b = 1 ν ( A ) {\displaystyle b=1-\nu (A)} .

Note ψ ( ν e f I A + 1 a ) = ψ A ( ν A e f ) {\displaystyle \psi (\nu e^{-f}\mathbb {I} _{A}+1-a)=\psi _{A}(\nu _{A}e^{-f})} where we use ν e f I A = a ν A e f {\displaystyle \nu e^{-f}\mathbb {I} _{A}=a\nu _{A}e^{-f}} .

Collecting Bones

The probability law of the random measure is determined by its Laplace functional and hence generating function.

Definition: Bone

Let K A = N I A {\displaystyle K_{A}=N\mathbb {I} _{A}} be the counting variable of K {\displaystyle K} restricted to A E {\displaystyle A\subset E} . When { N I A : A E } {\displaystyle \{N\mathbb {I} _{A}:A\subset E\}} and K = N I E {\displaystyle K=N\mathbb {I} _{E}} share the same family of laws subject to a rescaling h a ( θ ) {\displaystyle h_{a}(\theta )} of the parameter θ {\displaystyle \theta } , then K {\displaystyle K} is a called a bone distribution. The bone condition for the pgf is given by ψ θ ( a t + 1 a ) = ψ h a ( θ ) ( t ) {\displaystyle \psi _{\theta }(at+1-a)=\psi _{h_{a}(\theta )}(t)} .

Equipped with the notion of a bone distribution and condition, the main result for the existence and uniqueness of Poisson-type (PT) random counting measures is given as follows.

Theorem: existence and uniqueness of PT random measures

Assume that K κ θ {\displaystyle K\sim \kappa _{\theta }} with pgf ψ θ {\displaystyle \psi _{\theta }} belongs to the canonical non-negative power series (NNPS) family of distributions and { 0 , 1 } supp ( K ) {\displaystyle \{0,1\}\subset {\text{supp}}(K)} . Consider the random measure N = ( κ θ , ν ) {\displaystyle N=(\kappa _{\theta },\nu )} on the space ( E , E ) {\displaystyle (E,{\mathcal {E}})} and assume that ν {\displaystyle \nu } is diffuse. Then for any A E {\displaystyle A\subset E} with ν ( A ) = a > 0 {\displaystyle \nu (A)=a>0} there exists a mapping h a : Θ Θ {\displaystyle h_{a}:\Theta \rightarrow \Theta } such that the restricted random measure is N A = ( κ h a ( θ ) , ν A ) {\displaystyle N_{A}=(\kappa _{h_{a}(\theta )},\nu _{A})} , that is,

E e N A f = ψ h a ( θ ) ( ν A e f ) for f E + {\displaystyle \quad \mathbb {E} e^{-N_{A}f}=\psi _{h_{a}(\theta )}(\nu _{A}e^{-f})\quad {\text{for}}\quad f\in {\mathcal {E}}_{+}}

iff K {\displaystyle K} is Poisson, negative binomial, or binomial (Poisson-type).

The proof for this theorem is based on a generalized additive Cauchy equation and its solutions. The theorem states that out of all NNPS distributions, only PT have the property that their restrictions N I A {\displaystyle N\mathbb {I} _{A}} share the same family of distribution as K {\displaystyle K} , that is, they are closed under thinning. The PT random measures are the Poisson random measure, negative binomial random measure, and binomial random measure. Poisson is additive with independence on disjoint sets, whereas negative binomial has positive covariance and binomial has negative covariance. The binomial process is a limiting case of binomial random measure where p 1 , n c {\displaystyle p\rightarrow 1,n\rightarrow c} .

Distributional self-similarity applications

The "bone" condition on the pgf ψ θ {\displaystyle \psi _{\theta }} of K {\displaystyle K} encodes a distributional self-similarity property whereby all counts in restrictions (thinnings) to subspaces (encoded by pgf ψ A {\displaystyle \psi _{A}} ) are in the same family as ψ θ {\displaystyle \psi _{\theta }} of K {\displaystyle K} through rescaling of the canonical parameter. These ideas appear closely connected to those of self-decomposability and stability of discrete random variables. Binomial thinning is a foundational model to count time-series. The Poisson random measure has the well-known splitting property, is prototypical to the class of additive (completely random) random measures, and is related to the structure of Lévy processes, the jumps of Kolmogorov equations (Markov jump process), and the excursions of Brownian motion. Hence the self-similarity property of the PT family is fundamental to multiple areas. The PT family members are "primitives" or prototypical random measures by which many random measures and processes can be constructed.

References

  1. Caleb Bastian, Gregory Rempala. Throwing stones and collecting bones: Looking for Poisson-like random measures, Mathematical Methods in the Applied Sciences, 2020. doi:10.1002/mma.6224
  2. Katz L.. Classical and Contagious Discrete Distributions ch. Unified treatment of a broad class of discrete probability distributions, :175-182. Pergamon Press, Oxford 1965.
  3. Panjer Harry H.. Recursive Evaluation of a Family of Compound Distributions. 1981;12(1):22-26
  4. Conway R. W., Maxwell W. L.. A Queuing Model with State Dependent Service Rates. Journal of Industrial Engineering. 1962;12.
  5. Cinlar Erhan. Probability and Stochastics. Springer-Verlag New York; 2011
  6. Kallenberg Olav. Random Measures, Theory and Applications. Springer; 2017
  7. Steutel FW, Van Harn K. Discrete analogues of self-decomposability and stability. The Annals of Probability. 1979;:893–899.
  8. Al-Osh M. A., Alzaid A. A.. First-order integer-valued autogressive (INAR(1)) process. Journal of Time Series Analysis. 1987;8(3):261–275.
  9. Scotto Manuel G., Weiß Christian H., Gouveia Sónia. Thinning models in the analysis of integer-valued time series: a review. Statistical Modelling. 2015;15(6):590–618.
  10. Cinlar Erhan. Probability and Stochastics. Springer-Verlag New York; 2011.
Category: