Misplaced Pages

Lee–Carter model

Article snapshot taken from Wikipedia with creative commons attribution-sharealike license. Give it a read and then ask your questions in the chat. We can research this topic together.
(Redirected from Lee-Carter model) Numerical algorithm for mortality forecasting

The Lee–Carter model is a numerical algorithm used in mortality forecasting and life expectancy forecasting. The input to the model is a matrix of age specific mortality rates ordered monotonically by time, usually with ages in columns and years in rows. The output is a forecasted matrix of mortality rates in the same format as the input.

The model uses singular value decomposition (SVD) to find:

  • A univariate time series vector k t {\displaystyle \mathbf {k} _{t}} that captures 80–90% of the mortality trend (here the subscript t {\displaystyle t} refers to time),
  • A vector b x {\displaystyle \mathbf {b} _{x}} that describes the relative mortality at each age (here the subscript x {\displaystyle x} refers to age), and
  • A scaling constant (referred to here as s 1 {\displaystyle s_{1}} but unnamed in the literature).

k t {\displaystyle \mathbf {k} _{t}} is usually linear, implying that gains to life expectancy are fairly constant year after year in most populations. Prior to computing SVD, age specific mortality rates are first transformed into A x , t {\displaystyle \mathbf {A} _{x,t}} , by taking their logarithms, and then centering them by subtracting their age-specific means over time. The age-specific mean over time is denoted by a x {\displaystyle \mathbf {a} _{x}} . The subscript x , t {\displaystyle x,t} refers to the fact that A x , t {\displaystyle \mathbf {A} _{x,t}} spans both age and time.

Many researchers adjust the k t {\displaystyle \mathbf {k} _{t}} vector by fitting it to empirical life expectancies for each year, using the a x {\displaystyle \mathbf {a} _{x}} and b x {\displaystyle \mathbf {b} _{x}} generated with SVD. When adjusted using this approach, changes to k t {\displaystyle \mathbf {k} _{t}} are usually small.

To forecast mortality, k t {\displaystyle \mathbf {k} _{t}} (either adjusted or not) is projected into n {\displaystyle n} future years using an ARIMA model. The corresponding forecasted A x , t + n {\displaystyle \mathbf {A} _{x,t+n}} is recovered by multiplying k t + n {\displaystyle \mathbf {k} _{t+n}} by b x {\displaystyle \mathbf {b} _{x}} and the first diagonal element of S (when U S V = svd ( A x , t ) {\displaystyle \mathbf {U} \mathbf {S} \mathbf {V^{*}} ={\text{svd}}(\mathbf {A} _{x,t})} ). The actual mortality rates are recovered by taking exponentials of this vector.

Because of the linearity of k t {\displaystyle \mathbf {k} _{t}} , it is generally modeled as a random walk with trend. Life expectancy and other life table measures can be calculated from this forecasted matrix after adding back the means and taking exponentials to yield regular mortality rates.

In most implementations, confidence intervals for the forecasts are generated by simulating multiple mortality forecasts using Monte Carlo Methods. A band of mortality between 5% and 95% percentiles of the simulated results is considered to be a valid forecast. These simulations are done by extending k t {\displaystyle \mathbf {k} _{t}} into the future using randomization based on the standard error of k t {\displaystyle \mathbf {k} _{t}} derived from the input data.

Algorithm

The algorithm seeks to find the least squares solution to the equation:

ln ( m x , t ) = a x + k t b x + ϵ x , t {\displaystyle \ln {(\mathbf {m} _{x,t})}=\mathbf {a} _{x}+\mathbf {k} _{t}\mathbf {b} _{x}+\epsilon _{x,t}}

where m x , t {\displaystyle \mathbf {m} _{x,t}} is a matrix of mortality rate for each age x {\displaystyle x} in each year t {\displaystyle t} .

  1. Compute a x {\displaystyle \mathbf {a} _{x}} which is the average over time of ln ( m x , t ) {\displaystyle \ln {(\mathbf {m} _{x,t})}} for each age:
    a x = t = 1 T ln ( m x , t ) T {\displaystyle \mathbf {a} _{x}={\frac {\sum _{t=1}^{T}{\ln {(\mathbf {m} _{x,t})}}}{T}}}
  2. Compute A x , t {\displaystyle \mathbf {A} _{x,t}} which will be used in SVD:
    A x , t = ln ( m x , t ) a x {\displaystyle \mathbf {A} _{x,t}=\ln {(\mathbf {m} _{x,t})}-\mathbf {a} _{x}}
  3. Compute the singular value decomposition of A x , t {\displaystyle \mathbf {A} _{x,t}} :
    U S V = svd ( A x , t ) {\displaystyle \mathbf {U} \mathbf {S} \mathbf {V^{*}} ={\text{svd}}(\mathbf {A} _{x,t})}
  4. Derive b x {\displaystyle \mathbf {b} _{x}} , s 1 {\displaystyle s_{1}} (the scaling eigenvalue), and k t {\displaystyle \mathbf {k} _{t}} from U {\displaystyle \mathbf {U} } , S {\displaystyle \mathbf {S} } , and V {\displaystyle \mathbf {V^{*}} } :
    b x = ( u 1 , 1 , u 2 , 1 , . . . , u x , 1 ) {\displaystyle \mathbf {b} _{x}=(u_{1,1},u_{2,1},...,u_{x,1})}
    k t = ( v 1 , 1 , v 1 , 2 , . . . , v 1 , t ) {\displaystyle \mathbf {k} _{t}=(v_{1,1},v_{1,2},...,v_{1,t})}
  5. Forecast k t {\displaystyle \mathbf {k} _{t}} using a standard univariate ARIMA model to n {\displaystyle n} additional years:
    k t + n = ARIMA ( k t , n ) {\displaystyle \mathbf {k} _{t+n}={\text{ARIMA}}(\mathbf {k} _{t},n)}
  6. Use the forecasted k t + n {\displaystyle \mathbf {k} _{t+n}} , with the original b x {\displaystyle \mathbf {b} _{x}} , and a x {\displaystyle \mathbf {a} _{x}} to calculate the forecasted mortality rate for each age:
    m x , t + n = exp ( a x + s 1 k t + n b x ) {\displaystyle \mathbf {m} _{x,t+n}=\exp(\mathbf {a} _{x}+s_{1}\mathbf {k} _{t+n}\mathbf {b} _{x})}

Discussion

Without applying SVD or some other method of dimension reduction the table of mortality data is a highly correlated multivariate data series, and the complexity of these multidimensional time series makes them difficult to forecast. SVD has become widely used as a method of dimension reduction in many different fields, including by Google in their page rank algorithm.

The Lee–Carter model was introduced by Ronald D. Lee and Lawrence Carter in 1992 with the article "Modeling and Forecasting U.S. Mortality". The model grew out of their work in the late 1980s and early 1990s attempting to use inverse projection to infer rates in historical demography. The model has been used by the United States Social Security Administration, the US Census Bureau, and the United Nations. It has become the most widely used mortality forecasting technique in the world today.

There have been extensions to the Lee–Carter model, most notably to account for missing years, correlated male and female populations, and large scale coherency in populations that share a mortality regime (western Europe, for example). Many related papers can be found on Professor Ronald Lee's website.

Implementations

There are few software packages for forecasting with the Lee–Carter model.

References

  1. "The Lee-Carter Method for Forecasting Mortality, with Various Extensions and Applications | SOA" (PDF). Archived from the original (PDF) on March 7, 2019. Retrieved September 28, 2010.
  2. Lee, Ronald D; Carter, Lawrence R (September 1992). "Modeling and Forecasting U.S. Mortality". Journal of the American Statistical Association. 87 (419): 659–671. doi:10.2307/2290201.
  3. Lee, Ronald (June 5, 2003). "Reflections on Inverse Projection: Its Origins, Development, Extensions, and Relation to Forecasting".
  4. Federico Girosi; Gary King. "Understanding the Lee-Carter Mortality Forecasting Method" (PDF). Harvard University. Retrieved April 12, 2023.
Categories: