Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The...

28
TVA: BE 0862.986.729 BNP Paribas 001-4174957-56 RPM Nivelles Tel: +32 (0)10 84 07 50 [email protected] www.reacfin.com Reacfin s.a./n.v. EXECUTIVE SUMMARY Developing a robust and accurate mortality forecasting tool is of the outmost importance in the context of ever evolving mortality trends. Such phenomena are especially important in rich countries. Miscalculation of future mortality rates might prove an expensive error for the insurance industry impacting for instance both the private and public pension systems. In order to respond to such needs Reacfin’s team developed a mortality forecasting tool that estimates the likely future mortality rates based on the Lee-Carter (LC) model. The tool design aims to strike a balance between the user friendliness, by automating most of the processes and facilitating the user interaction, and the underlying algorithm complexity based on sound actuarial techniques. In a nutshell, the tool fits an LC model on the historical mortality observations of a given population (e.g. inhabitants of a country) and obtains expected future mortality rates via projections. The tool has a unique set of features such as optimal smoothing and the ability to parametrize the projection. Robustness analysis and backtest analysis features are provided as well. User guide Thanatos - Prospective Mortality Tool Release 2.0 © Reacfin s.a. – May 2017

Transcript of Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The...

Page 1: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

TVA: BE 0862.986.729

BNP Paribas 001-4174957-56

RPM Nivelles

Tel: +32 (0)10 84 07 50

[email protected]

www.reacfin.com

Reacfin s.a./n.v.

EXECUTIVE SUMMARY

Developing a robust and accurate mortality forecasting tool is of the outmost importance in the context of ever evolving mortality trends. Such phenomena are especially important in rich countries. Miscalculation of future mortality rates might prove an expensive error for the insurance industry impacting for instance both the private and public pension systems. In order to respond to such needs Reacfin’s team developed a mortality forecasting tool that estimates the likely future mortality rates based on the Lee-Carter (LC) model.

The tool design aims to strike a balance between the user friendliness, by automating most of the processes and facilitating the user interaction, and the underlying algorithm complexity based on sound actuarial techniques.

In a nutshell, the tool fits an LC model on the historical mortality observations of a given population (e.g. inhabitants of a country) and obtains expected future mortality rates via projections. The tool has a unique set of features such as optimal smoothing and the ability to parametrize the projection. Robustness analysis and backtest analysis features are provided as well.

User guide

Thanatos - Prospective Mortality Tool Release 2.0

© Reacfin s.a. – May 2017

Page 2: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

2 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

1 INTRODUCTION 2

2 TOOL PROCESS SYNOPSIS 2

3 IMPORT DATA PHASE 3

4 DATA SELECTION PHASE 4

5 DATA SMOOTING PHASE 5

5.1 Smoothing based on the 2D Locfit 6

5.2 Smoothing based on the 2D Poisson P-Splines 7

5.3 Smoothing based on the Graduation by Discrete Beta Kernel Techniques 8

6 CALIBRATION PHASE 8

6.1 Fitting 9

6.1.1 Lee-Carter 10

6.1.2 Lee-Carter smoothed by penalized least squares 10

6.1.3 Lee-Carter smoother by penalized log-likelihood 11

6.2 Robustness 12

7 COEFFICIENT SMOOTHING PHASE 13

8 PROJECTION PHASE 15

8.1 Deterministic projections 16

8.2 Stochastic projections 17

9 CLOSURE PHASE 19

10 BACKTESTING PHASE 22

11 EXPORT PHASE 23

12 BIBLIOGRAPHY 24

Page 3: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

2 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

1 INTRODUCTION

This user guide accompanies the Prospective Mortality Tool Release 2.0. It provides a description of the tool interface and a walk-through the predefined available demo version.

We assume that the user is familiar with the actuarial methods employed in the context of life insurance and mortality forecasting techniques. Consequently, this document focuses on the tool utilization in terms of features and functionality. However, the interested reader should refer to [1] for a more elaborate description of the Lee Carter (LC) model.

Mortality forecasting is a field of actuarial sciences and a major topic in ageing studies that focus on estimating the likely future mortality rates. This is especially important in rich countries with a high proportion of aging people, since aging populations are expensive in terms of pensions (both public and private). Consequently, the continuing increases in life expectancy beyond previously held limits have brought to the fore the critical importance of mortality forecasting.

The latest developments in the field since 1980 can be grouped in three broad approaches: extrapolation, expectation and explanation. Most developments have taken place in extrapolative forecasting where the two-factor LC method (age and period) and its variants proved to be among the most successful in terms of accuracy (see [2]).

Despite its outmost importance, as stated above, there are surprisingly few software packages for forecasting based on the LC Model. Hence, Reacfins’ team developed a web based tool providing a complete solution that employs the aforementioned model. This tool proves itself useful in accurately estimating life tables based on the mortality observations of a given population (e.g. inhabitants of a country).

The tool will facilitate the calculation of such prospective mortality tables by providing, on the one hand, an easy to use interface and, on the other hand, by using powerful statistical algorithms in a seamless way. Additionally, the tool enables the user to assess the robustness of the results via multiple tests and a backtesting procedure.

As a final remark, we remind that the insured population mortality is usually different from the general population mortality. This is due to the fact that only asubset of the general population underwrites insurance policies, with not necessarily the same mortality features. In order to tackle this issue, Reacfins’ team has already put forward an Adverse Selection Tool, called Hygie. Its input relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes.

2 TOOL PROCESS SYNOPSIS

Figure 1 depicts the overall process synopsis. It is noticeable that the application process is divided into successive phases that follow in a linear manner, meaning that later phases are available after the completion of the previous ones. For each phase a corresponding interface is provided in order to allow for user interaction.

Page 4: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

3 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

Figure 1. Process synopsis.

The entire analysis performed by the tool starts with the Import data phase. From the enlarged imported data set the user can select a subset in terms of age and time frame. This selection can be performed via the Data selection phase. At this stage the user has the option to choose or not to perform a Data smoothing on the selected data. The subsequent phase is the Calibration. This phase has two steps. The Fitting step is compulsory and its result will serve as input in the following phases. On the contrary, the Robustness step’s purpose is only to provide an analysis on the results as provided by the Fitting step. The results of the Robustness step can be saved and downloaded for further analysis. The next phase is the Coefficient Smoothing. Similar to the Data smoothing, this phase is optional and, if needed, the user can bypass it. The next phase, named the Projection, is composed of two steps, namely the Deterministic projection and the Stochastic projection. The Deterministic step is compulsory and its output is necessary for the remainder of the phases. The Stochastic step’s purpose is only to provide an analysis and the results can be saved and downloaded for further study. Before the last compulsory phase named Backtest, the user might opt to perform or not the Closure phase. The Exports is the final phase and it enables the user to save the results, as well as the entire set of selected models and the parameters thereof employed in the analysis.

For each phase within the tool, this document shall present the inputs, the parameters and the outputs, where relevant.

3 IMPORT DATA PHASE

Figure 2 depicts the first interface of the tool, namely the Import Data. As the

name suggests, its main purpose is to allow the user to select the data file that

serves as basis for the calculation of the life tables. In order to do so, data for a

certain country and a certain gender can be selected. The input file contains the

number of deaths and the exposure to risk for the population considered.

In the demo version, as indicated in the interface, the data is limited. Hence, only

the data until 1995 and concerning six different countries (i.e. Belgium, France,

Germany, Luxembourg, Slovenia and The Netherlands) is provided.

Calibration

Input Files (.txt)

Import dataData

selection

Output Files

Data

smoothing

Fitting

Robustness

Coefficient

smoothing

Deterministic

Projection

Stochastic

Projection

Closure

Backtesting Exports

Projection

Output Files

Output Files

Page 5: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

4 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

Figure 2. Import Data interface.

After the user selects the data type in terms of country and gender, the actual data must be loaded by clicking the “Import Data” button.

It is important to remark that at this stage the number of deaths is automatically transformed to 1 if no deaths are recorded. By doing so we avoid experiencing instantaneous mortality rates equal to zero since this will lead to computational program errors.

4 DATA SELECTION PHASE

The Data selection phase allows for fine tuning of the input data enabling the user to select a subset of it. During the loading phase, based on the input data, the tool will identify the maximum and minimum values in terms of time frame and age.

The following parameters are available to the user via the interface (see Figure 4):

- Time Frame - Age Frame

As presented in Figure 3, the user can restrict the input data, via the available sliders, by selecting a specific “Time Frame” (from 𝑡𝑚𝑖𝑛 to 𝑡𝑚𝑎𝑥) and “Age Frame” (from 𝑥𝑚𝑖𝑛 to 𝑥𝑚𝑎𝑥 ). We underline that the maximum age that can be selected must correspond to a value where the exposure to risk is positive for all years. This is necessary in order to avoid computational errors. This typically occurs for high age values where scarce data is available. We therefore recommend not using high age values in the calibration, but rather employing a closure model instead (see Section 9).

After performing the selection, the data must be reloaded by clicking the “Refresh Data” button. As a result, the 3D surfaces representing the forces of mortality and the annual returns by age are shown as a function of age 𝑥 and year 𝑡. The force of mortality surface graph depicts the logarithms of the instantaneous forces of mortality (log(𝜇𝑥,𝑡)). The forces of mortality are estimated as:

𝜇𝑥,𝑡 =𝐷𝑥,𝑡𝐸𝑅𝑥,𝑡

(1)

Where 𝐷𝑥,𝑡 denotes the number of deaths for the population of age 𝑥 during year

𝑡 and 𝐸𝑅𝑥,𝑡 denotes the exposure to risk.

Page 6: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

5 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

The annual returns by age graph shows the evolution of the instantaneous forces of mortality (𝜇𝑥,𝑡/𝜇𝑥,𝑡−1 − 1).

Figure 3. Data selection interface

5 DATA SMOOTING PHASE

During this phase, as the name suggests, the user can perform the smoothing procedure in order to reduce the local high volatility in the input data by applying low pass filters.

The following parameters are available to the user via the interface (see Figure 4):

- Perform data smoothing - Smoothing Methodology - Smoothing Parameters

The user can choose to perform a data smoothing or not via the corresponding dropdown menu (i.e. Perform data smoothing).

The type of smoothing algorithm is selected via the Smoothing Methodology dropdown menu. Three different smoothing algorithms are available: “2D-locfit”, “2D Poisson P-Splines” and “Graduation by Discrete Beta Kernel Techniques”.

The values and the type of Smoothing Parameters are depended on the selected Smoothing Methodology. Consequently, for each distinct smoothing algorithm a different set of smoothing parameters are available. An in depth description of each Smoothing Methodology and the corresponding parameters is provided in Sections 5.1, 5.2 and 5.3 respectively.

After employing the smoothing algorithm (by clicking on the corresponding

“Smoothing” button), the graphical representations of the gross “Forces of

Mortality Surface” and “Smoothed Forces of Mortality Surface” are available on

the interface. Obviously, in case the user chooses not to perform smoothing, the

second graph will be the same as the gross data graph. Note that the values in the

graph correspond to the logarithms of the instantaneous forces of mortality

(log(𝜇𝑥,𝑡)).

Page 7: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

6 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

Figure 4. Data smoothing interface

5.1 Smoothing based on the 2D Locfit

The 2D-locfit methodology employs a linear regression, meaning that for each log(𝜇𝑥,𝑡) observation a weighted linear regression is performed on the

neighboring values (across ages and years) in order to compute the smoothed values [3].

Figure 5. 2D Locfit data smoothing and the corresponding parameters

The value of the Smoothing parameter will represent the number of neighboring

points considered for the calculation expressed as a percentage of the total

observation from the input data set. For instance, if the parameter is 10% and if

there are 200 observations in total, 20 observations will be considered in each

neighborhood.

Obviously, the choice of the smoothing parameter is not a trivial matter since a

too high value will lead to an almost (if not) flat surface, whereas a too low value

will render the smoothing impact immaterial. In order to help the user in this

matter, an algorithm employing the Global Cross-Validation (GCV) was designed,

providing as output a pre-calculated value of the parameter. In a nutshell, the

algorithm will test multiple parameter values and will select the one for which the

Page 8: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

7 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

GCV statistic is minimal. Then, the user can further modify this value via the

interface, if needed.

5.2 Smoothing based on the 2D Poisson P-Splines

In this section we will provide a brief description of the smoothing methodology based on the “2D Poisson P-Splines”. Basically a P-spline stands for a "penalized B-spline". This refers to using the B-spline representation where the coefficients are determined partly by the data to be fitted, and partly by an additional penalty function that aims to impose smoothness to avoid overfitting.

In a nutshell B-splines of order 𝑛 are combinations of piecewise polynomial functions of a degree smaller than 𝑛 (see [4]). The B-Splines employed in the tool are equally spaced along the age and year axes. The functions composing the B-splines are defined over a domain 𝑠0 ≤ 𝑠 ≤ 𝑠𝑚 and are composed of piecewise polynoms. The points where the polynoms meet are known as knots or break-points.

Figure 6. 2D Poisson P-Splines data smoothing and the corresponding parameters

The results obtained in this smoothing phase rely on a smoothing function resulting from the Kronecker matrix product on the B-splines corresponding to the two axes (i.e. age and year). By default the degree of the B-splines on each axis is three.

The advantage of this method in comparison to the one presented in Section 5.1 relies on the fact that the employed statistical software package is specifically adapted to mortality data type as weights can be defined, as the exposure to risk. Consequently, the calculated mortality surface obtained via this method presents a smoother characteristic, especially for the higher age values. It is noticeable that the return variable of this function is a matrix of numbers that follows a Poisson distribution.

It should be noted that the function has variable parameters. However, in order to increase the functionality of the tool in terms of user friendliness, it was decided to provide a set of optimal parameters (see Figure 6). The calculation of the optimal parameters relies on the minimization of the Bayesian Information Criterion (BIC) based on the likelihood function.

Page 9: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

8 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

5.3 Smoothing based on the Graduation by Discrete Beta Kernel

Techniques

The kernel smoothing techniques are popular for the estimation of a real valued function by employing noisy observations. This is especially helpful when no parametric model of the function is known. Their popularity resides in their conceptual simplicity and practical and theoretical properties. We underline that such smoothing techniques are powerful methods that rely on the adaptive characteristic of the smoothing parameters and are most appropriate for low dimensional surfaces.

Figure 7. Graduation by Discrete Beta Kernel data smoothing and the corresponding parameters

There are multiple options for performing kernel smoothing. However, in the context of mortality rates graduation very few contributions were made. The one employed in the tool is based on the dbkGrad package available in R. An in depth description of this specific smoothing technique in terms of analytical formulation is beyond the scope of this user manual. The interested reader should refer to [5] for a more elaborate description of the method.

The dbkGrad package smoothing technique allows for both fixed and adaptive Discrete Beta Kernel Techniques. In our particular case of mortality surface, the adaptive nature of the smoothing parameters justifies the usage of this package, since the unreliable data at high ages compared to lower ages require different approaches.

As presented in Figure 7, the tool allows the user to select as input the value of two smoothing and two sensitivity parameters corresponding to the years and ages. By default the parameters are set to a precomputed value. It should be underlined that the calculation of the optimal default parameters calculated via the leave-one-out cross validation method (LOOCV) might prove computationally intensive and requires some time till completion.

6 CALIBRATION PHASE

The Calibration phase is divided into two different steps, namely Fitting and Robustness. The purpose of the Fitting step is to calibrate the LC model

Page 10: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

9 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

parameters to the preprocessed data as obtained during the data smoothing phase. The purpose of the Robustness step is to estimate the robustness of the fitting. This is done by performing a comparative analysis (in terms of general shape) of the LC parameters calibrated on two distinct time frames (see Figure 11).

The smoothing of the input data as performed in Section 5 might create issues since fictive numbers of deaths are computed based on smoothed mortality rates. For this reason, in the fitting phase, there were included methods that will better address this issue by applying a smoothing directly at the level of the LC coefficients (see Section 6.1.2 and Section 6.1.3). It is important to underline that if the user chooses to select a smoothed LC it is advised not to perform the input data smoothing phase.

6.1 Fitting

As presented in Figure 8, the Fitting step performs the calibration of the basic LC model [1]. Additionally, a visual representation of the resulting parameter values is provided.

We recall that the basic LC model is defined analytically as follows:

log(𝜇𝑥,𝑡) = 𝛼𝑥 + 𝛽𝑥𝜅𝑡 (2)

where the age coefficients are denoted by 𝛼𝑥, 𝛽𝑥 and the time coefficient is denoted by 𝜅𝑡. In the case of the generic LC model we have the following analytical form:

log(𝜇𝑥,𝑡) =∑𝛽𝑥(𝑖)𝜅𝑡(𝑖)𝛾𝑡−𝑥(𝑖)

(3)

where by 𝛽𝑥(𝑖) and 𝜅𝑡

(𝑖) we denote the age and the time coefficients, respectively.

Additionally, 𝛾𝑡−𝑥(𝑖) denotes the cohort terms (see [6]).

It is noticeable that from the drop down menu we can select a number of three different calibration methods, namely the vanilla Lee-Carter (presented in Section 6.1.1), the Lee-Carter smoother by penalized least square (presented in Section 6.1.2) and the Lee-Carter smoother by penalized log-likelihood (presented in Section 6.1.2).

For all the above enumerated LC methods, the tool precomputes the optimal calibration period that will serve as an input in the fitting phase. The user has the option to employ this optimal value or to further adjust it.

The optimal calibration period is calculated based on a mean deviance statistic as described in [7] [8]. As depicted in Figure 8, the method estimates the optimal years frame based on the underlying assumption that the 𝜅𝑡 coefficients are linearly distributed.

We underline that in the tool, in order to have sufficient number of years selected for the calibration the, the user cannot select a period smaller than 20 years.

Page 11: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

10 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

6.1.1 Lee-Carter

Figure 8. Lee-Carter interface

After the user selects the desired calibration methodology, the corresponding algorithm is triggered by clicking the “Calibration” button. As a result, two sets of graphs are produced, namely the “Calibrated Parameters” and “Calibrated Forces of Mortality Surface”, respectively.

The “Calibrated Parameters” depicts the calibrated values of the coefficients: 𝛼𝑥,

𝛽𝑥 and 𝜅𝑡. The “Calibrated Forces of Mortality Surface” is a 3D plot of the

logarithms of the forces of mortality log(𝜇𝑥,𝑡), as given by Eq. (2).

We underline that the calibration is performed based on the Goodman algorithm (see [9]).

As it may happen in most cases, the estimated 𝛽𝑥 coefficients may exhibit an irregular pattern. This will result in irregular projected life tables. A good method to produce regular projected life table is to smooth the 𝛽𝑥 coefficients as described in Section 6.1.2 and 6.1.3. Such a procedure is described in detail in [10] (Section 2) where the shooting is performed via a penalized least-squares or maximum likelihood analysis. The optimal value of the smoothing parameter is selected via a cross validation methodology.

6.1.2 Lee-Carter smoothed by penalized least squares

This method consists in smoothing the coefficients 𝛽𝑥 by penalized least squares. This implies finding the solution for an optimisation problem given the following function that must be minimized:

∑∑(ln(�̂�𝑥,𝑡(𝑡)) − 𝛼𝑥 − 𝛽𝑥𝜅𝑡)

𝑡

2

+ 𝛽′𝑃𝛽𝛽

𝑥

(4)

where,

𝑃𝛽 = 𝜋𝛽Δ′Δ (5)

with

Page 12: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

11 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

Δ =(

1 −20 1

1 0−2 1

0 …0 …

⋮ ⋱⋯ ⋱

⋱ ⋱0 1

⋱ ⋮−2 1

) (6)

It should be noted that, in the above analytical representation of the smoothed coefficients, 𝜋𝛽 represents the smoothing parameter. If 𝜋𝛽 is big it will mean that

more smoothing of the 𝛽𝑥 coefficients is required. This parameter can be adjusted

by the user and can have a value equal to 10𝑖 with 𝑖 ∈ {1,… , 6}.

The second term of function that needs to be minimised (𝛽′𝑃𝛽𝛽) represents the

penalization imposed on 𝛽𝑥 in terms of the least squares on the second order differences of the elements of the vector 𝛽𝑥. By employing the Newton-Raphson optimization algorithm we can find the parameters 𝛼𝑥 , 𝛽𝑥 , 𝜅𝑡 given the coefficient 𝜋𝛽 as provided by the user. The optimal smoothing parameter can be calculated

by the tool by leave one out cross validation. Given the fact that this procedure is computational intensive in the demo version this feature is not available.

Figure 9. Lee-Carter smoothed by penalized least square interface

6.1.3 Lee-Carter smoother by penalized log-likelihood

In this section we will present an alternative method to the one presented Section 6.1.2 for the calculation of the smoothed LC coefficients. In the remainder of this section we will briefly present the method and the corresponding user interface. The interested reader, for an in depth description of the analytical approach should consult the reference [10] (Section 3).

The crux of this method relies on the fact that the likelihood is proportional to the Poisson likelihood. Consequently, we can consider that the death number follows a Poisson distribution with the average equal to 𝐸𝑅𝑥,𝑡𝜇𝑥,𝑡. The model parameters

will be calculated by maximizing the following function:

ln(ℒ(𝛼, 𝛽, 𝜅)) −1

2𝛽′𝑃𝛽𝛽 (7)

Resulting in:

Page 13: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

12 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

∑∑(𝐷𝑥,𝑡(𝛼𝑥 + 𝛽𝑥𝜅𝑡) − 𝐸𝑅𝑥,𝑡𝑒𝑥𝑝(𝛼𝑥 − 𝛽𝑥𝜅𝑡))

𝑡𝑥

−1

2𝛽′𝑃𝛽𝛽 (8)

with 𝑃𝛽 expressed as in Eq 5.

By employing the Newton-Raphson optimization algorithm we can find the parameters 𝛼𝑥 , 𝛽𝑥, 𝜅𝑡 given the coefficient 𝜋𝛽 as provided by the user.

Figure 10. Lee-Carter smoothed by penalized log-likelihood

The optimal smoothing parameter can be calculated by the tool by leave one out cross validation. Given the fact that this procedure is computational intensive in the demo version this feature is not available.

6.2 Robustness

The Robustness interface aim is to perform a comparative analysis of Lee Carter parameters on two different timeframes in order to assess the robustness of the calibration.

Similarly to the “Data Selection” phase, the tool will automatically compute the time frame and the age frame based on the input data set uploaded. Additionally, the same age restriction is applied, meaning that the maximum age frame is the highest age before 𝑥𝑚𝑎𝑥 for which the exposure to risk is positive for all years. Via the interface the user can then restrict the data that will be employed in the analysis. Basically, a subset of the “Age Frame” and two subsets of the “Time Frame” will be selected as input for the analysis.

Page 14: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

13 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

Figure 11. Robustness interface. The dotted line represents the Time Frame 1 and the red continuous line represents the Time Frame 2

After performing the comparative analysis (by clicking the “Robustness” button) two distinct sets of calibrated Lee Carter parameters are obtained. The calibration employs the Goodman algorithm (see [9]) and is performed on the data smoothed employing the locfit model (see Section 5).

The results are graphically presented as depicted in Figure 11. Note that the purpose of the comparison is focused on the trend and not on the values themselves. For instance, as presented in Figure 11, it is noticeable that the trends for the 𝛼𝑥 and 𝜅𝑡 parameters are very similar. On the other hand, for the 𝛽𝑥 parameter there is a difference especially for the lower ages.

Given the polynomial form of the LC model (see Eq. (2)), one could obtain the same mortality rates for various calibrated parameters. In other words, there is no single solution. In order to fix this issue, the following constraints are imposed:

∑𝜅𝑡𝑡

= 0 (9)

∑𝛽𝑥𝑥

= 1 (10)

It is noticeable based on the two constraints presented above that, when modifying the time frame, all coefficients will shift by a certain degree.

7 COEFFICIENT SMOOTHING PHASE

During this phase the user can apply a smoothing algorithm on the LC parameters. This procedure complements the smoothing that can be applied at the input date level (see Section 5) or during the calibration phase (see Section 6.1.2 and 6.1.3). Although the smoothing can be applied at the input data level, at the calibration level and on the LC coefficients, we strongly recommend employing only one of them. This is due to the fact that applying several procedures at the same time will result in an almost flat surface. This will prevent us from capturing the local behavior in our result since it will be removed by this enhanced low pass filtering. However, as this may be required in particular situations or when we want to

Page 15: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

14 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

emphasize a global average behavior, the program was designed in a flexible manner allowing the user to perform several smoothing phases. In this particular case a warning message will appear, but the user remains, of course, free to choose the combination.

Similarly to the Data smoothing interface (as presented in Section 5), the following parameters are available to the user via the interface (see Figure 12) in order to perform the smoothing:

- Perform model coefficients smoothing - Smoothing Methodology - Smoothing parameter

Figure 12. Coefficient smoothing interface

The user can choose to perform a coefficient smoothing or not via the corresponding dropdown menu (“Perform model coefficients smoothing”).

The type of smoothing algorithm is selected via the corresponding dropdown menu named “Smoothing Methodology”. It is important to note that only the “2D-locfit” smoothing algorithm is available employing a local linear regression.

The value of the “Smoothing parameter” will represent the number of neighboring points that will be considered as a percentage of the total observation in the input data set.

We remind that, in the data smoothing phase, in order to help the user to select an appropriate parameter, the tool will provide a pre-computed value. However, for the LC smoothing parameter no such value will be computed and a default 10% value will be provided.

After the parameter selection, the user can trigger the algorithm by clicking on the “Smoothing” button. The result is presented via two graphs, namely “Smoothed Model Parameters” and “Smoothed Forces of Mortality Surface”. The “Smoothed Model Parameters” graph represents the smoothed LC coefficients. The “Smoothed Forces of Mortality Surface” is a 3D graph depicting the forces of mortality logarithms as derived from the LC model after employing the smoothing algorithm.

Page 16: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

15 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

8 PROJECTION PHASE

Of course, having models that fit past mortality has not an added value. The goal is to estimate future mortality by projecting built past life tables. To do so an underlying projection of the life table must be estimated based on past data. The basis for such estimation will rely on the calibrated model parameters. We underline that, in case the tool relies on the LC model, the age coefficients do not need to be projected as the age frame will not evolve. Conversely, the time coefficients will have to be projected.

We will employ a parametrical Auto Regressive Integrated Moving Average (ARIMA) (see [11]) for the projection.

For our specific problem we have the following parametrical ARIMA(𝑝, 1, 𝑞) with 𝑝, 𝑞 ∈ {0,1,2}. The meaning of the ARIMA parameters is the following: 𝑝 is the order (number of time lags) of the autoregressive model, 𝑑 = 1 is the degree of differencing (the number of times the data have had past values subtracted), and 𝑞 is the order of the moving-average model.

Basically, a stationary linear process {𝑋𝑡} is called ARMA(𝑝, 𝑞) with 𝑝 ≥ 0, 𝑞 ≥ 0 if there is two sets of constants 𝑎1, … , 𝑎𝑝 and 𝑏1, … , 𝑏𝑞 with 𝑎𝑝 ≠ 0, 𝑏𝑞 ≠ 0 and ξt

follows a white noise process. This will result in the following iterative representation:

𝑋𝑡 −∑𝑎𝑖

𝑝

𝑖=1

Xt−1 = ξt +∑𝑏𝑗

𝑞

𝑗=1

ξt−1 (11)

By applying this to our specific mortality time series of variable 𝜅𝑡, and by rendering them stationary via difference will resulting in an ARIMA(𝑝, 1, 𝑞) with drift 𝑑 defined as follows:

𝜅𝑡 − 𝜅𝑡−1 −∑𝑎𝑖

𝑝

𝑖=1

(𝜅𝑡−𝑖 − 𝜅𝑡−𝑖−1) = d + ξt +∑𝑏𝑗

𝑞

𝑗=1

ξt−1 (12)

The above analytical approach is employed in the tool via a dedicated statistical package.

Besides the calculation of the projected 𝜅𝑡 and the new mortality surface the tool will calculate as well two statistical indicator, namely the Akaike information criterion (AIC) and the Bayesian information criterion (BIC) (see [12] [13]).

The AIC is a measure of the relative quality of statistical models for a given set of data. AIC will find a trade-off between the number of coefficients and quality of fit.

Given a collection of models for the data, AIC estimates the quality of each model, relative to each of the other models. Hence, AIC provides a means for model selection. The BIC is a criterion closely related to the AIC.

Page 17: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

16 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

The Projection phase is divided into two sections, “Deterministic Projection” and “Stochastic Projection”, respectively. Performing the “Deterministic Projection” step is compulsory since its result will serve as input for the following steps. On the other hand, the “Stochastic Projection” might prove useful in developing scenarios for future mortality events. The “Stochastic Projection” result will not be employed in the subsequent steps.

8.1 Deterministic projections

Before performing the deterministic projection on the data, the user can adjust the following parameters:

- Projection methodology - First considered year in the projection of the time parameter - Ultimate Time for Projection series - Confidence interval for the projection of the time parameter - Parameter 𝑝 - Parameter 𝑞

The current version of the tool allows only for an ARIMA methodology to be selected via the dropdown menu (“Projection methodology”) in order to estimate the projections.

For the “First considered year in the projection of the time parameter” (denoted as tmin,calib), the selection will be based on the restricted time frame as defined

during the “Data Selection” phase (see Section 4). Let us denote by tmin and tmax the minimum and maximum of the time frame as selected by the user. Then the tmin,calib will belong to the interval tmin,calib ∈ [tmin; tmax]. The last calibration year will automatically be the last year of the time frame tmax, and the projection will be performed from that point onwards. Consequently, the calibration will be performed on the interval starting from tmin,calib and ending with tmax. It is

noticeable that the tool provides a pre-computed optimal first year as a starting point for the projection. This estimation is based on maximizing the coefficient of determination (𝑅2) of a linear regression. Additionally, in order to have sufficient data for the fitting, the past ten years are not included in the selection. In conclusion, all the years that will be considered for testing in order to estimate the optimal starting point will belong to the interval [tmin;max(tmin; tmax −10)].

The “Ultimate Time for Projection series” parameter will represent the last year considered in the projection.

In order to provide a more detailed analysis, the result of the projections will include a depiction of the projection confidence interval that will be defined by the user selection in terms of “Confidence interval for the projection of the time parameter” Note that this is a theoretical confidence interval based on the normal distribution of the error term.

Page 18: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

17 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

Figure 13. Deterministic projection interface.

Parameter 𝒑 and Parameter 𝒒 will allow the user to set the order of the autoregressive model (as number of time lags) and the order of the moving-average model, respectively. Note that optimal 𝑝 and 𝑞 values obtained by minimizing the AIC are proposed to the user.

After the parameter selection, the user can trigger the algorithm by clicking on the “Projection” button. As depicted in Figure 13, the result is summarised in two graphical representations: “Projected Time Parameters” and “Projected Forces of Mortality Surface”.

The “Projected Time Parameters” graph presents the projected time coefficient. The best estimate projection is depicted as the red line, whereas the theoretical confidence interval is depicted as the blue area of the graph.

The “Projected Forces of Mortality Surface” graph is a 3D plot representing the

logarithms of the forces of mortality as derived from the (smoothed) calibrated

and projected coefficients. In other words, this is a combination of the past life

table and the projected life table.

8.2 Stochastic projections

As presented in the synopsis (see Figure 1), the stochastic projection result is not used for the remainder of the process. The purpose of this step is to enable developing scenarios for future mortality events. This type of data is quite useful for instance in developing internal models or performing profitability studies.

Before performing the stochastic projection on the data, the user can adjust the following parameters:

- Projection methodology - Number of simulations

Page 19: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

18 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

- First considered year in the projection of the time parameter - Ultimate Time for Projection - Confidence interval for the projection of the time parameter - Parameter p - Parameter q

The “Number of simulations” defines the number considered for the confidence

interval simulation. This number is limited between 100 and 10.000 with a step of

100.

Figure 14. Stochastic Projection interface.

All the other remaining parameters are similar to the ones employed in the “Deterministic projections” step: Projection methodology, First considered year in the projection of the time parameter and Ultimate Time for Projection. Consequently, we will not pursue with an in-depth description thereof in the remainder of this section. For a detailed description please refer to Section 8.1.

After the parameter selection, the user can trigger the algorithm by clicking on the “Projection” button. As depicted in Figure 14, the result is summarized in a graphical representation of the “Projected Time Parameters”. The graph depicts the projected time coefficients with drift calibrated on the selected years. The red line represents the best estimate projection and is obtained by setting the error term to 0. Furthermore, a confidence interval based on the simulations (represented by the blue area) is presented.

The results of the analysis are downloadable by means of two separate Life Tables representing the lower and upper bound resulting from the lower and upper bound projection of the time coefficients. Note that the result depends on the chosen confidence interval, on the number of simulations and on the random number generator.

Page 20: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

19 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

9 CLOSURE PHASE

The observations corresponding to the oldest people are often either missing or of poor quality, thus resulting in an undesired increase in volatility. This is due to the natural fact that the number of people is decreasing with age. In order to have reliable mortality values for these high ages, a closure model is required. Additionally, the closure model should fit as much as possible with the rest of the mortality surface.

In order to enrich the scarce data corresponding to the oldest ages, different closure models are available in the tool:

- Denuit-Goderniaux (see [14]) - Kannisto (see [15]) - Free ultimate age - Closure with constant rates

In the following we will provide a brief description of the available closure models emphasizing their analytical form. The rationale behind this is to shed some light into the relationship between the model parameters, as available in our tool for the user, and the impact thereof on the results. For an in-depth presentation of each model, the reader may refer to the above referenced publications.

Under the Denuit-Goderniaux model, the logarithm of the mortality rates follows a quadratic function of the age expressed analytically as follows:

log(𝑞𝑥) = 𝑎 + 𝑏𝑥 + 𝑐𝑥2 + 𝜀𝑥 (13)

where 𝜀𝑥 defines a random variable normally distributed with mean 0 and variance 𝜎2 ( 𝜀𝑥~𝒩(0, 𝜎2)). Two constraints are imposed in terms of ultimate age (𝑞𝜔 = 1) and of inflexion constraint (𝑞′𝜔 = 0) leading to:

log(𝑞𝑥) = 𝑐(𝜔 − 𝑥)2 + 𝜀𝑥 (14)

Additionally, the calibration of the parameter 𝑐 relies on linear regression normally performed on a previous range of ages. This, however, will result in a gap between the last rates obtained by the LC model and the first closed rates. In order to avoid this problem, the linear regression is performed on a so-called single age. This will allow to find the coefficient 𝑐 directly and not to observe such a gap.

The Kannisto model is calibrated on the age values and is defined as follow:

𝜇𝑥 =𝛷1exp(𝛷2𝑥)

1 + 𝛷1exp(𝛷2𝑥) (15)

Where

𝛷1 = exp (1

𝑛∑ logit(𝜇𝑥𝑘)

𝑛

𝑘=1−𝛷2�̅�)

(16)

Page 21: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

20 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

𝛷2 =∑ (logit(𝜇𝑥𝑘) ∗ (𝑥𝑘 − �̅�))𝑛𝑘=1

∑ (𝑥𝑘 − �̅�)2𝑛𝑘=1

(17)

Under the Free ultimate age model, the ultimate age is not necessarily the same

for all years. With 𝜔𝑡 being the last age for year 𝑡, and 𝑥0 the maximum age used

in the calibration, the completion model is given by:

log(𝑞𝑥,𝑡) =log(𝑞𝑥0,𝑡)

(𝜔𝑡 − 𝑥0)2(𝜔𝑡 − 𝑥)2 (18)

for ages higher than 𝑥0. The retained value for 𝜔𝑡 minimizes the following expression, with side being equal to 10:

∑ [log(𝑞𝑥,𝑡) −log(𝑞𝑥0,𝑡)

(𝜔𝑡 − 𝑥0)2(𝜔𝑡 − 𝑥)2]

2𝑥0

𝑥=𝑥0−𝑠𝑖𝑑𝑒

(19)

The “Closure with constant rates” model it is a simplified closure model. The forces of mortality 𝜇𝑥,𝑡 for ages going from 𝑥𝑚𝑎𝑥 to the ultimate age chosen by the user are kept constant for each year 𝑡. This approach, besides being straight forward and leading to an easy implementation, can be considered as well the most risk adverse in the insurance scope. Basically, by underestimating the forces of mortality we will consequently underestimated probabilities of death as well. This is valid especially in the case of policies where the interests are paid till death, as annuities. The importance resides in the fact that the insurer should foresee lower death probabilities than the actual ones in order to increase the provisions. It has to be noticed that, in certain cases, it is not advisable to apply this approach. For instance in case of whole life insurance policies.

We underline that all the above models are applied for each year independently.

Before performing the stochastic projection on the data, the user can adjust the following parameters:

- Perform table closure? - Closure methodology

The Closure phase interface, as depicted in Figure 15, allows the user to

completely bypass applying a closure model. This can be done via the

corresponding dropdown menu “Perform table closure?”. In case the user

decides to employ a model, the selection between the aforementioned types of

models is done via the “Closure methodology” dropdown menu. It is noticeable

that the interface is dynamic, allowing for the selection of a different set of

parameter for each distinct type of closure model.

We underline that, for the “Free ultimate age” model, no parameters are needed.

In case the user selects a “Denuit-Goderniaux” model, the tool allows for the

selection between “Fixed” or “Adapted” types of calculation corresponding to the

Page 22: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

21 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

“Determination of the initial age for closure” parameter. “Fixed” will imply that

the calibration of the parameter 𝑐 is performed based only on the value of 𝑎𝑚𝑖𝑛,

as defined by the “Initial Age for Closure model” parameter (see infra). “Adapted”

will imply that the calibration of the parameter 𝑐 is performed based on the

optimal value of the age 𝑥. Basically, the value of 𝑐 will be calculated for all the

ages 𝑥 ranging from 𝑎𝑚𝑖𝑛 to max(𝑎𝑚𝑖𝑛; 𝑥𝑚𝑎𝑥 − 10). The 𝑅2 coefficient is

computed by comparing the observed and closed values of the log(𝑞𝑥). The

optimal age 𝑥𝑜𝑝𝑡 is selected as the age for which the 𝑅2 is maximum. The

“Ultimate Age” is the last considered age 𝜔 for which the death probability is

equal to one.

Figure 15. Closure interface.

In case the user chooses a “Kannisto” model, the tool allows for the selection of

the minimum and maximum ages that will define the calculation range. Basically,

the minimum age is 𝑥𝑚𝑖𝑛 = 𝑥1 and the maximum age is 𝑥𝑚𝑎𝑥 = 𝑥𝑛,

corresponding to the parameters of 𝛷1 and 𝛷2 as defined in Eq (16) and Eq. (17).

Although in the theoretical case there is no limitation for the ultimate age, the

user has to choose one via the Last Closure Age parameter. This is necessary in

order to obtain a finite life table.

In case the user selects a “Closure with constant rates” model, the tool allows for

the selection of the maximum age.

After the parameter selection, the user can trigger the algorithm by clicking on the “Closure” button. As a result (see Figure 15), two distinct 3D graphs are produced, namely:

- Forces of Mortality Surface - Closed Forces of Mortality Surface.

Page 23: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

22 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

The two graphs show a comparative analysis before and after applying the closure model on the log(𝑞𝑥). We remind that the log(𝑞𝑥) values are derived from the LC coefficients as obtained in the “Projection” phase. Needless to say that the two surfaces will be similar if no closure model is employed.

10 BACKTESTING PHASE

The goal of this phase, as its name suggests, is to assess the quality of the projection via a backtest procedure. The first step consists in calculating life tables employing the LC model based on three distinct time frames. The time frames will be obtained via a different last calibration year, whereas the first calibration year will be equal to the first selected year 𝑡𝑚𝑖𝑛 as defined in the “Data Selection” (see Section 3).

Figure 16. Backtesting interface.

In order to assess the predictive power of the model, we divide the available data into two sets. The last calibration year will define the separation between the two sets. The first set will be employed to calibrate the model and project the mortality features, while the second set is used to assess the prediction.

It is important to underline that the data employed are obtained after applying the “Data Smoothing” phase with a locfit procedure, whereas the LC coefficients are not smoothed. In other words, no “Coefficient Smoothing” phase is applied and the calibration as detailed in Section 6.1.1 is applied. This stage is followed by the projection and the closure phases. The applied closure methodology is by default “Denuit-Goderniaux” with an “Initial Age for Closure model” (𝑎𝑚𝑖𝑛) being equal to max(𝑥𝑚𝑖𝑛; 𝑥𝑚𝑎𝑥 − 5).

We underline that the result of the backtest procedure will not impact the final life table result.

In order to perform the backtest the user can select the following parameters:

- Life expectancy type - Age for life expectancy calculation

Page 24: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

23 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

- Last calibration year 1 - Last calibration year 2 - Last calibration year 3 - Backtesting confidence interval

Via the “Life expectancy type” dropdown menu the user can select either Period (life expectancy computed on the mortality rates of a single year) or Prospective (life expectancy computed on the mortality rates of the cohort).

The “Age for life expectancy calculation” will define the age for which the remaining life expectancy has to be computed.

The life expectancy will be computed based on a best estimate projection of the LC time coefficient. The coefficient projection’s lower and upper limits will be calculated as well. The calculation will not be based on simulations, but will rely instead on the theoretical confidence interval. This confidence interval will be derived from the normal distribution of the error term resulting from the projection of the time coefficient. The value of this interval will be selected via the “Backtesting confidence interval”.

The three “Last calibration year” parameters will define the three distinct time

frames underlying the backtest procedure.

After the parameter selection, the user can trigger the algorithm by clicking on the “Backtesting” button. As a result (see Figure 16), a complete representation of the three distinct time frames is provided. The output graph presents the life expectancies and the corresponding confidence intervals for each year of the three distinct time frames. Additionally, the observed life expectancy is provided.

11 EXPORT PHASE

Via the “Export phase” all the results and additional reports can be downloaded for further use. The following exportable data is provided:

- Life Table - Reduction Rates - Report - Model choices - Model parameters

The “Life Table” will represent the result obtained after having performed each

compulsory step.

The “Reduction Rates” output will be a list of all the reduction rates calculated. By

definition, a reduction rate is the ratio between the mortality rate of a given year

and age, divided by the mortality rate at the same age, but at the last year

selected in the input data set. Basically, for age 𝑥 and future year 𝑡, this means

that the ratio is 𝑞𝑥,𝑡/𝑞𝑥,𝑡𝑚𝑎𝑥. This provides us with a coefficient table that will be

employed to transform the last observed/modelled mortality rates into future

mortality rates.

Page 25: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

24 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

The “Report” is a pdf file containing a summary of the LC analysis. The user can

add further comments in the corresponding text input controls and dropdown

menus followed by clicking on the “Insert Comments” button.

The “Model choices” is a table containing the different selections made by the

user in terms of model choices.

The “Model parameters” is a table containing the complete list of parameters

corresponding to the selected models.

Figure 17. Export interface.

12 BIBLIOGRAPHY

[1] L. Carter and R. Lee, "Comment on: Modeling and Forecasting U.S. Mortality," Journal of the American Statistical Association, vol. 87, no. 419, p. 659–671, 1992.

[2] H. Booth and L. Tickle, "Mortality modelling and forecasting: a review of methods," Annals of Actuarial Science, vol. 3, no. 1-2, pp. 3-43, 2008.

[3] C. Loader, "Locfit : An Introduction," Statistical Computing and Graphics Newsletter, April 1997. [Online]. Available: www.statistik.lmu.de/~leiten/Lehre/Material/GLM_0708/Tutorium/locfit.pdf.

[4] Wikipedia, "B-spline," [Online]. Available: https://en.wikipedia.org/wiki/B-spline.

[5] A. Mazza and A. Punzo, "DBKGrad : An R Package for Mortality Rates Graduation by Fixed and Adaptative," JSS , vol. 57, no. 2, 2014.

[6] A. J. Cairns, D. Blake, K. Dowd, G. D. Coughlan, D. Epstein, A. Ong and I.

Page 26: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

25 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

Balevich, "A quantitative comparison of stochastic mortality models using data from England and Wales and the United States," North American Actuarial Journal, vol. 13, no. 1, pp. 1-35, 2009.

[7] H. Booth, R. Hyndman, L. Tickle and P. de Jong, "Lee-Carter mortality forecasting : a multi-country comparison of variants and extensions," DemRes, vol. 15, no. 9, pp. 289-310, 2006.

[8] H. Booth, J. Maindonald and L. Smith, "Applying Lee-Carter under conditions of variable mortality decline," Population Studies, vol. 56, no. 3, pp. 325-336., 2002.

[9] L. Goodman, "Simple models for the analysis of association in cross-classifications having ordered categories," Journal of the American Statistical, vol. 74, pp. 537-552, 1979.

[10] A. Delwarde, M. Denuit and P. Eilers, "Smoothing the Lee-Carter and Poisson log-bilinear models for mortality forecasting a penalized log-likelihood approach," Statistical Modelling, vol. 7, pp. 29-48, 2007.

[11] Wikipedia, "Autoregressive integrated moving average (ARIMA)," [Online]. Available: https://en.wikipedia.org/wiki/Autoregressive_integrated_moving_average#Estimation_of_coefficients.

[12] Wikipedia, "Akaike information criterion (AIC)," [Online]. Available: https://en.wikipedia.org/wiki/Akaike_information_criterion.

[13] Wikipedia, "Bayesian information criterion (BIC)," [Online]. Available: https://en.wikipedia.org/wiki/Bayesian_information_criterion.

[14] M. Denuit and A.-C. Goderniaux, "Closing and projecting lifetables using log-linear models," Bulletin of the Swiss Association of Actuaries, pp. 29-49, 2005.

[15] V. Kannisto, "Presentation at a workshop on old-age mortality," in Odense University, Odense, Denmark, 1992.

Page 27: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

26 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

Excellence: our outstanding feature

To deliver more than is expected from us, we attract the best people and develop their skills to the most cutting-hedging techniques supported by a robust and rigorous knowledge management framework.

Innovation: our founding ambition

Leveraging on our profound academic roots, we are dedicated on creating inventive solutions by combining our extensive professional experience with the latest scientific research.

Integrity: our commitment

We put work ethics, client's best interest and confidentiality as the foundation of our work. We are fully independent and dedicated at telling the truth.

Solution-driven: our focus

We produce for our clients tangible long-term sustainable value. We help our clients not only to reach the top, we help them reaching the stable top.

Reliability: our characteristic

We never compromise on the quality of our work, the respect of deadlines & budgets and our other commitments. We don’t produce reports, we deliver results!

ALM, Portfolio Management & Quantitative Finance Implementation and calibration of stochastic models for

valuation, trading and risk Management purposes

Times series analysis & modelling

Pricing of financial instruments & development of ALM models

Design/review/implementation of systematic trading & hedging strategies

Business intelligence in ALM or Portfolio Management

Tools development (Valuation, Pricing, hedging, portfolio replication, etc.)

Design of Capital Management solutions

Insurance specialities

Life, Health and Pension

DFA* Models

Capital Requirement assessment

Business valuation support

Product development (pricing, profitability,..) & Reserving

Model validation

Non-Life

Reserving: triangle methods, individual claims modelling

Pricing: frequency and severity modelling, large claims analysis, credibility methods, commercial constraints

DFA models: cash-flows projection, calibration of models

Reinsurance: modelling covers, optimal reinsurance programs

Qualitative Risk Management, Restructuring & Operations

Organization & Governance

Businesses restructuring & change management

Implementation and industrialization of processes

Internal & regulatory reporting (KRI’s & KPI’s dashboards)

Model Review frameworks

Model Documentation

Risk & Portfolio Management

Page 28: Thanatos - Prospective Mortality - Reacfin · relies on the output produced by Thanatos. The results are then employed for pricing, valuation and profitability purposes. 2 TOOL PROCESS

27 | P a g e

User guide

Prospective Mortality Tool

© Reacfin s.a.

Xavier Maréchal

CEO

M +32 497 48 98 48

[email protected]

Aurélie Miller

Director

M +32 486 31 60 99

[email protected]

François Ducuroir

Managing Partner

M +32 472 72 32 05

[email protected]

Samuel Mahy

Director

M +32 498 04 23 90

[email protected]

Maciej Sterzynski

Managing Partner

M +32 485 97 09 16

[email protected]

Fabien Verdicq

Director

M +32 498 23 76 30

[email protected]

Reacfin s.a. is a consulting firm, spin-off of the University of Louvain.

We develop innovative solutions and robust practical tools to manage our customers’ risks, products, capital & portfolios.

Linking Academic Excellence

with Market Best Practice

Reacfin s.a./n.v.

Place de l'Université 25

B-1348 Louvain-la-Neuve

www.reacfin.com

+ 32 (0)10 84 07 50