scispace - formally typeset
Search or ask a question

Showing papers by "Jens Perch Nielsen published in 2015"


Journal ArticleDOI
TL;DR: It is proposed to model the counts of deaths directly by using a Poisson regression with an age–period–cohort structure, but without offset, to avoid this problem.
Abstract: It is of considerable interest to forecast future mesothelioma mortality. No measures for exposure are available so it is not straightforward to apply a dose–response model. It is proposed to model the counts of deaths directly by using a Poisson regression with an age–period–cohort structure, but without offset. Traditionally the age–period–cohort is viewed as suffering from an identification problem. It is shown how to reparameterize the model in terms of freely varying parameters, to avoid this problem. It is shown how to conduct inference and how to construct distribution forecasts.

27 citations


Journal ArticleDOI
TL;DR: In this paper, the authors investigated the double chain ladder model when other knowledge is available, focusing on two specific types of prior knowledge, namely prior knowledge on the number of zero-claims for each underwriting year and prior knowledge about the relationship between the development of the claim and its mean severity.
Abstract: Double chain ladder demonstrated how the classical chain ladder technique can be broken down into separate components. It was shown that under certain model assumptions and via one particular estimation technique, it is possible to interpret the classical chain ladder method as a model of the observed number of counts with a built-in delay function from when a claim is reported until it is paid. In this paper, we investigate the double chain ladder model further and consider the case when other knowledge is available, focusing on two specific types of prior knowledge, namely prior knowledge on the number of zero-claims for each underwriting year and prior knowledge about the relationship between the development of the claim and its mean severity. Both types of prior knowledge readily lend themselves to be included in the double chain ladder framework.

21 citations


Journal ArticleDOI
TL;DR: In this paper, a portfolio selection problem of an investor with a deterministic savings plan who aims to have a target wealth value at retirement is solved, where the investor is an expected power utility-maximizer.
Abstract: We solve a portfolio selection problem of an investor with a deterministic savings plan who aims to have a target wealth value at retirement. The investor is an expected power utility-maximizer. The target wealth value is the maximum wealth that the investor can have at retirement. By constraining the investor to have no more than the target wealth at retirement, we find that the lower quantiles of the terminal wealth distribution increase, so the risk of poor financial outcomes is reduced. The drawback of the optimal strategy is that the possibility of gains above the target wealth is eliminated.

21 citations


Journal ArticleDOI
TL;DR: In this paper, the authors show that recent published mortality projections with unobserved exposure can be understood as structured density estimation, where the structured density is only observed on a sub-sample corresponding to historical calendar times.
Abstract: This paper shows that recent published mortality projections with unobserved exposure can be understood as structured density estimation. The structured density is only observed on a sub-sample corresponding to historical calendar times. The mortality forecast is obtained by extrapolating the structured density to future calendar times using that the components of the density are identified within sample. The new method is illustrated on the important practical problem of forecasting mesothelioma for the UK population. Full asymptotic theory is provided. The theory is given in such generality that it also introduces mathematical statistical theory for the recent continuous chain ladder model. This allows a modern approach to classical reserving techniques used every day in any non-life insurance company around the globe. Applications to mortality data and non-life insurance data are provided along with relevant small sample simulation studies.

19 citations


Journal ArticleDOI
TL;DR: In this article, the structural assumption is made that the density is a product of one-dimensional functions and the theory is quite general in assuming the shape of the region where density is observed.
Abstract: This paper generalizes recent proposals of density forecasting models and it develops theory for this class of models. In density forecasting the density of observations is estimated in regions where the density is not observed. Identification of the density in such regions is guaranteed by structural assumptions on the density that allows exact extrapolation. In this paper the structural assumption is made that the density is a product of one-dimensional functions. The theory is quite general in assuming the shape of the region where the density is observed. Such models naturally arise when the time point of an observation can be written as the sum of two terms (e.g. onset and incubation period of a disease). The developed theory also allows for a multiplicative factor of seasonal effects. Seasonal effects are present in many actuarial, biostatistical, econometric and statistical studies. Smoothing estimators are proposed that are based on backfitting. Full asymptotic theory is derived for them. A practical example from the insurance business is given producing a within year budget of reported insurance claims. A small sample study supports the theoretical results.

18 citations


Journal ArticleDOI
TL;DR: In this paper, the structural assumption is made that the density is a product of one-dimensional functions and the theory is quite general in assuming the shape of the region where density is observed.
Abstract: This paper generalizes recent proposals of density forecasting models and it develops theory for this class of models. In density forecasting, the density of observations is estimated in regions where the density is not observed. Identification of the density in such regions is guaranteed by structural assumptions on the density that allows exact extrapolation. In this paper, the structural assumption is made that the density is a product of one-dimensional functions. The theory is quite general in assuming the shape of the region where the density is observed. Such models naturally arise when the time point of an observation can be written as the sum of two terms (e.g., onset and incubation period of a disease). The developed theory also allows for a multiplicative factor of seasonal effects. Seasonal effects are present in many actuarial, biostatistical, econometric and statistical studies. Smoothing estimators are proposed that are based on backfitting. Full asymptotic theory is derived for them. A practical example from the insurance business is given producing a within year budget of reported insurance claims. A small sample study supports the theoretical results.

18 citations


Journal ArticleDOI
TL;DR: In this paper, the authors take the actuarial long-term view and base their prediction on yearly data from 1872 through 2014, focusing on nonlinear relationships between a set of covariates.
Abstract: One of the most studied questions in economics and finance is whether empirical models can be used to predict equity returns or premiums. In this paper, we take the actuarial long-term view and base our prediction on yearly data from 1872 through 2014. While many authors favor the historical mean or other parametric methods, this article focuses on nonlinear relationships between a set of covariates. A bootstrap test on the true functional form of the conditional expected returns confirms that yearly returns on the S&P500 are predictable. The inclusion of prior knowledge in our nonlinear model shows notable improvement in the prediction of excess stock returns compared to a fully nonparametric model. Statistically, a bias and dimension reduction method is proposed to import more structure in the estimation process as an adequate way to circumvent the curse of dimensionality.

16 citations


Journal ArticleDOI
TL;DR: In this paper, the authors derived explicit expressions for the maximum likelihood estimators in terms of development factors which are geometric averages, and derived the distribution of the estimators, which is invariant to traditional measures for exposure.
Abstract: The log normal reserving model is considered. The contribution of the paper is to derive explicit expressions for the maximum likelihood estimators. These are expressed in terms of development factors which are geometric averages. The distribution of the estimators is derived. It is shown that the analysis is invariant to traditional measures for exposure.

9 citations


Posted Content
TL;DR: In this paper, an investment strategy for an investor with a savings plan for retirement consisting on constraining the terminal wealth accumulated after the savings period by setting an upper and lower bound is analyzed.
Abstract: We analyze an investment strategy for an investor with a savings plan for retirement consisting on constraining the terminal wealth accumulated after the savings period by setting an upper and lower bound. We carry out a simulation of the terminal wealth after a savings period of thirty years by using daily, monthly, weekly and yearly updates. We calculate the percentiles of the final wealth and the corresponding lifetime annuity that the pension saver will receive during the consumption period. We observe how that the simulated values converge to the theoretical values of the percentiles when the frequency of update increases. Finally, in the numerical example the effect of inflaction is also considered.

7 citations


Posted Content
TL;DR: Mortality among males below 90 years of age is predicted to peak at 2079 deaths in 2017, and the response-only model is a simple benchmark that forecasts just as well as more complicated models.
Abstract: Background: It is of considerable interest to forecast the future burden of mesothelioma mortality. Data on deaths are available, whereas no measure of asbestos exposure is available. Methods: We compare two Poisson models. First, a response-only model with an age-cohort specication. Second, a dose-response model using a synthetic exposure measure. Results: The response-only model has 5% higher peak mortality than the doseresponse model. The former performs better in out-of-sample comparison. Conclusion: Mortality among males below 90 years of age is predicted to peak at 2079 deaths in 2017. The response-only model is a simple benchmark that forecasts just as well as more complicated models.

4 citations


Journal ArticleDOI
TL;DR: In this article, a new bias reducing method called global polynomial adjustment (GPA) was proposed for kernel hazard estimation, which is a global correction which is applicable to any kernel hazard estimator.
Abstract: This paper introduces a new bias reducing method for kernel hazard estimation. The method is called global polynomial adjustment (GPA). It is a global correction which is applicable to any kernel hazard estimator. The estimator works well from a theoretical point of view as it asymptotically reduces bias with unchanged variance. A simulation study investigates the finite-sample properties of GPA. The method is tested on local constant and local linear estimators. From the simulation experiment we conclude that the global estimator improves the goodness-of-fit. An especially encouraging result is that the bias-correction works well for small samples, where traditional bias reduction methods have a tendency to fail.


Book ChapterDOI
21 Aug 2015
TL;DR: In this paper, the authors demonstrate the use of nonparametric intensity estimation, including construction of pointwise confidence sets, for analyzing rating transition data and find that transition intensities away from the class studied here for illustration strongly depend on the direction of the previous move but that this dependence vanishes after 2-3 years.
Abstract: We demonstrate the use of non-parametric intensity estimation - including construction of pointwise confidence sets - for analyzing rating transition data. We find that transition intensities away from the class studied here for illustration strongly depend on the direction of the previous move but that this dependence vanishes after 2-3 years.