scispace - formally typeset
Search or ask a question

Showing papers in "Scandinavian Actuarial Journal in 2011"


Journal ArticleDOI
TL;DR: In this article, a multi-dimensional data analysis approach to the Lee-Carter (LC) model of mortality trends is proposed, which combines estimates for the three modes: (1) time, (2) age groups and (3) different populations.
Abstract: In this paper, we focus on a Multi-dimensional Data Analysis approach to the Lee–Carter (LC) model of mortality trends. In particular, we extend the bilinear LC model and specify a new model based on a three-way structure, which incorporates a further component in the decomposition of the log-mortality rates. A multi-way component analysis is performed using the Tucker3 model. The suggested methodology allows us to obtain combined estimates for the three modes: (1) time, (2) age groups and (3) different populations. From the results obtained by the Tucker3 decomposition, we can jointly compare, in both a numerical and graphical way, the relationships among all three modes and obtain a time-series component as a leading indicator of the mortality trend for a group of populations. Further, we carry out a correlation analysis of the estimated trends in order to assess the reliability of the results of the three-way decomposition. The model's goodness of fit is assessed using an analysis of the residuals. Fin...

77 citations


Journal ArticleDOI
TL;DR: In this article, the authors consider the composite Lognormal-Pareto model proposed by Cooray & Ananda (2005) and suitably modified by Scollnik (2007), which allows for heterogeneity with respect to the threshold and let it vary among observations.
Abstract: This paper further considers the composite Lognormal-Pareto model proposed by Cooray & Ananda (2005) and suitably modified by Scollnik (2007). This model is based on a Lognormal density up to an unknown threshold value and a Pareto density thereafter. Instead of using a single threshold value applying uniformly to the whole data set, the model proposed in the present paper allows for heterogeneity with respect to the threshold and let it vary among observations. Specifically, the threshold value for a particular observation is seen as the realization of a positive random variable and the mixed composite Lognormal- Pareto model is obtained by averaging over the population of interest. The performance of the composite Lognormal-Pareto model and of its mixed extension is compared using the well-known Danish fire losses data set.

75 citations


Journal ArticleDOI
TL;DR: In this article, recursive formulae are derived for the probabilities of a wide variety of mixed Poisson distributions, including transformed mixing random variables, shifted and truncated mixing distributions, compound distributions, and tail probabilities.
Abstract: Recursive formulae are derived for the probabilities of a wide variety of mixed Poisson distributions. Known results are unified and extended. Related formulae are discussed for transformed mixing random variables, shifted and truncated mixing distributions, compound distributions, and tail probabilities. Applications of these models are briefly discussed.

72 citations


Journal ArticleDOI
TL;DR: In this paper, the authors considered a continuous time risk model, where the claims are reinsured by some reinsurance with retention level, where means "no reinsurance" and b=0 means "full reinsurance".
Abstract: In this paper we consider a classical continuous time risk model, where the claims are reinsured by some reinsurance with retention level , where means ‘no reinsurance’ and b=0 means ‘full reinsurance’ The insurer can change the retention level continuously To prevent negative surplus the insurer has to inject additional capital The problem is to minimise the expected discounted cost over all admissible reinsurance strategies We show that an optimal reinsurance strategy exists For some special cases we will be able to give the optimal strategy explicitly In other cases the method will be illustrated only numerically

43 citations



Journal ArticleDOI
TL;DR: In this paper, the authors supplement the literature by adding the logfolded normal and log-folded-t families, and fit the newly proposed distributions to data which represent the total damage done by 827 fires in Norway for the year 1988.
Abstract: A rich variety of probability distributions has been proposed in the actuarial literature for fitting of insurance loss data. Examples include: lognormal, log-t, various versions of Pareto, loglogistic, Weibull, gamma and its variants, and generalized beta of the second kind distributions, among others. In this paper, we supplement the literature by adding the log-folded-normal and log-folded-t families. Shapes of the density function and key distributional properties of the ‘folded’ distributions are presented along with three methods for the estimation of parameters: method of maximum likelihood; method of moments; and method of trimmed moments. Further, large and small-sample properties of these estimators are studied in detail. Finally, we fit the newly proposed distributions to data which represent the total damage done by 827 fires in Norway for the year 1988. The fitted models are then employed in a few quantitative risk management examples, where point and interval estimates for several value-at-r...

32 citations


Journal ArticleDOI
TL;DR: In this article, a non-trivial extension to a stochastic instantaneous interest rate is presented for Erlang claims number processes, and for the Ho-Lee-Merton and Vasicek interest rate models.
Abstract: Formulas have been obtained for the moments of the discounted aggregate claims process, for a constant instantaneous interest rate, and for a claims number process that is an ordinary or a delayed renewal process. In this paper, we present explicit formulas on the first two moments and the joint moment of this risk process, for a non-trivial extension to a stochastic instantaneous interest rate. Examples are given for Erlang claims number processes, and for the Ho–Lee–Merton and the Vasicek interest rate models.

30 citations


Journal ArticleDOI
TL;DR: The notion of distortion of copulas, a natural extension of distortion within the univariate framework, is examined and the formula developed by Genest & Rivest (2001) for computing the distribution of the probability integral transformation of a random vector is extended to the case within the distortion framework.
Abstract: This article examines the notion of distortion of copulas, a natural extension of distortion within the univariate framework. We study three approaches to this extension: (1) distortion of the margins alone while keeping the original copula structure; (2) distortion of the margins while simultaneously altering the copula structure; and (3) synchronized distortion of the copula and its margins. When applying distortion within the multivariate framework, it is important to preserve the properties of a copula function. For the first two approaches, this is a rather straightforward result; however, for the third approach, the proof has been exquisitely constructed in Morillas (2005). These three approaches unify the different types of multivariate distortion that have scarcely scattered in the literature. Our contribution in this paper is to further consider this unifying framework: we give numerous examples to illustrate and we examine their properties particularly with some aspects of ordering multivariate ...

29 citations


Journal ArticleDOI
TL;DR: This work is among the first to either compute or approximate finite time ruin probabilities in the perturbed risk model, and develops an efficient recursive procedure by recognizing a repeating structure in the probability matrices the authors work with.
Abstract: In this paper, we consider a class of perturbed risk processes that have an underlying Markov structure, including Markov-modulated risk processes, and Sparre–Andersen risk processes when both inter-claim times and claim sizes are phase-type. We apply the Erlangization method to the risk process in the class in order to obtain an accurate approximation of the finite time ruin probability. In addition, we develop an efficient recursive procedure by recognizing a repeating structure in the probability matrices we work with. We believe the present work is among the first to either compute or approximate finite time ruin probabilities in the perturbed risk model.

29 citations


Journal ArticleDOI
TL;DR: In this paper, a simple Poisson cluster model for the payment numbers and the corresponding total payments for insurance claims arriving in a given year is considered, and the conditions under which the predictions are asymptotically linear as the number of past payments becomes large.
Abstract: We consider a simple Poisson cluster model for the payment numbers and the corresponding total payments for insurance claims arriving in a given year. Due to the Poisson structure one can give reasonably explicit expressions for the prediction of the payment numbers and total payments in future periods given the past observations of the payment numbers. One can also derive reasonably explicit expressions for the corresponding prediction errors. In the (a, b) class of Panjer's claim size distributions, these expressions can be evaluated by simple recursive algorithms. We study the conditions under which the predictions are asymptotically linear as the number of past payments becomes large. We also demonstrate that, in other regimes, the prediction may be far from linear. For example, a staircase-like pattern may arise as well. We illustrate how the theory works on real-life data, also in comparison with the chain ladder method.

29 citations


Journal ArticleDOI
TL;DR: In this article, the authors propose a methodology for assessing risk dependencies in the valuation of the Solvency II capital requirement, which is relevant to the analysis of risk diversification in the aggregation process.
Abstract: In the valuation of the Solvency II capital requirement, the correct appraisal of risk dependencies acquires particular relevance. These dependencies refer to the recognition of risk diversification in the aggregation process and there are different levels of aggregation and hence different types of diversification. For instance, for a non-life company at the first level the risk components of each single line of business (e.g. premium, reserve, and CAT risks) need to be combined in the overall portfolio, the second level regards the aggregation of different kind of risks as, for example, market and underwriting risk, and finally various solo legal entities could be joined together in a group. Solvency II allows companies to capture these diversification effects in capital requirement assessment, but the identification of a proper methodology can represent a delicate issue. Indeed, while internal models by simulation approaches permit usually to obtain the portfolio multivariate distribution only in the i...

Journal ArticleDOI
TL;DR: In this paper, the authors established statistical claims models for the coherence between externally inflicted water damage to private buildings in Norway and selected meteorological variables, based on these models and downscaled climate predictions from the Hadley centre HadAM3H climate model, the estimated loss level of a future scenario period (2071-2100) is compared to that of a control period (1961-1990).
Abstract: The insurance industry, like other parts of the financial sector, is vulnerable to climate change. Life as well as non-life products are affected and knowledge of future loss levels is valuable. Risk and premium calculations may be updated accordingly, and dedicated loss-preventive measures can be communicated to customers and regulators. We have established statistical claims models for the coherence between externally inflicted water damage to private buildings in Norway and selected meteorological variables. Based on these models and downscaled climate predictions from the Hadley centre HadAM3H climate model, the estimated loss level of a future scenario period (2071–2100) is compared to that of a control period (1961–1990). In spite of substantial estimation uncertainty, our analyses identify an incontestable increase in the claims level along with some regional variability. Of the uncertainties inherently involved in such predictions, only the error due to model fit is quantifiable.

Journal ArticleDOI
TL;DR: In this paper, two different approaches to how one can include diagonal effects in non-life claims reserving based on run-off triangles are presented, which can be interpreted as extensions of the multiplicative Poisson models introduced by Hachemeister & Stanard (1975) and Mack (1991).
Abstract: In this paper we present two different approaches to how one can include diagonal effects in non-life claims reserving based on run-off triangles. Empirical analyses suggest that the approaches in Zehnwirth (2003) and Kuang et al. (2008a, 2008b) do not work well with low-dimensional run-off triangles because estimation uncertainty is too large. To overcome this problem we consider similar models with a smaller number of parameters. These are closely related to the framework considered in Verbeek (1972) and Taylor (1977, 2000); the separation method. We explain that these models can be interpreted as extensions of the multiplicative Poisson models introduced by Hachemeister & Stanard (1975) and Mack (1991).

Journal ArticleDOI
TL;DR: A semi-Markov model is constructed and parameterised for the life history of a woman with BC in which events such as diagnosis, treatment, recovery and recurrence are incorporated and the impact of adverse selection under various moratoria on the use of genetic information is shown.
Abstract: Gui et al. (2006), in Part III of a series of papers, proposed a dynamic family history model of breast cancer (BC) and ovarian cancer in which the development of a family history was represented explicitly as a transition between states, and then applied this model to life insurance and critical illness insurance. In this study, we extend the model to income protection insurance. In this paper, Part IV of the series, we construct and parameterise a semi-Markov model for the life history of a woman with BC, in which events such as diagnosis, treatment, recovery and recurrence are incorporated. In Part V, we then show: (a) estimates of premium ratings depending on genotype or family history; and (b) the impact of adverse selection under various moratoria on the use of genetic information.


Journal ArticleDOI
TL;DR: The aim is to replace the ad-hoc Bayesian derivation of the Whittaker method of graduation with a formal Bayesian specification and to show that with this specification it is possible to complete the graduation without making an smoothing parameter selection.
Abstract: The Whittaker method of graduation has been known and used for a long time and has remained popular due to its possession of a number of ideal properties. They include being nonparametric and having an easy to understand foundation. The latter means that it makes sense and thus the user of the method has a good idea of what it can and cannot do. As well, there is a statistical derivation available that uses Bayesian notions. A problem with the derivation is that it is more intuitive than precise and as such does not provide a useful frame of reference for the graduator. Regardless of the point of view, the graduation cannot be completed until the smoothing parameter is selected and this has always relied on the judgment of the analyst. In this paper, three tasks will be undertaken. The first is to replace the ad-hoc Bayesian derivation of the method with a formal Bayesian specification. The second is to show that with this specification it is possible to complete the graduation without making an ...

Journal Article
TL;DR: In this article, a model is presented which can be used when interest rates and future lifetimes are random, for life insurance, and expressions for the mean values and the standard deviations of a future life insurance payment are obtained.
Abstract: A model is presented which can be used when interest rates and future lifetimes are random, for life insurance. Expressions for the mean values and the standard deviations of a future life insurance payment are obtained. These can be used in determining contingency reserves and premium margins for possible adverse interest and mortality experience for a portfolio of life insurance policies. Four complete examples are considered, including four tables of values.

Journal ArticleDOI
TL;DR: This part extends a comprehensive model of a life history of a woman at risk of breast cancer into an IPI market model by incorporating rates of insurance-buying behaviour, in order to estimate the possible costs of adverse selection in terms of increased premiums under various moratoria on the use of genetic information.
Abstract: In Part IV we presented a comprehensive model of a life history of a woman at risk of breast cancer (BC), in which relevant events such as diagnosis, treatment, recovery and recurrence were represented explicitly, and corresponding transition intensities were estimated. In this part, we study some applications to income protection insurance (IPI) business. We calculate premiums based either on genetic test results or more practically on a family history of BC. We then extend the model into an IPI market model by incorporating rates of insurance-buying behaviour, in order to estimate the possible costs of adverse selection, in terms of increased premiums, under various moratoria on the use of genetic information.

Journal ArticleDOI
TL;DR: In this paper, the authors extended the analysis of ruin-related quantities to the delayed renewal risk models and derived a general equation that is used as a framework to obtain the deficit at ruin among many random variables associated with ruin.
Abstract: The main focus of this paper is to extend the analysis of ruin-related quantities to the delayed renewal risk models. First, the background for the delayed renewal risk model is introduced and a general equation that is used as a framework is derived. The equation is obtained by conditioning on the first drop below the initial surplus level. Then, we consider the deficit at ruin among many random variables associated with ruin. The properties of the distribution function (DF) of the proper deficit are examined in particular.

Journal ArticleDOI
TL;DR: In this paper, the authors considered a multi-threshold compound Poisson surplus process, where the initial surplus is between any two consecutive thresholds, and the insurer has the option to choose the respective premium rate and interest rate.
Abstract: We consider a multi-threshold compound Poisson surplus process. When the initial surplus is between any two consecutive thresholds, the insurer has the option to choose the respective premium rate and interest rate. Also, the model allows for borrowing the current amount of deficit whenever the surplus falls below zero. Starting from the integro-differential equations satisfied by the Gerber–Shiu function that appear in Yang et al. (2008), we consider exponentially and phase-type(2) distributed claim sizes, in which cases we are able to transform the integro-differential equations into ordinary differential equations. As a result, we obtain explicit expressions for the Gerber–Shiu function.

Journal Article
TL;DR: In this article, a continuous-time model of a reinsurance market is presented, which contains the principal components of uncertainty transparent in such a market: Uncertainty about the time instants at which accidents take place, and uncertainty about claim sizes given that accidents have occurred.
Abstract: In this paper a continuous-time model of a reinsurance market is presented, which contains the principal components of uncertainty transparent in such a market: Uncertainty about the time instants at which accidents take place, and uncertainty about claim sizes given that accidents have occurred. Due to random jumps at random time points of the underlying claims processes, the absence of arbitrage opportunities is not sufficient to give unique premium functionals in general. Market preferences are derived under a necessary condition for a general exchange equilibrium. Information constraints are found under which premiums of risks are determined. It is demonstrated how general reinsurance treaties can be uniquely split into proportional contracts and nonproportional ones. Several applications to reinsurance markets are given, and the results are compared to the corresponding theory of the classical one-period model of a reinsurance syndicate. This paper attempts to reach a synthesis between the c...

Journal Article
TL;DR: In this article, the authors defend the conventional actuarial approach against criticisms raised by Hoem (1984), and show that the conventional method of calculating the exposed-to-risk is logically consistent with the aggregate expected number of deaths.
Abstract: This paper defends the conventional actuarial approach against criticisms raised by Hoem (1984). It shows that the conventional method of calculating the exposed-to-risk is logically consistent with the aggregate expected number of deaths. In addition, maximum likelihood estimates are shown to be closely related to the estimates of decremental probabilities derived using the conventional actuarial approach.

Journal ArticleDOI
TL;DR: In this article, a detailed proof of what corresponds to part (C) of the theorem above is given, and part (A) will follow in a follow-up paper.
Abstract: It has been pointed out to the author that it would be appropriate to give a more detailed treatment of the results (referred to in section 4 of the preceding paper) sketched rather summarily on page 223 of [4],1 in particular with regard to the general case discussed above in section 8. I take this opportunity graciously afforded me by the editors of this journal to give here a detailed proof of what corresponds to part (C) of the theorem above, and part (A) will follow.