scispace - formally typeset
Search or ask a question

Showing papers in "Insurance Mathematics & Economics in 2012"


Journal ArticleDOI
TL;DR: In this article, the authors consider two well-known datasets from actuarial science and fit a number of parametric distributions to these data, and find that the skew-normal and skew-student are reasonably competitive compared to other models in the literature when describing insurance data.
Abstract: This paper analyzes whether the skew-normal and skew-student distributions recently discussed in the finance literature are reasonable models for describing claims in property-liability insurance. We consider two well-known datasets from actuarial science and fit a number of parametric distributions to these data. Also the non-parametric transformation kernel approach is considered as a benchmark model. We find that the skew-normal and skew-student are reasonably competitive compared to other models in the literature when describing insurance data. In addition to goodness-of-fit tests, tail risk measures such as value at risk and tail value at risk are estimated for the datasets under consideration.

127 citations


Journal ArticleDOI
TL;DR: In this paper, the optimal time-consistent investment and reinsurance strategies for an insurer under Heston's stochastic volatility (SV) model have been discussed, where the surplus process of the insurer is approximated by a Brownian motion with drift.
Abstract: This paper considers the optimal time-consistent investment and reinsurance strategies for an insurer under Heston’s stochastic volatility (SV) model. Such an SV model applied to insurers’ portfolio problems has not yet been discussed as far as we know. The surplus process of the insurer is approximated by a Brownian motion with drift. The financial market consists of one risk-free asset and one risky asset whose price process satisfies Heston’s SV model. Firstly, a general problem is formulated and a verification theorem is provided. Secondly, the closed-form expressions of the optimal strategies and the optimal value functions for the mean–variance problem without precommitment are derived under two cases: one is the investment–reinsurance case and the other is the investment-only case. Thirdly, economic implications and numerical sensitivity analysis are presented for our results. Finally, some interesting phenomena are found and discussed.

122 citations


Journal ArticleDOI
TL;DR: In this paper, the stochastic dynamic programming approach is used to investigate the optimal asset allocation for a defined-contribution pension plan with downside protection under stochastically inflation, and the closed-form solution is derived under the CRRA utility function.
Abstract: In this paper, the stochastic dynamic programming approach is used to investigate the optimal asset allocation for a defined-contribution pension plan with downside protection under stochastic inflation. The plan participant invests the fund wealth and the stochastic interim contribution flows into the financial market. The nominal interest rate model is described by the Cox–Ingersoll–Ross ( Cox et al., 1985 ) dynamics. To cope with the inflation risk, the inflation indexed bond is included in the asset menu. The retired individuals receive an annuity that is indexed by inflation and a downside protection on the amount of this annuity is considered. The closed-form solution is derived under the CRRA utility function. Finally, a numerical application is presented to characterize the dynamic behavior of the optimal investment strategy.

106 citations


Journal ArticleDOI
TL;DR: In this article, the optimal excess-of-loss reinsurance and investment strategies under a constant elasticity of variance (CEV) model for an insurer are considered, and explicit expressions for optimal strategies and optimal value functions of the two problems are derived by stochastic control approach and variable change technique.
Abstract: The optimal excess-of-loss reinsurance and investment strategies under a constant elasticity of variance (CEV) model for an insurer are considered in this paper. Assume that the insurer’s surplus process is approximated by a Brownian motion with drift, the insurer can purchase excess-of-loss reinsurance and invest his (or her) surplus in a financial market consisting of one risk-free asset and one risky asset whose price is modeled by a CEV model, and the objective of the insurer is to maximize the expected exponential utility from terminal wealth. Two problems are studied, one being a reinsurance-investment problem and the other being an investment-only problem. Explicit expressions for optimal strategies and optimal value functions of the two problems are derived by stochastic control approach and variable change technique. Moreover, several interesting results are found, and some sensitivity analysis and numerical simulations are provided to illustrate our results.

103 citations


Journal ArticleDOI
TL;DR: In this paper, a hidden Markov chain (MC) is used to measure the dependence between international financial markets, allowing the unobserved time-varying dependence parameter to vary according to both a restricted ARMA process and an unobserved two-state MC.
Abstract: Measuring dynamic dependence between international financial markets has recently attracted great interest in financial econometrics because the observed correlations rose dramatically during the 2008–09 global financial crisis. Here, we propose a novel approach for measuring dependence dynamics. We include a hidden Markov chain (MC) in the equation describing dependence dynamics, allowing the unobserved time-varying dependence parameter to vary according to both a restricted ARMA process and an unobserved two-state MC. Estimation is carried out via the inference for the margins in conjunction with filtering/smoothing algorithms. We use block bootstrapping to estimate the covariance matrix of our estimators. Monte Carlo simulations compare the performance of regime switching and no switching models, supporting the regime-switching specification. Finally the proposed approach is applied to empirical data, through the study of the S&P500 (USA), FTSE100 (UK) and BOVESPA (Brazil) stock market indexes.

91 citations


Journal ArticleDOI
TL;DR: In this paper, a parallel dual approach to the direct parametric modelling and projecting of mortality rates is proposed, where the feasibility of projecting mortality improvement rates (as opposed to projecting mortality rates) using parametric predictor structures that are amenable to simple time series forecasting is investigated.
Abstract: We investigate the modelling of mortality improvement rates and the feasibility of projecting mortality improvement rates (as opposed to projecting mortality rates), using parametric predictor structures that are amenable to simple time series forecasting. This leads to our proposing a parallel dual approach to the direct parametric modelling and projecting of mortality rates. Comparisons of simulated life expectancy predictions (by the cohort method) using the England and Wales population mortality experiences for males and females under a variety of controlled data trimming exercises are presented in detail and comparisons are also made between the parallel modelling approaches.

83 citations


Journal ArticleDOI
TL;DR: In this paper, the class of Log phase-type (LogPH) distributions is proposed as a parametric alternative to the generalized Pareto distribution (GPD) for fitting heavy tailed data.
Abstract: Many insurance loss data are known to be heavy-tailed. In this article we study the class of Log phase-type (LogPH) distributions as a parametric alternative in fitting heavy tailed data. Transformed from the popular phase-type distribution class, the LogPH introduced by Ramaswami exhibits several advantages over other parametric alternatives. We analytically derive its tail related quantities including the conditional tail moments and the mean excess function, and also discuss its tail thickness in the context of extreme value theory. Because of its denseness proved herein, we argue that the LogPH can offer a rich class of heavy-tailed loss distributions without separate modeling for the tail side, which is the case for the generalized Pareto distribution (GPD). As a numerical example we use the well-known Danish fire data to calibrate the LogPH model and compare the result with that of the GPD. We also present fitting results for a set of insurance guarantee loss data.

76 citations


Journal ArticleDOI
Abstract: Consider a renewal risk model in which claim sizes and inter-arrival times correspondingly form a sequence of independent and identically distributed random pairs, with each pair obeying a dependence structure described via the conditional distribution of the inter-arrival time given the subsequent claim size being large. We study large deviations of the aggregate amount of claims. For a heavy-tailed case, we obtain a precise large-deviation formula, which agrees with existing ones in the literature.

67 citations


Journal ArticleDOI
TL;DR: In this article, the authors derive the optimal consumption rate and focus on the impact of mortality rate uncertainty versus simple lifetime uncertainty in the retirement phase where this risk plays a greater role.
Abstract: We extend the lifecycle model (LCM) of consumption over a random horizon (also known as the Yaari model) to a world in which (i) the force of mortality obeys a diffusion process as opposed to being deterministic, and (ii) consumers can adapt their consumption strategy to new information about their mortality rate (also known as health status) as it becomes available. In particular, we derive the optimal consumption rate and focus on the impact of mortality rate uncertainty versus simple lifetime uncertainty — assuming that the actuarial survival curves are initially identical — in the retirement phase where this risk plays a greater role. In addition to deriving and numerically solving the partial differential equation (PDE) for the optimal consumption rate, our main general result is that when the utility preferences are logarithmic the initial consumption rates are identical. But, in a constant relative risk aversion (CRRA) framework in which the coefficient of relative risk aversion is greater (smaller) than one, the consumption rate is higher (lower) and a stochastic force of mortality does make a difference. That said, numerical experiments indicate that, even for non-logarithmic preferences, the stochastic mortality effect is relatively minor from the individual’s perspective. Our results should be relevant to researchers interested in calibrating the lifecycle model as well as those who provide normative guidance (also known as financial advice) to retirees.

65 citations


Journal ArticleDOI
TL;DR: An algorithm for numerical approximation is given which introduces dependence between originally independent marginal samples through reordering in a hierarchical risk aggregation method which is flexible in high dimensions.
Abstract: For high-dimensional risk aggregation purposes, most popular copula classes are too restrictive in terms of attainable dependence structures. These limitations aggravate with increasing dimension. We study a hierarchical risk aggregation method which is flexible in high dimensions. With this method it suffices to specify a low dimensional copula for each aggregation step in the hierarchy. Copulas and margins of arbitrary kind can be combined. We give an algorithm for numerical approximation which introduces dependence between originally independent marginal samples through reordering.

62 citations


Journal ArticleDOI
TL;DR: In this paper, the authors modify an existing 4 factor model of mortality to enable better fitting to a wider age range, and using data from seven developed countries their empirical results show that the proposed model has a better fit to the actual data, is robust, and has good forecasting ability.
Abstract: Stochastic modeling of mortality rates focuses on fitting linear models to logarithmically adjusted mortality data from the middle or late ages. Whilst this modeling enables insurers to project mortality rates and hence price mortality products it does not provide good fit for younger aged mortality. Mortality rates below the early 20’s are important to model as they give an insight into estimates of the cohort effect for more recent years of birth. It is also important given the cumulative nature of life expectancy to be able to forecast mortality improvements at all ages. When we attempt to fit existing models to a wider age range, 5–89, rather than 20–89 or 50–89, their weaknesses are revealed as the results are not satisfactory. The linear innovations in existing models are not flexible enough to capture the non-linear profile of mortality rates that we see at the lower ages. In this paper, we modify an existing 4 factor model of mortality to enable better fitting to a wider age range, and using data from seven developed countries our empirical results show that the proposed model has a better fit to the actual data, is robust, and has good forecasting ability.

Journal ArticleDOI
TL;DR: In this article, the authors show that in individualized reinsurance treaties, to minimize certain risk measures of the retained loss of an insurer, the excess-of-loss treaty is the optimal reinsurance form for an insurer with dependent risks among a general class of individualized contracts.
Abstract: In the individual risk model, one is often concerned about positively dependent risks. Several notions of positive dependence have been proposed to describe such dependent risks. In this paper, we assume that the risks in the individual risk model are positively dependent through the stochastic ordering (PDS). The PDS risks include independent, comonotonic, conditionally stochastically increasing (CI) risks, and other interesting dependent risks. By proving the convolution preservation of the convex order for PDS random vectors, we show that in individualized reinsurance treaties, to minimize certain risk measures of the retained loss of an insurer, the excess-of-loss treaty is the optimal reinsurance form for an insurer with PDS dependent risks among a general class of individualized reinsurance contracts. This extends the study in Denuit and Vermandele (1998) on individualized reinsurance treaties to dependent risks. We also derive the explicit expressions for the retentions in the optimal excess-of-loss treaty in a two-line insurance business model.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the expected discounted value of a payment at the time of death under the assumption of a constant force of mortality, where the payment depends on the price of a stock at that time and possibly also on the history of the stock price.
Abstract: Motivated by the Guaranteed Minimum Death Benefits in various deferred annuities, we investigate the calculation of the expected discounted value of a payment at the time of death. The payment depends on the price of a stock at that time and possibly also on the history of the stock price. If the payment turns out to be the payoff of an option, we call the contract for the payment a (life) contingent option. Because each time-until-death distribution can be approximated by a combination of exponential distributions, the analysis is made for the case where the time until death is exponentially distributed, i.e., under the assumption of a constant force of mortality. The time-until-death random variable is assumed to be independent of the stock price process which is a geometric Brownian motion. Our key tool is a discounted joint density function. A substantial series of closed-form formulas is obtained, for the contingent call and put options, for lookback options, for barrier options, for dynamic fund protection, and for dynamic withdrawal benefits. In a section on several stocks, the method of Esscher transforms proves to be useful for finding among others an explicit result for valuing contingent Margrabe options or exchange options. For the case where the contracts have a finite expiry date, closed-form formulas are found for the contingent call and put options. From these, results for De Moivre’s law are obtained as limits. We also discuss equity-linked death benefit reserves and investment strategies for maintaining such reserves. The elasticity of the reserve with respect to the stock price plays an important role. Whereas in the most important applications the stopping time is the time of death, it could be different in other applications, for example, the time of the next catastrophe.

Journal ArticleDOI
TL;DR: In this article, a finite mixture of Skew Normal distributions is proposed to estimate the model, providing the likelihood and the priors for the all unknown parameters; they implement an adaptive Markov Chain Monte Carlo algorithm to approximate the posterior distribution.
Abstract: The derivation of loss distribution from insurance data is a very interesting research topic but at the same time not an easy task. To find an analytic solution to the loss distribution may be misleading although this approach is frequently adopted in the actuarial literature. Moreover, it is well recognized that the loss distribution is strongly skewed with heavy tails and presents small, medium and large size claims which hardly can be fitted by a single analytic and parametric distribution. Here we propose a finite mixture of Skew Normal distributions that provides a better characterization of insurance data. We adopt a Bayesian approach to estimate the model, providing the likelihood and the priors for the all unknown parameters; we implement an adaptive Markov Chain Monte Carlo algorithm to approximate the posterior distribution. We apply our approach to a well known Danish fire loss data and relevant risk measures, such as Value-at-Risk and Expected Shortfall probability, are evaluated as well.

Journal ArticleDOI
TL;DR: This paper illustrates the modeling of dependence structures of non-life insurance risks using the Bernstein copula, including its flexibility in mapping inhomogeneous dependence structures and its easy use in a simulation context due to its representation as mixture of independent Beta densities.
Abstract: This paper illustrates the modeling of dependence structures of non-life insurance risks using the Bernstein copula. We conduct a goodness-of-fit analysis and compare the Bernstein copula with other widely used copulas. Then, we illustrate the use of the Bernstein copula in a value-at-risk and tail-value-at-risk simulation study. For both analyses we utilize German claims data on storm, flood, and water damage insurance for calibration. Our results highlight the advantages of the Bernstein copula, including its flexibility in mapping inhomogeneous dependence structures and its easy use in a simulation context due to its representation as mixture of independent Beta densities. Practitioners and regulators working toward appropriate modeling of dependences in a risk management and solvency context can benefit from our results.

Journal ArticleDOI
TL;DR: In this paper, the authors introduced a premium principle which relies on cumulative prospect theory by Kahneman and Tversky, and examined the properties of this premium principle in the actuarial literature.
Abstract: The aim of this paper is to introduce a premium principle which relies on Cumulative Prospect Theory by Kahneman and Tversky. Some special cases of this premium principle have already been studied in the actuarial literature. In the paper, properties of this premium principle are examined.

Journal ArticleDOI
TL;DR: In this paper, an alternative to the Lee and Carter model, the AR(1)-ARCH(1) model, was proposed to consider the increased life expectancy trends in mortality rates and is still broadly used today.
Abstract: With the decline in the mortality level of populations, national social security systems and insurance companies of most developed countries are reconsidering their mortality tables taking into account the longevity risk. The Lee and Carter model is the first discrete-time stochastic model to consider the increased life expectancy trends in mortality rates and is still broadly used today. In this paper, we propose an alternative to the Lee–Carter model: an AR(1)–ARCH(1) model. More specifically, we compare the performance of these two models with respect to forecasting age-specific mortality in Italy. We fit the two models, with Gaussian and t-student innovations, for the matrix of Italian death rates from 1960 to 2003. We compare the forecast ability of the two approaches in out-of-sample analysis for the period 2004–2006 and find that the AR(1)–ARCH(1) model with t-student innovations provides the best fit among the models studied in this paper.

Journal ArticleDOI
TL;DR: In this paper, the authors studied the well-known Haezendonck-Goovaerts risk measures on their natural domain, that is on Orlicz spaces and, in particular, on ORlicz hearts.
Abstract: In this paper, we will study the well-known Haezendonck-Goovaerts risk measures on their natural domain, that is on Orlicz spaces and, in particular, on Orlicz hearts. We will provide a dual representation as well as the optimal scenario in such a representation and investigate the properties of the minimizer x � (that we will call Orlicz quantile) in the definition of the Haezendonck-Goovaerts risk measure. Since Orlicz quantiles fail to satisfy an internality property, bilateral Orlicz quantiles are also introduced and analyzed.

Journal ArticleDOI
TL;DR: In this article, the problem of optimal investment, consumption and life insurance acquisition for a wage earner who has CRRA (constant relative risk aversion) preferences is solved by dynamic programming approach and the HJB equation has closed form solution.
Abstract: This paper considers the problem of optimal investment, consumption and life insurance acquisition for a wage earner who has CRRA (constant relative risk aversion) preferences. The market model is complete, continuous, the uncertainty is driven by Brownian motion and the stock price has mean reverting drift. The problem is solved by dynamic programming approach and the HJB equation is shown to have closed form solution. Numerical experiments explore the impact market price of risk has on the optimal strategies.

Journal ArticleDOI
TL;DR: In this paper, a portfolio of n dependent risks X 1, …, X n is considered and the stochastic behavior of the aggregate claim amount S = X 1 + ⋯ + X n.
Abstract: In this paper, we consider a portfolio of n dependent risks X 1 , … , X n and we study the stochastic behavior of the aggregate claim amount S = X 1 + ⋯ + X n . Our objective is to determine the amount of economic capital needed for the whole portfolio and to compute the amount of capital to be allocated to each risk X 1 , … , X n . To do so, we use a top–down approach. For ( X 1 , … , X n ) , we consider risk models based on multivariate compound distributions defined with a multivariate counting distribution. We use the TVaR to evaluate the total capital requirement of the portfolio based on the distribution of S , and we use the TVaR-based capital allocation method to quantify the contribution of each risk. To simplify the presentation, the claim amounts are assumed to be continuously distributed. For multivariate compound distributions with continuous claim amounts, we provide general formulas for the cumulative distribution function of S , for the TVaR of S and the contribution to each risk. We obtain closed-form expressions for those quantities for multivariate compound distributions with gamma and mixed Erlang claim amounts. Finally, we treat in detail the multivariate compound Poisson distribution case. Numerical examples are provided in order to examine the impact of the dependence relation on the TVaR of S , the contribution to each risk of the portfolio, and the benefit of the aggregation of several risks.

Journal ArticleDOI
TL;DR: In this paper, a non-self-financing portfolio optimization problem under the framework of multi-period mean-variance with Markov regime switching and a stochastic cash flow is investigated.
Abstract: This paper investigates a non-self-financing portfolio optimization problem under the framework of multi-period mean–variance with Markov regime switching and a stochastic cash flow. The stochastic cash flow can be explained as capital additions or withdrawals during the investment process. Specially, the cash flow is the surplus process or the risk process of an insurer at each period. The returns of assets and amount of the cash flow all depend on the states of a stochastic market which are assumed to follow a discrete-time Markov chain. We analyze the existence of optimal solutions, and derive the optimal strategy and the efficient frontier in closed-form. Several special cases are discussed and numerical examples are given to demonstrate the effect of cash flow.

Journal ArticleDOI
TL;DR: In this paper, the authors proposed a Delta-Gamma hedging technique for mortality risk, where the risk factor against which to hedge is the difference between the actual mortality intensity in the future and its forecast intensity, the forward intensity.
Abstract: One of the major concerns of life insurers and pension funds is the increasing longevity of their beneficiaries. This paper studies the hedging problem of annuity cash flows when mortality and interest rates are stochastic. We first propose a Delta–Gamma hedging technique for mortality risk. The risk factor against which to hedge is the difference between the actual mortality intensity in the future and its “forecast” today, the forward intensity. We specialize the hedging technique first to the case in which mortality intensities are affine, then to Ornstein–Uhlenbeck and Feller processes, providing actuarial justifications for this selection. We show that, without imposing no arbitrage, we can get equivalent probability measures under which the HJM condition for no arbitrage is satisfied. Last, we extend our results to the presence of both interest rate and mortality risk. We provide a UK calibrated example of Delta–Gamma hedging of both mortality and interest rate risk.

Journal ArticleDOI
TL;DR: The mean-semivariance-skewness- semikurtosis model for fuzzy portfolio is presented and its four corresponding variants are also considered and the genetic algorithm integrating fuzzy simulation for optimization models is designed.
Abstract: The aim of this paper is to consider the moments and the semi-moments (ie semi-kurtosis) for portfolio selection with fuzzy risk factors (ie trapezoidal risk factors) In order to measure the leptokurtocity of fuzzy portfolio return, notions of moments (ie Kurtosis) kurtosis and semi-moments(ie Semi-kurtosis) for fuzzy port- folios are originally introduced in this paper, and their mathematical properties are studied As an extension of the mean-semivariance-skewness model for fuzzy portfolio, the mean-semivariance-skewness- semikurtosis is presented and its four corresponding variants are also considered We briefly designed the genetic algorithm integrating fuzzy simulation for our optimization models

Journal ArticleDOI
Qihe Tang1, Fan Yang1
TL;DR: In this paper, the authors derived exact asymptotics for the Haezendonck-Goovaerts risk measure via a convex Young function and a parameter q ∈ ( 0, 1 ) representing the confidence level.
Abstract: In this paper, we are interested in the calculation of the Haezendonck–Goovaerts risk measure, which is defined via a convex Young function and a parameter q ∈ ( 0 , 1 ) representing the confidence level. We mainly focus on the case in which the risk variable follows a distribution function from a max-domain of attraction. For this case, we restrict the Young function to be a power function and we derive exact asymptotics for the Haezendonck–Goovaerts risk measure as q ↑ 1 . As a subsidiary, we also consider the case with an exponentially distributed risk variable and a general Young function, and we obtain an analytical expression for the Haezendonck–Goovaerts risk measure.

Journal ArticleDOI
TL;DR: In this paper, a stochastic projection model is proposed in order to represent the future evolution of mortality and disability transition intensities, and a risk model based on portfolio risk reserve is proposed and different rules to calculate solvency capital requirements for life underwriting risk are examined.
Abstract: The aim of the paper is twofold. Firstly, it develops a model for risk assessment in a portfolio of life annuities with long term care benefits. These products are usually represented by a Markovian Multi-State model and are affected by both longevity and disability risks. Here, a stochastic projection model is proposed in order to represent the future evolution of mortality and disability transition intensities. Data from the Italian National Institute of Social Security (INPS) and from Human Mortality Database (HMD) are used to estimate the model parameters. Secondly, it investigates the solvency in a portfolio of enhanced pensions. To this aim a risk model based on the portfolio risk reserve is proposed and different rules to calculate solvency capital requirements for life underwriting risk are examined. Such rules are then compared with the standard formula proposed by the Solvency II project.

Journal ArticleDOI
TL;DR: In this article, the authors proposed a specific analytic valuation framework with mortality risk, interest rate risk, and housing price risk that helps determine fair premiums when the present value of premiums equals the past value of contingent losses.
Abstract: For the valuation of reverse mortgages with tenure payments, this article proposes a specific analytic valuation framework with mortality risk, interest rate risk, and housing price risk that helps determine fair premiums when the present value of premiums equals the present value of contingent losses. The analytic valuation of reverse mortgages with tenure payments is more complex than the valuation with a lump sum payment. This study therefore proposes a dimension reduction technique to achieve a closed-form solution for reverse annuity mortgage insurance, conditional on the evolution of interest rates. The technique provides strong accuracy, offering important implications for lenders and insurers.

Journal ArticleDOI
TL;DR: In this paper, a tailor-made bootstrap was proposed to improve the methodology for forecasting mortality in order to enhance model performance and increase forecasting power by capturing the dependence structure of neighboring observations in the population.
Abstract: The risk profile of an insurance company involved in annuity business is heavily affected by the uncertainty in future mortality trends. It is problematic to capture accurately future survival patterns, in particular at retirement ages when the effects of the rectangularization phenomenon and random fluctuations are combined. Another important aspect affecting the projections is related to the so-called cohort-period effect. In particular, the mortality experience of countries in the industrialized world over the course of the twentieth century would suggest a substantial age–time interaction, with the two dominant trends affecting different age groups at different times. From a statistical point of view, this indicates a dependence structure. Also the dependence between ages is an important component in the modeling of mortality ( Barrieu et al., 2011 ). It is observed that the mortality improvements are similar for individuals of contiguous ages ( Wills and Sherris, 2008 ). Moreover, considering the data subdivided by set by single years of age, the correlations between the residuals for adjacent age groups tend to be high (as noted in Denton et al., 2005 ). This suggests that there is value in exploring the dependence structure, also across time, in other words the inter-period correlation. The aim of this paper is to improve the methodology for forecasting mortality in order to enhance model performance and increase forecasting power by capturing the dependence structure of neighboring observations in the population. To do this, we adapt the methodology for measuring uncertainty in projections in the Lee–Carter context and introduce a tailor-made bootstrap instead of an ordinary bootstrap. The approach is illustrated with an empirical example.

Journal ArticleDOI
TL;DR: In this article, the authors proposed alternative analytical methods for the calculation of risk measures for variable annuity guaranteed benefits on a stand-alone basis based on the study of geometric Brownian motion and its integral.
Abstract: With the increasing complexity of investment options in life insurance, more and more life insurers have adopted stochastic modeling methods for the assessment and management of insurance and financial risks. The most prevalent approach in market practice, Monte Carlo simulation, has been observed to be time consuming and sometimes extremely costly. In this paper we propose alternative analytical methods for the calculation of risk measures for variable annuity guaranteed benefits on a stand-alone basis. The techniques for analytical calculations are based on the study of geometric Brownian motion and its integral. Another novelty of the paper is to propose a quantitative model which assesses both market risk on the liability side and revenue risk on the asset side in the same framework from the viewpoint of risk management. As we demonstrate by numerous examples on quantile risk measure and conditional tail expectation, the methods and numerical algorithms developed in this paper appear to be both accurate and computationally efficient.

Journal ArticleDOI
TL;DR: In this article, an explicit expression for the optimal investment strategy, under the constant elasticity of variance (CEV) model, which maximizes the expected HARA utility of the final value of the surplus at the maturity time is given.
Abstract: We give an explicit expression for the optimal investment strategy, under the constant elasticity of variance (CEV) model, which maximizes the expected HARA utility of the final value of the surplus at the maturity time. To do this, the corresponding HJB equation will be transformed into a linear partial differential equation by applying a Legendre transform. And we prove that the optimal investment strategy corresponding to the HARA utility function converges a.s. to the one corresponding to the exponential utility function.

Journal ArticleDOI
TL;DR: In this article, the authors argue that risk sharing is always beneficial (with respect to convex order or second degree stochastic dominance) provided the risk-averse agents share the total losses appropriately (whatever the distribution of the losses, their correlation structure and individual degrees of risk aversion).
Abstract: Using a standard reduction argument based on conditional expectations, this paper argues that risk sharing is always beneficial (with respect to convex order or second degree stochastic dominance) provided the risk-averse agents share the total losses appropriately (whatever the distribution of the losses, their correlation structure and individual degrees of risk aversion). Specifically, all agents hand their individual losses over to a pool and each of them is liable for the conditional expectation of his own loss given the total loss of the pool. We call this risk sharing mechanism the conditional mean risk sharing. If all the conditional expectations involved are non-decreasing functions of the total loss then the conditional mean risk sharing is shown to be Pareto-optimal. Explicit expressions for the individual contributions to the pool are derived in some special cases of interest: independent and identically distributed losses, comonotonic losses, and mutually exclusive losses. In particular, conditions under which this payment rule leads to a comonotonic risk sharing are examined.