scispace - formally typeset
Search or ask a question

Showing papers by "Jens Perch Nielsen published in 2006"


Posted Content
TL;DR: In this article, an extended version of the Cox model where the parameters are allowed to vary over time was used to analyse the temporal covariate effects on the duration times of a customer's lifetime.
Abstract: The Cox model (Cox, 1972) is widely used in customer lifetime duration research, but it assumes that the regression coefficients are time invariant. In order to analyse the temporal covariate effects on the duration times, we propose to use an extended version of the Cox model where the parameters are allowed to vary over time. We apply this methodology to real insurance policy cancellation data and we conclude that the kind of contracts held by the customer and the concurrence of an external insurer in the cancellation influence the risk of the customer leaving the company, but the effect differs as time goes by.

68 citations


01 Jan 2006
TL;DR: In this paper, an extended version of the Cox model where the parameters are allowed to vary over time was used to analyse the temporal covariate effects on the duration times of a customer's lifetime.
Abstract: The Cox model (Cox, 1972) is widely used in customer lifetime duration research, but it assumes that the regression coefficients are time invariant. In order to analyse the temporal covariate effects on the duration times, we propose to use an extended version of the Cox model where the parameters are allowed to vary over time. We apply this methodology to real insurance policy cancellation data and we conclude that the kind of contracts held by the customer and the concurrence of an external insurer in the cancellation influence the risk of the customer leaving the company, but the effect differs as time goes by.

62 citations


Journal ArticleDOI
TL;DR: In this paper, a representative member of a family of new pension schemes that have been introduced in the new millennium to alleviate these problems is analyzed using contingent claims pricing theory, and the authors explore the properties of this pension scheme in detail and find that in terms of market value, smoothing is an illusion, but also that return smoothing implies a dynamic asset allocation strategy which corresponds with traditional pension saving advice and the recommendations of state-of-the-art dynamic portfolio choice models.
Abstract: Traditional with-profits pension saving schemes have been criticized for their opacity, plagued by embedded options and guarantees, and have recently created enormous problems for the solvency of the life insurance and pension industry. This has fueled creativity in the industry’s product development departments, and this paper analyzes a representative member of a family of new pension schemes that have been introduced in the new millennium to alleviate these problems. The complete transparency of the new scheme’s smoothing mechanism means that it can be analyzed using contingent claims pricing theory. We explore the properties of this pension scheme in detail and find that in terms of market value, smoothing is an illusion, but also that the return smoothing mechanism implies a dynamic asset allocation strategy which corresponds with traditional pension saving advice and the recommendations of state-of-the-art dynamic portfolio choice models.

51 citations


Journal ArticleDOI
TL;DR: In this article, a tailor made semiparametric asymmetric kernel density estimator was developed for the estimation of actuarial loss distributions. The estimator is obtained by transforming the data with the generalized Champernowne distribution initially fitted to the data.
Abstract: We develop a tailor made semiparametric asymmetric kernel density estimator for the estimation of actuarial loss distributions. The estimator is obtained by transforming the data with the generalized Champernowne distribution initially fitted to the data. Then the density of the transformed data is estimated by use of local asymmetric kernel methods to obtain superior estimation properties in the tails. We find in a vast simulation study that the proposed semiparametric estimation procedure performs well relative to alternative estimators. An application to operational loss data illustrates the proposed method.

25 citations


Journal ArticleDOI
TL;DR: In this paper, the benefits of applying sophisticated statistical techniques to challenges faced in the quantification of operational risk are considered, emphasizing the importance of capturing tail behavior, and applying non-parametric smoothing techniques along with a parametric base with a particular view to comparison with extreme value theory.
Abstract: This paper considers the benefits of applying sophisticated statistical techniques to challenges faced in the quantification of operational risk. The evolutionary nature of operational risk modeling to establish capital charges is recognized, emphasizing the importance of capturing tail behavior. Non-parametric smoothing techniques are considered along with a parametric base with a particular view to comparison with extreme value theory. This is presented without detailed proofs with the aim of demonstrating to practitioners the practical benefits of such techniques. The smoothed estimators embedded in a credibility approach supports analysis from pooled data across lines of business or across risk types/regions.

20 citations


Journal ArticleDOI
TL;DR: The benefits of applying sophisticated statistical techniques to challenges faced in the quantification of operational risk, including nonparametric smoothing techniques, are considered along with a parametric base with a particular view to comparison with extreme value theory.
Abstract: This paper considers the benefits of applying sophisticated statistical techniques to challenges faced in the quantification of operational risk. The evolutionary nature of operational risk modelling to establish capital charges is recognised emphasizing the importance of capturing tail behaviour. Nonparametric smoothing techniques are considered along with a parametric base with a particular view to comparison with extreme value theory. This is presented without detailed proofs in the aim of demonstrating to practitioners the practical benefits of such techniques. The smoothed estimators embedded in a credibility approach supports analysis from pooled data across lines of business or across risk types/regions.

16 citations


Journal ArticleDOI
TL;DR: A semiparametric approach to operational risk modelling that is able to take underreporting into account and that allows itself to be guided by prior knowledge of distributional shape is introduced.
Abstract: Not all claims are reported when a financial operational risk data base is created. The probability of reporting increase with the size of the operational risk loss and approaches one for very big losses. Operational risk losses comes from many different sources and can be expected to follow a wide variety of distributional shapes. Therefore, an approach to operational risk modelling based on one or two favourites of parametric models are deemed to fail. In this paper we introduce a semiparametric approach to operational risk modelling that is able to take underreporting into account and that allows itself to be guided by prior knowledge of distributional shape.

15 citations


Journal ArticleDOI
TL;DR: In this article, the authors describe some of the pitfalls which can result from the use of the standardised mortality ratio (SMR) while evaluating the development of mortality over time, in particular when SMRs are applied to insurance portfolios varying dramatically over time.
Abstract: Almost all over the world, decreasing mortality rates and increasing life expectancy have led to greater interest in estimating and predicting mortality. Here we describe some of the pitfalls which can result from the use of the standardised mortality ratio (SMR) while evaluating the development of mortality over time, in particular when SMRs are applied to insurance portfolios varying dramatically over time. Although an excellent comparative study of a single-figure index for a number of countries was recently done by Macdonald et al. (1998), we advocate care when attempting to extend this type of method to insurance data. Here we promote the use of genuine multiplicative modelling such as in Felipe et al. (2001), who compared the mortality rates in Denmark and Spain. The starting point for our study was the two-dimensional mortality estimator of Nielsen & Linton (1995), which considers mortality as a function of chronological time and age. From the principle of marginal integration (see Nielsen & Linton, 1995, and Linton et al., 2003), estimators of the multiplicative model can be obtained from this two-dimensional estimator. An application of the method is provided for mortality data of the United States of America, England & Wales, France, Italy, Japan and Russia.

9 citations


01 Jan 2006
TL;DR: A tailor made semiparametric asymmetric kernel density estimator is developed for the estimation of actuarial loss distributions by transforming the data with the generalized Champernowne distribution initially fitted to the data.
Abstract: We develop a tailor made semiparametric asymmetric kernel density estimator for the estimation of actuarial loss distributions. The estimator is obtained by transforming the data with the generalized Champernowne distribution initially fitted to the data. Then the density of the transformed data is estimated by use of local asymmetric kernel methods to obtain superior estimation properties in the tails. We find in a vast simulation study that the proposed semiparametric estimation procedure performs well relative to alternative estimators. An application to operational loss data illustrates the proposed method.

6 citations


Journal ArticleDOI
TL;DR: In this paper, the authors investigate the model of Froot & Stein (1998), a model which has very strong implications for risk management and argue that their conclusions are too strong and need to be qualified.
Abstract: We investigate the model of Froot & Stein (1998), a model which has very strong implications for risk management. We argue that their conclusions are too strong and need to be qualified. We also argue that their analysis is incorrect and incomplete. Specifically, there are some unusual consequences of their model, which may be linked to the chosen pricing formula.

5 citations


Journal ArticleDOI
TL;DR: In this paper, the authors show that the economic intuition behind the paper of Froot and Stein (1998) is correct and that their result can be obtained when the market is reformulated in a discrete time setting, where modern market theory is employed.
Abstract: H?gh, Linton and Nielsen (2006) showed that the famous result in the reward winning paper of Froot and Stein (1998) is not correct in the sense that their result does not follow from their assumptions. In this paper we show that the economic intuition behind the paper of Froot and Stein (1998) is correct and that their result can be obtained when the market is reformulated in a discrete time setting, where modern market theory is employed.