scispace - formally typeset
Search or ask a question
Author

Lluís Bermúdez

Bio: Lluís Bermúdez is an academic researcher from University of Barcelona. The author has contributed to research in topics: Overdispersion & Bivariate analysis. The author has an hindex of 12, co-authored 34 publications receiving 613 citations.

Papers
More filters
Journal ArticleDOI
TL;DR: In this article, the authors introduce different multivariate Poisson regression models in order to relax the independence assumption, including zero-inflated models to account for excess of zeros and overdispersion.
Abstract: When actuaries face with the problem of pricing an insurance contract that contains different types of coverage, such as a motor insurance or homeowner's insurance policy, they usually assume that types of claim are independent. However, this assumption may not be realistic: several studies have shown that there is a positive correlation between types of claim. Here we introduce different multivariate Poisson regression models in order to relax the independence assumption, including zero-inflated models to account for excess of zeros and overdispersion. These models have been largely ignored to date, mainly because of their computational difficulties. Bayesian inference based on MCMC helps to solve this problem (and also lets us derive, for several quantities of interest, posterior summaries to account for uncertainty). Finally, these models are applied to an automobile insurance claims database with three different types of claims. We analyse the consequences for pure and loaded premiums when the independence assumption is relaxed by using different multivariate Poisson regression models and their zero-inflated versions.

88 citations

Journal ArticleDOI
TL;DR: In this paper, a 2-finite mixture of bivariate Poisson regression models is proposed to model the overdispersion in the data, and it is shown that a simple zero-inflated bivariate poisson model does not suffice.
Abstract: In a recent paper Bermudez [2009] used bivariate Poisson regression models for ratemaking in car insurance, and included zero-inflated models to account for the excess of zeros and the overdispersion in the data set. In the present paper, we revisit this model in order to consider alternatives. We propose a 2-finite mixture of bivariate Poisson regression models to demonstrate that the overdispersion in the data requires more structure if it is to be taken into account, and that a simple zero-inflated bivariate Poisson model does not suffice. At the same time, we show that a finite mixture of bivariate Poisson regression models embraces zero-inflated bivariate Poisson regression models as a special case. Additionally, we describe a model in which the mixing proportions are dependent on covariates when modelling the way in which each individual belongs to a separate cluster. Finally, an EM algorithm is provided in order to ensure the models’ ease-of-fit. These models are applied to the same automobile insurance claims data set as used in Bermudez [2009] and it is shown that the modelling of the data set can be improved considerably.

83 citations

Journal ArticleDOI
TL;DR: In this article, the authors introduce different multivariate Poisson regression models in order to relax the independence assumption, including zero-inflated models to account for excess of zeros and overdispersion.
Abstract: When actuaries face the problem of pricing an insurance contract that contains different types of coverage, such as a motor insurance or a homeowner’s insurance policy, they usually assume that types of claim are independent. However, this assumption may not be realistic: several studies have shown that there is a positive correlation between types of claim. Here we introduce different multivariate Poisson regression models in order to relax the independence assumption, including zero-inflated models to account for excess of zeros and overdispersion. These models have been largely ignored to date, mainly because of their computational difficulties. Bayesian inference based on MCMC helps to resolve this problem (and also allows us to derive, for several quantities of interest, posterior summaries to account for uncertainty). Finally, these models are applied to an automobile insurance claims database with three different types of claim. We analyse the consequences for pure and loaded premiums when the independence assumption is relaxed by using different multivariate Poisson regression models together with their zero-inflated versions.

82 citations

Journal ArticleDOI
TL;DR: In this article, the authors discuss the use of the standard model for the calculation of the solvency capital requirement (SCR) when the company aims to use the specific parameters of the model on the basis of the experience of its portfolio.
Abstract: In this work discuss the use of the standard model for the calculation of the solvency capital requirement (SCR) when the company aims to use the specific parameters of the model on the basis of the experience of its portfolio. In particular, this analysis focuses on the formula presented in the latest quantitative impact study (2010 CEIOPS) for non-life underwriting premium and reserve risk. One of the keys of the standard model for premium and reserves risk is the correlation matrix between lines of business. In this work we present how the correlation matrix between lines of business could be estimated from a quantitative perspective, as well as the possibility of using a credibility model for the estimation of the matrix of correlation between lines of business that merge qualitative and quantitative perspective.

61 citations

Journal ArticleDOI
TL;DR: It is shown that, in some instances, a common risk measure such as Value-at-Risk is not subadditive when certain dependence structures are considered and higher risk evaluations are obtained for independence between random variables than those obtained in the case of comonotonicity.
Abstract: This paper examines why a financial entity’s solvency capital estimation might be underestimated if the total amount required is obtained directly from a risk measurement. Using Monte Carlo simulation we show that, in some instances, a common risk measure such as Value-at-Risk is not subadditive when certain dependence structures are considered. Higher risk evaluations are obtained for independence between random variables than those obtained in the case of comonotonicity. The paper stresses, therefore, the relationship between dependence structures and capital estimation.

55 citations


Cited by
More filters
Book ChapterDOI
01 Jan 1998

885 citations

Book
04 Nov 2005
TL;DR: In this article, the authors provide an essential guide to managing modern financial risk by combining coverage of stochastic order and risk measure theories with the basics of risk management, including dependence concepts and dependence orderings.
Abstract: The increasing complexity of insurance and reinsurance products has seen a growing interest amongst actuaries in the modelling of dependent risks. For efficient risk management, actuaries need to be able to answer fundamental questions such as: Is the correlation structure dangerous? And, if yes, to what extent? Therefore tools to quantify, compare, and model the strength of dependence between different risks are vital. Combining coverage of stochastic order and risk measure theories with the basics of risk management and stochastic dependence, this book provides an essential guide to managing modern financial risk. * Describes how to model risks in incomplete markets, emphasising insurance risks. * Explains how to measure and compare the danger of risks, model their interactions, and measure the strength of their association. * Examines the type of dependence induced by GLM-based credibility models, the bounds on functions of dependent risks, and probabilistic distances between actuarial models. * Detailed presentation of risk measures, stochastic orderings, copula models, dependence concepts and dependence orderings. * Includes numerous exercises allowing a cementing of the concepts by all levels of readers. * Solutions to tasks as well as further examples and exercises can be found on a supporting website.

590 citations

Journal Article
TL;DR: Shavell, 2004, The Belknap Press of Harvard University Press, 737 pages as mentioned in this paper The foundation of economic analysis of law has been defined by Shavell and his colleagues, including Posner, Landes, and Ehrlich.
Abstract: Foundations of Economic Analysis of Law, by Steven Shavell, 2004, The Belknap Press of Harvard University Press, 737 pages Steven Shavell has contributed to the foundations of economic analysis of law in different manners According to Posner (2006), he is a member of the third generation of economic analysts of law-Coase, Becker, and Calabresi being the first group, with Posner himself, Landes, and Ehrlich forming the second Shavell has published several books and more than 100 articles on economics and on the economics of law He has contributed to the principal-agent theory (Shavell, 1979b) and, more particularly, to the moral hazard literature (Shavell, 1979a) This book proposes an overview of the fields in the economics of law to which the author has contributed It also covers in detail other fields and many contributions to the literature The emphasis is on theory, but some empirical facts are mentioned The book has twenty-nine chapters in seven parts or sections, a comprehensive list of references (786 references in the References section of the book), and two indexes (authors and subjects) It covers many subjects related to the economic analysis of basic law Particular attention is devoted to the positive analysis of law, although the normative aspect is also well covered The book is addressed to two broad audiences: economists and individuals interested in law with no formal background in economics There is no formal economic analysis in the text (but formal models are sometimes sketched in footnotes) and no detailed discussion of legal doctrine The subjects covered are important for any legal system: laws related to property, accidents, contracts, crimes, and their litigation process Specialized subjects such as labor, bankruptcy, or environmental law are not covered However, for the readers of the Journal of Risk and Insurance, accident law is discussed in detail (one section including five chapters that will be analyzed below) Chapter 1, the introduction to the book, presents the author's basic philosophy with regard to the economics of law He first distinguishes the positive analysis of the economics of law from its normative analysis Using his example for automobile accidents, the positive analysis is concerned with how a liability system affects accidents and litigation expenses, whereas the normative analysis looks at the social desirability of a liability system Two standard and important assumptions are made for the normative analysis First, the normative analysis does not take any of the distributive aspects into account; this is left to the income tax system and other transfer mechanisms Second, the notions of fairness and morality are not integrated in the analysis, although a significant effort is made to do so in part seven of the book The first four parts of the book treat areas related to private law: property law, liability for accidents, contract law, and civil litigation They are called private because they are enforced by the activities or suits of private parties The first of the four parts is devoted to property law Chapter 2 covers the rationale of ownership and the emergence of property rights The chapter defines concepts that are not often discussed in the standard economic literature For example, the author treats property rights, their justification, and their emergence Property rights are themselves divided into two types of rights: possessory rights and transfer rights The justification of property rights is mainly related to incentives: incentives to work, incentives to maintain and improve things, and incentives to transfer things Their emergence occurs when the advantages are greater than the costs of instituting and maintaining them Chapter 3 is devoted to the division of property rights while chapter 4 discusses, in detail, the acquisition and transfer of property rights, including transfer after death Chapter 5 concerns the issues of conflict and cooperation associated with the use of property rights …

276 citations

Posted Content
TL;DR: In this article, the authors examined whether there is an optimal intensity for R&D subsidies through an analysis of their impact on private research effort and showed a non-linear relationship between the percentage of subsidy received and the firms' research effort.
Abstract: The effectiveness of R&D subsidies can vary substantially depending on their characteristics. Specifically, the amount and intensity of such subsidies are crucial issues in the design of public schemes supporting private R&D. Public agencies determine the intensities of R&D subsidies for firms in line with their eligibility criteria, although assessing the effects of R&D projects accurately is far from straightforward. The main aim of this paper is to examine whether there is an optimal intensity for R&D subsidies through an analysis of their impact on private R&D effort. We examine the decisions of a public agency to grant subsidies taking into account not only the characteristics of the firms but also, as few previous studies have done to date, those of the R&D projects. In determining the optimal subsidy we use both parametric and non-parametric techniques. The results show a non-linear relationship between the percentage of subsidy received and the firms’ R&D effort. These results have implications for technology policy, particularly for the design of R&D subsidies that ensure enhanced effectiveness.

254 citations

01 Jun 2016
TL;DR: A team of researchers at Google's DeepMind Technologies has been working on a means to increase the capabilities of computers by combining aspects of data processing and artificial intelligence and have come up with what they are calling a differentiable neural computer (DNC).
Abstract: A team of researchers at Google's DeepMind Technologies has been working on a means to increase the capabilities of computers by combining aspects of data processing and artificial intelligence and have come up with what they are calling a differentiable neural computer (DNC.) In their paper published in the journal Nature, they describe the work they are doing and where they believe it is headed. To make the work more accessible to the public team members, Alexander Graves and Greg Wayne have posted an explanatory page on the DeepMind website. [13]

248 citations