scispace - formally typeset
Search or ask a question

Showing papers on "Efficient frontier published in 2015"


Journal ArticleDOI
TL;DR: In this article, the authors studied an optimal investment-reinsurance problem for an insurer with a surplus process represented by the Cramer-Lundberg model, where the insurer faces the decision-making problem of choosing to purchase reinsurance, acquire new business and invest its surplus in the financial market such that the mean and variance of its terminal wealth is maximized and minimized simultaneously.
Abstract: This paper studies an optimal investment–reinsurance problem for an insurer with a surplus process represented by the Cramer–Lundberg model. The insurer is assumed to be a mean–variance optimizer. The financial market consists of one risk-free asset and one risky asset. The market price of risk depends on a Markovian, affine-form, square-root stochastic factor process, while the volatility and appreciation rate of the risky asset are given by non-Markovian, unbounded processes. The insurer faces the decision-making problem of choosing to purchase reinsurance, acquire new business and invest its surplus in the financial market such that the mean and variance of its terminal wealth is maximized and minimized simultaneously. We adopt a backward stochastic differential equation approach to solve the problem. Closed-form expressions for the efficient frontier and efficient strategy of the mean–variance problem are derived. Numerical examples are presented to illustrate our results in two special cases, the constant elasticity of variance model and Heston’s model.

93 citations


Journal ArticleDOI
Lei Fang1
TL;DR: In this paper, a new approach for resource allocation based on efficiency analysis under a centralized decision-making environment is presented, which provides a level-wise improvement path to direct the DMUs to reach their ultimate targets on the efficient frontier in an implementable and realistic manner.
Abstract: The existing centralized resource allocation models often assume that all of the DMUs are efficient after resource allocation. For the DMU with a very low efficiency score, it means the dramatic reduction of the resources, which can cause the organizational resistance. In addition, in reality, it is particularly difficult for the DMUs to achieve their target efficiencies in a single step, especially when they are far from the efficient frontier. Thus, gradual progress towards benchmarking targets is gaining importance. In this paper, we present a new approach for resource allocation based on efficiency analysis under a centralized decision-making environment. Through our approach, the central decision-maker can obtain a sequence of intermediate benchmark targets based on efficiency analysis, which provide a level-wise improvement path to direct the DMUs to reach their ultimate targets on the efficient frontier in an implementable and realistic manner. Numerical examples are presented to illustrate the application procedure of the proposed approach.

74 citations


Posted Content
TL;DR: In this paper, the authors used portfolio optimization techniques to determine the most favorable investment portfolio in particular stock indices of three companies, namely Microsoft Corporation, Christian Dior Fashion House and Shevron Corporation.
Abstract: In this paper Portfolio Optimization techniques were used to determine the most favorable investment portfolio In particular, stock indices of three companies, namely Microsoft Corporation, Christian Dior Fashion House and Shevron Corporation were evaluated Using this data the amounts invested in each asset when a portfolio is chosen on the efficient frontier were calculated In addition, the Portfolio with minimum variance, tangency portfolio and optimal Markowitz portfolio are presented

71 citations


Journal ArticleDOI
TL;DR: A heuristic approach is applied to approximate the cardinality constrained efficient frontier of the portfolio selection problem considering the below-mean absolute semi-deviation as a measure of risk.
Abstract: We present a cardinality constrained credibility mean-absolute semi-deviation model.We prove relationships for possibility and credibility moments for LR-fuzzy variables.The return on a given portfolio is modeled by means of LR-type fuzzy variables.We solve the portfolio selection problem using an evolutionary procedure with a DSS.We select best portfolio from Pareto-front with a ranking strategy based on Fuzzy VaR. We introduce a cardinality constrained multi-objective optimization problem for generating efficient portfolios within a fuzzy mean-absolute deviation framework. We assume that the return on a given portfolio is modeled by means of LR-type fuzzy variables, whose credibility distributions collect the contemporary relationships among the returns on individual assets. To consider credibility measures of risk and return on a given portfolio enables us to work with its Fuzzy Value-at-Risk. The relationship between credibility expected values for LR-type fuzzy variables and possibilistic moments for LR-fuzzy numbers having the same membership function are analyzed. We apply a heuristic approach to approximate the cardinality constrained efficient frontier of the portfolio selection problem considering the below-mean absolute semi-deviation as a measure of risk. We also explore the impact of adding a Fuzzy Value-at-Risk measure that supports the investor's choices. A computational study of our multi-objective evolutionary approach and the performance of the credibility model are presented with a data set collected from the Spanish stock market.

68 citations


Posted Content
TL;DR: In this article, the DEA's best practice production frontier was estimated from the observed inputs/outputs of the sample banks, and the inputs and outputs of identified efficient banks defined the efficient frontier and enveloped those of inefficient banks.
Abstract: The contradiction in 1987 between identified efficient banks and their low profitability performance was due to the methodology. Because of the pervasiveness of LDC loans, the empirically-based DEA recognized this overriding emphasis as 'best practice.' What led to this, of course, was something akin to herd behavior - where one bank after another had entered the LDC loan market in a significant way. As discussed above, DEA’s best practice production frontier was estimated from the observed inputs/outputs of the sample banks. The inputs/outputs of identified efficient banks defined the efficient frontier and enveloped those of inefficient banks. That is, in 1987, DEA identified the LDC herd mentality as a positive contributor to bank efficiency and defined the efficient frontier based on the inputs/outputs of these problem-loan banks.As noted above in Table 4, this problem was clearly recognized by the low profitability performance of the 1987 efficient banks. These DEA-identified best practice banks were in fact financial 'bad practice' banks. Again, this contradiction was apparent from the huge LDC loan write offs. In general, a DEA problem of this type can be identified by measuring DMU performance to validate results and by knowledge of the managerial/institutional characteristics of the DMUs and the exogenous factors impacting them. In this way, DEA results can be adjusted as needed to provide the correct interpretation of DMU efficiency.By 1992, banking had normalized and DEA-identified best practice banks were also financial good practice banks. These results were consistent with the endogenous and exogenous aspects of banking at that time. Efficient banks had higher returns on all six profitability measures than inefficient banks, and all of their returns were positive. Also, their returns on foreign activities were higher than on domestic activities.On the other hand, inefficient banks had positive returns on five of six measures, excluding consolidated net income to total equity capital. Also, their returns on foreign activities were higher than on domestic activities. Thus, in 1992 both efficient and inefficient banks had positive returns on foreign activities, although they were higher for efficient banks. The fact that efficient banks had higher returns on all measures was, with one exception, a complete reversal from 1987.In conclusion, the prescription for the improved input/output efficiency of banks may be summarized as follows: Management should always focus on overall efficiency, but with particular attention to input variables - especially cash and real capital - and to foreign loans among the outputs. Further, when efficiency is in a state of flux, for better or worse, management should be alert to both foreign loan outputs and cash inputs.

68 citations


Journal ArticleDOI
TL;DR: In this paper, the authors examined the use of second-order stochastic dominance as both a technique for constructing portfolios and also as a way to measure performance and found that it is a preference-free technique that will suit any risk-averse investor, and does not require normally distributed returns.
Abstract: We examine the use of second-order stochastic dominance as both a technique for constructing portfolios and also as a way to measure performance As a preference-free technique second-order stochastic dominance will suit any risk-averse investor, and it does not require normally distributed returns Using in-sample data, we construct portfolios such that their second-order stochastic dominance over a benchmark is most probable The empirical results based on 21 years of daily data suggest that this portfolio choice technique significantly outperforms the benchmark out-of-sample Moreover, its performance is typically better - and frequently much better - than several alternative portfolio-choice approaches using equal weights, mean-variance optimization, or minimum-variance methods

66 citations


Journal ArticleDOI
Wenbin Liu1, Wenbin Liu2, Zhongbao Zhou1, Debin Liu1, Helu Xiao1 
TL;DR: In this article, the classic DEA models are used to sample portfolios to take into account sufficient diversification, and thus can be used as an effective tool in computing the PE for their performance assessments.
Abstract: Traditional DEA models and nonlinear (diversification) DEA models are often used in performance evaluation of portfolios. However, the diversification models are usually very complicated to compute except the very basic models. And the classic DEA models still need to be further justified and tested, since it is not clear whether they are over-linearised according to the diversification models. The existing studies on performance evaluation via the classic DEA models generally focus on the selection of inputs and outputs. In this work, we investigate theoretical foundation of DEA models for portfolios from a perspective of sampling portfolio. We show the classic DEA provides an effective and practical way to approximate the portfolio efficiency (PE). We further verify this approach through different portfolio models with various frictions and bounds on the market. Through the comprehensive simulations carried out in this study, we show that with adequate data sets, the classic DEA models can effectively sample portfolios to take into account sufficient diversification, and thus can be used as an effective tool in computing the PE for their performance assessments. This study can be viewed as a justification of the classic DEA performance assessments and the way to introduce other efficiency notions (like allocation efficiency, scale efficiency, etc) into assessment of portfolios.

62 citations


Journal ArticleDOI
TL;DR: The conclusion can be interpreted as saying that the mean-variance problem for the AAI explains certain counter-intuitive investor behaviors, by which the attitude to risk exposure, for an AAI facing model uncertainty, depends on positive past experience.
Abstract: In this paper, an ambiguity-averse insurer (AAI) whose surplus process is approximated by a Brownian motion with drift, hopes to manage risk by both investing in a Black–Scholes financial market and transferring some risk to a reinsurer, but worries about uncertainty in model parameters. She chooses to find investment and reinsurance strategies that are robust with respect to this uncertainty, and to optimize her decisions in a mean-variance framework. By the stochastic dynamic programming approach, we derive closed-form expressions for a robust optimal benchmark strategy and its corresponding value function, in the sense of viscosity solutions, which allows us to find a mean-variance efficient strategy and the efficient frontier. Furthermore, economic implications are analyzed via numerical examples. In particular, our conclusion in the mean-variance framework differs qualitatively, for certain parameter ranges, with model-uncertainty robustness conclusions in the framework of utility functions: model un...

58 citations


01 Jan 2015
TL;DR: In this paper, the authors consider convex parametric multiobjective optimization problems under data uncertainty and derive the robust Pareto frontier with respect to the corresponding original PAREto frontier.
Abstract: Motivated by Markowitz portfolio optimization problems under uncertainty in the problem data, we consider general convex parametric multiobjective optimization problems under data uncertainty. This uncertainty is treated by a robust multiobjective formulation in the gist of Ben-Tal and Nemirovski. For this novel formulation, we investigate its relationship to the original multiobjective formulation as well as to its scalarizations. Further, we provide a characterization of the location of the robust Pareto frontier with respect to the corresponding original Pareto frontier and show that standard techniques from multiobjective optimization can be employed to characterize this robust efficient frontier. We illustrate our results based on a standard meannvariance problem. If time allows, we consider the same problem from the point of view of set valued optimization. Based on canonical ordering structures, this leads to different robust formulations, which will be discussed in detail. We show how these more general approaches can be tackled numerically based on a sequence of semi-infinite optimization problems.

58 citations


Journal ArticleDOI
TL;DR: In this article, a non-stochastic robust portfolio optimization approach is proposed for land-use portfolio selection in the context of agricultural crops. But the approach requires information on the covariance of uncertain returns between all combinations of the economic options and also assumes that returns are normally distributed.

56 citations


Journal ArticleDOI
TL;DR: In this article, the authors find evidence to support the use of SunGard APT and Axioma multi-factor models for portfolio construction and risk control for global stocks during the period 1997-2011.

Journal ArticleDOI
TL;DR: A generalized equilibrium efficient frontier data envelopment analysis approach (GEEFDEA) is proposed which improves and strengthens the EEFDEA approach and is applied to the data set of 2012 London Olympic Games.

Journal ArticleDOI
TL;DR: This paper applies modern portfolio theory (MPT) to formulate an optimal stage investment of groundwater contamination remediation in China and finds that the efficient frontier of investment displays an upward-sloping shape in risk-return space.

Journal ArticleDOI
TL;DR: In this paper, the authors studied the optimization problem of DC pension plan under mean-variance criterion and derived the closed-forms of efficient frontier and strategies, which can be used to show the economic behavior of the efficient frontier.
Abstract: This paper studies the optimization problem of DC pension plan under mean–variance criterion. The financial market consists of cash, bond and stock. Similar to Guan and Liang (2014), we assume that the instantaneous interest rate is an affine process including the Cox–Ingersoll–Ross (CIR) model and Vasicek model. However, we assume that the expected return of the stock follows a completely different mean-reverting process, which can well display the bear and bull features of the market, and the market price of the stock index is the Ornstein–Uhlenbeck process. The pension manager thus has to undertake the risks of interest rate and market price of stock index. Besides, a special stochastic contribution rate is formulated. The goal of the pension manager is to maximize the expected terminal value and minimize the variance of terminal value. We will use the technique developed by Guan and Liang (2014) to tackle this problem and derive the closed-forms of efficient frontier and strategies. Numerical analysis is given in the end of this paper to show the economic behavior of the efficient frontier and strategies.

Journal ArticleDOI
TL;DR: In this article, the authors studied the time-consistent investment strategy for a defined contribution (DC) pension plan under the mean-variance criterion, where two background risks are taken into account: the inflation risk and the salary risk, and the effects of the inflation and stochastic income on the equilibrium strategy and the equilibrium efficient frontier are illustrated by mathematical and numerical analysis.
Abstract: This paper studies the time-consistent investment strategy for a defined contribution (DC) pension plan under the mean–variance criterion. Since the time horizon of a pension fund management problem is relatively long, two background risks are taken into account: the inflation risk and the salary risk. Meanwhile, there are a risk-free asset, a stock and an inflation-indexed bond available in the financial market. The extended Hamilton–Jacobi–Bellman (HJB for short) equation of the equilibrium value function and the verification theorem corresponding to our problem are presented. The closed-form time-consistent investment strategy and the equilibrium efficient frontier are obtained by stochastic control technique. The effects of the inflation and stochastic income on the equilibrium strategy and the equilibrium efficient frontier are illustrated by mathematical and numerical analysis. Finally, we compare in detail the time-consistent results in our paper with the pre-commitment one and find the distinct properties of these two results.

Journal ArticleDOI
TL;DR: In this paper, it was shown that if the market portfolio is replaced by the equal or entropy weighted portfolio among many others, no relative arbitrages can be constructed under the same conditions using functionally generated portfolios.
Abstract: In stochastic portfolio theory, a relative arbitrage is an equity portfolio which is guaranteed to outperform a benchmark portfolio over a finite horizon. When the market is diverse and sufficiently volatile, and the benchmark is the market or a buy-and-hold portfolio, functionally generated portfolios introduced by Fernholz provide a systematic way of constructing relative arbitrages. In this paper we show that if the market portfolio is replaced by the equal or entropy weighted portfolio among many others, no relative arbitrages can be constructed under the same conditions using functionally generated portfolios. We also introduce and study a shaped-constrained optimization problem for functionally generated portfolios in the spirit of maximum likelihood estimation of a log-concave density.

Journal ArticleDOI
TL;DR: It is concluded that the background risk can better reflect the investment risk of the real economy environment which make the investors choose a more suitable portfolio to them.

Journal ArticleDOI
TL;DR: A mean-variance portfolio selection problem in a complete market with unbounded random coefficients is concerned, and adapted processes are used to model market coefficients, and it is assumed that only the interest rate is bounded, while the appreciation rate, volatility and market price of risk are unbounded.

Journal ArticleDOI
TL;DR: In this paper, an asset and liability management problem in a continuous-time mean-variance framework, where interest rate is driven by the Vasicek model and the liability process is governed by Brownian motion with drift is considered.

Journal ArticleDOI
TL;DR: In this paper, the authors examined the role of U.S. timberland assets in a mixed portfolio from the risk perspective, and found that large-cap stocks and small-cap stock are generally risk intensifiers, whereas treasury bonds, treasury bills, and timber land assets are risk diversifiers.

Book ChapterDOI
01 Jan 2015
TL;DR: This chapter discusses the relationship between this classification of facets and the distinction in Olesen and Petersen (Manage Sci 42:205–219, 1996) between non-full dimensional and full dimensional efficient facets, and reviews recent uses of efficient faces and facets in the literature.
Abstract: Data Envelopment Analysis (DEA) employs mathematical programming to measure the relative efficiency of Decision Making Units (DMUs). One of the topics of this chapter is concerned with development of indicators to determine whether or not the specification of the input and output space is supported by data in the sense that the variation in data is sufficient for estimation of a frontier of the same dimension as the input output space. Insufficient variation in data implies that some inputs/outputs can be substituted along the efficient frontier but only in fixed proportions. Data thus locally support variation in a subspace of a lower dimension rather than in the input output space of full dimension. The proposed indicators are related to the existence of so-called Full Dimensional Efficient Facets (FDEFs). To characterize the facet structure of the CCR- or the BCC-estimators, (Charnes et al. Eur J Oper Res 2:429–444, 1978; Banker et al. Manage Sci 30(9):1078–1092, 1984) of the efficient frontier we derive a dual representation of the technologies. This dual representation is derived from polar cones. Relying on the characterization of efficient faces and facets in Steuer (Multiple criteria optimization. Theory, computation and application, 1986), we use the dual representation to define the FDEFs. We provide small examples where no FDEFs exist, both for the CCR- and the BCC estimator. Thrall (Ann Oper Res 66:109–138, 1996) introduces a distinction between interior and exterior facets. In this chapter we discuss the relationship between this classification of facets and the distinction in Olesen and Petersen (Manage Sci 42:205–219, 1996) between non-full dimensional and full dimensional efficient facets. Procedures for identification of all interior and exterior facets are discussed and a specific small example using Qhull to generate all facets is presented. In Appendix B we present the details of the input to and the output from Qhull. It is shown that the existence of well-defined marginal rates of substitution along the estimated strongly efficient frontier segments requires the existence of FDEFs. A test for the existence of FDEFs is developed, and a technology called EXFA that relies only on FDEFs and the extension of these facets is proposed, both in the context of the CCR-model and the BCC-model. This technology is related to the Cone-Ratio DEA. The EXFA technology is used to define the EXFA efficiency index providing a lower bound on the efficiency rating of the DMU under evaluation. An upper bound on the efficiency rating is provided by a technology defined as the (non-convex) union of the input output sets generated from FDEFs only. Finally, we review recent uses of efficient faces and facets in the literature.

Journal ArticleDOI
TL;DR: In this article, the financial content of a portfolio selection model is discussed in order to justify that it can be integrated into a decision support system designed for investors interested in socially responsible investment but initially reluctant to pay a financial cost in exchange for increasing the social responsibility of their portfolios.

Journal ArticleDOI
TL;DR: In this paper, the authors focused on building investment portfolios by using the Markowitz Portfolio Theory (MPT) derived from the Capital Asset Pricing Model (CAPM) to calculate the weights of individual securities in portfolios.
Abstract: This paper is focused on building investment portfolios by using the Markowitz Portfolio Theory (MPT). Derivation based on the Capital Asset Pricing Model (CAPM) is used to calculate the weights of individual securities in portfolios. The calculated portfolios include a portfolio copying the benchmark made using the CAPM model, portfolio with low and high beta coefficients, and a random portfolio. Only stocks were selected for the examined sample from all the asset classes. Stocks in each portfolio are put together according to predefined criteria. All stocks were selected from Dow Jones Industrial Average (DJIA) index which serves as a benchmark, too. Portfolios were compared based on their risk and return profiles. The results of this work will provide general recommendations on the optimal approach to choose securities for an investor's portfolio.

Journal ArticleDOI
TL;DR: In this article, the authors extend the use of Rao (1982b)'s quadratic entropy (RQE) to modern portfolio theory and argue that the RQE of a portfolio is a valid, flexible and unifying approach to measuring portfolio diversification.
Abstract: This paper extends the use of Rao (1982b)’s Quadratic Entropy (RQE) to modern portfolio theory. It argues that the RQE of a portfolio is a valid, flexible and unifying approach to measuring portfolio diversification. The paper demonstrates that portfolio’s RQE can encompass most existing measures, such as the portfolio variance, the diversification ratio, the normalized portfolio variance, the diversification return or excess growth rates, the Gini-Simpson indices, the return gaps, Markowitz’s utility function and Bouchaud’s general free utility. The paper also shows that assets selected under RQE can protect portfolios from mass destruction (systemic risk) and an empirical illustration suggests that this protection is substantial.

Journal ArticleDOI
TL;DR: An original methodology, named "simulation based DEA" that combines an agronomic whole-farm model and an efficient frontier method for in-depth exploration of the determinants of eco-efficiency and ecological intensification pathways is proposed.
Abstract: Growing awareness of the multiple environmental impacts of livestock production has created a need to extend the definition of efficiency to a multidimensional eco-efficiency concept. Our paper proposes an original methodology, named "simulation based DEA" that combines an agronomic whole-farm model and an efficient frontier method for in-depth exploration of the determinants of eco-efficiency. GAMEDE, the whole-farm model we use, is randomly parameterized for key management practices and structural parameters to generate a large dataset of simulated dairy systems. The upper and lower bounds set for the stochastic choice of parameter values are a key point in the methodology and rely on expert knowledge derived from participatory modelling. For each simulation, numerous indicators describing the functioning and the performance of the production system are calculated and a set of inputs, outputs, and undesirable outputs, are then used to define the production technology in an efficiency frontier analysis. Data envelopment analysis, the efficiency frontier method applied, provides a multidimensional eco-efficiency score representing the increase in outputs that is possible with no increase in inputs or undesirable outputs. The eco-efficiency score can be linked to all the indicators of the production system calculated by the whole-farm model, after which it becomes possible to explore the managerial, structural, economic, agronomic, zootechnical, and environmental factors that explain different levels of eco-efficiency. In our case study, dairy farming in Reunion Island, livestock production is strongly constrained by land scarcity. Consequently the most eco-efficient farms appear to be intensive systems with high forage productivity to ensure feed self-sufficiency. While most studies on efficiency are restricted to a narrow dataset, the proposed methodology is innovative in that it makes it possible to cover a wide range of possible livestock farming systems in a given territory, including systems that do not exist at the present time, and to characterize them using multiple descriptive variables, at a limited cost in time and in the cost of surveys. Coupling whole-farm models and efficiency frontier analysis is a promising way to accurately identify the determinants of eco-efficiency and ecological intensification pathways. A whole-farm model is used to generate a dataset for efficient frontier analysis.Monitoring the production process enables eco-efficiency to be assessed.A whole-farm model is randomly launched thanks to expert knowledge on parameters.Robustness methods are implemented to control the influence of simulated outliers.Under land constraints, feed self-sufficiency is critical for dairy farm eco-efficiency.

Journal ArticleDOI
TL;DR: In this article, a portfolio selection problem is formulated as a bi-criteria optimization problem which maximizes the expected portfolio return and minimizes the maximum individual risk of the assets in the portfolio.
Abstract: In this paper, we introduce a new portfolio selection method. Our method is innovative and flexible. An explicit solution is obtained, and the selection method allows for investors with different degree of risk aversion. The portfolio selection problem is formulated as a bi-criteria optimization problem which maximizes the expected portfolio return and minimizes the maximum individual risk of the assets in the portfolio. The efficient frontier using our method is compared with various efficient frontiers in the literature and found to be superior to others in the mean-variance space.

Journal ArticleDOI
TL;DR: A computationally efficient incremental approach, Safe and Reachable Frontier Detection (SRFD), that processes locally updated map data to generate only the safe and reachable frontier information is introduced.

Journal ArticleDOI
TL;DR: In this article, a data-oriented, non-parametric approach is proposed to evaluate the economic efficiency of a set of alternative models of an energy-consuming device, based on data envelopment analysis (DEA).
Abstract: The market for an energy-consuming device offers a range of models that will meet consumers’ needs for an energy service with different levels of energy efficiency. A more efficient model is likely to have greater up-front costs, but the increased efficiency will eventually translate into energy cost savings over the device’s lifespan. Cost-effectiveness indicators (namely, net benefit and benefit-cost ratio) can be used to assess whether a more efficient model can be a better alternative for consumers. However, whereas these indicators express to what extent the additional benefits outweigh the additional costs, they do not indicate how efficiently each model allocates capital and energy to provide the energy service. They, therefore, lack the economic efficiency dimension of the problem. This paper introduces a data-oriented, non-parametric approach to evaluate such efficiency for a set of alternative models of an energy-consuming device. It relies on data envelopment analysis (DEA) to calculate relative efficiency coefficients. The coefficients establish an input efficient frontier for the energy service provided and indicate the models that provide the energy service at the least cost. DEA is further extended to calculate the highest cost-effectiveness achievable and indicate the most cost-effective alternatives. The approach proves useful to support consumers’ decision-making when shopping for energy-consuming equipment, to guide manufacturers when benchmarking the models they produce, and to inform energy efficiency policy-making and program designing.

Journal ArticleDOI
TL;DR: This paper describes optimal solutions of the portfolio problem associated to quasiconvex risk measures and describes the shape of the efficient frontier in the mean-risk space and some particular cases are investigated.
Abstract: In this paper, we focus on the portfolio optimization problem associated with a quasiconvex risk measure (satisfying some additional assumptions). For coherent/convex risk measures, the portfolio optimization problem has been already studied in the literature. Following the approach of Ruszczynski and Shapiro [Ruszczynski A, Shapiro A (2006) Optimization of convex risk functions. Math. Oper. Res. 31(3):433–452.], but by means of quasiconvex analysis and notions of subdifferentiability, we characterize optimal solutions of the portfolio problem associated with quasiconvex risk measures. The shape of the efficient frontier in the mean-risk space and some particular cases are also investigated.

Journal ArticleDOI
TL;DR: In this paper, the authors argue that the extent to which a CVC investor and its corporate sponsor may learn from new ventures depends on the nature of its risk attitude and, more in general, on its portfolio diversification (low risk) or concentration (high risk) strategy.
Abstract: When seizing new investment opportunities, CVC investors face a tension between learning rewards and risks in the form of market and technological uncertainties. Based on an inductive qualitative study relying upon a unique, longitudinal dataset of 260 CVC deals carried out by the top CVC investors in the biopharmaceutical industry between 2003 and 2013, we argue that the extent to which a CVC investor (and its corporate sponsor) may learn from new ventures depends on the nature of its risk attitude and, more in general, on its portfolio diversification (low risk) or concentration (high risk) strategy. In so doing, we identify four typologies of CVC portfolio strategies that allow for growth and learning options available to the parent sponsor, showing that there is a curvilinear (U-shaped) relationship between learning propensity and portfolio diversification. We also develop a tool for determining a CVC opportunity set that may help a fund to optimally allocate capital based on its own risk-return preferences. Theory for CVC decision-making is advanced by furthering two propositions requiring future empirical validation.