scispace - formally typeset
Search or ask a question

Showing papers on "Kelly criterion published in 2014"


Journal ArticleDOI
TL;DR: In this article, the first and second moments of the asset excess returns are used to define the Kelly criterion and a simple numerical algorithm to manage virtually arbitrarily large portfolios according to so-called fractional Kelly strategies.
Abstract: The Kelly criterion is a money management principle that beats any other approach in many respects. In particular, it maximizes the expected growth rate and the median of the terminal wealth. However, until recently application of the Kelly criterion to multivariate portfolios has seen little analysis. We briefly introduce the Kelly criterion and then present its multivariate version based only on the first and the second moments of the asset excess returns. Additionally, we provide a simple numerical algorithm to manage virtually arbitrarily large portfolios according to so-called fractional Kelly strategies.

24 citations


Proceedings Article
27 Jul 2014
TL;DR: It is shown that when traders are willing to risk only a small fraction of their wealth in any period, belief heterogeneity can persist indefinitely; if bets are large in proportion to wealth then only the most accurate belief type survives and the market price is more accurate in the long run when traders with less accurate beliefs also survive.
Abstract: We investigate the limiting behavior of trader wealth and prices in a simple prediction market with a finite set of participants having heterogeneous beliefs. Traders bet repeatedly on the outcome of a binary event with fixed Bernoulli success probability. A class of strategies, including (fractional) Kelly betting and constant relative risk aversion (CRRA) are considered. We show that when traders are willing to risk only a small fraction of their wealth in any period, belief heterogeneity can persist indefinitely; if bets are large in proportion to wealth then only the most accurate belief type survives. The market price is more accurate in the long run when traders with less accurate beliefs also survive. That is, the survival of traders with heterogeneous beliefs, some less accurate than others, allows the market price to better reflect the objective probability of the event in the long run.

19 citations


Posted Content
01 Jan 2014
TL;DR: In this article, risk sensitive control has evolved into an innovative and successful framework for solving dynamically a wide range of practical investment management problems over the last two decades, and has been used in a variety of applications.
Abstract: Over the last two decades, risk-sensitive control has evolved into an innovative and successful framework for solving dynamically a wide range of practical investment management problems.

15 citations


Posted Content
TL;DR: In this article, an investor may determine the optimal leverage factor to apply on each trade, for maximizing the profitability of investing, by using money management, and the stopping of losses of losses is the best strategy.
Abstract: By using money management, an investor may determine the optimal leverage factor to apply on each trade, for maximizing the profitability of investing. Research suggests that the stopping of losses ...

6 citations


Proceedings ArticleDOI
27 Apr 2014
TL;DR: An analogy between gambling and information engines is shown and an extension of gambling to the continuous-valued case is shown, which can be useful for investments in the stock market using options.
Abstract: TO BE CONSIDERED FOR AN IEEE JACK KEIL WOLF ISIT STUDENT PAPER AWARD. In information theory, mutual information is a known bound on the gain in the growth rate due to knowledge of side information on a gambling result; the betting strategy that reaches that bound is named the Kelly criterion. In physics, it was recently shown that mutual information is also a bound on the amount of work that can be extracted from a single heat bath using measurement-based control protocols; extraction that is done using “Information Engines”. However, to the best of our knowledge, no relation between these two fields has been presented before. In this paper, we briefly review the two fields and then show an analogy between gambling, where bits are converted to wealth, and information engines, where bits representing measurement results are converted to energy. This enables us to use well-known methods and results from one field to solve problems in the other. We present three such cases: maximal work extraction when the joint distribution of X and Y is unknown, work extraction when some energy is lost in each cycle, e.g., due to friction, and an analysis of systems with memory. In all three cases, the analogy enables us to use known results to reach new ones.

6 citations


Journal ArticleDOI
TL;DR: In this paper, the authors investigate the limiting behavior of trader wealth and prices in a simple prediction market with a finite set of participants having heterogeneous beliefs, and show that traders with less accurate beliefs also survive.
Abstract: We investigate the limiting behavior of trader wealth and prices in a simple prediction market with a finite set of participants having heterogeneous beliefs. Traders bet repeatedly on the outcome of a binary event with fixed Bernoulli success probability. A class of strategies, including (fractional) Kelly betting and constant relative risk aversion (CRRA) are considered. We show that when traders are willing to risk only a small fraction of their wealth in any period, belief heterogeneity can persist indefinitely; if bets are large in proportion to wealth then only the most accurate belief type survives. The market price is more accurate in the long run when traders with less accurate beliefs also survive. That is, the survival of traders with heterogeneous beliefs, some less accurate than others, allows the market price to better reflect the objective probability of the event in the long run.

5 citations


Posted Content
TL;DR: In this article, the authors provide a method to obtain the maximum growth while staying above a predetermined ex-ante discrete time smooth wealth path with high probability, with shortfalls below the path penalized with a convex function of the shortfall so as to force the investor to remain above the wealth path.
Abstract: The optimal capital growth strategy or Kelly strategy, has many desirable properties such as maximizing the asympotic long run growth of capital. However, it has considerable short run risk since the utility is logarithmic, with essentially zero Arrow-Pratt risk aversion. Most investors favor a smooth wealth path with high growth. In this paper we provide a method to obtain the maximum growth while staying above a predetermined ex-ante discrete time smooth wealth path with high probability, with shortfalls below the path penalized with a convex function of the shortfall so as to force the investor to remain above the wealth path. This results in a lower investment fraction than the Kelly strategy with less risk, and lower but maximal growth rate under the assumptions. A mixture model with Markov transitions between several normally distributed market regimes is used for the dynamics of asset prices. The investment model allows the determination of the optimal constrained growth wagers at discrete points in time in an attempt to stay above the ex-ante path.

5 citations


Posted Content
TL;DR: This article proved that Pareto theory of circulation of elites results from our wealth evolution model, Kelly criterion for optimal betting and Keynes' observation of "animal spirits" that drive the economy and cause that human financial decisions are prone to excess risk-taking.
Abstract: We prove that Pareto theory of circulation of elites results from our wealth evolution model, Kelly criterion for optimal betting and Keynes' observation of "animal spirits" that drive the economy and cause that human financial decisions are prone to excess risk-taking.

5 citations


Patent
08 Sep 2014
TL;DR: In this paper, the expected inverse assets (INA) measure is proposed to optimize the expected future inverse assets, conditioned on the assets having some estimated linear return distribution, using a simple cross evaluation method, whereby the optimum leverage according to one method is measured using the other method's utility function.
Abstract: The question of how much should be placed at risk on a given investment, relative to the total assets available for investment, is basically that of determining the optimal leverage. The approach taken by the method described in this specification is to optimize the expected future inverse assets, conditioned on the assets having some estimated linear return distribution. The expected inverse assets is shown to outperform the Kelly Criterion, an existing well known method for calculating optimal leverage, using a simple cross evaluation method, whereby the optimum leverage according to one method is measured using the other method's utility function. The expected inverse assets measure outperforms the Kelly Criterion in the two analytic scenarios considered, a Gaussian distribution of log-returns and a Bernoulli distribution of linear returns. Example usage of the expected inverse asset utility or objective function is provided by the specification of a system of processing histograms that represent the forecast return distributions of investments. It is also shown how this system can be applied specifically to leveraging with market equities, leveraging with debt, leveraging in insurance, and leveraging in a retirement portfolio.

5 citations



Journal ArticleDOI
TL;DR: An analogy between gambling and information engines is demonstrated, which follows an extension of gambling to the continuous-valued case, which is shown to be useful for investments in currency exchange rates or in the stock market using options.
Abstract: In information theory, one area of interest is gambling, where mutual information characterizes the maximal gain in wealth growth rate due to knowledge of side information; the betting strategy that achieves this maximum is named the Kelly strategy. In the field of physics, it was recently shown that mutual information can characterize the maximal amount of work that can be extracted from a single heat bath using measurement-based control protocols, i.e., using "information engines". However, to the best of our knowledge, no relation between gambling and information engines has been presented before. In this paper, we briefly review the two concepts and then demonstrate an analogy between gambling, where bits are converted into wealth, and information engines, where bits representing measurements are converted into energy. From this analogy follows an extension of gambling to the continuous-valued case, which is shown to be useful for investments in currency exchange rates or in the stock market using options. Moreover, the analogy enables us to use well-known methods and results from one field to solve problems in the other. We present three such cases: maximum work extraction when the probability distributions governing the system and measurements are unknown, work extraction when some energy is lost in each cycle, e.g., due to friction, and an analysis of systems with memory. In all three cases, the analogy enables us to use known results in order to obtain new ones.

Proceedings ArticleDOI
27 Mar 2014
TL;DR: A meta-model is built which uses the Kelly criterion to determine an optimal allocation over these investment strategies, thus simultaneously capturing regimes operating in the data over different time horizons, and in order to detect changes in the relevant data regime, and hence investment allocations, a forecasting algorithm which relies on a Kalman filter.
Abstract: In this paper we present a smart portfolio management methodology, which advances existing portfolio management techniques at two distinct levels. First, we develop a set of investment models that target regimes found in the data over different time horizons. We then build a meta-model which uses the Kelly criterion to determine an optimal allocation over these investment strategies, thus simultaneously capturing regimes operating in the data over different time horizons. Finally, in order to detect changes in the relevant data regime, and hence investment allocations, we use a forecasting algorithm which relies on a Kalman filter. We call our combined method, that uses both the Kelly criterion and the Kalman filter, the K2 algorithm. Using a large-scale historical dataset of both stocks and indices, we show that our K2 algorithm gives better risk adjusted returns in terms of the Sharpe ratio, better average gain to average loss ratio and higher probability of success compared to existing benchmarks, when measured in out-of-sample tests.

Journal ArticleDOI
TL;DR: In this article, the authors calculate the variance of the estimated Kelly criterion ratio and show how to calculate the value of the optimal fraction from empirical data, and how to estimate the variance.
Abstract: Investing according to the Kelly criterion will theoretically outperform any other sizing strategy. However, the value of the optimal fraction will generally need to be estimated from empirical data. This means that our estimate will invariably have a degree of uncertainty attached to it. In this note I show how to calculate the variance of the estimated Kelly criterion ratio.

Journal ArticleDOI
TL;DR: The approach suggested here proposes that classification need not offer a conclusion on every instance within a data set, and if an algorithm finds instances in which attributes pertaining to a patient's disease offer zero to nil information, there should be no classification offered.
Abstract: In binary classification, two-way confusion matrices, with corresponding measures, such as sensitivity and specificity, have become so ubiquitous that those who review results may not realize there are other and more realistic ways to visualize data. This is, particularly, true when risk and reward considerations are important. The approach suggested here proposes that classification need not offer a conclusion on every instance within a data set. If an algorithm finds instances (e.g., patient cases in a medical data set) in which attributes pertaining to a patient's disease offer zero to nil information, there should be no classification offered. From the physician's perspective, disclosure of nil information should be welcome because it might prevent potentially harmful treatment. It follows from this that the developer of a classifier can provide summary results amendable for helping the consumer decide whether or not it is prudent to pass or act (commission versus omission). It is not always about balancing sensitivity and specificity in all cases, but optimizing action on some cases. The explanation is centered on John Kelly's link of gambling with Shannon information theory. In addition, Graham's margin of safety, Bernoulli's utiles, and Hippocratic Oath are important. An example problem is provided using a Netherlands Cancer Institute breast cancer data set. Recurrence score, a popular molecular-based assay for breast cancer prognosis, was found to have an uninformative zone. The uninformative subset had been grouped with positive results to garner higher sensitivity. Yet, because of a positive result, patients might be advised to undergo potentially harmful treatment in the absence of useful information.


Journal ArticleDOI
TL;DR: In this article, the authors show that the Kelly criterion provides an optimal leverage that prevents excessive leverage in balance sheet rebalancing, which is potentially countercyclical and could also reinforce the central bank monetary policy transmission mechanism.
Abstract: The Basel capital is a “margin” requirement imposed by regulators to cushion banks against extreme falls in prices of assets held, and is often a function of value-at-risk (VaR). The way banks adjust their balance sheets to maintain the requirement is equivalent to leverage targeting that has been shown to cause procyclical risk. The 2008 crisis revealed that Basel 2 capital was insufficient to protect banks against crisis losses, but the industry believes the current Basel 3 requirements are too high for sustainable business. Is there an optimal capital?Balance sheet rebalancing with a target leverage can be described by a multiplicative game or process. Most players will lose money even if the game has a positive expectation because excessive leverage causes the majority to get wiped out over time and the system achieves a “winner takes all” effect. Fortunately, the Kelly criterion provides an optimal leverage that prevents this. By using empirical data for balance sheet simulation, we show that the Kelly criterion gives an optimal capital vis-a-vis Basel 2, Basel 2.5 and expected shortfall. This capital approach provides the best survival strategy over the economic cycle. The article suggests how this can be computed in practice for an actual bank. Also the Kelly-based capital is potentially countercyclical, which addresses procyclical risks and could also reinforce the central bank monetary policy transmission mechanism.

Journal ArticleDOI
TL;DR: In this paper, a simple strategy for games where a player has zero edge is proposed, and the strategy results in a high percentage of profitable outcomes in games where the player has an edge.
Abstract: This paper proposes a simple strategy for games where a player has zero edge. We present preliminary results from simulation runs that show that the strategy results in a high percentage of profitable outcomes. We also show that combining the strategy with the Kelly Criterion results in an even higher percentage of profitable outcomes in games where the player has an edge.

Journal ArticleDOI
TL;DR: In this paper, a solution to Proebsting's paradox is provided, which appears to show that the investment rule known as the Kelly criterion can lead a decision maker to invest a higher fraction of his wealth the more unfavorable the odds he faces are and, as a consequence, risk an arbitrarily high proportion of his own wealth on the outcome of a single event.

Posted Content
TL;DR: In this article, the authors introduce the notion of robust forward criteria which addresses the issues of ambiguity in model specification and in preferences and investment horizon specification and describe the evolution of time-consistent ambiguity averse preferences.
Abstract: We combine forward investment performance processes and ambiguity averse portfolio selection. We introduce the notion of robust forward criteria which addresses the issues of ambiguity in model specification and in preferences and investment horizon specification. It describes the evolution of time-consistent ambiguity averse preferences. We first focus on establishing dual characterizations of the robust forward criteria. This offers various advantages as the dual problem amounts to a search for an infimum whereas the primal problem features a saddle-point. Our approach is based on ideas developed in Schied (2007) and Zitkovic (2009). We then study in detail non-volatile criteria. In particular, we solve explicitly the example of an investor who starts with a logarithmic utility and applies a quadratic penalty function. The investor builds a dynamical estimate of the market price of risk $\hat \lambda$ and updates her stochastic utility in accordance with the so-perceived elapsed market opportunities. We show that this leads to a time-consistent optimal investment policy given by a fractional Kelly strategy associated with $\hat \lambda$. The leverage is proportional to the investor's confidence in her estimate $\hat \lambda$.

Journal ArticleDOI
TL;DR: In this paper, the authors calculate the variance of the estimated Kelly criterion ratio and show how to calculate the value of the optimal fraction from empirical data, and how to estimate the variance.
Abstract: Investing according to the Kelly criterion will theoretically outperform any other sizing strategy. However, the value of the optimal fraction will generally need to be estimated from empirical data. This means that our estimate will invariably have a degree of uncertainty attached to it. In this note I show how to calculate the variance of the estimated Kelly criterion ratio.


Posted Content
TL;DR: In this article, the authors determine the Kelly criterion for a game with variable pay-off and show that the Kelly fraction satisfies a fundamental integral equation and is smaller than the classical Kelly fraction for the same game with the constant average payoff.
Abstract: We determine Kelly criterion for a game with variable pay-off. The Kelly fraction satisfies a fundamental integral equation and is smaller than the classical Kelly fraction for the same game with the constant average pay-off.

Journal ArticleDOI
TL;DR: The Kelly strategy is attractive to practitioners because of its robust and optimal properties, e.g. that it dominates any other strategy in the long run and minimizes the time to reach a target as discussed by the authors.
Abstract: The Kelly strategy is attractive to practitioners because of its robust and optimal properties, e.g. that it dominates any other strategy in the long run and minimizes the time to reach a target, a...

Posted Content
TL;DR: In this article, the authors extend the diffusion model to jump-diffusion processes, where the driving Brownian motions are augmented by a class of Poisson random measures, which they introduce in Section 7.1 below.
Abstract: In Part I of this book, asset prices and factor processes were represented by diffusion processes, driven by correlated Brownian motions. In Part II we extend the theory — using as far as possible the same general approach — to jump-diffusion processes, where the driving Brownian motions are augmented by a class of Poisson random measures, which we introduce in Section 7.1 below. There are at least three good reasons for doing so:1. The diffusion framework cannot accommodate credit-related assets such as CDS (credit default swaps). A CDS is a swap consisting of regular premium payments one one side and, on the other, a contingent payment should some reference entity trigger a default event. Obviously, the main modelling question is how to represent the default time and the size of the contingent payment. There are many ways to do this, but even if default isrepresented by, say, a barrier hitting time by some diffusion process, the fact remains that the CDS investor's portfolio will take a hit at the moment of default, so the portfolio value cannot be represented as a continuous process.2. The asset price distributions implied by diffusion models can be unrealistic. Specifically, all the models presented so far have ‘thin tails’, i.e. the asset return distributions are similar to the Gaussian distribution in the tails. It is however well known that even quite standard financial data series such as stock indices have fatter tails than that: thin tails are the exception rather than the rule. From an asset allocation point of view, this may or may not be a significant factor. It may be that inaccurate representation of the tails is swamped by other deficiencies of the model such as — notably — the inability to estimate growth rates over long time horizons. On the other hand, from a risk-management perspective tail behaviour is the only thing that matters when one is computing VaR or CVaR, so one could hardly be satisfied if this were grossly misrepresented.3. The absence of explicit liquidity modelling in the diffusion model means that some risks certainly present in practice are ignored. To see the point, consider the simple Merton model of Chapter 1. The decision variable h(t) is the fraction of wealth invested in the risky asset, and this can be adjusted at will. This implies that it is always possible to avoid running into bankruptcy in this model. For example, suppose h(t) ≡ h*, a constant as in the Merton strategies; then the portfolio value is log-normal and the probability of hitting zero in finite time is zero. To see what is happening in detail, suppose the initial stock price is $100 and our investor is 10 times leveraged with initial capital $1000, so he starts with $10,000 in stock, financed by borrowing $9,000 from the bank. If there is a crash and the stock price drops by 15% to $85 then the investor is wiped out: his stock is now worth $8,500 while he still owes $9,000 to the bank. If, however, he can trade sufficiently fast that he can rebalance the portfolio to maintain the 10-to-1 leverage ratio every time the stock price falls by $1 then he loses a lot of money but remains solvent. The evolution of his portfolio is shown in Figure 7.1, and its final net value is $181.77…