scispace - formally typeset
Search or ask a question
Author

Uday Rajan

Other affiliations: Carnegie Mellon University
Bio: Uday Rajan is an academic researcher from University of Michigan. The author has contributed to research in topics: Market liquidity & Order (exchange). The author has an hindex of 25, co-authored 89 publications receiving 3283 citations. Previous affiliations of Uday Rajan include Carnegie Mellon University.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article, the authors used data on securitized subprime mortgages issued in the period 1997-2006 to show that a statistical default model estimated in a low securitization period breaks down in a high securitus period in a systematic manner: it underpredicts defaults among borrowers for whom soft information is more valuable.
Abstract: Statistical default models, widely used to assess default risk, are subject to a Lucas critique We demonstrate this phenomenon using data on securitized subprime mortgages issued in the period 1997--2006 As the level of securitization increases, lenders have an incentive to originate loans that rate high based on characteristics that are reported to investors, even if other unreported variables imply a lower borrower quality Consistent with this behavior, we find that over time lenders set interest rates only on the basis of variables that are reported to investors, ignoring other credit-relevant information The change in lender behavior alters the data generating process by transforming the mapping from observables to loan defaults To illustrate this effect, we show that a statistical default model estimated in a low securitization period breaks down in a high securitization period in a systematic manner: it underpredicts defaults among borrowers for whom soft information is more valuable Regulations that rely on such models to assess default risk may therefore be undermined by the actions of market participants

392 citations

Journal ArticleDOI
TL;DR: In this article, a dynamic limit order market is modeled as a stochastic sequential game, and an algorithm based on Pakes and McGuire (2001) is proposed to find a stationary Markov-perfect equilibrium.
Abstract: We model a dynamic limit order market as a stochastic sequential game. Since the model is analytically intractable, we provide an algorithm based on Pakes and McGuire (2001) to find a stationary Markov-perfect equilibrium. Given the stationary equilibrium, we generate artificial time series and perform comparative dynamics. We demonstrate that the order flow displays persistence. As we know the data generating process, we can compare transaction prices to the true value of the asset, as well as explicitly determine the welfare gains accruing to investors. Due to the endogeneity of order flow, the midpoint of the quoted prices is not a good proxy for the true value. Further, transaction costs paid by market order submitters are negative on average. The effective spread is negatively correlated with true transaction costs, and largely uncorrelated with changes in investor surplus. As a policy experiment, we consider the effect of a change in tick size, and find that it has a very small positive impact on investor surplus.

319 citations

Journal ArticleDOI
TL;DR: In this paper, a dynamic limit order market is modeled as a stochastic sequential game with rational traders and an algorithm based on Pakes and McGuire (2001) is proposed to find a stationary Markov-perfect equilibrium.
Abstract: We model a dynamic limit order market as a stochastic sequential game with rational traders. Since the model is analytically intractable, we provide an algorithm based on Pakes and McGuire (2001) to find a stationary Markov-perfect equilibrium. We then generate artificial time series and perform comparative dynamics. Conditional on a transaction, the midpoint of the quoted prices is not a good proxy for the true value. Further, transaction costs paid by market order submitters are negative on average, and negatively correlated with the effective spread. Reducing the tick size is not Pareto improving but increases total investor surplus.

308 citations

Journal ArticleDOI
TL;DR: In this paper, the authors consider informed traders in a limit order market for a single asset, where traders randomly arrive at the market, after choosing whether to purchase information about the common value.
Abstract: We consider informed traders in a limit order market for a single asset. The asset has a common value; in addition, each trader has a private value for it. Traders randomly arrive at the market, after choosing whether to purchase information about the common value. They may either post prices or accept posted prices. If a trader's order has not executed, he randomly reenters the market, and may change his previous order. The model is thus a dynamic stochastic game with asymmetric information. We numerically solve for equilibrium in the model, and simulate market outcomes. Agents' incentives to acquire information and their subsequent equilibrium trading behavior changes systematically with the underlying volatility of the asset. Agents with no intrinsic benefit from trade have the highest value for information and also tend to supply liquidity. However, these agents reduce their liquidity provision when the asset volatility is high. In equilibrium, the limit order market acts as a "volatility multiplier'': prices are more volatile than the fundamental value of the asset. This effect increases when the fundamental volatility of the asset is higher or when there is asymmetric information across traders, due to a change in the composition of trader types that choose to provide liquidity. Further, changes in the microstructure noise are negatively correlated with changes in the estimated fundamental value, implying that asset betas estimated from high-frequency data will be incorrect.

216 citations

Journal ArticleDOI
TL;DR: In this paper, the authors show that over time lenders set interest rates only on the basis of variables that are reported to investors, ignoring other credit-relevant information, and that among borrowers with similar reported characteristics, over time the set that receives loans becomes worse along the unseen information dimension.

212 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: The authors summarizes and explains the main events of the liquidity and credit crunch in 2007-08, starting with the trends leading up to the crisis and explaining how four different amplification mechanisms magnified losses in the mortgage market into large dislocations and turmoil in financial markets.
Abstract: This paper summarizes and explains the main events of the liquidity and credit crunch in 2007-08. Starting with the trends leading up to the crisis, I explain how these events unfolded and how four different amplification mechanisms magnified losses in the mortgage market into large dislocations and turmoil in financial markets.

3,033 citations

Journal ArticleDOI
TL;DR: The financial market turmoil in 2007 and 2008 has led to the most severe financial crisis since the Great Depression and threatens to have large repercussions on the real economy as mentioned in this paper The bursting of the housing bubble forced banks to write down several hundred billion dollars in bad loans caused by mortgage delinquencies at the same time the stock market capitalization of the major banks declined by more than twice as much.
Abstract: The financial market turmoil in 2007 and 2008 has led to the most severe financial crisis since the Great Depression and threatens to have large repercussions on the real economy The bursting of the housing bubble forced banks to write down several hundred billion dollars in bad loans caused by mortgage delinquencies At the same time, the stock market capitalization of the major banks declined by more than twice as much While the overall mortgage losses are large on an absolute scale, they are still relatively modest compared to the $8 trillion of US stock market wealth lost between October 2007, when the stock market reached an all-time high, and October 2008 This paper attempts to explain the economic mechanisms that caused losses in the mortgage market to amplify into such large dislocations and turmoil in the financial markets, and describes common economic threads that explain the plethora of market declines, liquidity dry-ups, defaults, and bailouts that occurred after the crisis broke in summer 2007 To understand these threads, it is useful to recall some key factors leading up to the housing bubble The US economy was experiencing a low interest rate environment, both because of large capital inflows from abroad, especially from Asian countries, and because the Federal Reserve had adopted a lax interest rate policy Asian countries bought US securities both to peg the exchange rates at an export-friendly level and to hedge against a depreciation of their own currencies against the dollar, a lesson learned from the Southeast Asian crisis of the late 1990s The Federal Reserve Bank feared a deflationary period after the bursting of the Internet bubble and thus did not counteract the buildup of the housing bubble At the same time, the banking system underwent an important transformation The

2,434 citations

Posted Content
TL;DR: Based on within-stock variation, it is found that algorithmic trading and liquidity are positively related and quoted and effective spreads narrow under autoquote and adverse selection declines, indicating that algorithms do causally improve liquidity.
Abstract: Algorithmic trading has sharply increased over the past decade Does it improve market quality, and should it be encouraged? We provide the first analysis of this question The NYSE automated quote dissemination in 2003, and we use this change in market structure that increases algorithmic trading as an exogenous instrument to measure the causal effect of algorithmic trading on liquidity For large stocks in particular, algorithmic trading narrows spreads, reduces adverse selection, and reduces trade-related price discovery The findings indicate that algorithmic trading improves liquidity and enhances the informativeness of quotes

1,190 citations

Journal ArticleDOI
TL;DR: In this paper, the causal effect of algorithmic trading on the New York Stock Exchange's quote dissemination has been analyzed. And the results indicate that AT improves liquidity and enhances the informativeness of quotes.
Abstract: Algorithmic trading (AT) has increased sharply over the past decade. Does it improve market quality, and should it be encouraged? We provide the first analysis of this question. The New York Stock Exchange automated quote dissemination in 2003, and we use this change in market structure that increases AT as an exogenous instrument to measure the causal effect of AT on liquidity. For large stocks in particular, AT narrows spreads, reduces adverse selection, and reduces trade-related price discovery. The findings indicate that AT improves liquidity and enhances the informativeness of quotes. TECHNOLOGICAL CHANGE HAS REVOLUTIONIZED the way financial assets are traded. Every step of the trading process, from order entry to trading venue to back office, is now highly automated, dramatically reducing the costs incurred by intermediaries. By reducing the frictions and costs of trading, technology has the potential to enable more efficient risk sharing, facilitate hedging, improve liquidity, and make prices more efficient. This could ultimately reduce firms’ cost of capital. Algorithmic trading (AT) is a dramatic example of this far-reaching technological change. Many market participants now employ AT, commonly defined as the use of computer algorithms to automatically make certain trading decisions, submit orders, and manage those orders after submission. From a starting point near zero in the mid-1990s, AT is thought to be responsible for

1,002 citations