scispace - formally typeset
Search or ask a question
Institution

New York University

EducationNew York, New York, United States
About: New York University is a education organization based out in New York, New York, United States. It is known for research contribution in the topics: Population & Poison control. The organization has 72380 authors who have published 165545 publications receiving 8334030 citations. The organization is also known as: NYU & University of the City of New York.


Papers
More filters
Book
11 Aug 2011
TL;DR: The authors describe an algorithm that reconstructs a close approximation of 1-D and 2-D signals from their multiscale edges and shows that the evolution of wavelet local maxima across scales characterize the local shape of irregular structures.
Abstract: A multiscale Canny edge detection is equivalent to finding the local maxima of a wavelet transform. The authors study the properties of multiscale edges through the wavelet theory. For pattern recognition, one often needs to discriminate different types of edges. They show that the evolution of wavelet local maxima across scales characterize the local shape of irregular structures. Numerical descriptors of edge types are derived. The completeness of a multiscale edge representation is also studied. The authors describe an algorithm that reconstructs a close approximation of 1-D and 2-D signals from their multiscale edges. For images, the reconstruction errors are below visual sensitivity. As an application, a compact image coding algorithm that selects important edges and compresses the image data by factors over 30 has been implemented. >

3,187 citations

Journal ArticleDOI
TL;DR: In this article, the authors provide a model that links a security's market liquidity and traders' funding liquidity, i.e., their availability of funds, to explain the empirically documented features that market liquidity can suddenly dry up (i) is fragile), (ii) has commonality across securities, (iii) is related to volatility, and (iv) experiences “flight to liquidity” events.
Abstract: We provide a model that links a security’s market liquidity — i.e., the ease of trading it — and traders’ funding liquidity — i.e., their availability of funds. Traders provide market liquidity and their ability to do so depends on their funding, that is, their capital and the margins charged by their financiers. In times of crisis, reductions in market liquidity and funding liquidity are mutually reinforcing, leading to a liquidity spiral. The model explains the empirically documented features that market liquidity (i) can suddenly dry up (i.e. is fragile), (ii) has commonality across securities, (iii) is related to volatility, (iv) experiences “flight to liquidity” events, and (v) comoves with the market. Finally, the model shows how the Fed can improve current market liquidity by committing to improve funding in a potential future crisis.

3,166 citations

Journal ArticleDOI
TL;DR: In this paper, the authors estimate diversification's effect on firm value by imputing stand-alone values for individual business segments and compare the sum of these standalone values to the firm's actual value, and find that overinvestment and cross-subsidization contribute to the value loss.

3,150 citations

Proceedings ArticleDOI
01 Jun 2018
TL;DR: The Multi-Genre Natural Language Inference corpus is introduced, a dataset designed for use in the development and evaluation of machine learning models for sentence understanding and shows that it represents a substantially more difficult task than does the Stanford NLI corpus.
Abstract: This paper introduces the Multi-Genre Natural Language Inference (MultiNLI) corpus, a dataset designed for use in the development and evaluation of machine learning models for sentence understanding. At 433k examples, this resource is one of the largest corpora available for natural language inference (a.k.a. recognizing textual entailment), improving upon available resources in both its coverage and difficulty. MultiNLI accomplishes this by offering data from ten distinct genres of written and spoken English, making it possible to evaluate systems on nearly the full complexity of the language, while supplying an explicit setting for evaluating cross-genre domain adaptation. In addition, an evaluation using existing machine learning models designed for the Stanford NLI corpus shows that it represents a substantially more difficult task than does that corpus, despite the two showing similar levels of inter-annotator agreement.

3,148 citations

Journal ArticleDOI
TL;DR: This paper reviews some of the recent developments in upstream difference schemes through a unified representation, in order to enable comparison between the various schemes.
Abstract: This paper reviews some of the recent developments in upstream difference schemes through a unified representation, in order to enable comparison between the various schemes. Special attention is given to the Godunov-type schemes that result from using an approximate solution of the Riemann problem. For schemes based on flux splitting, the approximate Riemann solution can be interpreted as a solution of the collisionless Boltzmann equation.

3,133 citations


Authors

Showing all 73237 results

NameH-indexPapersCitations
Rob Knight2011061253207
Virginia M.-Y. Lee194993148820
Frank E. Speizer193636135891
Stephen V. Faraone1881427140298
Eric R. Kandel184603113560
Andrei Shleifer171514271880
Eliezer Masliah170982127818
Roderick T. Bronson169679107702
Timothy A. Springer167669122421
Alvaro Pascual-Leone16596998251
Nora D. Volkow165958107463
Dennis R. Burton16468390959
Charles N. Serhan15872884810
Giacomo Bruno1581687124368
Tomas Hökfelt158103395979
Network Information
Related Institutions (5)
University of Pennsylvania
257.6K papers, 14.1M citations

98% related

Columbia University
224K papers, 12.8M citations

98% related

Yale University
220.6K papers, 12.8M citations

97% related

Harvard University
530.3K papers, 38.1M citations

97% related

University of Washington
305.5K papers, 17.7M citations

96% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2023245
20221,205
20218,761
20209,108
20198,417
20187,680