scispace - formally typeset
Search or ask a question
Institution

New York University

EducationNew York, New York, United States
About: New York University is a education organization based out in New York, New York, United States. It is known for research contribution in the topics: Population & Poison control. The organization has 72380 authors who have published 165545 publications receiving 8334030 citations. The organization is also known as: NYU & University of the City of New York.


Papers
More filters
Proceedings Article
08 Dec 2014
TL;DR: In this article, two deep network stacks are employed to make a coarse global prediction based on the entire image, and another to refine this prediction locally, which achieves state-of-the-art results on both NYU Depth and KITTI.
Abstract: Predicting depth is an essential component in understanding the 3D geometry of a scene. While for stereo images local correspondence suffices for estimation, finding depth relations from a single image is less straightforward, requiring integration of both global and local information from various cues. Moreover, the task is inherently ambiguous, with a large source of uncertainty coming from the overall scale. In this paper, we present a new method that addresses this task by employing two deep network stacks: one that makes a coarse global prediction based on the entire image, and another that refines this prediction locally. We also apply a scale-invariant error to help measure depth relations rather than scale. By leveraging the raw datasets as large sources of training data, our method achieves state-of-the-art results on both NYU Depth and KITTI, and matches detailed depth boundaries without the need for superpixelation.

2,994 citations

Journal ArticleDOI
TL;DR: Protein kinases that phosphorylate the alpha subunit of eukaryotic initiation factor 2 (eIF2alpha) are activated in stressed cells and negatively regulate protein synthesis, resulting in the induction of the downstream gene CHOP (GADD153).

2,988 citations

Journal ArticleDOI
TL;DR: The use of kappa implicitly assumes that all disagreements are equally serious as discussed by the authors, which is not the case, and hence the use of the kappa scale implicitly implies that not all disagreements will be equally serious.
Abstract: or weighted kappa (Spitzer, Cohen, Fleiss and Endicott, 1967; Cohen, 1968a). Kappa is the proportion of agreement corrected for chance, and scaled to vary from -1 to +1 so that a negative value indicates poorer than chance agreement, zero indicates exactly chance agreement, and a positive value indicates better than chance agreement. A value of unity indicates perfect agreement. The use of kappa implicitly assumes that all disagreements are equally serious. When the investigator can specify the relative seriousness of each kind of disagreement, he may employ weighted kappa, the proportion of weighted agreement corrected for chance. For measuring the reliability of quantitative scales, the product-moment and intraclass correlation coefficients are widely

2,986 citations

Posted Content
TL;DR: In this article, the authors introduce a novel visualization technique that gives insight into the function of intermediate feature layers and the operation of the classifier, and perform an ablation study to discover the performance contribution from different model layers.
Abstract: Large Convolutional Network models have recently demonstrated impressive classification performance on the ImageNet benchmark. However there is no clear understanding of why they perform so well, or how they might be improved. In this paper we address both issues. We introduce a novel visualization technique that gives insight into the function of intermediate feature layers and the operation of the classifier. We also perform an ablation study to discover the performance contribution from different model layers. This enables us to find model architectures that outperform Krizhevsky \etal on the ImageNet classification benchmark. We show our ImageNet model generalizes well to other datasets: when the softmax classifier is retrained, it convincingly beats the current state-of-the-art results on Caltech-101 and Caltech-256 datasets.

2,982 citations

Journal ArticleDOI
TL;DR: The credit channel theory of monetary policy transmission holds that informational frictions in credit markets worsen during tight money periods and the resulting increase in the external finance premium enhances the effects of monetary policies on the real economy as discussed by the authors.
Abstract: The 'credit channel' theory of monetary policy transmission holds that informational frictions in credit markets worsen during tight- money periods. The resulting increase in the external finance premium--the difference in cost between internal and external funds-- enhances the effects of monetary policy on the real economy. We document the responses of GDP and its components to monetary policy shocks and describe how the credit channel helps explain the facts. We discuss two main components of this mechanism, the balance-sheet channel and the bank lending channel. We argue that forecasting exercises using credit aggregates are not valid tests of this theory.

2,977 citations


Authors

Showing all 73237 results

NameH-indexPapersCitations
Rob Knight2011061253207
Virginia M.-Y. Lee194993148820
Frank E. Speizer193636135891
Stephen V. Faraone1881427140298
Eric R. Kandel184603113560
Andrei Shleifer171514271880
Eliezer Masliah170982127818
Roderick T. Bronson169679107702
Timothy A. Springer167669122421
Alvaro Pascual-Leone16596998251
Nora D. Volkow165958107463
Dennis R. Burton16468390959
Charles N. Serhan15872884810
Giacomo Bruno1581687124368
Tomas Hökfelt158103395979
Network Information
Related Institutions (5)
University of Pennsylvania
257.6K papers, 14.1M citations

98% related

Columbia University
224K papers, 12.8M citations

98% related

Yale University
220.6K papers, 12.8M citations

97% related

Harvard University
530.3K papers, 38.1M citations

97% related

University of Washington
305.5K papers, 17.7M citations

96% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2023245
20221,205
20218,761
20209,108
20198,417
20187,680