scispace - formally typeset
Search or ask a question
Institution

Cornell University

EducationIthaca, New York, United States
About: Cornell University is a education organization based out in Ithaca, New York, United States. It is known for research contribution in the topics: Population & Gene. The organization has 102246 authors who have published 235546 publications receiving 12283673 citations. The organization is also known as: Cornell & CUI.


Papers
More filters
Proceedings ArticleDOI
21 Jul 2017
TL;DR: DenseNet as mentioned in this paper proposes to connect each layer to every other layer in a feed-forward fashion, which can alleviate the vanishing gradient problem, strengthen feature propagation, encourage feature reuse, and substantially reduce the number of parameters.
Abstract: Recent work has shown that convolutional networks can be substantially deeper, more accurate, and efficient to train if they contain shorter connections between layers close to the input and those close to the output. In this paper, we embrace this observation and introduce the Dense Convolutional Network (DenseNet), which connects each layer to every other layer in a feed-forward fashion. Whereas traditional convolutional networks with L layers have L connections—one between each layer and its subsequent layer—our network has L(L+1)/2 direct connections. For each layer, the feature-maps of all preceding layers are used as inputs, and its own feature-maps are used as inputs into all subsequent layers. DenseNets have several compelling advantages: they alleviate the vanishing-gradient problem, strengthen feature propagation, encourage feature reuse, and substantially reduce the number of parameters. We evaluate our proposed architecture on four highly competitive object recognition benchmark tasks (CIFAR-10, CIFAR-100, SVHN, and ImageNet). DenseNets obtain significant improvements over the state-of-the-art on most of them, whilst requiring less memory and computation to achieve high performance. Code and pre-trained models are available at https://github.com/liuzhuang13/DenseNet.

27,821 citations

Journal ArticleDOI
TL;DR: In addition to NDF, new improved methods for total dietary fiber and nonstarch polysaccharides including pectin and beta-glucans now are available and are also of interest in rumen fermentation.

23,302 citations

Proceedings ArticleDOI
21 Jul 2017
TL;DR: This paper exploits the inherent multi-scale, pyramidal hierarchy of deep convolutional networks to construct feature pyramids with marginal extra cost and achieves state-of-the-art single-model results on the COCO detection benchmark without bells and whistles.
Abstract: Feature pyramids are a basic component in recognition systems for detecting objects at different scales. But pyramid representations have been avoided in recent object detectors that are based on deep convolutional networks, partially because they are slow to compute and memory intensive. In this paper, we exploit the inherent multi-scale, pyramidal hierarchy of deep convolutional networks to construct feature pyramids with marginal extra cost. A top-down architecture with lateral connections is developed for building high-level semantic feature maps at all scales. This architecture, called a Feature Pyramid Network (FPN), shows significant improvement as a generic feature extractor in several applications. Using a basic Faster R-CNN system, our method achieves state-of-the-art single-model results on the COCO detection benchmark without bells and whistles, surpassing all existing single-model entries including those from the COCO 2016 challenge winners. In addition, our method can run at 5 FPS on a GPU and thus is a practical and accurate solution to multi-scale object detection. Code will be made publicly available.

16,727 citations

Journal ArticleDOI
TL;DR: In this paper, Heaton, AG Hogg, KA Hughen, KF Kaiser, B Kromer, SW Manning, RW Reimer, DA Richards, JR Southon, S Talamo, CSM Turney, J van der Plicht, CE Weyhenmeyer
Abstract: Additional co-authors: TJ Heaton, AG Hogg, KA Hughen, KF Kaiser, B Kromer, SW Manning, RW Reimer, DA Richards, JR Southon, S Talamo, CSM Turney, J van der Plicht, CE Weyhenmeyer

13,605 citations

Journal ArticleDOI
TL;DR: It is demonstrated that the algorithms proposed are highly effective at discovering community structure in both computer-generated and real-world network data, and can be used to shed light on the sometimes dauntingly complex structure of networked systems.
Abstract: We propose and study a set of algorithms for discovering community structure in networks-natural divisions of network nodes into densely connected subgroups. Our algorithms all share two definitive features: first, they involve iterative removal of edges from the network to split it into communities, the edges removed being identified using any one of a number of possible "betweenness" measures, and second, these measures are, crucially, recalculated after each removal. We also propose a measure for the strength of the community structure found by our algorithms, which gives us an objective metric for choosing the number of communities into which a network should be divided. We demonstrate that our algorithms are highly effective at discovering community structure in both computer-generated and real-world network data, and show how they can be used to shed light on the sometimes dauntingly complex structure of networked systems.

12,882 citations


Authors

Showing all 103081 results

NameH-indexPapersCitations
Eric S. Lander301826525976
David Miller2032573204840
Lewis C. Cantley196748169037
Charles A. Dinarello1901058139668
Scott M. Grundy187841231821
Paul G. Richardson1831533155912
Chris Sander178713233287
David R. Williams1782034138789
David L. Kaplan1771944146082
Kari Alitalo174817114231
Richard K. Wilson173463260000
George F. Koob171935112521
Avshalom Caspi170524113583
Derek R. Lovley16858295315
Stephen B. Baylin168548188934
Network Information
Related Institutions (5)
University of Pennsylvania
257.6K papers, 14.1M citations

96% related

Stanford University
320.3K papers, 21.8M citations

96% related

University of Washington
305.5K papers, 17.7M citations

96% related

Columbia University
224K papers, 12.8M citations

96% related

Yale University
220.6K papers, 12.8M citations

95% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2023309
20221,363
202112,457
202012,139
201910,787
20189,905