scispace - formally typeset

Institution

University College London

EducationLondon, United Kingdom
About: University College London is a(n) education organization based out in London, United Kingdom. It is known for research contribution in the topic(s): Population & Poison control. The organization has 81105 authors who have published 210603 publication(s) receiving 9868552 citation(s). The organization is also known as: UCL & University College, London.


Papers
More filters
Report SeriesDOI
Abstract: Estimation of the dynamic error components model is considered using two alternative linear estimators that are designed to improve the properties of the standard first-differenced GMM estimator. Both estimators require restrictions on the initial conditions process. Asymptotic efficiency comparisons and Monte Carlo simulations for the simple AR(1) model demonstrate the dramatic improvement in performance of the proposed estimators compared to the usual first-differenced GMM estimator, and compared to non-linear GMM. The importance of these results is illustrated in an application to the estimation of a labour demand model using company panel data.

17,468 citations

Journal ArticleDOI
TL;DR: A genetical mathematical model is described which allows for interactions between relatives on one another's fitness and a quantity is found which incorporates the maximizing property of Darwinian fitness, named “inclusive fitness”.
Abstract: A genetical mathematical model is described which allows for interactions between relatives on one another's fitness. Making use of Wright's Coefficient of Relationship as the measure of the proportion of replica genes in a relative, a quantity is found which incorporates the maximizing property of Darwinian fitness. This quantity is named “inclusive fitness”. Species following the model should tend to evolve behaviour such that each organism appears to be attempting to maximize its inclusive fitness. This implies a limited restraint on selfish competitive behaviour and possibility of limited self-sacrifices. Special cases of the model are used to show (a) that selection in the social situations newly covered tends to be slower than classical selection, (b) how in populations of rather non-dispersive organisms the model may apply to genes affecting dispersion, and (c) how it may apply approximately to competition between relatives, for example, within sibships. Some artificialities of the model are discussed.

14,111 citations

Journal ArticleDOI
22 Dec 2000-Science
TL;DR: Locally linear embedding (LLE) is introduced, an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs that learns the global structure of nonlinear manifolds.
Abstract: Many areas of science depend on exploratory data analysis and visualization. The need to analyze large amounts of multivariate data raises the fundamental problem of dimensionality reduction: how to discover compact representations of high-dimensional data. Here, we introduce locally linear embedding (LLE), an unsupervised learning algorithm that computes low-dimensional, neighborhood-preserving embeddings of high-dimensional inputs. Unlike clustering methods for local dimensionality reduction, LLE maps its inputs into a single global coordinate system of lower dimensionality, and its optimizations do not involve local minima. By exploiting the local symmetries of linear reconstructions, LLE is able to learn the global structure of nonlinear manifolds, such as those generated by images of faces or documents of text.

13,822 citations

Proceedings ArticleDOI
27 Jun 2016
Abstract: Convolutional networks are at the core of most state of-the-art computer vision solutions for a wide variety of tasks. Since 2014 very deep convolutional networks started to become mainstream, yielding substantial gains in various benchmarks. Although increased model size and computational cost tend to translate to immediate quality gains for most tasks (as long as enough labeled data is provided for training), computational efficiency and low parameter count are still enabling factors for various use cases such as mobile vision and big-data scenarios. Here we are exploring ways to scale up networks in ways that aim at utilizing the added computation as efficiently as possible by suitably factorized convolutions and aggressive regularization. We benchmark our methods on the ILSVRC 2012 classification challenge validation set demonstrate substantial gains over the state of the art: 21:2% top-1 and 5:6% top-5 error for single frame evaluation using a network with a computational cost of 5 billion multiply-adds per inference and with using less than 25 million parameters. With an ensemble of 4 models and multi-crop evaluation, we report 3:5% top-5 error and 17:3% top-1 error on the validation set and 3:6% top-5 error on the official test set.

12,684 citations

Journal ArticleDOI
Claude Amsler1, Michael Doser2, Mario Antonelli, D. M. Asner3  +173 moreInstitutions (86)
TL;DR: This biennial Review summarizes much of particle physics, using data from previous editions.
Abstract: This biennial Review summarizes much of particle physics. Using data from previous editions., plus 2778 new measurements from 645 papers, we list, evaluate, and average measured properties of gauge bosons, leptons, quarks, mesons, and baryons. We also summarize searches for hypothetical particles such as Higgs bosons, heavy neutrinos, and supersymmetric particles. All the particle properties and search limits are listed in Summary Tables. We also give numerous tables, figures, formulae, and reviews of topics such as the Standard Model, particle detectors., probability, and statistics. Among the 108 reviews are many that are new or heavily revised including those on CKM quark-mixing matrix, V-ud & V-us, V-cb & V-ub, top quark, muon anomalous magnetic moment, extra dimensions, particle detectors, cosmic background radiation, dark matter, cosmological parameters, and big bang cosmology.

11,048 citations


Authors

Showing all 81105 results

NameH-indexPapersCitations
Trevor W. Robbins2311137164437
George Davey Smith2242540248373
Karl J. Friston2171267217169
Robert J. Lefkowitz214860147995
Cyrus Cooper2041869206782
David Miller2032573204840
Mark I. McCarthy2001028187898
André G. Uitterlinden1991229156747
Raymond J. Dolan196919138540
Michael Marmot1931147170338
Nicholas G. Martin1921770161952
David R. Williams1782034138789
John Hardy1771178171694
James J. Heckman175766156816
Kay-Tee Khaw1741389138782
Network Information
Related Institutions (5)
University of Cambridge

282.2K papers, 14.4M citations

97% related

University of Oxford

258.1K papers, 12.9M citations

97% related

University of Toronto

294.9K papers, 13.5M citations

95% related

Columbia University

224K papers, 12.8M citations

95% related

Harvard University

530.3K papers, 38.1M citations

95% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2022194
202115,383
202014,648
201912,642
201811,690
201711,550