scispace - formally typeset
Search or ask a question
Institution

University of Erlangen-Nuremberg

EducationErlangen, Bayern, Germany
About: University of Erlangen-Nuremberg is a education organization based out in Erlangen, Bayern, Germany. It is known for research contribution in the topics: Population & Immune system. The organization has 42405 authors who have published 85600 publications receiving 2663922 citations.


Papers
More filters
Journal ArticleDOI
TL;DR: An alternative implementation of random forests is proposed, that provides unbiased variable selection in the individual classification trees, that can be used reliably for variable selection even in situations where the potential predictor variables vary in their scale of measurement or their number of categories.
Abstract: Variable importance measures for random forests have been receiving increased attention as a means of variable selection in many classification tasks in bioinformatics and related scientific fields, for instance to select a subset of genetic markers relevant for the prediction of a certain disease. We show that random forest variable importance measures are a sensible means for variable selection in many applications, but are not reliable in situations where potential predictor variables vary in their scale of measurement or their number of categories. This is particularly important in genomics and computational biology, where predictors often include variables of different types, for example when predictors include both sequence data and continuous variables such as folding energy, or when amino acid sequence data show different numbers of categories. Simulation studies are presented illustrating that, when random forest variable importance measures are used with data of varying types, the results are misleading because suboptimal predictor variables may be artificially preferred in variable selection. The two mechanisms underlying this deficiency are biased variable selection in the individual classification trees used to build the random forest on one hand, and effects induced by bootstrap sampling with replacement on the other hand. We propose to employ an alternative implementation of random forests, that provides unbiased variable selection in the individual classification trees. When this method is applied using subsampling without replacement, the resulting variable importance measures can be used reliably for variable selection even in situations where the potential predictor variables vary in their scale of measurement or their number of categories. The usage of both random forest algorithms and their variable importance measures in the R system for statistical computing is illustrated and documented thoroughly in an application re-analyzing data from a study on RNA editing. Therefore the suggested method can be applied straightforwardly by scientists in bioinformatics research.

2,697 citations

Journal ArticleDOI
TL;DR: The Standardization and Terminology Committee (STC) of the International Society of Biomechanics proposes definitions of JCS for the ankle, hip, and spine, and suggests that adopting these standards will lead to better communication among researchers and clinicians.

2,650 citations

Journal ArticleDOI
TL;DR: The updated strategies for the diagnosis and exclusion of HFNEF are useful not only for individual patient management but also for patient recruitment in future clinical trials exploring therapies forHFNEF.
Abstract: Diastolic heart failure (DHF) currently accounts for more than 50% of all heart failure patients. DHF is also referred to as heart failure with normal left ventricular (LV) ejection fraction (HFNEF) to indicate that HFNEF could be a precursor of heart failure with reduced LVEF. Because of improved cardiac imaging and because of widespread clinical use of plasma levels of natriuretic peptides, diagnostic criteria for HFNEF needed to be updated. The diagnosis of HFNEF requires the following conditions to be satisfied: (i) signs or symptoms of heart failure; (ii) normal or mildly abnormal systolic LV function; (iii) evidence of diastolic LV dysfunction. Normal or mildly abnormal systolic LV function implies both an LVEF > 50% and an LV end-diastolic volume index (LVEDVI) 16 mmHg or mean pulmonary capillary wedge pressure >12 mmHg) or non-invasively by tissue Doppler (TD) (E/E' > 15). If TD yields an E/E' ratio suggestive of diastolic LV dysfunction (15 > E/E' > 8), additional non-invasive investigations are required for diagnostic evidence of diastolic LV dysfunction. These can consist of blood flow Doppler of mitral valve or pulmonary veins, echo measures of LV mass index or left atrial volume index, electrocardiographic evidence of atrial fibrillation, or plasma levels of natriuretic peptides. If plasma levels of natriuretic peptides are elevated, diagnostic evidence of diastolic LV dysfunction also requires additional non-invasive investigations such as TD, blood flow Doppler of mitral valve or pulmonary veins, echo measures of LV mass index or left atrial volume index, or electrocardiographic evidence of atrial fibrillation. A similar strategy with focus on a high negative predictive value of successive investigations is proposed for the exclusion of HFNEF in patients with breathlessness and no signs of congestion. The updated strategies for the diagnosis and exclusion of HFNEF are useful not only for individual patient management but also for patient recruitment in future clinical trials exploring therapies for HFNEF.

2,578 citations

Journal ArticleDOI
Frank Arute1, Kunal Arya1, Ryan Babbush1, Dave Bacon1, Joseph C. Bardin2, Joseph C. Bardin1, Rami Barends1, Rupak Biswas3, Sergio Boixo1, Fernando G. S. L. Brandão4, Fernando G. S. L. Brandão1, David A. Buell1, B. Burkett1, Yu Chen1, Zijun Chen1, Ben Chiaro5, Roberto Collins1, William Courtney1, Andrew Dunsworth1, Edward Farhi1, Brooks Foxen5, Brooks Foxen1, Austin G. Fowler1, Craig Gidney1, Marissa Giustina1, R. Graff1, Keith Guerin1, Steve Habegger1, Matthew P. Harrigan1, Michael J. Hartmann1, Michael J. Hartmann6, Alan Ho1, Markus R. Hoffmann1, Trent Huang1, Travis S. Humble7, Sergei V. Isakov1, Evan Jeffrey1, Zhang Jiang1, Dvir Kafri1, Kostyantyn Kechedzhi1, Julian Kelly1, Paul V. Klimov1, Sergey Knysh1, Alexander N. Korotkov8, Alexander N. Korotkov1, Fedor Kostritsa1, David Landhuis1, Mike Lindmark1, E. Lucero1, Dmitry I. Lyakh7, Salvatore Mandrà3, Jarrod R. McClean1, Matt McEwen5, Anthony Megrant1, Xiao Mi1, Kristel Michielsen9, Kristel Michielsen10, Masoud Mohseni1, Josh Mutus1, Ofer Naaman1, Matthew Neeley1, Charles Neill1, Murphy Yuezhen Niu1, Eric Ostby1, Andre Petukhov1, John Platt1, Chris Quintana1, Eleanor Rieffel3, Pedram Roushan1, Nicholas C. Rubin1, Daniel Sank1, Kevin J. Satzinger1, Vadim Smelyanskiy1, Kevin J. Sung1, Kevin J. Sung11, Matthew D. Trevithick1, Amit Vainsencher1, Benjamin Villalonga12, Benjamin Villalonga1, Theodore White1, Z. Jamie Yao1, Ping Yeh1, Adam Zalcman1, Hartmut Neven1, John M. Martinis5, John M. Martinis1 
24 Oct 2019-Nature
TL;DR: Quantum supremacy is demonstrated using a programmable superconducting processor known as Sycamore, taking approximately 200 seconds to sample one instance of a quantum circuit a million times, which would take a state-of-the-art supercomputer around ten thousand years to compute.
Abstract: The promise of quantum computers is that certain computational tasks might be executed exponentially faster on a quantum processor than on a classical processor1. A fundamental challenge is to build a high-fidelity processor capable of running quantum algorithms in an exponentially large computational space. Here we report the use of a processor with programmable superconducting qubits2-7 to create quantum states on 53 qubits, corresponding to a computational state-space of dimension 253 (about 1016). Measurements from repeated experiments sample the resulting probability distribution, which we verify using classical simulations. Our Sycamore processor takes about 200 seconds to sample one instance of a quantum circuit a million times-our benchmarks currently indicate that the equivalent task for a state-of-the-art classical supercomputer would take approximately 10,000 years. This dramatic increase in speed compared to all known classical algorithms is an experimental realization of quantum supremacy8-14 for this specific computational task, heralding a much-anticipated computing paradigm.

2,527 citations


Authors

Showing all 42831 results

NameH-indexPapersCitations
Hermann Brenner1511765145655
Richard B. Devereux144962116403
Manfred Paulini1411791110930
Daniel S. Berman141136386136
Peter Lang140113698592
Joseph Sodroski13854277070
Richard J. Johnson13788072201
Jun Lu135152699767
Michael Schmitt1342007114667
Jost B. Jonas1321158166510
Andreas Mussgiller127105973778
Matthew J. Budoff125144968115
Stefan Funk12550656955
Markus F. Neurath12493462376
Jean-Marie Lehn123105484616
Network Information
Related Institutions (5)
Technische Universität München
123.4K papers, 4M citations

96% related

Ludwig Maximilian University of Munich
161.5K papers, 5.7M citations

95% related

Heidelberg University
119.1K papers, 4.6M citations

94% related

National University of Singapore
165.4K papers, 5.4M citations

93% related

University of Padua
114.8K papers, 3.6M citations

93% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2023208
2022660
20215,163
20204,911
20194,593
20184,374