Institution
University of Luxembourg
Education•Luxembourg, Luxembourg•
About: University of Luxembourg is a education organization based out in Luxembourg, Luxembourg. It is known for research contribution in the topics: Context (language use) & Computer science. The organization has 4744 authors who have published 22175 publications receiving 381824 citations.
Papers published on a yearly basis
Papers
More filters
••
TL;DR: The role of contexts, relationships and dispositions in young people's citizenship learning in everyday life is discussed in this paper, where the authors focus on the role of context, relationship, and attitude.
Abstract: Understanding young people’s citizenship learning in everyday life : The role of contexts, relationships and dispositions
212 citations
••
TL;DR: It is predicted that systems approaches will empower the transition from conventional reactive medical practice to a more proactive P4 medicine focused on wellness, and will reverse the escalating costs of drug development an will have enormous social and economic benefits.
Abstract: Personalized medicine is a term for a revolution in medicine that envisions the individual patient as the central focus of healthcare in the future. The term "personalized medicine", however, fails to reflect the enormous dimensionality of this new medicine that will be predictive, preventive, personalized, and participatory-a vision of medicine we have termed P4 medicine. This reflects a paradigm change in how medicine will be practiced that is revolutionary rather than evolutionary. P4 medicine arises from the confluence of a systems approach to medicine and from the digitalization of medicine that creates the large data sets necessary to deal with the complexities of disease. We predict that systems approaches will empower the transition from conventional reactive medical practice to a more proactive P4 medicine focused on wellness, and will reverse the escalating costs of drug development an will have enormous social and economic benefits. Our vision for P4 medicine in 10 years is that each patient will be associated with a virtual data cloud of billions of data points and that we will have the information technology for healthcare to reduce this enormous data dimensionality to simple hypotheses about health and/or disease for each individual. These data will be multi-scale across all levels of biological organization and extremely heterogeneous in type - this enormous amount of data represents a striking signal-to-noise (S/N) challenge. The key to dealing with this S/N challenge is to take a "holistic systems approach" to disease as we will discuss in this article.
211 citations
••
Institute for Systems Biology1, University of Lyon2, University of Luxembourg3, Wellcome Trust Sanger Institute4, Semmelweis University5, Linköping University6, Pfizer7, International Association of Classification Societies8, Austrian Academy of Sciences9, Medical University of Vienna10, Max Planck Society11, University of Florida12, KPMG13, Leiden University14, CERN15, Utrecht University16, European Bioinformatics Institute17, Saarland University18, Forschungszentrum Jülich19, Leiden University Medical Center20, Imperial College London21, Vienna University of Technology22, Association of the British Pharmaceutical Industry23, RWTH Aachen University24, University of Sheffield25, University of Leeds26, King's College London27, National and Kapodistrian University of Athens28, Fraunhofer Society29, Janssen Pharmaceutica30, University of Manchester31, Curie Institute32, Philips33, Katholieke Universiteit Leuven34, University of Valencia35, Swiss Institute of Bioinformatics36, Polaris Industries37
TL;DR: Clinicians, researchers, and citizens need improved methods, tools, and training to generate, analyze, and query data effectively and contribute to creating the European Single Market for health, which will improve health and healthcare for all Europeans.
Abstract: Medicine and healthcare are undergoing profound changes. Whole-genome sequencing and high-resolution imaging technologies are key drivers of this rapid and crucial transformation. Technological innovation combined with automation and miniaturization has triggered an explosion in data production that will soon reach exabyte proportions. How are we going to deal with this exponential increase in data production? The potential of “big data” for improving health is enormous but, at the same time, we face a wide range of challenges to overcome urgently. Europe is very proud of its cultural diversity; however, exploitation of the data made available through advances in genomic medicine, imaging, and a wide range of mobile health applications or connected devices is hampered by numerous historical, technical, legal, and political barriers. European health systems and databases are diverse and fragmented. There is a lack of harmonization of data formats, processing, analysis, and data transfer, which leads to incompatibilities and lost opportunities. Legal frameworks for data sharing are evolving. Clinicians, researchers, and citizens need improved methods, tools, and training to generate, analyze, and query data effectively. Addressing these barriers will contribute to creating the European Single Market for health, which will improve health and healthcare for all Europeans.
211 citations
••
TL;DR: This work presents fastcore, a generic algorithm for reconstructing context-specific metabolic network models from global genome-wide metabolicnetwork models such as Recon X, and shows that a minimal consistent reconstruction can be defined via a set of sparse modes of the global network.
Abstract: Systemic approaches to the study of a biological cell or tissue rely increasingly on the use of context-specific metabolic network models. The reconstruction of such a model from high-throughput data can routinely involve large numbers of tests under different conditions and extensive parameter tuning, which calls for fast algorithms. We present fastcore, a generic algorithm for reconstructing context-specific metabolic network models from global genome-wide metabolic network models such as Recon X. fastcore takes as input a core set of reactions that are known to be active in the context of interest (e.g., cell or tissue), and it searches for a flux consistent subnetwork of the global network that contains all reactions from the core set and a minimal set of additional reactions. Our key observation is that a minimal consistent reconstruction can be defined via a set of sparse modes of the global network, and fastcore iteratively computes such a set via a series of linear programs. Experiments on liver data demonstrate speedups of several orders of magnitude, and significantly more compact reconstructions, over a rival method. Given its simplicity and its excellent performance, fastcore can form the backbone of many future metabolic network reconstruction algorithms.
211 citations
••
TL;DR: In this article, a review of deep neural networks for molecular simulation is presented, with particular focus on (deep) neural network for the prediction of quantum-mechanical energies and forces, coarse-grained molecular dynamics, the extraction of free energy surfaces and kinetics and generative network approaches to sample molecular equilibrium structures and compute thermodynamics.
Abstract: Machine learning (ML) is transforming all areas of science. The complex and time-consuming calculations in molecular simulations are particularly suitable for a machine learning revolution and have already been profoundly impacted by the application of existing ML methods. Here we review recent ML methods for molecular simulation, with particular focus on (deep) neural networks for the prediction of quantum-mechanical energies and forces, coarse-grained molecular dynamics, the extraction of free energy surfaces and kinetics and generative network approaches to sample molecular equilibrium structures and compute thermodynamics. To explain these methods and illustrate open methodological problems, we review some important principles of molecular physics and describe how they can be incorporated into machine learning structures. Finally, we identify and describe a list of open challenges for the interface between ML and molecular simulation.
210 citations
Authors
Showing all 4893 results
Name | H-index | Papers | Citations |
---|---|---|---|
Jun Wang | 166 | 1093 | 141621 |
Leroy Hood | 158 | 853 | 128452 |
Andreas Heinz | 108 | 1078 | 45002 |
Philippe Dubois | 101 | 1098 | 48086 |
John W. Berry | 97 | 351 | 52470 |
Michael Müller | 91 | 333 | 26237 |
Bart Preneel | 82 | 844 | 25572 |
Bjorn Ottersten | 81 | 1058 | 28359 |
Sander Kersten | 79 | 246 | 23985 |
Alexandre Tkatchenko | 77 | 271 | 26863 |
Rudi Balling | 75 | 238 | 19529 |
Lionel C. Briand | 75 | 380 | 24519 |
Min Wang | 72 | 716 | 19197 |
Stephen H. Friend | 70 | 184 | 53422 |
Ekhard K. H. Salje | 70 | 581 | 19938 |