Institution
University of Milan
Education•Milan, Italy•
About: University of Milan is a education organization based out in Milan, Italy. It is known for research contribution in the topics: Population & Transplantation. The organization has 58413 authors who have published 139784 publications receiving 4636354 citations. The organization is also known as: Università degli Studi di Milano & Statale.
Papers published on a yearly basis
Papers
More filters
••
University of Amsterdam1, Nemours Foundation2, University of Western Australia3, Pierre-and-Marie-Curie University4, Columbia University5, University of Pennsylvania6, Oslo University Hospital7, University of Palermo8, French Institute of Health and Medical Research9, University of Gothenburg10, University of Milan11, University of Western Ontario12, University College London13, University Medical Center Groningen14, University of Copenhagen15, University of California, Los Angeles16, Ludwig Maximilian University of Munich17, University of the Witwatersrand18, Imperial College London19, University of São Paulo20, Radboud University Nijmegen21, Charité22, Helsinki University Central Hospital23, University of Helsinki24, Sahlgrenska University Hospital25
TL;DR: This consensus paper aims to improve awareness of the need for early detection and management of FH children by recommending cascade screening of families using a combined phenotypic and genotypic strategy.
Abstract: Familial hypercholesterolaemia (FH) is a common genetic cause of premature coronary heart disease (CHD). Globally, one baby is born with FH every minute. If diagnosed and treated early in childhood, individuals with FH can have normal life expectancy. This consensus paper aims to improve awareness of the need for early detection and management of FH children. Familial hypercholesterolaemia is diagnosed either on phenotypic criteria, i.e. an elevated low-density lipoprotein cholesterol (LDL-C) level plus a family history of elevated LDL-C, premature coronary artery disease and/or genetic diagnosis, or positive genetic testing. Childhood is the optimal period for discrimination between FH and non-FH using LDL-C screening. An LDL-C ≥5 mmol/L (190 mg/dL), or an LDL-C ≥4 mmol/L (160 mg/dL) with family history of premature CHD and/or high baseline cholesterol in one parent, make the phenotypic diagnosis. If a parent has a genetic defect, the LDL-C cut-off for the child is ≥3.5 mmol/L (130 mg/dL). We recommend cascade screening of families using a combined phenotypic and genotypic strategy. In children, testing is recommended from age 5 years, or earlier if homozygous FH is suspected. A healthy lifestyle and statin treatment (from age 8 to 10 years) are the cornerstones of management of heterozygous FH. Target LDL-C is 10 years, or ideally 50% reduction from baseline if 8–10 years, especially with very high LDL-C, elevated lipoprotein(a), a family history of premature CHD or other cardiovascular risk factors, balanced against the long-term risk of treatment side effects. Identifying FH early and optimally lowering LDL-C over the lifespan reduces cumulative LDL-C burden and offers health and socioeconomic benefits. To drive policy change for timely detection and management, we call for further studies in the young. Increased awareness, early identification, and optimal treatment from childhood are critical to adding decades of healthy life for children and adolescents with FH.
581 citations
••
TL;DR: In this paper, the electron density distribution in transition metal carbonyl clusters has been analyzed using the quantum theory of atoms in molecules and experimental determinations of electron density in metal-caroline clusters.
581 citations
••
TL;DR: TMS findings represent the first direct characterization of the coarse electrophysiological properties of three associative areas of the human cerebral cortex and indicate that, in healthy subjects, each corticothalamic module is normally tuned to oscillate at a characteristic rate.
Abstract: The frequency tuning of a system can be directly determined by perturbing it and by observing the rate of the ensuing oscillations, the so called natural frequency. This approach is used, for example, in physics, in geology, and also when one tunes a musical instrument. In the present study, we employ transcranial magnetic stimulation (TMS) to directly perturb a set of selected corticothalamic modules (Brodmann areas 19, 7, and 6) and high-density electroencephalogram to measure their natural frequency. TMS consistently evoked dominant alpha-band oscillations (8-12 Hz) in the occipital cortex, beta-band oscillations (13-20 Hz) in the parietal cortex, and fast beta/gamma-band oscillations (21-50 Hz) in the frontal cortex. Each cortical area tended to preserve its own natural frequency also when indirectly engaged by TMS through brain connections and when stimulated at different intensities, indicating that the observed oscillations reflect local physiological mechanisms. These findings were reproducible across individuals and represent the first direct characterization of the coarse electrophysiological properties of three associative areas of the human cerebral cortex. Most importantly, they indicate that, in healthy subjects, each corticothalamic module is normally tuned to oscillate at a characteristic rate. The natural frequency can be directly measured in virtually any area of the cerebral cortex and may represent a straightforward and flexible way to probe the state of human thalamocortical circuits at the patient's bedside.
580 citations
••
TL;DR: This paper proves tight data-dependent bounds for the risk of this hypothesis in terms of an easily computable statistic M/sub n/ associated with the on-line performance of the ensemble, and obtains risk tail bounds for kernel perceptron algorithms interms of the spectrum of the empirical kernel matrix.
Abstract: In this paper, it is shown how to extract a hypothesis with small risk from the ensemble of hypotheses generated by an arbitrary on-line learning algorithm run on an independent and identically distributed (i.i.d.) sample of data. Using a simple large deviation argument, we prove tight data-dependent bounds for the risk of this hypothesis in terms of an easily computable statistic M/sub n/ associated with the on-line performance of the ensemble. Via sharp pointwise bounds on M/sub n/, we then obtain risk tail bounds for kernel perceptron algorithms in terms of the spectrum of the empirical kernel matrix. These bounds reveal that the linear hypotheses found via our approach achieve optimal tradeoffs between hinge loss and margin size over the class of all linear functions, an issue that was left open by previous results. A distinctive feature of our approach is that the key tools for our analysis come from the model of prediction of individual sequences; i.e., a model making no probabilistic assumptions on the source generating the data. In fact, these tools turn out to be so powerful that we only need very elementary statistical facts to obtain our final risk bounds.
580 citations
••
TL;DR: The major issues regarding this multi-step process, focussing in particular on challenges of the extraction of radiomic features from data sets provided by computed tomography, positron emission tomographic, and magnetic resonance imaging are summarised.
Abstract: Radiomics is an emerging translational field of research aiming to extract mineable high-dimensional data from clinical images. The radiomic process can be divided into distinct steps with definable inputs and outputs, such as image acquisition and reconstruction, image segmentation, features extraction and qualification, analysis, and model building. Each step needs careful evaluation for the construction of robust and reliable models to be transferred into clinical practice for the purposes of prognosis, non-invasive disease tracking, and evaluation of disease response to treatment. After the definition of texture parameters (shape features; first-, second-, and higher-order features), we briefly discuss the origin of the term radiomics and the methods for selecting the parameters useful for a radiomic approach, including cluster analysis, principal component analysis, random forest, neural network, linear/logistic regression, and other. Reproducibility and clinical value of parameters should be firstly tested with internal cross-validation and then validated on independent external cohorts. This article summarises the major issues regarding this multi-step process, focussing in particular on challenges of the extraction of radiomic features from data sets provided by computed tomography, positron emission tomography, and magnetic resonance imaging.
579 citations
Authors
Showing all 58902 results
Name | H-index | Papers | Citations |
---|---|---|---|
Yi Cui | 220 | 1015 | 199725 |
Peter J. Barnes | 194 | 1530 | 166618 |
Thomas C. Südhof | 191 | 653 | 118007 |
Charles A. Dinarello | 190 | 1058 | 139668 |
Alberto Mantovani | 183 | 1397 | 163826 |
John J.V. McMurray | 178 | 1389 | 184502 |
Giuseppe Remuzzi | 172 | 1226 | 160440 |
Russel J. Reiter | 169 | 1646 | 121010 |
Jean Louis Vincent | 161 | 1667 | 163721 |
Tobin J. Marks | 159 | 1621 | 111604 |
Tomas Hökfelt | 158 | 1033 | 95979 |
José Baselga | 156 | 707 | 122498 |
Naveed Sattar | 155 | 1326 | 116368 |
Silvia Franceschi | 155 | 1340 | 112504 |
Frederik Barkhof | 154 | 1449 | 104982 |