scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
TL;DR: Improvements to Galaxy's core framework, user interface, tools, and training materials enable Galaxy to be used for analyzing tens of thousands of datasets, and >5500 tools are now available from the Galaxy ToolShed.
Abstract: Galaxy (homepage: https://galaxyproject.org, main public server: https://usegalaxy.org) is a web-based scientific analysis platform used by tens of thousands of scientists across the world to analyze large biomedical datasets such as those found in genomics, proteomics, metabolomics and imaging. Started in 2005, Galaxy continues to focus on three key challenges of data-driven biomedical science: making analyses accessible to all researchers, ensuring analyses are completely reproducible, and making it simple to communicate analyses so that they can be reused and extended. During the last two years, the Galaxy team and the open-source community around Galaxy have made substantial improvements to Galaxy's core framework, user interface, tools, and training materials. Framework and user interface improvements now enable Galaxy to be used for analyzing tens of thousands of datasets, and >5500 tools are now available from the Galaxy ToolShed. The Galaxy community has led an effort to create numerous high-quality tutorials focused on common types of genomic analyses. The Galaxy developer and user communities continue to grow and be integral to Galaxy's development. The number of Galaxy public servers, developers contributing to the Galaxy framework and its tools, and users of the main Galaxy server have all increased substantially.

2,601 citations


Posted Content
TL;DR: In this paper, a deconvolution network is proposed to identify pixel-wise class labels and predict segmentation masks in an input image, and construct the final semantic segmentation map by combining the results from all proposals.
Abstract: We propose a novel semantic segmentation algorithm by learning a deconvolution network. We learn the network on top of the convolutional layers adopted from VGG 16-layer net. The deconvolution network is composed of deconvolution and unpooling layers, which identify pixel-wise class labels and predict segmentation masks. We apply the trained network to each proposal in an input image, and construct the final semantic segmentation map by combining the results from all proposals in a simple manner. The proposed algorithm mitigates the limitations of the existing methods based on fully convolutional networks by integrating deep deconvolution network and proposal-wise prediction; our segmentation method typically identifies detailed structures and handles objects in multiple scales naturally. Our network demonstrates outstanding performance in PASCAL VOC 2012 dataset, and we achieve the best accuracy (72.5%) among the methods trained with no external data through ensemble with the fully convolutional network.

2,601 citations


Journal ArticleDOI
TL;DR: The authors' projection results provide concrete examples of how the distribution of child causes of deaths could look in 15-20 years to inform priority setting in the post-2015 era.

2,600 citations


Journal ArticleDOI
06 Aug 2018
TL;DR: Noisy Intermediate-Scale Quantum (NISQ) technology will be available in the near future, and the 100-qubit quantum computer will not change the world right away - but it should be regarded as a significant step toward the more powerful quantum technologies of the future.
Abstract: Noisy Intermediate-Scale Quantum (NISQ) technology will be available in the near future. Quantum computers with 50-100 qubits may be able to perform tasks which surpass the capabilities of today's classical digital computers, but noise in quantum gates will limit the size of quantum circuits that can be executed reliably. NISQ devices will be useful tools for exploring many-body quantum physics, and may have other useful applications, but the 100-qubit quantum computer will not change the world right away --- we should regard it as a significant step toward the more powerful quantum technologies of the future. Quantum technologists should continue to strive for more accurate quantum gates and, eventually, fully fault-tolerant quantum computing.

2,598 citations


Journal ArticleDOI
TL;DR: The disease is mild in most people; in some (usually the elderly and those with comorbidities), it may progress to pneumonia, acute respiratory distress syndrome (ARDS) and multi organ dysfunction and many people are asymptomatic.
Abstract: There is a new public health crises threatening the world with the emergence and spread of 2019 novel coronavirus (2019-nCoV) or the severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). The virus originated in bats and was transmitted to humans through yet unknown intermediary animals in Wuhan, Hubei province, China in December 2019. There have been around 96,000 reported cases of coronavirus disease 2019 (COVID-2019) and 3300 reported deaths to date (05/03/2020). The disease is transmitted by inhalation or contact with infected droplets and the incubation period ranges from 2 to 14 d. The symptoms are usually fever, cough, sore throat, breathlessness, fatigue, malaise among others. The disease is mild in most people; in some (usually the elderly and those with comorbidities), it may progress to pneumonia, acute respiratory distress syndrome (ARDS) and multi organ dysfunction. Many people are asymptomatic. The case fatality rate is estimated to range from 2 to 3%. Diagnosis is by demonstration of the virus in respiratory secretions by special molecular tests. Common laboratory findings include normal/ low white cell counts with elevated C-reactive protein (CRP). The computerized tomographic chest scan is usually abnormal even in those with no symptoms or mild disease. Treatment is essentially supportive; role of antiviral agents is yet to be established. Prevention entails home isolation of suspected cases and those with mild illnesses and strict infection control measures at hospitals that include contact and droplet precautions. The virus spreads faster than its two ancestors the SARS-CoV and Middle East respiratory syndrome coronavirus (MERS-CoV), but has lower fatality. The global impact of this new epidemic is yet uncertain.

2,594 citations


Journal ArticleDOI
TL;DR: This work introduces a comprehensive secure federated-learning framework, which includes horizontal federated learning, vertical federatedLearning, and federated transfer learning, and provides a comprehensive survey of existing works on this subject.
Abstract: Today’s artificial intelligence still faces two major challenges. One is that, in most industries, data exists in the form of isolated islands. The other is the strengthening of data privacy and security. We propose a possible solution to these challenges: secure federated learning. Beyond the federated-learning framework first proposed by Google in 2016, we introduce a comprehensive secure federated-learning framework, which includes horizontal federated learning, vertical federated learning, and federated transfer learning. We provide definitions, architectures, and applications for the federated-learning framework, and provide a comprehensive survey of existing works on this subject. In addition, we propose building data networks among organizations based on federated mechanisms as an effective solution to allowing knowledge to be shared without compromising user privacy.

2,593 citations


Journal ArticleDOI
TL;DR: A panel to update the prior position statements on the management of type 2 diabetes in adults includes additional focus on lifestyle management and diabetes self-management education and support and efforts targeting weight loss.
Abstract: The American Diabetes Association and the European Association for the Study of Diabetes have briefly updated their 2018 recommendations on management of hyperglycemia, based on important research findings from large cardiovascular outcomes trials published in 2019. Important changes include: 1) the decision to treat high-risk individuals with a glucagon-like peptide 1 (GLP-1) receptor agonist or sodium-glucose cotransporter 2 (SGLT2) inhibitor to reduce major adverse cardiovascular events (MACE), hospitalization for heart failure (hHF), cardiovascular death, or chronic kidney disease (CKD) progression should be considered independently of baseline HbA1c or individualized HbA1c target; 2) GLP-1 receptor agonists can also be considered in patients with type 2 diabetes without established cardiovascular disease (CVD) but with the presence of specific indicators of high risk; and 3) SGLT2 inhibitors are recommended in patients with type 2 diabetes and heart failure, particularly those with heart failure with reduced ejection fraction, to reduce hHF, MACE, and CVD death, as well as in patients with type 2 diabetes with CKD (estimated glomerular filtration rate 30 to ≤60 mL min-1 [1.73 m]-2 or urinary albumin-to-creatinine ratio >30 mg/g, particularly >300 mg/g) to prevent the progression of CKD, hHF, MACE, and cardiovascular death.

2,592 citations


Journal ArticleDOI
Kazunori Akiyama, Antxon Alberdi1, Walter Alef2, Keiichi Asada3  +403 moreInstitutions (82)
TL;DR: In this article, the Event Horizon Telescope was used to reconstruct event-horizon-scale images of the supermassive black hole candidate in the center of the giant elliptical galaxy M87.
Abstract: When surrounded by a transparent emission region, black holes are expected to reveal a dark shadow caused by gravitational light bending and photon capture at the event horizon. To image and study this phenomenon, we have assembled the Event Horizon Telescope, a global very long baseline interferometry array observing at a wavelength of 1.3 mm. This allows us to reconstruct event-horizon-scale images of the supermassive black hole candidate in the center of the giant elliptical galaxy M87. We have resolved the central compact radio source as an asymmetric bright emission ring with a diameter of 42 +/- 3 mu as, which is circular and encompasses a central depression in brightness with a flux ratio greater than or similar to 10: 1. The emission ring is recovered using different calibration and imaging schemes, with its diameter and width remaining stable over four different observations carried out in different days. Overall, the observed image is consistent with expectations for the shadow of a Kerr black hole as predicted by general relativity. The asymmetry in brightness in the ring can be explained in terms of relativistic beaming of the emission from a plasma rotating close to the speed of light around a black hole. We compare our images to an extensive library of ray-traced general-relativistic magnetohydrodynamic simulations of black holes and derive a central mass of M = (6.5 +/- 0.7) x 10(9) M-circle dot. Our radio-wave observations thus provide powerful evidence for the presence of supermassive black holes in centers of galaxies and as the central engines of active galactic nuclei. They also present a new tool to explore gravity in its most extreme limit and on a mass scale that was so far not accessible.

2,589 citations


Posted Content
TL;DR: This position paper defines interpretability and describes when interpretability is needed (and when it is not), and suggests a taxonomy for rigorous evaluation and exposes open questions towards a more rigorous science of interpretable machine learning.
Abstract: As machine learning systems become ubiquitous, there has been a surge of interest in interpretable machine learning: systems that provide explanation for their outputs. These explanations are often used to qualitatively assess other criteria such as safety or non-discrimination. However, despite the interest in interpretability, there is very little consensus on what interpretable machine learning is and how it should be measured. In this position paper, we first define interpretability and describe when interpretability is needed (and when it is not). Next, we suggest a taxonomy for rigorous evaluation and expose open questions towards a more rigorous science of interpretable machine learning.

2,589 citations


Journal ArticleDOI
TL;DR: It is found that among laboratory confirmed cases of COVID-19, patients with any comorbidity yielded poorer clinical outcomes than those without and a greater number ofComorbidities also correlated with poorer clinical outcome.
Abstract: Background The coronavirus disease 2019 (Covid-19) outbreak is evolving rapidly worldwide. Objective To evaluate the risk of serious adverse outcomes in patients with coronavirus disease 2019 (Covid-19) by stratifying the comorbidity status. Methods We analysed the data from 1590 laboratory-confirmed hospitalised patients 575 hospitals in 31 province/autonomous regions/provincial municipalities across mainland China between December 11th, 2019 and January 31st, 2020. We analyse the composite endpoints, which consisted of admission to intensive care unit, or invasive ventilation, or death. The risk of reaching to the composite endpoints was compared according to the presence and number of comorbidities. Results The mean age was 48.9 years. 686 patients (42.7%) were females. Severe cases accounted for 16.0% of the study population. 131 (8.2%) patients reached to the composite endpoints. 399 (25.1%) reported having at least one comorbidity. The most prevalent comorbidity was hypertension (16.9%), followed by diabetes (8.2%). 130 (8.2%) patients reported having two or more comorbidities. After adjusting for age and smoking status, COPD [hazards ratio (HR) 2.681, 95% confidence interval (95%CI) 1.424–5.048], diabetes (HR 1.59, 95%CI 1.03–2.45), hypertension (HR 1.58, 95%CI 1.07–2.32) and malignancy (HR 3.50, 95%CI 1.60–7.64) were risk factors of reaching to the composite endpoints. The HR was 1.79 (95%CI 1.16–2.77) among patients with at least one comorbidity and 2.59 (95%CI 1.61–4.17) among patients with two or more comorbidities. Conclusion Among laboratory-confirmed cases of Covid-19, patients with any comorbidity yielded poorer clinical outcomes than those without. A greater number of comorbidities also correlated with poorer clinical outcomes.

2,587 citations


Proceedings ArticleDOI
27 Jun 2016
TL;DR: This work proposes an LSTM model which can learn general human movement and predict their future trajectories and outperforms state-of-the-art methods on some of these datasets.
Abstract: Pedestrians follow different trajectories to avoid obstacles and accommodate fellow pedestrians. Any autonomous vehicle navigating such a scene should be able to foresee the future positions of pedestrians and accordingly adjust its path to avoid collisions. This problem of trajectory prediction can be viewed as a sequence generation task, where we are interested in predicting the future trajectory of people based on their past positions. Following the recent success of Recurrent Neural Network (RNN) models for sequence prediction tasks, we propose an LSTM model which can learn general human movement and predict their future trajectories. This is in contrast to traditional approaches which use hand-crafted functions such as Social forces. We demonstrate the performance of our method on several public datasets. Our model outperforms state-of-the-art methods on some of these datasets. We also analyze the trajectories predicted by our model to demonstrate the motion behaviour learned by our model.

Proceedings ArticleDOI
Xiaozhi Chen1, Huimin Ma1, Ji Wan2, Bo Li2, Xia Tian2 
21 Jul 2017
TL;DR: This paper proposes Multi-View 3D networks (MV3D), a sensory-fusion framework that takes both LIDAR point cloud and RGB images as input and predicts oriented 3D bounding boxes and designs a deep fusion scheme to combine region-wise features from multiple views and enable interactions between intermediate layers of different paths.
Abstract: This paper aims at high-accuracy 3D object detection in autonomous driving scenario. We propose Multi-View 3D networks (MV3D), a sensory-fusion framework that takes both LIDAR point cloud and RGB images as input and predicts oriented 3D bounding boxes. We encode the sparse 3D point cloud with a compact multi-view representation. The network is composed of two subnetworks: one for 3D object proposal generation and another for multi-view feature fusion. The proposal network generates 3D candidate boxes efficiently from the birds eye view representation of 3D point cloud. We design a deep fusion scheme to combine region-wise features from multiple views and enable interactions between intermediate layers of different paths. Experiments on the challenging KITTI benchmark show that our approach outperforms the state-of-the-art by around 25% and 30% AP on the tasks of 3D localization and 3D detection. In addition, for 2D detection, our approach obtains 14.9% higher AP than the state-of-the-art on the hard data among the LIDAR-based methods.

Journal ArticleDOI
B. P. Abbott1, Richard J. Abbott1, T. D. Abbott2, Fausto Acernese3  +1062 moreInstitutions (115)
TL;DR: The magnitude of modifications to the gravitational-wave dispersion relation is constrain, the graviton mass is bound to m_{g}≤7.7×10^{-23} eV/c^{2} and null tests of general relativity are performed, finding that GW170104 is consistent with general relativity.
Abstract: We describe the observation of GW170104, a gravitational-wave signal produced by the coalescence of a pair of stellar-mass black holes. The signal was measured on January 4, 2017 at 10∶11:58.6 UTC by the twin advanced detectors of the Laser Interferometer Gravitational-Wave Observatory during their second observing run, with a network signal-to-noise ratio of 13 and a false alarm rate less than 1 in 70 000 years. The inferred component black hole masses are 31.2^(8.4) _(−6.0)M_⊙ and 19.4^(5.3)_( −5.9)M_⊙ (at the 90% credible level). The black hole spins are best constrained through measurement of the effective inspiral spin parameter, a mass-weighted combination of the spin components perpendicular to the orbital plane, χ_(eff) = −0.12^(0.21)_( −0.30). This result implies that spin configurations with both component spins positively aligned with the orbital angular momentum are disfavored. The source luminosity distance is 880^(450)_(−390) Mpc corresponding to a redshift of z = 0.18^(0.08)_( −0.07) . We constrain the magnitude of modifications to the gravitational-wave dispersion relation and perform null tests of general relativity. Assuming that gravitons are dispersed in vacuum like massive particles, we bound the graviton mass to m_g ≤ 7.7 × 10^(−23) eV/c^2. In all cases, we find that GW170104 is consistent with general relativity.

Journal ArticleDOI
TL;DR: In this paper, a Monte Carlo sampler (The Joker) is used to perform a search for companions to 96,231 red-giant stars observed in the APOGEE survey (DR14) with $ ≥ 3$ spectroscopic epochs.
Abstract: Multi-epoch radial velocity measurements of stars can be used to identify stellar, sub-stellar, and planetary-mass companions. Even a small number of observation epochs can be informative about companions, though there can be multiple qualitatively different orbital solutions that fit the data. We have custom-built a Monte Carlo sampler (The Joker) that delivers reliable (and often highly multi-modal) posterior samplings for companion orbital parameters given sparse radial-velocity data. Here we use The Joker to perform a search for companions to 96,231 red-giant stars observed in the APOGEE survey (DR14) with $\\geq 3$ spectroscopic epochs. We select stars with probable companions by making a cut on our posterior belief about the amplitude of the stellar radial-velocity variation induced by the orbit. We provide (1) a catalog of 320 companions for which the stellar companion properties can be confidently determined, (2) a catalog of 4,898 stars that likely have companions, but would require more observations to uniquely determine the orbital properties, and (3) posterior samplings for the full orbital parameters for all stars in the parent sample. We show the characteristics of systems with confidently determined companion properties and highlight interesting systems with candidate compact object companions.

Proceedings Article
01 Jan 2015
TL;DR: This paper extends the idea of a student network that could imitate the soft output of a larger teacher network or ensemble of networks, using not only the outputs but also the intermediate representations learned by the teacher as hints to improve the training process and final performance of the student.
Abstract: While depth tends to improve network performances, it also makes gradient-based training more difficult since deeper networks tend to be more non-linear. The recently proposed knowledge distillation approach is aimed at obtaining small and fast-to-execute models, and it has shown that a student network could imitate the soft output of a larger teacher network or ensemble of networks. In this paper, we extend this idea to allow the training of a student that is deeper and thinner than the teacher, using not only the outputs but also the intermediate representations learned by the teacher as hints to improve the training process and final performance of the student. Because the student intermediate hidden layer will generally be smaller than the teacher's intermediate hidden layer, additional parameters are introduced to map the student hidden layer to the prediction of the teacher hidden layer. This allows one to train deeper students that can generalize better or run faster, a trade-off that is controlled by the chosen student capacity. For example, on CIFAR-10, a deep student network with almost 10.4 times less parameters outperforms a larger, state-of-the-art teacher network.

Journal ArticleDOI
TL;DR: An overview of the key aspects of graphene and related materials, ranging from fundamental research challenges to a variety of applications in a large number of sectors, highlighting the steps necessary to take GRMs from a state of raw potential to a point where they might revolutionize multiple industries are provided.
Abstract: We present the science and technology roadmap for graphene, related two-dimensional crystals, and hybrid systems, targeting an evolution in technology, that might lead to impacts and benefits reaching into most areas of society. This roadmap was developed within the framework of the European Graphene Flagship and outlines the main targets and research areas as best understood at the start of this ambitious project. We provide an overview of the key aspects of graphene and related materials (GRMs), ranging from fundamental research challenges to a variety of applications in a large number of sectors, highlighting the steps necessary to take GRMs from a state of raw potential to a point where they might revolutionize multiple industries. We also define an extensive list of acronyms in an effort to standardize the nomenclature in this emerging field.

Journal ArticleDOI
TL;DR: In patients with type 2 diabetes at high cardiovascular risk, empagliflozin was associated with slower progression of kidney disease and lower rates of clinically relevant renal events than was placebo when added to standard care.
Abstract: BackgroundDiabetes confers an increased risk of adverse cardiovascular and renal events. In the EMPA-REG OUTCOME trial, empagliflozin, a sodium–glucose cotransporter 2 inhibitor, reduced the risk of major adverse cardiovascular events in patients with type 2 diabetes at high risk for cardiovascular events. We wanted to determine the long-term renal effects of empagliflozin, an analysis that was a prespecified component of the secondary microvascular outcome of that trial. MethodsWe randomly assigned patients with type 2 diabetes and an estimated glomerular filtration rate of at least 30 ml per minute per 1.73 m2 of body-surface area to receive either empagliflozin (at a dose of 10 mg or 25 mg) or placebo once daily. Prespecified renal outcomes included incident or worsening nephropathy (progression to macroalbuminuria, doubling of the serum creatinine level, initiation of renal-replacement therapy, or death from renal disease) and incident albuminuria. ResultsIncident or worsening nephropathy occurred in ...

Journal ArticleDOI
Ian G. McKeith, Bradley F. Boeve, Dennis W. Dickson, Glenda M. Halliday, John-Paul Taylor1, Daniel Weintraub2, Dag Aarsland1, Dag Aarsland3, James E. Galvin2, Johannes Attems4, Johannes Attems5, Clive Ballard4, Clive Ballard2, Ashley Bayston2, Ashley Bayston4, Thomas G. Beach6, Thomas G. Beach1, Frédéric Blanc7, Nicolaas Bohnen8, Nicolaas Bohnen9, Nicolaas Bohnen10, Laura Bonanni3, Laura Bonanni1, Jose Bras3, Jose Bras1, Patrik Brundin3, Patrik Brundin1, David J. Burn3, David J. Burn1, Alice Chen-Plotkin3, John E. Duda11, Omar M. A. El-Agnaf, Howard Feldman12, Tanis J. Ferman, Dominic Ffytche13, Hiroshige Fujishiro14, Douglas Galasko15, Jennifer G. Goldman16, Stephen N. Gomperts16, Neill R. Graff-Radford, Lawrence S. Honig17, Lawrence S. Honig18, Alex Iranzo19, Alex Iranzo20, Alex Iranzo21, Kejal Kantarci, Daniel I. Kaufer11, Walter Kukull22, Virginia M.Y. Lee23, James B. Leverenz17, James B. Leverenz18, Simon J.G. Lewis2, Carol F. Lippa18, Carol F. Lippa17, Angela Lunde3, M Masellis19, M Masellis21, M Masellis20, Eliezer Masliah, Pamela J. McLean, Brit Mollenhauer5, Brit Mollenhauer24, Thomas J. Montine25, Thomas J. Montine26, Emilio Moreno27, Emilio Moreno28, Emilio Moreno2, Etsuro Mori28, Etsuro Mori2, Etsuro Mori27, Melissa E. Murray, John T. O'Brien28, John T. O'Brien27, Sotoshi Orimo27, Sotoshi Orimo28, Ronald B. Postuma28, Ronald B. Postuma27, Shankar Ramaswamy28, Shankar Ramaswamy27, Owen A. Ross, David P. Salmon25, David P. Salmon26, Andrew B. Singleton26, Andrew B. Singleton25, Angela Taylor24, Angela Taylor5, Alan Thomas16, Pietro Tiraboschi, Jon B. Toledo, John Q. Trojanowski, Debby W. Tsuang10, Zuzana Walker8, Zuzana Walker25, Masahito Yamada26, Masahito Yamada9, Kenji Kosaka 
TL;DR: The Dementia with Lewy Bodies (DLB) Consortium has refined its recommendations about the clinical and pathologic diagnosis of DLB, updating the previous report, which has been in widespread use for the last decade.
Abstract: The Dementia with Lewy Bodies (DLB) Consortium has refined its recommendations about the clinical and pathologic diagnosis of DLB, updating the previous report, which has been in widespread use for the last decade. The revised DLB consensus criteria now distinguish clearly between clinical features and diagnostic biomarkers, and give guidance about optimal methods to establish and interpret these. Substantial new information has been incorporated about previously reported aspects of DLB, with increased diagnostic weighting given to REM sleep behavior disorder and 123iodine-metaiodobenzylguanidine (MIBG) myocardial scintigraphy. The diagnostic role of other neuroimaging, electrophysiologic, and laboratory investigations is also described. Minor modifications to pathologic methods and criteria are recommended to take account of Alzheimer disease neuropathologic change, to add previously omitted Lewy-related pathology categories, and to include assessments for substantia nigra neuronal loss. Recommendations about clinical management are largely based upon expert opinion since randomized controlled trials in DLB are few. Substantial progress has been made since the previous report in the detection and recognition of DLB as a common and important clinical disorder. During that period it has been incorporated into DSM-5, as major neurocognitive disorder with Lewy bodies. There remains a pressing need to understand the underlying neurobiology and pathophysiology of DLB, to develop and deliver clinical trials with both symptomatic and disease-modifying agents, and to help patients and carers worldwide to inform themselves about the disease, its prognosis, best available treatments, ongoing research, and how to get adequate support.

Journal ArticleDOI
TL;DR: The mRNA-1273 vaccine induced anti-SARS-CoV-2 immune responses in all participants, and no trial-limiting safety concerns were identified, which support further development of this vaccine.
Abstract: Background The severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) emerged in late 2019 and spread globally, prompting an international effort to accelerate development of a vacci...

Journal ArticleDOI
TL;DR: This briefer article should be read as an addendum to the previous full account on the management of hyperglycemia, which described the need to individualize both treatment targets and treatment strategies with an emphasis on patient-centered care and shared decision making.
Abstract: In 2012, the American Diabetes Association (ADA) and the European Association for the Study of Diabetes (EASD) published a position statement on the management of hyperglycemia in patients with type 2 diabetes (1,2). This was needed because of an increasing array of antihyperglycemic drugs and growing uncertainty regarding their proper selection and sequence. Because of a paucity of comparative effectiveness research on long-term treatment outcomes with many of these medications, the 2012 publication was less prescriptive than prior consensus reports. We previously described the need to individualize both treatment targets and treatment strategies, with an emphasis on patient-centered care and shared decision making, and this continues to be our position, although there are now more head-to-head trials that show slight variance between agents with regard to glucose-lowering effects. Nevertheless, these differences are often small and would be unlikely to reflect any definite differential effect in an individual patient. The ADA and EASD have requested an update to the position statement incorporating new data from recent clinical trials. Between June and September of 2014, the Writing Group reconvened, including one face-to-face meeting, to discuss the changes. An entirely new statement was felt to be unnecessary. Instead, the group focused on those areas where revisions were suggested by a changing evidence base. This briefer article should therefore be read as an addendum to the previous full account (1,2). Glucose control remains a major focus in the management of patients with type 2 diabetes. However, this should always be in the context of a comprehensive cardiovascular risk factor reduction program, to include smoking cessation and the adoption of other healthy lifestyle habits, blood pressure control, lipid management with priority to statin medications, and, in some circumstances, antiplatelet therapy. Studies have conclusively determined that reducing hyperglycemia decreases the onset and progression of …

Proceedings ArticleDOI
21 Jul 2017
TL;DR: The concept of end-to-end learning of optical flow is advanced and it work really well, and faster variants that allow optical flow computation at up to 140fps with accuracy matching the original FlowNet are presented.
Abstract: The FlowNet demonstrated that optical flow estimation can be cast as a learning problem. However, the state of the art with regard to the quality of the flow has still been defined by traditional methods. Particularly on small displacements and real-world data, FlowNet cannot compete with variational methods. In this paper, we advance the concept of end-to-end learning of optical flow and make it work really well. The large improvements in quality and speed are caused by three major contributions: first, we focus on the training data and show that the schedule of presenting data during training is very important. Second, we develop a stacked architecture that includes warping of the second image with intermediate optical flow. Third, we elaborate on small displacements by introducing a subnetwork specializing on small motions. FlowNet 2.0 is only marginally slower than the original FlowNet but decreases the estimation error by more than 50%. It performs on par with state-of-the-art methods, while running at interactive frame rates. Moreover, we present faster variants that allow optical flow computation at up to 140fps with accuracy matching the original FlowNet.

Journal ArticleDOI
TL;DR: The pathologic basis of disease is determined by X-ray diffraction analysis of the granuloma with an Higgs–Cotran–Bouchut–Seiden ratio of 3:1.
Abstract: Robbins and Cotran pathologic basis of disease / , Robbins and Cotran pathologic basis of disease / , کتابخانه دیجیتال جندی شاپور اهواز

Journal ArticleDOI
TL;DR: The FCV-19S, a seven-item scale, has robust psychometric properties and is reliable and valid in assessing fear of COVID-19 among the general population and will also be useful in allaying CO VID-19 fears among individuals.
Abstract: Background: The emergence of the COVID-19 and its consequences has led to fears, worries, and anxiety among individuals worldwide. The present study developed the Fear of COVID-19 Scale (FCV-19S) t ...

Journal ArticleDOI
TL;DR: Estimates of extinction rates reveal an exceptionally rapid loss of biodiversity over the last few centuries, indicating that a sixth mass extinction is already under way and a window of opportunity is rapidly closing.
Abstract: The oft-repeated claim that Earth’s biota is entering a sixth “mass extinction” depends on clearly demonstrating that current extinction rates are far above the “background” rates prevailing between the five previous mass extinctions. Earlier estimates of extinction rates have been criticized for using assumptions that might overestimate the severity of the extinction crisis. We assess, using extremely conservative assumptions, whether human activities are causing a mass extinction. First, we use a recent estimate of a background rate of 2 mammal extinctions per 10,000 species per 100 years (that is, 2 E/MSY), which is twice as high as widely used previous estimates. We then compare this rate with the current rate of mammal and vertebrate extinctions. The latter is conservatively low because listing a species as extinct requires meeting stringent criteria. Even under our assumptions, which would tend to minimize evidence of an incipient mass extinction, the average rate of vertebrate species loss over the last century is up to 100 times higher than the background rate. Under the 2 E/MSY background rate, the number of species that have gone extinct in the last century would have taken, depending on the vertebrate taxon, between 800 and 10,000 years to disappear. These estimates reveal an exceptionally rapid loss of biodiversity over the last few centuries, indicating that a sixth mass extinction is already under way. Averting a dramatic decay of biodiversity and the subsequent loss of ecosystem services is still possible through intensified conservation efforts, but that window of opportunity is rapidly closing.

Journal ArticleDOI
TL;DR: Regorafenib is the only systemic treatment shown to provide survival benefit in HCC patients progressing on sorafenIB treatment, and future trials should explore combinations of regorAFenib with other systemic agents and third-line treatments for patients who fail or who do not tolerate the sequence of sorafanib and regorafinib.

Journal ArticleDOI
TL;DR: A fast and accurate fully automatic method for brain tumor segmentation which is competitive both in terms of accuracy and speed compared to the state of the art, and introduces a novel cascaded architecture that allows the system to more accurately model local label dependencies.

Journal ArticleDOI
27 Nov 2015-Science
TL;DR: Comparison of melanoma growth in mice harboring distinct commensal microbiota and observed differences in spontaneous antitumor immunity, suggests that manipulating the microbiota may modulate cancer immunotherapy.
Abstract: T cell infiltration of solid tumors is associated with favorable patient outcomes, yet the mechanisms underlying variable immune responses between individuals are not well understood. One possible modulator could be the intestinal microbiota. We compared melanoma growth in mice harboring distinct commensal microbiota and observed differences in spontaneous antitumor immunity, which were eliminated upon cohousing or after fecal transfer. Sequencing of the 16S ribosomal RNA identified Bifidobacterium as associated with the antitumor effects. Oral administration of Bifidobacterium alone improved tumor control to the same degree as programmed cell death protein 1 ligand 1 (PD-L1)–specific antibody therapy (checkpoint blockade), and combination treatment nearly abolished tumor outgrowth. Augmented dendritic cell function leading to enhanced CD8+ T cell priming and accumulation in the tumor microenvironment mediated the effect. Our data suggest that manipulating the microbiota may modulate cancer immunotherapy.

Proceedings ArticleDOI
18 Jun 2018
TL;DR: This work forms this intuition as a non-parametric classification problem at the instance-level, and uses noise-contrastive estimation to tackle the computational challenges imposed by the large number of instance classes.
Abstract: Neural net classifiers trained on data with annotated class labels can also capture apparent visual similarity among categories without being directed to do so. We study whether this observation can be extended beyond the conventional domain of supervised learning: Can we learn a good feature representation that captures apparent similarity among instances, instead of classes, by merely asking the feature to be discriminative of individual instances? We formulate this intuition as a non-parametric classification problem at the instance-level, and use noise-contrastive estimation to tackle the computational challenges imposed by the large number of instance classes. Our experimental results demonstrate that, under unsupervised learning settings, our method surpasses the state-of-the-art on ImageNet classification by a large margin. Our method is also remarkable for consistently improving test performance with more training data and better network architectures. By fine-tuning the learned feature, we further obtain competitive results for semi-supervised learning and object detection tasks. Our non-parametric model is highly compact: With 128 features per image, our method requires only 600MB storage for a million images, enabling fast nearest neighbour retrieval at the run time.

Journal ArticleDOI
02 Apr 2015-Nature
TL;DR: A terrestrial assemblage database of unprecedented geographic and taxonomic coverage is analysed to quantify local biodiversity responses to land use and related changes and shows that in the worst-affected habitats, pressures reduce within-sample species richness by an average of 76.5%, total abundance by 39.5% and rarefaction-based richness by 40.3%.
Abstract: Human activities, especially conversion and degradation of habitats, are causing global biodiversity declines. How local ecological assemblages are responding is less clear--a concern given their importance for many ecosystem functions and services. We analysed a terrestrial assemblage database of unprecedented geographic and taxonomic coverage to quantify local biodiversity responses to land use and related changes. Here we show that in the worst-affected habitats, these pressures reduce within-sample species richness by an average of 76.5%, total abundance by 39.5% and rarefaction-based richness by 40.3%. We estimate that, globally, these pressures have already slightly reduced average within-sample richness (by 13.6%), total abundance (10.7%) and rarefaction-based richness (8.1%), with changes showing marked spatial variation. Rapid further losses are predicted under a business-as-usual land-use scenario; within-sample richness is projected to fall by a further 3.4% globally by 2100, with losses concentrated in biodiverse but economically poor countries. Strong mitigation can deliver much more positive biodiversity changes (up to a 1.9% average increase) that are less strongly related to countries' socioeconomic status.

Journal ArticleDOI
TL;DR: The current status of TCGA Research Network structure, purpose, and achievements are discussed, to provide publicly available datasets to help improve diagnostic methods, treatment standards, and finally to prevent cancer.
Abstract: The Cancer Genome Atlas (TCGA) is a public funded project that aims to catalogue and discover major cancer-causing genomic alterations to create a comprehensive "atlas" of cancer genomic profiles. So far, TCGA researchers have analysed large cohorts of over 30 human tumours through large-scale genome sequencing and integrated multi-dimensional analyses. Studies of individual cancer types, as well as comprehensive pan-cancer analyses have extended current knowledge of tumorigenesis. A major goal of the project was to provide publicly available datasets to help improve diagnostic methods, treatment standards, and finally to prevent cancer. This review discusses the current status of TCGA Research Network structure, purpose, and achievements.