scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
TL;DR: An overview of recent development of TMP nanomaterials as catalysts for hydrogen generation with high activity and stability is presented, and specific strategies to further improve the catalytic efficiency and stability of T MPs by structural engineering are demonstrated.
Abstract: The urgent need of clean and renewable energy drives the exploration of effective strategies to produce molecular hydrogen. With the assistance of highly active non-noble metal electrocatalysts, electrolysis of water is becoming a promising candidate to generate pure hydrogen with low cost and high efficiency. Very recently, transition metal phosphides (TMPs) have been proven to be high performance catalysts with high activity, high stability, and nearly ∼100% Faradic efficiency in not only strong acidic solutions, but also in strong alkaline and neutral media for electrochemical hydrogen evolution. In this tutorial review, an overview of recent development of TMP nanomaterials as catalysts for hydrogen generation with high activity and stability is presented. The effects of phosphorus (P) on HER activity, and their synthetic methods of TMPs are briefly discussed. Then we will demonstrate the specific strategies to further improve the catalytic efficiency and stability of TMPs by structural engineering. Making use of TMPs as cocatalysts and catalysts in photochemical and photoelectrochemical water splitting is also discussed. Finally, some key challenges and issues which should not be ignored during the rapid development of TMPs are pointed out. These strategies and challenges of TMPs are instructive for designing other high-performance non-noble metal catalysts.

2,104 citations


Journal ArticleDOI
07 Oct 2016-Science
TL;DR: N nanoscale phase stabilization of CsPbI3 quantum dots (QDs) to low temperatures that can be used as the active component of efficient optoelectronic devices and describe the formation of α-CsP bI3 QD films that are phase-stable for months in ambient air.
Abstract: We show nanoscale phase stabilization of CsPbI 3 quantum dots (QDs) to low temperatures that can be used as the active component of efficient optoelectronic devices. CsPbI 3 is an all-inorganic analog to the hybrid organic cation halide perovskites, but the cubic phase of bulk CsPbI 3 (α-CsPbI 3 )—the variant with desirable band gap—is only stable at high temperatures. We describe the formation of α-CsPbI 3 QD films that are phase-stable for months in ambient air. The films exhibit long-range electronic transport and were used to fabricate colloidal perovskite QD photovoltaic cells with an open-circuit voltage of 1.23 volts and efficiency of 10.77%. These devices also function as light-emitting diodes with low turn-on voltage and tunable emission.

2,103 citations


Proceedings ArticleDOI
13 Jul 2018
TL;DR: Wang et al. as mentioned in this paper proposed a novel deep learning framework, Spatio-Temporal Graph Convolutional Networks (STGCN), to tackle the time series prediction problem in traffic domain.
Abstract: Timely accurate traffic forecast is crucial for urban traffic control and guidance. Due to the high nonlinearity and complexity of traffic flow, traditional methods cannot satisfy the requirements of mid-and-long term prediction tasks and often neglect spatial and temporal dependencies. In this paper, we propose a novel deep learning framework, Spatio-Temporal Graph Convolutional Networks (STGCN), to tackle the time series prediction problem in traffic domain. Instead of applying regular convolutional and recurrent units, we formulate the problem on graphs and build the model with complete convolutional structures, which enable much faster training speed with fewer parameters. Experiments show that our model STGCN effectively captures comprehensive spatio-temporal correlations through modeling multi-scale traffic networks and consistently outperforms state-of-the-art baselines on various real-world traffic datasets.

2,103 citations


Journal ArticleDOI
09 Jul 2015-Nature
TL;DR: GPC1+ crExos may serve as a potential non-invasive diagnostic and screening tool to detect early stages of pancreatic cancer to facilitate possible curative surgical therapy.
Abstract: Exosomes are lipid-bilayer-enclosed extracellular vesicles that contain proteins and nucleic acids. They are secreted by all cells and circulate in the blood. Specific detection and isolation of cancer-cell-derived exosomes in the circulation is currently lacking. Using mass spectrometry analyses, we identify a cell surface proteoglycan, glypican-1 (GPC1), specifically enriched on cancer-cell-derived exosomes. GPC1(+) circulating exosomes (crExos) were monitored and isolated using flow cytometry from the serum of patients and mice with cancer. GPC1(+) crExos were detected in the serum of patients with pancreatic cancer with absolute specificity and sensitivity, distinguishing healthy subjects and patients with a benign pancreatic disease from patients with early- and late-stage pancreatic cancer. Levels of GPC1(+) crExos correlate with tumour burden and the survival of pre- and post-surgical patients. GPC1(+) crExos from patients and from mice with spontaneous pancreatic tumours carry specific KRAS mutations, and reliably detect pancreatic intraepithelial lesions in mice despite negative signals by magnetic resonance imaging. GPC1(+) crExos may serve as a potential non-invasive diagnostic and screening tool to detect early stages of pancreatic cancer to facilitate possible curative surgical therapy.

2,102 citations


Journal ArticleDOI
TL;DR: Given the important role of oxidative stress in the pathogenesis of many clinical conditions and aging, antioxidant therapy could positively affect the natural history of several diseases, but further investigation is needed to evaluate the real efficacy of these therapeutic interventions.
Abstract: Reactive oxygen and nitrogen species (RONS) are produced by several endogenous and exogenous processes, and their negative effects are neutralized by antioxidant defenses. Oxidative stress occurs from the imbalance between RONS production and these antioxidant defenses. Aging is a process characterized by the progressive loss of tissue and organ function. The oxidative stress theory of aging is based on the hypothesis that age-associated functional losses are due to the accumulation of RONS-induced damages. At the same time, oxidative stress is involved in several age-related conditions (ie, cardiovascular diseases [CVDs], chronic obstructive pulmonary disease, chronic kidney disease, neurodegenerative diseases, and cancer), including sarcopenia and frailty. Different types of oxidative stress biomarkers have been identified and may provide important information about the efficacy of the treatment, guiding the selection of the most effective drugs/dose regimens for patients and, if particularly relevant from a pathophysiological point of view, acting on a specific therapeutic target. Given the important role of oxidative stress in the pathogenesis of many clinical conditions and aging, antioxidant therapy could positively affect the natural history of several diseases, but further investigation is needed to evaluate the real efficacy of these therapeutic interventions. The purpose of this paper is to provide a review of literature on this complex topic of ever increasing interest.

2,101 citations


Journal ArticleDOI
21 Jun 2016-JAMA
TL;DR: It is concluded with high certainty that screening for colorectal cancer in average-risk, asymptomatic adults aged 50 to 75 years is of substantial net benefit.
Abstract: Importance Colorectal cancer is the second leading cause of cancer death in the United States. In 2016, an estimated 134 000 persons will be diagnosed with the disease, and about 49 000 will die from it. Colorectal cancer is most frequently diagnosed among adults aged 65 to 74 years; the median age at death from colorectal cancer is 73 years. Objective To update the 2008 US Preventive Services Task Force (USPSTF) recommendation on screening for colorectal cancer. Evidence Review The USPSTF reviewed the evidence on the effectiveness of screening with colonoscopy, flexible sigmoidoscopy, computed tomography colonography, the guaiac-based fecal occult blood test, the fecal immunochemical test, the multitargeted stool DNA test, and the methylated SEPT9 DNA test in reducing the incidence of and mortality from colorectal cancer or all-cause mortality; the harms of these screening tests; and the test performance characteristics of these tests for detecting adenomatous polyps, advanced adenomas based on size, or both, as well as colorectal cancer. The USPSTF also commissioned a comparative modeling study to provide information on optimal starting and stopping ages and screening intervals across the different available screening methods. Findings The USPSTF concludes with high certainty that screening for colorectal cancer in average-risk, asymptomatic adults aged 50 to 75 years is of substantial net benefit. Multiple screening strategies are available to choose from, with different levels of evidence to support their effectiveness, as well as unique advantages and limitations, although there are no empirical data to demonstrate that any of the reviewed strategies provide a greater net benefit. Screening for colorectal cancer is a substantially underused preventive health strategy in the United States. Conclusions and Recommendations The USPSTF recommends screening for colorectal cancer starting at age 50 years and continuing until age 75 years (A recommendation). The decision to screen for colorectal cancer in adults aged 76 to 85 years should be an individual one, taking into account the patient’s overall health and prior screening history (C recommendation).

2,100 citations


Proceedings ArticleDOI
21 Jul 2017
TL;DR: The ChestX-ray dataset as discussed by the authors contains 108,948 frontal-view X-ray images of 32,717 unique patients with the text-mined eight disease image labels from the associated radiological reports using natural language processing.
Abstract: The chest X-ray is one of the most commonly accessible radiological examinations for screening and diagnosis of many lung diseases. A tremendous number of X-ray imaging studies accompanied by radiological reports are accumulated and stored in many modern hospitals Picture Archiving and Communication Systems (PACS). On the other side, it is still an open question how this type of hospital-size knowledge database containing invaluable imaging informatics (i.e., loosely labeled) can be used to facilitate the data-hungry deep learning paradigms in building truly large-scale high precision computer-aided diagnosis (CAD) systems. In this paper, we present a new chest X-ray database, namely ChestX-ray8, which comprises 108,948 frontal-view X-ray images of 32,717 unique patients with the text-mined eight disease image labels (where each image can have multi-labels), from the associated radiological reports using natural language processing. Importantly, we demonstrate that these commonly occurring thoracic diseases can be detected and even spatially-located via a unified weakly-supervised multi-label image classification and disease localization framework, which is validated using our proposed dataset. Although the initial quantitative results are promising as reported, deep convolutional neural network based reading chest X-rays (i.e., recognizing and locating the common disease patterns trained with only image-level labels) remains a strenuous task for fully-automated high precision CAD systems.

2,100 citations


Journal ArticleDOI
TL;DR: A survey of 40 research efforts that employ deep learning techniques, applied to various agricultural and food production challenges indicates that deep learning provides high accuracy, outperforming existing commonly used image processing techniques.

2,100 citations


Journal ArticleDOI
TL;DR: This work generates primary data, creates bioinformatics tools and provides analysis to support the work of expert manual gene annotators and automated gene annotation pipelines to identify and characterise gene loci to the highest standard.
Abstract: The accurate identification and description of the genes in the human and mouse genomes is a fundamental requirement for high quality analysis of data informing both genome biology and clinical genomics. Over the last 15 years, the GENCODE consortium has been producing reference quality gene annotations to provide this foundational resource. The GENCODE consortium includes both experimental and computational biology groups who work together to improve and extend the GENCODE gene annotation. Specifically, we generate primary data, create bioinformatics tools and provide analysis to support the work of expert manual gene annotators and automated gene annotation pipelines. In addition, manual and computational annotation workflows use any and all publicly available data and analysis, along with the research literature to identify and characterise gene loci to the highest standard. GENCODE gene annotations are accessible via the Ensembl and UCSC Genome Browsers, the Ensembl FTP site, Ensembl Biomart, Ensembl Perl and REST APIs as well as https://www.gencodegenes.org.

2,095 citations


Journal ArticleDOI
TL;DR: TAVR was a noninferior alternative to surgery in patients with severe aortic stenosis at intermediate surgical risk, with a different pattern of adverse events associated with each procedure.
Abstract: BackgroundAlthough transcatheter aortic-valve replacement (TAVR) is an accepted alternative to surgery in patients with severe aortic stenosis who are at high surgical risk, less is known about comparative outcomes among patients with aortic stenosis who are at intermediate surgical risk. MethodsWe evaluated the clinical outcomes in intermediate-risk patients with severe, symptomatic aortic stenosis in a randomized trial comparing TAVR (performed with the use of a self-expanding prosthesis) with surgical aortic-valve replacement. The primary end point was a composite of death from any cause or disabling stroke at 24 months in patients undergoing attempted aortic-valve replacement. We used Bayesian analytical methods (with a margin of 0.07) to evaluate the noninferiority of TAVR as compared with surgical valve replacement. ResultsA total of 1746 patients underwent randomization at 87 centers. Of these patients, 1660 underwent an attempted TAVR or surgical procedure. The mean (±SD) age of the patients was 7...

2,095 citations


Journal ArticleDOI
TL;DR: The challenges of using deep learning for remote-sensing data analysis are analyzed, recent advances are reviewed, and resources are provided that hope will make deep learning in remote sensing seem ridiculously simple.
Abstract: Central to the looming paradigm shift toward data-intensive science, machine-learning techniques are becoming increasingly important. In particular, deep learning has proven to be both a major breakthrough and an extremely powerful tool in many fields. Shall we embrace deep learning as the key to everything? Or should we resist a black-box solution? These are controversial issues within the remote-sensing community. In this article, we analyze the challenges of using deep learning for remote-sensing data analysis, review recent advances, and provide resources we hope will make deep learning in remote sensing seem ridiculously simple. More importantly, we encourage remote-sensing scientists to bring their expertise into deep learning and use it as an implicit general model to tackle unprecedented, large-scale, influential challenges, such as climate change and urbanization.

Journal ArticleDOI
TL;DR: This review critically summarize the main challenges linked to lifelong learning for artificial learning systems and compare existing neural network approaches that alleviate, to different extents, catastrophic forgetting.

Book ChapterDOI
08 Oct 2016
TL;DR: Zhang et al. as mentioned in this paper proposed a compact hourglass-shape CNN structure for faster and better image super-resolution, which can achieve real-time performance on a generic CPU while still maintaining good performance.
Abstract: As a successful deep model applied in image super-resolution (SR), the Super-Resolution Convolutional Neural Network (SRCNN) [1, 2] has demonstrated superior performance to the previous hand-crafted models either in speed and restoration quality. However, the high computational cost still hinders it from practical usage that demands real-time performance (24 fps). In this paper, we aim at accelerating the current SRCNN, and propose a compact hourglass-shape CNN structure for faster and better SR. We re-design the SRCNN structure mainly in three aspects. First, we introduce a deconvolution layer at the end of the network, then the mapping is learned directly from the original low-resolution image (without interpolation) to the high-resolution one. Second, we reformulate the mapping layer by shrinking the input feature dimension before mapping and expanding back afterwards. Third, we adopt smaller filter sizes but more mapping layers. The proposed model achieves a speed up of more than 40 times with even superior restoration quality. Further, we present the parameter settings that can achieve real-time performance on a generic CPU while still maintaining good performance. A corresponding transfer strategy is also proposed for fast training and testing across different upscaling factors.

Journal ArticleDOI
TL;DR: The incidence of community-acquired pneumonia requiring hospitalization was highest among the oldest adults and despite current diagnostic tests, no pathogen was detected in the majority of patients.
Abstract: Background Incidence estimates of hospitalizations for community-acquired pneumonia among children in the United States that are based on prospective data collection are limited. Updated estimates of pneumonia that has been confirmed radiographically and with the use of current laboratory diagnostic tests are needed. Methods We conducted active population-based surveillance for community-acquired pneumonia requiring hospitalization among children younger than 18 years of age in three hospitals in Memphis, Nashville, and Salt Lake City. We excluded children with recent hospitalization or severe immunosuppression. Blood and respiratory specimens were systematically collected for pathogen detection with the use of multiple methods. Chest radiographs were reviewed independently by study radiologists. Results From January 2010 through June 2012, we enrolled 2638 of 3803 eligible children (69%), 2358 of whom (89%) had radiographic evidence of pneumonia. The median age of the children was 2 years (interquartile ...

Posted Content
TL;DR: In this article, the problem of hallucinating a plausible color version of the photograph is addressed by posing it as a classification task and using class-balancing at training time to increase the diversity of colors in the result.
Abstract: Given a grayscale photograph as input, this paper attacks the problem of hallucinating a plausible color version of the photograph. This problem is clearly underconstrained, so previous approaches have either relied on significant user interaction or resulted in desaturated colorizations. We propose a fully automatic approach that produces vibrant and realistic colorizations. We embrace the underlying uncertainty of the problem by posing it as a classification task and use class-rebalancing at training time to increase the diversity of colors in the result. The system is implemented as a feed-forward pass in a CNN at test time and is trained on over a million color images. We evaluate our algorithm using a "colorization Turing test," asking human participants to choose between a generated and ground truth color image. Our method successfully fools humans on 32% of the trials, significantly higher than previous methods. Moreover, we show that colorization can be a powerful pretext task for self-supervised feature learning, acting as a cross-channel encoder. This approach results in state-of-the-art performance on several feature learning benchmarks.

Journal ArticleDOI
TL;DR: The chimeric antigen receptor (CAR) T-cell therapy tisagenlecleucel targets and eliminates CD19-expressing B cells and showed efficacy against B-cell lymphomas in a single-center, phase 2a study.
Abstract: Background Patients with diffuse large B-cell lymphoma that is refractory to primary and second-line therapies or that has relapsed after stem-cell transplantation have a poor prognosis. The chimeric antigen receptor (CAR) T-cell therapy tisagenlecleucel targets and eliminates CD19-expressing B cells and showed efficacy against B-cell lymphomas in a single-center, phase 2a study. Methods We conducted an international, phase 2, pivotal study of centrally manufactured tisagenlecleucel involving adult patients with relapsed or refractory diffuse large B-cell lymphoma who were ineligible for or had disease progression after autologous hematopoietic stem-cell transplantation. The primary end point was the best overall response rate (i.e., the percentage of patients who had a complete or partial response), as judged by an independent review committee. Results A total of 93 patients received an infusion and were included in the evaluation of efficacy. The median time from infusion to data cutoff was 14 ...

Journal ArticleDOI
TL;DR: With a longer time after the onset of symptoms, CT findings were more frequent, including consolidation, bilateral and peripheral disease, greater total lung involvement, linear opacities, “crazy-paving” pattern and the “reverse halo” sign.
Abstract: In this retrospective study, chest CTs of 121 symptomatic patients infected with coronavirus disease-19 (COVID-19) from four centers in China from January 18, 2020 to February 2, 2020 were reviewed for common CT findings in relationship to the time between symptom onset and the initial CT scan (i.e. early, 0-2 days (36 patients), intermediate 3-5 days (33 patients), late 6-12 days (25 patients)). The hallmarks of COVID-19 infection on imaging were bilateral and peripheral ground-glass and consolidative pulmonary opacities. Notably, 20/36 (56%) of early patients had a normal CT. With a longer time after the onset of symptoms, CT findings were more frequent, including consolidation, bilateral and peripheral disease, greater total lung involvement, linear opacities, "crazy-paving" pattern and the "reverse halo" sign. Bilateral lung involvement was observed in 10/36 early patients (28%), 25/33 intermediate patients (76%), and 22/25 late patients (88%).

Journal ArticleDOI
TL;DR: AntiSMASH 5 adds detection rules for clusters encoding the biosynthesis of acyl-amino acids, β-lactones, fungal RiPPs, RaS-Ri PPs, polybrominated diphenyl ethers, C-nucleosides, PPY-like ketones and lipolanthines and provides more detailed predictions for type II polyketide synthase-encoding gene clusters.
Abstract: Secondary metabolites produced by bacteria and fungi are an important source of antimicrobials and other bioactive compounds. In recent years, genome mining has seen broad applications in identifying and characterizing new compounds as well as in metabolic engineering. Since 2011, the 'antibiotics and secondary metabolite analysis shell-antiSMASH' (https://antismash.secondarymetabolites.org) has assisted researchers in this, both as a web server and a standalone tool. It has established itself as the most widely used tool for identifying and analysing biosynthetic gene clusters (BGCs) in bacterial and fungal genome sequences. Here, we present an entirely redesigned and extended version 5 of antiSMASH. antiSMASH 5 adds detection rules for clusters encoding the biosynthesis of acyl-amino acids, β-lactones, fungal RiPPs, RaS-RiPPs, polybrominated diphenyl ethers, C-nucleosides, PPY-like ketones and lipolanthines. For type II polyketide synthase-encoding gene clusters, antiSMASH 5 now offers more detailed predictions. The HTML output visualization has been redesigned to improve the navigation and visual representation of annotations. We have again improved the runtime of analysis steps, making it possible to deliver comprehensive annotations for bacterial genomes within a few minutes. A new output file in the standard JavaScript object notation (JSON) format is aimed at downstream tools that process antiSMASH results programmatically.

Journal ArticleDOI
Catherine O. Johnson, Minh Nguyen1, Gregory A. Roth1, Emma Nichols  +269 moreInstitutions (1)
TL;DR: The results presented here are the estimates of burden due to overall stroke and ischaemic and haemorrhagic stroke from GBD 2016, indicating that the burden of stroke is likely to remain high.
Abstract: Summary Background Stroke is a leading cause of mortality and disability worldwide and the economic costs of treatment and post-stroke care are substantial. The Global Burden of Diseases, Injuries, and Risk Factors Study (GBD) provides a systematic, comparable method of quantifying health loss by disease, age, sex, year, and location to provide information to health systems and policy makers on more than 300 causes of disease and injury, including stroke. The results presented here are the estimates of burden due to overall stroke and ischaemic and haemorrhagic stroke from GBD 2016. Methods We report estimates and corresponding uncertainty intervals (UIs), from 1990 to 2016, for incidence, prevalence, deaths, years of life lost (YLLs), years lived with disability (YLDs), and disability-adjusted life-years (DALYs). DALYs were generated by summing YLLs and YLDs. Cause-specific mortality was estimated using an ensemble modelling process with vital registration and verbal autopsy data as inputs. Non-fatal estimates were generated using Bayesian meta-regression incorporating data from registries, scientific literature, administrative records, and surveys. The Socio-demographic Index (SDI), a summary indicator generated using educational attainment, lagged distributed income, and total fertility rate, was used to group countries into quintiles. Findings In 2016, there were 5·5 million (95% UI 5·3 to 5·7) deaths and 116·4 million (111·4 to 121·4) DALYs due to stroke. The global age-standardised mortality rate decreased by 36·2% (−39·3 to −33·6) from 1990 to 2016, with decreases in all SDI quintiles. Over the same period, the global age-standardised DALY rate declined by 34·2% (−37·2 to −31·5), also with decreases in all SDI quintiles. There were 13·7 million (12·7 to 14·7) new stroke cases in 2016. Global age-standardised incidence declined by 8·1% (−10·7 to −5·5) from 1990 to 2016 and decreased in all SDI quintiles except the middle SDI group. There were 80·1 million (74·1 to 86·3) prevalent cases of stroke globally in 2016; 41·1 million (38·0 to 44·3) in women and 39·0 million (36·1 to 42·1) in men. Interpretation Although age-standardised mortality rates have decreased sharply from 1990 to 2016, the decrease in age-standardised incidence has been less steep, indicating that the burden of stroke is likely to remain high. Planned updates to future GBD iterations include generating separate estimates for subarachnoid haemorrhage and intracerebral haemorrhage, generating estimates of transient ischaemic attack, and including atrial fibrillation as a risk factor. Funding Bill & Melinda Gates Foundation

Journal ArticleDOI
TL;DR: To develop a new evidence‐based, pharmacologic treatment guideline for rheumatoid arthritis (RA), a large number of patients with RA are referred to a single clinic for treatment with these medications.
Abstract: Objective To develop a new evidence-based, pharmacologic treatment guideline for rheumatoid arthritis (RA). Methods We conducted systematic reviews to synthesize the evidence for the benefits and harms of various treatment options. We used the Grading of Recommendations Assessment, Development and Evaluation (GRADE) methodology to rate the quality of evidence. We employed a group consensus process to grade the strength of recommendations (either strong or conditional). A strong recommendation indicates that clinicians are certain that the benefits of an intervention far outweigh the harms (or vice versa). A conditional recommendation denotes uncertainty over the balance of benefits and harms and/or more significant variability in patient values and preferences. Results The guideline covers the use of traditional disease-modifying antirheumatic drugs (DMARDs), biologic agents, tofacitinib, and glucocorticoids in early (<6 months) and established (≥6 months) RA. In addition, it provides recommendations on using a treat-to-target approach, tapering and discontinuing medications, and the use of biologic agents and DMARDs in patients with hepatitis, congestive heart failure, malignancy, and serious infections. The guideline addresses the use of vaccines in patients starting/receiving DMARDs or biologic agents, screening for tuberculosis in patients starting/receiving biologic agents or tofacitinib, and laboratory monitoring for traditional DMARDs. The guideline includes 74 recommendations: 23% are strong and 77% are conditional. Conclusion This RA guideline should serve as a tool for clinicians and patients (our two target audiences) for pharmacologic treatment decisions in commonly encountered clinical situations. These recommendations are not prescriptive, and the treatment decisions should be made by physicians and patients through a shared decision-making process taking into account patients’ values, preferences, and comorbidities. These recommendations should not be used to limit or deny access to therapies.

Journal ArticleDOI
TL;DR: The large majority of studies on the role of the microbiome in the pathogenesis of disease are correlative and preclinical; several have influenced clinical practice.
Abstract: The large majority of studies on the role of the microbiome in the pathogenesis of disease are correlative and preclinical; several have influenced clinical practice.

Proceedings ArticleDOI
21 Jul 2017
TL;DR: The surprising existence of universal perturbations reveals important geometric correlations among the high-dimensional decision boundary of classifiers and outlines potential security breaches with the existence of single directions in the input space that adversaries can possibly exploit to break a classifier on most natural images.
Abstract: Given a state-of-the-art deep neural network classifier, we show the existence of a universal (image-agnostic) and very small perturbation vector that causes natural images to be misclassified with high probability We propose a systematic algorithm for computing universal perturbations, and show that state-of-the-art deep neural networks are highly vulnerable to such perturbations, albeit being quasi-imperceptible to the human eye We further empirically analyze these universal perturbations and show, in particular, that they generalize very well across neural networks The surprising existence of universal perturbations reveals important geometric correlations among the high-dimensional decision boundary of classifiers It further outlines potential security breaches with the existence of single directions in the input space that adversaries can possibly exploit to break a classifier on most natural images

Journal ArticleDOI
TL;DR: The ImageJ project is used as a case study of how open‐source software fosters its suites of software tools, making multitudes of image‐analysis technology easily accessible to the scientific community.
Abstract: Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available-from commercial to academic, special-purpose to Swiss army knife, small to large-but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on the life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts the life sciences, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem.

Journal ArticleDOI
TL;DR: Guidelines summarize and evaluate available evidence with the aim of assisting health professionals in proposing the best management strategies for an individual patient with a given condition.
Abstract: Guidelines summarize and evaluate available evidence with the aim of assisting health professionals in proposing the best management strategies for an individual patient with a given condition. Guidelines and their recommendations should facilitate decision making of health professionals in their daily practice. However, the final decisions concerning an individual patient must be made by the responsible health professional(s) in consultation with the patient and caregiver as appropriate.

Proceedings ArticleDOI
20 Mar 2017
TL;DR: This paper explores domain randomization, a simple technique for training models on simulated images that transfer to real images by randomizing rendering in the simulator, and achieves the first successful transfer of a deep neural network trained only on simulated RGB images to the real world for the purpose of robotic control.
Abstract: Bridging the ‘reality gap’ that separates simulated robotics from experiments on hardware could accelerate robotic research through improved data availability. This paper explores domain randomization, a simple technique for training models on simulated images that transfer to real images by randomizing rendering in the simulator. With enough variability in the simulator, the real world may appear to the model as just another variation. We focus on the task of object localization, which is a stepping stone to general robotic manipulation skills. We find that it is possible to train a real-world object detector that is accurate to 1.5 cm and robust to distractors and partial occlusions using only data from a simulator with non-realistic random textures. To demonstrate the capabilities of our detectors, we show they can be used to perform grasping in a cluttered environment. To our knowledge, this is the first successful transfer of a deep neural network trained only on simulated RGB images (without pre-training on real images) to the real world for the purpose of robotic control.

Journal ArticleDOI
17 Jul 2019
TL;DR: AmoebaNet-A as mentioned in this paper modified the tournament selection evolutionary algorithm by introducing an age property to favor the younger genotypes and achieved state-of-the-art performance.
Abstract: The effort devoted to hand-crafting neural network image classifiers has motivated the use of architecture search to discover them automatically. Although evolutionary algorithms have been repeatedly applied to neural network topologies, the image classifiers thus discovered have remained inferior to human-crafted ones. Here, we evolve an image classifier— AmoebaNet-A—that surpasses hand-designs for the first time. To do this, we modify the tournament selection evolutionary algorithm by introducing an age property to favor the younger genotypes. Matching size, AmoebaNet-A has comparable accuracy to current state-of-the-art ImageNet models discovered with more complex architecture-search methods. Scaled to larger size, AmoebaNet-A sets a new state-of-theart 83.9% top-1 / 96.6% top-5 ImageNet accuracy. In a controlled comparison against a well known reinforcement learning algorithm, we give evidence that evolution can obtain results faster with the same hardware, especially at the earlier stages of the search. This is relevant when fewer compute resources are available. Evolution is, thus, a simple method to effectively discover high-quality architectures.

Journal ArticleDOI
TL;DR: An overview of Mercury 4.0, an analysis, design and prediction platform that acts as a hub for the entire Cambridge Structural Database software suite, is presented.
Abstract: The program Mercury, developed at the Cambridge Crystallographic Data Centre, was originally designed primarily as a crystal structure visualization tool. Over the years the fields and scientific communities of chemical crystallography and crystal engineering have developed to require more advanced structural analysis software. Mercury has evolved alongside these scientific communities and is now a powerful analysis, design and prediction platform which goes a lot further than simple structure visualization.

Journal ArticleDOI
TL;DR: The design of the hologram integrates a ground metal plane with a geometric metasurface that enhances the conversion efficiency between the two circular polarization states, leading to high diffraction efficiency without complicating the fabrication process.
Abstract: Using a metasurface comprising an array of nanorods with different orientations and a backreflector, a hologram image can be obtained in the visible and near-infrared with limited loss of light intensity.

Journal ArticleDOI
TL;DR: Treatment with pembrolizumab plus axitinib resulted in significantly longer overall survival and progression‐free survival, as well as a higher objective response rate, than treatment with sunitin ib among patients with previously untreated advanced renal‐cell carcinoma.
Abstract: Background The combination of pembrolizumab and axitinib showed antitumor activity in a phase 1b trial involving patients with previously untreated advanced renal-cell carcinoma. Whether pembrolizumab plus axitinib would result in better outcomes than sunitinib in such patients was unclear. Methods In an open-label, phase 3 trial, we randomly assigned 861 patients with previously untreated advanced clear-cell renal-cell carcinoma to receive pembrolizumab (200 mg) intravenously once every 3 weeks plus axitinib (5 mg) orally twice daily (432 patients) or sunitinib (50 mg) orally once daily for the first 4 weeks of each 6-week cycle (429 patients). The primary end points were overall survival and progression-free survival in the intention-to-treat population. The key secondary end point was the objective response rate. All reported results are from the protocol-specified first interim analysis. Results After a median follow-up of 12.8 months, the estimated percentage of patients who were alive at 12 months was 89.9% in the pembrolizumab-axitinib group and 78.3% in the sunitinib group (hazard ratio for death, 0.53; 95% confidence interval [CI], 0.38 to 0.74; P Conclusions Among patients with previously untreated advanced renal-cell carcinoma, treatment with pembrolizumab plus axitinib resulted in significantly longer overall survival and progression-free survival, as well as a higher objective response rate, than treatment with sunitinib. (Funded by Merck Sharp & Dohme; KEYNOTE-426 ClinicalTrials.gov number, NCT02853331.).

Journal ArticleDOI
TL;DR: In this article, the authors used the observed time delay of $(+1.74 \pm 0.05 ) between GRB 170817A and GW170817 to constrain the difference between speed of gravity and the speed of light.
Abstract: On 2017 August 17, the gravitational-wave event GW170817 was observed by the Advanced LIGO and Virgo detectors, and the gamma-ray burst (GRB) GRB 170817A was observed independently by the Fermi Gamma-ray Burst Monitor, and the Anticoincidence Shield for the Spectrometer for the International Gamma-Ray Astrophysics Laboratory. The probability of the near-simultaneous temporal and spatial observation of GRB 170817A and GW170817 occurring by chance is $5.0\times 10^{-8}$. We therefore confirm binary neutron star mergers as a progenitor of short GRBs. The association of GW170817 and GRB 170817A provides new insight into fundamental physics and the origin of short gamma-ray bursts. We use the observed time delay of $(+1.74 \pm 0.05)\,$s between GRB 170817A and GW170817 to: (i) constrain the difference between the speed of gravity and the speed of light to be between $-3\times 10^{-15}$ and $+7\times 10^{-16}$ times the speed of light, (ii) place new bounds on the violation of Lorentz invariance, (iii) present a new test of the equivalence principle by constraining the Shapiro delay between gravitational and electromagnetic radiation. We also use the time delay to constrain the size and bulk Lorentz factor of the region emitting the gamma rays. GRB 170817A is the closest short GRB with a known distance, but is between 2 and 6 orders of magnitude less energetic than other bursts with measured redshift. A new generation of gamma-ray detectors, and subthreshold searches in existing detectors, will be essential to detect similar short bursts at greater distances. Finally, we predict a joint detection rate for the Fermi Gamma-ray Burst Monitor and the Advanced LIGO and Virgo detectors of 0.1--1.4 per year during the 2018-2019 observing run and 0.3--1.7 per year at design sensitivity.