scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
TL;DR: Materials whose optical properties can be reconfigured are crucial for photonic applications such as optical memories and phase-change materials offer such utility and recent progress is reviewed.
Abstract: Materials whose optical properties can be reconfigured are crucial for photonic applications such as optical memories. Phase-change materials offer such utility and here recent progress is reviewed. Phase-change materials (PCMs) provide a unique combination of properties. On transformation from the amorphous to crystalline state, their optical properties change drastically. Short optical or electrical pulses can be utilized to switch between these states, making PCMs attractive for photonic applications. We review recent developments in PCMs and evaluate the potential for all-photonic memories. Towards this goal, the progress and existing challenges to realize waveguides with stepwise adjustable transmission are presented. Colour-rendering and nanopixel displays form another interesting application. Finally, nanophotonic applications based on plasmonic nanostructures are introduced. They provide reconfigurable, non-volatile functionality enabling manipulation and control of light. Requirements and perspectives to successfully implement PCMs in emerging areas of photonics are discussed.

872 citations


Journal ArticleDOI
TL;DR: In this paper, the authors show that the signal waveform is consistent with the predictions of general relativity and verify that the signals from the merger of two stellar-mass black holes in the LIGO detectors are consistent with these predictions.
Abstract: On June 8, 2017 at 02:01:16.49 UTC, a gravitational-wave signal from the merger of two stellar-mass black holes was observed by the two Advanced LIGO detectors with a network signal-to-noise ratio of 13. This system is the lightest black hole binary so far observed, with component masses $12^{+7}_{-2}\,M_\odot$ and $7^{+2}_{-2}\,M_\odot$ (90% credible intervals). These lie in the range of measured black hole masses in low-mass X-ray binaries, thus allowing us to compare black holes detected through gravitational waves with electromagnetic observations. The source's luminosity distance is $340^{+140}_{-140}$ Mpc, corresponding to redshift $0.07^{+0.03}_{-0.03}$. We verify that the signal waveform is consistent with the predictions of general relativity.

872 citations


Journal ArticleDOI
TL;DR: The Aerial Image data set (AID), a large-scale data set for aerial scene classification, is described to advance the state of the arts in scene classification of remote sensing images and can be served as the baseline results on this benchmark.
Abstract: Aerial scene classification, which aims to automatically label an aerial image with a specific semantic category, is a fundamental problem for understanding high-resolution remote sensing imagery. In recent years, it has become an active task in remote sensing area and numerous algorithms have been proposed for this task, including many machine learning and data-driven approaches. However, the existing datasets for aerial scene classification like UC-Merced dataset and WHU-RS19 are with relatively small sizes, and the results on them are already saturated. This largely limits the development of scene classification algorithms. This paper describes the Aerial Image Dataset (AID): a large-scale dataset for aerial scene classification. The goal of AID is to advance the state-of-the-arts in scene classification of remote sensing images. For creating AID, we collect and annotate more than ten thousands aerial scene images. In addition, a comprehensive review of the existing aerial scene classification techniques as well as recent widely-used deep learning methods is given. Finally, we provide a performance analysis of typical aerial scene classification and deep learning approaches on AID, which can be served as the baseline results on this benchmark.

872 citations


Posted Content
TL;DR: This paper proposed convolutional neural network models for matching two sentences, which can be applied to matching tasks of different nature and in different languages and demonstrate the efficacy of the proposed model on a variety of matching tasks and its superiority to competitor models.
Abstract: Semantic matching is of central importance to many natural language tasks \cite{bordes2014semantic,RetrievalQA}. A successful matching algorithm needs to adequately model the internal structures of language objects and the interaction between them. As a step toward this goal, we propose convolutional neural network models for matching two sentences, by adapting the convolutional strategy in vision and speech. The proposed models not only nicely represent the hierarchical structures of sentences with their layer-by-layer composition and pooling, but also capture the rich matching patterns at different levels. Our models are rather generic, requiring no prior knowledge on language, and can hence be applied to matching tasks of different nature and in different languages. The empirical study on a variety of matching tasks demonstrates the efficacy of the proposed model on a variety of matching tasks and its superiority to competitor models.

872 citations


Journal ArticleDOI
TL;DR: A range of mechanisms underpinning the resilience of ecosystem functions across three ecological scales are identified and biodiversity, encompassing variation from within species to across landscapes, may be crucial for the longer-term resilience ofcosystem functions and the services that they underpin.
Abstract: Accelerating rates of environmental change and the continued loss of global biodiversity threaten functions and services delivered by ecosystems. Much ecosystem monitoring and management is focused on the provision of ecosystem functions and services under current environmental conditions, yet this could lead to inappropriate management guidance and undervaluation of the importance of biodiversity. The maintenance of ecosystem functions and services under substantial predicted future environmental change (i.e., their ‘resilience’) is crucial. Here we identify a range of mechanisms underpinning the resilience of ecosystem functions across three ecological scales. Although potentially less important in the short term, biodiversity, encompassing variation from within species to across landscapes, may be crucial for the longer-term resilience of ecosystem functions and the services that they underpin.

871 citations


Journal ArticleDOI
TL;DR: In this article, a regression of the variable at date t on the four most recent values as of date t achieves all the objectives sought by users of the HP filter with none of its drawbacks.
Abstract: Here's why. (1) HP introduces spurious dynamic relations that have no basis in the underlying data-generating process. (2) Filtered values at the end of the sample are very different from those in the middle, and are also characterized by spurious dynamics. (3) A statistical formalization of the problem typically produces values for the smoothing parameter vastly at odds with common practice. (4) There's a better alternative. A regression of the variable at date t on the four most recent values as of date t—h achieves all the objectives sought by users of the HP filter with none of its drawbacks. JEL codes: C22, E32, E47

871 citations


Posted Content
TL;DR: This article used a crowd-sourced hate speech lexicon to collect tweets containing hate speech keywords and trained a multi-class classifier to distinguish hate speech from other offensive language, finding that racist and homophobic tweets are more likely to be classified as hate speech but that sexist tweets are generally classified as offensive.
Abstract: A key challenge for automatic hate-speech detection on social media is the separation of hate speech from other instances of offensive language. Lexical detection methods tend to have low precision because they classify all messages containing particular terms as hate speech and previous work using supervised learning has failed to distinguish between the two categories. We used a crowd-sourced hate speech lexicon to collect tweets containing hate speech keywords. We use crowd-sourcing to label a sample of these tweets into three categories: those containing hate speech, only offensive language, and those with neither. We train a multi-class classifier to distinguish between these different categories. Close analysis of the predictions and the errors shows when we can reliably separate hate speech from other offensive language and when this differentiation is more difficult. We find that racist and homophobic tweets are more likely to be classified as hate speech but that sexist tweets are generally classified as offensive. Tweets without explicit hate keywords are also more difficult to classify.

871 citations


Journal ArticleDOI
TL;DR: In this paper, a detection of the baryon acoustic oscillation (BAO) feature in the flux-correlation function of the Ly forest of high-redshift quasars with a statistical significance of five standard deviations was reported.
Abstract: We report a detection of the baryon acoustic oscillation (BAO) feature in the flux-correlation function of the Ly forest of high-redshift quasars with a statistical significance of five standard deviations The study uses 137,562 quasars in the redshift range 2:1 z 3:5 from the Data Release 11 (DR11) of the Baryon Oscillation Spectroscopic Survey (BOSS) of SDSS-III This sample contains three times the number of quasars used in previous studies The measured position of the BAO peak determines the angular distance, DA(z = 2:34) and expansion rate, H(z = 2:34), both on a scale set by the sound horizon at the drag epoch, rd We find DA=rd =

871 citations


Posted Content
TL;DR: It is shown that structure matters: incorporating knowledge about locality in the input into the objective can significantly improve a representation’s suitability for downstream tasks and is an important step towards flexible formulations of representation learning objectives for specific end-goals.
Abstract: In this work, we perform unsupervised learning of representations by maximizing mutual information between an input and the output of a deep neural network encoder. Importantly, we show that structure matters: incorporating knowledge about locality of the input to the objective can greatly influence a representation's suitability for downstream tasks. We further control characteristics of the representation by matching to a prior distribution adversarially. Our method, which we call Deep InfoMax (DIM), outperforms a number of popular unsupervised learning methods and competes with fully-supervised learning on several classification tasks. DIM opens new avenues for unsupervised learning of representations and is an important step towards flexible formulations of representation-learning objectives for specific end-goals.

871 citations


Journal ArticleDOI
TL;DR: The first update of BindingDB since 2007 is provided, focusing on new and unique features and highlighting directions of importance to the field as a whole.
Abstract: BindingDB, www.bindingdb.org, is a publicly accessible database of experimental protein-small molecule interaction data. Its collection of over a million data entries derives primarily from scientific articles and, increasingly, US patents. BindingDB provides many ways to browse and search for data of interest, including an advanced search tool, which can cross searches of multiple query types, including text, chemical structure, protein sequence and numerical affinities. The PDB and PubMed provide links to data in BindingDB, and vice versa; and BindingDB provides links to pathway information, the ZINC catalog of available compounds, and other resources. The BindingDB website offers specialized tools that take advantage of its large data collection, including ones to generate hypotheses for the protein targets bound by a bioactive compound, and for the compounds bound by a new protein of known sequence; and virtual compound screening by maximal chemical similarity, binary kernel discrimination, and support vector machine methods. Specialized data sets are also available, such as binding data for hundreds of congeneric series of ligands, drawn from BindingDB and organized for use in validating drug design methods. BindingDB offers several forms of programmatic access, and comes with extensive background material and documentation. Here, we provide the first update of BindingDB since 2007, focusing on new and unique features and highlighting directions of importance to the field as a whole.

871 citations


Journal ArticleDOI
TL;DR: This cross-sectional study reviews the medical records of 1524 patients with cancer treated at a single tertiary care hospital in Wuhan, China to evaluate the characteristics associated with transmission of the SARS-CoV-2 virus.
Abstract: This cross-sectional study reviews the medical records of 1524 patients with cancer treated at a single tertiary care hospital in Wuhan, China, to evaluate the characteristics associated with transmission of the SARS-CoV-2 virus.

Journal ArticleDOI
TL;DR: These results may underestimate the overall proportion of cancers attributable to modifiable factors, because the impact of all established risk factors could not be quantified, and many likely modifiable risk factors are not yet firmly established as causal.
Abstract: Contemporary information on the fraction of cancers that potentially could be prevented is useful for priority setting in cancer prevention and control. Herein, the authors estimate the proportion and number of invasive cancer cases and deaths, overall (excluding nonmelanoma skin cancers) and for 26 cancer types, in adults aged 30 years and older in the United States in 2014, that were attributable to major, potentially modifiable exposures (cigarette smoking; secondhand smoke; excess body weight; alcohol intake; consumption of red and processed meat; low consumption of fruits/vegetables, dietary fiber, and dietary calcium; physical inactivity; ultraviolet radiation; and 6 cancer-associated infections). The numbers of cancer cases were obtained from the Centers for Disease Control and Prevention (CDC) and the National Cancer Institute; the numbers of deaths were obtained from the CDC; risk factor prevalence estimates were obtained from nationally representative surveys; and associated relative risks of cancer were obtained from published, large-scale pooled analyses or meta-analyses. In the United States in 2014, an estimated 42.0% of all incident cancers (659,640 of 1570,975 cancers, excluding nonmelanoma skin cancers) and 45.1% of cancer deaths (265,150 of 587,521 deaths) were attributable to evaluated risk factors. Cigarette smoking accounted for the highest proportion of cancer cases (19.0%; 298,970 cases) and deaths (28.8%; 169,180 deaths), followed by excess body weight (7.8% and 6.5%, respectively) and alcohol intake (5.6% and 4.0%, respectively). Lung cancer had the highest number of cancers (184,970 cases) and deaths (132,960 deaths) attributable to evaluated risk factors, followed by colorectal cancer (76,910 cases and 28,290 deaths). These results, however, may underestimate the overall proportion of cancers attributable to modifiable factors, because the impact of all established risk factors could not be quantified, and many likely modifiable risk factors are not yet firmly established as causal. Nevertheless, these findings underscore the vast potential for reducing cancer morbidity and mortality through broad and equitable implementation of known preventive measures. CA Cancer J Clin 2018;68:31-54. © 2017 American Cancer Society.

Journal ArticleDOI
18 Sep 2015-Science
TL;DR: A proton-conduction cathode and simpler fabrication enable lower-temperature operation of methane-fueled ceramic fuel cells and develops a proton-, oxygen-ion–, and electron-hole–conducting PCFC-compatible cathode material that greatly improved oxygen reduction reaction kinetics at intermediate to low temperatures.
Abstract: Because of the generally lower activation energy associated with proton conduction in oxides compared to oxygen ion conduction, protonic ceramic fuel cells (PCFCs) should be able to operate at lower temperatures than solid oxide fuel cells (250° to 550°C versus ≥600°C) on hydrogen and hydrocarbon fuels if fabrication challenges and suitable cathodes can be developed. We fabricated the complete sandwich structure of PCFCs directly from raw precursor oxides with only one moderate-temperature processing step through the use of sintering agents such as copper oxide. We also developed a proton-, oxygen-ion–, and electron-hole–conducting PCFC-compatible cathode material, BaCo0.4Fe0.4Zr0.1Y0.1O3-δ (BCFZY0.1), that greatly improved oxygen reduction reaction kinetics at intermediate to low temperatures. We demonstrated high performance from five different types of PCFC button cells without degradation after 1400 hours. Power densities as high as 455 milliwatts per square centimeter at 500°C on H2 and 142 milliwatts per square centimeter on CH4 were achieved, and operation was possible even at 350°C.

Journal ArticleDOI
TL;DR: This paper used newly collected data on patents issued to U.S. firms in the 1926 to 2010 period, combined with the stock market response to news about patents, to measure the economic importance of each innovation.
Abstract: We propose a new measure of the economic importance of each innovation. Our measure uses newly collected data on patents issued to U.S. firms in the 1926 to 2010 period, combined with the stock market response to news about patents. Our patent-level estimates of private economic value are positively related to the scientific value of these patents, as measured by the number of citations the patent receives in the future. Our new measure is associated with substantial growth, reallocation, and creative destruction, consistent with the predictions of Schumpeterian growth models. Aggregating our measure suggests that technological innovation accounts for significant medium-run fluctuations in aggregate economic growth and TFP. Our measure contains additional information relative to citation-weighted patent counts; the relation between our measure and firm growth is considerably stronger. Importantly, the degree of creative destruction that is associated with our measure is higher than previous estimates, confirming that it is a useful proxy for the private valuation of patents.

Journal ArticleDOI
18 Sep 2015-Science
TL;DR: In this article, an ultrathin invisibility skin cloak is proposed to cover a 3D arbitrarily shaped object by complete restoration of the phase of the reflected light at 730-nanometer wavelength.
Abstract: Metamaterial-based optical cloaks have thus far used volumetric distribution of the material properties to gradually bend light and thereby obscure the cloaked region Hence, they are bulky and hard to scale up and, more critically, typical carpet cloaks introduce unnecessary phase shifts in the reflected light, making the cloaks detectable Here, we demonstrate experimentally an ultrathin invisibility skin cloak wrapped over an object This skin cloak conceals a three-dimensional arbitrarily shaped object by complete restoration of the phase of the reflected light at 730-nanometer wavelength The skin cloak comprises a metasurface with distributed phase shifts rerouting light and rendering the object invisible In contrast to bulky cloaks with volumetric index variation, our device is only 80 nanometer (about one-ninth of the wavelength) thick and potentially scalable for hiding macroscopic objects


Journal ArticleDOI
TL;DR: The major methods to prepare nanoemulsions, theories to predict droplet size, physical conditions and chemical additives which affect droplet stability, and recent applications are summarized.
Abstract: Nanoemulsions are kinetically stable liquid-in-liquid dispersions with droplet sizes on the order of 100 nm. Their small size leads to useful properties such as high surface area per unit volume, robust stability, optically transparent appearance, and tunable rheology. Nanoemulsions are finding application in diverse areas such as drug delivery, food, cosmetics, pharmaceuticals, and material synthesis. Additionally, they serve as model systems to understand nanoscale colloidal dispersions. High and low energy methods are used to prepare nanoemulsions, including high pressure homogenization, ultrasonication, phase inversion temperature and emulsion inversion point, as well as recently developed approaches such as bubble bursting method. In this review article, we summarize the major methods to prepare nanoemulsions, theories to predict droplet size, physical conditions and chemical additives which affect droplet stability, and recent applications.

Journal ArticleDOI
TL;DR: The very reason such tasks produce robust and easily replicable experimental effects – low between-participant variability – makes their use as correlational tools problematic, and it is demonstrated that taking reliability estimates into account has the potential to qualitatively change theoretical conclusions.
Abstract: Individual differences in cognitive paradigms are increasingly employed to relate cognition to brain structure, chemistry, and function. However, such efforts are often unfruitful, even with the most well established tasks. Here we offer an explanation for failures in the application of robust cognitive paradigms to the study of individual differences. Experimental effects become well established – and thus those tasks become popular – when between-subject variability is low. However, low between-subject variability causes low reliability for individual differences, destroying replicable correlations with other factors and potentially undermining published conclusions drawn from correlational relationships. Though these statistical issues have a long history in psychology, they are widely overlooked in cognitive psychology and neuroscience today. In three studies, we assessed test-retest reliability of seven classic tasks: Eriksen Flanker, Stroop, stop-signal, go/no-go, Posner cueing, Navon, and Spatial-Numerical Association of Response Code (SNARC). Reliabilities ranged from 0 to .82, being surprisingly low for most tasks given their common use. As we predicted, this emerged from low variance between individuals rather than high measurement variance. In other words, the very reason such tasks produce robust and easily replicable experimental effects – low between-participant variability – makes their use as correlational tools problematic. We demonstrate that taking such reliability estimates into account has the potential to qualitatively change theoretical conclusions. The implications of our findings are that well-established approaches in experimental psychology and neuropsychology may not directly translate to the study of individual differences in brain structure, chemistry, and function, and alternative metrics may be required.

Proceedings ArticleDOI
01 May 2017
TL;DR: Li et al. as discussed by the authors designed a hybrid convolutional neural network to integrate meta-data with text and showed that this hybrid approach can improve a text-only deep learning model.
Abstract: Automatic fake news detection is a challenging problem in deception detection, and it has tremendous real-world political and social impacts. However, statistical approaches to combating fake news has been dramatically limited by the lack of labeled benchmark datasets. In this paper, we present LIAR: a new, publicly available dataset for fake news detection. We collected a decade-long, 12.8K manually labeled short statements in various contexts from PolitiFact.com, which provides detailed analysis report and links to source documents for each case. This dataset can be used for fact-checking research as well. Notably, this new dataset is an order of magnitude larger than previously largest public fake news datasets of similar type. Empirically, we investigate automatic fake news detection based on surface-level linguistic patterns. We have designed a novel, hybrid convolutional neural network to integrate meta-data with text. We show that this hybrid approach can improve a text-only deep learning model.

Journal ArticleDOI
TL;DR: ADT-free survival was longer with MDT than with surveillance alone for oligorecurrent PCa, suggesting that MDT should be explored further in phase III trials.
Abstract: PurposeRetrospective studies suggest that metastasis-directed therapy (MDT) for oligorecurrent prostate cancer (PCa) improves progression-free survival. We aimed to assess the benefit of MDT in a randomized phase II trial.Patients and MethodsIn this multicenter, randomized, phase II study, patients with asymptomatic PCa were eligible if they had had a biochemical recurrence after primary PCa treatment with curative intent, three or fewer extracranial metastatic lesions on choline positron emission tomography–computed tomography, and serum testosterone levels > 50 ng/mL. Patients were randomly assigned (1:1) to either surveillance or MDT of all detected lesions (surgery or stereotactic body radiotherapy). Surveillance was performed with prostate-specific antigen (PSA) follow-up every 3 months, with repeated imaging at PSA progression or clinical suspicion for progression. Random assignment was balanced dynamically on the basis of two factors: PSA doubling time (≤ 3 v > 3 months) and nodal versus non-nodal ...

Journal ArticleDOI
TL;DR: The disease burden caused by limited access to antimicrobials, attributable to resistance to antimicrobial resistance, and the potential effect of vaccines in restricting the need for antibiotics are assessed.

Book ChapterDOI
15 May 2017
TL;DR: The culture industry, the most inflexible style of all, proves to be the goal of the very liberalism which is criticized for its lack of style as mentioned in this paper, and the relentless unity of the culture industry bears witness to the emergent unity of politics.
Abstract: Culture is infecting everything with sameness. Film, radio, and magazines form a system. Each branch of culture is unanimous within itself and all are unanimous together. All mass culture under monopoly is identical, and the contours of its skeleton, the conceptual armature fabricated by monopoly. The whole world is passed through the filter of the culture industry. For the present the technology of the culture industry confines itself to standardization and mass production and sacrifices what once distinguished the logic of the work from that of society. The relentless unity of the culture industry bears witness to the emergent unity of politics. The concept of a genuine style becomes transparent in the culture industry as the aesthetic equivalent of power. The culture industry, the most inflexible style of all, proves to be the goal of the very liberalism which is criticized for its lack of style.

Journal ArticleDOI
TL;DR: In this paper, the authors investigated the detection rate of 68Ga-PSMA PET/CT in patients with biochemical recurrence after radical prostatectomy and found that the detection rates were correlated with PSA level and PSA kinetics.
Abstract: The expression of prostate-specific membrane antigen (PSMA) is increased in prostate cancer. Recently, 68Ga-PSMA (Glu-NH-CO-NH-Lys-(Ahx)-[68Ga(HBED-CC)]) was developed as a PSMA ligand. The aim of this study was to investigate the detection rate of 68Ga-PSMA PET/CT in patients with biochemical recurrence after radical prostatectomy. Methods: Two hundred forty-eight of 393 patients were evaluable for a retrospective analysis. Median prostate-specific antigen (PSA) level was 1.99 ng/mL (range, 0.2–59.4 ng/mL). All patients underwent contrast-enhanced PET/CT after injection of 155 ± 27 MBq of 68Ga-PSMA ligand. The detection rates were correlated with PSA level and PSA kinetics. The influence of antihormonal treatment, primary Gleason score, and contribution of PET and morphologic imaging to the final diagnosis were assessed. Results: Two hundred twenty-two (89.5%) patients showed pathologic findings in 68Ga-PSMA ligand PET/CT. The detection rates were 96.8%, 93.0%, 72.7%, and 57.9% for PSA levels of ≥2, 1 to 6, 4–6, and

Proceedings ArticleDOI
07 Jun 2015
TL;DR: Zhang et al. as discussed by the authors introduced a neural network architecture which has fully connected layers on top of CNNs responsible for feature extraction at three different scales, and proposed a refinement method to enhance the spatial coherence of their saliency results.
Abstract: Visual saliency is a fundamental problem in both cognitive and computational sciences, including computer vision. In this paper, we discover that a high-quality visual saliency model can be learned from multiscale features extracted using deep convolutional neural networks (CNNs), which have had many successes in visual recognition tasks. For learning such saliency models, we introduce a neural network architecture, which has fully connected layers on top of CNNs responsible for feature extraction at three different scales. We then propose a refinement method to enhance the spatial coherence of our saliency results. Finally, aggregating multiple saliency maps computed for different levels of image segmentation can further boost the performance, yielding saliency maps better than those generated from a single segmentation. To promote further research and evaluation of visual saliency models, we also construct a new large database of 4447 challenging images and their pixelwise saliency annotations. Experimental results demonstrate that our proposed method is capable of achieving state-of-the-art performance on all public benchmarks, improving the F-Measure by 5.0% and 13.2% respectively on the MSRA-B dataset and our new dataset (HKU-IS), and lowering the mean absolute error by 5.7% and 35.1% respectively on these two datasets.

Journal ArticleDOI
TL;DR: The adverse-event profile of nintedanib observed in this trial was similar to that observed in patients with idiopathic pulmonary fibrosis; gastrointestinal adverse events, including diarrhea, were more common with nintinganib than with placebo.
Abstract: Background Interstitial lung disease (ILD) is a common manifestation of systemic sclerosis and a leading cause of systemic sclerosis-related death. Nintedanib, a tyrosine kinase inhibitor, has been shown to have antifibrotic and antiinflammatory effects in preclinical models of systemic sclerosis and ILD. Methods We conducted a randomized, double-blind, placebo-controlled trial to investigate the efficacy and safety of nintedanib in patients with ILD associated with systemic sclerosis. Patients who had systemic sclerosis with an onset of the first non-Raynaud's symptom within the past 7 years and a high-resolution computed tomographic scan that showed fibrosis affecting at least 10% of the lungs were randomly assigned, in a 1:1 ratio, to receive 150 mg of nintedanib, administered orally twice daily, or placebo. The primary end point was the annual rate of decline in forced vital capacity (FVC), assessed over a 52-week period. Key secondary end points were absolute changes from baseline in the modified Rodnan skin score and in the total score on the St. George's Respiratory Questionnaire (SGRQ) at week 52. Results A total of 576 patients received at least one dose of nintedanib or placebo; 51.9% had diffuse cutaneous systemic sclerosis, and 48.4% were receiving mycophenolate at baseline. In the primary end-point analysis, the adjusted annual rate of change in FVC was -52.4 ml per year in the nintedanib group and -93.3 ml per year in the placebo group (difference, 41.0 ml per year; 95% confidence interval [CI], 2.9 to 79.0; P = 0.04). Sensitivity analyses based on multiple imputation for missing data yielded P values for the primary end point ranging from 0.06 to 0.10. The change from baseline in the modified Rodnan skin score and the total score on the SGRQ at week 52 did not differ significantly between the trial groups, with differences of -0.21 (95% CI, -0.94 to 0.53; P = 0.58) and 1.69 (95% CI, -0.73 to 4.12 [not adjusted for multiple comparisons]), respectively. Diarrhea, the most common adverse event, was reported in 75.7% of the patients in the nintedanib group and in 31.6% of those in the placebo group. Conclusions Among patients with ILD associated with systemic sclerosis, the annual rate of decline in FVC was lower with nintedanib than with placebo; no clinical benefit of nintedanib was observed for other manifestations of systemic sclerosis. The adverse-event profile of nintedanib observed in this trial was similar to that observed in patients with idiopathic pulmonary fibrosis; gastrointestinal adverse events, including diarrhea, were more common with nintedanib than with placebo. (Funded by Boehringer Ingelheim; SENSCIS ClinicalTrials.gov number, NCT02597933.).

Journal ArticleDOI
TL;DR: Predictive biomarkers, mechanisms of resistance, hyperprogressors, treatment duration and treatment beyond progression, immune-related toxicities, and clinical trial design are key concepts in need of further consideration to optimize the anticancer potential of this class of immunotherapy.
Abstract: Early preclinical evidence provided the rationale for programmed cell death 1 (PD-1) and programmed death ligand 1 (PD-L1) blockade as a potential form of cancer immunotherapy given that activation of the PD-1/PD-L1 axis putatively served as a mechanism for tumor evasion of host tumor antigen-specific T-cell immunity. Early-phase studies investigating several humanized monoclonal IgG4 antibodies targeting PD-1 and PD-L1 in advanced solid tumors paved way for the development of the first PD-1 inhibitors, nivolumab and pembrolizumab, approved by the Food and Drug Administration (FDA) in 2014. The number of FDA-approved agents of this class is rapidly enlarging with indications for treatment spanning across a spectrum of malignancies. The purpose of this review is to highlight the clinical development of PD-1 and PD-L1 inhibitors in cancer therapy to date. In particular, we focus on detailing the registration trials that have led to FDA-approved indications of anti-PD-1 and anti-PD-L1 therapies in cancer. As the number of PD-1/PD-L1 inhibitors continues to grow, predictive biomarkers, mechanisms of resistance, hyperprogressors, treatment duration and treatment beyond progression, immune-related toxicities, and clinical trial design are key concepts in need of further consideration to optimize the anticancer potential of this class of immunotherapy.

Journal ArticleDOI
TL;DR: The aging and growth of the population resulted in an increase in global cardiovascular deaths between 1990 and 2013, despite a decrease in age-specific death rates in most regions.
Abstract: Background Global deaths from cardiovascular disease are increasing as a result of population growth, the aging of populations, and epidemiologic changes in disease Disentangling the effects of these three drivers on trends in mortality is important for planning the future of the health care system and benchmarking progress toward the reduction of cardiovascular disease Methods We used mortality data from the Global Burden of Disease Study 2013, which includes data on 188 countries grouped into 21 world regions We developed three counterfactual scenarios to represent the principal drivers of change in cardiovascular deaths (population growth alone, population growth and aging, and epidemiologic changes in disease) from 1990 to 2013 Secular trends and correlations with changes in national income were examined Results Global deaths from cardiovascular disease increased by 41% between 1990 and 2013 despite a 39% decrease in age-specific death rates; this increase was driven by a 55% increase in mortalit

Journal ArticleDOI
TL;DR: Current, widely used clinical repair techniques for resurfacing articular cartilage defects and a developmental pipeline of acellular and cellular regenerative products and techniques that could revolutionize joint care over the next decade by promoting the development of functional articular Cartilage are described.
Abstract: Chondral and osteochondral lesions due to injury or other pathology commonly result in the development of osteoarthritis, eventually leading to progressive total joint destruction. Although current progress suggests that biologic agents can delay the advancement of deterioration, such drugs are incapable of promoting tissue restoration. The limited ability of articular cartilage to regenerate renders joint arthroplasty an unavoidable surgical intervention. This Review describes current, widely used clinical repair techniques for resurfacing articular cartilage defects; short-term and long-term clinical outcomes of these techniques are discussed. Also reviewed is a developmental pipeline of acellular and cellular regenerative products and techniques that could revolutionize joint care over the next decade by promoting the development of functional articular cartilage. Acellular products typically consist of collagen or hyaluronic-acid-based materials, whereas cellular techniques use either primary cells or stem cells, with or without scaffolds. Central to these efforts is the prominent role that tissue engineering has in translating biological technology into clinical products; therefore, concomitant regulatory processes are also discussed.

Journal ArticleDOI
TL;DR: The formulation of SHs is an important advancement for future multi-omics studies and for better understanding the mechanisms of fermentation inhibition in lignocellulosic hydrolysates, which was instrumental for defining the most important inhibitors in the ACH.
Abstract: The fermentation inhibition of yeast or bacteria by lignocellulose-derived degradation products, during hexose/pentose co-fermentation, is a major bottleneck for cost-effective lignocellulosic biorefineries. To engineer microbial strains for improved performance, it is critical to understand the mechanisms of inhibition that affect fermentative organisms in the presence of major components of a lignocellulosic hydrolysate. The development of a synthetic lignocellulosic hydrolysate (SH) media with a composition similar to the actual biomass hydrolysate will be an important advancement to facilitate these studies. In this work, we characterized the nutrients and plant-derived decomposition products present in AFEX™ pretreated corn stover hydrolysate (ACH). The SH was formulated based on the ACH composition and was further used to evaluate the inhibitory effects of various families of decomposition products during Saccharomyces cerevisiae 424A (LNH-ST) fermentation. The ACH contained high levels of nitrogenous compounds, notably amides, pyrazines, and imidazoles. In contrast, a relatively low content of furans and aromatic and aliphatic acids were found in the ACH. Though most of the families of decomposition products were inhibitory to xylose fermentation, due to their abundance, the nitrogenous compounds showed the most inhibition. From these compounds, amides (products of the ammonolysis reaction) contributed the most to the reduction of the fermentation performance. However, this result is associated to a concentration effect, as the corresponding carboxylic acids (products of hydrolysis) promoted greater inhibition when present at the same molar concentration as the amides. Due to its complexity, the formulated SH did not perfectly match the fermentation profile of the actual hydrolysate, especially the growth curve. However, the SH formulation was effective for studying the inhibitory effect of various compounds on yeast fermentation. The formulation of SHs is an important advancement for future multi-omics studies and for better understanding the mechanisms of fermentation inhibition in lignocellulosic hydrolysates. The SH formulated in this work was instrumental for defining the most important inhibitors in the ACH. Major AFEX decomposition products are less inhibitory to yeast fermentation than the products of dilute acid or steam explosion pretreatments; thus, ACH is readily fermentable by yeast without any detoxification.

Journal ArticleDOI
TL;DR: Pulmonary manifestation of COVID-19 infection is predominantly characterized by ground-glass opacification with occasional consolidation on CT, suggesting that CT is a more sensitive imaging modality for investigation.
Abstract: Chest radiographic and CT findings of 21 patients with confirmed COVID-19 are described along with a literature review of other publications describing the radiologic findings of this novel coronav...