scispace - formally typeset
Search or ask a question
Browse all papers

Journal ArticleDOI
TL;DR: To examine whether the interdependence between oxidative stress and inflammation can explain the antioxidant paradox, the basic aspects of oxidative Stress and inflammation and their relationship and dependence are discussed.
Abstract: Oxidative stress has been implicated in many chronic diseases. However, antioxidant trials are so far largely unsuccessful as a preventive or curative measure. Chronic low-grade inflammatory process, on the other hand, plays a central role in the pathogenesis of a number of chronic diseases. Oxidative stress and inflammation are closely related pathophysiological processes, one of which can be easily induced by another. Thus, both processes are simultaneously found in many pathological conditions. Therefore, the failure of antioxidant trials might result from failure to select appropriate agents that specifically target both inflammation and oxidative stress or failure to use both antioxidants and anti-inflammatory agents simultaneously or use of nonselective agents that block some of the oxidative and/or inflammatory pathways but exaggerate the others. To examine whether the interdependence between oxidative stress and inflammation can explain the antioxidant paradox we discussed in the present review the basic aspects of oxidative stress and inflammation and their relationship and dependence.

685 citations


Journal ArticleDOI
TL;DR: This review summarizes recent developments in realizing band structures with geometrical and topological features in experiments on cold atomic gases, beginning with a summary of the key concepts of geometry and topology for Bloch bands.
Abstract: There have been significant recent advances in realizing band structures with geometrical and topological features in experiments on cold atomic gases. This review summarizes these developments, beginning with a summary of the key concepts of geometry and topology for Bloch bands. Descriptions are given of the different methods that have been used to generate these novel band structures for cold atoms and of the physical observables that have allowed their characterization. The focus is on the physical principles that underlie the different experimental approaches, providing a conceptual framework within which to view these developments. Also described is how specific experimental implementations can influence physical properties. Moving beyond single-particle effects, descriptions are given of the forms of interparticle interactions that emerge when atoms are subjected to these energy bands and of some of the many-body phases that may be sought in future experiments.

685 citations


Journal ArticleDOI
TL;DR: The mainstay of control of the coronavirus disease 2019 (Covid-19) pandemic is vaccination against severe acute respiratory syndrome (SARS-CoV-2) as mentioned in this paper.
Abstract: Background The mainstay of control of the coronavirus disease 2019 (Covid-19) pandemic is vaccination against severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2). Within a year, s...

685 citations


Journal ArticleDOI
TL;DR: In this paper, a detailed kinetic study of hydrogen adsorption and evolution on Pt(111) in a wide pH range is presented, highlighting the role of reorganization of interfacial water to accommodate charge transfer through the electric double layer, the energetics of which are controlled by how strongly water interacts with the interfacial field.
Abstract: Hydrogen evolution on platinum is a key reaction for electrocatalysis and sustainable energy storage, yet its pH-dependent kinetics are not fully understood. Here we present a detailed kinetic study of hydrogen adsorption and evolution on Pt(111) in a wide pH range. Electrochemical measurements show that hydrogen adsorption and hydrogen evolution are both slow in alkaline media, consistent with the observation of a shift in the rate-determining step for hydrogen evolution. Adding nickel to the Pt(111) surface lowers the barrier for hydrogen adsorption in alkaline solutions and thereby enhances the hydrogen evolution rate. We explain these observations with a model that highlights the role of the reorganization of interfacial water to accommodate charge transfer through the electric double layer, the energetics of which are controlled by how strongly water interacts with the interfacial field. The model is supported by laser-induced temperature-jump measurements. Our model sheds light on the origin of the slow kinetics for the hydrogen evolution reaction in alkaline media. Despite its role in electrocatalysis and hydrogen generation, a complete understanding of the hydrogen evolution reaction on platinum remains elusive. Here, a detailed kinetic study of hydrogen adsorption and evolution on Pt(111) highlights the role of interfacial water reorganization in the hydrogen adsorption step.

685 citations


Journal ArticleDOI
TL;DR: The redesign of the ribosomal RNA operon copy number database (rrnDB) brings a substantial increase in the number of genomes described, improved curation, mapping of genomes to both NCBI and RDP taxonomies, and refined tools for querying and analyzing these data.
Abstract: Microbiologists utilize ribosomal RNA genes as molecular markers of taxonomy in surveys of microbial communities rRNA genes are often co-located as part of an rrn operon, and multiple copies of this operon are present in genomes across the microbial tree of life rrn copy number variability provides valuable insight into microbial life history, but introduces systematic bias when measuring community composition in molecular surveys Here we present an update to the ribosomal RNA operon copy number database (rrnDB), a publicly available, curated resource for copy number information for bacteria and archaea The redesigned rrnDB (http: //rrndbummsmedumichedu/) brings a substantial increase in the number of genomes described, improved curation, mapping of genomes to both NCBI and RDP taxonomies, and refined tools for querying and analyzing these data With these changes, the rrnDB is better positioned to remain a comprehensive resource under the torrent of microbial genome sequencing The enhanced rrnDB will contribute to the analysis of molecular surveys and to research linking genomic characteristics to life history

685 citations


Posted Content
TL;DR: It is argued that it is often preferable to treat similarly risky people similarly, based on the most statistically accurate estimates of risk that one can produce, rather than requiring that algorithms satisfy popular mathematical formalizations of fairness.
Abstract: The nascent field of fair machine learning aims to ensure that decisions guided by algorithms are equitable. Over the last several years, three formal definitions of fairness have gained prominence: (1) anti-classification, meaning that protected attributes---like race, gender, and their proxies---are not explicitly used to make decisions; (2) classification parity, meaning that common measures of predictive performance (e.g., false positive and false negative rates) are equal across groups defined by the protected attributes; and (3) calibration, meaning that conditional on risk estimates, outcomes are independent of protected attributes. Here we show that all three of these fairness definitions suffer from significant statistical limitations. Requiring anti-classification or classification parity can, perversely, harm the very groups they were designed to protect; and calibration, though generally desirable, provides little guarantee that decisions are equitable. In contrast to these formal fairness criteria, we argue that it is often preferable to treat similarly risky people similarly, based on the most statistically accurate estimates of risk that one can produce. Such a strategy, while not universally applicable, often aligns well with policy objectives; notably, this strategy will typically violate both anti-classification and classification parity. In practice, it requires significant effort to construct suitable risk estimates. One must carefully define and measure the targets of prediction to avoid retrenching biases in the data. But, importantly, one cannot generally address these difficulties by requiring that algorithms satisfy popular mathematical formalizations of fairness. By highlighting these challenges in the foundation of fair machine learning, we hope to help researchers and practitioners productively advance the area.

685 citations


Proceedings ArticleDOI
31 Mar 2017
TL;DR: This approach combines a search component based on bigram hashing and TF-IDF matching with a multi-layer recurrent neural network model trained to detect answers in Wikipedia paragraphs, indicating that both modules are highly competitive with respect to existing counterparts.
Abstract: This paper proposes to tackle open-domain question answering using Wikipedia as the unique knowledge source: the answer to any factoid question is a text span in a Wikipedia article. This task of machine reading at scale combines the challenges of document retrieval (finding the relevant articles) with that of machine comprehension of text (identifying the answer spans from those articles). Our approach combines a search component based on bigram hashing and TF-IDF matching with a multi-layer recurrent neural network model trained to detect answers in Wikipedia paragraphs. Our experiments on multiple existing QA datasets indicate that (1) both modules are highly competitive with respect to existing counterparts and (2) multitask learning using distant supervision on their combination is an effective complete system on this challenging task.

685 citations


Journal ArticleDOI
TL;DR: Specific monomer compositions give polymers that are robust and effective photocatalysts for the evolution of hydrogen from water in the presence of a sacrificial electron donor, without the apparent need for an added metal cocatalyst.
Abstract: Photocatalytic hydrogen production from water offers an abundant, clean fuel source, but it is challenging to produce photocatalysts that use the solar spectrum effectively. Many hydrogen-evolving photocatalysts are active in the ultraviolet range, but ultraviolet light accounts for only 3% of the energy available in the solar spectrum at ground level. Solid-state crystalline photocatalysts have light absorption profiles that are a discrete function of their crystalline phase and that are not always tunable. Here, we prepare a series of amorphous, microporous organic polymers with exquisite synthetic control over the optical gap in the range 1.94-2.95 eV. Specific monomer compositions give polymers that are robust and effective photocatalysts for the evolution of hydrogen from water in the presence of a sacrificial electron donor, without the apparent need for an added metal cocatalyst. Remarkably, unlike other organic systems, the best performing polymer is only photoactive under visible rather than ultraviolet irradiation.

685 citations


Journal ArticleDOI
TL;DR: Treatment with sofosbuvir-velpatasvir with or without ribavirin for 12 weeks and with so-called "superdrugs" for 24 weeks resulted in high rates of sustained virologic response in patients with HCV infection and decompensated cirrhosis.
Abstract: BackgroundAs the population that is infected with the hepatitis C virus (HCV) ages, the number of patients with decompensated cirrhosis is expected to increase. MethodsWe conducted a phase 3, open-label study involving both previously treated and previously untreated patients infected with HCV genotypes 1 through 6 who had decompensated cirrhosis (classified as Child–Pugh–Turcotte class B). Patients were randomly assigned in a 1:1:1 ratio to receive the nucleotide polymerase inhibitor sofosbuvir and the NS5A inhibitor velpatasvir once daily for 12 weeks, sofosbuvir–velpatasvir plus ribavirin for 12 weeks, or sofosbuvir–velpatasvir for 24 weeks. The primary end point was a sustained virologic response at 12 weeks after the end of therapy. ResultsOf the 267 patients who received treatment, 78% had HCV genotype 1, 4% genotype 2, 15% genotype 3, 3% genotype 4, and less than 1% genotype 6; no patients had genotype 5. Overall rates of sustained virologic response were 83% (95% confidence interval [CI], 74 to 90...

685 citations


Journal ArticleDOI
TL;DR: Patients with myasthenia gravis should be classified into subgroups to help with therapeutic decisions and prognosis, and additional immunomodulatory drugs are emerging, but therapeutic decisions are hampered by the scarcity of controlled studies.
Abstract: Myasthenia gravis is an autoimmune disease that is characterised by muscle weakness and fatigue, is B-cell mediated, and is associated with antibodies directed against the acetylcholine receptor, muscle-specific kinase (MUSK), lipoprotein-related protein 4 (LRP4), or agrin in the postsynaptic membrane at the neuromuscular junction. Patients with myasthenia gravis should be classified into subgroups to help with therapeutic decisions and prognosis. Subgroups based on serum antibodies and clinical features include early-onset, late-onset, thymoma, MUSK, LRP4, antibody-negative, and ocular forms of myasthenia gravis. Agrin-associated myasthenia gravis might emerge as a new entity. The prognosis is good with optimum symptomatic, immunosuppressive, and supportive treatment. Pyridostigmine is the preferred symptomatic treatment, and for patients who do not adequately respond to symptomatic therapy, corticosteroids, azathioprine, and thymectomy are first-line immunosuppressive treatments. Additional immunomodulatory drugs are emerging, but therapeutic decisions are hampered by the scarcity of controlled studies. Long-term drug treatment is essential for most patients and must be tailored to the particular form of myasthenia gravis.

685 citations


Journal ArticleDOI
TL;DR: The transmission, symptomatology, and mortality of COVID‐19 as they relate to older adults, and possible treatments that are currently under investigation are discussed.
Abstract: Severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), a novel virus that causes COVID-19 infection, has recently emerged and caused a deadly pandemic. Studies have shown that this virus causes worse outcomes and a higher mortality rate in older adults and those with comorbidities such as hypertension, cardiovascular disease, diabetes, chronic respiratory disease, and chronic kidney disease (CKD). A significant percentage of older American adults have these diseases, putting them at a higher risk of infection. Additionally, many adults with hypertension, diabetes, and CKD are placed on angiotensin-converting enzyme (ACE) inhibitors and angiotensin II receptor blockers. Studies have shown that these medications upregulate the ACE-2 receptor, the very receptor that the SARS-CoV-2 virus uses to enter host cells. Although it has been hypothesized that this may cause a further increased risk of infection, more studies on the role of these medications in COVID-19 infections are necessary. In this review, we discuss the transmission, symptomatology, and mortality of COVID-19 as they relate to older adults, and possible treatments that are currently under investigation. J Am Geriatr Soc 68:926-929, 2020.

Journal ArticleDOI
TL;DR: The metabarcoding approach presented here is non-invasive, more efficient, more cost-effective and more sensitive than the traditional survey methods and has the potential to serve as an alternative tool for biodiversity monitoring that revolutionizes natural resource management and ecological studies of fish communities on larger spatial and temporal scales.
Abstract: We developed a set of universal PCR primers (MiFish-U/E) for metabarcoding environmental DNA (eDNA) from fishes. Primers were designed using aligned whole mitochondrial genome (mitogenome) sequence...

Journal ArticleDOI
TL;DR: This position paper outlines the Academy's, DC, and ACSM's stance on nutrition factors that have been determined to influence athletic performance and emerging trends in the field of sports nutrition.

Journal ArticleDOI
TL;DR: This article proposed a new investor sentiment index that is aligned with the purpose of predicting the aggregate stock market by eliminating a common noise component in sentiment proxies, the new index has much greater predictive power than existing sentiment indices have both in and out of sample, and the predictability becomes both statistically and economically significant.
Abstract: We propose a new investor sentiment index that is aligned with the purpose of predicting the aggregate stock market. By eliminating a common noise component in sentiment proxies, the new index has much greater predictive power than existing sentiment indices have both in and out of sample, and the predictability becomes both statistically and economically significant. In addition, it outperforms well-recognized macroeconomic variables and can also predict cross-sectional stock returns sorted by industry, size, value, and momentum. The driving force of the predictive power appears to stem from investors' biased beliefs about future cash flows.

Journal ArticleDOI
01 Jan 2022-Preslia
TL;DR: A complete list of all alien taxa ever recorded in the flora of the Czech Republic is presented as an update of the original checklist published in 2002, with 44 taxa on the list that are reported in the present study for the first time as aliens introduced to the CzechRepublic or escaped from cultivation.
Abstract: A complete list of all alien taxa ever recorded in the flora of the Czech Republic is presented as an update of the original checklist published in 2002. New data accumulated in the last decade are incorporated and the listing and status of some taxa are reassessed based on improved knowledge. Alien flora of the Czech Republic consists of 1454 taxa listed with information on their taxonomic position, life history, geographic origin (or mode of origin, distinguishing anecophyte and hybrid), invasive status (casual; naturalized but not invasive; invasive), residence time status (archaeophyte vs neophyte), mode of introduction into the country (accidental, deliberate), and date of the first record. Additional information on species performance that was not part of the previous catalogue, i.e. on the width of species’ habitat niches, their dominance in invaded communities, and impact, is provided. The Czech alien flora consists of 350 (24.1%) archaeophytes and 1104 (75.9%) neophytes. The increase in the total number of taxa compared to the previous catalogue (1378) is due to addition of 151 taxa and removal of 75 (39 archaeophytes and 36 neophytes), important part of the latter being the reclassification of 41 taxa as native, mostly based on archaeobotanical evidence. The additions represent taxa newly recorded since 2002 and reported in the national literature; taxa resulting from investigation of sources omitted while preparing the previous catalogue; redetermination of previously reported taxa; reassessment of some taxa traditionally considered native for which the evidence suggests the opposite; and inclusion of intraspecific taxa previously not recognized in the flora. There are 44 taxa on the list that are reported in the present study for the first time as aliens introduced to the Czech Republic or escaped from cultivation.

Journal ArticleDOI
Yujin Oh1, Sangjoon Park1, Jong Chul Ye1
TL;DR: Experimental results show that the proposed patch-based convolutional neural network approach achieves state-of-the-art performance and provides clinically interpretable saliency maps, which are useful for COVID-19 diagnosis and patient triage.
Abstract: Under the global pandemic of COVID-19, the use of artificial intelligence to analyze chest X-ray (CXR) image for COVID-19 diagnosis and patient triage is becoming important. Unfortunately, due to the emergent nature of the COVID-19 pandemic, a systematic collection of CXR data set for deep neural network training is difficult. To address this problem, here we propose a patch-based convolutional neural network approach with a relatively small number of trainable parameters for COVID-19 diagnosis. The proposed method is inspired by our statistical analysis of the potential imaging biomarkers of the CXR radiographs. Experimental results show that our method achieves state-of-the-art performance and provides clinically interpretable saliency maps, which are useful for COVID-19 diagnosis and patient triage.

Book
14 Jan 2016
TL;DR: This book provides a compact self-contained introduction to the theory and application of Bayesian statistical methods and ends with modern topics such as variable selection in regression, generalized linear mixed effects models, and semiparametric copula estimation.
Abstract: This book provides a compact self-contained introduction to the theory and application of Bayesian statistical methods. The book is accessible to readers havinga basic familiarity with probability, yet allows more advanced readers to quickly grasp the principles underlying Bayesian theory and methods. The examples and computer code allow the reader to understand and implement basic Bayesian data analyses using standard statistical models and to extend the standard models to specialized data analysis situations. The book begins with fundamental notions such as probability, exchangeability and Bayes' rule, and ends with modern topics such as variable selection in regression, generalized linear mixed effects models, and semiparametric copula estimation. Numerous examples from the social, biological and physical sciences show how to implement these methodologies in practice. Monte Carlo summaries of posterior distributions play an important role in Bayesian data analysis. The open-source R statistical computing environment provides sufficient functionality to make Monte Carlo estimation very easy for a large number of statistical models and example R-code is provided throughout the text. Much of the example code can be run as is' in R, and essentially all of it can be run after downloading the relevant datasets from the companion website for this book.

Journal ArticleDOI
20 Dec 2019-Science
TL;DR: The chemical environment of a functional group that is activated for defect passivation was systematically investigated with theophylline, caffeine, and theobromine and hydrogen-bond formation between N-H and I (iodine) assisted the primary C=O binding with the antisite Pb defect to maximize surface-defect binding.
Abstract: Surface trap–mediated nonradiative charge recombination is a major limit to achieving high-efficiency metal-halide perovskite photovoltaics. The ionic character of perovskite lattice has enabled molecular defect passivation approaches through interaction between functional groups and defects. However, a lack of in-depth understanding of how the molecular configuration influences the passivation effectiveness is a challenge to rational molecule design. Here, the chemical environment of a functional group that is activated for defect passivation was systematically investigated with theophylline, caffeine, and theobromine. When N-H and C=O were in an optimal configuration in the molecule, hydrogen-bond formation between N-H and I (iodine) assisted the primary C=O binding with the antisite Pb (lead) defect to maximize surface-defect binding. A stabilized power conversion efficiency of 22.6% of photovoltaic device was demonstrated with theophylline treatment.

Journal ArticleDOI
TL;DR: A taxonomy of contemporary IDS is presented, a comprehensive review of notable recent works, and an overview of the datasets commonly used for evaluation purposes are presented, and evasion techniques used by attackers to avoid detection are presented.
Abstract: Cyber-attacks are becoming more sophisticated and thereby presenting increasing challenges in accurately detecting intrusions. Failure to prevent the intrusions could degrade the credibility of security services, e.g. data confidentiality, integrity, and availability. Numerous intrusion detection methods have been proposed in the literature to tackle computer security threats, which can be broadly classified into Signature-based Intrusion Detection Systems (SIDS) and Anomaly-based Intrusion Detection Systems (AIDS). This survey paper presents a taxonomy of contemporary IDS, a comprehensive review of notable recent works, and an overview of the datasets commonly used for evaluation purposes. It also presents evasion techniques used by attackers to avoid detection and discusses future research challenges to counter such techniques so as to make computer systems more secure.

Posted Content
Saining Xie1, Chen Sun1, Jonathan Huang1, Zhuowen Tu1, Kevin Murphy1 
TL;DR: It is shown that it is possible to replace many of the 3D convolutions by low-cost 2D convolution, suggesting that temporal representation learning on high-level “semantic” features is more useful.
Abstract: Despite the steady progress in video analysis led by the adoption of convolutional neural networks (CNNs), the relative improvement has been less drastic as that in 2D static image classification. Three main challenges exist including spatial (image) feature representation, temporal information representation, and model/computation complexity. It was recently shown by Carreira and Zisserman that 3D CNNs, inflated from 2D networks and pretrained on ImageNet, could be a promising way for spatial and temporal representation learning. However, as for model/computation complexity, 3D CNNs are much more expensive than 2D CNNs and prone to overfit. We seek a balance between speed and accuracy by building an effective and efficient video classification system through systematic exploration of critical network design choices. In particular, we show that it is possible to replace many of the 3D convolutions by low-cost 2D convolutions. Rather surprisingly, best result (in both speed and accuracy) is achieved when replacing the 3D convolutions at the bottom of the network, suggesting that temporal representation learning on high-level semantic features is more useful. Our conclusion generalizes to datasets with very different properties. When combined with several other cost-effective designs including separable spatial/temporal convolution and feature gating, our system results in an effective video classification system that that produces very competitive results on several action classification benchmarks (Kinetics, Something-something, UCF101 and HMDB), as well as two action detection (localization) benchmarks (JHMDB and UCF101-24).

Journal ArticleDOI
TL;DR: In this article, a review of the state of the art on geographical patterns of species range shifts under contemporary climate change for plants and animals across both terrestrial and marine ecosystems is presented.
Abstract: Poleward and upward shifts are the most frequent types of range shifts that have been reported in response to contemporary climate change. However, the number of reports documenting other types of range shifts – such as in east-west directions across longitudes or, even more unexpectedly, towards tropical latitudes and lower elevations – is increasing rapidly. Recent studies show that these range shifts may not be so unexpected once the local climate changes are accounted for. We here provide an updated synthesis of the fast-moving research on climate-related range shifts. By describing the current state of the art on geographical patterns of species range shifts under contemporary climate change for plants and animals across both terrestrial and marine ecosystems, we identified a number of research shortfalls. In addition to the recognised geographic shortfall in the tropics, we found taxonomic and methodological shortfalls with knowledge gaps regarding range shifts of prokaryotes, lowland range shifts of terrestrial plants, and bathymetric range shifts of marine plants. Based on this review, we provide a research agenda for filling these gaps. We outline a comprehensive framework for assessing multidimensional changes in species distributions, which should then be contrasted with expectations based on climate change indices, such as velocity measures accounting for complex local climate changes. Finally, we propose a unified classification of geographical patterns of species range shifts, arranged in a bi-dimensional space defined by species’ persistence and movement rates. Placing the observed and expected shifts into this bi-dimensional space should lead to more informed assessments of extinction risks.

Journal ArticleDOI
14 Dec 2018-Science
TL;DR: The resource and integrative analyses have uncovered genomic elements and networks in the brain, which in turn have provided insight into the molecular mechanisms underlying psychiatric disorders.
Abstract: Despite progress in defining genetic risk for psychiatric disorders, their molecular mechanisms remain elusive. Addressing this, the PsychENCODE Consortium has generated a comprehensive online resource for the adult brain across 1866 individuals. The PsychENCODE resource contains ~79,000 brain-active enhancers, sets of Hi-C linkages, and topologically associating domains; single-cell expression profiles for many cell types; expression quantitative-trait loci (QTLs); and further QTLs associated with chromatin, splicing, and cell-type proportions. Integration shows that varying cell-type proportions largely account for the cross-population variation in expression (with >88% reconstruction accuracy). It also allows building of a gene regulatory network, linking genome-wide association study variants to genes (e.g., 321 for schizophrenia). We embed this network into an interpretable deep-learning model, which improves disease prediction by ~6-fold versus polygenic risk scores and identifies key genes and pathways in psychiatric disorders.

Journal ArticleDOI
TL;DR: This work uses maximum likelihood inference to simultaneously detect recombination in bacterial genomes and account for it in phylogenetic reconstruction and finds evidence for recombination hotspots associated with mobile elements in Clostridium difficile ST6 and a previously undescribed 310kb chromosomal replacement in Staphylococcus aureus ST582.
Abstract: Recombination is an important evolutionary force in bacteria, but it remains challenging to reconstruct the imports that occurred in the ancestry of a genomic sample. Here we present ClonalFrameML, which uses maximum likelihood inference to simultaneously detect recombination in bacterial genomes and account for it in phylogenetic reconstruction. ClonalFrameML can analyse hundreds of genomes in a matter of hours, and we demonstrate its usefulness on simulated and real datasets. We find evidence for recombination hotspots associated with mobile elements in Clostridium difficile ST6 and a previously undescribed 310kb chromosomal replacement in Staphylococcus aureus ST582. ClonalFrameML is freely available at http://clonalframeml.googlecode.com/.

Journal ArticleDOI
TL;DR: In this paper, the authors examined the contribution of food prices and diet cost to socioeconomic inequalities in diet quality and found that foods of lower nutritional value and lower-quality diets generally cost less per calorie and tended to be selected by groups of lower socioeconomic status.
Abstract: Context: It is well established in the literature that healthier diets cost more than unhealthy diets. Objective: The aim of this review was to examine the contribution of food prices and diet cost to socioeconomic inequalities in diet quality. Data Sources: A systematic literature search of the PubMed, Google Scholar, and Web of Science databases was performed. Study Selection: Publications linking food prices, dietary quality, and socioeconomic status were selected. Data Extraction: Where possible, review conclusions were illustrated using a French national database of commonly consumed foods and their mean retail prices. Data Synthesis: Foods of lower nutritional value and lower-quality diets generally cost less per calorie and tended to be selected by groups of lower socioeconomic status. A number of nutrient-dense foods were available at low cost but were not always palatable or culturally acceptable to the low-income consumer. Acceptable healthier diets were uniformly associated with higher costs. Food budgets in poverty were insufficient to ensure optimum diets. Conclusions: Socioeconomic disparities in diet quality may be explained by the higher cost of healthy diets. Identifying food patterns that are nutrient rich, affordable, and appealing should be a priority to fight social inequalities in nutrition and health.

Proceedings Article
04 Dec 2017
TL;DR: Adding context vectors to a deep LSTM encoder from an attentional sequence-to-sequence model trained for machine translation to contextualize word vectors improves performance over using only unsupervised word and character vectors on a wide variety of common NLP tasks.
Abstract: Computer vision has benefited from initializing multiple deep layers with weights pretrained on large supervised training sets like ImageNet. Natural language processing (NLP) typically sees initialization of only the lowest layer of deep models with pretrained word vectors. In this paper, we use a deep LSTM encoder from an attentional sequence-to-sequence model trained for machine translation (MT) to contextualize word vectors. We show that adding these context vectors (CoVe) improves performance over using only unsupervised word and character vectors on a wide variety of common NLP tasks: sentiment analysis (SST, IMDb), question classification (TREC), entailment (SNLI), and question answering (SQuAD). For fine-grained sentiment analysis and entailment, CoVe improves performance of our baseline models to the state of the art.

Journal ArticleDOI
TL;DR: In this article, the authors describe the main ideas, recent developments and progress in a broad spectrum of research investigating ML and AI in the quantum domain, and discuss the fundamental issue of quantum generalizations of learning and AI concepts.
Abstract: Quantum information technologies, on the one hand, and intelligent learning systems, on the other, are both emergent technologies that are likely to have a transformative impact on our society in the future. The respective underlying fields of basic research-quantum information versus machine learning (ML) and artificial intelligence (AI)-have their own specific questions and challenges, which have hitherto been investigated largely independently. However, in a growing body of recent work, researchers have been probing the question of the extent to which these fields can indeed learn and benefit from each other. Quantum ML explores the interaction between quantum computing and ML, investigating how results and techniques from one field can be used to solve the problems of the other. Recently we have witnessed significant breakthroughs in both directions of influence. For instance, quantum computing is finding a vital application in providing speed-ups for ML problems, critical in our 'big data' world. Conversely, ML already permeates many cutting-edge technologies and may become instrumental in advanced quantum technologies. Aside from quantum speed-up in data analysis, or classical ML optimization used in quantum experiments, quantum enhancements have also been (theoretically) demonstrated for interactive learning tasks, highlighting the potential of quantum-enhanced learning agents. Finally, works exploring the use of AI for the very design of quantum experiments and for performing parts of genuine research autonomously, have reported their first successes. Beyond the topics of mutual enhancement-exploring what ML/AI can do for quantum physics and vice versa-researchers have also broached the fundamental issue of quantum generalizations of learning and AI concepts. This deals with questions of the very meaning of learning and intelligence in a world that is fully described by quantum mechanics. In this review, we describe the main ideas, recent developments and progress in a broad spectrum of research investigating ML and AI in the quantum domain.

Journal ArticleDOI
TL;DR: In this article, the authors make in-depth analyses of the various aspects of the biosorption technology, staring from the various biosorbents used till date and the various factors affecting the process.
Abstract: The biosorption process has been established as characteristics of dead biomasses of both cellulosic and microbial origin to bind metal ion pollutants from aqueous suspension. The high effectiveness of this process even at low metal concentration, similarity to ion exchange treatment process, but cheaper and greener alternative to conventional techniques have resulted in a mature biosorption technology. Yet its adoption to large scale industrial wastewaters treatment has still been a distant reality. The purpose of this review is to make in-depth analyses of the various aspects of the biosorption technology, staring from the various biosorbents used till date and the various factors affecting the process. The design of better biosorbents for improving their physico-chemical features as well as enhancing their biosorption characteristics has been discussed. Better economic value of the biosorption technology is related to the repeated reuse of the biosorbent with minimum loss of efficiency. In this context desorption of the metal pollutants as well as regeneration of the biosorbent has been discussed in detail. Various inhibitions including the multi mechanistic role of the biosorption technology has been identified which have played a contributory role to its non-commercialization.

Proceedings ArticleDOI
01 Jun 2019
TL;DR: This work develops a new framework for incrementally learning a unified classifier, e.g. a classifier that treats both old and new classes uniformly, and incorporates three components, cosine normalization, less-forget constraint, and inter-class separation, to mitigate the adverse effects of the imbalance.
Abstract: Conventionally, deep neural networks are trained offline, relying on a large dataset prepared in advance. This paradigm is often challenged in real-world applications, e.g. online services that involve continuous streams of incoming data. Recently, incremental learning receives increasing attention, and is considered as a promising solution to the practical challenges mentioned above. However, it has been observed that incremental learning is subject to a fundamental difficulty -- catastrophic forgetting, namely adapting a model to new data often results in severe performance degradation on previous tasks or classes. Our study reveals that the imbalance between previous and new data is a crucial cause to this problem. In this work, we develop a new framework for incrementally learning a unified classifier, e.g. a classifier that treats both old and new classes uniformly. Specifically, we incorporate three components, cosine normalization, less-forget constraint, and inter-class separation, to mitigate the adverse effects of the imbalance. Experiments show that the proposed method can effectively rebalance the training process, thus obtaining superior performance compared to the existing methods. On CIFAR-100 and ImageNet, our method can reduce the classification errors by more than 6% and 13% respectively, under the incremental setting of 10 phases.

Journal ArticleDOI
TL;DR: This work presents a review of the state of the art of information-theoretic feature selection methods, and describes a unifying theoretical framework which can retrofit successful heuristic criteria, indicating the approximations made by each method.
Abstract: In this work we present a review of the state of the art of information theoretic feature selection methods. The concepts of feature relevance, redundance and complementarity (synergy) are clearly defined, as well as Markov blanket. The problem of optimal feature selection is defined. A unifying theoretical framework is described, which can retrofit successful heuristic criteria, indicating the approximations made by each method. A number of open problems in the field are presented.

Journal ArticleDOI
TL;DR: These guidelines were designed to provide pragmatic recommendations, based on the best available published evidence, about when platelet transfusion may be appropriate in adult patients, and provide advice for adult patients who are candidates for platelets transfusion.
Abstract: Platelet transfusions are administered to prevent or treat bleeding in patients with quantitative or qualitative platelet disorders The AABB (formerly, the American Association of Blood Banks) dev