Showing papers by "Virginia Tech published in 2015"
••
TL;DR: The 1000 Genomes Project set out to provide a comprehensive description of common human genetic variation by applying whole-genome sequencing to a diverse set of individuals from multiple populations, and has reconstructed the genomes of 2,504 individuals from 26 populations using a combination of low-coverage whole-generation sequencing, deep exome sequencing, and dense microarray genotyping.
Abstract: The 1000 Genomes Project set out to provide a comprehensive description of common human genetic variation by applying whole-genome sequencing to a diverse set of individuals from multiple populations. Here we report completion of the project, having reconstructed the genomes of 2,504 individuals from 26 populations using a combination of low-coverage whole-genome sequencing, deep exome sequencing, and dense microarray genotyping. We characterized a broad spectrum of genetic variation, in total over 88 million variants (84.7 million single nucleotide polymorphisms (SNPs), 3.6 million short insertions/deletions (indels), and 60,000 structural variants), all phased onto high-quality haplotypes. This resource includes >99% of SNP variants with a frequency of >1% for a variety of ancestries. We describe the distribution of genetic variation across the global sample, and discuss the implications for common disease studies.
12,661 citations
••
07 Dec 2015TL;DR: The task of free-form and open-ended Visual Question Answering (VQA) is proposed, given an image and a natural language question about the image, the task is to provide an accurate natural language answer.
Abstract: We propose the task of free-form and open-ended Visual Question Answering (VQA). Given an image and a natural language question about the image, the task is to provide an accurate natural language answer. Mirroring real-world scenarios, such as helping the visually impaired, both the questions and answers are open-ended. Visual questions selectively target different areas of an image, including background details and underlying context. As a result, a system that succeeds at VQA typically needs a more detailed understanding of the image and complex reasoning than a system producing generic image captions. Moreover, VQA is amenable to automatic evaluation, since many open-ended answers contain only a few words or a closed set of answers that can be provided in a multiple-choice format. We provide a dataset containing ~0.25M images, ~0.76M questions, and ~10M answers (www.visualqa.org), and discuss the information it provides. Numerous baselines for VQA are provided and compared with human performance.
3,513 citations
••
07 Jun 2015TL;DR: A novel paradigm for evaluating image descriptions that uses human consensus is proposed and a new automated metric that captures human judgment of consensus better than existing metrics across sentences generated by various sources is evaluated.
Abstract: Automatically describing an image with a sentence is a long-standing challenge in computer vision and natural language processing. Due to recent progress in object detection, attribute classification, action recognition, etc., there is renewed interest in this area. However, evaluating the quality of descriptions has proven to be challenging. We propose a novel paradigm for evaluating image descriptions that uses human consensus. This paradigm consists of three main parts: a new triplet-based method of collecting human annotations to measure consensus, a new automated metric that captures consensus, and two new datasets: PASCAL-50S and ABSTRACT-50S that contain 50 sentences describing each image. Our simple metric captures human judgment of consensus better than existing metrics across sentences generated by various sources. We also evaluate five state-of-the-art image description approaches using this new protocol and provide a benchmark for future comparisons. A version of CIDEr named CIDEr-D is available as a part of MS COCO evaluation server to enable systematic evaluation and benchmarking.
3,504 citations
••
TL;DR: The RAST tool kit (RASTtk), a modular version of RAST that enables researchers to build custom annotation pipelines and offers a choice of software for identifying and annotating genomic features as well as the ability to add custom features to an annotation job.
Abstract: The RAST (Rapid Annotation using Subsystem Technology) annotation engine was built in 2008 to annotate bacterial and archaeal genomes. It works by offering a standard software pipeline for identifying genomic features (i.e., protein-encoding genes and RNA) and annotating their functions. Recently, in order to make RAST a more useful research tool and to keep pace with advancements in bioinformatics, it has become desirable to build a version of RAST that is both customizable and extensible. In this paper, we describe the RAST tool kit (RASTtk), a modular version of RAST that enables researchers to build custom annotation pipelines. RASTtk offers a choice of software for identifying and annotating genomic features as well as the ability to add custom features to an annotation job. RASTtk also accommodates the batch submission of genomes and the ability to customize annotation protocols for batch submissions. This is the first major software restructuring of RAST since its inception.
1,666 citations
••
TL;DR: The modified CPI has enabled crowdsourcing capabilities, which allow users to suggest edits to any entry and permits researchers to upload new findings ranging from human and environmental exposure data to complete life cycle assessments.
Abstract: To document the marketing and distribution of nano-enabled products into the commercial marketplace, the Woodrow Wilson International Center for Scholars and the Project on Emerging Nanotechnologies created the Nanotechnology Consumer Products Inventory (CPI) in 2005. The objective of this present work is to redevelop the CPI by leading a research effort to increase the usefulness and reliability of this inventory. We created eight new descriptors for consumer products, including information pertaining to the nanomaterials contained in each product. The project was motivated by the recognition that a diverse group of stakeholders from academia, industry, and state/federal government had become highly dependent on the inventory as an important resource and bellweather of the pervasiveness of nanotechnology in society. We interviewed 68 nanotechnology experts to assess key information needs. Their answers guided inventory modifications by providing a clear conceptual framework best suited for user expectations. The revised inventory was released in October 2013. It currently lists 1814 consumer products from 622 companies in 32 countries. The Health and Fitness category contains the most products (762, or 42% of the total). Silver is the most frequently used nanomaterial (435 products, or 24%); however, 49% of the products (889) included in the CPI do not provide the composition of the nanomaterial used in them. About 29% of the CPI (528 products) contain nanomaterials suspended in a variety of liquid media and dermal contact is the most likely exposure scenario from their use. The majority (1288 products, or 71%) of the products do not present enough supporting information to corroborate the claim that nanomaterials are used. The modified CPI has enabled crowdsourcing capabilities, which allow users to suggest edits to any entry and permits researchers to upload new findings ranging from human and environmental exposure data to complete life cycle assessments. There are inherent limitations to this type of database, but these modifications to the inventory addressed the majority of criticisms raised in published literature and in surveys of nanotechnology stakeholders and experts. The development of standardized methods and metrics for nanomaterial characterization and labelling in consumer products can lead to greater understanding between the key stakeholders in nanotechnology, especially consumers, researchers, regulators, and industry.
1,511 citations
••
Pacific Northwest National Laboratory1, University of Washington2, Radcliffe Institute for Advanced Study3, University of Wisconsin-Madison4, Eindhoven University of Technology5, University of Minnesota6, Lawrence Berkeley National Laboratory7, Northwestern University8, University of California, Berkeley9, University of Houston10, University of California, Davis11, University of Delaware12, Virginia Tech13, University of Leeds14, University of Konstanz15
TL;DR: The current understanding of CPA is described, some of the nonclassical thermodynamic and dynamic mechanisms known to give rise to experimentally observed pathways are examined, and the challenges to the understanding of these mechanisms are highlighted.
Abstract: Field and laboratory observations show that crystals commonly form by the addition and attachment of particles that range from multi-ion complexes to fully formed nanoparticles. The particles involved in these nonclassical pathways to crystallization are diverse, in contrast to classical models that consider only the addition of monomeric chemical species. We review progress toward understanding crystal growth by particle-attachment processes and show that multiple pathways result from the interplay of free-energy landscapes and reaction dynamics. Much remains unknown about the fundamental aspects, particularly the relationships between solution structure, interfacial forces, and particle motion. Developing a predictive description that connects molecular details to ensemble behavior will require revisiting long-standing interpretations of crystal formation in synthetic systems, biominerals, and patterns of mineralization in natural environments.
1,357 citations
••
TL;DR: Model reduction aims to reduce the computational burden by generating reduced models that are faster and cheaper to simulate, yet accurately represent the original large-scale system behavior as mentioned in this paper. But model reduction of linear, nonparametric dynamical systems has reached a considerable level of maturity, as reflected by several survey papers and books.
Abstract: Numerical simulation of large-scale dynamical systems plays a fundamental role in studying a wide range of complex physical phenomena; however, the inherent large-scale nature of the models often leads to unmanageable demands on computational resources. Model reduction aims to reduce this computational burden by generating reduced models that are faster and cheaper to simulate, yet accurately represent the original large-scale system behavior. Model reduction of linear, nonparametric dynamical systems has reached a considerable level of maturity, as reflected by several survey papers and books. However, parametric model reduction has emerged only more recently as an important and vibrant research area, with several recent advances making a survey paper timely. Thus, this paper aims to provide a resource that draws together recent contributions in different communities to survey the state of the art in parametric model reduction methods. Parametric model reduction targets the broad class of problems for wh...
1,230 citations
••
TL;DR: The paper defines smart tourism, sheds light on current smart tourism trends, and then lays out its technological and business foundations.
Abstract: Smart tourism is a new buzzword applied to describe the increasing reliance of tourism destinations, their industries and their tourists on emerging forms of ICT that allow for massive amounts of data to be transformed into value propositions. However, it remains ill-defined as a concept, which hinders its theoretical development. The paper defines smart tourism, sheds light on current smart tourism trends, and then lays out its technological and business foundations. This is followed by a brief discussion on the prospects and drawbacks of smart tourism. The paper further draws attention to the great need for research to inform smart tourism development and management.
1,114 citations
••
TL;DR: A complete redesign of the query and reporting interface has been performed in the IEDB 3.0 release to improve how end users can access immune epitopes information in an intuitive and biologically accurate manner.
Abstract: The IEDB, wwwiedborg, contains information on immune epitopes--the molecular targets of adaptive immune responses--curated from the published literature and submitted by National Institutes of Health funded epitope discovery efforts From 2004 to 2012 the IEDB curation of journal articles published since 1960 has caught up to the present day, with >95% of relevant published literature manually curated amounting to more than 15,000 journal articles and more than 704,000 experiments to date The revised curation target since 2012 has been to make recent research findings quickly available in the IEDB and thereby ensure that it continues to be an up-to-date resource Having gathered a comprehensive dataset in the IEDB, a complete redesign of the query and reporting interface has been performed in the IEDB 30 release to improve how end users can access this information in an intuitive and biologically accurate manner We here present this most recent release of the IEDB and describe the user testing procedures as well as the use of external ontologies that have enabled it
945 citations
••
TL;DR: It is suggested that the relative abundance of the clinical class 1 integron-integrase gene, intI1, is a good proxy for pollution because it is linked to genes conferring resistance to antibiotics, disinfectants and heavy metals.
Abstract: Around all human activity, there are zones of pollution with pesticides, heavy metals, pharmaceuticals, personal care products and the microorganisms associated with human waste streams and agriculture. This diversity of pollutants, whose concentration varies spatially and temporally, is a major challenge for monitoring. Here, we suggest that the relative abundance of the clinical class 1 integron-integrase gene, intI1, is a good proxy for pollution because: (1) intI1 is linked to genes conferring resistance to antibiotics, disinfectants and heavy metals; (2) it is found in a wide variety of pathogenic and nonpathogenic bacteria; (3) its abundance can change rapidly because its host cells can have rapid generation times and it can move between bacteria by horizontal gene transfer; and (4) a single DNA sequence variant of intI1 is now found on a wide diversity of xenogenetic elements, these being complex mosaic DNA elements fixed through the agency of human selection. Here we review the literature examining the relationship between anthropogenic impacts and the abundance of intI1, and outline an approach by which intI1 could serve as a proxy for anthropogenic pollution.
919 citations
••
University of Minnesota1, Leipzig University2, University College Dublin3, Centre national de la recherche scientifique4, University of Zurich5, University of Bayreuth6, Iowa State University7, Martin Luther University of Halle-Wittenberg8, University of Jena9, Swansea University10, United States Department of Agriculture11, Utrecht University12, University of Oxford13, University of Greifswald14, Sewanee: The University of the South15, University of Bern16, Technische Universität München17, Yokohama National University18, Columbia University19, University of Western Sydney20, Colorado State University21, University of California, Santa Barbara22, Virginia Tech23, Wageningen University and Research Centre24
TL;DR: Biodiversity mainly stabilizes ecosystem productivity, and productivity-dependent ecosystem services, by increasing resistance to climate events, and restoration of biodiversity to increase it, mainly by changing the resistance of ecosystem productivity toClimate events.
Abstract: It remains unclear whether biodiversity buffers ecosystems against climate extremes, which are becoming increasingly frequent worldwide1. Early results suggested that the ecosystem productivity of diverse grassland plant communities was more resistant, changing less during drought, and more resilient, recovering more quickly after drought, than that of depauperate communities2. However, subsequent experimental tests produced mixed results3, 4, 5, 6, 7, 8, 9, 10, 11, 12, 13. Here we use data from 46 experiments that manipulated grassland plant diversity to test whether biodiversity provides resistance during and resilience after climate events. We show that biodiversity increased ecosystem resistance for a broad range of climate events, including wet or dry, moderate or extreme, and brief or prolonged events. Across all studies and climate events, the productivity of low-diversity communities with one or two species changed by approximately 50% during climate events, whereas that of high-diversity communities with 16–32 species was more resistant, changing by only approximately 25%. By a year after each climate event, ecosystem productivity had often fully recovered, or overshot, normal levels of productivity in both high- and low-diversity communities, leading to no detectable dependence of ecosystem resilience on biodiversity. Our results suggest that biodiversity mainly stabilizes ecosystem productivity, and productivity-dependent ecosystem services, by increasing resistance to climate events. Anthropogenic environmental changes that drive biodiversity loss thus seem likely to decrease ecosystem stability14, and restoration of biodiversity to increase it, mainly by changing the resistance of ecosystem productivity to climate events.
••
TL;DR: In this paper, the branching fraction ratio R(D)(()*()) of (B) over bar → D-(*())tau(-)(nu)over bar (tau) relative to (B), where l = e or mu, was measured using the full Belle data sample.
Abstract: We report a measurement of the branching fraction ratios R(D)(()*()) of (B) over bar -> D-(*())tau(-)(nu) over bar (tau) relative to (B) over bar -> D-(*())l(-)(nu) over barl (where l = e or mu) using the full Belle data sample of 772 x 10(6)B (B) over bar pairs collected at the Upsilon(4S) resonance with the Belle detector at the KEKB asymmetric-energy e(+)e(-) collider. The measured values are R(D) = 0.375 +/- 0.064(stat) +/- 0.026(syst) and R(D*) = 0.293 +/- 0.038 (stat) +/- 0.015 (syst). The analysis uses hadronic reconstruction of the tag-side B meson and purely leptonic t decays. The results are consistent with earlier measurements and do not show a significant deviation from the standard model prediction.
••
Norwich Research Park1, University of California, Riverside2, University of Florida3, West Bengal State University4, Mediterranea University of Reggio Calabria5, Agro ParisTech6, University of Nice Sophia Antipolis7, Deakin University8, National Research Council9, College of Horticulture10, University of Massachusetts Amherst11, James Hutton Institute12, University of Tennessee13, Agriculture and Agri-Food Canada14, Oregon State University15, Agricultural Research Service16, University of Alabama at Birmingham17, University of Warwick18, University of Worcester19, Utrecht University20, Virginia Tech21, University of Manitoba22, Cornell University23, International Potato Center24, Wageningen University and Research Centre25, Institut national de la recherche agronomique26, North Carolina State University27
TL;DR: A survey to query the community for their ranking of plant-pathogenic oomycete species based on scientific and economic importance received 263 votes from 62 scientists in 15 countries for a total of 33 species and the Top 10 species are provided.
Abstract: Oomycetes form a deep lineage of eukaryotic organisms that includes a large number of plant pathogens which threaten natural and managed ecosystems. We undertook a survey to query the community for their ranking of plant-pathogenic oomycete species based on scientific and economic importance. In total, we received 263 votes from 62 scientists in 15 countries for a total of 33 species. The Top 10 species and their ranking are: (1) Phytophthora infestans; (2, tied) Hyaloperonospora arabidopsidis; (2, tied) Phytophthora ramorum; (4) Phytophthora sojae; (5) Phytophthora capsici; (6) Plasmopara viticola; (7) Phytophthora cinnamomi; (8, tied) Phytophthora parasitica; (8, tied) Pythium ultimum; and (10) Albugo candida. This article provides an introduction to these 10 taxa and a snapshot of current research. We hope that the list will serve as a benchmark for future trends in oomycete research.
••
TL;DR: Pervasive autosomal introgression between these human malaria vectors, including nonsister vector species, suggests that traits enhancing vectorial capacity may be gained through interspecific gene flow, including between nonsister species.
Abstract: Introgressive hybridization is now recognized as a widespread phenomenon, but its role in evolution remains contested. Here, we use newly available reference genome assemblies to investigate phylogenetic relationships and introgression in a medically important group of Afrotropical mosquito sibling species. We have identified the correct species branching order to resolve a contentious phylogeny and show that lineages leading to the principal vectors of human malaria were among the first to split. Pervasive autosomal introgression between these malaria vectors means that only a small fraction of the genome, mainly on the X chromosome, has not crossed species boundaries. Our results suggest that traits enhancing vectorial capacity may be gained through interspecific gene flow, including between nonsister species.
••
TL;DR: This chapter gives an overview and highlights recent advances in the understanding of the organization, regulation, and diversification of core and specialized terpenoid metabolic pathways, and addresses the most important functions of volatile and nonvolatile terpenoids specialized metabolites in plants.
Abstract: Terpenoids (isoprenoids) represent the largest and most diverse class of chemicals among the myriad compounds produced by plants. Plants employ terpenoid metabolites for a variety of basic functions in growth and development but use the majority of terpenoids for more specialized chemical interactions and protection in the abiotic and biotic environment. Traditionally, plant-based terpenoids have been used by humans in the food, pharmaceutical, and chemical industries, and more recently have been exploited in the development of biofuel products. Genomic resources and emerging tools in synthetic biology facilitate the metabolic engineering of high-value terpenoid products in plants and microbes. Moreover, the ecological importance of terpenoids has gained increased attention to develop strategies for sustainable pest control and abiotic stress protection. Together, these efforts require a continuous growth in knowledge of the complex metabolic and molecular regulatory networks in terpenoid biosynthesis. This chapter gives an overview and highlights recent advances in our understanding of the organization, regulation, and diversification of core and specialized terpenoid metabolic pathways, and addresses the most important functions of volatile and nonvolatile terpenoid specialized metabolites in plants.
••
TL;DR: This first comprehensive tutorial on the use of matching theory, a Nobel Prize winning framework, for resource management in wireless networks is developed and results show how matching theory can effectively improve the performance of resource allocation in all three applications discussed.
Abstract: The emergence of novel wireless networking paradigms such as small cell and cognitive radio networks has forever transformed the way in which wireless systems are operated. In particular, the need for self-organizing solutions to manage the scarce spectral resources has become a prevalent theme in many emerging wireless systems. In this article, the first comprehensive tutorial on the use of matching theory, a Nobel Prize winning framework, for resource management in wireless networks is developed. To cater for the unique features of emerging wireless networks, a novel, wireless-oriented classification of matching theory is proposed. Then the key solution concepts and algorithmic implementations of this framework are exposed. The developed concepts are applied in three important wireless networking areas in order to demonstrate the usefulness of this analytical tool. Results show how matching theory can effectively improve the performance of resource allocation in all three applications discussed.
••
Broad Institute1, Tehran University of Medical Sciences2, George Washington University3, European Bioinformatics Institute4, Sapienza University of Rome5, Temple University6, Tomsk State University7, University of Notre Dame8, Centre national de la recherche scientifique9, French Institute of Health and Medical Research10, Imperial College London11, James Cook University12, Massachusetts Institute of Technology13, Simon Fraser University14, University of California, Davis15, Institut de recherche pour le développement16, Kansas State University17, Foundation for Research & Technology – Hellas18, University of Perugia19, Virginia Tech20, University of Nevada, Las Vegas21, Baylor College of Medicine22, Boston College23, Harvard University24, University of Manchester25, University of California, San Francisco26, University of Cyprus27, National Health Laboratory Service28, University of Crete29, Kenya Medical Research Institute30, University of Arizona31, University of Pennsylvania32, Indian Council of Medical Research33, New Mexico State University34, Liverpool School of Tropical Medicine35, Vanderbilt University Medical Center36, Vanderbilt University37, Swiss Institute of Bioinformatics38, University of Geneva39, Texas A&M University40, Chiang Mai University41, Oswaldo Cruz Foundation42, Rio de Janeiro State University43, Indiana University44, University of Santiago de Compostela45, Wellcome Trust Sanger Institute46, Liverpool John Moores University47, University of Georgia48, Harvey Mudd College49, University of California, Irvine50, University of Groningen51, Centers for Disease Control and Prevention52, Biogen Idec53
TL;DR: The authors investigated the genomic basis of vectorial capacity and explore new avenues for vector control, sequenced the genomes of 16 anopheline mosquito species from diverse locations spanning ~100 million years of evolution Comparative analyses show faster rates of gene gain and loss, elevated gene shuffling on the X chromosome, and more intron losses, relative to Drosophila.
Abstract: Variation in vectorial capacity for human malaria among Anopheles mosquito species is determined by many factors, including behavior, immunity, and life history To investigate the genomic basis of vectorial capacity and explore new avenues for vector control, we sequenced the genomes of 16 anopheline mosquito species from diverse locations spanning ~100 million years of evolution Comparative analyses show faster rates of gene gain and loss, elevated gene shuffling on the X chromosome, and more intron losses, relative to Drosophila Some determinants of vectorial capacity, such as chemosensory genes, do not show elevated turnover but instead diversify through protein-sequence changes This dynamism of anopheline genes and genomes may contribute to their flexible capacity to take advantage of new ecological niches, including adapting to humans as primary hosts
••
TL;DR: Novel, selective PAD4 inhibitors binding to a calcium-deficient form of the PAD3 enzyme have been validated, for the first time, in both histone citrullination and neutrophil extracellular trap formation, validating the critical enzymatic role of human and mouse PAD 4.
Abstract: PAD4 has been strongly implicated in the pathogenesis of autoimmune, cardiovascular and oncological diseases through clinical genetics and gene disruption in mice. New selective PAD4 inhibitors binding a calcium-deficient form of the PAD4 enzyme have validated the critical enzymatic role of human and mouse PAD4 in both histone citrullination and neutrophil extracellular trap formation for, to our knowledge, the first time. The therapeutic potential of PAD4 inhibitors can now be explored.
••
TL;DR: In this article, the authors discuss the nature of use of the Internet by American travelers and find that while traditional means of Internet use for travel planning appear to be widespread across all customer segments, higher-order Internet uses are now prevalent among some segments, particularly among travelers of Generation Y.
••
TL;DR: This re-analysis of trace-metal data is consistent with oxygenation continuing well into the Palaeozoic era, and suggests that subsurface water masses in mid-Proterozoic oceans were predominantly anoxic and ferruginous, but with a tendency towards euxinia (sulfide-bearing) that is not observed in the Neoproterozoics era.
Abstract: Sedimentary rocks deposited across the Proterozoic-Phanerozoic transition record extreme climate fluctuations, a potential rise in atmospheric oxygen or re-organization of the seafloor redox landscape, and the initial diversification of animals. It is widely assumed that the inferred redox change facilitated the observed trends in biodiversity. Establishing this palaeoenvironmental context, however, requires that changes in marine redox structure be tracked by means of geochemical proxies and translated into estimates of atmospheric oxygen. Iron-based proxies are among the most effective tools for tracking the redox chemistry of ancient oceans. These proxies are inherently local, but have global implications when analysed collectively and statistically. Here we analyse about 4,700 iron-speciation measurements from shales 2,300 to 360 million years old. Our statistical analyses suggest that subsurface water masses in mid-Proterozoic oceans were predominantly anoxic and ferruginous (depleted in dissolved oxygen and iron-bearing), but with a tendency towards euxinia (sulfide-bearing) that is not observed in the Neoproterozoic era. Analyses further indicate that early animals did not experience appreciable benthic sulfide stress. Finally, unlike proxies based on redox-sensitive trace-metal abundances, iron geochemical data do not show a statistically significant change in oxygen content through the Ediacaran and Cambrian periods, sharply constraining the magnitude of the end-Proterozoic oxygen increase. Indeed, this re-analysis of trace-metal data is consistent with oxygenation continuing well into the Palaeozoic era. Therefore, if changing redox conditions facilitated animal diversification, it did so through a limited rise in oxygen past critical functional and ecological thresholds, as is seen in modern oxygen minimum zone benthic animal communities.
••
TL;DR: The Sol Genomics Network is a web portal with genomic and phenotypic data, and analysis tools for the Solanaceae family and close relatives, and a new tool was recently implemented to improve Virus-Induced Gene Silencing (VIGS) constructs called the SGN VIGS tool.
Abstract: The Sol Genomics Network (SGN, http: //solgenomics.net) is a web portal with genomic and phenotypic data, and analysis tools for the Solanaceae family and close relatives. SGN hosts whole genome data for an increasing number of Solanaceae family members including tomato, potato, pepper, eggplant, tobacco and Nicotiana benthamiana. The database also stores loci and phenotype data, which researchers can upload and edit with user-friendly web interfaces. Tools such as BLAST, GBrowse and JBrowse for browsing genomes, expression and map data viewers, a locus community annotation system and a QTL analysis tools are available. A new tool was recently implemented to improve Virus-Induced Gene Silencing (VIGS) constructs called the SGN VIGS tool. With the growing genomic and phenotypic data in the database, SGN is now advancing to develop new web-based breeding tools and implement the code and database structure for other species or clade-specific databases.
••
TL;DR: The omics revolution is identifying many novel enzymes and paradigms for biomass deconstruction, but more emphasis on function is required, particularly for enzyme cocktails, in which LPMOs may play an important role.
••
TL;DR: A framework that brings together prenatal, social/contextual, and neurobiological mechanisms to explain the intergenerational transmission of self-regulation is introduced, a framework that incorporates potential transactional processes between generations.
Abstract: This review examines mechanisms contributing to the intergenerational transmission of self-regulation. To provide an integrated account of how self-regulation is transmitted across generations, we draw from over 75 years of accumulated evidence, spanning case studies to experimental approaches, in literatures covering developmental, social, and clinical psychology, and criminology, physiology, genetics, and human and animal neuroscience (among others). First, we present a taxonomy of what self-regulation is and then examine how it develops— overviews that guide the main foci of the review. Next, studies supporting an association between parent and child self-regulation are reviewed. Subsequently, literature that considers potential social mechanisms of transmission, specifically parenting behavior, interparental (i.e., marital) relationship behaviors, and broader rearing influences (e.g., household chaos) is considered. Finally, evidence that prenatal programming may be the starting point of the intergenerational transmission of self-regulation is covered, along with key findings from the behavioral and molecular genetics literatures. To integrate these literatures, we introduce the self-regulation intergenerational transmission model, a framework that brings together prenatal, social/contextual, and neurobiological mechanisms (spanning endocrine, neural, and genetic levels, including gene-environment interplay and epigenetic processes) to explain the intergenerational transmission of self-regulation. This model also incorporates potential transactional processes between generations (e.g., children’s self-regulation and parent– child interaction dynamics that may affect parents’ self-regulation) that further influence intergenerational processes. In pointing the way forward, we note key future directions and ways to address limitations in existing work throughout the review and in closing. We also conclude by noting several implications for intervention work.
••
TL;DR: In this paper, a comprehensive discussion is provided of the critical physical, chemical and structural parameters, such as soft and hard segment structures and their molecular weights, polymer composition, solubility parameters, competitive intermolecular interactions and others, which strongly affect the morphology and bulk and surface properties of segmented thermoplastic polyurethanes.
••
TL;DR: This paper makes the first attempt to formally address the problem of authorized data deduplication, and shows that the proposed authorized duplicate check scheme incurs minimal overhead compared to normal operations.
Abstract: Data deduplication is one of important data compression techniques for eliminating duplicate copies of repeating data, and has been widely used in cloud storage to reduce the amount of storage space and save bandwidth. To protect the confidentiality of sensitive data while supporting deduplication, the convergent encryption technique has been proposed to encrypt the data before outsourcing. To better protect data security, this paper makes the first attempt to formally address the problem of authorized data deduplication. Different from traditional deduplication systems, the differential privileges of users are further considered in duplicate check besides the data itself. We also present several new deduplication constructions supporting authorized duplicate check in a hybrid cloud architecture. Security analysis demonstrates that our scheme is secure in terms of the definitions specified in the proposed security model. As a proof of concept, we implement a prototype of our proposed authorized duplicate check scheme and conduct testbed experiments using our prototype. We show that our proposed authorized duplicate check scheme incurs minimal overhead compared to normal operations.
••
Parthenope University of Naples1, INAF2, Max Planck Society3, Spanish National Research Council4, Aix-Marseille University5, International Space Science Institute6, European Space Agency7, Uppsala University8, Polish Academy of Sciences9, Braunschweig University of Technology10, University of Maryland, College Park11, University of Padua12, Paris Diderot University13, Versailles Saint-Quentin-en-Yvelines University14, European Space Research and Technology Centre15, Selex ES16, University of Trento17, Virginia Tech18, University of Florida19, Open University20, German Aerospace Center21, National Central University22, University of Kent23, University of Granada24, Centre national de la recherche scientifique25, Instituto Nacional de Técnica Aeroespacial26, University of Bern27, Jet Propulsion Laboratory28
TL;DR: In this article, the GIADA (Grain Impact Analyser and Dust Accumulator) experiment on the European Space Agency's Rosetta spacecraft orbiting comet 67P/Churyumov-Gerasimenko was used to detect 35 outflowing grains of mass 10−10 to 10−7 kilograms.
Abstract: Critical measurements for understanding accretion and the dust/gas ratio in the solar nebula, where planets were forming 4.5 billion years ago, are being obtained by the GIADA (Grain Impact Analyser and Dust Accumulator) experiment on the European Space Agency’s Rosetta spacecraft orbiting comet 67P/Churyumov-Gerasimenko. Between 3.6 and 3.4 astronomical units inbound, GIADA and OSIRIS (Optical, Spectroscopic, and Infrared Remote Imaging System) detected 35 outflowing grains of mass 10−10 to 10−7 kilograms, and 48 grains of mass 10−5 to 10−2 kilograms, respectively. Combined with gas data from the MIRO (Microwave Instrument for the Rosetta Orbiter) and ROSINA (Rosetta Orbiter Spectrometer for Ion and Neutral Analysis) instruments, we find a dust/gas mass ratio of 4 ± 2 averaged over the sunlit nucleus surface. A cloud of larger grains also encircles the nucleus in bound orbits from the previous perihelion. The largest orbiting clumps are meter-sized, confirming the dust/gas ratio of 3 inferred at perihelion from models of dust comae and trails.
••
30 Dec 2015TL;DR: Black feminist thought is a collection of ideas, writings, and art that articulates a standpoint of, for, and by African American women in the United States as discussed by the authors, which is a term used to describe the collection of writings, writings and art of black women.
Abstract: Black feminist thought is a term used to describe a collection of ideas, writings, and art that articulates a standpoint of, for, and by African American women in the United States. Black feminist thought positions African American women as a group that is socially situated in a unique status due to simultaneous interlocking processes of race, ethnicity, gender, class, and sexual orientation. This theoretical framework posits that the politics of location and the intrapersonal and interpersonal negotiation of intersecting social identities shape African American women's individual and collective consciousness, self-definitions, and actions. The history of black feminist thought as expressed through activism and seminal scholarship are presented.
Keywords:
African American;
feminism;
gender;
identity politics;
nationalism
••
TL;DR: This paper introduces outsourcing computation into IBE for the first time and proposes a revocable IBE scheme in the server-aided setting and proposes another construction which is provable secure under the recently formulized Refereed Delegation of Computation model.
Abstract: Identity-Based Encryption (IBE) which simplifies the public key and certificate management at Public Key Infrastructure (PKI) is an important alternative to public key encryption. However, one of the main efficiency drawbacks of IBE is the overhead computation at Private Key Generator (PKG) during user revocation. Efficient revocation has been well studied in traditional PKI setting, but the cumbersome management of certificates is precisely the burden that IBE strives to alleviate. In this paper, aiming at tackling the critical issue of identity revocation, we introduce outsourcing computation into IBE for the first time and propose a revocable IBE scheme in the server-aided setting. Our scheme offloads most of the key generation related operations during key-issuing and key-update processes to a Key Update Cloud Service Provider, leaving only a constant number of simple operations for PKG and users to perform locally. This goal is achieved by utilizing a novel collusion-resistant technique: we employ a hybrid private key for each user, in which an AND gate is involved to connect and bound the identity component and the time component. Furthermore, we propose another construction which is provable secure under the recently formulized Refereed Delegation of Computation model. Finally, we provide extensive experimental results to demonstrate the efficiency of our proposed construction.
01 Nov 2015
TL;DR: This paper aims to demonstrate the efforts towards in-situ applicability of EMMARM, which aims to provide real-time information about concrete mechanical properties such as E-modulus and compressive strength.
Abstract: United States. Air Force Office of Scientific Research (Computational Mathematics Grant FA9550-12-1-0420)
••
TL;DR: In this paper, the authors explored residents' perceived value of tourism development, life domain satisfaction (material/non-material), and overall quality of life in their community using a sample of residents from five different tourism destinations.