scispace - formally typeset
Search or ask a question

Showing papers by "Louisiana State University published in 2012"


Journal ArticleDOI
TL;DR: These guidelines are presented for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes.
Abstract: In 2008 we published the first set of guidelines for standardizing research in autophagy. Since then, research on this topic has continued to accelerate, and many new scientists have entered the field. Our knowledge base and relevant new technologies have also been expanding. Accordingly, it is important to update these guidelines for monitoring autophagy in different organisms. Various reviews have described the range of assays that have been used for this purpose. Nevertheless, there continues to be confusion regarding acceptable methods to measure autophagy, especially in multicellular eukaryotes. A key point that needs to be emphasized is that there is a difference between measurements that monitor the numbers or volume of autophagic elements (e.g., autophagosomes or autolysosomes) at any stage of the autophagic process vs. those that measure flux through the autophagy pathway (i.e., the complete process); thus, a block in macroautophagy that results in autophagosome accumulation needs to be differentiated from stimuli that result in increased autophagic activity, defined as increased autophagy induction coupled with increased delivery to, and degradation within, lysosomes (in most higher eukaryotes and some protists such as Dictyostelium) or the vacuole (in plants and fungi). In other words, it is especially important that investigators new to the field understand that the appearance of more autophagosomes does not necessarily equate with more autophagy. In fact, in many cases, autophagosomes accumulate because of a block in trafficking to lysosomes without a concomitant change in autophagosome biogenesis, whereas an increase in autolysosomes may reflect a reduction in degradative activity. Here, we present a set of guidelines for the selection and interpretation of methods for use by investigators who aim to examine macroautophagy and related processes, as well as for reviewers who need to provide realistic and reasonable critiques of papers that are focused on these processes. These guidelines are not meant to be a formulaic set of rules, because the appropriate assays depend in part on the question being asked and the system being used. In addition, we emphasize that no individual assay is guaranteed to be the most appropriate one in every situation, and we strongly recommend the use of multiple assays to monitor autophagy. In these guidelines, we consider these various methods of assessing autophagy and what information can, or cannot, be obtained from them. Finally, by discussing the merits and limits of particular autophagy assays, we hope to encourage technical innovation in the field.

4,316 citations


Journal ArticleDOI
TL;DR: Among the regions of the ribosomal cistron, the internal transcribed spacer (ITS) region has the highest probability of successful identification for the broadest range of fungi, with the most clearly defined barcode gap between inter- and intraspecific variation.
Abstract: Six DNA regions were evaluated as potential DNA barcodes for Fungi, the second largest kingdom of eukaryotic life, by a multinational, multilaboratory consortium. The region of the mitochondrial cytochrome c oxidase subunit 1 used as the animal barcode was excluded as a potential marker, because it is difficult to amplify in fungi, often includes large introns, and can be insufficiently variable. Three subunits from the nuclear ribosomal RNA cistron were compared together with regions of three representative protein-coding genes (largest subunit of RNA polymerase II, second largest subunit of RNA polymerase II, and minichromosome maintenance protein). Although the protein-coding gene regions often had a higher percent of correct identification compared with ribosomal markers, low PCR amplification and sequencing success eliminated them as candidates for a universal fungal barcode. Among the regions of the ribosomal cistron, the internal transcribed spacer (ITS) region has the highest probability of successful identification for the broadest range of fungi, with the most clearly defined barcode gap between inter- and intraspecific variation. The nuclear ribosomal large subunit, a popular phylogenetic marker in certain groups, had superior species resolution in some taxonomic groups, such as the early diverging lineages and the ascomycete yeasts, but was otherwise slightly inferior to the ITS. The nuclear ribosomal small subunit has poor species-level resolution in fungi. ITS will be formally proposed for adoption as the primary fungal barcode marker to the Consortium for the Barcode of Life, with the possibility that supplementary barcodes may be developed for particular narrowly circumscribed taxonomic groups.

4,116 citations


Journal ArticleDOI
15 Mar 2012-Nature
TL;DR: Interactions between diverse synthetic, multivalent macromolecules (including multi-domain proteins and RNA) produce sharp liquid–liquid-demixing phase separations, generating micrometre-sized liquid droplets in aqueous solution.
Abstract: Cells are organized on length scales ranging from angstrom to micrometres. However, the mechanisms by which angstrom-scale molecular properties are translated to micrometre-scale macroscopic properties are not well understood. Here we show that interactions between diverse synthetic, multivalent macromolecules (including multi-domain proteins and RNA) produce sharp liquid-liquid-demixing phase separations, generating micrometre-sized liquid droplets in aqueous solution. This macroscopic transition corresponds to a molecular transition between small complexes and large, dynamic supramolecular polymers. The concentrations needed for phase transition are directly related to the valency of the interacting species. In the case of the actin-regulatory protein called neural Wiskott-Aldrich syndrome protein (N-WASP) interacting with its established biological partners NCK and phosphorylated nephrin, the phase transition corresponds to a sharp increase in activity towards an actin nucleation factor, the Arp2/3 complex. The transition is governed by the degree of phosphorylation of nephrin, explaining how this property of the system can be controlled to regulatory effect by kinases. The widespread occurrence of multivalent systems suggests that phase transitions may be used to spatially organize and biochemically regulate information throughout biology.

1,816 citations


Journal ArticleDOI
TL;DR: A critical analysis of the challenges and limitations of the most widely used fluorescent probes for detecting and measuring reactive oxygen and nitrogen species and proposed guidelines that will help present and future researchers with regard to the optimal use of selected fluorescent probes and interpretation of results are presented.

1,423 citations


Journal ArticleDOI
18 Oct 2012-Nature
TL;DR: It is shown that nutrient levels commonly associated with coastal eutrophication increased above-ground leaf biomass, decreased the dense, below-ground biomass of bank-stabilizing roots, and increased microbial decomposition of organic matter, demonstrating that nutrient enrichment can be a driver of salt marsh loss.
Abstract: A nine-year whole-ecosystem experiment demonstrates that nutrient enrichment, a global problem in coastal ecosystems, can be a driver of salt-marsh loss. Salt marshes provide important ecosystem services such as storm protection for coastal cities, nutrient removal and carbon sequestration, but despite protective measures these ecosystems are in decline. Nine years of data from a whole-ecosystem nutrient-enrichment experiment now demonstrate that current levels of coastal nutrient loading can alter key salt-marsh-ecosystem properties, leading to the collapse of creek banks and, ultimately, the conversion of salt marsh into mudflat. The potential deterioration of coastal marshes owing to eutrophication adds another dimension to the challenge of managing nitrogen while meeting food-production demands in the twenty-first century. Salt marshes are highly productive coastal wetlands that provide important ecosystem services such as storm protection for coastal cities, nutrient removal and carbon sequestration. Despite protective measures, however, worldwide losses of these ecosystems have accelerated in recent decades1. Here we present data from a nine-year whole-ecosystem nutrient-enrichment experiment. Our study demonstrates that nutrient enrichment, a global problem for coastal ecosystems2,3,4, can be a driver of salt marsh loss. We show that nutrient levels commonly associated with coastal eutrophication increased above-ground leaf biomass, decreased the dense, below-ground biomass of bank-stabilizing roots, and increased microbial decomposition of organic matter. Alterations in these key ecosystem properties reduced geomorphic stability, resulting in creek-bank collapse with significant areas of creek-bank marsh converted to unvegetated mud. This pattern of marsh loss parallels observations for anthropogenically nutrient-enriched marshes worldwide, with creek-edge and bay-edge marsh evolving into mudflats and wider creeks5,6,7. Our work suggests that current nutrient loading rates to many coastal ecosystems have overwhelmed the capacity of marshes to remove nitrogen without deleterious effects. Projected increases in nitrogen flux to the coast, related to increased fertilizer use required to feed an expanding human population, may rapidly result in a coastal landscape with less marsh, which would reduce the capacity of coastal regions to provide important ecological and economic services.

844 citations


Journal ArticleDOI
TL;DR: The status of global food security, i.e., the balance between the growing food demand of the world population and global agricultural output, combined with discrepancies between supply and demand at the regional, national, and local scales, is alarming as mentioned in this paper.
Abstract: The status of global food security, i.e., the balance between the growing food demand of the world population and global agricultural output, combined with discrepancies between supply and demand at the regional, national, and local scales (Smil 2000; UN Department of Economic and Social Affairs 2011; Ingram 2011), is alarming. This imbalance is not new (Dyson 1999) but has dramatically worsened during the recent decades, culminating recently in the 2008 food crisis. It is important to note that in mid-2011, food prices were back to their heights of the middle of the 2008 crisis (FAO 2011). Plant protection in general and the protection of crops against plant diseases in particular, have an obvious role to play in meeting the growing demand for food quality and quantity (Strange and Scott 2005). Roughly, direct yield losses caused by pathogens, animals, and weeds, are altogether responsible for losses ranging between 20 and 40 % of global agricultural productivity (Teng and Krupa 1980; Teng 1987; Oerke et al. 1994; Oerke 2006). Crop losses due to pests and pathogens are direct, as well as indirect; they have a number of facets, some with short-, and others with long-term consequences (Zadoks 1967). The phrase “losses between 20 and 40 %” therefore inadequately reflects the true costs of crop losses to consumers, public health, societies, environments, economic fabrics and farmers. The components of food security include food availability (production, import, reserves), physical and economic access to food, and food utilisation (e.g., nutritive value, safety), as has been recently reviewed by Ingram (2011). Although crop losses caused by plant disease directly affect the first of these components, they also affect others (e.g., the food utilisation component) directly or indirectly through the fabrics of trade, policies and societies (Zadoks 2008). Most of the agricultural research conducted in the 20th century focused on increasing crop productivity as the world population and its food needs grew (Evans 1998; Smil 2000; Nellemann et al. 2009). Plant protection then primarily focused on protecting crops from yield losses due to biological and non-biological causes. The problem remains as challenging today as in the 20th century, with additional complexity generated by the reduced room for manoeuvre available environmentally, economically, and socially (FAO 2011; Brown 2011). This results from shrinking natural resources that are available to agriculture: these include water, agricultural land, arable soil, biodiversity, the availability of non-renewable energy, human labour, fertilizers (Smil 2000), and the deployment of some key inputs, such as high quality seeds and planting material (Evans 1998). In addition to yield losses caused by diseases, these new elements of complexity also include post harvest quality losses and the possible accumulation of toxins during and after the S. Savary (*) : J.-N. Aubertot INRA, UMR1248 AGIR, 24 Chemin de Borde Rouge, Auzeville, CS52627, 31326 Castanet-Tolosan Cedex, France e-mail: Serge.Savary@toulouse.inra.fr

720 citations


Journal Article
TL;DR: The present data demonstrate an increase of CAM usage from 1990 through 2006 in all countries investigated, as well as differences between the general population and medical personnel.
Abstract: Background: The interest in complementary and alternative medicine (CAM) has increased during the past decade and the attitude of the general public is mainly positive, but the debate about the clinical effectiveness of these therapies remains controversial among many medical professionals. Methods: We conducted a systematic search of the existing literature utilizing different databases, including PubMed/Medline, PSYNDEX, and PsycLit, to research the use and acceptance of CAM among the general population and medical personnel. A special focus on CAM-referring literature was set by limiting the PubMed search to ‘‘Complementary Medicine’’ and adding two other search engines: CAMbase (www.cambase.de) and CAMRESEARCH (www.camresearch.net). These engines were used to reveal publications that at the time of the review were not indexed in PubMed. Results: A total of 16 papers met the scope criteria. Prevalence rates of CAM in each of the included studies were between 5% and 74.8%. We found a higher utilization of homeopathy and acupuncture in German-speaking countries. Excluding any form of spiritual prayer, the data demonstrate that chiropractic manipulation, herbal medicine, massage, and homeopathy were the therapies most commonly used by the general population. We identified sex, age, and education as predictors of CAM utilization: More users were women, middle aged, and more educated. The ailments most often associated with CAM utilization included back pain or pathology, depression, insomnia, severe headache or migraine, and stomach or intestinal illnesses. Medical students were the most critical toward CAM. Compared to students of other professions (ie, nursing students: 44.7%, pharmacy students: 18.2%), medical students reported the least consultation with a CAM practitioner (10%). Conclusions: The present data demonstrate an increase of CAM usage from 1990 through 2006 in all countries investigated. We found geographical differences, as well as differences between the general population and medical personnel.

548 citations


Journal ArticleDOI
TL;DR: The Einstein Toolkit as mentioned in this paper is a community-driven, freely accessible computational infrastructure intended for use in numerical relativity, relativistic astrophysics, and other applications, which combines a core set of components needed to simulate astrophysical objects such as black holes, compact objects, and collapsing stars.
Abstract: We describe the Einstein Toolkit, a community-driven, freely accessible computational infrastructure intended for use in numerical relativity, relativistic astrophysics, and other applications. The toolkit, developed by a collaboration involving researchers from multiple institutions around the world, combines a core set of components needed to simulate astrophysical objects such as black holes, compact objects, and collapsing stars, as well as a full suite of analysis tools. The Einstein Toolkit is currently based on the Cactus framework for high-performance computing and the Carpet adaptive mesh refinement driver. It implements spacetime evolution via the BSSN evolution system and general relativistic hydrodynamics in a finite-volume discretization. The toolkit is under continuous development and contains many new code components that have been publicly released for the first time and are described in this paper. We discuss the motivation behind the release of the toolkit, the philosophy underlying its development, and the goals of the project. A summary of the implemented numerical techniques is included, as are results of numerical test covering a variety of sample astrophysical problems.

479 citations


Journal ArticleDOI
TL;DR: The significantly reduced treatment time required to remove the Cr(VI) and the applicability in treating the solutions with low pH make MGNCs promising for the efficient removal of heavy metals from the wastewater.
Abstract: A facile thermodecomposition process to synthesize magnetic graphene nanocomposites (MGNCs) is reported. High-resolution transmission electron microscopy and energy filtered elemental mapping revealed a core@double-shell structure of the nanoparticles with crystalline iron as the core, iron oxide as the inner shell and amorphous Si–S–O compound as the outer shell. The MGNCs demonstrate an extremely fast Cr(VI) removal from the wastewater with a high removal efficiency and with an almost complete removal of Cr(VI) within 5 min. The adsorption kinetics follows the pseudo-second-order model and the novel MGNC adsorbent exhibits better Cr(VI) removal efficiency in solutions with low pH. The large saturation magnetization (96.3 emu/g) of the synthesized nanoparticles allows fast separation of the MGNCs from liquid suspension. By using a permanent magnet, the recycling process of both the MGNC adsorbents and the adsorbed Cr(VI) is more energetically and economically sustainable. The significantly reduced treatm...

476 citations


Journal ArticleDOI
TL;DR: This paper presents a Bayesian implementation of an evolutionary model-based method, the general mixed Yule-coalescent model (GMYC), which integrates over the parameters of the model and uncertainty in phylogenetic relationships using the output of widely available phylogenetic models and Markov-Chain Monte Carlo simulation in order to produce marginal probabilities of species identities.
Abstract: Species are considered the fundamental unit in many ecological and evolutionary analyses, yet accurate, complete, accessible taxonomic frameworks with which to identify them are often unavailable to researchers. In such cases DNA sequence-based species delimitation has been proposed as a means of estimating species boundaries for further analysis. Several methods have been proposed to accomplish this. Here we present a Bayesian implementation of an evolutionary model-based method, the general mixed Yule-coalescent model (GMYC). Our implementation integrates over the parameters of the model and uncertainty in phylogenetic relationships using the output of widely available phylogenetic models and Markov-Chain Monte Carlo (MCMC) simulation in order to produce marginal probabilities of species identities. We conducted simulations testing the effects of species evolutionary history, levels of intraspecific sampling and number of nucleotides sequenced. We also re-analyze the dataset used to introduce the original GMYC model. We found that the model results are improved with addition of DNA sequence and increased sampling, although these improvements have limits. The most important factor in the success of the model is the underlying phylogenetic history of the species under consideration. Recent and rapid divergences result in higher amounts of uncertainty in the model and eventually cause the model to fail to accurately assess uncertainty in species limits. Our results suggest that the GMYC model can be useful under a wide variety of circumstances, particularly in cases where divergences are deeper, or taxon sampling is incomplete, as in many studies of ecological communities, but that, in accordance with expectations from coalescent theory, rapid, recent radiations may yield inaccurate results. Our implementation differs from existing ones in two ways: it allows for the accounting for important sources of uncertainty in the model (phylogenetic and in parameters specific to the model) and in the specification of informative prior distributions that can increase the precision of the model. We have incorporated this model into a user-friendly R package available on the authors’ websites.

448 citations


Journal ArticleDOI
TL;DR: This review surveys recent literature on the three named issues with emphasis on methodological advancements and implications for public policy, including late-stage cancer diagnosis, and optimization methods can be used to improve the distribution and supply of health care providers.
Abstract: Despite spending more than any other nation on medical care per person, the United States ranks behind other industrialized nations in key health performance measures. A main cause is the deep disp...

Journal ArticleDOI
TL;DR: The critical issues that must be addressed if BNCT is to become a more widely established clinical modality for the treatment of those malignancies for which there currently are no good treatment options are summarized.
Abstract: Boron neutron capture therapy (BNCT) is a biochemically targeted radiotherapy based on the nuclear capture and fission reactions that occur when non-radioactive boron-10, which is a constituent of natural elemental boron, is irradiated with low energy thermal neutrons to yield high linear energy transfer alpha particles and recoiling lithium-7 nuclei. Clinical interest in BNCT has focused primarily on the treatment of high grade gliomas, recurrent cancers of the head and neck region and either primary or metastatic melanoma. Neutron sources for BNCT currently have been limited to specially modified nuclear reactors, which are or until the recent Japanese natural disaster, were available in Japan, the United States, Finland and several other European countries, Argentina and Taiwan. Accelerators producing epithermal neutron beams also could be used for BNCT and these are being developed in several countries. It is anticipated that the first Japanese accelerator will be available for therapeutic use in 2013. The major hurdle for the design and synthesis of boron delivery agents has been the requirement for selective tumor targeting to achieve boron concentrations in the range of 20 μg/g. This would be sufficient to deliver therapeutic doses of radiation with minimal normal tissue toxicity. Two boron drugs have been used clinically, a dihydroxyboryl derivative of phenylalanine, referred to as boronophenylalanine or “BPA”, and sodium borocaptate or “BSH” (Na2B12H11SH). In this report we will provide an overview of other boron delivery agents that currently are under evaluation, neutron sources in use or under development for BNCT, clinical dosimetry, treatment planning, and finally a summary of previous and on-going clinical studies for high grade gliomas and recurrent tumors of the head and neck region. Promising results have been obtained with both groups of patients but these outcomes must be more rigorously evaluated in larger, possibly randomized clinical trials. Finally, we will summarize the critical issues that must be addressed if BNCT is to become a more widely established clinical modality for the treatment of those malignancies for which there currently are no good treatment options.

Journal ArticleDOI
TL;DR: This study outlines a phylogenomic approach using a novel class of phylogenetic markers derived from ultraconserved elements and flanking DNA, and shows that this class of marker is useful for recovering deep-level phylogeny in placental mammals.
Abstract: Phylogenomics offers the potential to fully resolve the Tree of Life, but increasing genomic coverage also reveals conflicting evolutionary histories among genes, demanding new analytical strategies for elucidating a single history of life. Here, we outline a phylogenomic approach using a novel class of phylogenetic markers derived from ultraconserved elements and flanking DNA. Using species-tree analysis that accounts for discord among hundreds of independent loci, we show that this class of marker is useful for recovering deep-level phylogeny in placental mammals. In broad outline, our phylogeny agrees with recent phylogenomic studies of mammals, including several formerly controversial relationships. Our results also inform two outstanding questions in placental mammal phylogeny involving rapid speciation, where species-tree methods are particularly needed. Contrary to most phylogenomic studies, our study supports a first-diverging placental mammal lineage that includes elephants and tenrecs (Afrotheria). The level of conflict among gene histories is consistent with this basal divergence occurring in or near a phylogenetic ‘‘anomaly zone’’ where a failure to account for coalescent stochasticity will mislead phylogenetic inference. Addressing a long-standing phylogenetic mystery, we find some support from a high genomic coverage data set for a traditional placement of bats (Chiroptera) sister to a clade containing Perissodactyla, Cetartiodactyla, and Carnivora, and not nested within the latter clade, as has been suggested recently, although other results were conflicting. One of the most remarkable findings of our study is that ultraconserved elements and their flanking DNA are a rich source of phylogenetic information with strong potential for application across Amniotes. [Supplemental material is available for this article.]

Journal ArticleDOI
TL;DR: This document represents an attempt of a working group of the American Academy of Allergy, Asthma & Immunology to provide further guidance and synthesis in this use of vaccination for diagnostic purposes in consideration of PIDD, as well as to identify key areas for further research.
Abstract: A major diagnostic intervention in the consideration of many patients suspected to have primary immunodeficiency diseases (PIDDs) is the application and interpretation of vaccination. Specifically, the antibody response to antigenic challenge with vaccines can provide substantive insight into the status of human immune function. There are numerous vaccines that are commonly used in healthy individuals, as well as others that are available for specialized applications. Both can potentially be used to facilitate consideration of PIDD. However, the application of vaccines and interpretation of antibody responses in this context are complex. These rely on consideration of numerous existing specific studies, interpolation of data from healthy populations, current diagnostic guidelines, and expert subspecialist practice. This document represents an attempt of a working group of the American Academy of Allergy, Asthma & Immunology to provide further guidance and synthesis in this use of vaccination for diagnostic purposes in consideration of PIDD, as well as to identify key areas for further research.

Journal ArticleDOI
Yu Liu1, Yu Liu2, Fahui Wang1, Yu Xiao2, Song Gao2 
TL;DR: Wang et al. as mentioned in this paper used a seven-day taxi trajectory data set collected in Shanghai, and investigated the temporal variations of both pick-ups and drop-offs, and their association with different land use features.

Journal ArticleDOI
TL;DR: The technical achievements of Google Earth, and the functionality of this first generation of virtual globes, are reviewed against the Gore vision, to prompt a reexamination of the initial vision of Digital Earth.
Abstract: A speech of then-Vice President Al Gore in 1998 created a vision for a Digital Earth, and played a role in stimulating the development of a first generation of virtual globes, typified by Google Earth, that achieved many but not all the elements of this vision. The technical achievements of Google Earth, and the functionality of this first generation of virtual globes, are reviewed against the Gore vision. Meanwhile, developments in technology continue, the era of “big data” has arrived, the general public is more and more engaged with technology through citizen science and crowd-sourcing, and advances have been made in our scientific understanding of the Earth system. However, although Google Earth stimulated progress in communicating the results of science, there continue to be substantial barriers in the public’s access to science. All these factors prompt a reexamination of the initial vision of Digital Earth, and a discussion of the major elements that should be part of a next generation.

Journal ArticleDOI
TL;DR: The presence of recently damaged and deceased corals beneath the path of a previously documented plume emanating from the Macondo well provides compelling evidence that the oil impacted deep-water ecosystems.
Abstract: To assess the potential impact of the Deepwater Horizon oil spill on offshore ecosystems, 11 sites hosting deep-water coral communities were examined 3 to 4 mo after the well was capped. Healthy coral communities were observed at all sites >20 km from the Macondo well, including seven sites previously visited in September 2009, where the corals and communities appeared unchanged. However, at one site 11 km southwest of the Macondo well, coral colonies presented widespread signs of stress, including varying degrees of tissue loss, sclerite enlargement, excess mucous production, bleached commensal ophiuroids, and covering by brown flocculent material (floc). On the basis of these criteria the level of impact to individual colonies was ranked from 0 (least impact) to 4 (greatest impact). Of the 43 corals imaged at that site, 46% exhibited evidence of impact on more than half of the colony, whereas nearly a quarter of all of the corals showed impact to >90% of the colony. Additionally, 53% of these corals’ ophiuroid associates displayed abnormal color and/or attachment posture. Analysis of hopanoid petroleum biomarkers isolated from the floc provides strong evidence that this material contained oil from the Macondo well. The presence of recently damaged and deceased corals beneath the path of a previously documented plume emanating from the Macondo well provides compelling evidence that the oil impacted deep-water ecosystems. Our findings underscore the unprecedented nature of the spill in terms of its magnitude, release at depth, and impact to deep-water ecosystems.

01 Jan 2012
TL;DR: The basic concepts of genetic integrity and the consequences of defects in the DNA repair relevant to CRC are discussed andEpigenetic alterations, essential in CRC tumorigenesis, are reviewed alongside clinical informationrelevant to CRC.
Abstract: Colorectal cancer (CRC) is the third most common cancer in men and the second most common cancer in women worldwide. Both genetic and epigenetic alterations are common in CRC and are the driving force of tumorigenesis. The adenoma-carcinoma sequence was proposed in the 1980s that described transformation of normal colorectal epithelium to an adenoma and ultimately to an invasive and metastatic tumor. Initial genetic changes start in an early adenoma and accumulate as it transforms to carcinoma. Chromosomal instability, microsatellite instability and CpG island methylator phenotype pathways are responsible for genetic instability in colorectal cancer. Chromosomal instability pathway consist of activation of proto-oncogenes (KRAS) and inactivation of at least three tumor suppression genes, namely loss of APC, p53 and loss of heterozogosity (LOH) of long arm of chromosome 18. Mutations of TGFBR and PIK3CA genes have also been recently described. Herein we briefly discuss the basic concepts of genetic integrity and the consequences of defects in the DNA repair relevant to CRC. Epigenetic alterations, essential in CRC tumorigenesis, are also reviewed alongside clinical information relevant to CRC.

Journal ArticleDOI
TL;DR: The resulting phylogeny provides overwhelming support for the hypothesis that turtles evolved from a common ancestor of birds and crocodilians, rejecting the hypothesized relationship between turtles and lepidosaurs.
Abstract: We present the first genomic-scale analysis addressing the phylogenetic position of turtles, using over 1000 loci from representatives of all major reptile lineages including tuatara. Previously, studies of morphological traits positioned turtles either at the base of the reptile tree or with lizards, snakes and tuatara (lepidosaurs), whereas molecular analyses typically allied turtles with crocodiles and birds (archosaurs). A recent analysis of shared microRNA families found that turtles are more closely related to lepidosaurs. To test this hypothesis with data from many single-copy nuclear loci dispersed throughout the genome, we used sequence capture, high-throughput sequencing and published genomes to obtain sequences from 1145 ultraconserved elements (UCEs) and their variable flanking DNA. The resulting phylogeny provides overwhelming support for the hypothesis that turtles evolved from a common ancestor of birds and crocodilians, rejecting the hypothesized relationship between turtles and lepidosaurs.

Journal ArticleDOI
TL;DR: In this article, the authors used meta-analytical techniques to assess the extent to which job burnout and employee engagement are independent and useful constructs, and found that dimension-level correlations between burnout, and engagement dimensions exhibit a similar pattern of association with correlates, and controlling for burnout in meta-regression equations significantly reduced the effect sizes associated with engagement.

Journal ArticleDOI
12 Jan 2012-Nature
TL;DR: The central region of the supernova remnant SNR 0509−67.5 in the Large Magellanic Cloud contains no ex-companion star to a visual magnitude limit of 26.9 (an absolute magnitude of MV = +8.4) within a region of radius 1.43 arcseconds, which rules out all published single-degenerate models for this supernova.
Abstract: The central region of the supernova remnant SNR 0509−67.5 in the Large Magellanic Cloud is shown to contain no ex-companion star, which suggests it was formed by an explosion resulting from the merger of two white dwarf stars. Type Ia supernovae are thought to result from a thermonuclear explosion of a white dwarf star, arising from either the merger of two white dwarfs (the 'double-degenerate' path), or by mass transfer from a companion star ('single-degenerate' path). The precise identity of the progenitor is still controversial. Bradley Schaefer and Ashley Pagnotta have examined high-resolution Hubble Space Telescope images of the central region of the supernova remnant SNR 0509-67.5 in the Large Magellanic Cloud, and find no sign of a point source. This lack of any ex-companion star rules out all published single-degenerate models for this supernova and points to a double-degenerate merger as the most likely origin. A type Ia supernova is thought to begin with the explosion of a white dwarf star1. The explosion could be triggered by the merger of two white dwarfs2,3 (a ‘double-degenerate’ origin), or by mass transfer from a companion star4,5 (the ‘single-degenerate’ path). The identity of the progenitor is still controversial; for example, a recent argument against the single-degenerate origin6 has been widely rejected7,8,9,10,11. One way to distinguish between the double- and single-degenerate progenitors is to look at the centre of a known type Ia supernova remnant to see whether any former companion star is present12,13. A likely ex-companion star for the progenitor of the supernova observed by Tycho Brahe has been identified14, but that claim is still controversial15,16,17,18. Here we report that the central region of the supernova remnant SNR 0509−67.5 (the site of a type Ia supernova 400 ± 50 years ago, based on its light echo19,20) in the Large Magellanic Cloud contains no ex-companion star to a visual magnitude limit of 26.9 (an absolute magnitude of MV = +8.4) within a region of radius 1.43 arcseconds. (This corresponds to the 3σ maximum distance to which a companion could have been ‘kicked’ by the explosion.) This lack of any ex-companion star to deep limits rules out all published single-degenerate models for this supernova. The only remaining possibility is that the progenitor of this particular type Ia supernova was a double-degenerate system.

Journal ArticleDOI
TL;DR: This uniquely large pooled analysis confirms that current cigarette smoking is associated with a twofold increased risk of pancreatic cancer and that the risk increases with the number of cigarettes smoked and duration of smoking.

Journal ArticleDOI
TL;DR: The most recent proposed functional roles for these components, their structures (as deduced from biochemical and X-ray crystallographic studies) and the locations of their proposed binding domains within the Photosystem II complex are discussed.

Journal ArticleDOI
TL;DR: This review focuses on the key aspects of glutathione redox mechanisms associated with apoptotic signaling that includes changes in cellular glutathion redox homeostasis through glutATHione oxidation or GSH transport in relation to the initiation or propagation of the apoptotic cascade, and evidence for S-glutathiolation in protein modulation and apoptotic initiation.

Journal ArticleDOI
TL;DR: It is proposed that the degree of fragmentation or permeability of the geographical setting together with the intermediate dispersal model are crucial in reconciling previous, often contradictory findings regarding the relationship between dispersal and diversification.
Abstract: Dispersal can stimulate speciation by facilitating geographical expansion across barriers or inhibit speciation by maintaining gene flow among populations. Therefore, the relationship between dispersal ability and speciation rates can be positive or negative. Furthermore, an ‘intermediate dispersal’ model that combines positive and negative effects predicts a unimodal relationship between dispersal and diversification. Because both dispersal ability and speciation rates are difficult to quantify, empirical evidence for the relationship between dispersal and diversification remains scarce. Using a surrogate for flight performance and a species-level DNA-based phylogeny of a large South American bird radiation (the Furnariidae), we found that lineages with higher dispersal ability experienced lower speciation rates. We propose that the degree of fragmentation or permeability of the geographical setting together with the intermediate dispersal model are crucial in reconciling previous, often contradictory findings regarding the relationship between dispersal and diversification.

Journal ArticleDOI
TL;DR: It is suggested that heavily weathered crude oil from the spill imparts significant biological impacts in sensitive Louisiana marshes, some of which remain for over 2 mo following initial exposures.
Abstract: The biological consequences of the Deepwater Horizon oil spill are unknown, especially for resident organisms Here, we report results from a field study tracking the effects of contaminating oil across space and time in resident killifish during the first 4 mo of the spill event Remote sensing and analytical chemistry identified exposures, which were linked to effects in fish characterized by genome expression and associated gill immunohistochemistry, despite very low concentrations of hydrocarbons remaining in water and tissues Divergence in genome expression coincides with contaminating oil and is consistent with genome responses that are predictive of exposure to hydrocarbon-like chemicals and indicative of physiological and reproductive impairment Oil-contaminated waters are also associated with aberrant protein expression in gill tissues of larval and adult fish These data suggest that heavily weathered crude oil from the spill imparts significant biological impacts in sensitive Louisiana marshes, some of which remain for over 2 mo following initial exposures

Journal ArticleDOI
TL;DR: The findings provide a better understanding of the mechanism(s) involved in stem cell aging and regenerative potential, and this in turn may affect tissue repair that declines with aging.

Journal ArticleDOI
TL;DR: In this article, the effects of source materials on the microstructure and mechanical properties were studied by comparing two types of geopolymers synthesized from metakaolin, a non-waste material, and the admixture of two wastes, red mud and fly ash.

Journal ArticleDOI
TL;DR: It is argued that the vision of Digital Earth put forward by Vice-President Al Gore 13 years ago needs to be re-evaluated in the light of the many developments in the fields of information technology, data infrastructures and earth observation that have taken place since.
Abstract: This position paper is the outcome of a brainstorming workshop organised by the International Society for Digital Earth (ISDE) in Beijing in March 2011. It argues that the vision of Digital Earth (DE) put forward by Vice-President Al Gore 13 years ago needs to be re-evaluated in the light of the many developments in the fields of information technology, data infrastructures and earth observation that have taken place since. The paper identifies the main policy, scientific and societal drivers for the development of DE and illustrates the multi-faceted nature of a new vision of DE grounding it with a few examples of potential applications. Because no single organisation can on its own develop all the aspects of DE, it is essential to develop a series of collaborations at the global level to turn the vision outlined in this paper into reality.

Journal ArticleDOI
TL;DR: In this article, a multi-objective optimization model based on harmony search algorithm (HS) is presented to minimize the life cycle cost (LCC) and carbon dioxide equivalent (CO2-eq) emissions of the buildings.