scispace - formally typeset
Search or ask a question

Showing papers in "PubMed Central in 2012"


Journal Article
TL;DR: The Human Microbiome Project has analysed the largest cohort and set of distinct, clinically relevant body habitats so far, finding the diversity and abundance of each habitat’s signature microbes to vary widely even among healthy subjects, with strong niche specialization both within and among individuals.
Abstract: Studies of the human microbiome have revealed that even healthy individuals differ remarkably in the microbes that occupy habitats such as the gut, skin and vagina. Much of this diversity remains unexplained, although diet, environment, host genetics and early microbial exposure have all been implicated. Accordingly, to characterize the ecology of human-associated microbial communities, the Human Microbiome Project has analysed the largest cohort and set of distinct, clinically relevant body habitats so far. We found the diversity and abundance of each habitat’s signature microbes to vary widely even among healthy subjects, with strong niche specialization both within and among individuals. The project encountered an estimated 81–99% of the genera, enzyme families and community configurations occupied by the healthy Western microbiome. Metagenomic carriage of metabolic pathways was stable among individuals despite variation in community structure, and ethnic/racial background proved to be one of the strongest associations of both pathways and microbes with clinical metadata. These results thus delineate the range of structural and functional configurations normal in the microbial communities of a healthy population, enabling future characterization of the epidemiology, ecology and translational applications of the human microbiome.

6,350 citations


Journal Article
TL;DR: In this paper, a split-spectrum amplitude-decorrelation angiography (SSADA) was proposed to improve the signal-to-noise ratio (SNR) of flow detection.
Abstract: Amplitude decorrelation measurement is sensitive to transverse flow and immune to phase noise in comparison to Doppler and other phase-based approaches. However, the high axial resolution of OCT makes it very sensitive to the pulsatile bulk motion noise in the axial direction. To overcome this limitation, we developed split-spectrum amplitude-decorrelation angiography (SSADA) to improve the signal-to-noise ratio (SNR) of flow detection. The full OCT spectrum was split into several narrower bands. Inter-B-scan decorrelation was computed using the spectral bands separately and then averaged. The SSADA algorithm was tested on in vivo images of the human macula and optic nerve head. It significantly improved both SNR for flow detection and connectivity of microvascular network when compared to other amplitude-decorrelation algorithms.

1,151 citations


Journal Article
TL;DR: This paper presents a meta-modelling system that automates the very labor-intensive and therefore time-heavy and expensive and therefore expensive and expensive process of manually programming the Schrödinger programming language.
Abstract: Appropriate tuning of binding selectivity is a primary objective in the discovery and optimization of a compound on the path toward developing a drug. The environment in which drugs act is complex, with many potential interaction partners. Proteins, DNA, RNA, lipids, sugars, metabolites, and other small molecules all have the potential to interact with a drug, and in many cases these unexpected interactions lead to undesired and often severe side effects. Conversely, the ability to interact with multiple targets or drug resistance mutants can be advantageous in certain contexts. Designing a drug with the appropriate balance of avoidance of undesirable targets (narrow selectivity) and coverage of one or more targets of interest (broad selectivity, also referred to as promiscuity) is a continual drug development challenge. In many cases this objective is attained through trial and error, but there are rational approaches that can guide the tuning of selectivity, and examples have been published that illustrate a number of generalizable strategies. In this review, we discuss fundamental principles that account for selectivity and highlight examples where selectivity has been attained through rational design. An understanding of the general principles that drive selectivity should allow for more efficient design of compounds with desirable selectivity profiles.1−3 Traditionally, drug design has been pursued with the primary objective of finding a compound that binds with high affinity to a target of interest.4 Recently, considerable effort has been expended measuring off-target interactions with partners such as ion channels (including the Kv11.1 potassium ion channel hERG),5,6 cytochrome P450s (CYPs),7,8 and other proteins that can lead to adverse side effects. Other considerations, such as family or subtype selectivity have gained considerable attention for targets with homologues that bind to the same or similar native substrates. A common example is the kinase family (i.e., phosphotransferases), for which each family member binds ATP in the process of transferring a phosphate group to a substrate.9 From a drug discovery perspective, the aim is to hit only one or a subset of kinases along the biochemical pathway of interest while avoiding other kinases for which inhibition may result in adverse effects.10 In practice, absolute selectivity for a single kinase may be unattainable, but modulating the selectivity profile can lead to improved drug properties and in many cases hitting multiple kinases can be beneficial.11 While it is most common to design away from interactions with undesirable proteins, in other cases it is desirable to hit a panel of targets.12,13 An example of this type of broad coverage involves designing a drug that is not sensitive to resistance mutations, which requires a molecule that binds to drug-resistant variants as well as to the wild-type target. This type of promiscuous, broad coverage is particularly important for rapidly mutating targets, such as those that occur in infectious disease (with HIV being a prototypical example) and cancer. This aspect of drug discovery is of growing importance, as witnessed by the evolution of resistance to existing anticancer14−16 and antimicrobial agents (antibiotics,17 antivirals,18 antifungals,19 and antimalarials20). Similarly, when multiple pathways are accessible for a given signaling cascade, it may be desirable to hit at least one member of each parallel pathway in order to successfully block the downstream signal. Recently, the idea of deliberately using promiscuous drugs has gained credence.11 However, this promiscuity must itself be selective for a given subset of targets, and nonspecific binding is always undesirable. In general, there is a fine balance in designing the appropriate level of narrow and broad selectivity, and one must determine the design criteria for selectivity based on the relevant biological processes. The importance of gaining selectivity has been appreciated for many years, and there are a number of experimental approaches to screen for off-target interactions.21−23 While performing an exhaustive selectivity screen against all possible interaction partners is still intractable, it is possible to construct selectivity screening panels that can be used to gain insights and find more selective compounds.21 Conceptually, the problem of designing for a particular selectivity profile is significantly more complex than designing for high affinity to a single target. This is true whether purely experimental approaches are being undertaken or whether computational analysis and design are involved. The underlying problem is challenging because it is necessary to evaluate energy differences for each ligand binding to a panel of targets and decoys rather than to a single desirable target. Computational methods are of limited accuracy when predicting affinities of individual complexes; these difficulties are compounded when multiple relative affinities are required to accurately design appropriate specificities. From a computational perspective, structure-based design methods typically are developed to yield low false-positive rates (i.e., to maximize the chance that predictions of tight binders are in fact tight binders) at the expense of higher false-negative rates (tight binders that are not predicted to be so by the computational method). Accurate selectivity prediction and design require reducing the false-negative rate without increasing the false-positive one. This is a difficult search problem and can require very fine sampling of conformational space, including protein and ligand intramolecular degrees of freedom, as well as intermolecular (“pose”) degrees of freedom. This problem becomes increasingly more difficult if the proteins and/or ligands have significant flexibility, as the size of the search space increases enormously. Essentially, designing for selectivity is significantly more complex than designing for affinity for two reasons: first, because of the multifactorial nature of the task and, second, because of the inherent difficulty of considering all modes of relaxation with sufficient accuracy, particularly when ligands bind decoy receptors. In this review we highlight some recent examples of successful approaches to achieving changes in selectivity. We present cases where the goal required narrowing the binding profile to one or a small number of targets and increasing the relative binding affinity to targets over decoys, and we present cases where the goal required broadening the binding profile to increase the number of targets bound and flattening the relative affinity across the panel of targets. We have deliberately elected to organize the discussion around a set of principles that have proven enabling in realizing selectivity goals. In very simple yet still useful terms, achieving broad selectivity involves recognizing and exploiting similarities in binding capabilities across a collection of targets, and narrow selectivity involves identifying and exploiting differences between targets and decoys. Most of the review examines five aspects of binding and complementarity that have proven useful handles that we have grouped together as structure-based approaches. These five features (shape, electrostatics, flexibility, hydration, and allostery) have been utilized because they differ, whether subtly or substantially, across sets of target and decoy molecules sufficiently to realize the affinity changes necessary for selectivity. The principles of exploiting the features listed above are schematically represented in Figure ​Figure1,1, and we will describe and discuss each in detail. The review continues by discussing other approaches that involve higher-level concepts beyond taking advantage of structural similarities and differences, although ultimately they can often be achieved through structure-based approaches. We describe a substrate-mimetic approach to developing broad inhibition across a population of rapidly mutating enzyme targets (called the substrate envelope hypothesis), and we also describe methods for leveraging differences in cellular environments to achieve selectivity goals. We have necessarily chosen a limited number of examples from the recent literature to review and illustrate the narrative that we have set forward. We apologize in advance for necessary omissions and any inadvertent oversights that kept us from including all of the truly wonderful advances in this field. We also note that reviews on related topics have appeared that will also be of use to the interested reader.24−28 Open in a separate window Figure 1 Selectivity Strategies. This cartoon illustrates six design strategies based on five principles (shape, electrostatics, flexibility, hydration, and allostery) that can be employed to gain binding selectivity for a given target: (A) optimization of ligand charges specifically for the target and against the decoy; (B) displacement of a high-energy water molecule in the target that is not present in the decoy; (C) binding to an allosteric pocket in the target that is not present in the decoy; (D) creating a clash with the decoy receptor but not the target receptor, where the decoy is unable to alleviate the clash by structural rearrangement; (E) binding to a receptor conformation that is accessible in the target but inaccessible in the decoy; (F) creating an interaction with the target receptor but not the decoy receptor, where the decoy is unable to form the interaction by structural rearrangement. Note that (D) and (F) are different manifestations of the same underlying principle (shape complementarity), with (D) decreasing binding to the decoy through the introduction of a clash and (F) increasing binding to the target through the introduction of a favorable contact.

181 citations


Journal Article
TL;DR: The results provide important clues regarding the functional architecture of face processing, suggesting that the left hemisphere is involved in processing ‘low-level’ face semblance, and perhaps is a precursor to categorical ‘deep’ analyses on the right.
Abstract: Are visual face processing mechanisms the same in the left and right cerebral hemispheres? The possibility of such ‘duplicated processing’ seems puzzling in terms of neural resource usage, and we currently lack a precise characterization of the lateral differences in face processing. To address this need, we have undertaken a three-pronged approach. Using functional magnetic resonance imaging, we assessed cortical sensitivity to facial semblance, the modulatory effects of context and temporal response dynamics. Results on all three fronts revealed systematic hemispheric differences. We found that: (i) activation patterns in the left fusiform gyrus correlate with image-level face-semblance, while those in the right correlate with categorical face/non-face judgements. (ii) Context exerts significant excitatory/inhibitory influence in the left, but has limited effect on the right. (iii) Face-selectivity persists in the right even after activity on the left has returned to baseline. These results provide important clues regarding the functional architecture of face processing, suggesting that the left hemisphere is involved in processing ‘low-level’ face semblance, and perhaps is a precursor to categorical ‘deep’ analyses on the right.

118 citations


Journal Article
TL;DR: Enhanced tissue discrimination as well as quantitative measurements of sample properties was demonstrated using the additional contrast and information contained in the PS-OCT images.
Abstract: Polarization sensitive optical coherence tomography (PS-OCT) is a functional imaging method that provides additional contrast using the light polarizing properties of a sample. This manuscript describes PS-OCT based on ultrahigh speed swept source / Fourier domain OCT operating at 1050nm at 100kHz axial scan rates using single mode fiber optics and a multiplexing approach. Unlike previously reported PS-OCT multiplexing schemes, the method uses a passive polarization delay unit and does not require active polarization modulating devices. This advance decreases system cost and avoids complex synchronization requirements. The polarization delay unit was implemented in the sample beam path in order to simultaneously illuminate the sample with two different polarization states. The orthogonal polarization components for the depth-multiplexed signals from the two input states were detected using dual balanced detection. PS-OCT images were computed using Jones calculus. 3D PS-OCT imaging was performed in the human and rat retina. In addition to standard OCT images, PS-OCT images were generated using contrast form birefringence and depolarization. Enhanced tissue discrimination as well as quantitative measurements of sample properties was demonstrated using the additional contrast and information contained in the PS-OCT images.

98 citations


Journal Article
TL;DR: Resources from a population of 242 healthy adults sampled at 15 or 18 body sites up to three times are presented, which have generated 5,177 microbial taxonomic profiles from 16S ribosomal RNA genes and over 3.5 terabases of metagenomic sequence so far.
Abstract: A variety of microbial communities and their genes (the microbiome) exist throughout the human body, with fundamental roles in human health and disease. The National Institutes of Health (NIH)-funded Human Microbiome Project Consortium has established a population-scale framework to develop metagenomic protocols, resulting in a broad range of quality-controlled resources and data including standardized methods for creating, processing and interpreting distinct types of high-throughput metagenomic data available to the scientific community. Here we present resources from a population of 242 healthy adults sampled at 15 or 18 body sites up to three times, which have generated 5,177 microbial taxonomic profiles from 16S ribosomal RNA genes and over 3.5 terabases of metagenomic sequence so far. In parallel, approximately 800 reference strains isolated from the human body have been sequenced. Collectively, these data represent the largest resource describing the abundance and variety of the human microbiome, while providing a framework for current and future studies.

56 citations


Journal Article
TL;DR: In this paper, the authors describe fabrication and testing of a nanochannel device that enhances measurement resolution by performing multiple measurements (>100) on single DNA molecules, which enabled length discrimination between a mixture of λ-DNA (48.5 kbp) and T7 DNA (39.9 kbp).
Abstract: Nanofluidic sensing elements have been the focus of recent experiments for numerous applications ranging from nucleic acid fragment sizing to single-molecule DNA sequencing. These applications critically rely on high measurement fidelity, and methods to increase resolution are required. Herein, we describe fabrication and testing of a nanochannel device that enhances measurement resolution by performing multiple measurements (>100) on single DNA molecules. The enhanced measurement resolution enabled length discrimination between a mixture of λ-DNA (48.5 kbp) and T7 DNA (39.9 kbp) molecules, which were detected as transient current changes during translocation of the molecules through the nanochannel. As long DNA molecules are difficult to resolve quickly and with high fidelity with conventional electrophoresis, this approach may yield potentially portable, direct electrical sizing of DNA fragments with high sensitivity and resolution.

21 citations


Journal Article
TL;DR: It is found that rate distributions are lognormal, with a mean and variance that depend on climatic conditions and substrate, suggesting that the intrinsic variability of decomposers, substrate and environment results in a predictable distribution of rates.
Abstract: Carbon removed from the atmosphere by photosynthesis is released back by respiration. Although some organic carbon is degraded quickly, older carbon persists; consequently carbon stocks are much larger than predicted by initial decomposition rates. This disparity can be traced to a wide range of first-order decay-rate constants, but the rate distributions and the mechanisms that determine them are unknown. Here, we pose and solve an inverse problem to find the rate distributions corresponding to the decomposition of plant matter throughout North America. We find that rate distributions are lognormal, with a mean and variance that depend on climatic conditions and substrate. Changes in temperature and precipitation scale all rates similarly, whereas the initial substrate composition sets the time scale of faster rates. These findings probably result from the interplay of stochastic processes and biochemical kinetics, suggesting that the intrinsic variability of decomposers, substrate and environment results in a predictable distribution of rates. Within this framework, turnover times increase exponentially with the kinetic heterogeneity of rates, thereby providing a theoretical expression for the persistence of recalcitrant organic carbon in the natural environment.

7 citations


Journal Article
TL;DR: This work derives continuous estimates of cerebral venous oxygen saturation, cerebral oxygen extraction fraction, cerebral blood flow, and cerebral metabolic rate of oxygen from conventional ventilators, and these estimates compare very favorably to previously reported data obtained by non-continuous and invasive means from preterm infants in neonatal critical care.
Abstract: Oxidative stress during fetal development, delivery, or early postnatal life is a major cause of neuropathology, as both hypoxic and hyperoxic insults can significantly damage the developing brain. Despite the obvious need for reliable cerebral oxygenation monitoring, no technology currently exists to monitor cerebral oxygen metabolism continuously and noninvasively in infants at high risk for developing brain injury. Consequently, a rational approach to titrating oxygen supply to cerebral oxygen demand - and thus avoiding hyperoxic or hypoxic insults - is currently lacking. We present a promising method to close this crucial technology gap in the important case of neonates on conventional ventilators. By using cerebral near-infrared spectroscopy and signals from conventional ventilators, along with arterial oxygen saturation, we derive continuous (breath-by-breath) estimates of cerebral venous oxygen saturation, cerebral oxygen extraction fraction, cerebral blood flow, and cerebral metabolic rate of oxygen. The resultant estimates compare very favorably to previously reported data obtained by non-continuous and invasive means from preterm infants in neonatal critical care.

6 citations


Journal Article
TL;DR: Optical coherence tomography is now proven to be an effective noninvasive tool to evaluate the choroid and to detect choroidal changes in pathologic states and can be used as a parameter for diagnosis and follow-up.
Abstract: Background:A structurally and functionally normal choroidal vasculature is essential for retinal function. Therefore, a precise clinical understanding of choroidal morphology should be important for understanding many retinal and choroidal diseases.Methods:PUBMED (http://www.ncbi.nlm.nih.gov/sites/e

4 citations


Journal Article
TL;DR: In this paper, the role of cellular deformability in determining the circulatory characteristics of gametocytes was investigated, which suggests that mature but not immature gametocyte circulate in the peripheral blood for uptake in the mosquito blood meal and transmission to another human host thus ensuring long-term survival of the parasite.
Abstract: Gametocyte maturation in Plasmodium falciparum is a critical step in the transmission of malaria While the majority of parasites proliferate asexually in red blood cells, a small fraction of parasites undergo sexual conversion and mature over 2 weeks to become competent for transmission to a mosquito vector Immature gametocytes sequester in deep tissues while mature stages must be able to circulate, pass the spleen and present themselves to the mosquito vector in order to complete transmission Sequestration of asexual red blood cell stage parasites has been investigated in great detail These studies have demonstrated that induction of cytoadherence properties through specific receptor-ligand interactions coincides with a significant increase in host cell stiffness In contrast, the adherence and biophysical properties of gametocyte-infected red blood cells have not been studied systematically Utilizing a transgenic line for 3D live imaging, in vitro capillary assays and 3D finite element whole cell modelling, we studied the role of cellular deformability in determining the circulatory characteristics of gametocytes Our analysis shows that the red blood cell deformability of immature gametocytes displays an overall decrease followed by rapid restoration in mature gametocytes Intriguingly, simulations suggest that along with deformability variations, the morphological changes of the parasite may play an important role in tissue distribution in vivo Taken together, we present a model, which suggests that mature but not immature gametocytes circulate in the peripheral blood for uptake in the mosquito blood meal and transmission to another human host thus ensuring long-term survival of the parasite