scispace - formally typeset
Search or ask a question

Showing papers by "University College London published in 2005"


Journal ArticleDOI
TL;DR: The new HITRAN is greatly extended in terms of accuracy, spectral coverage, additional absorption phenomena, added line-shape formalisms, and validity, and molecules, isotopologues, and perturbing gases have been added that address the issues of atmospheres beyond the Earth.
Abstract: This paper describes the contents of the 2016 edition of the HITRAN molecular spectroscopic compilation. The new edition replaces the previous HITRAN edition of 2012 and its updates during the intervening years. The HITRAN molecular absorption compilation is composed of five major components: the traditional line-by-line spectroscopic parameters required for high-resolution radiative-transfer codes, infrared absorption cross-sections for molecules not yet amenable to representation in a line-by-line form, collision-induced absorption data, aerosol indices of refraction, and general tables such as partition sums that apply globally to the data. The new HITRAN is greatly extended in terms of accuracy, spectral coverage, additional absorption phenomena, added line-shape formalisms, and validity. Moreover, molecules, isotopologues, and perturbing gases have been added that address the issues of atmospheres beyond the Earth. Of considerable note, experimental IR cross-sections for almost 300 additional molecules important in different areas of atmospheric science have been added to the database. The compilation can be accessed through www.hitran.org. Most of the HITRAN data have now been cast into an underlying relational database structure that offers many advantages over the long-standing sequential text-based structure. The new structure empowers the user in many ways. It enables the incorporation of an extended set of fundamental parameters per transition, sophisticated line-shape formalisms, easy user-defined output formats, and very convenient searching, filtering, and plotting of data. A powerful application programming interface making use of structured query language (SQL) features for higher-level applications of HITRAN is also provided.

7,638 citations


Journal ArticleDOI
TL;DR: The dementia with Lewy bodies (DLB) Consortium has revised criteria for the clinical and pathologic diagnosis of DLB incorporating new information about the core clinical features and suggesting improved methods to assess them as mentioned in this paper.
Abstract: The dementia with Lewy bodies (DLB) Consortium has revised criteria for the clinical and pathologic diagnosis of DLB incorporating new information about the core clinical features and suggesting improved methods to assess them. REM sleep behavior disorder, severe neuroleptic sensitivity, and reduced striatal dopamine transporter activity on functional neuroimaging are given greater diagnostic weighting as features suggestive of a DLB diagnosis. The 1-year rule distinguishing between DLB and Parkinson disease with dementia may be difficult to apply in clinical settings and in such cases the term most appropriate to each individual patient should be used. Generic terms such as Lewy body (LB) disease are often helpful. The authors propose a new scheme for the pathologic assessment of LBs and Lewy neurites (LN) using alpha-synuclein immunohistochemistry and semiquantitative grading of lesion density, with the pattern of regional involvement being more important than total LB count. The new criteria take into account both Lewy-related and Alzheimer disease (AD)-type pathology to allocate a probability that these are associated with the clinical DLB syndrome. Finally, the authors suggest patient management guidelines including the need for accurate diagnosis, a target symptom approach, and use of appropriate outcome measures. There is limited evidence about specific interventions but available data suggest only a partial response of motor symptoms to levodopa: severe sensitivity to typical and atypical antipsychotics in ∼50%, and improvements in attention, visual hallucinations, and sleep disorders with cholinesterase inhibitors.

4,258 citations


Journal ArticleDOI
TL;DR: A new, MATLAB based toolbox for the SPM2 software package is introduced which enables the integration of probabilistic cytoarchitectonic maps and results of functional imaging studies and an easy-to-use tool for the integrated analysis of functional and anatomical data in a common reference space.

3,911 citations


Journal ArticleDOI
TL;DR: Overall safety and tolerability was good with no change in the safety profile of pioglitazone identified; mortality rates from heart failure did not differ between groups.

3,899 citations


Journal ArticleDOI
TL;DR: A Commission on Social Determinants of Health is launching, which will review the evidence, raise societal debate, and recommend policies with the goal of improving health of the world's most vulnerable people.

3,670 citations


Journal ArticleDOI
TL;DR: The aims of this article are to encompass many apparently unrelated anatomical, physiological and psychophysical attributes of the brain within a single theoretical perspective and to provide a principled way to understand many aspects of cortical organization and responses.
Abstract: This article concerns the nature of evoked brain responses and the principles underlying their generation. We start with the premise that the sensory brain has evolved to represent or infer the causes of changes in its sensory inputs. The problem of inference is well formulated in statistical terms. The statistical fundaments of inference may therefore afford important constraints on neuronal implementation. By formulating the original ideas of Helmholtz on perception, in terms of modern-day statistical theories, one arrives at a model of perceptual inference and learning that can explain a remarkable range of neurobiological facts. It turns out that the problems of inferring the causes of sensory input (perceptual inference) and learning the relationship between input and cause (perceptual learning) can be resolved using exactly the same principle. Specifically, both inference and learning rest on minimizing the brain’s free energy, as defined in statistical physics. Furthermore, inference and learning can proceed in a biologically plausible fashion. Cortical responses can be seen as the brain’s attempt to minimize the free energy induced by a stimulus and thereby encode the most likely cause of that stimulus. Similarly, learning emerges from changes in synaptic efficacy that minimize the free energy, averaged over all stimuli encountered. The underlying scheme rests on empirical Bayes and hierarchical models of how sensory input is caused. The use of hierarchical models enables the brain to construct prior expectations in a dynamic and context-sensitive fashion. This scheme provides a principled way to understand many aspects of cortical organization and responses. The aim of this article is to encompass many apparently unrelated anatomical, physiological and psychophysical attributes of the brain within a single theoretical perspective. In terms of cortical architectures, the theoretical treatment predicts that sensory cortex should be arranged hierarchically, that connections should be reciprocal and that forward and backward connections should show a functional asymmetry (forward connections are driving, whereas backward connections are both driving and modulatory). In terms of synaptic physiology, it predicts associative plasticity and, for dynamic models, spike-timing-dependent plasticity. In terms of electrophysiology, it accounts for classical and extra classical receptive field effects and long-latency or endogenous components of evoked cortical responses. It predicts the attenuation of responses encoding prediction error with perceptual learning and explains many phenomena such as repetition suppression, mismatch negativity (MMN) and the P300 in electroencephalography. In psychophysical terms, it accounts for the behavioural correlates of these physiological phenomena, for example, priming and global precedence. The final focus of this article is on perceptual learning as measured with the MMN and the implications for empirical studies of coupling among cortical areas using evoked sensory responses.

3,569 citations


Journal ArticleDOI
TL;DR: For example, Frith as discussed by the authors showed that children with autism have a specific problem with theory-of-mind tasks, such as looking for the hidden chocolate in the cupboard.

3,256 citations


Journal ArticleDOI
20 Jan 2005-Neuron
TL;DR: A very rapid method of conditioning the human motor cortex using rTMS that produces a controllable, consistent, long-lasting, and powerful effect on motor cortex physiology and behavior after an application period of only 20-190 s is described.

3,211 citations


Journal ArticleDOI
TL;DR: This study details the 2009 recommendations of the NCCD on the use of cell death-related terminology including ‘entosis’, ‘mitotic catastrophe”,’ ‘necrosis‚ ‘necroptosis‚’ and ‘pyroptotic’.
Abstract: Different types of cell death are often defined by morphological criteria, without a clear reference to precise biochemical mechanisms. The Nomenclature Committee on Cell Death (NCCD) proposes unified criteria for the definition of cell death and of its different morphologies, while formulating several caveats against the misuse of words and concepts that slow down progress in the area of cell death research. Authors, reviewers and editors of scientific periodicals are invited to abandon expressions like 'percentage apoptosis' and to replace them with more accurate descriptions of the biochemical and cellular parameters that are actually measured. Moreover, at the present stage, it should be accepted that caspase-independent mechanisms can cooperate with (or substitute for) caspases in the execution of lethal signaling pathways and that 'autophagic cell death' is a type of cell death occurring together with (but not necessarily by) autophagic vacuolization. This study details the 2009 recommendations of the NCCD on the use of cell death-related terminology including 'entosis', 'mitotic catastrophe', 'necrosis', 'necroptosis' and 'pyroptosis'.

3,005 citations


Journal ArticleDOI
TL;DR: This review presents the best characterized of these biochemical pathways that control some of the most fundamental processes of cell biology common to all eukaryotes, including morphogenesis, polarity, movement, and cell division.
Abstract: Approximately one percent of the human genome encodes proteins that either regulate or are regulated by direct interaction with members of the Rho family of small GTPases. Through a series of complex biochemical networks, these highly conserved molecular switches control some of the most fundamental processes of cell biology common to all eukaryotes, including morphogenesis, polarity, movement, and cell division. In the first part of this review, we present the best characterized of these biochemical pathways; in the second part, we attempt to integrate these molecular details into a biological context.

2,876 citations


Journal ArticleDOI
TL;DR: A set of behaviour change domains agreed by a consensus of experts is available for use in implementation research and applications of this domain list will enhance understanding of the behaviour change processes inherent in implementation of evidence-based practice.
Abstract: Background: Evidence-based guidelines are often not implemented effectively with the result that best health outcomes are not achieved. This may be due to a lack of theoretical understanding of the processes involved in changing the behaviour of healthcare professionals. This paper reports the development of a consensus on a theoretical framework that could be used in implementation research. The objectives were to identify an agreed set of key theoretical constructs for use in (1) studying the implementation of evidence based practice and (2) developing strategies for effective implementation, and to communicate these constructs to an interdisciplinary audience. Methods: Six phases of work were conducted to develop a consensus: (1) identifying theoretical constructs; (2) simplifying into construct domains; (3) evaluating the importance of the construct domains; (4) interdisciplinary evaluation; (5) validating the domain list; and (6) piloting interview questions. The contributors were a “psychological theory” group (n = 18), a “health services research” group (n = 13), and a “health psychology” group (n = 30). Results: Twelve domains were identified to explain behaviour change: (1) knowledge, (2) skills, (3) social/professional role and identity, (4) beliefs about capabilities, (5) beliefs about consequences, (6) motivation and goals, (7) memory, attention and decision processes, (8) environmental context and resources, (9) social influences, (10) emotion regulation, (11) behavioural regulation, and (12) nature of the behaviour. Conclusions: A set of behaviour change domains agreed by a consensus of experts is available for use in implementation research. Applications of this domain list will enhance understanding of the behaviour change processes inherent in implementation of evidence-based practice and will also test the validity of these proposed domains.

Posted Content
TL;DR: In this paper, the authors present theoretical models for three variants of such markets: a monopoly platform, a model of competing platforms where each agent must choose to join a single platform, and a case of "competing bottlenecks", where one group wishes to join all platforms.
Abstract: There are many examples of markets involving two groups of agents who need to interact via 'platforms', and where one group's benefit from joining a platform depends on the number of agents from the other group who join the same platform. This paper presents theoretical models for three variants of such markets: a monopoly platform; a model of competing platforms where each agent must choose to join a single platform; and a model of 'competing bottlenecks', where one group wishes to join all platforms. The main determinants of equilibrium prices are (i) the relative sizes of the cross-group externalities, (ii) whether fees are levied on a lump-sum or per-transaction basis, and (iii) whether a group joins just one platform or joins all platforms.

Journal ArticleDOI
TL;DR: A model of research synthesis designed to work with complex social interventions or programmes, and which is based on the emerging ‘realist’ approach to evaluation is offered, to enable decision-makers to reach a deeper understanding of the intervention and how it can be made to work most effectively.
Abstract: Evidence-based policy is a dominant theme in contemporary public services but the practical realities and challenges involved in using evidence in policy-making are formidable. Part of the problem is one of complexity. In health services and other public services, we are dealing with complex social interventions which act on complex social systems--things like league tables, performance measures, regulation and inspection, or funding reforms. These are not 'magic bullets' which will always hit their target, but programmes whose effects are crucially dependent on context and implementation. Traditional methods of review focus on measuring and reporting on programme effectiveness, often find that the evidence is mixed or conflicting, and provide little or no clue as to why the intervention worked or did not work when applied in different contexts or circumstances, deployed by different stakeholders, or used for different purposes. This paper offers a model of research synthesis which is designed to work with complex social interventions or programmes, and which is based on the emerging 'realist' approach to evaluation. It provides an explanatory analysis aimed at discerning what works for whom, in what circumstances, in what respects and how. The first step is to make explicit the programme theory (or theories)--the underlying assumptions about how an intervention is meant to work and what impacts it is expected to have. We then look for empirical evidence to populate this theoretical framework, supporting, contradicting or modifying the programme theories as it goes. The results of the review combine theoretical understanding and empirical evidence, and focus on explaining the relationship between the context in which the intervention is applied, the mechanisms by which it works and the outcomes which are produced. The aim is to enable decision-makers to reach a deeper understanding of the intervention and how it can be made to work most effectively. Realist review does not provide simple answers to complex questions. It will not tell policy-makers or managers whether something works or not, but will provide the policy and practice community with the kind of rich, detailed and highly practical understanding of complex social interventions which is likely to be of much more use to them when planning and implementing programmes at a national, regional or local level.

Report SeriesDOI
TL;DR: In this article, the authors investigate the relationship between product market competition and innovation and find strong evidence of an inverted-U relationship using panel data, where competition discourages laggard firms from innovating but encourages neck-and-neck firms to innovate.
Abstract: This paper investigates the relationship between product market competition and innovation. We find strong evidence of an inverted-U relationship using panel data. We develop a model where competition discourages laggard firms from innovating but encourages neck-and-neck firms to innovate. Together with the effect of competition on the equilibrium industry structure, these generate an inverted-U. Two additional predictions of the model—that the average technological distance between leaders and followers increases with competition, and that the inverted-U is steeper when industries are more neck-and-neck—are both supported by the data.

Journal ArticleDOI
TL;DR: This work considers dual-action choice systems from a normative perspective, and suggests a Bayesian principle of arbitration between them according to uncertainty, so each controller is deployed when it should be most accurate.
Abstract: A broad range of neural and behavioral data suggests that the brain contains multiple systems for behavioral choice, including one associated with prefrontal cortex and another with dorsolateral striatum. However, such a surfeit of control raises an additional choice problem: how to arbitrate between the systems when they disagree. Here, we consider dual-action choice systems from a normative perspective, using the computational theory of reinforcement learning. We identify a key trade-off pitting computational simplicity against the flexible and statistically efficient use of experience. The trade-off is realized in a competition between the dorsolateral striatal and prefrontal systems. We suggest a Bayesian principle of arbitration between them according to uncertainty, so each controller is deployed when it should be most accurate. This provides a unifying account of a wealth of experimental evidence about the factors favoring dominance by either system.

Journal ArticleDOI
TL;DR: A Bayes empirical Bayes (BEB) approach to the Codon-based substitution models problem is developed, which assigns a prior to the model parameters and integrates over their uncertainties, and the results suggest that in small data sets the new BEB method does not generate false positives as did the old NEB approach.
Abstract: Codon-based substitution models have been widely used to identify amino acid sites under positive selection in comparative analysis of protein-coding DNA sequences. The nonsynonymous-synonymous substitution rate ratio (d(N)/d(S), denoted omega) is used as a measure of selective pressure at the protein level, with omega > 1 indicating positive selection. Statistical distributions are used to model the variation in omega among sites, allowing a subset of sites to have omega > 1 while the rest of the sequence may be under purifying selection with omega 1. Current implementations, however, use the naive EB (NEB) approach and fail to account for sampling errors in maximum likelihood estimates of model parameters, such as the proportions and omega ratios for the site classes. In small data sets lacking information, this approach may lead to unreliable posterior probability calculations. In this paper, we develop a Bayes empirical Bayes (BEB) approach to the problem, which assigns a prior to the model parameters and integrates over their uncertainties. We compare the new and old methods on real and simulated data sets. The results suggest that in small data sets the new BEB method does not generate false positives as did the old NEB approach, while in large data sets it retains the good power of the NEB approach for inferring positively selected sites.

Journal ArticleDOI
TL;DR: This review considers the distinct roles of synaptic and extrasynaptic GABA receptor subtypes in the control of neuronal excitability in the adult mammalian brain.
Abstract: The proper functioning of the adult mammalian brain relies on the orchestrated regulation of neural activity by a diverse population of GABA (gamma-aminobutyric acid)-releasing neurons. Until recently, our appreciation of GABA-mediated inhibition focused predominantly on the GABA(A) (GABA type A) receptors located at synaptic contacts, which are activated in a transient or 'phasic' manner by GABA that is released from synaptic vesicles. However, there is growing evidence that low concentrations of ambient GABA can persistently activate certain subtypes of GABA(A) receptor, which are often remote from synapses, to generate a 'tonic' conductance. In this review, we consider the distinct roles of synaptic and extrasynaptic GABA receptor subtypes in the control of neuronal excitability.

Journal ArticleDOI
TL;DR: In this paper, a power-spectrum analysis of the final 2DF Galaxy Redshift Survey (2dFGRS) employing a direct Fourier method is presented, and the covariance matrix is determined using two different approaches to the construction of mock surveys, which are used to demonstrate that the input cosmological model can be correctly recovered.
Abstract: We present a power-spectrum analysis of the final 2dF Galaxy Redshift Survey (2dFGRS), employing a direct Fourier method. The sample used comprises 221 414 galaxies with measured redshifts. We investigate in detail the modelling of the sample selection, improving on previous treatments in a number of respects. A new angular mask is derived, based on revisions to the photometric calibration. The redshift selection function is determined by dividing the survey according to rest-frame colour, and deducing a self-consistent treatment of k-corrections and evolution for each population. The covariance matrix for the power-spectrum estimates is determined using two different approaches to the construction of mock surveys, which are used to demonstrate that the input cosmological model can be correctly recovered. We discuss in detail the possible differences between the galaxy and mass power spectra, and treat these using simulations, analytic models and a hybrid empirical approach. Based on these investigations, we are confident that the 2dFGRS power spectrum can be used to infer the matter content of the universe. On large scales, our estimated power spectrum shows evidence for the ‘baryon oscillations’ that are predicted in cold dark matter (CDM) models. Fitting to a CDM model, assuming a primordial n s = 1 spectrum, h = 0.72 and negligible neutrino mass, the preferred

Journal ArticleDOI
TL;DR: This work surveys studies of natural interspecific hybridization in plants and a variety of animals to show that limited invasions of the genome are widespread, with potentially important consequences in evolutionary biology, speciation, biodiversity, and conservation.
Abstract: Hybridization between species is commonplace in plants, but is often seen as unnatural and unusual in animals. Here, I survey studies of natural interspecific hybridization in plants and a variety of animals. At least 25% of plant species and 10% of animal species, mostly the youngest species, are involved in hybridization and potential introgression with other species. Species in nature are often incompletely isolated for millions of years after their formation. Therefore, much evolution of eventual reproductive isolation can occur while nascent species are in gene-flow contact, in sympatry or parapatry, long after divergence begins. Although the relative importance of geographic isolation and gene flow in the origin of species is still unknown, many key processes involved in speciation, such as 'reinforcement' of post-mating isolation by the evolution of assortative mating, will have ample opportunity to occur in the presence of continuing gene flow. Today, DNA sequence data and other molecular methods are beginning to show that limited invasions of the genome are widespread, with potentially important consequences in evolutionary biology, speciation, biodiversity, and conservation.

Journal ArticleDOI
TL;DR: Induced head cooling is not protective in a mixed population of infants with neonatal encephalopathy, but it could safely improve survival without severe neurodevelopmental disability in infants with less severe aEEG changes.

Journal ArticleDOI
TL;DR: This review discusses the relative merits of different normalisation strategies, suggests a method of validation that will enable the measurement of biologically meaningful results and discusses the popular practice of measuring an internal reference or housekeeping gene.
Abstract: Real-time RT-PCR has become a common technique, no longer limited to specialist core facilities. It is in many cases the only method for measuring mRNA levels of vivo low copy number targets of interest for which alternative assays either do not exist or lack the required sensitivity. Benefits of this procedure over conventional methods for measuring RNA include its sensitivity, large dynamic range, the potential for high throughout as well as accurate quantification. To achieve this, however, appropriate normalisation strategies are required to control for experimental error introduced during the multistage process required to extract and process the RNA. There are many strategies that can be chosen; these include normalisation to sample size, total RNA and the popular practice of measuring an internal reference or housekeeping gene. However, these methods are frequently applied without appropriate validation. In this review we discuss the relative merits of different normalisation strategies and suggest a method of validation that will enable the measurement of biologically meaningful results.

Journal ArticleDOI
TL;DR: These findings provide a resolution to the long-standing early and late selection debate within a load theory of attention that accommodates behavioural and neuroimaging data within a framework that integrates attention research with executive function.

Journal ArticleDOI
TL;DR: The results show that this 'mirror system' integrates observed actions of others with an individual's personal motor repertoire, and suggest that the human brain understands actions by motor simulation.
Abstract: When we observe someone performing an action, do our brains simulate making that action? Acquired motor skills offer a unique way to test this question, since people differ widely in the actions they have learned to perform. We used functional magnetic resonance imaging to study differences in brain activity between watching an action that one has learned to do and an action that one has not, in order to assess whether the brain processes of action observation are modulated by the expertise and motor repertoire of the observer. Experts in classical ballet, experts in capoeira and inexpert control subjects viewed videos of ballet or capoeira actions. Comparing the brain activity when dancers watched their own dance style versus the other style therefore reveals the influence of motor expertise on action observation. We found greater bilateral activations in premotor cortex and intraparietal sulcus, right superior parietal lobe and left posterior superior temporal sulcus when expert dancers viewed movements that they had been trained to perform compared to movements they had not. Our results show that this 'mirror system' integrates observed actions of others with an individual's personal motor repertoire, and suggest that the human brain understands actions by motor simulation.

Proceedings Article
05 Dec 2005
TL;DR: It is shown that this new Gaussian process (GP) regression model can match full GP performance with small M, i.e. very sparse solutions, and it significantly outperforms other approaches in this regime.
Abstract: We present a new Gaussian process (GP) regression model whose co-variance is parameterized by the the locations of M pseudo-input points, which we learn by a gradient based optimization. We take M ≪ N, where N is the number of real data points, and hence obtain a sparse regression method which has O(M2N) training cost and O(M2) prediction cost per test case. We also find hyperparameters of the covariance function in the same joint optimization. The method can be viewed as a Bayesian regression model with particular input dependent noise. The method turns out to be closely related to several other sparse GP approaches, and we discuss the relation in detail. We finally demonstrate its performance on some large data sets, and make a direct comparison to other sparse GP methods. We show that our method can match full GP performance with small M, i.e. very sparse solutions, and it significantly outperforms other approaches in this regime.

Journal ArticleDOI
TL;DR: The biasing effects of mode of questionnaire administration has important implications for research methodology, the validity of the results of research, and for the soundness of public policy developed from evidence using questionnaire-based research.
Abstract: Background One of the main primary data collection instruments in social, health and epidemiological research is the survey questionnaire. Modes of data collection by questionnaire differ in several ways, including the method of contacting respondents, the medium of delivering the questionnaire to respondents, and the administration of the questions. These are likely to have different effects on the quality of the data collected. Methods This paper is based on a narrative review of systematic and non-systematic searches of the literature on the effects of mode of questionnaire administration on data quality. Results Within different modes of questionnaire administration, there were many documented potential, biasing influences on the responses obtained. These were greatest between different types of mode (e.g. self-administered versus interview modes), rather than within modes. It can be difficult to separate out the effects of the different influences, at different levels. Conclusions The biasing effects of mode of questionnaire administration has important implications for research methodology, the validity of the results of research, and for the soundness of public policy developed from evidence using questionnaire-based research. All users of questionnaires need to be aware of these potential effects on their data.

Journal ArticleDOI
TL;DR: Killing was previously believed to be accomplished by oxygen free radicals and other reactive oxygen species generated by the NADPH oxidase, and by oxidized halides produced by myeloperoxidase, but this is incorrect.
Abstract: Neutrophils provide the first line of defense of the innate immune system by phagocytosing, killing, and digesting bacteria and fungi. Killing was previously believed to be accomplished by oxygen free radicals and other reactive oxygen species generated by the NADPH oxidase, and by oxidized halides produced by myeloperoxidase. We now know this is incorrect. The oxidase pumps electrons into the phagocytic vacuole, thereby inducing a charge across the membrane that must be compensated. The movement of compensating ions produces conditions in the vacuole conducive to microbial killing and digestion by enzymes released into the vacuole from the cytoplasmic granules.

Journal ArticleDOI
TL;DR: A modified branch-site model is described and used to construct two LRTs, one of which is a direct test of positive selection on the lineages of interest and is recommended for use in real data analysis and the other appears robust against violations of model assumptions.
Abstract: Detecting positive Darwinian selection at the DNA sequence level has been a subject of considerable interest. However, positive selection is difficult to detect because it often operates episodically on a few amino acid sites, and the signal may be masked by negative selection. Several methods have been developed to test positive selection that acts on given branches (branch methods) or on a subset of sites (site methods). Recently, Yang, Z., and R. Nielsen (2002. Codon-substitution models for detecting molecular adaptation at individual sites along specific lineages. Mol. Biol. Evol. 19:908-917) developed likelihood ratio tests (LRTs) based on branch-site models to detect positive selection that affects a small number of sites along prespecified lineages. However, computer simulations suggested that the tests were sensitive to the model assumptions and were unable to distinguish between relaxation of selective constraint and positive selection (Zhang, J. 2004. Frequent false detection of positive selection by the likelihood method with branch-site models. Mol. Biol. Evol. 21:1332-1339). Here, we describe a modified branch-site model and use it to construct two LRTs, called branch-site tests 1 and 2. We applied the new tests to reanalyze several real data sets and used computer simulation to examine the performance of the two tests by examining their false-positive rate, power, and robustness. We found that test 1 was unable to distinguish relaxed constraint from positive selection affecting the lineages of interest, while test 2 had acceptable false-positive rates and appeared robust against violations of model assumptions. As test 2 is a direct test of positive selection on the lineages of interest, it is referred to as the branch-site test of positive selection and is recommended for use in real data analysis. The test appeared conservative overall, but exhibited better power in detecting positive selection than the branch-based lest. Bayes empirical Bayes identification of amino acid sites under positive selection along the foreground branches was found to be reliable, but lacked power.

Journal ArticleDOI
TL;DR: The Ultra-Violet/Optical Telescope (UVOT) as discussed by the authors is one of the three instruments flying aboard the Swift Gamma-ray Observatory, which is designed to capture the early (∼1 min) UV and optical photons from the afterglow of gamma-ray bursts in the 170-600 nm band as well as long term observations of these afterglows.
Abstract: The Ultra-Violet/Optical Telescope (UVOT) is one of three instruments flying aboard the Swift Gamma-ray Observatory. It is designed to capture the early (∼1 min) UV and optical photons from the afterglow of gamma-ray bursts in the 170–600 nm band as well as long term observations of these afterglows. This is accomplished through the use of UV and optical broadband filters and grisms. The UVOT has a modified Ritchey–Chretien design with micro-channel plate intensified charged-coupled device detectors that record the arrival time of individual photons and provide sub-arcsecond positioning of sources. We discuss some of the science to be pursued by the UVOT and the overall design of the instrument.

Journal ArticleDOI
TL;DR: The local cortical network structure can be viewed as a skeleton of stronger connections in a sea of weaker ones, likely to play an important role in network dynamics and should be investigated further.
Abstract: How different is local cortical circuitry from a random network? To answer this question, we probed synaptic connections with several hundred simultaneous quadruple whole-cell recordings from layer 5 pyramidal neurons in the rat visual cortex. Analysis of this dataset revealed several nonrandom features in synaptic connectivity. We confirmed previous reports that bidirectional connections are more common than expected in a random network. We found that several highly clustered three-neuron connectivity patterns are overrepresented, suggesting that connections tend to cluster together. We also analyzed synaptic connection strength as defined by the peak excitatory postsynaptic potential amplitude. We found that the distribution of synaptic connection strength differs significantly from the Poisson distribution and can be fitted by a lognormal distribution. Such a distribution has a heavier tail and implies that synaptic weight is concentrated among few synaptic connections. In addition, the strengths of synaptic connections sharing pre- or postsynaptic neurons are correlated, implying that strong connections are even more clustered than the weak ones. Therefore, the local cortical network structure can be viewed as a skeleton of stronger connections in a sea of weaker ones. Such a skeleton is likely to play an important role in network dynamics and should be investigated further.

Journal ArticleDOI
03 Nov 2005-BMJ
TL;DR: Systematic reviews of complex evidence cannot rely solely on protocol-driven search strategies, and primary sources must be identified by “snowballing” or by personal knowledge or personal contacts.
Abstract: Objective To describe where papers come from in a systematic review of complex evidence. Method Audit of how the 495 primary sources for the review were originally identified. Results Only 30% of sources were obtained from the protocol defined at the outset of the study (that is, from the database and hand searches). Fifty one per cent were identified by “snowballing” (such as pursuing references of references), and 24% by personal knowledge or personal contacts. Conclusion Systematic reviews of complex evidence cannot rely solely on protocol-driven search strategies.