scispace - formally typeset
Search or ask a question

Showing papers by "University of Warwick published in 2009"


Journal ArticleDOI
TL;DR: M mothur is used as a case study to trim, screen, and align sequences; calculate distances; assign sequences to operational taxonomic units; and describe the α and β diversity of eight marine samples previously characterized by pyrosequencing of 16S rRNA gene fragments.
Abstract: mothur aims to be a comprehensive software package that allows users to use a single piece of software to analyze community sequence data. It builds upon previous tools to provide a flexible and powerful software package for analyzing sequencing data. As a case study, we used mothur to trim, screen, and align sequences; calculate distances; assign sequences to operational taxonomic units; and describe the alpha and beta diversity of eight marine samples previously characterized by pyrosequencing of 16S rRNA gene fragments. This analysis of more than 222,000 sequences was completed in less than 2 h with a laptop computer.

17,350 citations


Journal ArticleDOI
TL;DR: A series of improvements to the spectroscopic reductions are described, including better flat fielding and improved wavelength calibration at the blue end, better processing of objects with extremely strong narrow emission lines, and an improved determination of stellar metallicities.
Abstract: This paper describes the Seventh Data Release of the Sloan Digital Sky Survey (SDSS), marking the completion of the original goals of the SDSS and the end of the phase known as SDSS-II. It includes 11,663 deg^2 of imaging data, with most of the ~2000 deg^2 increment over the previous data release lying in regions of low Galactic latitude. The catalog contains five-band photometry for 357 million distinct objects. The survey also includes repeat photometry on a 120° long, 2°.5 wide stripe along the celestial equator in the Southern Galactic Cap, with some regions covered by as many as 90 individual imaging runs. We include a co-addition of the best of these data, going roughly 2 mag fainter than the main survey over 250 deg^2. The survey has completed spectroscopy over 9380 deg^2; the spectroscopy is now complete over a large contiguous area of the Northern Galactic Cap, closing the gap that was present in previous data releases. There are over 1.6 million spectra in total, including 930,000 galaxies, 120,000 quasars, and 460,000 stars. The data release includes improved stellar photometry at low Galactic latitude. The astrometry has all been recalibrated with the second version of the USNO CCD Astrograph Catalog, reducing the rms statistical errors at the bright end to 45 milliarcseconds per coordinate. We further quantify a systematic error in bright galaxy photometry due to poor sky determination; this problem is less severe than previously reported for the majority of galaxies. Finally, we describe a series of improvements to the spectroscopic reductions, including better flat fielding and improved wavelength calibration at the blue end, better processing of objects with extremely strong narrow emission lines, and an improved determination of stellar metallicities.

5,665 citations


Journal ArticleDOI
TL;DR: The recent state of the art CAD technology for digitized histopathology is reviewed and the development and application of novel image analysis technology for a few specific histopathological related problems being pursued in the United States and Europe are described.
Abstract: Over the past decade, dramatic increases in computational power and improvement in image analysis algorithms have allowed the development of powerful computer-assisted analytical approaches to radiological data. With the recent advent of whole slide digital scanners, tissue histopathology slides can now be digitized and stored in digital image form. Consequently, digitized tissue histopathology has now become amenable to the application of computerized image analysis and machine learning techniques. Analogous to the role of computer-assisted diagnosis (CAD) algorithms in medical imaging to complement the opinion of a radiologist, CAD algorithms have begun to be developed for disease detection, diagnosis, and prognosis prediction to complement the opinion of the pathologist. In this paper, we review the recent state of the art CAD technology for digitized histopathology. This paper also briefly describes the development and application of novel image analysis technology for a few specific histopathology related problems being pursued in the United States and Europe.

1,644 citations


Journal ArticleDOI
TL;DR: In this paper, the effects of interventions designed to reduce the incidence of falls in elderly people (living in the community, or in institutional or hospital care) were assessed using the Cochrane Bone, Joint and Muscle Trauma Group Specialised Register.
Abstract: BACKGROUND: Approximately 30 per cent of people over 65 years of age and living in the community fall each year; the number is higher in institutions Although less than one fall in 10 results in a fracture, a fifth of fall incidents require medical attention OBJECTIVES: To assess the effects of interventions designed to reduce the incidence of falls in elderly people (living in the community, or in institutional or hospital care) SEARCH STRATEGY: We searched the Cochrane Bone, Joint and Muscle Trauma Group Specialised Register (January 2003), Cochrane Central Register of Controlled Trials (The Cochrane Library, Issue 1, 2003), MEDLINE (1966 to February 2003), EMBASE (1988 to 2003 Week 19), CINAHL (1982 to April 2003), The National Research Register, Issue 2, 2003, Current Controlled Trials (wwwcontrolled-trialscom accessed 11 July 2003) and reference lists of articles No language restrictions were applied Further trials were identified by contact with researchers in the field SELECTION CRITERIA: Randomised trials of interventions designed to minimise the effect of, or exposure to, risk factors for falling in elderly people Main outcomes of interest were the number of fallers, or falls Trials reporting only intermediate outcomes were excluded DATA COLLECTION AND ANALYSIS: Two reviewers independently assessed trial quality and extracted data Data were pooled using the fixed effect model where appropriate MAIN RESULTS: Sixty two trials involving 21,668 people were includedInterventions likely to be beneficial:Multidisciplinary, multifactorial, health/environmental risk factor screening/intervention programmes in the community both for an unselected population of older people (4 trials, 1651 participants, pooled RR 073, 95%CI 063 to 085), and for older people with a history of falling or selected because of known risk factors (5 trials, 1176 participants, pooled RR 086, 95%CI 076 to 098), and in residential care facilities (1 trial, 439 participants, cluster-adjusted incidence rate ratio 060, 95%CI 050 to 073) A programme of muscle strengthening and balance retraining, individually prescribed at home by a trained health professional (3 trials, 566 participants, pooled relative risk (RR) 080, 95% confidence interval (95%CI) 066 to 098) Home hazard assessment and modification that is professionally prescribed for older people with a history of falling (3 trials, 374 participants, RR 066, 95% CI 054 to 081) Withdrawal of psychotropic medication (1 trial, 93 participants, relative hazard 034, 95%CI 016 to 074) Cardiac pacing for fallers with cardioinhibitory carotid sinus hypersensitivity (1 trial, 175 participants, WMD -520, 95%CI -940 to -100) A 15 week Tai Chi group exercise intervention (1 trial, 200 participants, risk ratio 051, 95%CI 036 to 073) Interventions of unknown effectiveness:Group-delivered exercise interventions (9 trials, 1387 participants) Individual lower limb strength training (1 trial, 222 participants) Nutritional supplementation (1 trial, 46 participants) Vitamin D supplementation, with or without calcium (3 trials, 461 participants) Home hazard modification in association with advice on optimising medication (1 trial, 658 participants), or in association with an education package on exercise and reducing fall risk (1 trial, 3182 participants) Pharmacological therapy (raubasine-dihydroergocristine, 1 trial, 95 participants) Interventions using a cognitive/behavioural approach alone (2 trials, 145 participants) Home hazard modification for older people without a history of falling (1 trial, 530 participants) Hormone replacement therapy (1 trial, 116 participants) Correction of visual deficiency (1 trial, 276 participants)Interventions unlikely to be beneficial:Brisk walking in women with an upper limb fracture in the previous two years (1 trial, 165 participants) AUTHORS' CONCLUSIONS: Interventions to prevent falls that are likely to be effective are now available; less is known about their effectiveness in preventing fall-related injuries Costs per fall prevented have been established for four of the interventions and careful economic modelling in the context of the local healthcare system is important Some potential interventions are of unknown effectiveness and further research is indicated Language: en

1,496 citations


Journal ArticleDOI
Brian J. Haas1, Sophien Kamoun2, Sophien Kamoun3, Michael C. Zody4, Michael C. Zody1, Rays H. Y. Jiang1, Rays H. Y. Jiang5, Robert E. Handsaker1, Liliana M. Cano2, Manfred Grabherr1, Chinnappa D. Kodira6, Chinnappa D. Kodira1, Sylvain Raffaele2, Trudy Torto-Alalibo6, Trudy Torto-Alalibo3, Tolga O. Bozkurt2, Audrey M. V. Ah-Fong7, Lucia Alvarado1, Vicky L. Anderson8, Miles R. Armstrong9, Anna O. Avrova9, Laura Baxter10, Jim Beynon10, Petra C. Boevink9, Stephanie R. Bollmann11, Jorunn I. B. Bos3, Vincent Bulone12, Guohong Cai13, Cahid Cakir3, James C. Carrington14, Megan Chawner15, Lucio Conti16, Stefano Costanzo11, Richard Ewan16, Noah Fahlgren14, Michael A. Fischbach17, Johanna Fugelstad12, Eleanor M. Gilroy9, Sante Gnerre1, Pamela J. Green18, Laura J. Grenville-Briggs8, John Griffith15, Niklaus J. Grünwald11, Karolyn Horn15, Neil R. Horner8, Chia-Hui Hu19, Edgar Huitema3, Dong-Hoon Jeong18, Alexandra M. E. Jones2, Jonathan D. G. Jones2, Richard W. Jones11, Elinor K. Karlsson1, Sridhara G. Kunjeti20, Kurt Lamour21, Zhenyu Liu3, Li-Jun Ma1, Dan MacLean2, Marcus C. Chibucos22, Hayes McDonald23, Jessica McWalters15, Harold J. G. Meijer5, William Morgan24, Paul Morris25, Carol A. Munro8, Keith O'Neill1, Keith O'Neill6, Manuel D. Ospina-Giraldo15, Andrés Pinzón, Leighton Pritchard9, Bernard H Ramsahoye26, Qinghu Ren27, Silvia Restrepo, Sourav Roy7, Ari Sadanandom16, Alon Savidor28, Sebastian Schornack2, David C. Schwartz29, Ulrike Schumann8, Ben Schwessinger2, Lauren Seyer15, Ted Sharpe1, Cristina Silvar2, Jing Song3, David J. Studholme2, Sean M. Sykes1, Marco Thines2, Marco Thines30, Peter J. I. van de Vondervoort5, Vipaporn Phuntumart25, Stephan Wawra8, R. Weide5, Joe Win2, Carolyn A. Young3, Shiguo Zhou29, William E. Fry13, Blake C. Meyers18, Pieter van West8, Jean B. Ristaino19, Francine Govers5, Paul R. J. Birch31, Stephen C. Whisson9, Howard S. Judelson7, Chad Nusbaum1 
17 Sep 2009-Nature
TL;DR: The sequence of the P. infestans genome is reported, which at ∼240 megabases (Mb) is by far the largest and most complex genome sequenced so far in the chromalveolates and probably plays a crucial part in the rapid adaptability of the pathogen to host plants and underpins its evolutionary potential.
Abstract: Phytophthora infestans is the most destructive pathogen of potato and a model organism for the oomycetes, a distinct lineage of fungus-like eukaryotes that are related to organisms such as brown algae and diatoms. As the agent of the Irish potato famine in the mid-nineteenth century, P. infestans has had a tremendous effect on human history, resulting in famine and population displacement(1). To this day, it affects world agriculture by causing the most destructive disease of potato, the fourth largest food crop and a critical alternative to the major cereal crops for feeding the world's population(1). Current annual worldwide potato crop losses due to late blight are conservatively estimated at $6.7 billion(2). Management of this devastating pathogen is challenged by its remarkable speed of adaptation to control strategies such as genetically resistant cultivars(3,4). Here we report the sequence of the P. infestans genome, which at similar to 240 megabases (Mb) is by far the largest and most complex genome sequenced so far in the chromalveolates. Its expansion results from a proliferation of repetitive DNA accounting for similar to 74% of the genome. Comparison with two other Phytophthora genomes showed rapid turnover and extensive expansion of specific families of secreted disease effector proteins, including many genes that are induced during infection or are predicted to have activities that alter host physiology. These fast-evolving effector genes are localized to highly dynamic and expanded regions of the P. infestans genome. This probably plays a crucial part in the rapid adaptability of the pathogen to host plants and underpins its evolutionary potential.

1,341 citations


Journal ArticleDOI
Brian Yanny1, Constance M. Rockosi2, Heidi Jo Newberg3, Gillian R. Knapp4, Jennifer K. Adelman-McCarthy1, Bonnie Alcorn1, S. Allam1, Carlos Allende Prieto5, Carlos Allende Prieto6, Deokkeun An7, K. S. J. Anderson8, K. S. J. Anderson9, Scott F. Anderson10, Coryn A. L. Bailer-Jones11, Steve Bastian1, Timothy C. Beers12, Eric F. Bell11, Vasily Belokurov13, Dmitry Bizyaev9, Norm Blythe9, John J. Bochanski10, William N. Boroski1, Jarle Brinchmann14, J. Brinkmann9, Howard Brewington9, Larry N. Carey10, Kyle M. Cudworth15, Michael L. Evans10, Nick Evans13, Evalyn Gates15, Boris T. Gänsicke16, Bruce Gillespie9, G. F. Gilmore13, Ada Nebot Gomez-Moran, Eva K. Grebel17, Jim Greenwell10, James E. Gunn4, Cathy Jordan9, Wendell Jordan9, Paul Harding18, Hugh C. Harris, John S. Hendry1, Diana Holder9, Inese I. Ivans4, Željko Ivezić10, Sebastian Jester11, Jennifer A. Johnson7, Stephen M. Kent1, S. J. Kleinman9, Alexei Y. Kniazev11, Jurek Krzesinski9, Richard G. Kron15, Nikolay Kuropatkin1, Svetlana Lebedeva1, Young Sun Lee12, R. French Leger1, Sébastien Lépine19, Steve Levine, Huan Lin1, Dan Long9, Craig P. Loomis4, Robert H. Lupton4, O. Malanushenko9, Viktor Malanushenko9, Bruce Margon2, David Martínez-Delgado11, P. M. McGehee20, Dave Monet, Heather L. Morrison18, Jeffrey A. Munn, Eric H. Neilsen1, Atsuko Nitta9, John E. Norris21, Daniel Oravetz9, Russell Owen10, Nikhil Padmanabhan22, Kaike Pan9, R. S. Peterson1, Jeffrey R. Pier, Jared Platson1, Paola Re Fiorentin23, Paola Re Fiorentin11, Gordon T. Richards24, Hans-Walter Rix11, David J. Schlegel22, Donald P. Schneider25, Matthias R. Schreiber26, Axel Schwope, Valena C. Sibley1, Audrey Simmons9, Stephanie A. Snedden9, J. Allyn Smith27, Larry Stark10, Fritz Stauffer9, Matthias Steinmetz, Christopher Stoughton1, Mark SubbaRao28, Mark SubbaRao15, Alexander S. Szalay29, Paula Szkody10, Aniruddha R. Thakar29, Sivarani Thirupathi12, Douglas L. Tucker1, A. Uomoto30, Daniel E. Vanden Berk25, S. Vidrih17, Yogesh Wadadekar4, Yogesh Wadadekar31, S. Watters9, R. Wilhelm32, Rosemary F. G. Wyse29, Jean Yarger9, Daniel B. Zucker13 
TL;DR: The Sloan Extension for Galactic Understanding and Exploration (SEGUE) Survey as mentioned in this paper obtained approximately 240,000 moderate-resolution spectra from 3900 to 9000 of fainter Milky Way stars (14.0 10 per resolution element).
Abstract: The Sloan Extension for Galactic Understanding and Exploration (SEGUE) Survey obtained {approx}240,000 moderate-resolution (R {approx} 1800) spectra from 3900 {angstrom} to 9000 {angstrom} of fainter Milky Way stars (14.0 10 per resolution element, stellar atmospheric parameters are estimated, including metallicity, surface gravity, and effective temperature. SEGUE obtained 3500 deg{sup 2} of additional ugriz imaging (primarily at low Galactic latitudes) providing precise multicolor photometry ({sigma}(g, r, i) {approx} 2%), ({sigma}(u, z) {approx} 3%) and astrometry ({approx}0.1) for spectroscopic target selection. The stellar spectra, imaging data, and derived parameter catalogs for this survey are publicly available as part of Sloan Digital Sky Survey Data Release 7.

1,133 citations


Book
02 Dec 2009
TL;DR: Archer as discussed by the authors argues that people in their daily lives feel a genuine freedom of thought and belief, yet this is unavoidably constrained by cultural limitations, such as those imposed by the language spoken, the knowledge developed and the information available at any time.
Abstract: People are inescapably shaped by the culture in which they live, while culture itself is made and remade by people. Human beings in their daily lives feel a genuine freedom of thought and belief, yet this is unavoidably constrained by cultural limitations--such as those imposed by the language spoken, the knowledge developed and the information available at any time. In this book, Margaret Archer provides an analysis of the nature and stringency of cultural constraints, and the conditions and degrees of cultural freedom, and offers a radical new explanation of the tension between them. She suggests that the "problem of culture and agency" directly parallels the "problem of structure and agency," and that both problems can be solved by using the same analytical framework. She therefore paves the way toward the theoretical unification of the structural and cultural fields.

1,125 citations


Journal ArticleDOI
TL;DR: Climate change mitigation in transport should benefit public health substantially and policies to increase the acceptability, appeal, and safety of active urban travel, and discourage travel in private motor vehicles would provide larger health benefits than would policies that focus solely on lower-emission motor vehicles.

1,013 citations


Journal ArticleDOI
TL;DR: In this paper, the authors demonstrate a cleaning process that verifiably removes the contamination on the device structure and allows the intrinsic chemical responses of the graphene monolayer to be measured.
Abstract: Graphene is a two-dimensional material with extremely favorable chemical sensor properties. Conventional nanolithography typically leaves a resist residue on the graphene surface, whose impact on the sensor characteristics has not yet been determined. Here we show that the contamination layer chemically dopes the graphene, enhances carrier scattering, and acts as an absorbent layer that concentrates analyte molecules at the graphene surface, thereby enhancing the sensor response. We demonstrate a cleaning process that verifiably removes the contamination on the device structure and allows the intrinsic chemical responses of the graphene monolayer to be measured. These intrinsic responses are surprisingly small, even upon exposure to strong analytes such as ammonia vapor.

939 citations


Journal ArticleDOI
TL;DR: A consensus group of experts comprised of experts in pediatric and adult endocrinology, diabetes education, transplantation, metabolism, bariatric/metabolic surgery, and (for another perspective) hematology-oncology met in June 2009 to discuss issues.
Abstract: The mission of the American Diabetes Association is “to prevent and cure diabetes and to improve the lives of all people affected by diabetes.” Increasingly, scientific and medical articles (1) and commentaries (2) about diabetes interventions use the terms “remission” and “cure” as possible outcomes. Several approved or experimental treatments for type 1 and type 2 diabetes (e.g., pancreas or islet transplants, immunomodulation, bariatric/metabolic surgery) are of curative intent or have been portrayed in the media as a possible cure. However, defining remission or cure of diabetes is not as straightforward as it may seem. Unlike “dichotomous” diseases such as many malignancies, diabetes is defined by hyperglycemia, which exists on a continuum and may be impacted over a short time frame by everyday treatment or events (medications, diet, activity, intercurrent illness). The distinction between successful treatment and cure is blurred in the case of diabetes. Presumably improved or normalized glycemia must be part of the definition of remission or cure. Glycemic measures below diagnostic cut points for diabetes can occur with ongoing medications (e.g., antihyperglycemic drugs, immunosuppressive medications after a transplant), major efforts at lifestyle change, a history of bariatric/metabolic surgery, or ongoing procedures (such as repeated replacements of endoluminal devices). Do we use the terms remission or cure for all patients with normal glycemic measures, regardless of how this is achieved? A consensus group comprised of experts in pediatric and adult endocrinology, diabetes education, transplantation, metabolism, bariatric/metabolic surgery, and (for another perspective) hematology-oncology met in June 2009 to discuss these issues. The group considered a wide variety of questions, including whether it is ever accurate to say that a chronic illness is cured; what the definitions of management, remission, or cure might be; whether goals of managing comorbid conditions revert to those of patients without diabetes if someone is …

880 citations


Journal ArticleDOI
TL;DR: It is proposed that the success of infants and nonhuman animals on some belief reasoning tasks may be best explained by a cognitively efficient but inflexible capacity for tracking belief-like states in humans.
Abstract: The lack of consensus on how to characterize humans' capacity for belief reasoning has been brought into sharp focus by recent research. Children fail critical tests of belief reasoning before 3 to 4 years of age (H. Wellman, D. Cross, & J. Watson, 2001; H. Wimmer & J. Perner, 1983), yet infants apparently pass false-belief tasks at 13 or 15 months (K. H. Onishi & R. Baillargeon, 2005; L. Surian, S. Caldi, & D. Sperber, 2007). Nonhuman animals also fail critical tests of belief reasoning but can show very complex social behavior (e.g., J. Call & A Tomasello, 2005). Fluent social interaction in adult humans implies efficient processing of beliefs, yet direct tests suggest that belief reasoning is cognitively demanding, even for adults (e.g., I. A. Apperly, D. Samson, & G. W. Humphreys, 2009). The authors interpret these findings by drawing an analogy with the domain of number cognition, where similarly contrasting results have been observed. They propose that the success of infants and nonhuman animals on some belief reasoning tasks may be best explained by a cognitively efficient but inflexible capacity for tracking belief-like states. In humans, this capacity persists in parallel with a later-developing, more flexible but more cognitively demanding theory-of-mind abilities.

Journal ArticleDOI
TL;DR: A scoring system is proposed for concomitantly appraising the methodological quality of qualitative, quantitative and mixed methods studies for SMSRs and this scoring system may also be used to appraise the quantitative and qualitative quality of mixed methods research.

Journal ArticleDOI
TL;DR: A short 7 item version of WEMWBS was found to satisfy the strict unidimensionality expectations of the Rasch model, and be largely free of bias, and is preferable to the full 14 item version at present for monitoring mental well-being in populations.
Abstract: Background The Warwick-Edinburgh Mental Well-Being Scale (WEMWBS) was developed to meet demand for instruments to measure mental well-being. It comprises 14 positively phrased Likert-style items and fulfils classic criteria for scale development. We report here the internal construct validity of WEMWBS from the perspective of the Rasch measurement model.

Journal ArticleDOI
TL;DR: In this article, a vocabulary and strategy for theorizing work and organizational practices is presented, based on the metaphorical movement of "zooming in" and "zooming out of" practice.
Abstract: This article contributes to re-specifying a number of the phenomena of interest to organizational studies in terms of patterns of socio-material practices and their effects. It does so by outlining a vocabulary and strategy that make up a framework for theorizing work and organizational practices. The vocabulary is based on number of sensitizing concepts that connote practice as an open-ended, heterogeneous accomplishment which takes place within a specific horizon of sense and a set of concerns which the practice itself brings to bear. The strategy is based on the metaphorical movement of ‘zooming in’ and ‘zooming out of’ practice. The zooming in and out are obtained through switching theoretical lenses and re-positioning in the field, so that certain aspects of the practice are fore-grounded while others are bracketed. Building on the results of an extended study of telemedicine, the article discusses in detail the different elements of the framework and how it enhances our capacity to re-present practi...

Journal ArticleDOI
20 Nov 2009
TL;DR: GEM-CAP should be considered as one of the standard first-line options in locally advanced and metastatic pancreatic cancer and the meta-analysis of published studies showed a significant survival benefit in favor of GEM- CAP.
Abstract: Purpose Both gemcitabine (GEM) and fluoropyrimidines are valuable treatment for advanced pancreatic cancer. This open-label study was designed to compare the overall survival (OS) of patients randomly assigned to GEM alone or GEM plus capecitabine (GEM-CAP). Patients and Methods Patients with previously untreated histologically or cytologically proven locally advanced or metastatic carcinoma of the pancreas with a performance status ≤ 2 were recruited. Patients were randomly assigned to GEM or GEM-CAP. The primary outcome measure was survival. Meta-analysis of published studies was also conducted. Results Between May 2002 and January 2005, 533 patients were randomly assigned to GEM (n = 266) and GEM-CAP (n = 267) arms. GEM-CAP significantly improved objective response rate (19.1% v 12.4%; P = .034) and progression-free survival (hazard ratio [HR], 0.78; 95% CI, 0.66 to 0.93; P = .004) and was associated with a trend toward improved OS (HR, 0.86; 95% CI, 0.72 to 1.02; P = .08) compared with GEM alone. This trend for OS benefit for GEM-CAP was consistent across different prognostic subgroups according to baseline stratification factors (stage and performance status) and remained after adjusting for these stratification factors (P = .077). Moreover, the meta-analysis of two additional studies involving 935 patients showed a significant survival benefit in favor of GEM-CAP (HR, 0.86; 95% CI, 0.75 to 0.98; P = .02) with no intertrial heterogeneity. Conclusion On the basis of our trial and the meta-analysis, GEM-CAP should be considered as one of the standard first-line options in locally advanced and metastatic pancreatic cancer.

Journal ArticleDOI
30 Nov 2009-PLOS ONE
TL;DR: A systematic survey of reporting, experimental design and statistical analysis in published biomedical research using laboratory animals identified a number of issues that need to be addressed in order to improve experimentalDesign and reporting in publications describing research using animals.
Abstract: For scientific, ethical and economic reasons, experiments involving animals should be appropriately designed, correctly analysed and transparently reported. This increases the scientific validity of the results, and maximises the knowledge gained from each experiment. A minimum amount of relevant information must be included in scientific publications to ensure that the methods and results of a study can be reviewed, analysed and repeated. Omitting essential information can raise scientific and ethical concerns. We report the findings of a systematic survey of reporting, experimental design and statistical analysis in published biomedical research using laboratory animals. Medline and EMBASE were searched for studies reporting research on live rats, mice and non-human primates carried out in UK and US publicly funded research establishments. Detailed information was collected from 271 publications, about the objective or hypothesis of the study, the number, sex, age and/or weight of animals used, and experimental and statistical methods. Only 59% of the studies stated the hypothesis or objective of the study and the number and characteristics of the animals used. Appropriate and efficient experimental design is a critical component of high-quality science. Most of the papers surveyed did not use randomisation (87%) or blinding (86%), to reduce bias in animal selection and outcome assessment. Only 70% of the publications that used statistical methods described their methods and presented the results with a measure of error or variability. This survey has identified a number of issues that need to be addressed in order to improve experimental design and reporting in publications describing research using animals. Scientific publication is a powerful and important source of information; the authors of scientific publications therefore have a responsibility to describe their methods and results comprehensively, accurately and transparently, and peer reviewers and journal editors share the responsibility to ensure that published studies fulfil these criteria.

Journal ArticleDOI
29 Oct 2009-Nature
TL;DR: In this paper, the authors reported that GRB 090423 lies at a redshift of z approximate to 8.2, implying that massive stars were being produced and dying as GRBs similar to 630 Myr after the Big Bang.
Abstract: Long-duration gamma-ray bursts (GRBs) are thought to result from the explosions of certain massive stars(1), and some are bright enough that they should be observable out to redshifts of z > 20 using current technology(2-4). Hitherto, the highest redshift measured for any object was z = 6.96, for a Lyman-alpha emitting galaxy(5). Here we report that GRB 090423 lies at a redshift of z approximate to 8.2, implying that massive stars were being produced and dying as GRBs similar to 630 Myr after the Big Bang. The burst also pinpoints the location of its host galaxy.

Journal ArticleDOI
18 Aug 2009-ACS Nano
TL;DR: Electron diffraction shows that on average the underlying carbon lattice maintains the order and lattice-spacings of graphene; a structure that is clearly resolved in 80 kV aberration-corrected atomic resolution TEM images.
Abstract: We report on the structural analysis of graphene oxide (GO) by transmission electron microscopy (TEM). Electron diffraction shows that on average the underlying carbon lattice maintains the order and lattice-spacings of graphene; a structure that is clearly resolved in 80 kV aberration-corrected atomic resolution TEM images. These results also reveal that single GO sheets are highly electron transparent and stable in the electron beam, and hence ideal support films for the study of nanoparticles and macromolecules by TEM. We demonstrate this through the structural analysis of physiological ferritin, an iron-storage protein.

Journal ArticleDOI
TL;DR: In this article, a broad range of programs for prevention of child maltreatment exist, the effectiveness of most of the programs is unknown, and there are currently no known approaches to prevent emotional abuse or exposure to intimate-partner violence.

Journal ArticleDOI
TL;DR: There have now been two successive policy regimes since the Second World War that have temporarily succeeded in reconciling the uncertainties and instabilities of a capitalist economy with democracy's need for stability for people's lives and capitalism's own need for confident mass consumers.
Abstract: There have now been two successive policy regimes since the Second World War that have temporarily succeeded in reconciling the uncertainties and instabilities of a capitalist economy with democracy's need for stability for people's lives and capitalism's own need for confident mass consumers. The first of these was the system of public demand management generally known as Keynesianism. The second was not, as has often been thought, a neo-liberal turn to pure markets, but a system of markets alongside extensive housing and other debt among low- and medium-income people linked to unregulated derivatives markets. It was a form of privatised Keynesianism. This combination reconciled capitalism's problem, but in a way that eventually proved unsustainable. After its collapse there is debate over what will succeed it. Most likely is an attempt to re-create it on a basis of corporate social responsibility.

Journal ArticleDOI
TL;DR: This review puts the current knowledge of marine picocyanobacterial genomics into an environmental context and presents previously unpublished genomic information arising from extensive genomic comparisons in order to provide insights into the adaptations of these marine microbes to their environment and how they are reflected at the genomic level.
Abstract: Marine picocyanobacteria of the genera Prochlorococcus and Synechococcus numerically dominate the picophytoplankton of the world ocean, making a key contribution to global primary production. Prochlorococcus was isolated around 20 years ago and is probably the most abundant photosynthetic organism on Earth. The genus comprises specific ecotypes which are phylogenetically distinct and differ markedly in their photophysiology, allowing growth over a broad range of light and nutrient conditions within the 45 degrees N to 40 degrees S latitudinal belt that they occupy. Synechococcus and Prochlorococcus are closely related, together forming a discrete picophytoplankton clade, but are distinguishable by their possession of dissimilar light-harvesting apparatuses and differences in cell size and elemental composition. Synechococcus strains have a ubiquitous oceanic distribution compared to that of Prochlorococcus strains and are characterized by phylogenetically discrete lineages with a wide range of pigmentation. In this review, we put our current knowledge of marine picocyanobacterial genomics into an environmental context and present previously unpublished genomic information arising from extensive genomic comparisons in order to provide insights into the adaptations of these marine microbes to their environment and how they are reflected at the genomic level.

Journal ArticleDOI
TL;DR: The proposed simple guidelines for combining estimates after MI may lead to a wider and more appropriate use of MI in future prognostic modelling studies.
Abstract: multiple imputation (mi) provides an effective approach to handle missing covariate data within prognostic modelling studies, as it can properly account for the missing data uncertainty. the multiply imputed datasets are each analysed using standard prognostic modelling techniques to obtain the estimates of interest. the estimates from each imputed dataset are then combined into one overall estimate and variance, incorporating both the within and between imputation variability. rubin's rules for combining these multiply imputed estimates are based on asymptotic theory. the resulting combined estimates may be more accurate if the posterior distribution of the population parameter of interest is better approximated by the normal distribution. however, the normality assumption may not be appropriate for all the parameters of interest when analysing prognostic modelling studies, such as predicted survival probabilities and model performance measures. guidelines for combining the estimates of interest when analysing prognostic modelling studies are provided. a literature review is performed to identify current practice for combining such estimates in prognostic modelling studies. methods for combining all reported estimates after mi were not well reported in the current literature. rubin's rules without applying any transformations were the standard approach used, when any method was stated. the proposed simple guidelines for combining estimates after mi may lead to a wider and more appropriate use of mi in future prognostic modelling studies.

Journal ArticleDOI
TL;DR: This paper addresses the question of new knowledge created in organizations by focusing on direct social interaction by adopting a dialogical approach, and several organizational examples are reinterpreted to illustrate the above points.
Abstract: Despite several insightful empirical studies on how new knowledge is created in organizations, there is still no satisfactory answer to the question, how is new knowledge created in organizations? The purpose of this paper is to address this question by focusing on direct social interaction, adopting a dialogical approach. The following argument is advanced. From a dialogical perspective, new knowledge in organizations originates in the individual ability to draw new distinctions concerning a task at hand. New distinctions may be developed because practitioners experience their situations in terms of already constituted distinctions, which lend themselves to further articulation. Further articulation develops when organizational members engage in dialogical exchanges. When productive, dialogue leads to self-distanciation, namely, to individuals taking distance from their customary and unreflective ways of acting as practitioners. Dialogue is productive depending on the extent to which participants engage relationally with one another. When this happens, participants are more likely to actively take responsibility for both the joint tasks in which they are involved and for the relationships they have with others. Self-distanciation leads to new distinctions through three processes of conceptual change (conceptual combination, conceptual expansion, and conceptual reframing), which, when intersubjectively accepted, constitute new knowledge. Several organizational examples, as well as findings from organizational knowledge research, are reinterpreted to illustrate the above points.

Journal ArticleDOI
TL;DR: In this paper, the authors show that management innovation is a consequence of a firm's internal context and of the external search for new knowledge, and demonstrate a trade-off between context and search, in that there is a negative effect on management innovation associated with their joint occurrence.

Journal ArticleDOI
01 Apr 2009
TL;DR: The authors argue that critical management studies (CMS) should be conceptualized as a profoundly performative project, and suggest a range of tactics including affirming ambiguity, working with mysteries, applied communicative action, exploring heterotopias and engaging micro-emancipations.
Abstract: We argue that critical management studies (CMS) should be conceptualized as a profoundly performative project. The central task of CMS should be to actively and pragmatically intervene in specific debates about management and encourage progressive forms of management. This involves CMS becoming affirmative, caring, pragmatic, potential focused, and normative. To do this, we suggest a range of tactics including affirming ambiguity, working with mysteries, applied communicative action, exploring heterotopias and engaging micro-emancipations.

Journal ArticleDOI
TL;DR: The authors investigated whether dropout in the Avon Longitudinal study of parents and children (ALSPAC) was systematic or random, and if systematic, whether it had an impact on the prediction of disruptive behaviour disorders.
Abstract: Background Participant drop-out occurs in all longitudinal studies, and if systematic, may lead to selection biases and erroneous conclusions being drawn from a study. Aims We investigated whether drop out in the Avon Longitudinal Study of Parents And Children (ALSPAC) was systematic or random, and if systematic, whether it had an impact on the prediction of disruptive behaviour disorders. Method Teacher reports of disruptive behaviour among currently participating, previously participating and never participating children aged 8 years in the ALSPAC longitudinal study were collected. Data on family factors were obtained in pregnancy. Simulations were conducted to explain the impact of selective drop-out on the strength of prediction. Results Drop out from the ALSPAC cohort was systematic and children who dropped out were more likely to suffer from disruptive behaviour disorder. Systematic participant drop-out according to the family variables, however, did not alter the association between family factors obtained in pregnancy and disruptive behaviour disorder at 8 years of age. Conclusions Cohort studies are prone to selective drop-out and are likely to underestimate the prevalence of psychiatric disorder. This empirical study and the simulations confirm that the validity of regression models is only marginally affected despite range restrictions after selective drop-out.

Posted Content
TL;DR: In this paper, the authors provide evidence that happiness raises productivity in a piece-rate Niederle-Vesterlund task, and a complementary Experiment 2 is designed to check the robustness and lasting nature of this kind of effect, in which major realworld unhappiness shocks - bereavement and family illness - are studied.
Abstract: The paper provides evidence that happiness raises productivity. In Experiment 1, a randomized trial is designed. Some subjects have their happiness levels increased, while those in a control group do not. Treated subjects have 12% greater productivity in a paid piece-rate Niederle-Vesterlund task. They alter output but not the per-piece quality of their work. To check the robustness and lasting nature of this kind of effect, a complementary Experiment 2 is designed. In this, major real-world unhappiness shocks - bereavement and family illness - are studied. The findings from (real-life) Experiment 2 match those from (random-assignment) Experiment 1.

Journal ArticleDOI
10 Apr 2009-Science
TL;DR: Altering the stimulation intervals gave different patterns of NF-κB–dependent gene expression, which supports the idea that oscillation frequency has a functional role in nuclear factor κB regulation.
Abstract: The nuclear factor kappa B (NF-kappa B) transcription factor regulates cellular stress responses and the immune response to infection. NF-kappa B activation results in oscillations in nuclear NF-kappa B abundance. To define the function of these oscillations, we treated cells with repeated short pulses of tumor necrosis factor-alpha at various intervals to mimic pulsatile inflammatory signals. At all pulse intervals that were analyzed, we observed synchronous cycles of NF-kappa B nuclear translocation. Lower frequency stimulations gave repeated full-amplitude translocations, whereas higher frequency pulses gave reduced translocation, indicating a failure to reset. Deterministic and stochastic mathematical models predicted how negative feedback loops regulate both the resetting of the system and cellular heterogeneity. Altering the stimulation intervals gave different patterns of NF-kappa B-dependent gene expression, which supports the idea that oscillation frequency has a functional role.

Journal ArticleDOI
TL;DR: This review illustrates notable recent progress in the field of medicinal bioinorganic chemistry as many new approaches to the design of innovative metal-based anticancer drugs are emerging.

Journal ArticleDOI
TL;DR: The latest advances in the understanding of the biochemistry, physiology and therapeutics of nitrate, nitrite and NO were discussed during a recent 2-day meeting at the Nobel Forum, Karolinska Institutet in Stockholm.
Abstract: Inorganic nitrate and nitrite from endogenous or dietary sources are metabolized in vivo to nitric oxide (NO) and other bioactive nitrogen oxides. The nitrate-nitrite-NO pathway is emerging as an important mediator of blood flow regulation, cell signaling, energetics and tissue responses to hypoxia. The latest advances in our understanding of the biochemistry, physiology and therapeutics of nitrate, nitrite and NO were discussed during a recent 2-day meeting at the Nobel Forum, Karolinska Institutet in Stockholm.