scispace - formally typeset
Search or ask a question

Showing papers by "University of Calgary published in 2008"


Journal ArticleDOI
01 Jun 2008-Chest
TL;DR: This article discusses the prevention of venous thromboembolism (VTE) and is part of the Antithrombotic and Thrombolytic Therapy: American College of Chest Physicians Evidence-Based Clinical Practice Guidelines (8th Edition).

3,944 citations


Journal ArticleDOI
TL;DR: A new CG model for proteins as an extension of the MARTINI force field is developed and effectively reproduces peptide-lipid interactions and the partitioning of amino acids and peptides in lipid bilayers.
Abstract: Many biologically interesting phenomena occur on a time scale that is too long to be studied by atomistic simulations. These phenomena include the dynamics of large proteins and self-assembly of biological materials. Coarse-grained (CG) molecular modeling allows computer simulations to be run on length and time scales that are 2–3 orders of magnitude larger compared to atomistic simulations, providing a bridge between the atomistic and the mesoscopic scale. We developed a new CG model for proteins as an extension of the MARTINI force field. Here, we validate the model for its use in peptide-bilayer systems. In order to validate the model, we calculated the potential of mean force for each amino acid as a function of its distance from the center of a dioleoylphosphatidylcholine (DOPC) lipid bilayer. We then compared amino acid association constants, the partitioning of a series of model pentapeptides, the partitioning and orientation of WALP23 in DOPC lipid bilayers and a series of KALP peptides in dimyris...

2,173 citations


Journal ArticleDOI
TL;DR: More rapid diagnostic testing of ESBL-producing bacteria and the possible modification of guidelines for community-onset bacteraemia associated with UTIs are required.
Abstract: The medical community relies on clinical expertise and published guidelines to assist physicians with choices in empirical therapy for system-based infectious syndromes, such as community-acquired pneumonia and urinary-tract infections (UTIs). From the late 1990s, multidrug-resistant Enterobacteriaceae (mostly Escherichia coli) that produce extended-spectrum beta lactamases (ESBLs), such as the CTX-M enzymes, have emerged within the community setting as an important cause of UTIs. Recent reports have also described ESBL-producing E coli as a cause of bloodstream infections associated with these community-onset UTIs. The carbapenems are widely regarded as the drugs of choice for the treatment of severe infections caused by ESBL-producing Enterobacteriaceae, although comparative clinical trials are scarce. Thus, more rapid diagnostic testing of ESBL-producing bacteria and the possible modification of guidelines for community-onset bacteraemia associated with UTIs are required.

1,811 citations


Journal ArticleDOI
TL;DR: A conceptual model of the core phenomenon and key themes in event tourism studies is provided as a framework for spurring theoretical advancement, identifying research gaps, and assisting professional practice as mentioned in this paper.

1,802 citations


Journal ArticleDOI
TL;DR: Analysis of individual measures of personality and categories of SWB shows that different personality and SWB scales can be substantively different and that the relationship between the two is typically much larger than previous meta-analyses have indicated.
Abstract: Understanding subjective well-being (SWB) has historically been a core human endeavor and presently spans fields from management to mental health. Previous meta-analyses have indicated that personality traits are one of the best predictors. Still, these past results indicate only a moderate relationship, weaker than suggested by several lines of reasoning. This may be because of commensurability, where researchers have grouped together substantively disparate measures in their analyses. In this article, the authors review and address this problem directly, focusing on individual measures of personality (e.g., the Neuroticism-Extroversion-Openness Personality Inventory; P. T. Costa & R. R. McCrae, 1992) and categories of SWB (e.g., life satisfaction). In addition, the authors take a multivariate approach, assessing how much variance personality traits account for individually as well as together. Results indicate that different personality and SWB scales can be substantively different and that the relationship between the two is typically much larger (e.g., 4 times) than previous meta-analyses have indicated. Total SWB variance accounted for by personality can reach as high as 39% or 63% disattenuated. These results also speak to meta-analyses in general and the need to account for scale differences once a sufficient research base has been generated.

1,404 citations


Journal ArticleDOI
Jan Schipper1, Jan Schipper2, Janice Chanson2, Janice Chanson1, Federica Chiozza3, Neil A. Cox1, Neil A. Cox2, Michael R. Hoffmann2, Michael R. Hoffmann1, Vineet Katariya2, John F. Lamoreux4, John F. Lamoreux2, Ana S. L. Rodrigues5, Ana S. L. Rodrigues6, Simon N. Stuart2, Simon N. Stuart1, Helen J. Temple2, Jonathan E. M. Baillie7, Luigi Boitani3, Thomas E. Lacher4, Thomas E. Lacher1, Russell A. Mittermeier, Andrew T. Smith8, Daniel Absolon, John M. Aguiar4, John M. Aguiar1, Giovanni Amori, Noura Bakkour1, Noura Bakkour9, Ricardo Baldi10, Ricardo Baldi11, Richard J. Berridge, Jon Bielby12, Jon Bielby7, Patricia Ann Black13, Julian Blanc, Thomas M. Brooks14, Thomas M. Brooks15, Thomas M. Brooks1, James Burton16, James Burton17, Thomas M. Butynski18, Gianluca Catullo, Roselle Chapman, Zoe Cokeliss7, Ben Collen7, Jim Conroy, Justin Cooke, Gustavo A. B. da Fonseca19, Gustavo A. B. da Fonseca20, Andrew E. Derocher21, Holly T. Dublin, J. W. Duckworth11, Louise H. Emmons22, Richard H. Emslie2, Marco Festa-Bianchet23, Matthew N. Foster, Sabrina Foster24, David L. Garshelis25, C. Cormack Gates26, Mariano Gimenez-Dixon, Susana González, José F. González-Maya, Tatjana C. Good27, Geoffrey Hammerson28, Philip S. Hammond29, D. C. D. Happold30, Meredith Happold30, John Hare, Richard B. Harris31, Clare E. Hawkins32, Clare E. Hawkins14, Mandy Haywood33, Lawrence R. Heaney34, Simon Hedges11, Kristofer M. Helgen22, Craig Hilton-Taylor2, Syed Ainul Hussain35, Nobuo Ishii36, Thomas Jefferson37, Richard K. B. Jenkins38, Charlotte H. Johnston8, Mark Keith39, Jonathan Kingdon40, David Knox1, Kit M. Kovacs41, Kit M. Kovacs42, Penny F. Langhammer8, Kristin Leus43, Rebecca L. Lewison44, Gabriela Lichtenstein, Lloyd F. Lowry45, Zoe Macavoy12, Georgina M. Mace12, David Mallon46, Monica Masi, Meghan W. McKnight, Rodrigo A. Medellín47, Patricia Medici48, G. Mills, Patricia D. Moehlman, Sanjay Molur, Arturo Mora2, Kristin Nowell, John F. Oates49, Wanda Olech, William R.L. Oliver, Monik Oprea22, Bruce D. Patterson34, William F. Perrin37, Beth Polidoro2, Caroline M. Pollock2, Abigail Powel50, Yelizaveta Protas9, Paul A. Racey38, Jim Ragle2, Pavithra Ramani24, Galen B. Rathbun51, Randall R. Reeves, Stephen B. Reilly37, John E. Reynolds52, Carlo Rondinini3, Ruth Grace Rosell-Ambal1, Monica Rulli, Anthony B. Rylands, Simona Savini, Cody J. Schank24, Wes Sechrest24, Caryn Self-Sullivan, Alan Shoemaker2, Claudio Sillero-Zubiri40, Naamal De Silva, David E. Smith24, Chelmala Srinivasulu53, P. J. Stephenson, Nico van Strien54, Bibhab Kumar Talukdar55, Barbara L. Taylor37, Rob Timmins, Diego G. Tirira, Marcelo F. Tognelli10, Marcelo F. Tognelli56, Katerina Tsytsulina, Liza M. Veiga57, Jean-Christophe Vié2, Elizabeth A. Williamson58, Sarah A. Wyatt, Yan Xie, Bruce E. Young28 
Conservation International1, International Union for Conservation of Nature and Natural Resources2, Sapienza University of Rome3, Texas A&M University4, Instituto Superior Técnico5, University of Cambridge6, Zoological Society of London7, Arizona State University8, Columbia University9, National Scientific and Technical Research Council10, Wildlife Conservation Society11, Imperial College London12, National University of Tucumán13, University of Tasmania14, University of the Philippines Los Baños15, Earthwatch Institute16, University of Edinburgh17, Drexel University18, Universidade Federal de Minas Gerais19, Global Environment Facility20, University of Alberta21, Smithsonian Institution22, Université de Sherbrooke23, University of Virginia24, Minnesota Department of Natural Resources25, University of Calgary26, James Cook University27, NatureServe28, University of St Andrews29, Australian National University30, University of Montana31, General Post Office32, University of Otago33, Field Museum of Natural History34, Wildlife Institute of India35, Tokyo Woman's Christian University36, National Oceanic and Atmospheric Administration37, University of Aberdeen38, University of the Witwatersrand39, University of Oxford40, University Centre in Svalbard41, Norwegian Polar Institute42, Copenhagen Zoo43, San Diego State University44, University of Alaska Fairbanks45, Manchester Metropolitan University46, National Autonomous University of Mexico47, University of Kent48, City University of New York49, Victoria University of Wellington50, California Academy of Sciences51, Mote Marine Laboratory52, Osmania University53, White Oak Conservation54, Aaranyak55, University of California, Davis56, Museu Paraense Emílio Goeldi57, University of Stirling58
10 Oct 2008-Science
TL;DR: In this paper, the authors present a comprehensive assessment of the conservation status and distribution of the world's mammals, including marine mammals, using data collected by 1700+ experts, covering all 5487 species.
Abstract: Knowledge of mammalian diversity is still surprisingly disparate, both regionally and taxonomically. Here, we present a comprehensive assessment of the conservation status and distribution of the world's mammals. Data, compiled by 1700+ experts, cover all 5487 species, including marine mammals. Global macroecological patterns are very different for land and marine species but suggest common mechanisms driving diversity and endemism across systems. Compared with land species, threat levels are higher among marine mammals, driven by different processes (accidental mortality and pollution, rather than habitat loss), and are spatially distinct (peaking in northern oceans, rather than in Southeast Asia). Marine mammals are also disproportionately poorly known. These data are made freely available to support further scientific developments and conservation action.

1,383 citations


Journal ArticleDOI
TL;DR: This report reviews recommendations for sun exposure and vitamin D intake and possible caveats associated with these recommendations and also examines mechanisms whereby vitamin D synthesis and intake can be optimized.
Abstract: Given the recent spate of reports of vitamin D deficiency, there is a need to reexamine our understanding of natural and other sources of vitamin D, as well as mechanisms whereby vitamin D synthesis and intake can be optimized. This stateof-the-art report from the Drug and Therapeutics Committee of the Lawson Wilkins Pediatric Endocrine Society was aimed to perform this task and also reviews recommendations for sun exposure and vitamin D intake and possible caveats associated with these recommendations. Pediatrics 2008;122:398‐417

1,200 citations


Journal ArticleDOI
06 Mar 2008-Nature
TL;DR: It is shown that internalized adenoviral DNA induces maturation of pro-interleukin-1β in macrophages, which is dependent on NALP3 and ASC, components of the innate cytosolic molecular complex termed the inflammasome, which strengthens their central role in innate immunity.
Abstract: The innate immune system recognizes nucleic acids during infection and tissue damage. Whereas viral RNA is detected by endosomal toll-like receptors (TLR3, TLR7, TLR8) and cytoplasmic RIG-I and MDA5, endosomal TLR9 and cytoplasmic DAI bind DNA, resulting in the activation of nuclear factor-kappaB and interferon regulatory factor transcription factors. However, viruses also trigger pro-inflammatory responses, which remain poorly defined. Here we show that internalized adenoviral DNA induces maturation of pro-interleukin-1beta in macrophages, which is dependent on NALP3 and ASC, components of the innate cytosolic molecular complex termed the inflammasome. Correspondingly, NALP3- and ASC-deficient mice display reduced innate inflammatory responses to adenovirus particles. Inflammasome activation also occurs as a result of transfected cytosolic bacterial, viral and mammalian (host) DNA, but in this case sensing is dependent on ASC but not NALP3. The DNA-sensing pro-inflammatory pathway functions independently of TLRs and interferon regulatory factors. Thus, in addition to viral and bacterial components or danger signals in general, inflammasomes sense potentially dangerous cytoplasmic DNA, strengthening their central role in innate immunity.

949 citations


Journal ArticleDOI
TL;DR: In this article, the authors investigated the mechanisms controlling the bond formation among extruded polymer filaments in the fused deposition modeling (FDM) process and showed that the bonding phenomenon is thermally driven and ultimately determines the integrity and mechanical properties of the resultant prototypes.
Abstract: Purpose – The purpose of this paper is to investigate the mechanisms controlling the bond formation among extruded polymer filaments in the fused deposition modeling (FDM) process. The bonding phenomenon is thermally driven and ultimately determines the integrity and mechanical properties of the resultant prototypes.Design/methodology/approach – The bond quality was assessed through measuring and analyzing changes in the mesostructure and the degree of healing achieved at the interfaces between the adjoining polymer filaments. Experimental measurements of the temperature profiles were carried out for specimens produced under different processing conditions, and the effects on mesostructures and mechanical properties were observed. Parallel to the experimental work, predictions of the degree of bonding achieved during the filament deposition process were made based on the thermal analysis of extruded polymer filaments.Findings – Experimental results showed that the fabrication strategy, the envelope temper...

949 citations


Journal ArticleDOI
TL;DR: The results of the study suggest that the instrument is a valid, reliable, and efficient measure of the dimensions of social presence and cognitive presence, thereby providing additional support for the validity of the Col as a framework for constructing effective online learning environments.
Abstract: This article reports on the multi-institutional development and validation of an instrument that attempts to operationalize Garrison, Anderson and Archer's Community of Inquiry (Col) framework (2000). The results of the study suggest that the instrument is a valid, reliable, and efficient measure of the dimensions of social presence and cognitive presence, thereby providing additional support for the validity of the Col as a framework for constructing effective online learning environments. While factor analysis supported the idea of teaching presence as a construct, it also suggested that the construct consisted of two factors-one related to course design and organization and the other related to instructor behavior during the course. The article concludes with a discussion of potential implications of further refinement of the Col measures for researchers, designers, administrators, and instructors.

779 citations


Proceedings ArticleDOI
18 Aug 2008
TL;DR: A detailed characterization of Twitter, an application that allows users to send short messages, is presented, which identifies distinct classes of Twitter users and their behaviors, geographic growth patterns and current size of the network.
Abstract: Web 2.0 has brought about several new applications that have enabled arbitrary subsets of users to communicate with each other on a social basis. Such communication increasingly happens not just on Facebook and MySpace but on several smaller network applications such as Twitter and Dodgeball. We present a detailed characterization of Twitter, an application that allows users to send short messages. We gathered three datasets (covering nearly 100,000 users) including constrained crawls of the Twitter network using two different methodologies, and a sampled collection from the publicly available timeline. We identify distinct classes of Twitter users and their behaviors, geographic growth patterns and current size of the network, and compare crawl results obtained under rate limiting constraints.

Journal ArticleDOI
TL;DR: The theoretical basis for the Allan variance for modeling the inertial sensors' error terms and its implementation in modeling different grades of inertial sensor units are covered.
Abstract: It is well known that inertial navigation systems can provide high-accuracy position, velocity, and attitude information over short time periods. However, their accuracy rapidly degrades with time. The requirements for an accurate estimation of navigation information necessitate the modeling of the sensors' error components. Several variance techniques have been devised for stochastic modeling of the error of inertial sensors. They are basically very similar and primarily differ in that various signal processings, by way of weighting functions, window functions, etc., are incorporated into the analysis algorithms in order to achieve a particular desired result for improving the model characterizations. The simplest is the Allan variance. The Allan variance is a method of representing the root means square (RMS) random-drift error as a function of averaging time. It is simple to compute and relatively simple to interpret and understand. The Allan variance method can be used to determine the characteristics of the underlying random processes that give rise to the data noise. This technique can be used to characterize various types of error terms in the inertial-sensor data by performing certain operations on the entire length of data. In this paper, the Allan variance technique will be used in analyzing and modeling the error of the inertial sensors used in different grades of the inertial measurement units. By performing a simple operation on the entire length of data, a characteristic curve is obtained whose inspection provides a systematic characterization of various random errors contained in the inertial-sensor output data. Being a directly measurable quantity, the Allan variance can provide information on the types and magnitude of the various error terms. This paper covers both the theoretical basis for the Allan variance for modeling the inertial sensors' error terms and its implementation in modeling different grades of inertial sensors.

Journal ArticleDOI
TL;DR: E. coli ST131 and ST405 and multidrug-resistant IncFII plasmids may determine spread of this lactamase.
Abstract: We analyzed 43 CTX-M-15-producing Escherichia coli isolates and 6 plasmids encoding the blaCTX-M-15 gene from Canada, India, Kuwait, France, Switzerland, Portugal, and Spain. Most isolates belonged to phylogroups B2 (50%) and D (25%). An EC-B2 strain of clonal complex sequence type (ST) 131 was detected in all countries; other B2 isolates corresponded to ST28, ST405, ST354, and ST695 from specific areas. EC-D strains were clonally unrelated but isolates from 3 countries belonged to ST405. All CTX-M-15 plasmids corresponded to IncFII group with overrepresentation of 3 HpaI-digested plasmid DNA profiles (A, B and C; 85-120kb, similarity > or =70%). Plasmid A was detected in EC-B2 strains (ST131, ST354, or ST405), plasmid C was detected in B2 and D strains, and plasmid B was confined to worldwide-disseminated ST131. Most plasmids contained blaOXA-1, aac(6')-Ib-cr, and blaTEM-1. Worldwide dissemination of CTX-M-15 seems to be determined by clonal complexes ST131 and ST405 and multidrug-resistant IncFII plasmids.

Journal ArticleDOI
TL;DR: The validity of the International Classification of Disease, 10th Version (ICD-10) administrative hospital discharge data was generally similar though validity differed between coding versions for some conditions.
Abstract: The World Health Organization adopted the first version of the International Classification of Diseases (ICD) in 1900 to internationally monitor and compare mortality statistics and causes of death. Since then, the classification has been revised periodically to accommodate new knowledge of disease and health. The sixth revision, published in 1949, was more radical than the previous five revisions because this edition made it possible to record information from patient charts to compile morbidity statistics. Subsequent revisions were made in 1958 (7th Edition), in 1968 (8th Edition), and in 1979 (9th Edition). The United States modified ICD-9 by specifying many categories and extending coding rubrics to describe the clinical picture in more detail. These modifications resulted in the publication of ICD-9 Clinical Modification (ICD-9-CM) in 1979 for coding diagnoses in patient charts (Commission on Professional and Hospital Activities 1986). The latest version, ICD-10, was introduced in 1992 (World Health Organization 1992). The major differences between the ICD-10 and ICD-9-CM coding systems are: (1) the tabular list in ICD-10 has 21 categories of disease compared with 19 categories in ICD-9-CM and the category of diseases of the nervous system and sense organs in ICD-9-CM is divided into three categories in ICD-10, including diseases of the nervous system, diseases of the eye and adnexa, and diseases of the ear and mastoid process; and (2) the codes in ICD-10 are alphanumeric while codes in ICD-9-CM are numeric. Each code in ICD-10 starts with a letter (i.e., A–Z), followed by two numeric digits, a decimal, and a digit (e.g., acute bronchiolitis due to respiratory syncytial virus is J21.0). In contrast, codes in ICD-9-CM begin with three digit numbers (i.e., 001–999), that are followed by a decimal and up to two digits (e.g., acute bronchiolitis due to respiratory syncytial virus is 466.11). Canada, Australia, Germany, and other countries have enhanced ICD-10 by adding more specific codes and released country-specific ICD-10 versions, such as ICD-10-Canada (ICD-10-CA; Canadian Institute for Health Information 2003). However, ICD-10-CA has maintained its comparability with ICD-10. The basic ICD-10 structure, scope, content, and definition of existing codes are not altered in ICD-10-CA. This means that none of the ICD-10 codes are relocated or deleted. ICD-10-CA mainly extends code character levels, from third and fourth levels of ICD-10 to fourth, fifth, or sixth character levels (e.g., from I15.0 for renovascular hypertension to I15.00 for benign renovascular hypertension and I15.01 for malignant renovascular hypertension). A few additions of third- and fourth-level codes were also included in ICD-10-CA in a manner consistent with the existing classification. All of these additional codes are indicated with red maple leaf symbols in ICD-10-CA coding manuals. To continuously study the health care system and investigate or monitor population health status with ICD-10 data, it is imperative to assess errors that could occur in the process of creating administrative data due to the introduction of the new coding system, ICD-10. We conducted this study to evaluate the validity of ICD-10 administrative hospital discharge data and to determine whether there were improvements in the validity compared with the validity of ICD-9-CM data. To achieve this aim, we reviewed randomly selected charts coded using ICD-10 at four Canadian teaching hospitals, determined the presence or absence of recorded conditions, and then separately recoded the same charts using ICD-9-CM. Then we assessed the agreement between originally coded ICD-10 administrative and chart review data, and the recoded ICD-9-CM administrative data and chart review data for recording the same conditions. This permitted us to compare the accuracy of ICD-10 data relative to the chart review data, with the accuracy of ICD-9-CM data relative to the chart review data for these conditions.

Journal ArticleDOI
TL;DR: This article examined the effect of environmental regulations on trade flows and found that industries whose abatement costs increased most experienced the largest increases in net imports, accounting for 10% of the total increase in trade volume over the period.
Abstract: We use theory and empirics to examine the effect of environmental regulations on trade flows. A simple model demonstrates how unobserved heterogeneity, endogeneity, and aggregation issues bias standard measurements of this relationship. A reduced-form estimate of the model, using data on U.S. regulations and trade with Canada and Mexico for 130 manufacturing industries from 1977 to 1986, indicates that industries whose abatement costs increased most experienced the largest increases in net imports. For the average industry, the change in net imports we ascribe to regulatory costs amounting to 10% of the total increase in trade volume over the period.

Journal ArticleDOI
10 Jan 2008-Nature
TL;DR: The data imply a common methanogenic biodegradation mechanism in subsurface degraded oil reservoirs, resulting in consistent patterns of hydrocarbon alteration, and the common association of dry gas with severely degraded oils observed worldwide.
Abstract: Biodegradation of crude oil in subsurface petroleum reservoirs has adversely affected the majority of the world's oil, making recovery and refining of that oil more costly. The prevalent occurrence of biodegradation in shallow subsurface petroleum reservoirs has been attributed to aerobic bacterial hydrocarbon degradation stimulated by surface recharge of oxygen-bearing meteoric waters. This hypothesis is empirically supported by the likelihood of encountering biodegraded oils at higher levels of degradation in reservoirs near the surface. More recent findings, however, suggest that anaerobic degradation processes dominate subsurface sedimentary environments, despite slow reaction kinetics and uncertainty as to the actual degradation pathways occurring in oil reservoirs. Here we use laboratory experiments in microcosms monitoring the hydrocarbon composition of degraded oils and generated gases, together with the carbon isotopic compositions of gas and oil samples taken at wellheads and a Rayleigh isotope fractionation box model, to elucidate the probable mechanisms of hydrocarbon degradation in reservoirs. We find that crude-oil hydrocarbon degradation under methanogenic conditions in the laboratory mimics the characteristic sequential removal of compound classes seen in reservoir-degraded petroleum. The initial preferential removal of n-alkanes generates close to stoichiometric amounts of methane, principally by hydrogenotrophic methanogenesis. Our data imply a common methanogenic biodegradation mechanism in subsurface degraded oil reservoirs, resulting in consistent patterns of hydrocarbon alteration, and the common association of dry gas with severely degraded oils observed worldwide. Energy recovery from oilfields in the form of methane, based on accelerating natural methanogenic biodegradation, may offer a route to economic production of difficult-to-recover energy from oilfields.

Journal ArticleDOI
15 Aug 2008-Science
TL;DR: Results demonstrate that substorms are likely initiated by tail reconnection, and are reported on simultaneous measurements in the magnetotail at multiple distances, at the time of substorm onset.
Abstract: Magnetospheric substorms explosively release solar wind energy previously stored in Earth's magnetotail, encompassing the entire magnetosphere and producing spectacular auroral displays. It has been unclear whether a substorm is triggered by a disruption of the electrical current flowing across the near-Earth magnetotail, at approximately 10 R(E) (R(E): Earth radius, or 6374 kilometers), or by the process of magnetic reconnection typically seen farther out in the magnetotail, at approximately 20 to 30 R(E). We report on simultaneous measurements in the magnetotail at multiple distances, at the time of substorm onset. Reconnection was observed at 20 R(E), at least 1.5 minutes before auroral intensification, at least 2 minutes before substorm expansion, and about 3 minutes before near-Earth current disruption. These results demonstrate that substorms are likely initiated by tail reconnection.

Journal ArticleDOI
TL;DR: It is shown that DNA is a multifaceted component of P. aeruginosa biofilms and the presence of extracellular DNA in the biofilm matrix contributes to cation gradients, genomic DNA release and inducible antibiotic resistance.
Abstract: Biofilms are surface-adhered bacterial communities encased in an extracellular matrix composed of DNA, bacterial polysaccharides and proteins, which are up to 1000-fold more antibiotic resistant than planktonic cultures. To date, extracellular DNA has been shown to function as a structural support to maintain Pseudomonas aeruginosa biofilm architecture. Here we show that DNA is a multifaceted component of P. aeruginosa biofilms. At physiologically relevant concentrations, extracellular DNA has antimicrobial activity, causing cell lysis by chelating cations that stabilize lipopolysaccharide (LPS) and the outer membrane (OM). DNA-mediated killing occurred within minutes, as a result of perturbation of both the outer and inner membrane (IM) and the release of cytoplasmic contents, including genomic DNA. Sub-inhibitory concentrations of DNA created a cation-limited environment that resulted in induction of the PhoPQ- and PmrAB-regulated cationic antimicrobial peptide resistance operon PA3552-PA3559 in P. aeruginosa. Furthermore, DNA-induced expression of this operon resulted in up to 2560-fold increased resistance to cationic antimicrobial peptides and 640-fold increased resistance to aminoglycosides, but had no effect on beta-lactam and fluoroquinolone resistance. Thus, the presence of extracellular DNA in the biofilm matrix contributes to cation gradients, genomic DNA release and inducible antibiotic resistance. DNA-rich environments, including biofilms and other infection sites like the CF lung, are likely the in vivo environments where extracellular pathogens such as P. aeruginosa encounter cation limitation.

Journal ArticleDOI
TL;DR: A comprehensive meta-analysis of the effects of cell phones on driving performance was performed, finding that observed performance decrements probably underestimate the true behavior of drivers with mobile phones in their own vehicles.

Proceedings ArticleDOI
09 Nov 2008
TL;DR: The CUEZILLA prototype is a tool that measures the quality of new bug reports and recommends which elements should be added to improve the quality, and discusses several recommendations for better bug tracking systems which should focus on engaging bug reporters, better tool support, and improved handling of bug duplicates.
Abstract: In software development, bug reports provide crucial information to developers. However, these reports widely differ in their quality. We conducted a survey among developers and users of APACHE, ECLIPSE, and MOZILLA to find out what makes a good bug report.The analysis of the 466 responses revealed an information mismatch between what developers need and what users supply. Most developers consider steps to reproduce, stack traces, and test cases as helpful, which are at the same time most difficult to provide for users. Such insight is helpful to design new bug tracking tools that guide users at collecting and providing more helpful information.Our CUEZILLA prototype is such a tool and measures the quality of new bug reports; it also recommends which elements should be added to improve the quality. We trained CUEZILLA on a sample of 289 bug reports, rated by developers as part of the survey. In our experiments, CUEZILLA was able to predict the quality of 31--48% of bug reports accurately.

Journal ArticleDOI
TL;DR: Better understanding of the mechanisms through which the stomach is able to resist injury in the presence of luminal irritants is helping to drive the development of safer anti-inflammatory drugs, and therapies to accelerate and improve the quality of ulcer healing.
Abstract: Except in rare cases, the stomach can withstand exposure to highly concentrated hydrochloric acid, refluxed bile salts, alcohol, and foodstuffs with a wide range of temperatures and osmolarity. This is attributed to a number of physiological responses by the mucosal lining to potentially harmful luminal agents, and to an ability to rapidly repair damage when it does occur. Since the discovery in 1971 that prostaglandin synthesis could be blocked by aspirin and other nonsteroidal anti-inflammatory drugs (NSAIDs), there has been great interest in the contribution of prostaglandins to gastric mucosal defense. Prostaglandins modulate virtually every aspect of mucosal defense, and the importance of this contribution is evident by the increased susceptibility of the stomach to injury following ingestion of an NSAID. With chronic ingestion of these drugs, the development of ulcers in the stomach is a significant clinical concern. Research over the past two decades has helped to identify some of the key events triggered by NSAIDs that contribute to ulcer formation and/or impair ulcer healing. Recent research has also highlighted the fact that the protective functions of prostaglandins in the stomach can be carried out by other mediators, in particular the gaseous mediators nitric oxide and hydrogen sulfide. Better understanding of the mechanisms through which the stomach is able to resist injury in the presence of luminal irritants is helping to drive the development of safer anti-inflammatory drugs, and therapies to accelerate and improve the quality of ulcer healing.

Journal ArticleDOI
TL;DR: It is shown that empirical evidence invalidates the chronosequence-based sequences inferred in these classic studies, and evidence from studies that used non-chronosequences methods are reviewed to test the space-for-time substitution in four classic succession studies.
Abstract: Many introductory ecology textbooks illustrate succession, at least in part, by using certain classic studies (e.g. sand dunes, ponds/bogs, glacial till, and old fields) that substituted space for time (chronosequence) in determining the sequences of the succession. Despite past criticisms of this method, there is continued, often uncritical, use of chronosequences in current research on topics besides succession, including temporal changes in biodiversity, productivity, nutrient cycling, etc. To show the problem with chronosequence-based studies in general, we review evidence from studies that used non-chronosequence methods (such as long-term study of permanent plots, palynology, and stand reconstruction) to test the space-for-time substitution in four classic succession studies. In several cases, the tests have used the same locations and, in one case, the same plots as those in the original studies. We show that empirical evidence invalidates the chronosequence-based sequences inferred in these classic studies.

Proceedings ArticleDOI
06 Apr 2008
TL;DR: Current practice in Human Computer Interaction as encouraged by educational institutes, academic review processes, and institutions with usability groups advocate usability evaluation as a critical part of every design process.
Abstract: Current practice in Human Computer Interaction as encouraged by educational institutes, academic review processes, and institutions with usability groups advocate usability evaluation as a critical part of every design process. This is for good reason: usability evaluation has a significant role to play when conditions warrant it. Yet evaluation can be ineffective and even harmful if naively done 'by rule' rather than 'by thought'. If done during early stage design, it can mute creative ideas that do not conform to current interface norms. If done to test radical innovations, the many interface issues that would likely arise from an immature technology can quash what could have been an inspired vision. If done to validate an academic prototype, it may incorrectly suggest a design's scientific worthiness rather than offer a meaningful critique of how it would be adopted and used in everyday practice. If done without regard to how cultures adopt technology over time, then today's reluctant reactions by users will forestall tomorrow's eager acceptance. The choice of evaluation methodology - if any - must arise from and be appropriate for the actual problem or research question under consideration.

Journal ArticleDOI
TL;DR: In this paper, a preliminary model on the factors that prevent intra-family succession is presented, based on a review and analysis of the literature, and the model is applied to the family business literature.
Abstract: Although research on management succession is a dominant topic in the family business literature, little systematic attention has been given to the factors that prevent intra-family succession from occurring. Based on a review and analysis of the literature, this article presents a preliminary model on the factors that prevent intra-family succession.

Proceedings ArticleDOI
10 May 2008
TL;DR: This paper proposes to use network analysis on dependency graphs of the entire system to identify central program units that are more likely to face defects and finds that the recall for models building from network measures is by 10% points higher than for models built from complexity metrics.
Abstract: In software development, resources for quality assurance are limited by time and by cost. In order to allocate resources effectively, managers need to rely on their experience backed by code complexity metrics. But often dependencies exist between various pieces of code over which managers may have little knowledge. These dependencies can be construed as a low level graph of the entire system. In this paper, we propose to use network analysis on these dependency graphs. This allows managers to identify central program units that are more likely to face defects. In our evaluation on Windows Server 2003, we found that the recall for models built from network measures is by 10% points higher than for models built from complexity metrics. In addition, network measures could identify 60% of the binaries that the Windows developers considered as critical-twice as many as identified by complexity metrics.

Journal ArticleDOI
TL;DR: The application of genomics to the alkaloid field has accelerated the discovery of cDNAs encoding previously elusive biosynthetic enzymes, and technologies, such as large-scale gene expression analyses and metabolic engineering approaches with transgenic plants, have provided new insights into the regulatory architecture of alkaloids metabolism.
Abstract: Alkaloids represent a highly diverse group of compounds that are related only by the occurrence of a nitrogen atom in a heterocyclic ring. Plants are estimated to produce approximately 12,000 different alkaloids, which can be organized into groups according to their carbon skeletal structures. Alkaloid biosynthesis in plants involves many catalytic steps, catalyzed by enzymes that belong to a wide range of protein families. The characterization of novel alkaloid biosynthetic enzymes in terms of structural biochemistry, molecular and cell biology, and biotechnological applications has been the focus of research over the past several years. The application of genomics to the alkaloid field has accelerated the discovery of cDNAs encoding previously elusive biosynthetic enzymes. Other technologies, such as large-scale gene expression analyses and metabolic engineering approaches with transgenic plants, have provided new insights into the regulatory architecture of alkaloid metabolism.

Journal ArticleDOI
TL;DR: In RP evolving to definite SSc, microvascular damage is dynamic and sequential, while SSc-specific autoantibodies are associated with the course and type of capillary abnormalities.
Abstract: Objective To identify in patients with Raynaud's phenomenon (RP) independent markers that predict progression to definite systemic sclerosis (SSc) and to determine in patients with progression to SSc the type and sequence of microvascular damage and its relationship to SSc-specific autoantibodies. Methods Consecutive patients referred for evaluation of RP who had no definite connective tissue disease were evaluated for microvascular damage by nailfold capillary microscopy (NCM) and for anticentromere (anti–CENP-B), anti-Th/To, anti–topoisomerase I, and anti–RNA polymerase III (anti–RNAP III) autoantibodies by specific assays. Patients were studied prospectively. Results Of the 586 patients who were followed up for 3,197 person-years, 74 (12.6%) developed definite SSc. A characteristic sequence of microvascular damage was identified, starting with enlarged capillaries, followed by capillary loss, and then by capillary telangiectases. Definite SSc was diagnosed in close temporal relationship to capillary loss. Enlarged capillaries, capillary loss, and SSc-specific autoantibodies independently predicted definite SSc. Anti–CENP-B and anti-Th/To antibodies predicted enlarged capillaries; these autoantibodies and anti–RNAP III predicted capillary loss. Each autoantibody was associated with a distinct time course of microvascular damage. At followup, 79.5% of patients with 1 of these autoantibodies and abnormal findings on NCM at baseline had developed definite SSc. Patients with both baseline predictors were 60 times more likely to develop definite SSc. The data validated the proposed criteria for early SSc. Conclusion In RP evolving to definite SSc, microvascular damage is dynamic and sequential, while SSc-specific autoantibodies are associated with the course and type of capillary abnormalities. Abnormal findings on NCM at baseline together with an SSc-specific autoantibody indicate a very high probability of developing definite SSc, whereas their absence rules out this outcome.

Journal ArticleDOI
TL;DR: It is suggested that behavioral activation may be nearly as enduring as cognitive therapy and that both psychotherapies are less expensive and longer lasting alternatives to medication in the treatment of depression.
Abstract: This study followed treatment responders from a randomized controlled trial of adults with major depression. Patients treated with medication but withdrawn onto pill-placebo had more relapse through 1 year of follow-up compared to patients who received prior behavioral activation, prior cognitive therapy, or continued medication. Prior psychotherapy was also superior to medication withdrawal in the prevention of recurrence across the 2nd year of follow-up. Specific comparisons indicated that patients previously exposed to cognitive therapy were significantly less likely to relapse following treatment termination than patients withdrawn from medication, and patients previously exposed to behavioral activation did almost as well relative to patients withdrawn from medication, although the difference was not significantly different. Differences between behavioral activation and cognitive therapy were small in magnitude and not significantly different across the full 2-year follow-up, and each therapy was at least as efficacious as the continuation of medication. These findings suggest that behavioral activation may be nearly as enduring as cognitive therapy and that both psychotherapies are less expensive and longer lasting alternatives to medication in the treatment of depression.

Journal ArticleDOI
05 Jun 2008-BMJ
TL;DR: Alan Shiell, Penelope Hawe, and Lisa Gold explain why it is important to distinguish the two types of complexity.
Abstract: Although guidelines exist for evaluating complex interventions, they may be of little help in dealing with the multiple effects of interventions in complex systems such as hospitals. Alan Shiell, Penelope Hawe, and Lisa Gold explain why it is important to distinguish the two types of complexity

Journal ArticleDOI
TL;DR: The relationship between the distribution in membranes, bulk partitioning to cyclohexane, and several amino acid hydrophobicity scales is discussed and the results give detailed insight in the molecular basis of the preferred location and orientation of each side chain.