scispace - formally typeset
Search or ask a question

Showing papers by "University of Ottawa published in 2010"


Journal ArticleDOI
TL;DR: A structured summary is provided including, as applicable, background, objectives, data sources, study eligibility criteria, participants, interventions, study appraisal and synthesis methods, results, limitations, conclusions and implications of key findings.

31,379 citations


Journal ArticleDOI
05 Aug 2010-Nature
TL;DR: The results identify several novel loci associated with plasma lipids that are also associated with CAD and provide the foundation to develop a broader biological understanding of lipoprotein metabolism and to identify new therapeutic opportunities for the prevention of CAD.
Abstract: Plasma concentrations of total cholesterol, low-density lipoprotein cholesterol, high-density lipoprotein cholesterol and triglycerides are among the most important risk factors for coronary artery disease (CAD) and are targets for therapeutic intervention. We screened the genome for common variants associated with plasma lipids in >100,000 individuals of European ancestry. Here we report 95 significantly associated loci (P < 5 x 10(-8)), with 59 showing genome-wide significant association with lipid traits for the first time. The newly reported associations include single nucleotide polymorphisms (SNPs) near known lipid regulators (for example, CYP7A1, NPC1L1 and SCARB1) as well as in scores of loci not previously implicated in lipoprotein metabolism. The 95 loci contribute not only to normal variation in lipid traits but also to extreme lipid phenotypes and have an impact on lipid traits in three non-European populations (East Asians, South Asians and African Americans). Our results identify several novel loci associated with plasma lipids that are also associated with CAD. Finally, we validated three of the novel genes-GALNT2, PPP1R3B and TTC39B-with experiments in mouse models. Taken together, our findings provide the foundation to develop a broader biological understanding of lipoprotein metabolism and to identify new therapeutic opportunities for the prevention of CAD.

3,469 citations


Journal ArticleDOI
TL;DR: This work proposes principles for deciding saturation in theory-based interview studies, and demonstrates these principles in two studies, based on the theory of planned behaviour, designed to identify three belief categories (Behavioural, Normative and Control).
Abstract: In interview studies, sample size is often justified by interviewing participants until reaching 'data saturation'. However, there is no agreed method of establishing this. We propose principles for deciding saturation in theory-based interview studies (where conceptual categories are pre-established by existing theory). First, specify a minimum sample size for initial analysis (initial analysis sample). Second, specify how many more interviews will be conducted without new ideas emerging (stopping criterion). We demonstrate these principles in two studies, based on the theory of planned behaviour, designed to identify three belief categories (Behavioural, Normative and Control), using an initial analysis sample of 10 and stopping criterion of 3. Study 1 (retrospective analysis of existing data) identified 84 shared beliefs of 14 general medical practitioners about managing patients with sore throat without prescribing antibiotics. The criterion for saturation was achieved for Normative beliefs but not for other beliefs or studywise saturation. In Study 2 (prospective analysis), 17 relatives of people with Paget's disease of the bone reported 44 shared beliefs about taking genetic testing. Studywise data saturation was achieved at interview 17. We propose specification of these principles for reporting data saturation in theory-based interview studies. The principles may be adaptable for other types of studies.

2,248 citations


Journal ArticleDOI
Josée Dupuis1, Josée Dupuis2, Claudia Langenberg, Inga Prokopenko3  +336 moreInstitutions (82)
TL;DR: It is demonstrated that genetic studies of glycemic traits can identify type 2 diabetes risk loci, as well as loci containing gene variants that are associated with a modest elevation in glucose levels but are not associated with overt diabetes.
Abstract: Levels of circulating glucose are tightly regulated. To identify new loci influencing glycemic traits, we performed meta-analyses of 21 genome-wide association studies informative for fasting glucose, fasting insulin and indices of beta-cell function (HOMA-B) and insulin resistance (HOMA-IR) in up to 46,186 nondiabetic participants. Follow-up of 25 loci in up to 76,558 additional subjects identified 16 loci associated with fasting glucose and HOMA-B and two loci associated with fasting insulin and HOMA-IR. These include nine loci newly associated with fasting glucose (in or near ADCY5, MADD, ADRA2A, CRY2, FADS1, GLIS3, SLC2A2, PROX1 and C2CD4B) and one influencing fasting insulin and HOMA-IR (near IGF1). We also demonstrated association of ADCY5, PROX1, GCK, GCKR and DGKB-TMEM195 with type 2 diabetes. Within these loci, likely biological candidate genes influence signal transduction, cell proliferation, development, glucose-sensing and circadian regulation. Our results demonstrate that genetic studies of glycemic traits can identify type 2 diabetes risk loci, as well as loci containing gene variants that are associated with a modest elevation in glucose levels but are not associated with overt diabetes.

2,022 citations


Journal ArticleDOI
TL;DR: This Review surveys the latest efforts in which the reduction of irreversible fouling is attempted by the modification of the membrane surface.
Abstract: Fouling is the deposition of retained particles, colloids, macromolecules, salts, etc., at the membrane surface or inside the pore at the pore wall. Fouling reduces the membrane flux either temporarily or permanently. While the initial flux can be restored by washing the membrane or by applying back-pressures to the temporarily fouled membrane, it cannot be restored when the membrane becomes permanently fouled. The main focus of this Review is on the permanent flux decline. The fouling is caused by the interaction between the membrane surface and the foulants, which include inorganic, organic, and biological substances in many different forms. The foulants not only physically interact with the membrane surface but also chemically degrade the membrane material. For example, colloidal particles, such as natural organic matter (NOM), are considered as the main reason for membrane fouling, which could be controlled by the permeation hindrance and electric double layer repulsion. The formation of biofilms with extra-cellular polymeric substances (EPSs) and microbial cells matrix is the example of biofouling.1 Biofilms are developed by the microbial cell adhesion and subsequent colonization on the membrane surfaces through EPS, which may account for 50-90% of total organic carbon. The biofouling could be minimized by periodical washing with chemicals such as sodium hypochlorite solution, but it will result in the simultaneous degradation of the membrane material’s lifetime. It is a severe problem for membranes used in pressure-driven processes such as reverse osmosis (RO), nanofiltration (NF), ultrafiltration (UF), and microfiltration (MF) and also for other membrane processes, seriously hampering the applications of membrane processes. Hence, membrane fouling as well as its reduction has been a subject of many academic studies and industrial research and development efforts since the early 1960s when industrial membrane separation processes emerged. Selection of an appropriate membrane, pretreatment of the process fluid, adjustment of operating design, and conditions are all known to control fouling to some extent. On the other hand, development of absolutely nonfouling membranes seems extremely difficult, if not totally impossible. This Review surveys the latest efforts in which the reduction of irreversible fouling is attempted by the modification of the membrane surface. The separation process by membrane is essentially a surface phenomenon. More specifically, the skin layer or top surface layer plays the vital role. Therefore, it is a natural consequence to modify membrane surface for reducing the fouling. It is generally accepted that an increase in hydrophilicity offers better fouling resistance because protein and many other foulants are hydrophobic in nature. Most nanofiltration membranes are electrically charged, which significantly reduces the scale-formation. During the past decade, the emergence of atomic force microscopy (AFM) enabled us to study the effect of the surface roughness in nanoscale on the membrane fouling. It is believed that the membrane fouling with particulate substance is enhanced by an increase in the surface roughness. It is shown in this Review that all of the above concepts, except for the membrane surface charge, are based on correlation of data, which are, at best, valid within a limited range of surface property parameters. * To whom correspondence should be addressed. Phone: (613) 562-5800, ext 6085. Fax: (613) 562-5172. E-mail: rana@eng.uottawa.ca. Dipak Rana is presently a Research Scientist in the Industrial Membrane Research Institute, Department of Chemical and Biological Engineering, University of Ottawa, Ottawa, Canada. Dr. Rana has been a member of various prestigious organizations, such as the Indian Chemical Society, Society for Polymer Science, India, Society of Plastics Engineers, USA, American Chemical Society, etc., for a long period. He was awarded a Ph.D. in Science from Jadavpur University, Calcutta (presently Kolkata), when he was working at the Indian Association for the Cultivation of Science, Calcutta, India. He received his Master in Chemistry with specialization in Physical Chemistry as well as his Bachelor with Honors in Chemistry from the University of Calcutta, India. Dr. Rana has published over 50 professional papers and book chapters. Chem. Rev. 2010, 110, 2448–2471 2448

1,812 citations


Journal ArticleDOI
TL;DR: The evolution of CBME from the outcomes movement in the 20th century to a renewed approach that, focused on accountability and curricular outcomes and organized around competencies, promotes greater learner-centredness and de-emphasizes time-based curricular design is described.
Abstract: Although competency-based medical education (CBME) has attracted renewed interest in recent years among educators and policy-makers in the health care professions, there is little agreement on many aspects of this paradigm. We convened a unique partnership – the International CBME Collaborators – to examine conceptual issues and current debates in CBME. We engaged in a multi-stage group process and held a consensus conference with the aim of reviewing the scholarly literature of competency-based medical education, identifying controversies in need of clarification, proposing definitions and concepts that could be useful to educators across many jurisdictions, and exploring future directions for this approach to preparing health professionals. In this paper, we describe the evolution of CBME from the outcomes movement in the 20th century to a renewed approach that, focused on accountability and curricular outcomes and organized around competencies, promotes greater learner-centredness and de-emphasizes time-based curricular design. In this paradigm, competence and related terms are redefined to emphasize their multi-dimensional, dynamic, developmental, and contextual nature. CBME therefore has significant implications for the planning of medical curricula and will have an important impact in reshaping the enterprise of medical education. We elaborate on this emerging CBME approach and its related concepts, and invite medical educators everywhere to enter into further dialogue about the promise and the potential perils of competency-based medical curricula for the 21st century.

1,683 citations



Journal ArticleDOI
TL;DR: Among patients with NYHA class II or III heart failure, a wide QRS complex, and left ventricular systolic dysfunction, the addition of CRT to an ICD reduced rates of death and hospitalization for heart failure.
Abstract: Background Cardiac-resynchronization therapy (CRT) benefits patients with left ventricular systolic dysfunction and a wide QRS complex. Most of these patients are candidates for an implantable cardioverter–defibrillator (ICD). We evaluated whether adding CRT to an ICD and optimal medical therapy might reduce mortality and morbidity among such patients. Methods We randomly assigned patients with New York Heart Association (NYHA) class II or III heart failure, a left ventricular ejection fraction of 30% or less, and an intrinsic QRS duration of 120 msec or more or a paced QRS duration of 200 msec or more to receive either an ICD alone or an ICD plus CRT. The primary outcome was death from any cause or hospitalization for heart failure. Results We followed 1798 patients for a mean of 40 months. The primary outcome occurred in 297 of 894 patients (33.2%) in the ICD–CRT group and 364 of 904 patients (40.3%) in the ICD group (hazard ratio in the ICD–CRT group, 0.75; 95% confi dence interval [CI], 0.64 to 0.87; P<0.001). In the ICD–CRT group, 186 patients died, as compared with 236 in the ICD group (hazard ratio, 0.75; 95% CI, 0.62 to 0.91; P = 0.003), and 174 patients were hospitalized for heart failure, as compared with 236 in the ICD group (hazard ratio, 0.68; 95% CI, 0.56 to 0.83; P<0.001). However, at 30 days after device implantation, adverse events had occurred in 124 patients in the ICD-CRT group, as compared with 58 in the ICD group (P<0.001). Conclusions Among patients with NYHA class II or III heart failure, a wide QRS complex, and left ventricular systolic dysfunction, the addition of CRT to an ICD reduced rates of death and hospitalization for heart failure. This improvement was accompanied by more adverse events. (Funded by the Canadian Institutes of Health Research and Medtronic of Canada; ClinicalTrials.gov number, NCT00251251.)

1,554 citations


Journal ArticleDOI
TL;DR: Implementation of a new toxicity testing paradigm firmly based on human biology by transitioning from current expensive and lengthy in vivo testing with qualitative endpoints to in vitro toxicity pathway assays on human cells or cell lines using robotic high-throughput screening with mechanistic quantitative parameters.
Abstract: With the release of the landmark report Toxicity Testing in the 21st Century: A Vision and a Strategy, the U.S. National Academy of Sciences, in 2007, precipitated a major change in the way toxicity testing is conducted. It envisions increased efficiency in toxicity testing and decreased animal usage by transitioning from current expensive and lengthy in vivo testing with qualitative endpoints to in vitro toxicity pathway assays on human cells or cell lines using robotic high-throughput screening with mechanistic quantitative parameters. Risk assessment in the exposed human population would focus on avoiding significant perturbations in these toxicity pathways. Computational systems biology models would be implemented to determine the dose-response models of perturbations of pathway function. Extrapolation of in vitro results to in vivo human blood and tissue concentrations would be based on pharmacokinetic models for the given exposure condition. This practice would enhance human relevance of test results, and would cover several test agents, compared to traditional toxicological testing strategies. As all the tools that are necessary to implement the vision are currently available or in an advanced stage of development, the key prerequisites to achieving this paradigm shift are a commitment to change in the scientific community, which could be facilitated by a broad discussion of the vision, and obtaining necessary resources to enhance current knowledge of pathway perturbations and pathway assays in humans and to implement computational systems biology models. Implementation of these strategies would result in a new toxicity testing paradigm firmly based on human biology.

1,398 citations



Journal ArticleDOI
TL;DR: This survey aims at providing multimedia researchers with a state-of-the-art overview of fusion strategies, which are used for combining multiple modalities in order to accomplish various multimedia analysis tasks.
Abstract: This survey aims at providing multimedia researchers with a state-of-the-art overview of fusion strategies, which are used for combining multiple modalities in order to accomplish various multimedia analysis tasks. The existing literature on multimodal fusion research is presented through several classifications based on the fusion methodology and the level of fusion (feature, decision, and hybrid). The fusion methods are described from the perspective of the basic concept, advantages, weaknesses, and their usage in various analysis tasks as reported in the literature. Moreover, several distinctive issues that influence a multimodal fusion process such as, the use of correlation and independence, confidence level, contextual information, synchronization between different modalities, and the optimal modality selection are also highlighted. Finally, we present the open issues for further research in the area of multimodal fusion.

Journal ArticleDOI
11 Nov 2010-Nature
TL;DR: It is shown that mice lacking Mecp2 from GABA-releasing neurons recapitulate numerous Rett syndrome and autistic features, including repetitive behaviours, and that subtle dysfunction of GABAergic neurons contributes to numerous neuropsychiatric phenotypes.
Abstract: Mutations in the X-linked MECP2 gene, which encodes the transcriptional regulator methyl-CpG-binding protein 2 (MeCP2), cause Rett syndrome and several neurodevelopmental disorders including cognitive disorders, autism, juvenile-onset schizophrenia and encephalopathy with early lethality. Rett syndrome is characterized by apparently normal early development followed by regression, motor abnormalities, seizures and features of autism, especially stereotyped behaviours. The mechanisms mediating these features are poorly understood. Here we show that mice lacking Mecp2 from GABA (γ-aminobutyric acid)-releasing neurons recapitulate numerous Rett syndrome and autistic features, including repetitive behaviours. Loss of MeCP2 from a subset of forebrain GABAergic neurons also recapitulates many features of Rett syndrome. MeCP2-deficient GABAergic neurons show reduced inhibitory quantal size, consistent with a presynaptic reduction in glutamic acid decarboxylase 1 (Gad1) and glutamic acid decarboxylase 2 (Gad2) levels, and GABA immunoreactivity. These data demonstrate that MeCP2 is critical for normal function of GABA-releasing neurons and that subtle dysfunction of GABAergic neurons contributes to numerous neuropsychiatric phenotypes.

Journal ArticleDOI
TL;DR: This update of the CONSORT statement improves the wording and clarity of the previous checklist and incorporates recommendations related to topics that have only recently received recognition, such as selective outcome reporting bias.
Abstract: Overwhelming evidence shows the quality of reporting of randomised controlled trials (RCTs) is not optimal. Without transparent reporting, readers cannot judge the reliability and validity of trial findings nor extract information for systematic reviews. Recent methodological analyses indicate that inadequate reporting and design are associated with biased estimates of treatment effects. Such systematic error is seriously damaging to RCTs, which are considered the gold standard for evaluating interventions because of their ability to minimise or avoid bias. A group of scientists and editors developed the CONSORT (Consolidated Standards of Reporting Trials) statement to improve the quality of reporting of RCTs. It was first published in 1996 and updated in 2001. The statement consists of a checklist and flow diagram that authors can use for reporting an RCT. Many leading medical journals and major international editorial groups have endorsed the CONSORT statement. The statement facilitates critical appraisal and interpretation of RCTs. During the 2001 CONSORT revision, it became clear that explanation and elaboration of the principles underlying the CONSORT statement would help investigators and others to write or appraise trial reports. A CONSORT explanation and elaboration article was published in 2001 alongside the 2001 version of the CONSORT statement. After an expert meeting in January 2007, the CONSORT statement has been further revised and is published as the CONSORT 2010 Statement. This update improves the wording and clarity of the previous checklist and incorporates recommendations related to topics that have only recently received recognition, such as selective outcome reporting bias. This explanatory and elaboration document intended to enhance the use, understanding, and dissemination of the CONSORT statement-has also been extensively revised. It presents the meaning and rationale for each new and updated checklist item providing examples of good reporting and, where possible, references to relevant empirical studies. Several examples of flow diagrams are included. The CONSORT 2010 Statement, this revised explanatory and elaboration document, and the associated website (www. consort-statement. org) should be helpful resources to improve reporting of randomised trials.

Journal ArticleDOI
TL;DR: The STandards for Reporting Interventions in Clinical Trials of Acupuncture (STRICTA) as mentioned in this paper were published in five journals in 2001 and 2002 and were designed to improve reporting of acupuncture trials, particularly the interventions, thereby facilitating their interpretation and replication.
Abstract: The STandards for Reporting Interventions in Clinical Trials of Acupuncture (STRICTA) were published in five journals in 2001 and 2002. These guidelines, in the form of a checklist and explanations for use by authors and journal editors, were designed to improve reporting of acupuncture trials, particularly the interventions, thereby facilitating their interpretation and replication. Subsequent reviews of the application and impact of STRICTA have highlighted the value of STRICTA as well as scope for improvements and revision. To manage the revision process a collaboration between the STRICTA Group, the CONSORT Group, and the Chinese Cochrane Centre was developed in 2008. An expert panel with 47 participants was convened that provided electronic feedback on a revised draft of the checklist. At a subsequent face-to-face meeting in Freiburg, a group of 21 participants further revised the STRICTA checklist and planned dissemination. The new STRICTA checklist, which is an official extension of CONSORT, includes six items and 17 sub-items. These set out reporting guidelines for the acupuncture rationale, the details of needling, the treatment regimen, other components of treatment, the practitioner background, and the control or comparator interventions. In addition, and as part of this revision process, the explanations for each item have been elaborated, and examples of good reporting for each item are provided. In addition, the word “controlled” in STRICTA is replaced by “clinical,” to indicate that STRICTA is applicable to a broad range of clinical evaluation designs, including uncontrolled outcome studies and case reports. It is intended that the revised STRICTA, in conjunction with both the main CONSORT Statement and extension for nonpharmacologic treatment, will raise the quality of reporting of clinical trials of acupuncture.

Journal ArticleDOI
29 Oct 2010-Science
TL;DR: Crystallographic resolution of bound carbon dioxide in a porous solid validates methods of theoretically predicting binding behavior and bodes well for the theory-aided development of amine-based CO2 sorbents.
Abstract: Understanding the molecular details of CO(2)-sorbent interactions is critical for the design of better carbon-capture systems. Here we report crystallographic resolution of CO(2) molecules and their binding domains in a metal-organic framework functionalized with amine groups. Accompanying computational studies that modeled the gas sorption isotherms, high heat of adsorption, and CO(2) lattice positions showed high agreement on all three fronts. The modeling apportioned specific binding interactions for each CO(2) molecule, including substantial cooperative binding effects among the guest molecules. The validation of the capacity of such simulations to accurately model molecular-scale binding bodes well for the theory-aided development of amine-based CO(2) sorbents. The analysis shows that the combination of appropriate pore size, strongly interacting amine functional groups, and the cooperative binding of CO(2) guest molecules is responsible for the low-pressure binding and large uptake of CO(2) in this sorbent material.

Journal ArticleDOI
TL;DR: This paper found that hedonia and eudaimonia occupy both overlapping and distinct niches within a complete picture of well-being, and their combination may be associated with the greatest wellbeing.
Abstract: Hedonia (seeking pleasure and comfort) and eudaimonia (seeking to use and develop the best in oneself) are often seen as opposing pursuits, yet each may contribute to well-being in different ways. We conducted four studies (two correlational, one experience-sampling, and one intervention study) to determine outcomes associated with activities motivated by hedonic and eudaimonic aims. Overall, results indicated that: between persons (at the trait level) and within persons (at the momentary state level), hedonic pursuits related more to positive affect and carefreeness, while eudaimonic pursuits related more to meaning; between persons, eudaimonia related more to elevating experience (awe, inspiration, and sense of connection with a greater whole); within persons, hedonia related more negatively to negative affect; between and within persons, both pursuits related equally to vitality; and both pursuits showed some links with life satisfaction, though hedonia’s links were more frequent. People whose lives were high in both eudaimonia and hedonia had: higher degrees of most well-being variables than people whose lives were low in both pursuits (but did not differ in negative affect or carefreeness); higher positive affect and carefreeness than predominantly eudaimonic individuals; and higher meaning, elevating experience, and vitality than predominantly hedonic individuals. In the intervention study, hedonia produced more well-being benefits at short-term follow-up, while eudaimonia produced more at 3-month follow-up. The findings show that hedonia and eudaimonia occupy both overlapping and distinct niches within a complete picture of well-being, and their combination may be associated with the greatest well-being.

Journal ArticleDOI
TL;DR: In this paper, the performance studies of mixed matrix membrane (MMM) for gas separation were critically reviewed, and the materials selection and the preparation techniques of MMM were also discussed.

Journal ArticleDOI
TL;DR: An external-oxidant-free process to access the isoquinolone motif via cross-coupling/cyclization of benzhydroxamic acid with alkynes is described, pointing out the important involvement of a N-O bond as a tool for C-N bond formation and catalyst turnover.
Abstract: An external-oxidant-free process to access the isoquinolone motif via cross-coupling/cyclization of benzhydroxamic acid with alkynes is described. The reaction features a regioselective cleavage of a C-H bond on the benzhydroxamic acid coupling partner as well as a regioselective alkyne insertion. Mechanistic studies point out the important involvement of a N-O bond as a tool for C-N bond formation and catalyst turnover.

Journal ArticleDOI
Emek Demir1, Emek Demir2, Michael P. Cary2, Suzanne M. Paley3, Ken Fukuda, Christian Lemer4, Imre Vastrik, Guanming Wu5, Peter D'Eustachio6, Carl F. Schaefer7, Joanne S. Luciano, Frank Schacherer, Irma Martínez-Flores8, Zhenjun Hu9, Verónica Jiménez-Jacinto8, Geeta Joshi-Tope10, Kumaran Kandasamy11, Alejandra López-Fuentes8, Huaiyu Mi3, Elgar Pichler, Igor Rodchenkov12, Andrea Splendiani13, Andrea Splendiani14, Sasha Tkachev15, Jeremy Zucker16, Gopal R. Gopinath17, Harsha Rajasimha7, Harsha Rajasimha18, Ranjani Ramakrishnan19, Imran Shah20, Mustafa H Syed21, Nadia Anwar2, Özgün Babur1, Özgün Babur2, Michael L. Blinov22, Erik Brauner23, Dan Corwin, Sylva L. Donaldson12, Frank Gibbons23, Robert N. Goldberg24, Peter Hornbeck15, Augustin Luna7, Peter Murray-Rust25, Eric K. Neumann, Oliver Reubenacker22, Matthias Samwald26, Matthias Samwald27, Martijn P. van Iersel28, Sarala M. Wimalaratne29, Keith Allen30, Burk Braun, Michelle Whirl-Carrillo31, Kei-Hoi Cheung32, Kam D. Dahlquist33, Andrew Finney, Marc Gillespie34, Elizabeth M. Glass21, Li Gong31, Robin Haw5, Michael Honig35, Olivier Hubaut4, David W. Kane36, Shiva Krupa37, Martina Kutmon38, Julie Leonard30, Debbie Marks23, David Merberg39, Victoria Petri40, Alexander R. Pico41, Dean Ravenscroft42, Liya Ren10, Nigam H. Shah31, Margot Sunshine7, Rebecca Tang30, Ryan Whaley30, Stan Letovksy43, Kenneth H. Buetow7, Andrey Rzhetsky44, Vincent Schächter45, Bruno S. Sobral18, Ugur Dogrusoz1, Shannon K. McWeeney19, Mirit I. Aladjem7, Ewan Birney, Julio Collado-Vides8, Susumu Goto46, Michael Hucka47, Nicolas Le Novère, Natalia Maltsev21, Akhilesh Pandey11, Paul Thomas3, Edgar Wingender, Peter D. Karp3, Chris Sander2, Gary D. Bader12 
TL;DR: Thousands of interactions, organized into thousands of pathways, from many organisms are available from a growing number of databases, and this large amount of pathway data in a computable form will support visualization, analysis and biological discovery.
Abstract: Biological Pathway Exchange (BioPAX) is a standard language to represent biological pathways at the molecular and cellular level and to facilitate the exchange of pathway data. The rapid growth of the volume of pathway data has spurred the development of databases and computational tools to aid interpretation; however, use of these data is hampered by the current fragmentation of pathway information across many databases with incompatible formats. BioPAX, which was created through a community process, solves this problem by making pathway data substantially easier to collect, index, interpret and share. BioPAX can represent metabolic and signaling pathways, molecular and genetic interactions and gene regulation networks. Using BioPAX, millions of interactions, organized into thousands of pathways, from many organisms are available from a growing number of databases. This large amount of pathway data in a computable form will support visualization, analysis and biological discovery.

Journal ArticleDOI
TL;DR: In this paper, the authors review research on individual, peer, and school contributions that may be critical factors for enhancing efforts to address bullying among students, with an emphasis on how bullying is defined and assessed and the subsequent implications for bullying prevention and intervention program evaluation.
Abstract: In this article, the authors review research on individual, peer, and school contributions that may be critical factors for enhancing efforts to address bullying among students. Methodological challenges are delineated, with an emphasis on how bullying is defined and assessed and the subsequent implications for bullying prevention and intervention program evaluation. The impact of school-based anti-bullying programs and the challenges currently facing educators and researchers in this area are discussed. The article concludes with a proposal for a broader, ecologically based model of school bullying based on the emerging literature.

Journal ArticleDOI
TL;DR: Cholesterol lowering with rosuvastatin 40 mg did not reduce the progression of AS in patients with mild to moderate AS; thus, statins should not be used for the sole purpose of reducing the progress of AS.
Abstract: Background— Aortic stenosis (AS) is an active process with similarities to atherosclerosis. The objective of this study was to assess the effect of cholesterol lowering with rosuvastatin on the progression of AS. Methods and Results— This was a randomized, double-blind, placebo-controlled trial in asymptomatic patients with mild to moderate AS and no clinical indications for cholesterol lowering. The patients were randomized to receive either placebo or rosuvastatin 40 mg daily. A total of 269 patients were randomized: 134 patients to rosuvastatin 40 mg daily and 135 patients to placebo. Annual echocardiograms were performed to assess AS progression, which was the primary outcome; the median follow-up was 3.5 years. The peak AS gradient increased in patients receiving rosuvastatin from a baseline of 40.8±11.1 to 57.8±22.7 mm Hg at the end of follow-up and in patients with placebo from 41.6±10.9 mm Hg at baseline to 54.8±19.8 mm Hg at the end of follow-up. The annualized increase in the peak AS gradient wa...

Journal ArticleDOI
TL;DR: Given the importance of assessment and evaluation for CBME, the medical education community will need more collaborative research to address several major challenges in assessment, including “best practices” in the context of systems and institutional culture and how to best to train faculty to be better evaluators.
Abstract: Competency-based medical education (CBME), by definition, necessitates a robust and multifaceted assessment system. Assessment and the judgments or evaluations that arise from it are important at the level of the trainee, the program, and the public. When designing an assessment system for CBME, medical education leaders must attend to the context of the multiple settings where clinical training occurs. CBME further requires assessment processes that are more continuous and frequent, criterion-based, developmental, work-based where possible, use assessment methods and tools that meet minimum requirements for quality, use both quantitative and qualitative measures and methods, and involve the wisdom of group process in making judgments about trainee progress. Like all changes in medical education, CBME is a work in progress. Given the importance of assessment and evaluation for CBME, the medical education community will need more collaborative research to address several major challenges in assessment, including ‘‘best practices’’ in the context of systems and institutional culture and how to best to train faculty to be better evaluators. Finally, we must remember that expertise, not competence, is the ultimate goal. CBME does not end with graduation from a training program, but should represent a career that includes ongoing assessment.

Journal ArticleDOI
TL;DR: The use of complexes 1 and its dicationic analogue [Cp*Rh(MeCN)(3)][SbF(6)](2) 2 have been employed in the formation of indoles via the oxidative annulation of acetanilides with internal alkynes, extending the reaction class to include the synthesis of pyrroles.
Abstract: Recently, the rhodium(III)-complex [Cp*RhCl(2)](2) 1 has provided exciting opportunities for the efficient synthesis of aromatic heterocycles based on a rhodium-catalyzed C-H bond functionalization event. In the present report, the use of complexes 1 and its dicationic analogue [Cp*Rh(MeCN)(3)][SbF(6)](2) 2 have been employed in the formation of indoles via the oxidative annulation of acetanilides with internal alkynes. The optimized reaction conditions allow for molecular oxygen to be used as the terminal oxidant in this process, and the reaction may be carried out under mild temperatures (60 °C). These conditions have resulted in an expanded compatibility of the reaction to include a range of new internal alkynes bearing synthetically useful functional groups in moderate to excellent yields. The applicability of the method is exemplified in an efficient synthesis of paullone 3, a tetracyclic indole derivative with established biological activity. A mechanistic investigation of the reaction, employing deuterium labeling experiments and kinetic analysis, has provided insight into issues of reactivity for both coupling partners as well as aided in the development of conditions for improved regioselectivity with respect to meta-substituted acetanilides. This reaction class has also been extended to include the synthesis of pyrroles. Catalyst 2 efficiently couples substituted enamides with internal alkynes at room temperature to form trisubstituted pyrroles in good to excellent yields. The high functional group compatibility of this reaction enables the elaboration of the pyrrole products into a variety of differentially substituted pyrroles.

Journal ArticleDOI
TL;DR: In this paper, a distributed vibration sensor was developed by using heterodyne detection and signal processing of moving averaging and moving differential for the phase optical time domain reflectometry system for detecting pencil break vibration.
Abstract: We developed a distributed vibration sensor by using heterodyne detection and signal processing of moving averaging and moving differential for the phase optical time domain reflectometry system. The broadband acoustic frequency components generated by pencil-break vibration have been measured and identified in location by our distributed vibration sensor for the first time. Pencil break measurement is a standard technique to emulate the acoustic emission of cracks in concrete or steel bridges for early crack identification. The spatial resolution is 5m and the highest frequency response is 1 kHz, which is limited by the trigger frequency of data acquisition card. This new sensing system can be used for vibration detection of health monitoring of various civil structures as well as any dynamic monitoring requirement.

Journal ArticleDOI
Jason Z. Liu1, Federica Tozzi2, Dawn M. Waterworth2, Sreekumar G. Pillai2, Pierandrea Muglia2, Lefkos T. Middleton3, Wade H. Berrettini4, Christopher W. Knouff2, Xin Yuan2, Gérard Waeber5, Peter Vollenweider5, Martin Preisig5, Nicholas J. Wareham6, Jing Hua Zhao6, Ruth J. F. Loos6, Ins Barroso7, Kay-Tee Khaw8, Scott M. Grundy, Philip J. Barter9, Robert W. Mahley10, Antero Kesäniemi11, Ruth McPherson12, John B. Vincent13, John Strauss13, James L. Kennedy13, Anne Farmer14, Peter McGuffin14, Richard O. Day15, Keith Matthews15, Per Bakke16, Amund Gulsvik16, Susanne Lucae17, Marcus Ising17, T. Brueckl17, S. Horstmann17, H.-Erich Wichmann18, Rajesh Rawal, Norbert Dahmen19, Claudia Lamina20, Ozren Polasek21, Lina Zgaga22, Jennifer E. Huffman22, Susan Campbell22, Jaspal S. Kooner3, John C. Chambers3, Mary Susan Burnett23, Joseph M. Devaney23, Augusto D. Pichard23, Kenneth M. Kent23, Lowell F. Satler23, Joseph M. Lindsay23, Ron Waksman23, Stephen E. Epstein23, James F. Wilson22, Sarah H. Wild22, Harry Campbell22, Veronique Vitart22, Muredach P. Reilly4, Mingyao Li4, Liming Qu4, Robert L. Wilensky4, William H. Matthai4, Hakon Hakonarson4, Daniel J. Rader4, Andre Franke24, Michael Wittig24, Arne Schäfer24, Manuela Uda25, Antonio Terracciano26, Xiangjun Xiao27, Fabio Busonero25, Paul Scheet27, David Schlessinger26, David St Clair28, Dan Rujescu18, Gonçalo R. Abecasis29, Hans J. Grabe30, Alexander Teumer30, Henry Völzke30, Astrid Petersmann30, Ulrich John30, Igor Rudan31, Igor Rudan22, Caroline Hayward22, Alan F. Wright22, Ivana Kolcic21, Benjamin J. Wright32, John R. Thompson32, Anthony J. Balmforth33, Alistair S. Hall33, Nilesh J. Samani32, Carl A. Anderson7, Tariq Ahmad, Christopher G. Mathew34, Miles Parkes, Jack Satsangi22, Mark J. Caulfield35, Patricia B. Munroe35, Martin Farrall1, Anna F. Dominiczak36, Jane Worthington, Wendy Thomson, Steve Eyre, Anne Barton, Vincent Mooser2, Clyde Francks1, Clyde Francks2, Jonathan Marchini1 
TL;DR: The Oxford-GlaxoSmithKline study (Ox-GSK) as discussed by the authors performed a genome-wide meta-analysis of SNP association with smoking-related behavioral traits and found an effect on smoking quantity at a locus on 15q25 (P = 9.45 x 10(-19) that includes CHRNA5, CHRNA3 and CHRNB4.
Abstract: Smoking is a leading global cause of disease and mortality. We established the Oxford-GlaxoSmithKline study (Ox-GSK) to perform a genome-wide meta-analysis of SNP association with smoking-related behavioral traits. Our final data set included 41,150 individuals drawn from 20 disease, population and control cohorts. Our analysis confirmed an effect on smoking quantity at a locus on 15q25 (P = 9.45 x 10(-19)) that includes CHRNA5, CHRNA3 and CHRNB4, three genes encoding neuronal nicotinic acetylcholine receptor subunits. We used data from the 1000 Genomes project to investigate the region using imputation, which allowed for analysis of virtually all common SNPs in the region and offered a fivefold increase in marker density over HapMap2 (ref. 2) as an imputation reference panel. Our fine-mapping approach identified a SNP showing the highest significance, rs55853698, located within the promoter region of CHRNA5. Conditional analysis also identified a secondary locus (rs6495308) in CHRNA3.

Journal ArticleDOI
TL;DR: CO( 2) adsorption-desorption cycles and the corresponding mechanisms over amine supported CO(2) adsorbent (TRI-PE-MCM-41) under dry conditions and in the presence of moisture are studied.
Abstract: CO2 adsorption−desorption cycles and the corresponding mechanisms over amine supported CO2 adsorbent (TRI-PE-MCM-41) under dry conditions and in the presence of moisture (20 °C as dew point).

Journal ArticleDOI
TL;DR: A meta-analysis of 22 studies involving 27 plant species shows a significant reduction in the proportion of seeds outcrossed in response to anthropogenic habitat modifications, and whether reproductive assurance through selfing effectively compensates for reduced outcrossing is discussed.
Abstract: There is increasing evidence that human disturbance can negatively impact plant–pollinator interactions such as outcross pollination. We present a meta-analysis of 22 studies involving 27 plant species showing a significant reduction in the proportion of seeds outcrossed in response to anthropogenic habitat modifications. We discuss the evolutionary consequences of disturbance on plant mating systems, and in particular whether reproductive assurance through selfing effectively compensates for reduced outcrossing. The extent to which disturbance reduces pollinator versus mate availability could generate diverse selective forces on reproductive traits. Investigating how anthropogenic change influences plant mating will lead to new opportunities for better understanding of how mating systems evolve, as well as of the ecological and evolutionary consequences of human activities and how to mitigate them.

Journal ArticleDOI
TL;DR: In this paper, various techniques aimed at detecting potential outliers are reviewed and these techniques are subdivided into two classes, the ones regarding univariate data and those addressing multivariate data.
Abstract: Outliers are observations or measures that are suspicious because they are much smaller or much larger than the vast majority of the observations. These observations are problematic because they may not be caused by the mental process under scrutiny or may not reflect the ability under examination. The problem is that a few outliers is sometimes enough to distort the group results (by altering the mean performance, by increasing variability, etc.). In this paper, various techniques aimed at detecting potential outliers are reviewed. These techniques are subdivided into two classes, the ones regarding univariate data and those addressing multivariate data. Within these two classes, we consider the cases where the population distribution is known to be normal, the population is not normal but known, or the population is unknown. Recommendations will be put forward in each case.

Journal ArticleDOI
TL;DR: In this article, the role of value added (VA) as an indicator of intellectual capital (IC) and its impact on the firm's economic, financial and stock market performance is analyzed.
Abstract: Purpose – The purpose of this paper is to analyse the role of value added (VA) as an indicator of intellectual capital (IC), and its impact on the firm's economic, financial and stock market performance.Design/methodology/approach – The value added intellectual coefficient (VAIC™) method is used on 300 UK companies divided into three groups of industries: high‐tech, traditional and services. Data require to calculate VAIC™ method are obtained from the “Value Added Scoreboard” provided by the UK Department of Trade and Industry (DTI). Empirical analysis is conducted using correlation and linear multiple regression analysis.Findings – The results show that companies' IC has a positive impact on economic and financial performance. However, the association between IC and stock market performance is only significant for high‐tech industries. The results also indicate that capital employed remains a major determinant of financial and stock market performance although it has a negative impact on economic perform...

Journal ArticleDOI
TL;DR: There was poor justification of choice of intervention and use of theory in implementation research in the identified studies until at least 1998, suggesting greater use of explicit theory to understand barriers, design interventions, and explore mediating pathways and moderators is needed to advance the science of implementation research.
Abstract: There is growing interest in the use of cognitive, behavioural, and organisational theories in implementation research. However, the extent of use of theory in implementation research is uncertain. We conducted a systematic review of use of theory in 235 rigorous evaluations of guideline dissemination and implementation studies published between 1966 and 1998. Use of theory was classified according to type of use (explicitly theory based, some conceptual basis, and theoretical construct used) and stage of use (choice/design of intervention, process/mediators/moderators, and post hoc/explanation). Fifty-three of 235 studies (22.5%) were judged to have employed theories, including 14 studies that explicitly used theory. The majority of studies (n = 42) used only one theory; the maximum number of theories employed by any study was three. Twenty-five different theories were used. A small number of theories accounted for the majority of theory use including PRECEDE (Predisposing, Reinforcing, and Enabling Constructs in Educational Diagnosis and Evaluation), diffusion of innovations, information overload and social marketing (academic detailing). There was poor justification of choice of intervention and use of theory in implementation research in the identified studies until at least 1998. Future research should explicitly identify the justification for the interventions. Greater use of explicit theory to understand barriers, design interventions, and explore mediating pathways and moderators is needed to advance the science of implementation research.