scispace - formally typeset
Search or ask a question
Journal ArticleDOI

Research electronic data capture (REDCap)-A metadata-driven methodology and workflow process for providing translational research informatics support

TL;DR: Research electronic data capture (REDCap) is a novel workflow methodology and software solution designed for rapid development and deployment of electronic data Capture tools to support clinical and translational research.
About: This article is published in Journal of Biomedical Informatics.The article was published on 2009-04-01 and is currently open access. It has received 29988 citations till now. The article focuses on the topics: Translational research informatics & Metadata.
Citations
More filters
Journal ArticleDOI
TL;DR: The Research Electronic Data Capture (REDCap) data management platform was developed in 2004 to address an institutional need at Vanderbilt University, then shared with a limited number of adopting sites beginning in 2006, and a broader consortium sharing and support model was created.

8,712 citations

Journal ArticleDOI
TL;DR: Cardiac magnetic resonance imaging revealed cardiac involvement and ongoing myocardial inflammation in patients with recent coronavirus disease 2019, which was independent of preexisting conditions, severity and overall course of the acute illness, and the time from the original diagnosis.
Abstract: Importance Coronavirus disease 2019 (COVID-19) continues to cause considerable morbidity and mortality worldwide. Case reports of hospitalized patients suggest that COVID-19 prominently affects the cardiovascular system, but the overall impact remains unknown. Objective To evaluate the presence of myocardial injury in unselected patients recently recovered from COVID-19 illness. Design, Setting, and Participants In this prospective observational cohort study, 100 patients recently recovered from COVID-19 illness were identified from the University Hospital Frankfurt COVID-19 Registry between April and June 2020. Exposure Recent recovery from severe acute respiratory syndrome coronavirus 2 infection, as determined by reverse transcription–polymerase chain reaction on swab test of the upper respiratory tract. Main Outcomes and Measures Demographic characteristics, cardiac blood markers, and cardiovascular magnetic resonance (CMR) imaging were obtained. Comparisons were made with age-matched and sex-matched control groups of healthy volunteers (n = 50) and risk factor–matched patients (n = 57). Results Of the 100 included patients, 53 (53%) were male, and the mean (SD) age was 49 (14) years. The median (IQR) time interval between COVID-19 diagnosis and CMR was 71 (64-92) days. Of the 100 patients recently recovered from COVID-19, 67 (67%) recovered at home, while 33 (33%) required hospitalization. At the time of CMR, high-sensitivity troponin T (hsTnT) was detectable (greater than 3 pg/mL) in 71 patients recently recovered from COVID-19 (71%) and significantly elevated (greater than 13.9 pg/mL) in 5 patients (5%). Compared with healthy controls and risk factor–matched controls, patients recently recovered from COVID-19 had lower left ventricular ejection fraction, higher left ventricle volumes, and raised native T1 and T2. A total of 78 patients recently recovered from COVID-19 (78%) had abnormal CMR findings, including raised myocardial native T1 (n = 73), raised myocardial native T2 (n = 60), myocardial late gadolinium enhancement (n = 32), or pericardial enhancement (n = 22). There was a small but significant difference between patients who recovered at home vs in the hospital for native T1 mapping (median [IQR], 1119 [1092-1150] ms vs 1141 [1121-1175] ms;P = .008) and hsTnT (4.2 [3.0-5.9] pg/dL vs 6.3 [3.4-7.9] pg/dL;P = .002) but not for native T2 mapping. None of these measures were correlated with time from COVID-19 diagnosis (native T1:r = 0.07;P = .47; native T2:r = 0.14;P = .15; hsTnT:r = −0.07;P = .50). High-sensitivity troponin T was significantly correlated with native T1 mapping (r = 0.33;P Conclusions and Relevance In this study of a cohort of German patients recently recovered from COVID-19 infection, CMR revealed cardiac involvement in 78 patients (78%) and ongoing myocardial inflammation in 60 patients (60%), independent of preexisting conditions, severity and overall course of the acute illness, and time from the original diagnosis. These findings indicate the need for ongoing investigation of the long-term cardiovascular consequences of COVID-19.

1,576 citations


Additional excerpts

  • ...Age, mean (SD), y 49 (14) 48 (16) 49 (13) ....

    [...]

  • ...LVEDV index, mean (SD), mL/m2 86 (13) 80 (11)c 76 (14)c <....

    [...]

Journal ArticleDOI
TL;DR: This primer will equip both scientists and practitioners to understand the ontology and methodology of scale development and validation, thereby facilitating the advancement of the understanding of a range of health, social, and behavioral outcomes.
Abstract: Scale development and validation are critical to much of the work in the health, social, and behavioral sciences. However, the constellation of techniques required for scale development and evaluation can be onerous, jargon-filled, unfamiliar, and resource-intensive. Further, it is often not a part of graduate training. Therefore, our goal was to concisely review the process of scale development in as straightforward a manner as possible, both to facilitate the development of new, valid, and reliable scales, and to help improve existing ones. To do this, we have created a primer for best practices for scale development in measuring complex phenomena. This is not a systematic review, but rather the amalgamation of technical literature and lessons learned from our experiences spent creating or adapting a number of scales over the past several decades. We identified three phases that span nine steps. In the first phase, items are generated and the validity of their content is assessed. In the second phase, the scale is constructed. Steps in scale construction include pre-testing the questions, administering the survey, reducing the number of items, and understanding how many factors the scale captures. In the third phase, scale evaluation, the number of dimensions is tested, reliability is tested, and validity is assessed. We have also added examples of best practices to each step. In sum, this primer will equip both scientists and practitioners to understand the ontology and methodology of scale development and validation, thereby facilitating the advancement of our understanding of a range of health, social, and behavioral outcomes.

1,523 citations

Journal ArticleDOI
TL;DR: DSM-5 PTSD prevalence was higher among women than among men, and prevalence increased with greater traumatic event exposure, although only 2 of these differences were statistically significant.
Abstract: Prevalence of posttraumatic stress disorder (PTSD) defined according to the American Psychiatric Association's Diagnostic and Statistical Manual fifth edition (DSM-5; 2013) and fourth edition (DSM-IV; 1994) was compared in a national sample of U.S. adults (N = 2,953) recruited from an online panel. Exposure to traumatic events, PTSD symptoms, and functional impairment were assessed online using a highly structured, self-administered survey. Traumatic event exposure using DSM-5 criteria was high (89.7%), and exposure to multiple traumatic event types was the norm. PTSD caseness was determined using Same Event (i.e., all symptom criteria met to the same event type) and Composite Event (i.e., symptom criteria met to a combination of event types) definitions. Lifetime, past-12-month, and past 6-month PTSD prevalence using the Same Event definition for DSM-5 was 8.3%, 4.7%, and 3.8% respectively. All 6 DSM-5 prevalence estimates were slightly lower than their DSM-IV counterparts, although only 2 of these differences were statistically significant. DSM-5 PTSD prevalence was higher among women than among men, and prevalence increased with greater traumatic event exposure. Major reasons individuals met DSM-IV criteria, but not DSM-5 criteria were the exclusion of nonaccidental, nonviolent deaths from Criterion A, and the new requirement of at least 1 active avoidance symptom.

1,365 citations


Cites methods from "Research electronic data capture (R..."

  • ...The Research Electronic Data Capture platform (Harris et al., 2009) was used to program the National Stressful Events Sur vey questionnaire and to record all survey data....

    [...]

Journal ArticleDOI
TL;DR: The case burden of TBI across World Health Organization regions and World Bank income groups was sought to promote advocacy, understanding, and targeted intervention, and study quality was higher in the high-income countries (HICs) than in the low- and middle- Income countries (LMICs).
Abstract: OBJECTIVETraumatic brain injury (TBI)—the “silent epidemic”—contributes to worldwide death and disability more than any other traumatic insult. Yet, TBI incidence and distribution across regions and socioeconomic divides remain unknown. In an effort to promote advocacy, understanding, and targeted intervention, the authors sought to quantify the case burden of TBI across World Health Organization (WHO) regions and World Bank (WB) income groups.METHODSOpen-source epidemiological data on road traffic injuries (RTIs) were used to model the incidence of TBI using literature-derived ratios. First, a systematic review on the proportion of RTIs resulting in TBI was conducted, and a meta-analysis of study-derived proportions was performed. Next, a separate systematic review identified primary source studies describing mechanisms of injury contributing to TBI, and an additional meta-analysis yielded a proportion of TBI that is secondary to the mechanism of RTI. Then, the incidence of RTI as published by the Global...

1,353 citations

References
More filters
Journal ArticleDOI
TL;DR: A methodology for the development of WWW applications and a tool environment specifically tailored for the methodology, based upon models and techniques already used in the hypermedia, information systems, and software engineering fields, adapted and blended in an original mix.
Abstract: This paper describes a methodology for the development of WWW applications and a tool environment specifically tailored for the methodology. The methodology and the development environment are based upon models and techniques already used in the hypermedia, information systems, and software engineering fields, adapted and blended in an original mix. The foundation of the proposal is the conceptual design of WWW applications, using HDM-lite, a notation for the specification of structure, navigation, and presentation semantics. The conceptual schema is then translated into a “traditional” database schema, which describes both the organization of the content and the desired navigation and presentation features. The WWW pages can therefore be dynamically generated from the database content, following the navigation requests of the user. A CASE environment, called Autoweb System, offers a set of software tools, which assist the design and the execution of a WWW application, in all its different aspects, Real-life experiences of the use of the methodology and of the AutoWeb System in both the industrial and academic context are reported.

209 citations


"Research electronic data capture (R..." refers background or methods in this paper

  • ...and international institutions collaborating on the project; and (5) strengths and limitations of the RED-...

    [...]

  • ...double-data entry options); (5) data attribution and audit capabilities; (6) protocol document storage and sharing; (7) central data storage and backups; (8) data export functions for common statistical packages; and (9) data import functions to facilitate bulk import of data from other systems....

    [...]

  • ...consent forms, analysis code) or generated export files (SAS, SPSS, R, Stata, Excel); (4) a RIGHTS table containing specific researcher rights and expiration settings; and (5) a flat DATA table used to store all collected data for the study (one row per subject with all collected data fields stored in columns)....

    [...]

Journal ArticleDOI
TL;DR: WebEAV is a generic framework for Web development that is intended to streamline the process of Web application development for databases having a significant EAV component and addresses some challenging user interface issues that arise when any complex system is created.

68 citations


Additional excerpts

  • ...We present: (1) a brief description of the REDCap metadata-driven software toolset; (2) detail concerning the capture and use of study-related metadata from scientific research teams; (3) measures of impact for REDCap; (4) details concerning a consortium network of domestic and international institutions collaborating on the project; and (5) strengths and limitations of the REDCap system....

    [...]

  • ...consent forms, analysis code) or generated export files (SAS, SPSS, R, Stata, Excel); (4) a RIGHTS table containing specific researcher rights and expiration settings; and (5) a flat DATA table used to store all collected data for the study (one row per subject with all collected data fields stored in columns)....

    [...]

  • ...The following key features were identified as critical components for supporting research projects: (1) collaborative access to data across academic departments and institutions; (2) user authentication and role-based security; (3) intuitive electronic case report forms (CRFs); (4) real-time data validation, integrity checks and other mechanisms for ensuring data quality (e....

    [...]

  • ...teams; (3) measures of impact for REDCap; (4) details concerning a consortium network of domestic...

    [...]

Journal ArticleDOI
TL;DR: The purpose of this pilot study was to compare the cost, accuracy, and efficiency of a web-enhanced handheld computer data collection system with those of the traditional paper-based data collection and management system and to increase awareness/knowledge of researchers on these two dataCollection and management methods.

54 citations


"Research electronic data capture (R..." refers background in this paper

  • ...double-data entry options); (5) data attribution and audit capabilities; (6) protocol document storage and sharing; (7) central data storage and backups; (8) data export functions for common statistical packages; and (9) data import functions to facilitate bulk import of data from other systems....

    [...]

Journal ArticleDOI
TL;DR: SQLGEN uses a microcomputer database as the client to store metadata in relational form, to transiently capture server data in tables, and to allow rapid application prototyping followed by porting to client-server mode with modest effort.

15 citations


"Research electronic data capture (R..." refers background or methods in this paper

  • ...The standard REDCap project requires five distinct tables stored in a single MySQL database: (1) a METADATA table containing all study data pertaining to database storage (data field types and naming used for automatic creation of separate data storage table) and CRF presentation (form names and security levels, field-specific display and validation rules); (2) a LOGGING table used to store all information about data changes and exports; (3) a DOCS table used to store uploaded (ex....

    [...]

  • ...The following key features were identified as critical components for supporting research projects: (1) collaborative access to data across academic departments and institutions; (2) user authentication and role-based security; (3) intuitive electronic case report forms (CRFs); (4) real-time data validation, integrity checks and other mechanisms for ensuring data quality (e....

    [...]

  • ...teams; (3) measures of impact for REDCap; (4) details concerning a consortium network of domestic...

    [...]

Journal ArticleDOI
TL;DR: The National Institutes of Health has grown from a one-room bacteriology laboratory in 1887 into a multibillion-dollar medical research enterprise and is pioneering a vision that will determine what the agency should be and should do in this new “biomedical century.
Abstract: The National Institutes of Health (NIH) has grown from a one-room bacteriology laboratory in 1887 into a multibillion-dollar medical research enterprise. While the NIH has remained true to its original mission—pursuing scientific knowledge to improve people's health—almost everything else about it has changed. Today, NIH comprises 27 Institutes and Centers that sponsor medical research in areas ranging from cancer to diabetes and from genomics to alternative medicine. With the dramatic increases in scientific knowledge as well as the significant changes in diagnosis, treatment, prevention, translation, and delivery of care, researchers must continue to meet the challenges of tomorrow by adopting new strategies today. The NIH is pioneering a vision that will determine what the agency should be and should do in this new “biomedical century.” We are focusing our research energies in three specific areas: exploiting new pathways of discovery; encouraging the formation of interdisciplinary teams of the future; and reengineering the clinical research enterprise. In the first area, we know that scientific research is cost-effective because it saves lives and money in both the short and the long run. Witness our successes in working with AIDS, coronary heart disease, and cancer. We also know that health care costs are continuing to spiral upward, partly due to the aging population which is placing more and more demands on the system. To keep up with the projected increases in costs, the research enterprise must not only accelerate the pace of discovery, but also apply its research results in a timely manner. I believe this is one of the greatest challenges we face in this biomedical century. In the second area, discoveries in human biology are occurring at unprecedented speed, presenting opportunities in medicine that we could only dream about just a few years ago. But these opportunities require a shift in how medical research is conducted and funded. While investigator-initiated projects are the mainstay of NIH-supported research, increasingly these projects involve larger multidisciplinary teams of scientists. These teams might include specialists in disciplines outside biology, such as computer science, imaging, chemistry, mathematics, and informatics. Such multidisciplinary science teams are the wave of the future. I believe that we need to break down the walls that exist between scientific disciplines, inside and outside NIH. We need to foster the growth of interdisciplinary teams in order to maximize the enormous potential of research to improve our lives. For its part, NIH is working to meet this challenge by filling leadership positions at its Institutes and Centers with outstanding scientists who have experience with this new paradigm. In the third area, we need to reengineer the national clinical research enterprise in order to most effectively translate our discoveries into clinical reality. The list of initiatives to undertake is long, but necessary. It includes supporting multidisciplinary clinical research training career paths, introducing innovations in trial design, stimulating translational research, building clinical resources like tissue banks, developing large clinical research networks, and reducing regulatory hurdles. We must also explore a standard clinical research informatics strategy, which will permit the formation of Nation-wide “communities” of clinical researchers made up of academic researchers, qualified community physicians, and patient groups. Alongside our dedicated efforts to pioneer a new vision for NIH is the continuing need to attract the best and brightest researchers. We in the scientific community realize the need to cultivate and nurture minority talent if scientific research is to remain a viable enterprise. Currently, we are experiencing difficulty in retaining researchers from the minority communities, especially through the doctoral levels. We must identify and attract students with potential early on and not let them “fall through the cracks” of academia. We need to consider novel approaches to encourage minority students to remain in mainstream science. Providing students with knowledgeable and supportive mentors and role models could serve as an enticement. NIH has been a leader in supporting minority students throughout their education, with numerous programs and policies in place to increase the number of minorities in research careers. As we take on the challenge of conducting scientific research in the 21st century by pursuing the NIH vision, it is absolutely critical that we find ways to recruit and retain more women and minorities in scientific careers.

7 citations


"Research electronic data capture (R..." refers background or methods in this paper

  • ...The standard REDCap project requires five distinct tables stored in a single MySQL database: (1) a METADATA table containing all study data pertaining to database storage (data field types and naming used for automatic creation of separate data storage table) and CRF presentation (form names and security levels, field-specific display and validation rules); (2) a LOGGING table used to store all information about data changes and exports; (3) a DOCS table used to store uploaded (ex....

    [...]

  • ...We present: (1) a brief description of the REDCap metadata-driven software...

    [...]

  • ...The following key features were identified as critical components for supporting research projects: (1) collaborative access to data across academic departments and institutions; (2) user authentication and role-based security; (3) intuitive electronic case report forms (CRFs); (4) real-time data validation, integrity checks and other mechanisms for ensuring data quality (e....

    [...]