scispace - formally typeset
Search or ask a question

Showing papers on "Electronic data capture published in 2007"


Journal ArticleDOI
TL;DR: In this paper, the authors discuss the barriers and potential solutions that can assist the clinical trial industry in achieving more wide-spread Electronic Data Capture (EDC) use and the resulting improvement in operating efficiencies.

81 citations


Journal ArticleDOI
TL;DR: It is demonstrated that electronic patient data can be reused for prospective multicenter clinical research and patient care, and demonstrates a need for further development of therapeutic area standards that can facilitate researcher use of healthcare data.

76 citations


Journal ArticleDOI
TL;DR: How the IVRS work, the advantages and pitfalls of IVRS, and some of the lessons the authors have learned from administering a number of clinical studies using IVRS are described to gain a better understanding of this technology, which will enable them to optimize its usage in clinical research.

59 citations


Journal ArticleDOI
TL;DR: A system built in a research environment at the National Eye Institute (NEI), National Institutes of Health (NIH), based on the commercially available EMR from Next-Gen, Inc. is summarized.
Abstract: The electronic medical record (EMR), sometimes called electronic health record (EHR) or electronic patient record (EPR), has become one of the most important new technologies in healthcare. Electronic storage and exchange of health-care information has been of interest worldwide for years, but recent reports on medical error rates and national mandates for conversion from handwritten documents have heightened its importance. There have been many published EMR implementations in many different subspecialties. Most focus on improvements in efficiency, patient experience, and care. The Medical Records Institute has published the results of their survey of EHR usage,1 and it demonstrates that usage has been predominately in the ambulatory care sector, with a focus on clinical workflow and quality of care. Few published articles have focused on systems designed for use in ophthalmology, despite a history of the use of such systems dating back to Alcon’s IVY system in the 1990s (Alcon, Ltd., Fort Worth, TX). Published reports from users in the field of ophthalmology have all focused on custom systems, because no commercial product was suitable for their purposes.2 However, commercial systems are widely used and DeBry3 has provided a good overview of the systems available as well as the considerations that are important for individual practices. The focus of these reports is on cost savings, time savings, and quality of care. Concurrent with the increased use of EMR for clinical care, there has been an increase in the use of electronic systems for capture of data in clinical research and clinical trials.4 Many trials still require researchers to enter data manually on both a paper clinical record and a paper case report form (CRF), from which it is transcribed into an electronic database. With this process data are transcribed twice, once from the encounter note to the CRF, and again into the database, providing multiple potential sources for error. Databases tend to be developed independently for each study making cross-study research difficult and therefore infrequent. Retrospective studies require review of the paper record and recompiling the data of interest. As clinical trials have become more complicated, expensive, regulated, and increasingly monitored, clinical research organizations (CROs) and research coordinating centers have expanded and evolved, especially in terms of data collection and monitoring.5 With the advent of the internet, CROs and coordinating centers have moved to electronic data capture (EDC),6,7 which enables data entry for clinical studies to be performed on-site in real time by the researcher. This eliminates one transcription step, but still keeps data in discrete databases and provides no method for retrospective studies. With the increase in EMR implementation, there has been a subsequent increase in interest in using these systems for clinical research and clinical trials. As researchers have had to enter data into two electronic systems, EMR and EDC, speculation on the feasibility of merging these two technologies has begun.8,9 Bleicher8 and the document from the eClinical Forum (eCF)9 discuss some of the different ways that these two systems can be interfaced. In this report, we summarize a system we have built in a research environment at the National Eye Institute (NEI), National Institutes of Health (NIH). Both the eCF and Bleicher8,9 discuss the need for using the investigator as the single point of data entry and the need for electronic data transfer to a data-coordinating center to eliminate transcription errors. The data collection system would have to be highly customizable, to allow for the inclusion of elements for specific clinical trials. In addition, with all the data collected in a single system, it would allow for retrospective studies including data from all patients. We chose to build such a system based on the commercially available EMR from Next-Gen, Inc. (Horsham, PA; formerly MicroMed). In use at the NEI since 2002, the system has become the universal method of clinical research data collection in the outpatient clinic and currently contains data from almost 30,000 patient visits. Although the system has provided developmental challenges, we have implemented it as both a record of patient care and a platform for prospective and retrospective studies.

42 citations


Journal ArticleDOI
TL;DR: A computer-based automated menu-driven system with 658 data fields was developed for a cohort study of women aged 65 years or older, diagnosed with invasive histologically confirmed primary breast cancer, at 6 Cancer Research Network sites and an innovative modified version of the EDC permitted an automated evaluation of inter-rater and intra- rater reliability across six data collection sites.
Abstract: The choice between paper data collection methods and electronic data collection (EDC) methods has become a key question for clinical researchers There remains a need to examine potential benefits, efficiencies, and innovations associated with an EDC system in a multi-center medical record review study A computer-based automated menu-driven system with 658 data fields was developed for a cohort study of women aged 65 years or older, diagnosed with invasive histologically confirmed primary breast cancer (N = 1859), at 6 Cancer Research Network sites Medical record review with direct data entry into the EDC system was implemented An inter-rater and intra-rater reliability (IRR) system was developed using a modified version of the EDC Automation of EDC accelerated the flow of study information and resulted in an efficient data collection process Data collection time was reduced by approximately four months compared to the project schedule and funded time available for manuscript preparation increased by 12 months In addition, an innovative modified version of the EDC permitted an automated evaluation of inter-rater and intra-rater reliability across six data collection sites Automated EDC is a powerful tool for research efficiency and innovation, especially when multiple data collection sites are involved

40 citations


Journal ArticleDOI
20 Feb 2007-Trials
TL;DR: A study to compare the accuracy of hand held computer data capture versus more traditional paper-based methods found unacceptable high error rates occurred with hand held computers.
Abstract: The use of handheld computers in medicine has increased in the last decade, they are now used in a variety of clinical settings. There is an underlying assumption that electronic data capture is more accurate that paper-based data methods have been rarely tested. This report documents a study to compare the accuracy of hand held computer data capture versus more traditional paper-based methods. Clinical nurses involved in a randomised controlled trial collected patient information on a hand held computer in parallel with a paper-based data form. Both sets of data were entered into an access database and the hand held computer data compared to the paper-based data for discrepancies. Error rates from the handheld computers were 67.5 error per 1000 fields, compared to the accepted error rate of 10 per 10,000 field for paper-based double data entry. Error rates were highest in field containing a default value. While popular with staff, unacceptable high error rates occurred with hand held computers. Training and ongoing monitoring are needed if hand held computers are to be used for clinical data collection.

37 citations


Journal ArticleDOI
TL;DR: Usability, the ultimate goal of recording and managing patient data, requires, besides technical considerations, in addition appropriate methodology on medical data management, especially if data is intended to be used for multiple purposes, e.g. for patient care and quality management and clinical research.
Abstract: Objectives: To summarize background, challenges, objectives, and methods for the usability of patient data, in particular with respect to their multiple use, and to point out how to lecture medical data management. Methods: Analyzing the literature, providing an example based on Simpson‘s paradox and summarizing research and education in the field of medical data management, respectively health information management (in German: Medizinische Dokumentation). Results: For the multiple use of patient data, three main categories of use can be identified: patientoriented (or casuistic) analysis, patient-group reporting, and analysis for clinical studies. A so-called documentation protocol, related to study plans in clinical trials, supports the multiple use of data from the electronic health record in order to obtain valid, interpretable results. Lectures on medical data management may contain modules on introduction, basic concepts of clinical data management and coding systems, important medical coding systems (e.g. ICD, SNOMED, TNM, UMLS), typical medical documentation systems (e.g. on patient records, clinical and epidemiological registers), utilization of clinical data management systems, planning of medical coding systems and of clinical data management systems, hospital information systems and the electronic patient record, and on data management in clinical studies. Conclusion: Usability, the ultimate goal of recording and managing patient data, requires, besides technical considerations, in addition appropriate methodology on medical data management, especially if data is intended to be used for multiple purposes, e.g. for patient care and quality management and clinical research. Medical data management should be taught in health and biomedical informatics programs.

34 citations


Proceedings ArticleDOI
22 Oct 2007
TL;DR: This paper develops an application that allows trial chairmen to design their trial and especially the required data management system with comprehensive metadata according to their needs, integrating a clinical trial ontology into the design process, and integrates these applications into a European biomedical Grid for cancer research.
Abstract: Data management in post-genomic clinical trials is the process of collecting and validating clinical and genomic data with the goal to answer research questions and to preserve it for future scientific investigation. Comprehensive metadata describing the semantics of the data are needed to leverage it for further research like cross-trial analysis. Current clinical trial management systems mostly lack sufficient metadata and are not semantically interoperable. This paper outlines our approach to develop an application that allows trial chairmen to design their trial and especially the required data management system with comprehensive metadata according to their needs, integrating a clinical trial ontology into the design process. To demonstrate the built-in interoperability of data management systems developed in this way, we integrate these applications into a European biomedical Grid for cancer research in a way that the research data collected in the data management systems can be seamlessly analyzed and mined by researchers.

29 citations


Journal ArticleDOI
01 Mar 2007
TL;DR: This paper presents a Web-based EDC system designed to support a small-scale research-oriented clinical trial for establishing standard operating procedures (SOP) for electrochemotherapy with a new medical device, named Cliniporator, and tries to share the experience to exploit Internet and Web technologies to improve clinical trials data collection, follow up, and data analysis.
Abstract: Many branches of the healthcare industry are being influenced by information and communication technology (ICT). Clinical trials are not an exception. Despite this fact, more than 75% of clinical trials data are being collected on paper records. Recent ICT advances, such as broad acceptance of Internet Technology which are rapidly improving electronic data collection (EDC) tools, however, may soon reduce this percentage of "paper" supported clinical trials. In this paper, we present our Web-based EDC system designed to support a small-scale research-oriented clinical trial for establishing standard operating procedures (SOP) for electrochemotherapy with a new medical device, named Cliniporator. The definition of the SOP can only be based on a comprehensive analysis of collected data and results of clinical trial. Therefore, it is necessary to record treatment efficiency and, in this respect, to carefully follow and collect treatment parameters. We thus established central database and the Web application for filling database with data submitted by users from distant medical centers across Europe. Also, we enabled transmitting of data stored on the local Cliniporator medical devices to the central database as well as submitting of tumor images and marking of tumor nodules on interactive human map developed in Macromedia Flash. We provided users with dynamically generated basic statistics, and, several times during data collection process, we performed statistical data analysis. In order to assure high quality of data in a database, we included several mechanisms: automatic data validation, digital signatures, the form completeness notification system, e-mail alerting of completed forms, and "check tables." After 13 months of using the systems, we performed a simple usability evaluation of the system by asking users to answer to a questionnaire, and here we present the results. With this paper, we try to share our experience and encourage others to exploit Internet and Web technologies to improve clinical trials data collection, follow up, and data analysis

29 citations


Patent
21 Mar 2007
TL;DR: In this paper, a method and a system for capturing drug administration data in a medical environment is presented, where patient and drug identifiers are inputted into a handheld device comprising a scanner, a processor and a data storage device.
Abstract: A method and a system for capturing drug administration data in a medical environment. Patient and drug identifiers are inputted into a handheld device comprising a scanner, a processor and a data storage device. The patient identifiers and the drug identifiers are transmitted to a server. The patient and drug identifiers are correlated to show patient, doctor, hospital and geographic drug administration correlations.

28 citations


Journal ArticleDOI
TL;DR: A remote data entry (RDE) module was successfully integrated within a Web-based telemedicine system in a German multi-centric research network for a rare disease called Epidermolysis Bullosa.
Abstract: Objectives: Today’s increasing specialization of medicine necessitates the integration of telematic platforms for cross-institutional cooperation. In order to leverage the strengths of each cooperating institution a centralized unified storage using shared electronic patient records (EPRs) as well as secured remote data entry capabilities for supporting collaborative clinical research and care is essential. The objective of this project was to develop and introduce into routine use such a shared remote data entry (RDE) platform for the German multicentric Epidermolysis Bullosa (EB) network. Methods: An existing telematic application was extended by a remote data entry (RDE) module enabling the storage of structured data and pedigrees. HL7 Clinical Document Architecture (CDA) was chosen as a suitable standardized storage format that provides flexibility and future interoperability. Flexible data entry forms adaptable to distinct medical domains were implemented using XML-based technologies. Results: A flexible and comprehensive EPR /RDE platform was successfully implemented in the German EB network. A set of specific data entry forms was created to fully cover the network’s documentation needs. The platform has been in productive use since 2005. Conclusions: Standardized documentation by using HL7 CDA to store the medical research data as an EPR can contribute to high interoperability and an easier integration of heterogeneous health care information systems. Existing XML technologies offer a high degree of flexibility and adaptability to distinct medical domains. The ongoing development of standards (e.g. HL7 CDA R2) and interfaces (CDA /CDISC bridge) could further improve semantic and syntactic interoperability.

Proceedings ArticleDOI
08 Mar 2007
TL;DR: A Data Grid architecture for imaging-based clinical trials is presented and a prototype has been implemented in the Image Processing and Informatics Laboratory at the University of Southern California to test and evaluate performance in storing trial images and analysis results for a clinical trial.
Abstract: Clinical trials play a crucial role in testing new drugs or devices in modern medicine. Medical imaging has also become an important tool in clinical trials because images provide a unique and fast diagnosis with visual observation and quantitative assessment. A typical imaging-based clinical trial consists of: 1) A well-defined rigorous clinical trial protocol, 2) a radiology core that has a quality control mechanism, a biostatistics component, and a server for storing and distributing data and analysis results; and 3) many field sites that generate and send image studies to the radiology core. As the number of clinical trials increases, it becomes a challenge for a radiology core servicing multiple trials to have a server robust enough to administrate and quickly distribute information to participating radiologists/clinicians worldwide. The Data Grid can satisfy the aforementioned requirements of imaging based clinical trials. In this paper, we present a Data Grid architecture for imaging-based clinical trials. A Data Grid prototype has been implemented in the Image Processing and Informatics (IPI) Laboratory at the University of Southern California to test and evaluate performance in storing trial images and analysis results for a clinical trial. The implementation methodology and evaluation protocol of the Data Grid are presented.

01 Jan 2007
TL;DR: An ontology-based system for the design and integration of clinical trial data management in a convenient and flexible way and the usability of the system is evaluated using the test case of a specific pilot trial on medical devices.
Abstract: We developed an ontology-based system for the design and integration of clinical trial data management in a convenient and flexible way. A reference ontology serves as basis for both, the generation of clinical trial databases and the integration of data from various data sources into this database. We evaluated the usability of the system using the test case of a specific pilot trial on medical devices.

Book ChapterDOI
01 Jan 2007
TL;DR: Data management includes the entire spectrum from data collection and entry to data analysis and reporting, and the central themes remain: the trial conducted according to good clinical practices, carrying out the study according to protocol, and treatment and assessment of the participant according to Protocol.
Abstract: Publisher Summary Data management includes the entire spectrum from data collection and entry to data analysis and reporting. The data collected in a clinical trial constitute an accounting of the trial. Rules and guidelines that govern research include the Code of Federal Regulations, the Good Clinical Practices (GCPs) guidelines from the International Conference on Harmonisation, state laws, sponsor standard operating procedures (SOPs), and institutional SOPs. The GCPs are an international ethical and scientific quality standard for clinical trial conduct. A trial conducted under good clinical practices is the basis for demonstrating that the trial was conducted according to protocol. Even as automated systems are employed to facilitate clinical trial data management, the central themes remain: the trial conducted according to good clinical practices, carrying out the study according to protocol, and treatment and assessment of the participant according to protocol. These issues should always be raised and affirmed to assure the integrity of the research and the protection of the safety of the participant. Plans for data management should be set up early during the development phase of a clinical trial. Included in the plan are the appropriate mix of research personnel and resources such as staff time, workspace, computer equipment, and secure storage facilities for both paper and electronic equipment.

Journal Article
TL;DR: The successful implementation and evaluation of the Data Grid for imaging-based clinical trials provide three major benefits: an understanding of the methodology for using data grid technology and high speed networks in clinical trails; an establishment of the performance benchmarks of Data Grid over high speed Networks; and a Data Grid test bed for performing worldwide imaging based clinical trials.
Abstract: Clinical trials play a crucial role in testing new drugs or devices in modern clinical practice. Medical imaging has become an important tool in clinical trials because images provide a unique and fast diagnosis with visual observance and quantitative assessment. A typical imaging-based clinical trial consists of: (1) A well-defined rigorous clinical trial protocol, (2) a radiology core that has a quality control mechanism, a biostatistics component, and a server for storing and distributing data and analysis results; and (3) many field sites that generate and send clinical trial image studies to the radiology core. With ever-increasing number of clinical trials, it becomes a great challenge for a radiology core which handles multiple clinical trials to have a robust server to administrate multiple trials as well as satisfy the requirements to quickly distribute information to participating radiologists/clinicians worldwide to assess trials' results. Data Grid in the grid computing technology can satisfy the aforementioned requirements of imaging-based clinical trials. In this paper, we present a Data Grid architecture for worldwide imaging-based clinical trials. The Data Grid testbed has been set up in three international sites: Image Processing and Informatics (IPI) Laboratory at University of Southern California, USA; the Hong Kong Polytechnic University; and InCor (Heart Institute) at Sao Paulo, Brazil. The three chosen sites are connected with high speed international networks including the Internet2, the HARNET (Hong Kong Academic and Research Network), and the Brazilian National Research and Education Network (RNP2). The concept, design and implementation of the Data Grid are presented. Grid computing technology open source software Globus Toolkit 4.0 and DICOM technology were used to implement the DICOM compliance Data Grid. This paper also describes results of using Data Grid in imaging-based clinical trails for fault tolerance image data backup. The successful implementation and evaluation of the Data Grid for imaging-based clinical trials provide three major benefits: (1) an understanding of the methodology for using data grid technology and high speed networks in clinical trails; (2) an establishment of the performance benchmarks of Data Grid over high speed networks; and (3) a Data Grid test bed for performing worldwide imaging based clinical trials.

Journal ArticleDOI
TL;DR: An XML based flexible clinical trials data management framework in .NET environment that can be used for efficient design and deployment of eCRFs to efficiently collate data and analyze information from multi-site clinical trials.
Abstract: Clinical trials involve multi-site heterogeneous data generation with complex data input-formats and forms. The data should be captured and queried in an integrated fashion to facilitate further analysis. Electronic case-report forms (eCRF) are gaining popularity since it allows capture of clinical information in a rapid manner. We have designed and developed an XML based flexible clinical trials data management framework in .NET environment that can be used for efficient design and deployment of eCRFs to efficiently collate data and analyze information from multi-site clinical trials. The main components of our system include an XML form designer, a Patient registration eForm, reusable eForms, multiple-visit data capture and consolidated reports. A unique id is used for tracking the trial, site of occurrence, the patient and the year of recruitment. Availability http://www.rfdn.org/bioinfo/CTMS/ctms.html.

Patent
12 Mar 2007
TL;DR: In this paper, a software method and/or system is provided which may extract clinical data directly from an electronic medical record, which may be used to encourage patient compliance with his prescribed medication.
Abstract: A software method and/or system is provided which may extract clinical data directly from an electronic medical record. This data may be used to encourage patient compliance with his prescribed medication. The data may also be used for various market or clinical research purposes.

Patent
Jason Parker1
14 Feb 2007
TL;DR: In this article, a data management process is provided that manages ECG data collection, review, and reporting in an efficient and secure manner by utilizing biometric checks, quality control measures, and workflow control systems that route data based on a variety of factors.
Abstract: According to various embodiments of the present invention, a data management process is provided that manages ECG data collection, review, and reporting in an efficient and secure manner by utilizing biometric checks, quality control measures, and workflow control systems that route data based on a variety of factors to efficiently complete the measurement and analysis process.


Journal Article
Gao Wei-min1
TL;DR: Electronic data capture may realize cost saving and improvement of efficiency and data quality, as well as addressing some problems in PDC based clinical trial.
Abstract: Electronic data capture(EDC)is a technique for collecting clinical trial data in such a way that they are delivered to the sponsor in electronic form instead of on paper.Web-enabled EDC will become the fundamental mode for clinical trial data capture in the future.Source data of EDC includes paper source data and electronic source data.Source data verification(SDV)of the former one is evaluating the conformity of the data presented in e-CRF with source data.Although electronic source data eliminates need for SDV,strict logic check at the time of data entry and investigating audit trials are needed to confirm the authenticity of the data.EDC may realize cost saving and improvement of efficiency and data quality,as well as addressing some problems in PDC based clinical trial.

Journal ArticleDOI
TL;DR: Since the inception of the Internet and the World Wide Web, the way these standards have revolutionised the world is comparable to the industrial revolution, but all this activity and technological innovation has, in some ways, not been fully utilised within the clinical trials industry, where the adoption of technology has been markedly slow.
Abstract: Since the inception of the Internet and the World Wide Web in adopted at sites and the use of various forms of technology. The the 1970s and 80s, the world has changed. Websites and e-mail survey results indicated that only 30% of trials run by pharmaceuhave become central to the way enterprises conduct their business; tical companies and 11% of those run by contract research orindeed, it would be hard to imagine any company not using ganisations (CROs) were employing electronic data capture electronic communications to run their day-to-day operations. (EDC) technology in 2004, even though the development of the VeriSign Inc. reported that, as a conservative estimate, there are first systems used for the capture of clinical trial data appeared 2.25 billion e-mails sent every day,[1] while a 2003 report from the around 20 years ago (see figure 1).[5-7] While these early systems International Data Corporation put the number at approximately were not web systems, the CDISC research from 2004 indicates 31 billion.[2] Companies like Amazon and Google exist only that 66% of trials were still using paper. It is evident that the because of the web. Amazon.com was formed in 1994, launched clinical trials world has not embraced technology to its fullest its first international site in October 1998 and in 2005 made a gross extent; however, as shown in the figure for electronic case report profit of $US2039 million on a turnover of $US8490 million.[3] form (eCRF) and electronic patient reported outcome (ePRO) Google, founded in 1998, was handling 4.1 billion searches in July systems, adoption is on the rise. 2007 from within the US alone.[4] These two companies illustrate the impact of the web and the underlying technology that allows 1. Why Not in Clinical Trials? the web to function, technology that must surely be a consideration for the future of clinical trials. The financial sector is often used as an example of the successUnsurprisingly, powering websites such as Amazon and Googful adoption of technology, with Automated Teller Machines le are standards. The standards that are now maintained by the (ATMs) cited as an example of successful implementation of both World Wide Web Consortium (W3C) – the Hyper Text Protocol standards and technology. As we started to move more freely Standard (HTTP), the Transmission Control Protocol (TCP) and around the world, the financial services market recognised the the Internet Protocol (IP) – sit at the very core of the web and came advantages of being able to put a bank card into an ATM anywhere together in the early 1990s to form what we understand today as the Internet and the World Wide Web. Combined with the standards for e-mail, Post Office Protocol 3 (POP3) and Simple Mail Transfer Protocol (SMTP), the way these standards have revolutionised the world is comparable to the industrial revolution. However, all this activity and technological innovation has, in some ways, not been fully utilised within the clinical trials industry, where the adoption of technology has been markedly slow. In 2002, 2003 and, most recently, in 2004, the Clinical Data Interchange Standards Consortium (CDISC) conducted surveys to ascertain the uptake of technology within clinical trials. In particular, the 2004 survey tried to establish how far technology had been 30

Book ChapterDOI
01 Jan 2007
TL;DR: Two models that are presented here are the basis for the development of the electronic data collection software to support clinical trials, and the first one defines use cases of the actors using the system.
Abstract: In this paper some results of modeling web-based clinical trial electronic data collection process are presented. Clinical trial data collection is usually a paper-based process. Here we propose a new process of electronic data collection adjusted for the utilization of the web-based technology. Two models that we present here are the basis for the development of the electronic data collection software to support clinical trials. The first one defines use cases of the actors using the system. The second one is a model of the structure of the electronic Case Report Form (eCRF) document. Descriptions of these two models at a high level of abstraction are presented using the standard UML (Unified Modeling Language) version 2.0.