scispace - formally typeset
Open AccessJournal ArticleDOI

Defining and measuring completeness of electronic health records for secondary use

Reads0
Chats0
TLDR
It is found that according to any definition, the number of complete records in the clinical database is far lower than the nominal total, and it is concluded that the concept of completeness in EHR is contextual.
About
This article is published in Journal of Biomedical Informatics.The article was published on 2013-10-01 and is currently open access. It has received 290 citations till now. The article focuses on the topics: Data quality & Completeness (statistics).

read more

Citations
More filters
Journal ArticleDOI

Deep Patient: An Unsupervised Representation to Predict the Future of Patients from the Electronic Health Records

TL;DR: The findings indicate that deep learning applied to EHRs can derive patient representations that offer improved clinical predictions, and could provide a machine learning framework for augmenting clinical decision systems.
Journal ArticleDOI

The National COVID Cohort Collaborative (N3C): Rationale, Design, Infrastructure, and Deployment.

Melissa A. Haendel, +57 more
TL;DR: The N3C has demonstrated that a multisite collaborative learning health network can overcome barriers to rapidly build a scalable infrastructure incorporating multiorganizational clinical data for COVID-19 analytics.
Posted Content

DeepCare: A Deep Dynamic Memory Model for Predictive Medicine

TL;DR: DeepCare as discussed by the authors is an end-to-end deep dynamic neural network that reads medical records, stores previous illness history, infers current illness states and predicts future medical outcomes.
Journal ArticleDOI

"Big data" and the electronic health record.

TL;DR: In reviewing the literature for the past three years, this work focuses on "big data" in the context of EHR systems and reports on some examples of how secondary use of data has been put into practice.
References
More filters
Journal ArticleDOI

Missing data: Our view of the state of the art.

TL;DR: 2 general approaches that come highly recommended: maximum likelihood (ML) and Bayesian multiple imputation (MI) are presented and may eventually extend the ML and MI methods that currently represent the state of the art.
Journal ArticleDOI

Inference and missing data

Donald B. Rubin
- 01 Dec 1976 - 
TL;DR: In this article, it was shown that ignoring the process that causes missing data when making sampling distribution inferences about the parameter of the data, θ, is generally appropriate if and only if the missing data are missing at random and the observed data are observed at random, and then such inferences are generally conditional on the observed pattern of missing data.
Journal ArticleDOI

Beyond accuracy: what data quality means to data consumers

TL;DR: Using this framework, IS managers were able to better understand and meet their data consumers' data quality needs and this research provides a basis for future studies that measure data quality along the dimensions of this framework.
Journal ArticleDOI

Inference and missing data

Donald B. Rubin
- 01 Jun 1975 - 
TL;DR: In this paper, it was shown that ignoring the process that causes missing data when making sampling distribution inferences about the parameter of the data, θ, is generally appropriate if and only if the missing data are missing at random and the observed data are observed at random, and then such inferences are generally conditional on the observed pattern of missing data.
Book

Juran's Quality Control Handbook

TL;DR: The third edition of the book has been updated to give managers the know-how they need to manage for quality through the next decade as discussed by the authors, which is the finest book on quality ever written.
Related Papers (5)