scispace - formally typeset
Search or ask a question

Showing papers in "International Journal of Bio-medical Computing in 1994"


Journal Article
TL;DR: The concept of such rapid, high-temperature heating of tissue in a two-dimensional finite element numerical model is investigated and demonstrates the feasibility of interstitial radiofrequency delivery of a therapeutic heat dose to a 1 cm3 tumor during a 60-s period.
Abstract: Hyperthermia is a promising adjuvant cancer treatment modality. However, unresolved engineering problems with the production and regulation of temperature distributions within tissues in vivo have frustrated repeated efforts to implement clinical hyperthermia protocols. A major technical problem with hyperthermia production in vivo is the cooling effect caused by circulating blood in larger vessels. Larger blood vessels, when located in heated tumors, can prevent achievement of sufficiently high temperatures, resulting in loss of therapeutic effect. One possible way of circumventing this problem is the delivery of a critical heat dose during a short-term, high-temperature treatment episode to minimize cooling from blood flow. We investigated the concept of such rapid, high-temperature heating of tissue in a two-dimensional finite element numerical model. The model demonstrates the feasibility of interstitial radiofrequency delivery of a therapeutic heat dose, equivalent to 30 min at 43 degrees C, to a 1 cm3 tumor during a 60-s period. The model assumes circulation of cooling fluid through hollow electrodes. A post processor has been designed to display a 3-D image of the temperature distribution, electric field, and thermal dose delivered to a unit volume within the heated tissue.

298 citations


Journal ArticleDOI
TL;DR: Critical areas which deserve immediate attention include: improvement of pen-based technology, development of knowledge-based techniques that support contextual presentation, and development of new strategies and metrics to evaluate user interfaces.
Abstract: Lack of good user interfaces has been a major impediment to the acceptance and routine use of health-care professional workstations. Health-care providers, and the environment in which they practice, place strenuous demands on the interface. User interfaces must be designed with greater consideration of the requirements, cognitive capabilities, and limitations of the end-user. The challenge of gaining better acceptance and achieving widespread use of clinical information systems will be accentuated as the variety and complexity of multi-media presentation increases. Better understanding of issues related to cognitive processes involved in human-computer interactions is needed in order to design interfaces that are more intuitive and more acceptable to health-care professionals. Critical areas which deserve immediate attention include: improvement of pen-based technology, development of knowledge-based techniques that support contextual presentation, and development of new strategies and metrics to evaluate user interfaces. Only with deliberate attention to the user interface, can we improve the ways in which information technology contributes to the efficiency and effectiveness of health-care providers.

94 citations


Journal ArticleDOI
TL;DR: The first component of the National Institutes of Health to participate in the HPCC, the National Library of Medicine, recently issued a solicitation for proposals to address a range of issues, from privacy to 'testbed' networks, 'virtual reality,' and more.
Abstract: The High Performance Computing and Communications Program (HPCC) is a multiagency federal initiative under the leadership of the White House Office of Science and Technology Policy, established by the High Performance Computing Act of 1991. It has been assigned a critical role in supporting the international collaboration essential to science and to health care. Goals of the HPCC are to extend USA leadership in high performance computing and networking technologies; to improve technology transfer for economic competitiveness, education, and national security; and to provide a key part of the foundation for the National Information Infrastructure. The first component of the National Institutes of Health to participate in the HPCC, the National Library of Medicine (NLM), recently issued a solicitation for proposals to address a range of issues, from privacy to 'testbed' networks, 'virtual reality,' and more. These efforts will build upon the NLM's extensive outreach program and other initiatives, including the Unified Medical Language System (UMLS), MEDLARS, and Grateful Med. New Internet search tools are emerging, such as Gopher and 'Knowbots'. Medicine will succeed in developing future intelligent agents to assist in utilizing computer networks. Our ability to serve patients is so often restricted by lack of information and knowledge at the time and place of medical decision-making. The new technologies, properly employed, will also greatly enhance our ability to serve the patient.

51 citations


Journal ArticleDOI
TL;DR: It is shown that D correlates to the structural complexity of the individual cell contour, and interprete D as a statistical measure for the sample's fractal dimension.
Abstract: Many biological objects appear to have self-similar structures which can be characterized by their fractal dimension D . However, applications of the concept of fractal geometry are rather scarce in cell and tissue biology. Here we adapt and analyse critically 3 methods of digital image analysis to measure D of cellular profiles. As prototype examples we investigate in detail 2 samples of cells: (i) human T-lymphocytes from normal donors, and (ii) hairy leukemic cells. It is shown that D correlates to the structural complexity of the individual cell contour. The calculated D values for cells out of the same cell line scatter around a mean value D=1.15 for T-lymphocytes (S.D. = 0.03) and D=1.34 for hairy leukemic cells (S.D. = 0.04). Consequently, we interprete D as a statistical measure for the sample's fractal dimension.

49 citations


Journal ArticleDOI
TL;DR: The fundamental foundations and relevant clinical issues in adaptive control of drug dosage regimens for patients are examined, including the selection of individualized therapeutic goals for each patient.
Abstract: In this paper we examine several of the fundamental foundations and relevant clinical issues in adaptive control of drug dosage regimens for patients. Truly individualized therapy with drugs having narrow margins of safety first requires a practical pharmacokinetic/dynamic model of the behavior of a drug. Past experience with a drug is stored in the form of a population model. Next, using the information in such a model and its relationship to the incidence of adverse reactions, a specific, explicit therapeutic goal must be selected by the responsible clinician, based on the patient's need for the drug and the risk of adverse reactions felt to be justified by each patient's need, small, moderate, or great. Individualized drug therapy thus begins with the selection of individualized therapeutic goals (low, moderate, or high) for each patient. Using subsequent feedback from the patient's serum drug levels, and using Bayesian fitting, the model is then linked to each patient as a patient-specific model. Control of the model by the dosage regimen increasingly controls the patient, to better obtain the desired explicit therapeutic goals. This process is essentially similar to that of a flight control or missile guidance system.

46 citations


Journal ArticleDOI
TL;DR: The power as well as the limitations of each approach regarding handling of covariates are illustrated and compared using the same data set which concerns the pharmacokinetics of gentamicin in neonates.
Abstract: Population models were developed to analyze processes, described by parametric models, from measurements obtained in a sample of individuals. In order to analyze the sources of interindividual variability, covariates may be incorporated in the population analysis. The exploratory analyses and the two-stage approaches which use standard non-linear regression techniques are simple tools to select meaningful covariates. The global population approaches may be divided into two classes within which the covariates are handled differently: the parametric and the non-parametric methods. The power as well as the limitations of each approach regarding handling of covariates are illustrated and compared using the same data set which concerns the pharmacokinetics of gentamicin in neonates. With parametric approaches a second-stage model between structural parameters and covariates has to be defined. In the non-parametric method the joint distribution of parameters and covariates is estimated without parametric assumptions; however, it is assumed that covariates are observed with some error and parameters involved in functional relationships are not estimated. The important results concerning gentamicin in neonates were found by the two methods.

45 citations


Journal ArticleDOI
TL;DR: The convergence of the decomposition method is proved and a new method for solving identification problems and practical formulae for Adomian's polynomials are presented.
Abstract: We present practical formulae for the calculus of Adomian's polynomials. We also prove the convergence of the decomposition method. Furthermore, we propose a new method for solving identification problems.

40 citations


Journal ArticleDOI
TL;DR: The data suggest that new therapeutic regimens are needed for neutropenic patients with an intensive therapy for non-Hodgkin malignant lymphoma, Hodgkin disease, myeloma, acute leukemia, followed by an autologous bone marrow transplantation in 6 cases.
Abstract: Vancomycin (V) is widely used in neutropenic patients, though its kinetics are known in this type of patient. In the present study, ten patients were included: all of them received an intensive therapy for non-Hodgkin malignant lymphoma, Hodgkin disease, myeloma, acute leukemia, followed by an autologous bone marrow transplantation in 6 cases. All patients were neutropenic (100/mm3). The pharmacokinetic study was done at the first V administration: 1000 mg V were injected as a 1-h infusion. Plasma V concentrations were measured by an enzyme immunoassay (EMIT, Syva, France). V maximal and minimal concentrations were 61.3 +/- 38.6 micrograms/ml and 1.69 +/- 0.77 microgram/ml, respectively. Total V clearance was 158 +/- 51 ml/min, with a creatinine clearance of 141.2 +/- 36.2 ml/min on test day. V plasma kinetics can be described by a biexponential model, with the following parameters: [table: see text] These data show a 3-fold increase of initial volume of distribution and a shortened (3-fold) T1/2 beta, if compared to values obtained in normal subjects. Because the bactericidal effect is time dependent, there can be a risk of insufficient antibiotic effect throughout the day. Our data suggest that new therapeutic regimens are needed for these patients.

39 citations


Journal ArticleDOI
TL;DR: It is shown that the optimal open-loop stochastic control with linear control/state constraints can be solved exactly and efficiently as a quadratic program, providing a simple and flexible method for computing open- loop feedback designs of drug dosage regimens.
Abstract: This paper presents a general stochastic control framework for determining drug dosage regimens where the sample times, dosing times, desired goals, etc., occur at different times and in an asynchronous fashion. In the special case of multiple models with linear dynamics and quadratic cost (MMLQ), it is shown that the optimal open-loop stochastic control with linear control/state constraints can be solved exactly and efficiently as a quadratic program. This provides a simple and flexible method for computing open-loop feedback designs of drug dosage regimens. An implementation of the MMLQ adaptive control approach is demonstrated on a Lidocaine infusion process. For this example, the resulting MMLQ regimen is more effective than the MAP Bayesian regimen at reducing interpatient variability and keeping patients in the therapeutic range.

39 citations


Journal ArticleDOI
TL;DR: In the belief that the existence of a variety of standards is an absolute necessity for health care professional workstations to work, this paper provides a detailed overview of the standards efforts of a number of groups.
Abstract: In the belief that the existence of a variety of standards is an absolute necessity for health care professional workstations to work, this paper provides a detailed overview of the standards efforts of a number of groups. According to the International Standards Organization (ISO) Reference Model, workstations require a full level of standards from the physical level through and beyond the applications level. Rapidly changing technology challenges acceptance of standards at the lower levels. Current recommendations include fiberoptic media using certain protocols. Other standards in these lower levels also have support. At the applications level, data messaging standards are being developed by six groups. The consensus standards body for the United States is coordinating the efforts of these groups in order to produce a harmonized effort, and is coordinating the effort with Europe for an international effort. Work on the development of the full set of standards necessary for workstation implementations is lagging. Accelerating the process is mandatory if we are to achieve the necessary seamless interoperability required by workstations for ubiquitous intelligent communications between the workstations and the sources of data.

31 citations


Journal ArticleDOI
TL;DR: This article describes the key components of PACS and typical user environments are analysed and the requirements on the performance of the elements of a PACS are defined.
Abstract: The term ‘Picture Archiving and Communication Systems’ (PACS) applies to networks of digital image modalities, image workstations and mass image stores connected among each other by image data communication structures and controlled by appropriate image and data management. Predominantly, PACS are intended for application in the medical imaging domain, particularly in hospitals, where, by completely replacing the currently used films, they are supposed to lead to the ‘filmless radiology’. The development of PACS is still one of the challenging tasks in the computer engineering field, because the giant amounts of digital image data produced in medical diagnostics require the introduction of novel architectures and technologies. This article decribes the key components of PACS. Typical user environments are analysed and the requirements on the performance of the elements of a PACS are defined. The bottlenecks of current technologies are evaluated and examples of advanced approaches to PACS networks, archive modules and image workstations are given.

Journal ArticleDOI
TL;DR: It was found that both nets performed well on the simulated data, but not on the actual EEG data, and the reasons for the failure of both nets are discussed.
Abstract: The feasibility of using a multi-layer perceptron and Elman's recurrent network for the detection of specific waveforms (K-complexes) in electroencephalograms (EEGs), regardless of their location in the signal segment, is explored. Experiments with simulated and actual EEG data were performed. In case of the perceptron, the input consisted of the magnitude and/or phase values obtained from 10-s signal intervals, whereas the recurrent net operated on the digitized data samples directly. It was found that both nets performed well on the simulated data, but not on the actual EEG data. The reasons for the failure of both nets are discussed.

Journal ArticleDOI
TL;DR: The linear prediction (LP) method has been used for analysis and presentation of spectral array data for the better visualisation of background EEG activity and the biological significance of Fourier method and the LP method in respect to the microstructure of neuronal events in the generation of EEG is discussed.
Abstract: The EEG time series has been subjected to various formalisms of analysis to extract meaningful information regarding the underlying neural events In this paper the linear prediction (LP) method has been used for analysis and presentation of spectral array data for the better visualisation of background EEG activity It has also been used for signal generation, efficient data storage and transmission of EEG The LP method is compared with the standard Fourier method of compressed spectral array (CSA) of the multichannel EEG data The autocorrelation autoregressive (AR) technique is used for obtaining the LP coefficients with a model order of 15 While the Fourier method reduces the data only by half, the LP method just requires the storage of signal variance and LP coefficients The signal generated using white Gaussian noise as the input to the LP filter has a high correlation coefficient of 097 with that of original signal, thus making LP as a useful tool for storage and transmission of EEG The biological significance of Fourier method and the LP method in respect to the microstructure of neuronal events in the generation of EEG is discussed

Journal ArticleDOI
TL;DR: A preliminary exploration of the problem of defining and meeting users' needs, and outlines the three main axes of the work: user-centred design, medical concept and medical record models, and user interfaces and architectures.
Abstract: A preliminary exploration of the problem of defining and meeting users' needs, this paper draws the bulk of its content from experiences in the PEN&PAD project in the UK. This program has been researching and developing prototype clinical workstations for direct use in patient care by health care professionals, chiefly doctors. Focusing more on general issues rather than the specific functional requirements embodied in the PEN&PAD prototypes, the paper begins with a brief summary of the goals of PEN&PAD and then outlines the three main axes of the work: user-centred design, medical concept and medical record models, and user interfaces and architectures. The remainder of the paper then concentrates on the first of these topics — working with users to define functional requirements.

Journal Article
TL;DR: Although most health professionals would still believe that confidentiality is the main issue, it appears that data integrity and availability are as important in the context of the 'paperless' electronic record.
Abstract: Systems which process patient health data of any kind are considered to be medical information systems. Some data can be categorised as non-personal, non-identifiable, or non-patient-based such as knowledge bases. Others are considered as highly sensitive because of the 'need to know' to deliver health care to the patient. Access is not only justifiable for doctors and nurses, but, for specific purposes to administrative personnel and public health organisations. Of special concern are registers on sexually transmitted diseases, mental health and genetic diseases. In future, patients might gain more autonomy and also have access to some parts of their own record. Telematics allows them to update a data base and to consult a knowledge base. Clearly, physicians in charge of the cases have a responsibility that has been recognised by law in all Western countries. Access to patient's data should take into account this responsibility. Although most health professionals would still believe that confidentiality is the main issue, it appears that data integrity and availability are as important in the context of the 'paperless' electronic record. Information should be complete and correct, to be only accessed by authorized persons. The health care environment is characterised by an open nature of clinics that leaves them vulnerable to theft, damage and unauthorised access. Disclosure of information may affect the patient's social standing as well as their general health. The health professions lack sufficiently well-defined organisational structure, culture and perceptions to support security.

Journal ArticleDOI
TL;DR: A menu-driven PC program (TXJN2) implementing the JN procedure, a generalization of the analysis of covariance (ANCOVA) which does not make the assumption that the regression coefficients for the regression of X on the covariates, Z1 and Z2, are equal in the groups being compared.
Abstract: The Johnson-Neyman (JN) procedure, as originally formulated (Stat Res Mem, 1 (1936) 57-93), applies to a situation in which measurements on 1 dependent (response) variable, X, and 2 independent (predictor) variables, Z1 and Z2, are available for the members of 2 groups. The expected value of X is assumed to be a linear function of Z1 and Z2, but not necessarily the same function for both groups. The JN technique is used to obtain a set of values for the Z variables for which one would reject, at a specified level of significance alpha (e.g., alpha = 0.05), the hypothesis that the 2 groups have the same expected X values. This set of values, or 'region of significance,' may then be plotted to obtain a convenient description of those values of Z1 and Z2 for which the 2 groups differ. The technique can thus be described as a generalization of the analysis of covariance (ANCOVA) which does not make the assumption that the regression coefficients for the regression of X on the covariates, Z1 and Z2, are equal in the groups being compared. In this paper we describe, illustrate and make available a menu-driven PC program (TXJN2) implementing the JN procedure.

Journal ArticleDOI
TL;DR: This paper outlines the hierarchical model framework and describes how the required computations can be carried out in a straightforward manner by a Markov chain Monte Carlo technique known as Gibbs sampling, even when models involve mean-variance relationships and outliers.
Abstract: Compartmental models are widely used to model the profile of drug concentrations versus time from administration in an individual subject. Observed concentrations are then modelled as noisy departures from the underlying profile, the latter characterised for each individual by a small number of ‘individual parameters’. When a population of individuals is studied, inter-individual variation is modelled by assuming that the individual profile parameters are drawn from a population distribution, the latter characterised by ‘population parameters’ describing, in effect, a mean population profile and individual variation around it. From a Bayesian statistical perspective, such models fit exactly into the so-called hierarchical modelling framework, which provides a coherent basis for individual and population inferences and prediction, as well as for decision-making (for example, the design of dosage regimens). This paper outlines the hierarchical model framework and describes how the required computations can be carried out in a straightforward manner by a Markov chain Monte Carlo technique known as Gibbs sampling, even when models involve mean-variance relationships and outliers.

Journal ArticleDOI
TL;DR: The recommendations of this report have been accepted by the Mayo Foundation leadership resulting in the generation of a master plan and the creation of the governance structure for implementation at Mayo.
Abstract: On 1 January 1993, the Electronic Medical Record Task Force at Mayo Clinic published its report. Charged by the Mayo Foundation to define the Electronic Medical Record (EMR) for Mayo, the task force mapped the goals, strategies and time-lines for implementation of the EMR in that institution. The task force was composed predominantly of caregivers (physicians and nurses) with assistance from members of Mayo's information systems and administrative departments. The focus of the effort was care of the patient with the consensus belief that the EMR will improve that process and, if designed robustly, will serve the other information needs of claims, research, education and practice management. The recommendations of this report have been accepted by the Mayo Foundation leadership resulting in the generation of a master plan and the creation of the governance structure for implementation at Mayo. This paper abstracts key portions of the report.

Journal ArticleDOI
TL;DR: Some of the choices, both current and future, that are available to address the needs of controlled medical vocabularies for representing data and knowledge in clinical workstations are defined and some of the implications of those choices are explored.
Abstract: The representation of patient information for use in clinical workstations is a complex problem. Ideally, it should be addressed in a way that allows multiple uses of the data, including simple manual review, sharing and pooling across institutions, and as input to knowledge-based decision support systems. To a great extent, this means coding information with controlled medical vocabularies, but it does not mean that all information must be codable before workstations are feasible. This paper defines some of the choices, both current and future, that are available to address the needs of controlled medical vocabularies for representing data and knowledge in clinical workstations and explores some of the implications of those choices.

Journal Article
TL;DR: This work proposes a mathematical model for the thalamic gateway to the cortex and hints at possible theoretical explanations for clinical facts like counterirritation, acupuncture analgesia and variations in the sensibility of somatosensory perception.
Abstract: This work proposes a mathematical model for the thalamic gateway to the cortex. In this model, the ionic currents considered and the structural details are in accordance with the biomedical experimental data. To validate the model, three series of simulations were performed in different levels of complexity. First, some experiments show that the model captures the electrophysiological properties of a single thalamic cell, that is, the relay and burst modes of operation. Second, a complete neural network representing the thalamic gateway to the cortex is assembled and the influences of the cortical projections over the thalamus are analysed. Some interesting results about how the cortex opens and closes the thalamic gate, and the relation of this control policy with the phenomenon of attention, are shown. Finally, a third set of simulations establishes some mechanisms of interaction between neighboring thalamic regions, especially a form of somatosensory competition. The paper also hints at possible theoretical explanations for clinical facts like counterirritation, acupuncture analgesia and variations in the sensibility of somatosensory perception. The model seems to be an interesting and new way of understanding the thalamocortical interactions.

Journal ArticleDOI
TL;DR: A prototype physician's workstation (PWS) is developed that provides integrated access to patient information and uses embedded domain knowledge to enhance the presentation of clinical information to the physician.
Abstract: Patient care is an information-intensive activity, yet physicians have few tools to effectively access and manage patient data. We studied physicians' information needs in an outpatient clinic, and developed a prototype physician's workstation (PWS) to address those needs. The PWS provides integrated access to patient information and uses embedded domain knowledge to enhance the presentation of clinical information to the physician. All the applications in the PWS share a common patient context, defined by the state of the internal patient model — semantic integration. Relevant data are presented together and higher-order alerts are generated by combining notable events with relevant data from the patient context. Semantic integration allows us to present and to operate on all patient data in a given patient's context, significantly enhancing the effectiveness with which information is presented to the physician.

Journal ArticleDOI
TL;DR: The accuracy of the PKS package is reliable for the choice of optimal dosage regimen for amikacin and theophylline and it is demonstrated that no statistically significant bias was observed.
Abstract: Abbott Laboratories has developed a new software package (Abbottbase pharmacokinetic system or PKS package) that employs the principles of pharmacokinetics to assist clinical pharmacologists and clinicians in designing dosage regimens This software, which runs on IBM PC compatibles, allows Bayesian estimation of individual pharmacokinetic parameters The aim of the present study was to validate this new system in routine clinical practice for amikacin (40 intensive care unit patients) and theophylline (20 patients) By using the program one or more times during the treatment (50 cases for amikacin and 46 cases for theophylline), dosing recommendations were obtained in real time for all patients The predictive performance (bias and precision) was assessed by comparing predicted drug concentrations with those measured 24-48 h after dosage recommendation In the case of amikacin, precision and bias were computed separately for peak and trough levels In all cases, no statistically significant bias was observed Finally, our results demonstrate the accuracy of the program in predicting drug levels for amikacin and theophylline Consequently, the PKS package is reliable for the choice of optimal dosage regimen for amikacin and theophylline

Journal ArticleDOI
TL;DR: A new approach to synovial joint lubrication is presented using Bingham fluid as lubricant between the approaching porous cartilagenous surfaces, presumed that the thickness of the core formed, due to thickly concentrated hyaluronic acid molecules, increases as the surfaces come closer.
Abstract: A new approach to synovial joint lubrication is presented using Bingham fluid as lubricant between the approaching porous cartilagenous surfaces. It is presumed that the thickness of the core formed, due to thickly concentrated hyaluronic acid molecules, increases as the surfaces come closer. This is due to the withdrawal of the base fluid through the boosted lubrication mechanism, leading to the formation of lubricating gel. This gel ultimately acts as boundary lubricant which prevents cartilage-to-cartilage contacts very briefly during a gait cycle. For most of the gait cycle, fluid film lubrication persists, and this fluid supports greater loads which are due to the development of increased pressures as compared with viscous lubricants. Thus, normal joints possess an in-built mechanism to support greater loads, at the load-bearing joints, with less friction and wear.

Journal ArticleDOI
TL;DR: An architecture is described that facilitates integration of existing databases and applications without modifying them and allows for a growth-path towards application of the open system paradigm in medicine.
Abstract: An architecture is described that facilitates integration of existing databases and applications without modifying them. By means of this architecture, data from different sources dispersed in a network can be combined and directly used in existing applications or applications that have been developed specially for integration. This feature of combining data from different sources into one workstation is viewed as the enabling technology on which computer-based patient records can be built. The abstraction of computer-, network- and application-specific details is completely dealt with by the integration architecture. This integration architecture has been developed with extendibility and flexibility in mind, and allows for a growth-path towards application of the open system paradigm in medicine.

Journal ArticleDOI
Li D. Xu1
TL;DR: An integrated DSS for AIP is presented that is integrated with a database and achieves its efficiency by incorporating various algorithms and models to support AIP decision processes.
Abstract: In recent years, the importance of information systems has been identified as a vital issue to continuing success in AIDS intervention and prevention (AIP). The advances in information technology have resulted in integrative information systems including decision support systems (DSS). The concept of DSS for AIP was created at the intersection of two trends. The first trend was a growing belief that AIP information systems are successful in automating operations in AIP programs. The second was a continuing improvement in modeling and software development in the AIP area. This paper presents an integrated DSS for AIP. The system is integrated with a database and achieves its efficiency by incorporating various algorithms and models to support AIP decision processes. The application examples include screening AIDS-risky behaviors, evaluating educational interventions, and scheduling AIP sessions. The implementation results present evidence of the usefulness of the system in AIP.

Journal ArticleDOI
TL;DR: This work analyzed data of patients receiving amikacin, using three methods of population analysis: the First Order (FO) method implemented in NONMEM, and the non-parametric methods NPML and NPEM.
Abstract: We analyzed data of patients receiving amikacin, using three methods of population analysis: the First Order (FO) method implemented in NONMEM, and the non-parametric (NP) methods NPML and NPEM. SAS” software and FORTRAN were used, respectively to make NONMEM and NPML interfaces with the USC * PACK patient data files (l-45 days of therapy). We estimated amikacin population pharmacokinetic parameters in general medicine and elderly patients. Various models and parametrizations were used: a one-compartment model (clearance (CL) or elimination rate constant (K,l)/distribution volume (VOL)) using covariates (with estimated creatinine clearance (eCCr> and Weight (W): CL = C, * eCCr + Ci, K,, =

Journal ArticleDOI
TL;DR: Predictive performance of two widely used software programs and two different population parameters sets assessed with respect to the prediction of amikacin serum concentrations in intensive care unit (ICU) patients show that the differences between predicted and measured concentrations were unbiased when the population parameters used were adequate.
Abstract: Many dosing methods (nomogram, pharmacokinetic methods, Bayesian methods) can be used for the individualization of amikacin dosing. Among these methods, it is now well known that the Bayesian method provides a rapid and accurate means for individualizing dosage requirements for patients with diverse pharmacokinetic profiles. However, one problem has not been fully resolved. Should we use population-based parameters reflecting the patient population being monitored or should we used general population parameters? The aim of this study was to answer this question using two widely used software programs (USC ∗ PACK PC and Abbott PKS system) and two different population parameters sets. Predictive performance of these methods was assessed with respect to the prediction of amikacin serum concentrations in intensive care unit (ICU) patients. Our results show that the differences between predicted and measured concentrations were unbiased when the population parameters used were adequate. Precision values were comparable with previously reported values. The predictive performance of the two tested software programs are very comparable in ICU patients. In addition, we demonstrated that performance can be enhanced when using population-based parameters which reflect the patient population being monitored. It is therefore advisable for each user to properly characterize each particular patient population.

Journal ArticleDOI
TL;DR: Using transporting metaphors for HPW software emphasizes commonality and de-emphasizes diversity, and 3D Rooms, Gopher and Genes are familiar and transporting metaphors to be exploited for HPWs.
Abstract: The problem encountered by health care professionals and software developers has been a lack of demonstrable visions (prototypes) for Computer-based Patient Record (CPR) and Clinical Information System (CIS) applications. This deficiency has resulted in a quest for and consideration of models, metaphors, and mind maps for the Healthcare Professional Workstation (HPW) — the access mechanism for the CPR and the CIS. The familiar physician desktop and traditional paper-based metaphors are not adequate for all aspects of clinical information processes. In the clinical care environment, the flowsheet is a transporting metaphor because many different applications and tasks can be ‘transported’ into the flowsheet. 3D Rooms, Gopher and Genes are familiar and transporting metaphors to be exploited for HPWs. Using transporting metaphors for HPW software emphasizes commonality and de-emphasizes diversity. Each model and metaphor has an associated mind map. Only the mental model, mental metaphor or mind map for HPW software is important. Metaphors communicate real-world analogies, and communication is at the core of what defines usability. A mind map facilitates communication by building a model in the user's mind. The barriers to HPWs are not technical; they are related to economics, ownership of patient information, liability and information standards.

Journal ArticleDOI
TL;DR: A mathematical model has been developed to investigate the influence of externally imposed periodic body acceleration on blood flow in aorta and arteriole and an exact analytical solution of coupled differential equations, describing the flow of a particle-fluid suspension, is obtained.
Abstract: A mathematical model has been developed to investigate the influence of externally imposed periodic body acceleration on blood flow in aorta and arteriole. The rheological properties of blood has been represented by regarding the blood as a two-phase Newtonian fluid, that is, a suspension of cells in plasma. An exact analytical solution of coupled differential equations, describing the flow of a particle-fluid suspension, is obtained using the Laplace transforms technique. The effects of body acceleration and blood cells concentration on velocity, flow rate, acceleration and shear rates are computed and displayed graphically.

Journal ArticleDOI
TL;DR: The group on 'Sharing and Communication of Health Care Information' addressed the issues raised above and unanimously recommends a number of steps that will improve the sharing of information.
Abstract: Sharing and communicating information is a fundamental task in modern medicine. The health care system of the western world is based on teamwork of professionals who participate in the care of patients. Exchange of information (not just data) requires the communicating parties to agree on a communication channel, an exchange protocol, and a common language. The language includes an alphabet, words, phrases, and symbols that express and assign meaning, understood by all. The most common forms of communication are the spoken word and the paper-based patient record. Computers and communication systems improve the sharing of health care information by overcoming the limitations imposed by the dimensions of time and location. However, natural language is still too complex and too ambiguous for current computing devices to handle the complex interactions between health care professional and patients. A simpler 'language' is needed that uses domain specific vocabularies (and/or codes), well-defined exchange protocols for data, information, knowledge, and, in the future, perhaps even wisdom. This simpler 'language' is expected to handle most of the routine information exchange but not eliminate natural language. It is essential that health care information systems preserve and incorporate natural language expressions and integrate them with structured vocabularies. Today, agreeing on standard data exchange protocols and domain specific vocabularies and codes is our greatest challenge. However, standards alone are not sufficient. Acceptance of the standards by the health care professionals, verifications in clinical environments, and implementation agreements by the medical informatics industry are essential. The group on 'Sharing and Communication of Health Care Information' addressed the issues raised above and unanimously recommends a number of steps that will improve the sharing of information. In addition, specific recommendations are offered to governments, health care institutions, and to developers of health care information systems.