scispace - formally typeset
Search or ask a question

Showing papers by "Lionel Tarassenko published in 2009"


Journal ArticleDOI
TL;DR: Although the concept of telehealth monitoring is unfamiliar to most patients and practice nurses, the technology improved the support available for T2D patients commencing insulin treatment.
Abstract: Background Initiating and adjusting insulin treatment for people with type 2 diabetes (T2D) requires frequent clinician contacts both face-to-face and by telephone. We explored the use of a telehealth system to offer additional support to these patients. Methods Twenty-three patients with uncontrolled T2D were recruited from nine general practices to assess the feasibility and acceptability of telehealth monitoring and support for insulin initiation and adjustment. The intervention included a standard algorithm for self-titration of insulin dose, a Bluetooth enabled glucose meter linked to a mobile phone, an integrated diary to record insulin dose, feedback of charted blood glucose data and telehealth nurse review with telephone follow-up. Additional contact with patients was initiated when no readings were transmitted for >3 days or when persistent hyper- or hypoglycaemia was identified. Reponses of patients and clinicians to the system were assessed informally. Results The mean (SD) patient age was 58 years (12) with 78% male. The mean (SD) diabetes duration was 6.4 years (4.5), HbA1c at baseline was 9.5% (2.2), and the decrease in HbA1c at three months was 0.52% (0.91) with an insulin dose increase of 9 units (26). A mean (SD) of 160 (93) blood glucose readings was transmitted per patient in these three months. Practice nurses and general practitioners (GPs) viewed the technology as having the potential to improve patient care. Most patients were able to use the equipment with training and welcomed review of their blood glucose readings by a telehealth nurse. Conclusions Although the concept of telehealth monitoring is unfamiliar to most patients and practice nurses, the technology improved the support available for T2D patients commencing insulin treatment.

52 citations


Proceedings Article
01 Jan 2009
TL;DR: Visensia is a real-time, continuous vital sign acquisition system, using data fusion in order to predict patient deterioration, and an algorithm for deriving the respiration rate of the patient from the ECG signal is developed.
Abstract: Early detection of deterioration in hospital patients followed by intervention and stabilization can prevent adverse events such as a cardiac arrest, unscheduled admission to ICU, or death. Patients at step-down units of hospitals tend to have their vital signs checked by nursing staff at 4-hourly intervals. If an abnormality develops in the period between nurse observations, it is likely to lead to an adverse event (which may have been preventable). Visensia is a real-time, continuous vital sign acquisition system, using data fusion in order to predict patient deterioration. Validation trials have shown that the system successfully provides early warning of adverse events, such as cardiac arrests. We tested the system on lower acuity, ambulatory patients in a hospital ward with the vital signs being collected using telemetry. In order to optimize processing, we have developed an algorithm for deriving the respiration rate of the patient from the ECG signal.

28 citations


Journal ArticleDOI
TL;DR: The CYMPLA trial shows that mobile phone-based strrctrred intervention to achieve asthma control in patients with persistent asthma is a pragmatic randomised controlled trial.

19 citations


Journal ArticleDOI
TL;DR: Reliable, automated QT analysis would allow the use of all the ECG data recorded during continuous Holter monitoring, rather than just intermittent 10‐second ECGs.
Abstract: Background: Reliable, automated QT analysis would allow the use of all the ECG data recorded during continuous Holter monitoring, rather than just intermittent 10-second ECGs. Methods: BioQT is an automated ECG analysis system based on a Hidden Markov Model, which is trained to segment ECG signals using a database of thousands of annotated waveforms. Each sample of the ECG signal is encoded by its wavelet transform coefficients. BioQT also produces a confidence measure which can be used to identify unreliable segmentations. The automatic generation of templates based on shape descriptors allows an entire 24 hours of QT data to be rapidly reviewed by a human expert, after which the template annotations can automatically be applied to all beats in the recording. Results: The BioQT software has been used to show that drug-related perturbation of the T wave is greater in subjects receiving sotalol than in those receiving moxifloxacin. Chronological dissociation of T-wave morphology changes from the QT prolonging effect of the drug was observed with sotalol. In a definitive QT study, the percentage increase of standard deviation of QTc for the standard manual method with respect to that obtained with BioQT analysis was shown to be 44% and 30% for the placebo and moxifloxacin treatments, respectively. Conclusions: BioQT provides fully automated analysis, with confidence values for self-checking, on very large data sets such as Holter recordings. Automatic templating and expert reannotation of a small number of templates lead to a reduction in the sample size requirements for definitive QT studies.

19 citations


Journal ArticleDOI
01 May 2009
TL;DR: This article demonstrates how to establish empirically based models of normality that are guided by engineering knowledge and utilize key features normally used by expert engineers by considering asset-specific models that adapt the threshold of alerting in accordance with the observed normal running of the plant.
Abstract: The provision of TotalCare ® styled service offerings by original equipment manufac- ture (OEM) suppliers of high-integrity assets is intended to provide improved levels of system availability to the operator. A key element of such service offerings is the ability to minimize unplanned equipment downtime, and the utilization of advanced diagnostic and prognostic monitoring tools is a significant component in achieving this. Monitoring methods, founded on novelty detection technologies, are now a well-established condition monitoring technique. This approach is particularly appropriate for monitoring high-integrity plant where fault conditions arise with extremely low levels of probability. The approach described in this article is to establish empirically based models of normality that are guided by engineering knowledge and utilize key features normally used by expert engineers. However, rather than consider generic modelling approaches, it is proposed that application of models that adapt their sensitivity to the operation of individual assets offer greater prognostic efficiency. This article demonstrates how this can be achieved by considering asset-specific models that adapt the threshold of alerting in accordance with the observed normal running of the plant.

18 citations


01 Jan 2009
TL;DR: In this paper, a probabilistic approach is taken that employs extreme value theory to determine the boundaries of normal behaviour in a principled manner, and a novel visualisation technique is presented to highlight significant spectral content that would otherwise be too low in magnitude to see in a standard plot of spectral power.
Abstract: This paper presents a principled method for detecting the ‘abnormal’ content in vibration spectra obtained from rotating machinery. We illustrate the use of the method in detecting abnormalities in jet engine vibration spectra corresponding to unforeseen engine events. We take a novelty detection approach, in which a model of normality is constructed from the typically large numbers of examples of ‘normal’ behaviour that exist when monitoring jet engines. Abnormal spectral content is then detected by comparing new vibration spectra to the model of normality. The use of novelty detection allows us to take an engine-specific approach to modelling , in which the engine under test becomes its own model rather than relying on a model that is generic to a large population of engines. A probabilistic approach is taken that employs extreme value theory to determine the boundaries of normal behaviour in a principled manner. We also describe a novel visualisation technique that highlights significant spectral content that would otherwise be too low in magnitude to see in a standard plot of spectral power. 1. I ntroduction Vibration spectra obtained from rotating systems (such as gas turbine engines, combustion engines or machining tools) are characterised by peaks in spectral power at the fundamental frequency of rotation, and smaller peaks at harmonics of that fundamental frequency. In jet engine terminology, these peaks are conventionally called tracked orders. Methods exist [1,2,3] for the principled analysis of information pertaining to these tracked orders, such that precursors of system failure can be identified and preventative maintenance action taken. However, many modes of failure manifest themselves as changes in vibration spectra that are not related to the energy of the tracked orders. An example of this is the failure of engine bearings, which are small ball bearings enclosed within fixed cages such that they may rotate freely. These are used to form load-bearing contacts between the various rotating engine shafts and they maintain the position of the shafts relative to one another. Damage to the surfaces of these bearings may result in previously unobserved vibration energy at high frequencies, significantly removed from the narrow frequency bands of the tracked orders observed under normal conditions. A failure of the cages in which the bearings are mounted can result in constant peaks in spectral energy at previously unseen multiples of the fundamental tracked orders [4,5] . The latter could be described as novel tracked orders (NTOs), because they are peaks in vibration energy within narrow frequency bands and are thus tracked orders, but occur at frequencies for which tracked orders are not observed under normal conditions. This paper describes a method for identifying NTOs and other abnormal content in spectral data, allowing the identification of modes of failure that methods based on the modelling of tracked orders cannot detect. Principled methods are used for modelling the time-series of spectral data observed under normal conditions. The goal is to learn an engine-specific model of normality online in order to provide sensitive novelty detection without the need for tuning heuristic parameters. A model of normality is introduced in Section 2 and in Section 3 the principled methods for identifying which components of a vibration spectrum are significant with respect to background noise are discussed. We use these models to transform the problem into probability space in Section 4, describe how to perform novelty detection in Section 5 and present results from jet engine vibration data in Section 6. Throughout this paper, absolute values of vibration amplitude and frequency are not provided for purposes of commercial confidence, and units of measurement have been omitted from some Figures.

17 citations


Proceedings ArticleDOI
13 Nov 2009
TL;DR: Visensia as mentioned in this paper is a real-time continuous vital sign acquisition system, using data fusion in order to predict patient deterioration, which can prevent adverse events such as cardiac arrest, unscheduled admission to ICU, or death.
Abstract: Early detection of deterioration in hospital patients followed by intervention and stabilization can prevent adverse events such as a cardiac arrest, unscheduled admission to ICU, or death. Patients at step-down units of hospitals tend to have their vital signs checked by nursing staff at 4-hourly intervals. If an abnormality develops in the period between nurse observations, it is likely to lead to an adverse event (which may have been preventable). Visensia is a real-time, continuous vital sign acquisition system, using data fusion in order to predict patient deterioration. Validation trials have shown that the system successfully provides early warning of adverse events, such as cardiac arrests. We tested the system on lower acuity, ambulatory patients in a hospital ward with the vital signs being collected using telemetry. In order to optimize processing, we have developed an algorithm for deriving the respiration rate of the patient from the ECG signal.

15 citations


Proceedings ArticleDOI
30 Oct 2009
TL;DR: In this paper, an analytical approach is proposed to obtain closed-form solutions for the extreme value distributions of multivariate Gaussian distributions and present an application to vital-sign monitoring.
Abstract: Extreme Value Theory (EVT) describes the distribution of data considered extreme with respect to some generative distribution, effectively modelling the tails of that distribution. In novelty detection, we wish to determine if data are “normal” with respect to some model of normality. If that model consists of generative distributions, then EVT is appropriate for describing the behaviour of extrema generated from the model, and can be used to separate “normal” areas from “abnormal” areas of feature space in a principled manner. In a companion paper, we show that existing work in the use of EVT for novelty detection does not accurately describe the extrema of multimodal, multivariate distributions and propose a numerical method for overcoming such problems. In this paper, we introduce an analytical approach to obtain closed-form solutions for the extreme value distributions of multivariate Gaussian distributions and present an application to vital-sign monitoring.

14 citations


01 Jan 2009
TL;DR: Analysis of data obtained from in-service engines in order to identify engine deterioration and provide preventative maintenance and techniques can be used to identify precursors of engine events to avoid loss of engine service.
Abstract: Current practice in the operation and maintenance of an aircraft fleet requires analysis of data obtained from in-service engines in order to identify engine deterioration and provide preventative maintenance. Typically large quantities of engine vibration and performance data are available from various engine-mounted sensors. The analysis of such data requires techniques for modelling these multivariate data allowing fleet specialists to establish profiles of engine behaviour under different operating conditions. Additionally, such techniques can be used to identify precursors of engine events to avoid loss of engine service.

14 citations


Proceedings ArticleDOI
01 Dec 2009
TL;DR: The spectral fusion technique is found to correctly estimate respiratory rate 90% of the time in the case of non-ambulatory data and 86% ofThe Lancet data with a root mean square error of 0.92 and 1.40 breaths per minute, respectively.
Abstract: A new method for extracting respiratory signals from the electrocardiogram (ECG) is proposed. The method performs AR spectral analysis on heart rate variability and beat morphology information extracted from the ECG and identifies the closest matched frequencies which then provide an estimate of the respiration frequency. Fusing frequency information from different sources reliably rejects noise and movement-induced artefact and is promising for application to ambulatory hospital data. The performance of the method was validated on two databases of simultaneously recorded ECG and reference respiration signals. The spectral fusion technique is found to correctly estimate respiratory rate 90% of the time in the case of non-ambulatory data and 86% of the time in the case of ambulatory data with a root mean square error of 0.92 and 1.40 breaths per minute, respectively.

14 citations


Proceedings ArticleDOI
30 Oct 2009
TL;DR: In this paper, the authors use Extreme Value Theory (EVT) to describe the distribution of data considered extreme with respect to some generative distribution, effectively modeling the tails of that distribution.
Abstract: Extreme Value Theory (EVT) describes the distribution of data considered extreme with respect to some generative distribution, effectively modelling the tails of that distribution. In novelty detection, or one-class classification, we wish to determine if data are “normal” with respect to some model of normality. If that model consists of generative distributions, then EVT is appropriate for describing the behaviour of extremes generated from the model, and can be used to determine the location of decision boundaries that separate “normal” areas of data space from “abnormal” areas in a principled manner. This paper introduces existing work in the use of EVT for novelty detection, shows that existing work does not accurately describe the extrema of multivariate, multimodal generative distributions, and proposes a novel method for overcoming such problems. The method is numerical, and provides optimal solutions for generative multivariate, multimodal distributions of arbitrary complexity. In a companion paper, we present analytical closed-form solutions which are currently limited to unimodal, multivariate generative distributions.

01 Jan 2009
TL;DR: A clinical trial currently being undertaken in which post-operative cancer patients are monitored in bed for the first day of their recovery period, and then monitored using telemetry for the remainder of their stay in hospital, during which they may be ambulatory.
Abstract: Large numbers of preventable deaths occur in hospitals each year, due to adverse events such as cardiac arrest and unplanned admission into Intensive Care Units (ICUs) from other hospital wards. The majority of these patients exhibit physiological deterioration in their vital signs prior to onset of the adverse event, which can be detected by condition monitoring. This paper describes a multivariate, multimodal approach to condition monitoring that may be performed in real-time, which has been previously shown to provide early warning of patient deterioration, while generating a small number of false alarms. We describe a clinical trial currently being undertaken in which post-operative cancer patients are monitored in bed for the first day of their recovery period, and then monitored using telemetry for the remainder of their stay in hospital, during which they may be ambulatory. The use of such telemetry requires monitoring techniques that are robust in the presence of signal artefact introduced by patient movement. We also motivate the use of a patient-specific approach to condition monitoring for improved identification of physiological deterioration.

Proceedings ArticleDOI
06 Oct 2009
TL;DR: In this article, the authors describe two principled methods of setting a decision boundary based on extreme value statistics: (i) a numerical method that produces an "optimal" solution, and (ii) an analytical approximation in closed form.
Abstract: Novelty detection, one-class classification, or outlier detection, is typically employed for analysing signals when few examples of “abnormal” data are available, such that a multiclass approach cannot be taken. Multivariate, multimodal density estimation can be used to construct a model of the distribution of normal data. However, setting a decision boundary such that test data can be classified “normal” or “abnormal” with respect to the model of normality is typically performed using heuristic methods, such as thresholding the unconditional data density, p(x). This paper describes two principled methods of setting a decision boundary based on extreme value statistics: (i) a numerical method that produces an “optimal” solution, and (ii) an analytical approximation in closed form. We compare the performance of both approaches using large datasets from biomedical patient monitoring and jet engine health monitoring, and conclude that the analytical approach performs novelty detection as successfully as the “optimal” numerical approach, both of which outperform the conventional method.


Patent
09 Oct 2009
TL;DR: In this paper, the Parzen Windows probability function is used to calculate a single dimensional value based on the distance between the current state and normal states of the system using a Parzen-windows probability function.
Abstract: A method of obtaining a consistent evaluation of the state of the system which has been monitored by measurement of multiple parameters of that system. The multiple parameters are used to calculate a single dimensional value based on the distance between the current state and normal states of the system using a Parzen Windows probability function. Consistent single dimensional values regardless of the dimensionality of the original data set can be obtained by finding a relationship between the single dimensional value and the probability of status of the system. Different relationships are obtained for different dimensionalities of data sets. Sensor malfunction can also be detected by testing the probability of the state implied by measuring all of the available parameters against the probability of the state implied by ignoring different individual ones of the parameters. A significant disparity in the two probabilities indicate possible sensor malfunction.

Patent
09 Oct 2009
TL;DR: In this article, the Parzen Windows probability function is used to calculate a single dimensional value based on the distance between the current state and normal states of the system using a Parzen-windows probability function.
Abstract: A method of obtaining a consistent evaluation of the state of the system which has been monitored by measurement of multiple parameters of that system. The multiple parameters are used to calculate a single dimensional value based on the distance between the current state and normal states of the system using a Parzen Windows probability function. Consistent single dimensional values regardless of the dimensionality of the original data set can be obtained by finding a relationship between the single dimensional value and the probability of status of the system. Different relationships are obtained for different dimensionalities of data sets. Sensor malfunction can also be detected by testing the probability of the state implied by measuring all of the available parameters against the probability of the state implied by ignoring different individual ones of the parameters. A significant disparity in the two probabilities indicate possible sensor malfunction.