scispace - formally typeset
Search or ask a question

Showing papers by "fondazione bruno kessler published in 1998"


Journal ArticleDOI
TL;DR: Three automatic approaches to ventricular repolarisation duration measurement are compared on computer-generated and real ECG signals, in relation to their reliability in the presence of the most common electrocardiographic artefacts, to assess the amount of real and artefactual variability.
Abstract: Three automatic approaches to ventricular repolarisation duration measurement (R-Tapex, R-T(end threshold) and R-T(end fitting) methods) are compared on computer-generated and real ECG signals, in relation to their reliability in the presence of the most common electrocardiographic artefacts (i.e. additive broadband noise and additive and multiplicative periodical disturbances). Simulations permit the evaluation of the amount of R-T beat-to-beat variability induced by the artefacts. The R-T(end threshold) method performs better than the R-T(end fitting) one, and, hence, the latter should be used with caution when R-T(end) variability is addressed. Whereas the R-Tapex method is more robust with regard to broadband noise than the R-T(end threshold) one, the reverse situation is observed in the presence of periodical amplitude modulations. A high level of broadband noise dose not prevent the detection of the central frequency of underlying R-T periodical changes. Comparison between the power spectra of the beat-to-beat R-T variability series obtained from three orthogonal ECG leads (X,Y,Z) is used to assess the amount of real and artefactual variability in 13 normal subjects at rest. The R-Tapex series displays rhythms at high frequency (HF) with a percentage power on the Z lead (57.1 +/- 4.9) greater than that on the X and Y leads (41.9 +/- 4.6 and 46.1 +/- 4.9, respectively), probably because of respiratory-related artefacts affecting the Z lead more remarkably. More uniform HF power distributions over X,Y,Z leads are observed in the R-T(end threshold) series (31.8 +/- 3.8, 39.2 +/- 4.1 and 35.1 +/- 4.2, respectively), thus suggesting minor sensitivity of the R-T(end threshold) measure to respiratory-related artefacts.

86 citations


Proceedings Article
01 Jan 1998
TL;DR: In this paper, the authors tackle the lack of a comprehensive description of DFOL by providing a systematic account of a completely revised and extended version of the logic, together with a sound and complete axiomatisation of a general form of bridge rules based on Natural Deduction.
Abstract: Distributed First Order Logic (DFOL) has been introduced more than ten years ago with the purpose of formalising distributed knowledge-based systems, where knowledge about heterogeneous domains is scattered into a set of interconnected modules. DFOL formalises the knowledge contained in each module by means of first-order theories, and the interconnections between modules by means of special inference rules called bridge rules. Despite their restricted form in the original DFOL formulation, bridge rules have influenced several works in the areas of heterogeneous knowledge integration, modular knowledge representation, and schema/ontology matching. This, in turn, has fostered extensions and modifications of the original DFOL that have never been systematically described and published. This paper tackles the lack of a comprehensive description of DFOL by providing a systematic account of a completely revised and extended version of the logic, together with a sound and complete axiomatisation of a general form of bridge rules based on Natural Deduction. The resulting DFOL framework is then proposed as a clear formal tool for the representation of and reasoning about distributed knowledge and bridge rules.

6 citations