ECG Data Compression for Telecardiology
Loading...
Files
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
The electrical signal generated by heart and acquired from the body surface is known as
Electrocardiogram (ECG). It is used to know the status of heart to diagnose its
malfunctioning at an early stage, so that corrective action can be taken to prevent any major
non-reversible failure. For critical cardiac patients, persons under cardiac surveillance,
ambulatory patients and for creation of ECG database, continuous recording of ECG is
required. The recorded data becomes so voluminous that it becomes practically impossible to
handle it without compression. The importance of data compression further increases by the
fact that the rate at which cardiac patients are increasing all over the world, we do not have a
matching number of cardiologists to provide the required healthcare, especially in remote and
rural areas. One way to overcome this problem is to transmit ECG, along with other vital
statistics of a patient, over internet to a cardiologist for expert advice. For one day’s
continuous recording, the amount of multichannel ECG data exceeds several gigabytes.
Moreover if this data is to be transmitted over a telephone line or a slower digital
communication network, the time of transmission goes beyond the human patience.
Compressing the data is the only solution to this problem. The main goal of any compression
technique is to achieve maximum data volume reduction while preserving the significant
signal morphology on reconstruction.
Our work starts with literature survey of the techniques used for compression of ECG signal,
and identifies a wavelet compression method of Set Partitioning In Hierarchical Trees
(SPIHT) as superior to any other technique reported so far. We have proposed two additional
steps in the SPIHT algorithm, which are “Blank-fire removal” and “Polishing”. These
additional steps increase the compression ratio and reduce the percentage root-mean-square
difference, while retaining all features of the existing SPIHT algorithm. The performance of
existing SPIHT has been compared with that of the modified SPIHT algorithm on the same
database set i.e. ECG signals from MIT-BIH arrhythmia test base (mitdb) (sampling rate 360
per second), with the same wavelet filters and using the same distortion measure. Out of a
total of 84 signals tested, 59 signals (70.2%) have shown an improvement in the compression
ratio in the range of 3.24% - 20.22%, averaging to 7.95% improvement, while the remaining
25 signals (29.8%) have maintained the same compression ratio as given by the existing
SPIHT algorithm. Not even a single case has shown deterioration in compression ratio.
Improvement in distortion has been found in all 84 signals (100%) ranging from 3.26% to
19.23%, averaging to 9.99% reduction in PRD.
To handle lengthy ECG records, a set of executable files has been developed in C++
environment. Data downloaded from the Long–Term ST Database (ltstdb) has been used to
test the executable files. Some peculiar programming problems were encountered while
encoding the compressed data bit streams to ASCII characters. Since ASCII character for
decimal value 13 in MS-DOS, Windows, and various network standards, is used as part of the
end-of-line mark, while ASCII character for decimal value 26 and 255 are used for marking
end-of-text and end-of-file (EOF) respectively, they disrupt the normal file reading process.
These characters are identified in the bit stream itself and are subsequently modified to
circumvent this problem. The concepts of “fragmentation” and “looping” are used to handle
long records.
The number of refinement passes determines the extent to which a signal can be safely
compressed using SPIHT algorithm. With every refinement pass, the distortion in the
reconstructed signal decreases, but this also results in decrease in the compression ratio. A
criterion for the number of refinement passes in SPIHT algorithm is therefore required to
achieve optimal compression ratio, while retaining all clinically significant morphological
features. A study carried out to find this number has revealed that for the same number of
refinement passes, different signals give vastly different PRD. Thus recommending a fixed
number of refinement passes is not possible. However, if we take into account the presence
of “blank-fire” in the compression, a criterion can be evolved. We recommend five
refinement passes in absence of “blank-fire”, and six in its presence. This has resulted in
largely improved consistency in the distortion. The proposed refinement criterion has been
validated by visual inspection of the signals by a physician, as well as by a statistical analysis
over a larger database. Implementing the improved SPIHT algorithm, with the proposed
refinement criterion over 42 sets of two-lead ECG signals, the original and the reconstructed
signals were randomly presented to a physician, in sets of 12-15 pairs of signal. The
interpretations of these ECGs were same for all 42 reconstructed signals (100%) as for the
original ones.
We have also proposed a new form of root-mean-square distortion measuring parameter
called Dynamically Derived Percentage Root-mean-square Difference (DDPRD), which
gives its numerical value proportional to the distortion measured by other existing
parameters. If the original signal is scaled by any factor, or the baseline is shifted in any
direction or a baseline wander with zero dc value is introduced in it, subsequent to
reconstruction, the numerical value of distortion measured by DDPRD remains unaffected.
While all other existing parameters fail to give the same numerical value when the ECG test
signal is subjected to scaling/baseline shift/baseline wander, DDPRD shows immunity to the
mentioned changes in original signal.
Taking valuable suggestions from a panel of reputed physicians, a user-friendly, interactive
and dedicated telecardiology based website has been designed. The website has been
designed using HTML as the front end user GUI, MySQL for maintaining the related
database, and PHP as the interface programming language. Password protected login to the
website can be done as a manager or a client or an expert. The site has built-in security
features against unauthorized intrusions in the site. The jobs are prioritized as per the
requirement of the client and the transfer of credits from client’s account to expert’s account
is done as per the priority, i.e. more for high priority jobs and vice or versa. Ensuring online
availability of experts and making fiscal transactions from clients and to experts is
responsibility of the manager. The manager also does new registrations as a client or as an
expert after verifying their credentials. The website has been designed keeping in mind the
practical requirements and convenience of both client and expert, with an objective of
bridging the gap between the client and the expert. The concept of 24 hour availability of
authentic experts in the required area of specialization like cardiac medicine and cardio
thoracic surgery, facility to compress and decompress long records, prioritizing the jobs
instead of having a general queue, enhanced security features, safe fiscal transaction through
a manager and a platform for settling disputes, if any, are some of the features of the
developed website, which are not available on general chat sites.
It may be concluded that the present work has contributed significantly in the area of ECG
data compression for telecardiology, offering an opportunity to deliver better healthcare
services for entire population of the world.
Description
Doctor of Philosophy
