Automatic Detection of Heartbeats in Heart Sound Signals Using Deep Convolutional Neural Networks
DOI:
https://doi.org/10.5755/j01.eie.25.3.23680Keywords:
Artificial neural networks, Machine learning, SegmentationAbstract
The analysis of non-stationary signals commonly includes the signal segmentation process, dividing such signals into smaller time series, which are considered stationary and thus easier to process. Most commonly, the methods for signal segmentation utilize complex filtering, transformation and feature extraction techniques together with various kinds of classifiers, which especially in the field of biomedical signals, do not perform very well and are generally prone to poor performance when dealing with signals obtained in highly variable environments. In order to address these problems, we designed a new method for the segmentation of heart sound signals using deep convolutional neural networks, which works in a straightforward automatic manner and does not require any complex pre-processing. The proposed method was tested on a set of heartbeat sound clips, collected by non-experts with mobile devices in highly variable environments with excessive background noise. The obtained results show that the proposed method outperforms other methods, which are taking advantage of using domain knowledge for the analysis of the signals. Based on the encouraging experimental results, we believe that the proposed method can be considered as a solid basis for the further development of the automatic segmentation of highly variable signals using deep neural networks.
Downloads
Published
How to Cite
Issue
Section
License
The copyright for the paper in this journal is retained by the author(s) with the first publication right granted to the journal. The authors agree to the Creative Commons Attribution 4.0 (CC BY 4.0) agreement under which the paper in the Journal is licensed.
By virtue of their appearance in this open access journal, papers are free to use with proper attribution in educational and other non-commercial settings with an acknowledgement of the initial publication in the journal.
Funding data
-
Javna Agencija za Raziskovalno Dejavnost RS
Grant numbers P2-0057