Measuring Heart Rate With an RGB Camera For Real-Time General Health Monitoring

Bachelor Thesis (2025)
Author(s)

V. Pechi (TU Delft - Electrical Engineering, Mathematics and Computer Science)

Contributor(s)

Jorge Martinez – Mentor (TU Delft - Multimedia Computing)

Kianoush Rassels – Mentor (TU Delft - Bio-Electronics)

Christoph Lofi – Graduation committee member (TU Delft - Web Information Systems)

Faculty
Electrical Engineering, Mathematics and Computer Science
More Info
expand_more
Publication Year
2025
Language
English
Graduation Date
31-01-2025
Awarding Institution
Delft University of Technology
Project
['CSE3000 Research Project', 'Measuring Heart and Respiratory Rate with a Camera']
Programme
['Computer Science and Engineering']
Faculty
Electrical Engineering, Mathematics and Computer Science
Reuse Rights

Other than for strictly personal use, it is not permitted to download, forward or distribute the text or part of it, without the consent of the author(s) and/or copyright holder(s), unless the work is under an open content license such as Creative Commons.

Abstract

Heart rate (HR) is a critical indicator of an individual’s health, serving as a key metric for detecting potential cardiac issues. This paper explores a method for real-time heart rate measurement using RGB camera footage, aimed at general health monitoring. The proposed method utilizes a convolutional neural network (CNN) to generate a 3D mesh of the subjects’ facial features. The movement over time of the points in this mesh is used to compute a signal that captures the small pulsatile movements corresponding to the mechanical motion of blood being pumped through the veins. This signal is filtered, and motion sources are separated using principal component analysis (PCA). The most periodic component, the one with the highest frequency, is considered to correspond to the heart rate, and it’s frequency is used to estimate the heart rate. The proposed method is tested using the ECG-Fitness dataset, characterized by challenging environmental conditions such as significant subject motion and dim lighting conditions. Experimental results demonstrate the method’s capability for real-time applications, though further enhancements are needed to improve robustness under difficult environmental conditions.

Files

License info not available
Main.py.pdf
(pdf | 0.159 Mb)
License info not available