Understanding rPPG

03/01/2025

How cameras can estimate your vital signs, explained simply.

Introduction to rPPG

Remote Photoplethysmography (rPPG) is an exciting technology that allows us to measure vital signs like heart rate and respiratory rate using just a camera. But how is this possible? In this article, we will explore the fundamentals of rPPG and understand how it works, from its biological basis to modern neural network-based methods. By the end, you'll appreciate the science and engineering that makes this groundbreaking method accessible.

At Rouast Labs, our offerings VitalLens and VitalLens API provide access to this powerful technology. Whether you're an individual looking to monitor your wellness or a developer integrating health insights into your applications, both tools are designed to be accessible, user-friendly, and reliable for a variety of use cases.

The Biological Basis of rPPG

At the core of rPPG lies the fascinating interplay between biology and light. The human cardiovascular system continuously pumps blood throughout the body, and with each heartbeat, the blood volume in the skin changes slightly. These changes influence how light interacts with the skin, giving rise to a unique opportunity for remote sensing [1].

When light shines on the skin, two primary phenomena occur:

1. Absorption: Blood absorbs more light than surrounding tissues, leading to measurable variations in the reflected light. This is because hemoglobin, the molecule responsible for carrying oxygen in the blood, has specific light absorption properties.

2. Reflection: As blood volume fluctuates with each heartbeat, the amount of light reflected from the skin also changes. These subtle fluctuations are imperceptible to the human eye but can be detected by a camera.

Illustration of light interaction with skin
Light interaction with the skin and blood vessels.

rPPG is not the first approach to use this principle. Traditional Photoplethysmography (PPG), commonly used in finger clip sensors or smartwatches, relies on the same biological foundation. A small light source, such as an LED, shines through or onto the skin, and a sensor detects changes in the intensity of light caused by blood volume changes. The key difference is that rPPG extends this concept to remote measurement using a camera instead of direct contact.

Comparing PPG and rPPG

While PPG and rPPG share the same biological basis, their methods of operation differ:

Aspect PPG (Finger Clip Sensor) rPPG (Remote Camera-Based)
Light Source Integrated LED illuminates the skin directly. Ambient or natural light is used.
Sensor Dedicated photodetector senses transmitted light. Camera captures reflected light.
Contact Requires direct contact with the skin. Operates without physical contact.
Applications Wearable devices, clinical settings. Remote monitoring, telemedicine.
Comparison of PPG and rPPG
Comparison of PPG (contact-based) and rPPG (remote).

By leveraging cameras and computational algorithms, rPPG brings the benefits of traditional PPG to a new realm of non-contact applications, expanding its accessibility and use cases significantly.

How rPPG Works

The process of extracting vital signs from video is intricate and involves several carefully designed steps [2]:

1. Video Capture: A standard camera is used to record a video of the subject, focusing on regions such as the face where blood volume changes are most consistently visible. Modern devices with high-resolution cameras allow for capturing even the tiniest variations in reflected light.

2. Region Selection: Once the video is captured, a specific region of interest (ROI) is identified. Typically, this involves selecting the facial area, as it provides a stable and unobstructed view of the skin. Advanced computer vision algorithms can automatically detect and track the ROI across frames, even if the subject moves.

3. Signal Extraction: Signal extraction involves analyzing pixel intensity changes over time within the selected ROI. Specialized signal processing techniques filter out noise caused by motion, lighting changes, or other environmental factors. This step is critical for isolating the pulse and respiratory signals embedded in the video data.

4. Vital Sign Estimation: The extracted signals are then processed to compute heart rate and respiratory rate. Depending on the system, this may involve applying mathematical models or leveraging machine learning algorithms to enhance accuracy and robustness.

Each stage contributes to the overall reliability and precision of the rPPG system, ensuring it can perform effectively across various environments and conditions.

From Handcrafted Algorithms to Neural Networks

The Evolution of rPPG Methods

In its infancy, rPPG relied on handcrafted algorithms to derive physiological signals. These methods used predefined mathematical rules and signal processing techniques to extract features from video data. Examples of such handcrafted algorithms are G [1], CHROM [3], and POS [4].

While effective in controlled laboratory settings, these early methods were prone to errors when applied in real-world scenarios. Lighting variations, head movements, and differences in skin tones posed significant challenges, often leading to unreliable results.

The Neural Network Revolution

The advent of deep learning has revolutionized rPPG, allowing systems to learn directly from data rather than relying on predefined rules [5]. At Rouast Labs, we use neural networks for rPPG. Here's how it works:

Training Data: The neural network is trained on an extensive in-house dataset comprising videos paired with gold-standard physiological signals such as PPG, ECG, and respiration data. This dataset includes diverse conditions, skin tones, settings, and motion scenarios, ensuring broad applicability.

Feature Extraction: Instead of manually specifying features, the neural network automatically learns patterns in the video that correlate with vital signs. This enables it to adapt to variations in lighting, motion, and other environmental factors.

Cross-Dataset Testing: To ensure generalizability and robustness, we test our models on external datasets, such as Vital Videos [6]. This approach validates the performance of our neural network in diverse and unseen scenarios, ensuring it performs reliably beyond the training data.

This transition from handcrafted algorithms to data-driven neural networks has greatly enhanced the accuracy, robustness, and scalability of rPPG systems, paving the way for widespread adoption.

Neural network learning rPPG signals
A neural network learning rPPG signals from data.

Challenges in rPPG

Despite its advancements, rPPG faces several challenges that require ongoing innovation:

Environmental Factors: Variations in ambient lighting, shadows, and reflections can introduce significant noise into the signal.

Motion Artifacts: Movements such as head tilts, facial expressions, or body shifts can distort the extracted signal.

Device Variability Differences in camera quality, resolution, and frame rate can impact the performance of rPPG systems.

At Rouast Labs, we continually refine our models and algorithms to address these challenges, delivering reliable rPPG solutions for diverse use cases.

Conclusion

Remote Photoplethysmography (rPPG) demonstrates the incredible potential of combining biological insights with cutting-edge technology. By capturing subtle variations in light reflection, rPPG provides a non-invasive, contactless way to measure vital signs like heart rate and respiratory rate.

At Rouast Labs, we are committed to making this technology accessible to everyone. Our tools VitalLens and VitalLens API empower individuals and developers alike to explore the possibilities of rPPG. Whether you're curious about your own wellness or building innovative applications, we invite you to try them out for free today.

References

[1] W. Verkruysse, L. Svaasand, J. Nelson, "Remote plethysmographic imaging using ambient light," Optics Express, vol. 16(26), pp. 21434-21445, 2008.
[2] P. Rouast, M. Adam, R. Chiong, D. Cornforth, E. Lux, "Remote heart rate measurement using low-cost RGB face video: A technical literature review," Frontiers of Computer Science, vol. 12(2), pp. 858-872, 2018.
[3] G. De Haan, V. Jeanne, "Robust pulse rate from chrominance-based rPPG," IEEE Transactions on Biomedical Engineering, vol. 60(10), pp. 2878-2886, 2013.
[4] W. Wang, A. den Brinker, S. Stuijk, G. de Haan, "Algorithmic Principles of Remote PPG," IEEE Transactions on Biomedical Engineering, vol. 64(7), pp. 1479-1491, 2017.
[5] W. Chen, D. McDuff, "DeepPhys: Video-Based Physiological Measurement Using Convolutional Attention Networks," in Proc. ECCV,, 2018.
[6] P. Toye, "Vital Videos: A public dataset of videos with PPG and BP ground truths," arXiv preprint arXiv:2306.11891, , 2023.