Philipp Rouast

Dr Philipp Rouast

My research focuses on human-centered applications of deep learning and computer vision, especially in the health domain.

Publications

VitalLens:

VitalLens: Take a Vital Selfie

Philipp V. Rouast

arXiv preprint arXiv:2312.06892

This report introduces VitalLens, an app that estimates vital signs such as heart rate and respiration rate from selfie video in real time. VitalLens uses a computer vision model trained on a diverse dataset of video and physiological sensor data. We benchmark performance on several diverse datasets, including VV-Medium, ... Read more

PDF arXiv Google Scholar ResearchGate

Single-stage

Single-stage intake gesture detection using CTC loss and extended prefix beam search

Philipp V. Rouast and Marc T. P. Adam

IEEE Journal of Biomedical and Health Informatics 25 (7), 2733-2743 (2020)

Accurate detection of individual intake gestures is a key step towards automatic dietary monitoring. Both inertial sensor data of wrist movements and video data depicting the upper body have been used for this purpose. The most advanced approaches to date use a two-stage approach, in which (i) frame-level intake probabilities ... Read more

PDF arXiv Google Scholar IEEE Xplore PubMed ResearchGate

OREBA

OREBA: A Dataset for Objectively Recognizing Eating Behaviour and Associated Intake

Philipp V. Rouast, Hamid Heydarian, Marc T. P. Adam, and Megan E. Rollo

IEEE Access 8, 181955–181963 (2020)

Automatic detection of intake gestures is a key element of automatic dietary monitoring. Several types of sensors, including inertial measurement units (IMU) and video cameras, have been used for this purpose. The common machine learning approaches make use of the labelled sensor data to automatically learn how to make detections. ... Read more

PDF arXiv Google Scholar IEEE Xplore ResearchGate

Intake

Learning deep representations for video-based intake gesture detection

Philipp V. Rouast and Marc T. P. Adam

IEEE Journal of Biomedical and Health Informatics 24 (6), 1727–1737 (2020)

Automatic detection of individual intake gestures during eating occasions has the potential to improve dietary monitoring and support dietary recommendations. Existing studies typically make use of on-body solutions such as inertial and audio sensors, while video is used as ground truth. Intake gesture detection directly based on video has rarely ... Read more

PDF arXiv Google Scholar IEEE Xplore PubMed ResearchGate

Deep

Deep Learning for Human Affect Recognition: Insights and New Developments

Philipp V. Rouast, Marc T. P. Adam, Raymond Chiong

IEEE Transactions on Affective Computing 12 (2) 524-543 (2021)

Automatic human affect recognition is a key step towards more natural human-computer interaction. Recent trends include recognition in the wild using a fusion of audiovisual and physiological sensors, a challenging setting for conventional machine learning algorithms. Since 2010, novel deep learning algorithms have been applied increasingly in this field. ... Read more

PDF arXiv Google Scholar IEEE Xplore ResearchGate

Remote

Remote heart rate measurement using low-cost RGB face video: a technical literature review

Philipp V. Rouast, Marc T. P. Adam, Raymond Chiong, David Cornforth, Eva Lux

Frontiers of Computer Science 12 (5), 858–872 (2018)

Remote photoplethysmography (rPPG) allows remote measurement of the heart rate using low-cost RGB imaging equipment. In this study, we review the development of the field of rPPG since its emergence in 2008. We also classify existing rPPG approaches and derive a framework that provides an overview of modular steps. ... Read more

PDF Google Scholar ResearchGate SpringerLink