FitByte: Automatic Diet Monitoring in Unconstrained Situations Using Multimodal Sensing on Eyeglasses

要旨

In an attempt to help users reach their health goals and practitioners understand the relationship between diet and disease, researchers have proposed many wearable systems to automatically monitor food consumption. When a person consumes food, he/she brings the food close to their mouth, take a sip or bite and chew, and then swallow.Most diet monitoring approaches focus on one of these aspects of food intake, but this narrow reliance requires high precision and often fails in noisy and unconstrained situations common in a person's daily life. In this paper, we introduce FitByte, a multi-modal sensing approach on a pair of eyeglasses that tracks all phases of food intake. FitByte contains a set of inertial and optical sensors that allow it to reliably detect food intake events in noisy environments. It also has an on-board camera that opportunistically captures visuals of the food as the user consumes it. We evaluated the system in two studies with decreasing environmental constraints with 23 participants. On average, FitByte achieved 89% F1-score in detecting eating and drinking episodes.

キーワード
Eating Detection
Drinking Detection
Diet Monitoring
Health Sensing
Activity Recognition
Wearable Computing
Earables
Ubiquitous Computing
著者
Abdelkareem Bedri
Carnegie Mellon University, Pittsburgh, PA, USA
Diana Li
Carnegie Mellon University, Pittsburgh, PA, USA
Rushil Khurana
Carnegie Mellon University, Pittsburgh, PA, USA
Kunal Bhuwalka
Carnegie Mellon University, Pittsburgh, PA, USA
Mayank Goel
Carnegie Mellon University, Pittsburgh, PA, USA
DOI

10.1145/3313831.3376869

論文URL

https://doi.org/10.1145/3313831.3376869

動画

会議: CHI 2020

The ACM CHI Conference on Human Factors in Computing Systems (https://chi2020.acm.org/)

セッション: Use your head & run

Paper session
314 LANA'I
5 件の発表
2020-04-30 01:00:00
2020-04-30 02:15:00
日本語まとめ
読み込み中…