Authors: Pedro Herruzo [Universitat Politècnica de Catalunya], Laura Portell, Alberto Soto [Universitat de Barcelona], Beatriz Remeseiro [Universidad de Oviedo]

Abstract: The objective description of lifestyle patterns from egocentric images captured by wearable cameras is considered the next step on health-tracking applications which goes alongside heartbeats, steps walked, or burned calories measured mainly by electronic wristbands. In this context, the classification of eating, socializing, and sedentary lifestyle patterns has been addressed in previous research, considering the following classes for each pattern, respectively: 1) no food, food but not eating, and eating; 2) not socializing, and socializing; and 3) not sedentary, and sedentary. This previous approach provides a solution that considers all the possible combinations of classes among these three patterns, thus solving a multi-class classification problem with 12 classes. Given the nature of the problem, we propose to address the classification of these three lifestyle patterns under a multi-task perspective, employing a general framework based on Inception-V3 convolutional neural network. The feature maps extracted from the employed network are used to perform a final lifestyle pattern categorization. It is worth to note that our proposed framework offers visual explanations of the results to accomplish the standards of the European Union in terms of explainability, thus increasing both transparency and confidence on the obtained results. Furthermore, in this work a web-based lifelogging tool for periodic visual summarization of lifestyle patterns is introduced. Our method has been tested on a dataset composed of more than 45,000 egocentric images, and the experimental results confirm the reliability of the selected method while outperforming state-of-the-art in terms of accuracy and F1-score.


DOI: DOI: https://dx.doi.org/10.5244/C.33.348
Comments: Presented at BMVC 2019: Workshop on Applications of Egocentric Vision (EgoApp), Cardiff, UK.
Cite as: Paper (PDF): EgoApp2019_2.pdf