Human activity recognition using wearable accelerometers can enable in-situ detection of physical activities to support novel human-computer interfaces and interventions. However, developing valid algorithms that use accelerometer data to detect everyday activities often requires large amounts of training datasets, precisely labeled with the start and end times of the activities of interest. Acquiring annotated data is challenging and time-consuming. Applied games, such as human computation games (HCGs) have been used to annotate images, sounds, and videos to support advances in machine learning using the collective effort of "non-expert game players." However, their potential to annotate accelerometer data has not been formally explored. In this paper, we present two proof-of-concept, web-based HCGs aimed at enabling game players to annotate accelerometer data. Using results from pilot studies with Amazon Mechanical Turk players, we discuss key challenges, opportunities, and, more generally, the potential of using applied videogames for annotating raw accelerometer data to support activity recognition research.
Keywords: Applied games; accelerometers; activity recognition; crowdsourcing; data annotation; human computation.