Sensor annotation game

Using human-computation games to annotate accelerometer data

Duration: 

Jan 2018 – Dec 2018

Brief:

Robust activity recognition algorithms depend on how well their training datasets have been prepared. An important part of preparing this training data set is to have high-quality annotations/labels on the entire dataset. However, if the dataset gets bigger and bigger, adding annotations to it becomes even more cumbersome for the research team. Often, these tasks are done manually on small datasets by researchers.

Screen Shot 2019-02-18 at 9.04.03 PMIn order to solve this computational problem, we have designed “Mobots” – a human-computation (i.e. crowdsourcing) game to annotate large accelerometer datasets (e.g., NHANES and UK BioBank datasets). In Mobots, players are shown snippets of accelerometer data that they match with a lab-based ground truth data (as shown below). This way, we can gather annotations on a very large dataset with the quality of lab-based ground truth.

You can also watch our CHI PLAY presentation for more details:

Technology Used: 

Unity, AWS, C#, Actigraph, Actilife, R, PHP

Responsibility: 

Game Design, Research Design, Data Pre-processing, Level and Economy Design, Game Testing and Debugging

Collaborators:

Prof. Seth Cooper (Asst. Prof, Northeastern University), Josh A Miller (Ph.D. Student, Northeastern University)

Acknowledgment:

The research reported in this publication was supported by the National Institute of Biomedical Imaging and Bioengineering of the National Institutes of Health (NIH) under award number UH2EB024407. The content is solely the responsibility of the authors and does not necessarily represent the official views of the NIH. The work was also supported by NU-TECH AWS credits award from Northeastern University, Boston, MA.