Sensor annotation game

Using human-computation games to annotate accelerometer data

Duration: 

Jan 2018 – Present

Brief:

Robust activity recognition algorithms depend on how well their training datasets have been prepared. An important part of preparing this training data set is to have high-quality annotations/labels on the entire dataset. However, if the dataset gets bigger and bigger, adding annotations to it becomes even more cumbersome for the research team. Often, these tasks are done manually on small datasets by researchers.

 

Screen Shot 2019-02-18 at 9.04.03 PMTherefore, in order to solve this computational problem, we have designed “Mobots” – a human-computation (i.e. crowdsourcing) game to annotate large accelerometer datasets (e.g., NHANES and UK BioBank datasets). In Mobots, players are shown snippets of accelerometer data that they match with a lab-based ground truth data (as shown below). This way, we can gather annotations on a very large dataset with the quality of lab-based ground truth.  To access the current prototype of the game, please click here. We are carrying out several iterative tests on Mechanical Turk to improve this game.

Technology Used: 

Unity, AWS, C#, Actigraph, Actilife, R, PHP

Role: 

Game Design, Research Design, Data Pre-processing, Level and Economy Design, Game Testing and Debugging

Skills Developed: 

Game Design, Expert Interviews, Unity Programming, Database Design and Management, Data Analysis and Wrangling, Project Management

Collaborators:

Prof. Seth Cooper (Asst. Prof, Northeastern University), Prof. Dinesh John (Asst. Prof., Northeastern University), Binod Thapa-Chhetry (Ph.D. Student, Northeastern University), Josh A Miller (Ph.D. Student, Northeastern University)

Acknowledgment:

This is project is made possible by funding from the NIH BD2K (Big Data to Knowledge) project. Details soon.