BehaviourMod

BehaviourMod is an Android library composed of three components: an activity recognition system, a gesture recognition system and an orientation change detector.

The activity recognition system processes data recorded by the sensors embedded in an Android device and classifies the current user activity into one of the following classes:

The system is designed to detect activities regardless of the orientation and the context of the device (it works with the device in a pocket, in hand or in a bag). The GPS speed can also be used to increase the detection accuracy, especially when the user is carrying the device in a bag.

The activity predicted is provided at two different time scales: a short one (2 seconds) and a medium one (30 seconds). The larger time scale is used to filter out dubious results and thus improve the accuracy. An example of activity detection is shown at the two time scales in the following screenshot where the recognized activity is plotted according to a colour code.

Screen capture from BehaviourMod

Our activity recognition system was tested with a user study. Results showed a global detection accuracy of 96.7% for walking activities and 87.5% for the bicycling activity.

The gesture recognition system is still under development but the goal is to detect gestures to make interactions with mobile devices easier, more intuitive and fun. In particular, gestures can be used to trigger specific tasks that would otherwise require navigating through several menus. Preliminary tests have shown that our system achieves very high recognition accuracy for pointing gestures. This kind of gestures can be used in a city guide application to request additional information about a building or a monument for instance.

The orientation change detector was developed for detecting specific behaviour patterns in future studies. In particular, we aim to detect if the user seems lost or stressed by combining the detected activity and orientation changes over a time interval. This work is still in progress. For now, we are evaluating several orientation sensors, like the compass, the rotation vector and the GPS bearing. In the screenshot shown above, the lower part of the graph represents the orientation changes detected at the same time as the activity. The procedure to evaluate the different sensors was to recreate the path walked by a user in a map. The results are shown in the next screenshot. We observed in particular that the GPS bearing was the most accurate followed closely by the other sensors.

Screen capture from BehaviourMod