Essential readings (activity recognition)

If you are interested in the research of activity recognition, I have compiled a list of essential journals which will get you started. The list is focusing on sensor-based activity recognition. This is by no means an exhaustive list but it gives an indication of the researches taking place.

The main review paper [1] for sensor-based activity recognition. The paper covers the differences between the two sensing approaches, discusses the advantages and drawbacks of the major modeling and recognition approaches in sensor-based activity recognition: data-driven and knowledge-driven approaches. Review paper [2] is focusing on wearable sensor-based activity recognition.

Signal segmentation is a technique of dividing a large signal into smaller segments for processing and has direct impact on the quality of feature extraction and classification accuracy. Previous studies on activity recognition utilized various fixed window sizes for signal segmentation. The window size is empirically selected based on past experiments and hardware limitations for specific types of activity recognition. Paper [3] investigated the impact of window sizes on the recognition accuracy. In the experiments, three sets of features are classified by four different machine learning techniques. Paper [4] is another good literature that investigated the impact of window sizes on the recognition accuracy. The experiments involved classifying using various sizes of window segmentation on physical activity signals of varying lengths.

Activity recognition systems are developed by learning signal patterns of a given sensor placement and orientation and the systems will likely fail if the sensors are displaced. Papers [5-6] study the effects of sensor displacement in activity recognition using wearable sensors.

Papers [7-8] provide an overview of modeling and recognition approaches for vision-based activity recognition.

[1] L. Chen, J. Hoey, C. D. Nugent, D. J. Cook, and Z. Yu, “Sensor-Based Activity Recognition,” IEEE Transactions on Systems, Man, and Cybernetics, Part C: Applications and Reviews, vol. 42, no. 6, pp. 790–808, Nov. 2012. http://dx.doi.org/10.1109/TSMCC.2012.2198883
[2] O. D. Lara and M. A. Labrador, “A Survey on Human Activity Recognition using Wearable Sensors,” IEEE Communications Surveys Tutorials, vol. 15, no. 3, pp. 1192–1209, Third 2013. http://dx.doi.org/10.1109/SURV.2012.110112.00192
[3] O. Banos, J.-M. Galvez, M. Damas, H. Pomares, and I. Rojas, “Window Size Impact in Human Activity Recognition,” Sensors, vol. 14, no. 4, pp. 6474–6499, Apr. 2014. http://dx.doi.org/10.3390/s140406474
[4] B. Fida, I. Bernabucci, D. Bibbo, S. Conforto, and M. Schmid, “Varying behavior of different window sizes on the classification of static and dynamic physical activities from a single accelerometer,” Medical Engineering & Physics, vol. 37, no. 7, pp. 705–711, Jul. 2015. http://dx.doi.org/10.1016/j.medengphy.2015.04.005
[5] K. Kunze and P. Lukowicz, “Sensor Placement Variations in Wearable Activity Recognition,” IEEE Pervasive Computing, vol. 13, no. 4, pp. 32–41, Oct. 2014. http://dx.doi.org/10.1109/MPRV.2014.73
[6] O. Banos, M. A. Toth, M. Damas, H. Pomares, and I. Rojas, “Dealing with the Effects of Sensor Displacement in Wearable Activity Recognition,” Sensors, vol. 14, no. 6, pp. 9995–10023, Jun. 2014. http://dx.doi.org/10.3390/s140609995
[7] J. K. Aggarwal and M. S. Ryoo, “Human Activity Analysis: A Review,” ACM Comput. Surv., vol. 43, no. 3, p. 16:1–16:43, Apr. 2011. doi: http://dx.doi.org/10.1145/1922649.1922653
[8] D. Weinland, R. Ronfard, and E. Boyer, “A survey of vision-based methods for action representation, segmentation and recognition,” Computer Vision and Image Understanding, vol. 115, no. 2, pp. 224–241, Feb. 2011. doi: http://dx.doi.org/10.1016/j.cviu.2010.10.002

Leave a Reply