Ontology-based sensor fusion for activity recognition

Context-aware activity recognition systems are dealing with heterogeneous sensors and these sensors are providing data at different sampling rate and output forms. Wearable sensors such as accelerometers and gyroscopes provide fast and real-time raw data which has to be interpreted before being useful to the application. Whereas ambient sensors such as temperature, humidity or object-interaction sensors provide data at much slower rate which require minimal or do not need further interpretation. To formally represent the contextual information, an ontology-based sensor fusion for activity recognition. The ontological modelling method harnesses the best of both (wearable and ambient) sensing approaches to achieve a robust and comprehensive activity recognition system by exploiting contextual information from the user and environment. Interested readers are referred to [1] for more details.

[1] M. H. M. Noor, Z. Salcic, and K. I.-K. Wang, “Ontology-based sensor fusion activity recognition,” J Ambient Intell Human Comput, pp. 1–15, Jan. 2018.

Leave a Reply

Your email address will not be published. Required fields are marked *