Language Model Co-occurrence Linking for Interleaved Activity Discovery
Citation:
Rogers, E. and Kelleher, J.D. and Ross, R.J., Language Model Co-occurrence Linking for Interleaved Activity Discovery, In Proceedings of the 34th International ECMS Conference on Modelling and Simulation, ECMS 2020, pp. 183-189Abstract:
As ubiquitous computer and sensor systems become abundant, the potential for automatic identification and tracking of human behaviours becomes all the more evident. Annotating complex human behaviour datasets to achieve ground truth for supervised training can however be extremely labour-intensive, and error prone. One possible solution to this problem is activity discovery: the identification of activities in an unlabelled dataset by means of an unsupervised algorithm.
This paper presents a novel approach to activity discovery that utilises deep learning based language production models to construct a hierarchical, tree-like structure over a sequential vector of sensor events. Our approach differs from previous work in that it explicitly aims to deal with interleaving (switching back and forth between between activities) in a principled manner, by utilising the long-term memory capabilities of a recurrent neural network cell. We present our approach and test it on a realistic dataset to evaluate its performance. Our results show the viability of the approach and that it shows promise for further investigation. We believe this is a useful direction to consider in accounting for the continually changing nature of behaviours.
Author's Homepage:
http://people.tcd.ie/kellehjd
Author: Kelleher, John
Type of material:
Journal ArticleCollections
Series/Report no:
Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics);12081 LNCS;
Availability:
Full text availableDOI:
http://dx.doi.org/10.1007/978-3-030-45778-5_6Metadata
Show full item recordLicences: