Please use this identifier to cite or link to this item: http://buratest.brunel.ac.uk/handle/2438/8221
Title: Fast human activity recognition based on structure and motion
Authors: Hu, J
Boulgouris, NV
Keywords: Activity;Recognition;Surveillance
Issue Date: 2011
Publisher: Elsevier
Citation: Pattern Recognition Letters, 32(14), 1814 - 1821, 2011
Abstract: We present a method for the recognition of human activities. The proposed approach is based on the construction of a set of templates for each activity as well as on the measurement of the motion in each activity. Templates are designed so that they capture the structural and motion information that is most discriminative among activities. The direct motion measurements capture the amount of translational motion in each activity. The two features are fused at the recognition stage. Recognition is achieved in two steps by calculating the similarity between the templates and motion features of the test and reference activities. The proposed methodology is experimentally assessed and is shown to yield excellent performance.
Description: This is the post-print version of the final paper published in Pattern Recognition Letters. The published article is available from the link below. Changes resulting from the publishing process, such as peer review, editing, corrections, structural formatting, and other quality control mechanisms may not be reflected in this document. Changes may have been made to this work since it was submitted for publication. Copyright @ 2011 Elsevier B.V.
URI: http://www.sciencedirect.com/science/article/pii/S0167865511002261
http://bura.brunel.ac.uk/handle/2438/8221
DOI: http://dx.doi.org/10.1016/j.patrec.2011.07.013
ISSN: 0167-8655
Appears in Collections:Electronic and Computer Engineering
Publications
Dept of Electronic and Computer Engineering Research Papers

Files in This Item:
File Description SizeFormat 
Fulltext.pdf293.48 kBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.