Please use this identifier to cite or link to this item: http://buratest.brunel.ac.uk/handle/2438/12891
Title: Gesture recognition using low-cost devices: Techniques, applications, perspectives
Other Titles: Riconoscimento di gesti mediante dispositivi a basso costo: tecniche, applicazioni, prospettive
Authors: Gentile, V
Sorce, S
Malizia, A
Gentile, A
Keywords: Gesture recognition;Kinect-like devices;Human-computer interaction;Touchless interaction
Issue Date: 2016
Publisher: AICA - Associazione Italiana per l'Informatica ed il Calcolo Automatico
Citation: Mondo Digitale, 15(63): pp. 161 - 169, (2016)
Abstract: In the last decades, we have witnessed to the increase of the socalled Kinect-like devices, which are based on a set of low-cost sensors to acquire RGB and depth data of a scene. The high accessibility of such devices, mainly in terms of costs, has pushed their adoption as fundamental tool for gesture recognition in a large number of applications, both commercial and research-related ones. In this paper, we first discuss some of the general principles adopted by most of the main gesture recognition techniques described in literature. Then we present some application fields in which Kinect-like devices and gesture recognition algorithms have been used, ranging from educational-recreational examples to more complex and scientific fields (e.g. domotics, robotics and biomedical engineering). In two annexes, we list and shortly compare the main features of the Kinect-like devices available on the market, and we describe one of the most popular algorithm for skeletal tracking, which is the basis for the gesture recognition.
URI: http://mondodigitale.aicanet.net/2016-2/articoli/02_Riconoscimento_di_gesti_mediante_dispositivi_a_basso_costo.pdf
http://mondodigitale.aicanet.net/ultimo/index.xml
http://bura.brunel.ac.uk/handle/2438/12891
ISSN: 1720-898X
Appears in Collections:Dept of Computer Science Research Papers

Files in This Item:
File Description SizeFormat 
Fulltext.pdf15.95 MBAdobe PDFView/Open


Items in BURA are protected by copyright, with all rights reserved, unless otherwise indicated.