Please use this identifier to cite or link to this item:
|Title:||A real-time framework for vision based human robot interaction|
Image Colour Analysis
Image Motion Analysis
|Citation:||Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Beijing, China, 9-15 October 2006, 5831-5836|
|Abstract:||Interactive mobile robots are an active area of research. This paper presents a framework for designing a real-time vision based hand-body dynamic gesture recognition system for such robots. The said framework works in real world lighting conditions with complex backgrounds, and can handle intermittent motion of the camera. We present here a novel way in which the motion history image (MHI) and the motion energy image (MEI) is built. We propose a robust combining of the motion and color cues and we call this image as motion color image (MCI). The input signal is captured by using monocular color camera. Vision is the only feedback sensor being used. It is assumed that the gesturer is wearing clothes that are slightly different from the background. Gestures are first learned offline and then matched to the temporal data generated online in real time. We have tested this on a gesture database consisting of 11 hand-body gestures and have recorded recognition accuracy up to 90%. We have partially implemented and testing the system for Sony's Aibo robot dog using remote framework (RFW) SDK by Sony.|
|Appears in Collections:||Proceedings papers|
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.