Published on Thu Aug 16 2018

Egocentric Gesture Recognition for Head-Mounted AR devices

Tejo Chalasani, Jan Ondrej, Aljosa Smolic

Gestures are a natural extension from real world to augmented reality. Finding discriminating features relevant to gestures and hands in ego-view is the primary challenge. We propose a data driven end-to-end deep learning approach to address the problem of gesture recognition.

0
0
0
Abstract

Natural interaction with virtual objects in AR/VR environments makes for a smooth user experience. Gestures are a natural extension from real world to augmented space to achieve these interactions. Finding discriminating spatio-temporal features relevant to gestures and hands in ego-view is the primary challenge for recognising egocentric gestures. In this work we propose a data driven end-to-end deep learning approach to address the problem of egocentric gesture recognition, which combines an ego-hand encoder network to find ego-hand features, and a recurrent neural network to discern temporally discriminating features. Since deep learning networks are data intensive, we propose a novel data augmentation technique using green screen capture to alleviate the problem of ground truth annotation. In addition we publish a dataset of 10 gestures performed in a natural fashion in front of a green screen for training and the same 10 gestures performed in different natural scenes without green screen for validation. We also present the results of our network's performance in comparison to the state-of-the-art using the AirGest dataset

Mon Nov 04 2019
Computer Vision
Synthetic Video Generation for Robust Hand Gesture Recognition in Augmented Reality Applications
Hand gestures are a natural means of interaction in Augmented Reality andVirtual Reality (AR/VR) applications. There has been an increased focus on removing the dependence of accurate hand gesture recognition on expensive proprietary devices. Most such solutions either rely on multi-modal sensor data or deep neural networks.
0
0
0
Thu Jun 11 2020
Machine Learning
A Deep Learning Framework for Recognizing both Static and Dynamic Gestures
This feature makes it suitable for inexpensive human-robot interactions in social or industrial settings. We employ a pose-driven spatial attention strategy, which guides our proposed Static and Dynamic gestures Network - StaDNet.
0
0
0
Mon Apr 20 2020
Machine Learning
CatNet: Class Incremental 3D ConvNets for Lifelong Egocentric Gesture Recognition
Egocentric gestures are the most natural form of communication for humans to interactions with wearable devices such as VR/AR helmets and glasses. Traditional deep learning methods require storing all previous class samples in the system and training the model again from scratch.
0
0
0
Tue Mar 03 2020
Machine Learning
3D dynamic hand gestures recognition using the Leap Motion sensor and convolutional neural networks
A method for the recognition of a set of non-staticishlygestures acquired through the Leap Motion sensor. The acquired gesture information is converted in color images, where the variation of hand joint positions during the gesture are projected on a plane.
0
0
0
Wed Dec 13 2017
Computer Vision
Real-time Egocentric Gesture Recognition on Mobile Head Mounted Displays
Mobile virtual reality (VR) head mounted displays (HMD) have become popular among consumers in recent years. In this work, we demonstrate real-time egocentric hand gesture detection and localization on mobile HMDs.
0
0
0
Thu Jan 16 2020
Machine Learning
LE-HGR: A Lightweight and Efficient RGB-based Online Gesture Recognition Network for Embedded AR Devices
Online hand gesture recognition (HGR) techniques are essential in augmented reality (AR) applications for enabling natural human-to-computer interaction and communication. In recent years, the consumer market for low-cost AR devices has been rapidly growing, while the technology maturity in this domain is still limited.
0
0
0