Hand gestures are a natural and intuitive component of human-human communication, exhibiting great lexical variety. It stands to reason that such a user input mechanism can have many benefits, including seamless interaction, intuitive control, and robustness to physical constraints and ambient electrical, light and sound interference. However, while semantic and logical information encoded via hand gestures is readily decoded by humans, leveraging this communication channel in human-robot interfaces remains a challenge. Recent data-driven deep learning approaches are promising towards uncovering abstract and complex relationships that manual and direct rule-based classification schemes fail to discover. Such an approach is amenable towards hand gesture recognition but requires myriad data. Such data can be collected physically via user experiments, but the process is onerous and tedious. A streamlined approach with less overhead is sought.
This work presents a novel method of synthetic hand gesture dataset generation that leverages modern gaming engines. Furthermore, preliminary results indicate that the dataset, despite being synthetic and requiring no physical data collection, is both accurate and rich enough to train a real-world hand gesture classifier that operates in real-time.
Testing network classification performance with separate training dataset variations (kinematic, morphological, rotation, translation) characterize the efficacy of the proposed methods and also provide insights for improving the performance of learned behavior derived from synthetic data. These results have implications with regard to using purely synthetic data to train real-world classifiers in other application domains.
Affiliated Students: Kyle Lindgren
K. Lindgren, N. Kalavakonda, D. Caballero, K. Huang, and B. Hannaford, “Learned Hand Gesture Classification through Synthetically Generated Training Samples,” 2018 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Madrid, 2018.