First-Person View Hand Parameter Estimation Based on Fully Convolutional Neural Network

En Te Chou, Yun Chih Guo, Ya Hui Tang, Pei Yung Hsiao, Li Chen Fu*

*此作品的通信作者

研究成果: 圖書/報告稿件的類型會議稿件同行評審

摘要

In this paper, we propose a real-time framework that can not only estimate location of hands within a RGB image but also their corresponding 3D joint coordinates and their hand side determination of left or right handed simultaneously. Most of the recent methods on hand pose analysis from monocular images only focus on the 3D coordinates of hand joints, which cannot give a full story to users or applications. Moreover, to meet the demands of applications such as virtual reality or augmented reality, a first-person viewpoint hand pose dataset is needed to train our proposed CNN. Thus, we collect a synthetic RGB dataset captured in an egocentric view with the help of Unity, a 3D engine. The synthetic dataset is composed of hands with various posture, skin color and size. We provide 21 joint annotations including 3D coordinates, 2D locations, and corresponding hand side which is left hand or right hand for each hand within an image.

原文英語
主出版物標題Pattern Recognition - 5th Asian Conference, ACPR 2019, Revised Selected Papers
編輯Shivakumara Palaiahnakote, Gabriella Sanniti di Baja, Liang Wang, Wei Qi Yan
發行者Springer
頁面224-237
頁數14
ISBN(列印)9783030412982
DOIs
出版狀態已出版 - 2020
對外發佈
事件5th Asian Conference on Pattern Recognition, ACPR 2019 - Auckland, 新西蘭
持續時間: 26 11 201929 11 2019

出版系列

名字Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)
12047 LNCS
ISSN(列印)0302-9743
ISSN(電子)1611-3349

Conference

Conference5th Asian Conference on Pattern Recognition, ACPR 2019
國家/地區新西蘭
城市Auckland
期間26/11/1929/11/19

文獻附註

Publisher Copyright:
© 2020, Springer Nature Switzerland AG.

指紋

深入研究「First-Person View Hand Parameter Estimation Based on Fully Convolutional Neural Network」主題。共同形成了獨特的指紋。

引用此