2019 Vol.2 Oct. No.5 |
---|
|
Reference: [1] Mitra S, Acharya T. Gesture Recognition: A Survey[J]. IEEE Transactions on Systems Man & Cybernetics Part C, 2007, 37(3):311-324. [2] Pei J, Hu H H, Tao L, et al. Head gesture recognition for hands‐free control of an intelligent wheelchair[J]. Industrial Robot An International Journal, 2007, 34(1):60-68. [3] Wu J, Jian C. Bayesian Co-Boosting for Multi-modal Gesture Recognition[J]. Journal of Machine Learning Research, 2017, 15(1):3013-3036. [4] Bakheet S, Alhamadi A. Hand Gesture Recognition Using Optimized Local Gabor Features[J]. Journal of Computational & Theoretical Nanoscience, 2017, 14(3):1380-1389. [5] Qi J, Jiang G, Li G, et al. Surface EMG hand gesture recognition system based on PCA and GRNN[J]. Neural Computing and Applications, 2019, (4): 1-9. [6] Li J, Huai H, Gao J, et al. Spatial-temporal dynamic hand gesture recognition via hybrid deep learning model[J]. Journal on Multimodal User Interfaces, 2019, (8): 1-9. [7] Latorrecarmona P, Pla F. 3D human gesture recognition using integral imaging[J]. Spienewsroom, 2018, 12(1):15-20. [8] Lu Z, Xiang C, Qiang L, et al. A Hand Gesture Recognition Framework and Wearable Gesture-Based Interaction Prototype for Mobile Devices[J]. IEEE Transactions on Human-Machine Systems, 2017, 44(2):293-299. [9] Jebril N A, Al-Zoubi H R, Al-Haija Q A. Recognition of Handwritten Arabic Characters using Histograms of Oriented Gradient (HOG)[J]. Pattern Recognition & Image Analysis, 2018, 28(2):321-345. [10] Judd P, Albericio J, Moshovos A. Stripes: Bit-Serial Deep Neural Network Computing[J]. IEEE Computer Architecture Letters, 2017, 16(1):80-83. [11] Brooks J L, Zoumpoulaki A, Bowman H. Data‐driven region‐of‐interest selection without inflating Type I error rate[J]. Psychophysiology, 2017, 54(1):100-113. [12] Chen C, Liang J, Zhao H, et al. Frame difference energy image for gait recognition with incomplete silhouettes[J]. Pattern Recognition Letters, 2009, 30(11):977-984. [13] Kaur B, Joshi G, Vig R. Indian sign language recognition using Krawtchouk moment-based local features[J]. Journal of Photographic Science, 2017, 65(3):171-179.
|
Tsuruta Institute of Medical Information Technology
Address:[502,5-47-6], Tsuyama, Tsukuba, Saitama, Japan TEL:008148-28809 fax:008148-28808 Japan,Email:jpciams@hotmail.com,2019-09-16