Journal of Guangdong University of Technology ›› 2018, Vol. 35 ›› Issue (03): 107-112.doi: 10.12052/gdutxb.170138

Previous Articles     Next Articles

Baton-like Attitude Tracking Based on Mixed Reality Interaction

Wu Zhi-min1, He Han-wu2, Wu Yue-ming1   

  1. 1. School of Electromechanical Engineering, Guangdong University of Technology, Guangzhou 510006, China;
    2. Guangdong Polytechnic of Industry and Commerce, Guangzhou 510510, China
  • Received:2017-09-26 Online:2018-05-09 Published:2018-05-24
  • Supported by:
     

Abstract: In order to solve the problem of the low level of the nature, intuition and accuracy of current mixed reality system interaction, a new method based on baton-like interaction is proposed. The key to its implementation is to measure and track the baton's attitude in real time and accurately. Therefore, the focus is on a two-color vision-based visual tracking method to measure the attitude of the baton. The specific idea is to perform fast contour extraction on the color mark, and then patch the contour by the minimum area rectangle method to get the posture of the color mark projection in the plane. Finally, the principle of binocular vision is used to calculate the three-dimensional pose of the baton. In order to verify the validity of the proposed method, experiments are carried out to analyze the real-time and accuracy of the proposed algorithm. The results show that both the real-time and accuracy of the proposed method can meet the interaction requirements.

Key words: mixed reality, human-computer interaction, attitude calculation

CLC Number: 

  • TP391
[1] 罗超. 鼠标、键盘退场智能硬件掀起交互革命[J]. It时代周刊, 2014,(10):36-37. 2014,(10):36-37.
[2] 田丰, 牟书, 戴国忠, 等. Post-WIMP环境下笔式交互范式的研究[J]. 计算机学报, 2004, 27(07):977-984.TIAN F, MOU S, DAI G Z, et al. Research on a pen-based interaction paradigm in post-WIMP environment[J]. Chinese Journal of Computers, 2004, 27(07):977-984.
[3] MINE M, YOGANANDAN A, COFFEY D. Principles, interactions and devices for real-world immersive model-ing[J]. Computers and Graphics, 2015, 48(C):84-98.
[4] SCHMITZ M, ENDRES C, BUTZ A. A survey of human-computer interaction design in science fiction movies[C]//The 2nd International Conference on Intelligent Technologies for Interactive Entertainment. Mexico, Cancun:ICST, 2008:1-10.
[5] 黄进, 韩冬奇, 陈毅能, 等. 混合现实中的人机交互综述[J]. 计算机辅助设计与图形学学报, 2016, 28(6):869-880.HUANG J, HAN D Q, CHEN Y N, et al. A survey on human-computer interaction in mixed reality[J]. Journal of Computer-Aided Design & Computer Graphics, 2016, 28(6):869-880.
[6] KIM M, LEE J Y. Touch and hand gesture-based interac- tions for directly manipulating 3D virtual objects in mobile augmented reality[J]. Multimedia Tools & Appli-cations, 2016, 75(23):16529
[7] HURST W, WEZEL C V. Gesture-based interaction via finger tracking for mobile augmented reality[J]. Multi-media Tools & Applications, 2013, 62(1):233-258.
[8] LEE S, CHUN J. A stereo-vision approach for a natural 3D hand interaction with an AR object[C]//16th International Conference on Advanced Communication Technology. Pyeongchang:IEEE, 2014:315-321.
[9] CHATHURANGA S K, SAMARAWICKRAMA K C, CHANDIMA H M L, et al. Hands free interface for human computer interaction[C]//2010 Fifth International Conference on Information and Automation for Sustain-ability. Colombo:IEEE, 2011:359-364.
[10] EHSAN T E, SUNDARARAJAN V. Using brain-computer interfaces to detect human satisfaction in human-robot interaction[J]. International Journal of Humanoid Robotics, 2011, 8(1):87-101.
[11] MORIMOTO C H, MIMICA M R. Eye gaze tracking techniques for interactive applications[J]. Computer Vision and Image Understanding, 2005, 98(1):4-24.
[12] PATERAKI M, BALTZAKIS H, TRAHANIAS P E, et al. Visual estimation of pointed targets for robot guidance via fusion of face pose and hand orientation[C]//IEEE Interna- tional Conference on Computer Vision Workshops. Barcelona:IEEE, 2011:1060-1067.
[13] 徐维鹏, 王涌天, 刘越, 等. 增强现实中的虚实遮挡处理综述[J]. 计算机辅助设计与图形学学报, 2013, 25(11):1635-1642.XU W P, WANG Y T, LIU Y, et al. Survey on occlusion handling in augmented reality[J]. Journal of Computer-Aided Design & Computer Graphics, 2013, 25(11):1635-1642.
[14] 蔡荣太, 吴元昊, 王明佳, 等. 视频目标跟踪算法综述[J]. 视频应用与工程, 2010, 34(12):135-138.CAI R T, WU Y H, WANG M J, et al. Survey of visual object tracking algorithms[J]. Video Application and Project, 2010, 34(12):135-138.
[15] 侯志强, 韩崇昭. 视觉跟踪技术综述[J]. 自动化学报, 2006, 32(4):603-617.HOU Z Q, HAN C Z. A survey of visual tracking[J]. Acta Automatica Sinica, 2006, 32(4):603-617.
[1] Du Helen S., Lyu Jing-yi, Liang Nai-feng. A Review of Green Consumption Behavior Nudged by Gamification [J]. Journal of Guangdong University of Technology, 2022, 39(04): 46-54.
[2] Du Helen S, Ke Xiao-bo, Yi Hu, Hu Shao-ru. A Review of Social Media Popularity Studies: A Technology Dependent and Social Construction Perspective [J]. Journal of Guangdong University of Technology, 2017, 34(04): 1-11.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!