广东工业大学学报 ›› 2022, Vol. 39 ›› Issue (03): 83-88.doi: 10.12052/gdutxb.210082

• • 上一篇    下一篇

基于Kinect的人体−皮影交互体验装置设计

张紫然1, 宋文芳1, 张祖耀2   

  1. 1. 广东工业大学 艺术与设计学院, 广东 广州 510090;
    2. 浙江理工大学 艺术与设计学院, 浙江 杭州 310000
  • 收稿日期:2021-05-26 出版日期:2022-05-10 发布日期:2022-05-19
  • 通信作者: 宋文芳(1983-),女,副教授,主要研究方向为智能可穿戴产品设计,E-mail:kaffy@163.com
  • 作者简介:张紫然(1998-),女,硕士研究生,主要研究方向为产品设计
  • 基金资助:
    教育部人文社会科学研究一般项目(502210196);广东省大学生创新训练计划项目(212210280)

A Design of the Human-Shadow Play Interactive Experience Device Based on Kinect

Zhang Zi-ran1, Song Wen-fang1, Zhang Zu-yao2   

  1. 1. School of Art and Design, Guangdong University of Technology, Guangzhou 510090, China;
    2. School of Art and Design, Zhejiang Sci-Tech University, Hangzhou 310000, China
  • Received:2021-05-26 Online:2022-05-10 Published:2022-05-19

摘要: 皮影戏作为我国民间传统的艺术形式,具有重要的文化传承价值。然而在现代化发展道路上,皮影戏深受现代新娱乐方式的冲击,呈现演出市场疲软、后继无人、萎缩甚至消亡的局面,因此实现其现代化创新发展是目前急需探索的重要课题。基于此,设计了一种基于Kinect的人体−皮影交互体验装置,该装置中的皮影表演可由人体肢体动作控制,互动性强且操作简单。该装置通过Kinect捕捉人体动作,发出指令到安装于皮影关节处的舵机,由舵机控制皮影模仿人体运动。最后以人体左臂运动为例,测试了左臂运动角度与舵机运动角度,并计算了舵机运动的准确度。结果发现皮影左臂处的舵机跟随人体左臂运动的准确度为96.6%~99.8%,证明其具有较高的准确性和灵敏性。该装置具有较强的趣味性,有利于人们更好地理解和传承皮影文化。

关键词: 皮影, Kinect, 舵机, 交互体验

Abstract: Shadow play is a traditional folk-art form in China, and it has an important cultural heritage value. However, it is deeply affected by the impact of the new modern entertainment methods. On the road of modernization, shadow play presents a situation of weak performance on market, few successors, facing shrinkage or even extinction. Therefore, the modernization and innovation of the shadow play are important topics. A human-shadow play experience device is proposed based on Kinect. The shadow play can be controlled by human movements, thus increasing its interactivity and fun. Specifically, the novel device can capture human movements via Kinect and send commands to the steering engines installed in the shadow joints, thus controlling the movements of the shadow. Finally, the movement angles of steering engines and those of human left arm are examined, and the motion accuracy of the steering engine calculated. Results demonstrate that the accuracy of the device ranges from 96.6% to 99.8%, indicating the device can achieve high accuracy and sensitivity. The device is highly entertaining and is conducive to people's better understanding and inheritance of shadow play culture.

Key words: shadow play, Kinect, steering engine, interaction experience

中图分类号: 

  • TB472
[1] 钟贤权. 当代皮影艺术的生存困境与现代创新[J]. 中华文化论坛, 2013(5): 131-135.
ZHONG X Q. The survival dilemma and modern innovation of contemporary shadow puppets art [J]. Journal of Chinese Culture, 2013(5): 131-135.
[2] 辛雨璇, 邹墨馨. 皮影艺术的数字化保护与传承思路探索[J]. 戏剧之家, 2020(23): 48-50.
XIN Y X, ZOU M X. Exploring the idea of digital preservation and inheritance of shadow puppets art [J]. Home Drama, 2020(23): 48-50.
[3] 高星, 苏宇伦, 王泽宇. 华县皮影面临失传困境[J]. 人民周刊, 2018(12): 90-91.
GAO X, SU Y L, WANG Z Y. Huaxian shadow puppets faces the dilemma of loss [J]. People's Weekly, 2018(12): 90-91.
[4] 张海超. 影随展动寓教于乐——巡展中的唐山皮影表演[J]. 文物鉴定与鉴赏, 2019(4): 133.
ZHANG H C. Shadow moves with the exhibition educating and entertaining—Tangshan shadow performance in the touring exhibition[J] Identification and Appreciation to Cultural Relics, 2019(4): 133.
[5] 赵双柱, 包亚飞, 潘思凡, 等. 基于AR技术的非遗文化的保护与开发研究——以甘肃环县道情皮影戏为例[J]. 兰州文理学院学报(自然科学版), 2017, 31(6): 89-92.
ZHAO S Z, BAO Y F, PAN S F, et al. On the protection and development of intangible cultural heritage by using AR—a case study of Daoqing shadow play in Huan county, Gansu [J]. Journal of Lanzhou University of Arts and Science (Natural Sciences), 2017, 31(6): 89-92.
[6] 刘虹弦. 江汉平原皮影戏艺术特征及数字化研究[J]. 武汉纺织大学学报, 2020, 33(5): 3-7.
LIU H X. Research on the artistic characteristics and digitalization of shadow play in Jianghan plain [J]. Journal of Wuhan Textile University, 2020, 33(5): 3-7.
[7] 洪诗莹. 网络传播视域下的山东泰山皮影手机应用设计研究[D]. 济南: 山东大学, 2019.
[8] TIAN Y Z, WANG G P, LI L, et al. A universal self-adaption workspace mapping method for human-robot interaction using Kinect sensor data [J]. IEEE Sensors Journal, 2020, 20(14): 7918-7928.
[9] LIU F L, ZENG W, YUAN C Z, et al. Kinect-based hand gesture recognition using trajectory information, hand motion dynamics and neural networks [J]. Artificial Intelligence Review, 2019, 52(1): 563-583.
[10] ASHWINI K, AMUTHA R. Compressive sensing-based recognition of human upper limb motions with Kinect skeletal data [J]. Multimedia Tools and Applications, 2021, 80: 1-19.
[11] 曾碧, 林展鹏, 邓杰航. 自主移动机器人走廊识别算法研究与改进[J]. 广东工业大学学报, 2015, 33(5): 9-14.
ZENG B, LIN Z P, DENG J H. Algorithm research on recognition and improvement for corridor of autonomous mobile robot [J]. Journal of Guangdong University of Technology, 2015, 33(5): 9-14.
[12] 高金潇, 陈亦楠, 李福浩. 基于Kinect的动作识别跟踪的机械臂平台[J]. 科技资讯, 2019, 17(15): 1-4.
GAO J X, CHEN Y N, LI F H. Kinect-based robotic arm platform for motion recognition tracking [J]. Science & Technology Information, 2019, 17(15): 1-4.
[13] 徐军, 孟月霞, 王天伦, 等. 基于Kinect的仿人机器人控制系统[J]. 传感器与微系统, 2017, 36(9): 97-100.
XU J, MENG Y X, WANG T L, et al. Control system for humanoid robot based on Kinect [J]. Transducer and Microsystem Technologies, 2017, 36(9): 97-100.
[14] 胡敦利, 柯浩然, 张维. 基于Kinect和ROS的骨骼轨迹人体姿态识别研究[J]. 高技术通讯, 2020, 30(2): 177-184.
HU D L, KE H R, ZHANG W. Research on human body attitude recognition based on Kinect and ROS [J]. Chinese High Technology Letters, 2020, 30(2): 177-184.
[15] 吴智敏, 何汉武, 吴悦明. 基于混合现实交互的指挥棒位姿跟踪[J]. 广东工业大学学报, 2018, 35(3): 107-112.
WU Z M, HE H W, WU Y M. Baton-like attitude tracking based on mixed reality interaction [J]. Journal of Guangdong University of Technology, 2018, 35(3): 107-112.
[16] 王刚, 孙太任, 丁胜培. 动态受限机械臂的局部加权学习控制[J]. 系统仿真学报, 2019, 31(4): 733-739.
WANG G, SUN T R, DING S P. Locally weighted learning control for dynamic restricted manipulators [J]. Journal of System Simulation, 2019, 31(4): 733-739.
[17] 李琪, 王向东, 李华. 基于双Kinect传感器的三维人体姿态跟踪方法[J]. 系统仿真学报, 2020, 32(8): 1446-1454.
LI Q, WANG X D, LI H. 3D human pose tracking approach based on double Kinect sensors [J]. Journal of System Simulation, 2020, 32(8): 1446-1454.
[18] HUANG X F, SUN S Q, ZHANG K J, et al. A method of shadow puppet figure modeling and animation [J]. Frontiers of Information Technology & Electronic Engineering, 2015, 5: 367-379.
[19] SHU J, HAMANO F, ANQUS J. Application of extended Kalman filter for improving the accuracy and smoothness of Kinect skeleton-joint estimates [J]. Journal of Engineering Mathematics, 2014, 88: 161-175.
[20] CHIN L, EU K S, TAY T T, et al. A posture recognition model dedicated for differentiating between proper and improper sitting posture with Kinect sensor[C]// 2019 IEEE International Symposium on Haptic, Audio and Visual Environments and Games (HAVE). Subang Jaya: IEEE, 2019: 1-5.
[21] LI P. Research on robot boxing movement simulation based on Kinect sensor [J]. EURASIP Journal on Wireless Communications and Networking, 2020, 147(1): 2-15.
[1] 曾碧, 林展鹏, 邓杰航. 自主移动机器人走廊识别算法研究与改进[J]. 广东工业大学学报, 2016, 33(05): 9-14.
Viewed
Full text


Abstract

Cited

  Shared   
  Discussed   
No Suggested Reading articles found!