基于游标模型的沉浸式医学可视化非接触式手势交互方法
详细信息    查看全文 | 推荐本文 |
  • 英文篇名:Title Non-Contact Gesture Interaction Method Based on Cursor Model in Immersive Medical Visualization
  • 作者:雷金树 ; 王松 ; 朱东 ; 吴亚东
  • 英文作者:Lei Jinshu;Wang Song;Zhu Dong;Wu Yadong;School of Computer Science and Technology, Southwest University of Science and Technology;Sichuan Civil-military Integration Institute, Southwest University of Science and Technology;
  • 关键词:沉浸医学可视化 ; 非接触式手势交互 ; 游标模型 ; HTC ; VIVE+Leap ; Motion
  • 英文关键词:immersive medical visualization;;non-contact gesture interaction;;cursor model;;HTC VIVE+Leap Motion
  • 中文刊名:JSJF
  • 英文刊名:Journal of Computer-Aided Design & Computer Graphics
  • 机构:西南科技大学计算机科学与技术学院;西南科技大学四川省军民融合研究院;
  • 出版日期:2019-02-15
  • 出版单位:计算机辅助设计与图形学学报
  • 年:2019
  • 期:v.31
  • 基金:国家重点研究计划项目(2016QY04W0801);; 国防基础科研计划(JCKY2017404C004);; 四川省科技厅项目(2017TJPT0200,2017KZ0023,2017GZ0186);; 西南科技大学研究生创新基金(17ycx051);西南科技大学龙山人才计划(18lzx409,18lzxt13);; 四川省教育厅科技创新团队支持计划(18zd1102);; 四川省军民融合研究院开放基金(300015,18sxb030);; 四川省科技创新苗子工程资助项目(2018034)
  • 语种:中文;
  • 页:JSJF201902004
  • 页数:10
  • CN:02
  • ISSN:11-2925/TP
  • 分类号:28-37
摘要
在高度沉浸式的虚拟环境中,由于用户的整体视觉空间被渲染的可视化映射所覆盖,传统鼠标、键盘以及触控屏幕等交互方式无法直接应用到沉浸式环境下,大大影响了分析和理解3D医学数据的效率与准确性.为实现沉浸式虚拟环境下对3D医学数据的交互操作,提出一种基于游标模型的非接触式手势交互方法.首先借助游标模型快速确定手势状态,定义4种手势动作实现位移、缩放、旋转以及剖切4种医学可视化交互操作,并借助弹簧模型改善手势抖动问题;最后以HTC VIVE+Leap Motion构建沉浸式医学可视化系统,定义6项分析任务,对比鼠标、手柄等交互方式,从训练时间、完成时间、操作难度和用户反馈4个方面验证文中方法的有效性.
        In the highly immersive virtual environment, the user's overall visual space is covered by the rendered visual mapping, therefore the traditional interaction methods like mouse, keyboard, and touch screen cannot be directly applied to immersive environments, which affect the efficiency and accuracy of analysis and understanding 3D medical data. In order to realize the interactive operation of 3D medical data in immersive virtual environment, a non-contact gesture interaction method based on cursor model is proposed in this paper. Firstly, the cursor state is quickly determined by means of the cursor model, and four kinds of gesture actions are defined to realize four kinds of medical visualization interactive operations such as displacement, scaling, rotation and sectioning, and the gesture jitter problem is improved by the spring model. Finally, an immersive medical visualization system was constructed with HTC VIVE+Leap Motion, six analysis tasks were defined, and the mouse, the handle, and other interaction methods were compared to verify the effectiveness of the method in terms of training time, completion time, operation difficulty and user feedback.
引文
[1]Zhao Qinping.A survey on virtual reality[J].Science in China Series F:Information Sciences,2009,39(1):2-46(in Chinese)(赵沁平.虚拟现实综述[J].中国科学:信息科学,2009,39(1):2-46)
    [2]Fang Chihua,Zhu Wen,Fan Yingfang,et al.Application of medical image three-dimensional visualization system in the evaluation of resectability of pancreatic and periampullary cancer[J].Chinese Journal of Digestive Surgery,2012,11(4):366-370(in Chinese)(方驰华,祝文,范应方,等.医学图像三维可视化系统在胰腺及壶腹周围肿瘤可切除性评估中的应用[J].中华消化外科杂志,2012,11(4):366-370)
    [3]Xu Guangyou,Tao Linmi,Di Huijun,et al.Body language understanding for human computer interaction[M].Beijing:Publishing House of Electronics Industry,2014(in Chinese)(徐光祐,陶霖密,邸慧军,等.人机交互中的体态语言理解[M].北京:电子工业出版社,2014)
    [4]Huang H M,Rauch U,Liaw S S.Investigating learners’attitudes toward virtual reality learning environments:Based on a constructivist approach[J].Computers&Education,2010,55(3):1171-1182
    [5]Rautaray S S,Agrawal A.Vision based hand gesture recognition for human computer interaction:a survey[J].Artificial Intelligence Review,2015,43(1):1-54
    [6]Piumsomboon T,Clark A,Billinghurst M,et al.User-defined gestures for augmented reality[C]//Proceedings of IFIP Conference on Human-Computer Interaction.Heidelberg:Springer,2013:282-299
    [7]Potter L E,Araullo J,Carter L.The leap motion controller:a view on sign language[C]//Proceedings of the 25th Australian Computer-Human Interaction Conference:Augmentation,Application,Innovation,Collaboration.New York:ACM Press,2013:175-178
    [8]Kwon O H,Muelder C,Lee K,et al.A study of layout,rendering,and interaction methods for immersive graph visualization[J].IEEE Transactions on Visualization and Computer Graphics,2016,22(7):1802-1815
    [9]Huang Y J,Fujiwara T,Lin Y X,et al.A gesture system for graph visualization in virtual reality environments[C]//Proceedings of the Pacific Visualization Symposium.Los Alamitos:IEEE Computer Society,2017:41-45
    [10]Drouhard M,Steed C A,Hahn S,et al.Immersive visualization for materials science data analysis using the Oculus Rift[C]//Proceedings of the IEEE International Conference on Big Data.Los Alamitos:IEEE Computer Society,2015:2453-2461
    [11]Guo H,Mao N,Yuan X.WYSIWYG(what you see is what you get)volume visualization[J].IEEE Transactions on Visualization and Computer Graphics,2011,17(12):2106-2114
    [12]Satava R M.Medical applications of virtual reality[J].Journal of Medical Systems,1995,19(3):275-280
    [13]Egger J,Gall M,Wallner J,et al.HTC Vive MeVisLab integration via OpenVR for medical applications[J].PLoS ONE,2017,12(3):e0173972
    [14]Rautaray S S,Agrawal A.Vision based hand gesture recognition for human computer interaction:a survey[J].Artificial Intelligence Review,2015,43(1):1-54
    [15]Murthy G R S,Jadon R S.A review of vision based hand gestures recognition[J].International Journal of Information Technology and Knowledge Management,2009,2(2):405-410
    [16]Grimes G J.Digital data entry glove interface device:U SPatent 4,414,537[P].1983-11-8
    [17]Sodhi R S,Jones B R,Forsyth D,et al.BeThere:3D mobile collaboration with spatial input[C]//Proceedings of the SIGCHI Conference on Human Factors in Computing Systems.New York:ACM Press,2013:179-188
    [18]Tang M.Recognizing hand gestures with Microsoft’s Kinect[J].Palo Alto:Department of Electrical Engineering of Stanford University,2011,14(4):303-312
    [19]Leap Motion Inc.Leap Motion[OL].[2018-10-16].https://www.leapmotion.com
    [20]Kutter O,Shams R,Navab N.Visualization and GPU-accelerated simulation of medical ultrasound from CT images[J].Computer Methods and Programs in Biomedicine,2009,94(3):250-266
    [21]Wang Q H,Li J R,Wu B L,et al.Live parametric design modifications in CAD-linked virtual environment[J].International Journal of Advanced Manufacturing Technology,2010,50(9-12):859-869
    [22]Schroeder W J,Martin K M,Lorensen B.The visualization toolkit,an object-oriented approach to 3D graphics[M].4th ed.New York:Pearson Education,2006
    [23]Niehorster D C,Li L,Lappe M.The accuracy and precision of position and orientation tracking in the HTC vive virtual reality system for scientific research[J].i-Perception,2017,8(3):1-23
    [24]Leap Motion,Inc.Universal VR Dev mount[OL].[2018-10-16].https://store-world.leapmotion.com/products/universal-vrmount-pre-order
    [25]Belongie S.Rodrigues’rotation formula[OL].[2018-10-16].http://mathworld.wolfram.com/Rodrigues Rotation Formula.html
    [26]Guna J,Jakus G,Poga?nik M,et al.An analysis of the precision and reliability of the leap motion sensor and its suitability for static and dynamic tracking[J].Sensors,2014,14(2):3702-3720

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700