虚实融合场景中的深度感知研究综述
Review of depth perception in virtual and real fusion environment
- 2021年26卷第6期 页码:1503-1520
纸质出版日期: 2021-06-16 ,
录用日期: 2021-03-22
DOI: 10.11834/jig.210027
移动端阅览
浏览全部资源
扫码关注微信
纸质出版日期: 2021-06-16 ,
录用日期: 2021-03-22
移动端阅览
平佳敏, 刘越, 翁冬冬. 虚实融合场景中的深度感知研究综述[J]. 中国图象图形学报, 2021,26(6):1503-1520.
Jiamin Ping, Yue Liu, Dongdong Weng. Review of depth perception in virtual and real fusion environment[J]. Journal of Image and Graphics, 2021,26(6):1503-1520.
混合现实系统可以提供虚拟信息和真实环境实时叠加的虚实融合场景,在教育培训、文物保护、军事仿真、装备制造、手术医疗和展览展示等领域具有十分广阔的应用前景。混合现实系统首先利用标定数据构建虚拟摄像机模型,然后根据头部跟踪结果和虚拟摄像机位置实时绘制虚拟内容并将其叠加在真实环境中,用户通过虚实融合场景中渲染的图形化线索和虚拟物体特征感知其深度信息,但存在用于指导虚实融合场景绘制的视觉规律和感知理论匮乏、图形化线索可提供的绝对深度信息缺失和虚拟物体的渲染维度和特征指标不足等问题。本文分析了面向虚实融合场景绘制渲染的视觉规律,从用户感知的角度出发,围绕虚实融合场景中图形化线索绘制和虚拟物体渲染等展开综述,并对虚实融合场景中深度感知的研究趋势和重点进行展望和预测。
Mixed reality systems can provide virtual and real fusion environment
in which the virtual objects add to the real world in real time. Mixed reality systems have been widely used in education
training
heritage preservation
military simulation
equipment manufacturing
surgery
and exhibition. The mixed reality systems use the calibration data to build a virtual camera model
and then draw virtual content in real time based on the head tracking data and the position of the virtual camera. Finally
the virtual content is superimposed in the real environment. The user perceives the virtual object's depth information according to the integration of graphical cues and virtual object rendering features in the virtual and real fusion environment. When the user observes the virtual-real fusion scene presented by the mixed reality system
the following processes are included: 1) different distance information are converted into respective distance signals. The key role in this process is to present the virtual-real fusion scene through rendering technology. The user judges the distance on the basis of the inherent characteristics of the virtual object. 2) The user recognizes other visual stimulus variables in the scene and converts respective distance signal into the final distance signal. The key role in this process is to provide cues of depth information in the virtual and real fusion scene. The user needs to use depth cues to determine the position of the object. 3) They determine the distance relationship between the objects in the scene and convert the final distance signal into the corresponding indicated distance. The key role in this process is the visual law of the human eye when viewing the virtual and real scene. However
problems
such as the lack of visual principles and perception theories that can be used to guide the rendering of virtual and real fusion scenes
the lack of absolute depth information that the graphical clues can provide
and the lack of rendering features of virtual objects
are found. The study on the visual laws and perception theories that can be used to guide the rendering of virtual and real scenes is limited. The visual model and perception laws of the human eye should be studied when viewing virtual-real fusion scenes to form effective application guidance and improve virtual-real fusion scenes to apply visual laws effectively in the design and development of virtual-real fusion scenes and increase the accuracy of depth perception. The rendering effect of the mixed reality application improves the interactive efficiency and user experience of mixed reality applications. The absolute depth information that can be provided by graphical cues in the virtual-real fusion scene is missing. Graphical cues that can provide effective absolute depth information in the scene should be generated
the characteristics of different graphical cues should be extracted
and the effects on depth perception should be quantified to help users perceive the depth of the target object. This approach improves user performance in depth perception and provide a basis for rendering of virtual and real scenes. The rendering dimensions and characteristic indicators of virtual objects in virtual and real fusion scenes are insufficient. Reasonable parameter indicators and effective object rendering methods should be studied
different feature interaction models should be built
and the role of different virtual object rendering characteristics in depth perception should be clarified to determine the characteristics that play a major role in the rendering of virtual objects in virtual and real scenes. Finally
the study can provide a basis for rendering the fusion scene. The visual principle in virtual and real fusion environment rendering is analyzed
and then the rendering of graphical cues and virtual object in virtual and real fusion scenes is summarized
and finally the research trend of depth perception in virtual and real fusion scenes is discussed. When viewing virtual and real scenes
humans perceive objects in the scene through the visual system. The visual function factors related to the perception mechanism and the guiding effect of visual laws on depth perception should be studied to optimize the rendering of virtual and real scenes. With the development and application of perception technology in mixed reality
in recent years
many researchers have carried out studies on ground contact theory
the anisotropy of human eye perception
and the distribution of human eye gaze points in depth perception. The background environment and virtual objects in the virtual and real fusion scene can provide users with depth information cues. Most existing related studies focus on adding various depth cues to the virtual and real fusion scene and explore the relationship between additional depth information and depth perception in the scene through experiments. With the rapid development of computer graphics
in recent years
an increasing number of graphic technologies have been applied to the creation of virtual and real fusion scenes to enhance the depth position prompts of virtual objects
including linear perspective
graphical techniques for prompting position information
and creating X-ray vision Graphics technology. The virtual objects presented in the mixed reality system are an important part of the virtual and real fusion environment. In recent years
to study the role of the inherent characteristics of virtual objects in virtual and real fusion scenes in depth perception
researchers have carried out a large number of quantifications in terms of the size
color
brightness
transparency
texture
and surface lighting of virtual objects through experimental study. These rendering-based virtual object characteristics were extracted from the 17th century painting techniques
but they are different from traditional painting depth cues.
虚实融合场景绘制渲染深度感知混合现实视觉规律深度线索感知匹配
real and virtual fusion environmentscene renderingdepth perceptionmixed realityvisual lawdepth cuesperceptual matching
Adams W J and Elder J H. 2014. Effects of specular highlights on perceived surface convexity. PLoS Computational Biology, 10(5): #1003576[DOI:10.1371/journal.pcbi.1003576]
Aglioti S, DeSouza J F X and Goodale M A. 1995. Size-contrast illusions deceive the eye but not the hand. Current Biology, 5(6): 679-685[DOI:10.1016/S0960-9822(95)00133-3]
Anderson N H. 1981. Foundations of Information Integration Theory. New York: Academic Press[DOI:10.2307/1422202]
Antes J R. 1974. The time course of picture viewing. Journal of Experimental Psychology, 103(1): 62-70[DOI:10.1037/h0036799]
Arnheim R. 1952. Reviewed Work: the perception of the visual world by James J. Gibson. The Journal of Aesthetics and Art Criticism, 11(2): 172-173[DOI:10.2307/426044]
Ashley M L. 1898. Concerning the significance of intensity of light in visual estimates of depth. Psychological Review, 5(6): 595-615[DOI:10.1037/h0068517]
Avery B, Sandor C and Thomas B H. 2009. Improving spatial perception for augmented reality X-ray vision//Proceedings of 2009 IEEE Virtual Reality Conference. Lafayette, USA: IEEE: 79-82[DOI: 10.1109/VR.2009.4811002http://dx.doi.org/10.1109/VR.2009.4811002]
Bane R and Hollerer T. 2004. Interactive tools for virtual X-ray vision in mobile augmented reality//Proceedings of the 3rd IEEE and ACM International Symposium on Mixed and Augmented Reality. Arlington, USA: IEEE: 231-239[DOI: 10.1109/ISMAR.2004.36http://dx.doi.org/10.1109/ISMAR.2004.36]
Berbaum K, Tharp D and Mroczek K. 1983. Depth perception of surfaces in pictures: looking for conventions of depiction in Pandora's box. Perception, 12(1): 5-20[DOI:10.1068/p120005]
Berning M, Kleinert D, Riedel T and Beigl M. 2014. A study of depth perception in hand-held augmented reality using autostereoscopic displays//2014 IEEE International Symposium on Mixed and Augmented Reality. Munich, Germany: IEEE: 93-98[DOI: 10.1109/ISMAR.2014.6948413http://dx.doi.org/10.1109/ISMAR.2014.6948413]
Bian Z, Braunstein M L and Andersen G J. 2005. The ground dominance effect in the perception of 3-D layout. Perception and Psychophysics, 67(5): 802-815[DOI:10.3758/BF03193534]
Bichlmeier C, Sielhorst T, Heining S M and Navab N. 2007. Improving depth perception in medical AR: a virtual vision panel to the inside of the patient//Horsch A, Deserno T M, Handels H, Meinzer H P and Tolxdorff T, eds. Bildverarbeitung für die Medizin 2007. Berlin, Heidelberg: Springer: 217-221[DOI: 10.1007/978-3-540-71091-2_44http://dx.doi.org/10.1007/978-3-540-71091-2_44]
Biegstraaten M, De Grave D D J, Brenner E and Smeets J B J. 2007. Grasping the Müller-Lyer illusion: not a change in perceived length. Experimental Brain Research, 176(3): 497-503[DOI:10.1007/s00221-006-0744-8]
Boucheny C, Bonneau G P, Droulez J, Thibault G and Ploix S. 2009. A perceptive evaluation of volume rendering techniques. ACM Transactions on Applied Perception, 5(4): #23[DOI:10.1145/1462048.1462054]
Bradshaw M F, Glennerster A and Rogers B J. 1996. The effect of display size on disparity scaling from differential perspective and vergence cues. Vision Research, 36(9): 1255-1264[DOI:10.1016/0042-6989(95)00190-5]
Bruno N. 2001. When does action resist visual illusions? Trends in Cognitive Sciences, 5(9): 379-382[DOI:10.1016/S1364-6613(00)01725-3]
Carey D P. 2001. Do action systems resist visual illusions? Trends in Cognitive Sciences, 5(3): 109-113[DOI:10.1016/S1364-6613(00)01592-8]
Coelho H, Melo M, Branco F, Vasconcelos-Raposo J and Bessa M. 2019. The impact of gender, avatar and height in distance perception in virtual environments//RochaÁ, Adeli H, Reis L P and Costanzo S, eds. New Knowledge in Information Systems and Technologies. Cham: Springer: 696-705[DOI: 10.1007/978-3-030-16184-2_66http://dx.doi.org/10.1007/978-3-030-16184-2_66]
Coluccia E and Louse G. 2004. Gender differences in spatial orientation: a review. Journal of Environmental Psychology, 24(3): 329-340[DOI:10.1016/j.jenvp.2004.08.006]
Coules J. 1955. Effect of photometric brightness on judgments of distance. Journal of Experimental Psychology, 50(1): 19-25[DOI:10.1037/h0044343]
Crosby M E, Iding M K and Chin D N. 2001. Visual search and background complexity: does the forest hide the trees?//Proceedings of the 8th International Conference on User Modeling. Berlin, Germany: Springer: 225-227[DOI: 10.1007/3-540-44566-8_28http://dx.doi.org/10.1007/3-540-44566-8_28]
Cutting J E. 1997. How the eye measures reality and virtual reality. Behavior Research Methods, Instruments, and Computers, 29(1): 27-36[DOI:10.3758/BF03200563]
Diaz C, Walker M, Szafir D A and Szafir D. 2017. Designing for depth perceptions in augmented reality//Proceedings of 2017 IEEE International Symposium on Mixed and Augmented Reality. Nantes, France: IEEE: 111-122[DOI: 10.1109/ISMAR.2017.28http://dx.doi.org/10.1109/ISMAR.2017.28]
Duc A H, Bays P and Husain M. 2008. Eye movements as a probe of attention. Progress in Brain Research, 171: 403-411[DOI:10.1016/S0079-6123(08)00659-6]
Eggleston R G, Janson W P and Aldrich K A. 1996. Virtual reality system effects on size-distance judgements in a virtual environment//1996 IEEE Virtual Reality Annual International Symposium. Santa Clara, USA: IEEE: 139-146[DOI: 10.1109/VRAIS.1996.490521http://dx.doi.org/10.1109/VRAIS.1996.490521]
Egusa H. 1982. Effect of brightness on perceived distance as a figure-ground phenomenon. Perception, 11(6): 671-676[DOI:10.1068/p110671]
Egusa H. 1983. Effects of brightness, hue, and saturation on perceived depth between adjacent regions in the visual field. Perception, 12(2): 167-175[DOI:10.1068/p120167]
Ellis S R and Menges B M. 1998. Localization of virtual objects in the near visual field. Human Factors: The Journal of the Human Factors and Ergonomics Society, 40(3): 415-431[DOI:10.1518/001872098779591278]
Eren M T and Balcisoy S. 2018. Evaluation of X-ray visualization techniques for vertical depth judgments in underground exploration. The Visual Computer, 34(3): 405-416[DOI:10.1007/s0 0371-016-1346-5]
Farnè M. 1977. Brightness as an indicator to distance: relative brightness per se or contrast with the background? Perception, 6(3): 287-293[DOI:10.1068/p060287]
Feiner S K and Seligmann D D. 1992. Cutaways and ghosting: satisfying visibility constraints in dynamic 3D illustrations. The Visual Computer, 8(5): 292-302[DOI:10.1007/BF01897116]
Finney A F and Jones A. 2020. Asymmetric effects of the ebbinghaus illusion on depth judgments//Proceedings of 2020 IEEE Conference on Virtual Reality and 3D User Interfaces. Atlanta, USA: IEEE: 573-578[DOI: 10.1109/VR46266.2020.00079http://dx.doi.org/10.1109/VR46266.2020.00079]
Franz V H. 2001. Action does not resist visual illusions. Trends in Cognitive Sciences, 5(11): 457-459[DOI:10.1016/S1364-6613(00)01772-1]
Franz V H, Bülthoff H H and Fahle M. 2003. Grasp effects of the Ebbinghaus illusion: obstacle avoidance is not the explanation. Experimental Brain Research, 149(4): 470-477[DOI:10.1007/s00221-002-1364-6]
Fuchs H, Livingston M A, Raskar R, Colucci D, Keller K, State A, Crawford J R, Rademacher P, Drake S H and Meyer A A. 1998. Augmented reality visualization for laparoscopic surgery//Proceedings of the 1 st International Conference on Medical Image Computing and Computer-Assisted Intervention. Cambridge, UK: Springer: 934-943[DOI: 10.1007/BFb0056282http://dx.doi.org/10.1007/BFb0056282]
Girgus J S, Coren S and Agdern M. 1972. The interrelationship between the Ebbinghaus and Delboeuf illusions. Journal of Experimental Psychology, 95(2): 453-455[DOI:10.1037/h0033606]
Haffenden A M, Schiff K C and Goodale M A. 2001. The dissociation between perception and action in the Ebbinghaus illusion: nonillusory effects of pictorial cues on grasp. Current Biology, 11(3): 177-181[DOI:10.1016/S0960-9822(01)00023-9]
Hara M, Kosaka S, Huang J, Bleuler H and Yabuta T. 2009. Müller-Lyer illusion effect on a reaching movement in simultaneous presentation of visual and haptic/kinesthetic cues//Proceedings of 2009 IEEE/RSJ International Conference on Intelligent Robots and Systems. St. Louis, USA: IEEE: 1253-1258[DOI: 10.1109/IROS.2009.5354619http://dx.doi.org/10.1109/IROS.2009.5354619]
Hedrich M, Bloj M and Ruppertsberg A I. 2009. Color constancy improves for real 3D objects. Journal of Vision, 9(4): #16[DOI:10.1167/9.4.16]
Henderson J M. 1993. Eye movement control during visual object processing: effects of initial fixation position and semantic constraint. Canadian Journal of Experimental Psychology, 47(1): 79-98[DOI:10.1037/h0078776]
Hou M. 2001. User experience with alignment of real and virtual objects in a stereoscopic augmented reality interface//Proceedings of 2001 Conference of the Centre for Advanced Studies on Collaborative Research. Toronto, Canada: IBM: 1-12[DOI: 10.5555/782096.782102http://dx.doi.org/10.5555/782096.782102]
Howard I P and Rogers B J. 2012. Perceiving in Depth: Volume 2 Stereoscopic Vision. New York: Oxford University Press[DOI:10.1093/acprof:oso/9780199764150.001.0001]
Hu H H, Gooch A A, Thompson W B, Smits B E, Rieser J J and Shirley P. 2000. Visual cues for imminent object contact in realistic virtual environment//Proceedings of Visualization 2000. Salt Lake City, USA: IEEE: 179-185[DOI: 10.1109/VISUAL.2000.885692http://dx.doi.org/10.1109/VISUAL.2000.885692]
Itti L and Koch C. 2000. A saliency-based search mechanism for overt and covert shifts of visual attention. Vision Research, 40(10/12): 1489-1506[DOI:10.1016/S0042-6989(99)00163-7]
Jones A, Swan J E, Singh G and Kolstad E. 2008. The effects of virtual reality, augmented reality, and motion parallax on egocentric depth perception//Proceedings of 2008 IEEE Virtual Reality Conference. Reno, USA: IEEE: 267-268[DOI: 10.1109/VR.2008.4480794http://dx.doi.org/10.1109/VR.2008.4480794]
Jones J A, Hopper J E, Bolas M T and Krum D M. 2019. Orientation perception in real and virtual environments. IEEE Transactions on Visualization and Computer Graphics, 25(5): 2050-2060[DOI:10.1109/TVCG.2019.2898798]
Kalkofen D, Mendez E and Schmalstieg D. 2007. Interactive focus and context visualization for augmented reality//The 6th IEEE and ACM International Symposium on Mixed and Augmented Reality. Nara: IEEE: 191-201[DOI: 10.1109/ISMAR.2007.4538846http://dx.doi.org/10.1109/ISMAR.2007.4538846]
Kersten D, Mamassian P and Knill D C. 1997. Moving cast shadows induce apparent motion in depth. Perception, 26(2): 171-192[DOI:10.1068/p260171]
Kersten M, Stewart J, Troje N and Ellis R. 2006. Enhancing depth perception in translucent volumes. IEEE Transactions on Visualization and Computer Graphics, 12(5): 1117-1124[DOI:10.1109/TVCG.2006.139]
Knapp J and Loomis J. 2003. Visual perception of egocentric distance in real and virtual environments//Hettinger L J and Haas M, eds. Virtual and Adaptive Environments. Mahwah: Lawrence Erlbaum: 21-46[DOI: 10.1201/9781410608888.pt1http://dx.doi.org/10.1201/9781410608888.pt1]
Koch C, Neges M, König M and Abramovici M. 2014. Natural markers for augmented reality-based indoor navigation and facility maintenance. Automation in Construction, 48: 18-30[DOI:10.1016/j.autcon.2014.08.009]
Kruijff E, Swan J E and Feiner S. 2010. Perceptual issues in augmented reality revisited//2010 IEEE International Symposium on Mixed and Augmented Reality. Seoul, Korea (South): IEEE: 3-12[DOI: 10.1109/ISMAR.2010.5643530http://dx.doi.org/10.1109/ISMAR.2010.5643530]
Lampton D R, Mcdonald D P, Singer M and Bliss J P. 1995. Distance estimation in virtual environments//Proceedings of the Human Factors and Ergonomics Society Annual Meeting. [s.l.]: [s.n.]: 1268-1272[DOI: 10.1177/154193129503902006http://dx.doi.org/10.1177/154193129503902006]
Lang C Y, Nguyen T V, Katti H, Yadati K, Kankanhalli M and Yan S C. 2012. Depth matters: influence of depth cues on visual saliency//Proceedings of the 12th European Conference on Computer Vision. Florence, Italy: Springer: 101-115[DOI: 10.1007/978-3-642-33709-3_8http://dx.doi.org/10.1007/978-3-642-33709-3_8]
Langer M S and Bülthoff H H. 2001. A prior for global convexity in local shape-from-shading. Perception, 30(4): 403-410[DOI:10.1068/p3178]
Li Z, Sun E, Strawser C J, Spiegel A, Klein B and Durgin F H. 2013. On the anisotropy of perceived ground extents and the interpretation of walked distance as a measure of perception. Journal of Experimental Psychology: Human Perception and Performance, 39(2): 477-493[DOI:10.1037/a0029405]
Lindemann F and Ropinski T. 2011. About the influence of illumination models on image comprehension in direct volume rendering. IEEE Transactions on Visualization and Computer Graphics, 17(12): 1922-1931[DOI:10.1109/TVCG.2011.161]
Liu S, Hua H and Cheng D. 2010. A novel prototype for an optical see-through head-mounted display with addressable focus cues. IEEE Transactions on Visualization and Computer Graphics, 16(3): 381-393[DOI:10.1109/TVCG.2009.95]
Livingston M A, Ai Z M, Karsch K and Gibson G O. 2011. User interface design for military AR applications. Virtual Reality, 15(2): 175-184[DOI:10.1007/s10055-010-0179-1]
Livingston M A, Ai Z M, Swan J E and Smallman H S. 2009. Indoor vs. outdoor depth perception for mobile augmented reality//Proceedings of 2009 IEEE Virtual Reality Conference. Lafayette, USA: IEEE: 55-62[DOI: 10.1109/VR.2009.4810999http://dx.doi.org/10.1109/VR.2009.4810999]
Livingston M A, Swan J E, Gabbard J L, Hollerer T H, Hix D, Julier S J, Baillot Y and Brown D. 2003. Resolving multiple occluded layers in augmented reality//The 2nd IEEE and ACM International Symposium on Mixed and Augmented Reality. Tokyo, Japan: IEEE: 56-65[DOI: 10.1109/ISMAR.2003.1240688http://dx.doi.org/10.1109/ISMAR.2003.1240688]
Mackworth N H and Morandi A J. 1967. The gaze selects informative details within pictures. Perception and Psychophysics, 2(11): 547-552[DOI:10.3758/BF03210264]
Madison C, Thompson W, Kersten D, Shirley P and Smits B. 2001. Use of interreflection and shadow for surface contact. Perception and Psychophysics, 63(2): 187-194[DOI:10.3758/BF03194461]
Mannan S, Ruddock K H and Wooding D S. 1995. Automatic control of saccadic eye movements made in visual inspection of briefly presented 2-D images. Spatial Vision, 9(3): 363-386[DOI:10.1163/156856895X00052]
Mannan S K, Ruddock K H and Wooding D S. 1996. The relationship between the locations of spatial features and those of fixations made during visual examination of briefly presented images. Spatial Vision, 10(3): 165-188[DOI:10.1163/1 56856896X00123]
Mantiuk R, Bazyluk B and Tomaszewska A. 2011. Gaze-Dependent depth-of-field effect rendering in virtual environments//Proceedings of the 2nd International Conference on Serious Games Development and Applications. Lisbon, Portugal: Springer: 1-12[DOI: 10.1007/978-3-642-23834-5_1http://dx.doi.org/10.1007/978-3-642-23834-5_1]
Marreiros F M M and SmedbyÖ. 2013. Stereoscopic static depth perception of enclosed 3D objects//2013 ACM Symposium on Applied Perception, Dublin, Ireland: ACM: 15-22[DOI: 10.1145/2492494.2492501http://dx.doi.org/10.1145/2492494.2492501]
Matin E. 1974. Saccadic suppression: a review and an analysis. Psychological Bulletin, 81(12): 899-917[DOI:10.1037/h0037368]
McCandless J W and Ellis S R. 2000. Effect of eye position on the projected stimulus distance in a binocular head-mounted display//Proceedings of SPIE 3957, Stereoscopic Displays and Virtual Reality Systems VⅡ. San Jose, USA: SPIE: 41-48[DOI: 10.1117/12.384478http://dx.doi.org/10.1117/12.384478]
Mikkola M, Boev A and Gotchev A. 2010. Relative importance of depth cues on portable autostereoscopic display//Proceedings of the 3rd Workshop on Mobile Video Delivery. Firenze, Italy: ACM: 63-68[DOI: 10.1145/1878022.1878038http://dx.doi.org/10.1145/1878022.1878038]
Milgram P, Takemura H, Utsumi A and Kishino F. 1994. Augmented reality: a class of displays on the reality-virtuality continuum//Proceedings of SPIE 2351, Telemanipulator and Telepresence Technologies. Boston, USA: SPIE: 282-292[DOI: 10.1117/12.197321http://dx.doi.org/10.1117/12.197321]
Neider M B and Zelinsky G J. 2006. Scene context guides eye movements during visual search. Vision Research, 46(5): 614-621[DOI:10.1016/j.visres.2005.08.025]
Ooi T L, Wu B and He Z J. 2001. Distance determined by the angular declination below the horizon. Nature, 414(6860): 197-200[DOI:10.1038/35102562]
Parkhurst D J and Niebur E. 2003. Scene content selected by active vision. Spatial Vision, 16(2): 125-154[DOI:10.1163/156856 80360511645]
Peillard E, Thebaud T, Normand J M, Argelaguet A, Moreau G and Lécuyer A. 2019. Virtual objects look farther on the sides: the anisotropy of distance perception in virtual reality//Proceedings of 2019 IEEE Conference on Virtual Reality and 3D User Interfaces. Osaka, Japan: IEEE: 227-236[DOI: 10.1109/VR.2019.8797826http://dx.doi.org/10.1109/VR.2019.8797826]
Pisanpeeti A P and Dineté. 2017. Transparent objects: influence of shape and color on depth perception//Proceedings of 2017 IEEE International Conference on Acoustics, Speech and Signal Processing. New Orleans, USA: IEEE: 1867-1871[DOI: 10.1109/ICASSP.2017.7952480http://dx.doi.org/10.1109/ICASSP.2017.7952480]
Plodowski A and Jackson S R. 2001. Vision: getting to grips with the ebbinghaus illusion. Current Biology, 11(8): R304-R306[DOI:10.1016/S0960-9822(01)00170-1]
Rand K M, Tarampi M R, Creem-Regehr S H and Thompson W B. 2012. The influence of ground contact and visible horizon on perception of distance and size under severely degraded vision. Seeing and Perceiving, 25(5): 425-447[DOI:10.1163/187847611X620946]
Rayner K. 2009. Eye movements and attention in reading, scene perception, and visual search. The Quarterly Journal of Experimental Psychology, 62(8): 1457-1506[DOI:10.1080/17470210902816461]
Rolland J P, Gibson W and Ariely D. 1995. Towards quantifying depth and size perception in virtual environments. Presence: Teleoperators and Virtual Environments, 4(1): 24-49[DOI:10.1162/pres.1995.4.1.24]
Rolland J P, Meyer C, Arthur K and Rinalducci E. 2002. Method of adjustments versus method of constant stimuli in the quantification of accuracy and precision of rendered depth in head-mounted displays. Presence: Virtual and Augmented Reality, 11(6): 610-625[DOI:10.1162/105474602321050730]
Rosales C S, Pointon G, Adams H, Stefanucci J, Creem-Regehr S, Thompson W B and Bodenheimer B. 2019. Distance judgments to on- and off-ground objects in augmented reality//Proceedings of 2019 IEEE Conference on Virtual Reality and 3D User Interfaces. Osaka, Japan: IEEE: 237-243[DOI: 10.1109/VR.2019.8798095http://dx.doi.org/10.1109/VR.2019.8798095]
Rutishauser U and Koch C. 2007. Probabilistic modeling of eye movement data during conjunction search via feature-based attention. Journal of Vision, 7(6): #5[DOI:10.1167/7.6.5]
Sandor C, Cunningham A, Dey A and Mattila V V. 2010a. An Augmented Reality X-Ray system based on visual saliency//2010 IEEE International Symposium on Mixed and Augmented Reality. Seoul, Korea (South): IEEE: 27-36[DOI: 10.1109/ISMAR.2010.5643547http://dx.doi.org/10.1109/ISMAR.2010.5643547]
Sandor C, Dey A, Cunningham A, Barbier S, Eck U, Urquhart D, Marner M R, Jarvis G and Rhee S. 2010b. Egocentric space-distorting visualizations for rapid environment exploration in mobile mixed reality//Proceedings of 2010 IEEE Virtual Reality Conference. Boston, USA: IEEE: 47-50[DOI: 10.1109/VR.2010.5444815http://dx.doi.org/10.1109/VR.2010.5444815]
Siegel Z D and Kelly J W. 2017. Walking through a virtual environment improves perceived size within and beyond the walked space. Attention, Perception, and Psychophysics, 79(1): 39-44[DOI:10.3758/s13414-016-1243-z]
Singh G. 2013. Near-field Depth Perception in Optical See-though Augmented Reality. [s.l.]: MSSTATE
Singh G, Ellis S R and Swan J E. 2020. The effect of focal distance, age, and brightness on near-field augmented reality depth matching. IEEE Transactions on Visualization and Computer Graphics, 26(2): 1385-1398[DOI:10.1109/TVCG.2018.2869729]
Stone M, Szafir D A and Setlur V. 2014. An engineering model for color difference as a function of size//Proceedings of the 22nd IS&T Color and Imaging Conference. Boston, USA: [s.n.]
Sugano N, Kato H and Tachibana K. 2003. The effects of shadow representation of virtual objects in augmented reality//The 2nd IEEE and ACM International Symposium on Mixed and Augmented Reality. Tokyo, Japan: IEEE: 76-83[DOI: 10.1109/ISMAR.2003.1240690http://dx.doi.org/10.1109/ISMAR.2003.1240690]
Suzuki K. 2007. The moon illusion: Kaufman and Rock's (1962) apparent-distance theory reconsidered. Japanese Psychological Research, 49(1): 57-67[DOI:10.1111/j.1468-5884.2007.00332.x]
Swan J E, Jones A, Kolstad E, Livingston M A and Smallman H S. 2007. Egocentric depth judgments in optical, see-through augmented reality. IEEE Transactions on Visualization and Computer Graphics, 13(3): 429-442[DOI:10.1109/TVCG.2007.1035]
Swan J E, Livingston M A, Smallman H S, Brown D, Baillot Y, Gabbard J L and Hix D. 2006. A perceptual matching technique for depth judgments in optical, see-through augmented reality//Proceedings of IEEE Virtual Reality Conference. Alexandria, USA: IEEE: 19-26[DOI: 10.1109/VR.2006.13http://dx.doi.org/10.1109/VR.2006.13]
Taylor I L and Sumner F C. 1945. Actual brightness and distance of individual colors when their apparent distance is held constant. The Journal of Psychology, 19(1): 79-85[DOI:10.1080/00223980.1945.9917222]
Tharp G K and Ellis S R. 1990. The Effects of Training on Errors of Perceived Direction in Perspective Displays. NASA-TM-102792. NASA
Thor D H, Winters J J and Hoats D L. 1970. Eye elevation and visual space in monocular regard. Journal of Experimental Psychology, 86(2): 246-249[DOI:10.1037/h0029941]
Todd J T, Norman J F, Koenderink J J and Kappers A ML. 1997. Effects of texture, illumination, and surface reflectance on stereoscopic shape perception. Perception, 26(7): 807-822[DOI:10.1068/p260807]
Tošković O. 2011. The anisotropy of perceived distance: the eyes story. Psihologija, 44(1): 23-37[DOI:10.2298/PSI1101023T]
Tsuda T, Yamamoto H, Kameda Y and Ohta Y. 2006. Visualization methods for outdoor see-through vision. IEICE-Transactions on Information and Systems, E89-D(6): 1781-1789[DOI:10.1093/ietisy/e89-d.6.1781]
Tsutsui K I, Sakata H, Naganuma T and Taira M. 2002. Neural correlates for perception of 3D surface orientation from texture gradient. Science, 298(5592): 409-412[DOI:D]
Tudusciuc O and Nieder A. 2010. Comparison of length judgments and the Müller-Lyer illusion in monkeys and humans. Experimental Brain Research, 207(3/4): 221-231[DOI:10.1007/s00221-010-24 52-7]
Underwood G, Templeman E, Lamming L and Foulsham T. 2008. Is attention necessary for object identification? Evidence from eye movements during the inspection of real-world scenes. Consciousness and Cognition, 17(1): 159-170[DOI:10.1016/j.concog.2006.11.008]
Uratani K, Machida T, Kiyokawa K and Takemura H. 2005. A study of depth visualization techniques for virtual annotations in augmented reality//Proceedings of 2005 IEEE Conference 2005 on Virtual Reality. Bonn, Germany: IEEE: 295-296[DOI: 10.1109/VR.2005.1492802http://dx.doi.org/10.1109/VR.2005.1492802]
van Doorn A, Koenderink J, Kappers A, Doumen M and Todd J. 2007. Exocentric pointing in depth. Journal of Vision, 7(9): #288[DOI:10.1167/7.9.288]
Watanabe T. 2004. Anisotropy in depth perception of photograph. The Japanese Journal of Psychology, 75(1): 24-32[DOI:10.4992/jjpsy.75.24]
Williams C C and Henderson J M. 2007. The face inversion effect is not a consequence of aberrant eye movements. Memory and Cognition, 35(8): 1977-1985[DOI:10.3758/BF03192930]
Wither J and Hollerer T. 2005. Pictorial depth cues for outdoor augmented reality//The 9th IEEE International Symposium on Wearable Computers. Osaka, Japan: IEEE: 92-99[DOI: 10.1109/ISWC.2005.41http://dx.doi.org/10.1109/ISWC.2005.41]
Zhou Q, Hagemann G, Fafard D, Stavness I and Fels S. 2019. An evaluation of depth and size perception on a spherical fish tank virtual reality display. IEEE Transactions on Visualization and Computer Graphics, 25(5): 2040-2049[DOI:10.1109/TVCG.2019.2898742]
Zollmann S, Grasset R, Reitmayr G and Langlotz T. 2014. Image-based X-ray visualization techniques for spatial understanding in outdoor augmented reality//Proceedings of the 26th Australian Computer-Human Interaction Conference on Designing Futures: the Future of Design. Sydney, Australia: ACM: 194-203[DOI: 10.1145/2686612.2686642http://dx.doi.org/10.1145/2686612.2686642]
Zollmann S, Kalkofen D, Mendez E and Reitmayr G. 2010. Image-based ghostings for single layer occlusions in augmented reality//2010 IEEE International Symposium on Mixed and Augmented Reality. Seoul, Korea (South): IEEE: 19-26[DOI: 10.1109/ISMAR.2010.5643546http://dx.doi.org/10.1109/ISMAR.2010.5643546]
相关文章
相关作者
相关机构