Abstract
When walking through space, both dynamic visual information (optic flow) and body-based information (proprioceptive and vestibular) jointly specify the magnitude of distance travelled. While recent evidence has demonstrated the extent to which each of these cues can be used independently, less is known about how they are integrated when simultaneously present. Many studies have shown that sensory information is integrated using a weighted linear sum, yet little is known about whether this holds true for the integration of visual and body-based cues for travelled distance perception. In this study using Virtual Reality technologies, participants first travelled a predefined distance and subsequently matched this distance by adjusting an egocentric, in-depth target. The visual stimulus consisted of a long hallway and was presented in stereo via a head-mounted display. Body-based cues were provided either by walking in a fully tracked free-walking space (Exp. 1) or by being passively moved in a wheelchair (Exp. 2). Travelled distances were provided either through optic flow alone, body-based cues alone or through both cues combined. In the combined condition, visually specified distances were either congruent (1.0×) or incongruent (0.7× or 1.4×) with distances specified by body-based cues. Responses reflect a consistent combined effect of both visual and body-based information, with an overall higher influence of body-based cues when walking and a higher influence of visual cues during passive movement. When comparing the results of Experiments 1 and 2, it is clear that both proprioceptive and vestibular cues contribute to travelled distance estimates during walking. These observed results were effectively described using a basic linear weighting model.
Similar content being viewed by others
References
Alais D, Burr D (2004) The ventriloquist effect results from near-optimal bimodal integration. Curr Biol 14:257–262
Angelaki DE, Cullen KE (2008) Vestibular system: the many facets of a multimodal sense. Annu Rev Neurosci 31:125–150
Angelaki DE, Gu Y, DeAngelis GC (2009) Multisensory integration: psychophysics, neurophysiology, and computation. Curr Opin Neurobiol 19:452–458
Bakker NH, Werkoven PJ, Passenier PO (1999) The effects of proprioceptive and visual feedback on geographical orientation in virtual environments. Pres Teleop Virt Environ 8:36–53
Banton T, Stefanucci J, Durgin F, Fass A, Proffitt D (2005) The perception of walking speed in a virtual environment. Pres Teleop Virt Environ 14(4):394–406
Becker W, Nasios G, Raab S, Jürgens R (2002) Fusion of vestibular and podokinesthetic information during self-turning towards instructed targets. Exp Brain Res 144:458–474
Berthoz A, Israël I, Georges-François P, Grasso R, Tsuzuku T (1995) Spatial memory of body linear displacement: what is being stored? Science 269:95–98
Bertin RJV, Berthoz A (2004) Visuo-vestibular interaction in the reconstruction of travelled trajectories. Exp Brain Res 154:11–21
Bremmer F, Lappe M (1999) The use of optical velocities for distance discrimination and reproduction during visually simulated self motion. Exp Brain Res 127:33–42
Bruggeman H, Piuneu VS, Rieser JJ, Pick HL Jr (2009) Biomechanical versus inertial information: stable individual differences in perception of self-rotation. J Exp Psychol Hum Percept Perform 35:1472–1480
Bülthoff HH, Yuille A (1991) Bayesian models for seeing shapes and depth. Comment Theor Biol 2(4):283–314
Butler JS, Smith S, Campos JL, Bülthoff HH (2010) Bayesian integration of visual and vestibular signals for heading. J Vis 10(12):1–13
Butler JS, Campos JL, Bülthoff HH, Smith ST (2011) The role of stereo vision in visual-vestibular heading estimation. See Perceiv 24(5):453–470
Campos JL, Bülthoff HH (2011) Multisensory integration during self-motion in virtual reality. In: Wallace M, Murray M (eds) Frontiers in the neural bases of multisensory processes. Taylor and Francis Group, London, pp 603–627
Campos JL, Siegle J, Mohler BJ, Loomis JM, Bülthoff HH (2009) Imagined self-motion differs from actual self-motion. PLoS ONE 4(11):e7793. doi:10.1371/journal.pone.0007793
Campos JL, Byrne P, Sun HJ (2010) Body-based cues trump vision when estimating walked distance. Euro J Neurosci 31(10):1889–1898
Chance SS, Gaunet F, Beall AC, Loomis JM (1998) Locomotion mode affects the updating of objects encountered during travel: the contribution of vestibular and proprioceptive inputs to path integration. Pres Teleop Virt Environ 7(2):168–178
Cheng K, Shettleworth S, Huttenlocher J, Rieser JJ (2007) Bayesian integrating of spatial information. Psychol Bull 133(4):625–637
Durgin FH, Pelah A, Fox LF, Lewis J, Kane R, Walley KA (2005) Self-motion perception during locomotor recalibration: more than meets the eye. J Exp Psychol Hum Percept Perform 31:398–419
Durgin FH, Akagi M, Gallistel CR, Haiken W (2009) The precision of locomotor odometry in humans. Exp Brain Res 193(3):429–436
Elliott D (1986) Continuous visual information may be important after all: a failure to replicate Thomson. J Exp Psychol Hum Percept Perform 12:388–391
Ernst MO, Banks MS (2002) Humans integrate visual and haptic information in a statistically optimal fashion. Nature 415:429–433
Ernst MO, Bülthoff HH (2004) Merging the senses into a robust percept. Trends Cogn Sci 8:162–169
Fetsch CR, Turner AH, DeAngelis GC, Angelaki DE (2009) Dynamic reweighting of visual and vestibular cues during self-motion perception. J Neurosci 29(49):15601–15612
Fetsch CR, DeAngelis GC, Angelaki DE (2010) Visual-vestibular cue integration for heading perception: applications of optimal cue integration theory. Euro J Neurosci 31:1721–1729
Frenz H, Lappe M (2005) Absolute travel distance from optic flow. Vision Res 45(13):1679–1692
Frenz H, Lappe M (2006) Visual distance estimation in static compared to moving virtual scenes. Span J Psych 9(2):321–331
Frenz H, Bremmer F, Lappe M (2003) Discrimination of travel distances from “situated” optic flow. Vision Res 43(20):2173–2183
Frissen I, Campos JL, Souman JL, Ernst MO (2011) The relative contributions of proprioceptive and inertial information for self-motion perception. Exp Brain Res 212:163–176
Gepshtein S, Burge J, Ernst MO, Banks MS (2005) The combination of vision and touch depends on spatial proximity. J Vis 5(11): doi:10.1167/5.11.7
Gibson JJ (1950) Perception of the visual world. Houghton Mifflin, Boston
Gu Y, Angelaki DE, DeAngelis GC (2008) Neural correlates of multisensory cue integration in macaque MSTd. Nat Neurosci 11(10):1201–1210
Guerin P, Bardy BG (2008) Optical modulation of locomotion and energy expenditure at preferred transition speed. Exp Brain Res 189:393–402
Harris LR, Jenkin M, Zikovitz DC (2000) Visual and non-visual cues in the perception of linear self-motion. Exp Brain Res 135:12–21
Israël I, Berthoz A (1989) Contributions of the otoliths to the calculation of linear displacement. J Neurophysiol 62(1):247–263
Jürgens R, Becker W (2006) Perception of angular displacement without landmarks: evidence for Bayesian fusion of vestibular, optokinetic, podokinesthetic and cognitive information. Exp Brain Res 174:528–543
Kearns MJ (2003) The roles of vision and body senses in a homing task: the visual environment matters. Dissertation, Brown University
Kearns MJ, Warren WH, Duchon AP, Tarr MJ (2002) Path integration from optic flow and body senses in a homing task. Perception 31:349–374
Klatzky RL, Loomis JM, Beall AC, Chance SS, Golledge RG (1998) Spatial updating of self-position and orientation during real, imagined, and virtual locomotion. Psychol Sci 9(4):293–298
Knapp JM, Loomis JM (2004) Limited field of view of head-mounted displays is not the cause of distance underestimation in virtual environments. Pres Teleop Virt Environ 13(5):572–577
Knill DC, Saunders JA (2003) Do humans optimally integrate stereo and texture information for judgments of surface slant? Vis Res 43:2539–2558
Körding KP, Wolpert DM (2004) Bayesian integration in sensorimotor learning. Nature 427(15):244–247
Körding KP, Beierholm U, Ma WJ, Quartz S, Tenenbaum JB, Shams L (2007) Causal inference in multisensory perception. PLoS ONE 2(9):e943. doi:10.1371/journal.pone.0000943
Lappe M, Frenz H (2009) Visual estimation of travel distance during walking. Exp Brain Res 199:369–375
Lappe M, Jenkin M, Harris LR (2007) Travel distance estimation from visual motion by leaky path integration. Exp Brain Res 180(1):35–48
Larish JF, Flach JM (1990) Sources of optical information useful for perception of speed of rectilinear self-motion. J Exp Psychol Hum Percept Perform 16:295–302
Lee DN (1976) Theory of visual control of braking based on information about time-to-collision. Perception 5(4):437–459
Lee DN, Lishman JR (1975) Visual proprioceptive control of stance. J Hum Mov Stud 1:89–95
Loomis JM, Da Silva JA, Fujita N, Fukusima SS (1992) Visual space perception and visually directed action. J Exp Psychol Hum Percept Perform 18:906–921
MacNeilage PR, Banks MS, Berger DR, Bülthoff HH (2007) A Bayesian model of the disambiguation of gravitoinertial force by visual cues. Exp Brain Res 179:263–290
Marlinsky VV (1999) Vestibular and vestibule-proprioceptive perception of motion in the horizontal plane in blindfolded man: I. Estimations of linear displacement. Neuroscience 90:389–394
Mittelstaedt ML, Mittelstaedt H (2001) Idiothetic navigation in humans: estimation of path length. Exp Brain Res 13:318–332
Mohler BJ, Thompson WB, Creem-Regehr SH, Pick HL, Warren WH (2007a) Visual flow influences gait transition speed and preferred walking speed. Exp Brain Res 181(2):221–228
Mohler BJ, Thompson WB, Creem-Regehr SH, Willemsen P, Pick HL, Rieser JJ (2007b) Calibration of locomotion due to visual motion in a treadmill-based virtual environment. ACM Trans Appl Percept 4(1):20–32
Proffitt DR, Stefanucci J, Banton T, Epstein W (2003) The role of effort in perceiving distance. Psychol Sci 14(2):106–112
Prokop T, Schubert M, Berger W (1997) Visual influence on human locomotion. Exp Brain Res 114:63–70
Redlick FP, Jenkin M, Harris LR (2001) Humans can use optic flow to estimate distance of travel. Vision Res 41:213–219
Riecke BE, van Veen HAHC, Bülthoff HH (2002) Visual homing is possible without landmarks—A path integration study in virtual reality. Pres Teleop Virt Environ 11(5):443–473
Riecke BE, Cunningham DW, Bülthoff HH (2006) Spatial updating in virtual reality: the sufficiency of visual information. Psychol Res 71(3):298–313
Rieser JJ, Ashmead DH, Talor CR, Youngquist GA (1990) Visual perception and the guidance of locomotion without vision to previously seen targets. Perception 19(5):675–689
Rieser JJ, Pick HL, Ashmead DH, Garing AE (1995) Calibration of human locomotion and models of perceptual motor organization. J Exp Psychol Hum Percept Perform 21(3):480–497
Seidman SH (2008) Translational motion perception and vestiboocular responses in the absence of non-inertial cues. Exp Brain Res 184:13–29
Siegle J, Campos JL, Mohler BJ, Loomis JM, Bülthoff HH (2009) Measurement of instantaneous perceived self-motion using continuous pointing. Exp Brain Res 195(3):429–444
Souman JL, Frissen I, Sreenivasa M, Ernst MO (2009) Walking straight into circles. Curr Biol 19(18):1538–1542
Steck K, Wittlinger M, Wolf H (2009) Estimation of homing distance in desert ants, Cataglyphis fortis, remains unaffected by disturbances of walking behaviour. J Exp Biol 212(18):2893–2901
Steinicke F, Bruder G, Jerald J, Frenz H, Lappe M (2010) Estimation of detection thresholds for redirected walking techniques. IEEE Trans Vis Comp Graph 16(1):17–27
Sun HJ, Frost BJ (1998) Computation of different optical variables of looming objects in pigeon nucleus rotundus neurons. Nat Neurosci 1(4):296–303
Sun HJ, Lee AJ, Campos JL, Chan GSW, Zhang DH (2003) Multisensory integration in speed estimation during self-motion. Cyberpsychol Behav 6(5):509–518
Sun HJ, Campos JL, Chan GSW (2004a) Multisensory integration in the estimation of relative path length. Exp Brain Res 154(2):246–254
Sun HJ, Campos JL, Chan GSW, Young M, Ellard C (2004b) The contributions of static visual cues, nonvisual cues, and optic flow in distance estimation. Perception 33:49–65
Thompson WB, Willemsen P, Gooch AA, Creem-Regehr SH, Loomis JM, Beall AC (2004) Does the quality of the computer graphics matter when judging distances in visually immersive environments? Pres Teleop Virt Environ 13(5):560–571
Thomson JA (1983) Is continuous visual monitoring necessary in visually guided locomotion? J Exp Psychol Hum Percept Perform 9:427–443
Wallace MT, Roberson GE, Hairston WD, Stein BE, Vaughan JW, Schirillo JA (2004) Unifying multisensory signals across time and space. Exp Brain Res 158:252–258
Waller D, Greenauer N (2007) The role of body-based sensory information in the acquisition of enduring spatial representations. Psychol Res 71(3):322–332
Waller D, Richardson AR (2008) Correcting distance estimates by interacting with immersive virtual environments: effects of task and available sensory information. JEP Appl 14(1):61–72
Warren WH, Hannon DJ (1998) Direction of self-motion is perceived from optical-flow. Nature 336:162–163
Warren WH, Kay BA, Zosh WD, Duchon AP, Sahuc S (2001) Optic flow is used to control human walking. Nat Neurosci 4:213–216
Wilkie RM, Wann JP (2005) The role of visual and nonvisual information in the control of locomotion. J Exp Psychol Hum Percept Perform 31(5):901–911
Witmer BG, Kline PB (1998) Judging perceived and traversed distance in virtual environments. Pres Teleop Virt Environ 7:144–167
Wittlinger M, Wehner R, Wolf H (2006) The ant odometer: stepping on stilts and stumps. Science 312:1965–1967
Wohlgemuth S, Ronacher B, Wehner R (2001) Ant odometry in the third dimension. Nature 411(6839):795–798
Yong NA, Paige GD, Seidman SH (2007) Multiple sensory cues underlying the perception of translation and path. J Neurophysiol 97:1100–1113
Acknowledgments
We would like to thank Hong-Jin Sun for many helpful discussions and input on the experimental design. We also thank Betty Mohler and Michael Weyel for their technical assistance and informative input, Brian Oliver for his assistance with the experimental set-up and pilot testing, Ilja Frissen for comments on earlier versions of the manuscript, and Simon Musall for his invaluable assistance in collecting the data. The research was supported by funding from the Max Planck Society and by the World Class University (WCU) program funded by the Ministry of Education, Science and Technology through the National Research Foundation of Korea (R31-10008).
Author information
Authors and Affiliations
Corresponding authors
Rights and permissions
About this article
Cite this article
Campos, J.L., Butler, J.S. & Bülthoff, H.H. Multisensory integration in the estimation of walked distances. Exp Brain Res 218, 551–565 (2012). https://doi.org/10.1007/s00221-012-3048-1
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s00221-012-3048-1