Download - Santa Barbara Department of Psychology, Santa Barbara ... et al Presence.… · Santa Barbara Santa Barbara, CA 93106 Andrew C. Beall [email protected] Department of Psychology,

Transcript
Page 1: Santa Barbara Department of Psychology, Santa Barbara ... et al Presence.… · Santa Barbara Santa Barbara, CA 93106 Andrew C. Beall beall@psych.ucsb.edu Department of Psychology,

William B ThompsonthompsoncsutaheduSchool of Computing University ofUtahSalt Lake City UT 84112

Peter WillemsenwillemsncsutaheduSchool of Computing University ofUtahSalt Lake City UT 84112

Amy A Gooch1

amygoochnorthwesterneduSchool of Computing University ofUtahSalt Lake City UT 84112

Sarah H Creem-RegehrsarahcreempsychutaheduPsychology Department Universityof UtahSalt Lake City UT 84112

Jack M LoomisloomispsychucsbeduDepartment of PsychologyUniversity of California atSanta BarbaraSanta Barbara CA 93106

Andrew C BeallbeallpsychucsbeduDepartment of PsychologyUniversity of California atSanta BarbaraSanta Barbara CA 93106

Presence Vol 13 No 5 October 2004 560ndash571

copy 2004 by the Massachusetts Institute of Technology

Does the Quality of the ComputerGraphics Matter when JudgingDistances in Visually ImmersiveEnvironments

Abstract

In the real world people are quite accurate in judging distances to locations in theenvironment at least for targets resting on the ground plane and distances out toabout 20 m Distance judgments in visually immersive environments are much lessaccurate Several studies have now shown that in visually immersive environmentsthe world appears significantly smaller than intended This study investigates whetheror not the compression in apparent distances is the result of the low-quality com-puter graphics utilized in previous investigations Visually directed triangulated walk-ing was used to assess distance judgments in the real world and in three virtualenvironments with graphical renderings of varying quality

1 Introduction

The utility of visually immersive interfaces for applications such as simula-tion education and training is in part a function of how accurately such inter-faces convey a sense of the simulated world to a user In order for a user to actin a virtual world as if present in the physical world being simulated he or shemust perceive spatial relations the same way they would be perceived if theuser were actually in the physical world Subjectively current-generation vir-tual worlds often appear smaller than their intended size impacting a userrsquosability to accurately interact with the simulation and the potential to transferthe spatial knowledge back to the real world

Controlled experiments done by several different research groups are start-ing to provide objective evidence for this effect Distance judgments to targetspresented in visually immersive displays are often significantly compressedThere has been much speculation about the cause of this effect Limited fieldof view (FOV) the difficulties in accurately presenting binocular stereo usingdevices such as head-mounted displays (HMDs) errors in accommodationand limits on sharpness and resolution have all been suggested as potentiallycontributing to the misperception of distance (Rolland Gibson amp Arierly1995 Ellis amp Menges 1997 Witmer amp Sadowski 1998) Loomis and Knapp(2003) hypothesize that distance judgments are compressed in visually immer-sive environments because ldquothe rendering of the scenes is lacking subtle

1Present address Department of Computer Science Northwestern University Evanston IL60201

560 PRESENCE VOLUME 13 NUMBER 5

but important visual cues (eg natural texture high-lights) If this hypothesis is correct it means thatphotorealistic rendering of the surfaces and objects in asimulated environment is likely to produce more accu-rate perception of distancerdquo

This paper explores the conjecture that image qualityaffects distance judgments in virtual environments Westart with a discussion of what is meant by a ldquodistancejudgmentrdquo and point out that different types of distancejudgments likely depend on distinctly different visualcues We next discuss how to experimentally determineperceptual judgments of one type of perceived distanceThis is followed by the presentation of experimentalresults comparing distance judgments in the real worldwith judgments based on graphical renderings of vary-ing quality showing that quality of graphics has littleeffect on the accuracy of distance judgments We endwith a discussion contributing to the speculation onwhy distances are incorrectly perceived in visually im-mersive displays

2 Background

21 Visual Cues for Distance

Visual perception of distance can be defined inmultiple ways It is often categorized by the frame ofreference used Egocentric distances are measured fromthe observer to individual locations in the environmentExocentric distances are measured between two points inthe environment The distinction is important for tworeasons First of all the errors associated with the per-ception of egocentric and exocentric distances are differ-ent Although people perceive egocentric distances ac-curately when distance cues are abundant they makelarge systematic errors in perceiving an exocentric inter-val under the same viewing conditions Recent researchby Foley Ribeiro-Filho and Da Silva (2004) and byLoomis Philbeck and Zahorik (2002) demystifies theseparadoxical results Secondly some depth cues such asshadows can provide information about exocentric dis-tances but not egocentric distances

Another distinction between types of distance percep-tion is also critical Distance perception can involve abso-

lute relative or ordinal judgments Absolute distancesare specified in terms of some standard that need not bein the visual field (eg ldquotwo metersrdquo or ldquofive eye-heightsrdquo) Relative distances are specified in terms ofcomparisons with other visually determined distances(eg ldquolocation A is twice as far away as location Brdquo)Relative distances can be thought of as absolute dis-tances that have been subjected to an unknown butfixed scaling transformation Ordinal distances are a spe-cial case of relative distances in which it is possible onlyto determine the depth ordering between two locationsbut not the magnitude of the difference

Finally distance from the observer affects the natureand accuracy of distance perception Cutting and Vish-ton (1995) divide distances into three zones personalspace which extends slightly beyond an armrsquos reachfrom the observer action space within which we canrapidly locomote extending from the boundaries ofpersonal space to approximately 30 m from the ob-server and vista space beyond 30 m from the observer

The study reported on below deals with absolute ego-centric distance judgments in action space which areparticularly relevant to many virtual environment appli-cations A computational analysis shows that only a fewvisual cues provide information about such distances(Table 1) Accommodation and binocular disparity arenot effective beyond a few meters Absolute-motionparallax has the potential to provide information aboutabsolute egocentric distance if the velocity of the ob-server is utilized for scaling but this appears to be aweak distance cue for people (Beall Loomis Philbeckamp Fikes 1995) Within action space the related cues oflinear perspective height in the field and horizon ratioare relative-depth cues that have the potential for pro-viding absolute depth to objects resting on a groundplane when combined with information about the ob-serverrsquos eye height above the ground plane (Wraga1999) These cues can be exploited in visually immer-sive interfaces if the rendering geometry is correct andboth observer and object are in contact with a groundplane having adequate perspective cues Familiar sizemdashwhich involves exploiting the relationship between theassumed physical size of an object the distance of theobject from the observer and the retinal size of the im-

Thompson et al 561

age of the objectmdashcan also serve as an absolute-depthcue It is reasonable to assume that the effectiveness ofthe familiar-size cue depends at least in part of the real-ism of the imagery being viewed though we are notaware of definitive studies addressing this issue In the ex-periment described below we vary the quality of immer-sively viewed imagery while fixing the information availablefrom perspective cues in order to determine whether im-age quality affects absolute egocentric depth judgments

22 Experimentally EstimatingJudgments of Absolute EgocentricDistance

It is quite difficult to determine the distance to atarget that is ldquoseenrdquo by an observer This is particularlytrue for absolute-distance judgments since methodsinvolving just-noticeable-differences reference stan-dards and equal-interval tasks all involve relative dis-tance Verbal reporting can be used (eg ldquoHow manymeters away is location Ardquo) but verbal reports tend tobe noisy and are subject to a variety of biases that aredifficult to control An alternative for evaluating the per-ception of distance is to have subjects perform sometask in which the actions taken are dependent on theperceived distance to visible objects (Loomis Da Silva

Fujita amp Fukusima 1992 Rieser 1999) Such ap-proaches have the additional advantage of being particu-larly appropriate for evaluating the effectiveness of inter-active virtual environments

Walking to or toward previously viewed targets hasbeen used extensively to evaluate judgments of absoluteegocentric distance In one form of this task subjectsfirst look at a target and then walk to the target whileblindfolded They are told to stop at the target locationand the distance between their starting and stoppingpoints is presumed to be an indication of the originallyperceived distance (Thomson 1983 Rieser AshmeadTalor amp Youngquist 1990) A second form of this taskinvolves looking at a target walking while blindfoldedin an oblique direction from the original line of sight tothe target and then pointing toward or walking towardthe (now unseen) target (Fukusima Loomis amp DaSilva 1997) The presumed perceived distance is deter-mined based on the original starting point and the in-tersection of the original line of sight with the final indi-cated direction (Figure 1) Triangulated walking orpointing can be used to evaluate perception of largerdistances than can easily be tested using direct walkingand has a theoretical advantage over direct walking inthat it is less likely to involve some specialized visual-action coupling not related to more generally useful

Table 1 Common Visual Cues for Absolute (a) Relative (r) and Ordinal (o) Depth

Cue a r o Requirements for absolute depth

Accommodation x Very limited rangeBinocular convergence x x x Limited rangeBinocular disparity ndash x x Limited rangeLinear perspective height in picture horizon ratio x x x Requires viewpoint heightFamiliar size x x xRelative size ndash x x Subject to errorsAerial perspective ndash x x Adaptation to local conditionsAbsolute motion parallax x x Requires viewpoint velocityRelative motion parallax ndash ndash xTexture gradients ndash x ndashShading ndash x ndashOcclusion ndash ndash x

562 PRESENCE VOLUME 13 NUMBER 5

distance perception High accuracy in distance estima-tion has been observed in visually directed action experi-ments across many studies

23 Prior Studies of DistanceJudgments in Visually ImmersiveEnvironments

In the last few years a number of research groupshave addressed the issue of space perception in visuallyimmersive environments This work has been motivatedby a desire to explore new techniques both for probinghuman vision (Loomis Blascovich amp Beall 1999) andfor quantifying operator performance in virtual environ-ments (Lampton McDonald Singer amp Bliss 1995)Table 2 summarizes the results of four previous studiesof absolute egocentric distance judgments over actionspace in visually immersive environments along withsome of the results discussed further in section 4 Ineach of these studies some form of directed action wasused to evaluate distance judgments in both real andcomputer generated scenes All involved indoor envi-

ronments and targets situated on a level ground planeThe first study used a Fakespace Labs BOOM2C displaywith 1280 1024 resolution The second and thirdstudies used a Virtual Research FS5 HMD with 800

480 resolution The final two studies used an nVisionHiRes HMD with 1280 1024 resolution

One of the striking results from these studies is thatdistance judgments in virtual environments were consis-tently underestimated compared with judgments in thereal world Most of the results in the CG column of Ta-ble 2 were based on imagery comparable to that shownin Figure 2b One potential explanation for this com-pression of virtual space is that the quality of the imag-ery is too poor to generate an effective familiar-size ef-fect The experiment described below is aimed atexploring this conjecture

3 Method

In order to investigate the degree to which imagequality affects egocentric-distance judgments in virtualenvironments we compared distance judgments in thereal world with distance judgments in virtual environ-ments utilizing three very distinct styles of graphicalrendering 360deg high-resolution panoramic images in-tentionally low-quality texture-mapped computer graphicsand wireframe renderings (Figure 2) We probed sub-jectsrsquo perceptions of distance using a directed-actiontask in which subjects indirectly walked without visiontoward a previously viewed target A between-subjectsdesign was used in which a given subject viewed trialsat three different distances in one of four different envi-ronments Care was taken to make the tasks similar inboth the real and virtual environments and to make thescale and general layout of all four environments equiva-lent

31 Participants

Forty-eight college-age students participated inthis study with six male and six female subjects in eachcondition Subjects either received course credit for par-ticipating or were volunteers All subjects were given a

Figure 1 Triangulated walking task Subjects start walking in an

oblique direction from the direction of a previously viewed target On

directions from the experimenter they turn and take several steps

toward where they perceived the previously viewed target to be

Thompson et al 563

stereogram eye test and had normal or corrected-to-normal vision Interpupillary distances ranged from 51cm to 77 cm with an average of 619 cm

32 Materials

In the real-world condition subjects viewed afoam-core circular disk approximately 37 cm in diameterand placed on the ground at distances of 5 m 10 mand 15 m The experiment was performed in the lobbyof an engineering classroom building Subject positionsrelevant to computing apparent distance (Figure 1) weredetermined by measuring foot positions on the floor

In the three virtual-world conditions imagery waspresented using an nVision Datavisor HiRes HMD withinterlaced 1280 1024 resolution full field-sequentialcolor and a 42deg horizontal field of view The angularresolution of the HMD was on the order of 2 arc min-utes per pixel The nVision has user-adjustable focusThe display was configured with 100 stereo overlapbetween the two eyes Head tracking was done using anInterSense IS600 Mark 2 tracker This tracker uses amix of inertial gravitational and acoustic technologiesto provide state-of-the art accuracy and latency Onlytracker rotation was used to update the viewpointWhile translational tracker positions were recorded theresults reported in section 4 were based on measuredfoot position on the floor in order to be consistent withthe real-world condition All computer-generated envi-

ronments were rendered on an SGI Onyx2 R12000with two IR2 rendering pipelines One rendering pipe-line was used for each eye to provide stereopsis

Multiple sets of panorama images were produced fordifferent target distances and eye heights based on pho-tographs acquired by swinging a camera around a fixedaxis located in the same position as the viewpoint forthe real-world condition Targets were placed in thesame locations as for the real-world condition To pro-vide stereo viewing two sets of images were taken foreach panorama with the camera offset laterally 325cm from the axis of rotation The two sets of photo-graphs were digitized onto a PhotoCD and then mosa-icked into two cylindrical images using the PanoramaFactory software package Each cylindrical image wastexture-mapped onto a set of polygons forming a cylin-drical configuration providing the ability to generateviews over a 360deg by 100deg portion of the optical sphereRendering update rates were no less than 40 frames persecond in each eye The result was a compelling sense ofbeing able to look around in the virtual environmentthough no motion parallax was available and the stereogeometry was slightly incorrect To control for subjectsrsquoeye heights multiple-panorama image pairs were pro-duced for eye heights spaced at 5 cm intervals and theset nearest to a given subjectrsquos eye height was used forthat subjectrsquos trials Practical concerns relating to themanner in which the original images were captured pre-cluded a similar control for interpupillary distance

Table 2 Distance Judgments Based on Viewing Imagery Generated by Computer Graphics (CG) and Using VisuallyImmersive Displays

Study Distance (m) Real () CG () Task

Witmer amp Sadowski (1998) 46ndash32 92 85 Treadmill walkingKnapp (1999) 5ndash15 100 42 Triangulated walkingDurgin Fox Lewis amp Walley

(2002) 2ndash8 65 Direct walkingWillemsen amp Gooch (2002) 2ndash5 100 81 Direct walkingConditions 1 and 2 this study 5ndash15 95 44 Triangulated walking

Note Distances are compressed relative to comparable judgments based on viewing real-world environments Thepercentages indicate the overall ratio of perceived distance to actual distance

564 PRESENCE VOLUME 13 NUMBER 5

The second virtual-environment condition involved acomputer graphics rendering of the same classroombuilding lobby The scale of the model was the same asthe actual building lobby but the geometric detail wasintentionally kept quite simple Stereotypical tiled tex-ture maps were used Simple point-source lighting wasused with no shadows or other global illumination ef-fects Targets were rendered as red disks with the sizeand position corresponding to what was used for the

real-world condition Rendering update rates were noless than 30 frames per second in each eye

The wireframe virtual environment condition wasconstructed by rendering feature edges of the modelused in the second virtual-environment condition Oursoftware used an OpenGL silhouette drawing algorithm(Raskar amp Cohen 1999) to generate the feature edgesThe frame rates for this environment were no less than40 frames per second The wireframe rendering pro-

Figure 2 Sample imagery for conditions 2 3 and 4 (a) Section of panorama image showing target (b) Example of low-quality computer

graphics image showing target The viewpoint is the same as for Figure 2a (c) Example of wireframe computer graphics image showing target

The viewpoint is the same as for Figure 2a

Thompson et al 565

duced scenes that resemble black-on-white sketches ofthe classroom building lobby The target was renderedwith feature edges as well with size and position thesame as for the previous conditions

For both the texture-mapped and wireframe com-puter graphics conditions eye heights were renderedbased on the subjectsrsquo actual eye heights Interpupillarydistances for stereo rendering were fixed at 65 cm con-sistent with the panorama images

33 Procedure

Subjects were first provided with written instruc-tions that described the triangulated walking task andthen given a demonstration of the task in a space bothsmaller and different from the actual experiment spatiallayout For all conditions both real and virtual subjectswere instructed to obtain a good image of the targetand their local surroundings while first facing the targetSubjects were told that a ldquogood imagerdquo is obtained ifafter closing their eyes they would still be able to ldquoseerdquothe environment and most importantly the targetSubjects were allowed to rotate their head about theirneck but were instructed not to move their head fromside to side or back and forth This was done to mini-mize motion-parallax cues in the real-world conditionso as to make it as comparable as possible to the virtual-world conditions

Once a good image was achieved subjects were in-structed to physically turn their bodies approximately70deg to the right to face a junction of two walls in theenvironment After subjects turned they were in-structed to turn their head back toward the target toobtain a final view and reaffirm their mental image ofthe environment Then subjects either blindfoldedthemselves (real-world condition) or closed their eyeswhile the HMD screen was cleared to black (virtual-world conditions) Subjects were then directed to walkpurposefully and decisively in the direction their bodywas facing After walking approximately 25 m an ex-perimenter would give the verbal command ldquoturnrdquo sig-naling the subject to turn toward the target and stopwalking when they felt they were facing the target Sub-jects were instructed to perform this turn as if they were

turning a corner in a real hallway to make the move-ment as natural as possible At this point the subjectrsquosposition was marked and they were directed to ldquoTaketwo steps in the direction of the targetrdquo Again the sub-jectrsquos position was marked and recorded The subjectwas then led without vision to the starting location byan experimenter In all conditions the apparent locationof the target was assumed to lie at the intersection ofthe line of sight to the (visible) target from the initialvantage point and a line corresponding to the subjectrsquostrajectory on the final walk toward the presumed targetlocation (Figure 1)

The userrsquos own body is seldom rendered in immersivevirtual environments This is a potential problem wheninvestigating absolute egocentric distance judgmentssince eye height is an important scaling factor that couldconceivably be affected by looking down at the userrsquosfeet and the floor on which she or he is standing Ren-dering avatar feet may not be sufficient since it is diffi-cult to achieve a high degree of realism We controlledfor this potential problem by having users wear a circu-lar collar in both the real-world and virtual-world condi-tions (Figure 3) The collar had the effect of occludingusersrsquo view of the floor out to about 2 m hiding thearea around their feet in all four tested conditions

Prior to the experiment trials subjects practiced blindwalking for five minutes During this practice subjectswalked blindfolded in a hallway and responded to verbalcommands to start and stop walking The training ishelpful in building trust between the experimenter andthe subject (Rieser 1999) but more importantly accus-

Figure 3 Viewing collar to hide viewerrsquos body and floor close to

standing position

566 PRESENCE VOLUME 13 NUMBER 5

toms the subject to walking blind During both thetraining session and the actual experiment subjectswore headphones fed by an external microphone to helplimit the effects of sound localization in the environ-ment A remote microphone worn by the experimenterallowed subjects to hear instructions After the trainingsession subjects were led still blindfolded either to ourlaboratory or to the real lobby This last step was per-formed to help ensure that the subjectrsquos movement dur-ing the experiment would not be inhibited by a prioriknowledge of the location of the walls in our lab Thesound-masking headphones remained on during thistime For the virtual-world conditions when subjectsarrived in the laboratory the HMD was placed on theirhead while their eyes remained closed Once on sub-jects were allowed to open their eyes and adjust the fitand focus of the HMD after which the orientation ofthe virtual world was aligned with the the natural rest-ing position of the HMD on the subject

4 Results

Figures 4ndash7 show the average judgments for eachof the four conditions real world high-quality pan-

orama images low-quality texture-mapped computergraphics and wireframe Error bars indicate one stan-dard error above and below the mean The intersectioncomputation used to compute apparent distance (Figure1) results in asymmetric variability around the meansince a turn of deg too far to the right produced an over-shoot in distance larger than the undershoot in distance

Figure 4 Distance judgments Real world Figure 5 Distance judgments Panorama images

Figure 6 Distance judgments Low-quality computer graphics

Thompson et al 567

produced by a turn of deg too far to the left An arctan-gent transform was applied to the data to reduce thiseffect Averages error estimates and measures of statis-tical significance were calculated in the transform spaceThe inverse transform was then applied to the calculatedaverages and errors in order to allow presentation of thefinal results in terms of judged distance

Figure 8 allows easy comparisons between results forall four conditions The experiment confirmed previousstudies showing that for standing observers viewingground-level targets in action-space range distancejudgments in the real world were near veridical (real-world) while distance judgments based on computergraphics were significantly compressed The surprisingresult was that the amount of compression was nearlythe same for all three graphical displays That is dis-tance judgments were almost unaffected by the qualityof the imagery presented to subjects

A 4 (environment) 3 (distance) 2 (sex) repeated-measures ANOVA with distance as a within-subjectvariable and environment and sex as between-subjectvariables was performed on the transformed averagedistance judgments and indicated a significant effect ofenvironment F(3 40) 1077 p 001 Collapsedacross distance Scheffe post hoc comparisons showed

that distance judgments in the real world were greaterthan those given in each of the other environments (p

01) and that performance in the other three environ-ments did not differ (p 48 for all comparisons) Al-though the means at 10 m or 15 m suggest differencesbetween the virtual conditions post hoc univariateANOVAs (with three environmental conditions) at eachdistance indicated that these differences were negligible(p 4 for the effect of environment) The ANOVAalso indicated an effect of distance F(2 80) 18384p 001 Judged distance increased as a function ofphysical distance for all environments In all the analy-ses demonstrated that perceived distance was signifi-cantly more accurate in the real world compared to thevirtual environments and that distance judgments in thevirtual environments did not vary much from eachother

5 Discussion

The results presented above are a strong indicatorthat compressed absolute egocentric distance judgmentsin visually immersive environments are not caused by alack of realistic graphics rendering The phenomenal

Figure 7 Distance judgments Wireframe graphics Figure 8 Distance judgments Comparison of all conditions

568 PRESENCE VOLUME 13 NUMBER 5

experience of realism in the panoramic environment isbest expressed by the comments of several subjectsWhen looking into a glass window in the rendered dis-play they commented ldquoWhy canrsquot I see my reflection inthe glassrdquo Despite this subjective experience judg-ments based on wireframe renderings were as good asjudgments based on actual images presented with thesame display system In all virtual environments therewas a large compression of egocentric distance As aresult absolute egocentric distance judgments in virtualenvironments are not likely to be aided by photorealisticimprovements in computer graphics such as better tex-turing and illumination From a theoretical standpointthis suggests that familiar size may be a relatively minorcontributor to the sort of distance judgments that wereinvestigated though it is important to note that all fourconditions involved hallway-like scaling and geometryThe similarity between judged distances to targets onthe floor in the three types of virtual displays is consis-tent with the hypothesis that the declination of visualangle to targets dominates distance egocentric percep-tion (Ooi Wu amp He 2001) However this does notexplain the large differences observed between distancejudgments in the real and virtual conditions

The present experiment used a methodology that in-volved a stationary viewer and an action-based judg-ment task to address specific questions about judgmentsof distance in visually immersive environments Our in-tent was to determine whether observers would judgeegocentric distance in the simulated environment in asimilar manner as in the real world without the experi-ence of active exploration Thus we restricted the ob-serverrsquos movement while viewing the environmentsPrevious visual-motor adaptation studies (Rieser PickAshmead amp Garing 1995 Pick Rieser Wagner ampGaring 1999) have demonstrated that active observerswill quickly adapt to a new mapping between visual in-put and their own movements leading to the result ofmodified motor output that corresponds to the visualworld (recalibration) We might predict that allowingactive exploration of the virtual environments wouldlead to a similar adaptation and recalibration effect sothat observers would learn to walk and turn an accuratedistance to virtual targets While this prediction ad-

dresses an important question it is a different questionthan the one presently asked in this paper Our goal wasto test whether egocentric distance judgments wouldreplicate the accurate performance demonstrated in thereal world not whether these judgments could becomeaccurate after interacting within a compressed percep-tion of the world Future studies should consider boththe extent of veridical perception in visually immersiveenvironments and the role of actions in making immer-sive environments useful despite a potential lack ofveridical perception

What might explain the compression of absolute ego-centric distance judgments if not image quality Wesuggest several possibilities but no solid evidence sup-porting any of the potential explanations has yet beenpublished While the realism of the panorama imagesused in this study far exceeded any of the computergraphics employed in distance-judgment experiments byother investigators resolution and apparent sharpnesswere still limited compared to natural viewing of thereal world This may have influenced a familiar-size ef-fect or may have degraded the sense of presence whilewearing the HMD Dixon Wraga Proffitt and Wil-liams (2000) found that visual immersion was neededfor eye height to appropriately scale linear perspectivecues Perhaps a full sense of presence not only visualimmersion is needed for distance judgments to be com-parable to what is seen in the real world Limited fieldof view is often suggested as a cause of distorted spatialvision in HMDs but Knapp and Loomis (in press)found that limiting FOV did not affect real-world ego-centric distance judgments at least if the observer wasfree to move his or her head to visually explore the envi-ronment Motion parallax was not present in our virtualdisplay conditions but motion parallax appears to be arather weak absolute-distance cue (Beall et al 1995)In addition subjects performed veridically in our real-word condition with at most very limited translationalhead motion Focus and stereo convergence are not wellcontrolled in HMDs (Rolland et al 1995 WannRushton amp Mon-Williams 1995) and incorrect accom-modation cues are known to affect distance judgments(Andersen Saidpour amp Braunstein 1998 BinghamBradley Bailey amp Vinner 2001) It seems unlikely

Thompson et al 569

however that accommodation and convergence wouldhave an effect this large at the distances we were investi-gating Finally there may be some sort of ergonomiceffect associated with wearing an HMD (Lackner amp Di-Zio 1989)

Future research that manipulates factors other thanthe image quality such as FOV stereo and physicaleffects of the HMD is needed to begin to answer thesequestions A sense of presence is more difficult to defineand manipulate but is likely to be an important compo-nent in accurate distance perception in virtual environ-ments

Acknowledgments

This material is based upon work supported by the NationalScience Foundation under grants 9623614 0080999 and0121084 Thanks to Alias Wavefront for their donation ofMaya Complete which was used in this project

References

Andersen G J Saidpour A amp Braunstein M L (1998)Effects of collimation on perceived layout in 3-D scenesPerception 27 1305ndash1315

Beall A C Loomis J M Philbeck J M amp Fikes T J(1995) Absolute motion parallax weakly determines visualscale in real and virtual environments In Proceedings of theSPIE-The International Society for Optical Engineering2411 288ndash297

Bingham G P Bradley A Bailey M amp Vinner R (2001)Accomodation occlusion and disparity matching are usedto guide reaching A comparison of actual versus virtual en-vironments Journal of Experimental Psychology HumanPerception and Performance 27 1314ndash1334

Cutting J E amp Vishton P M (1995) Perceiving layoutand knowing distance The integration relative potency andcontextual use of different information about depth In WEpstein amp S Rogers (Eds) Perception of Space and Motion(pp 69ndash117) New York Academic

Dixon M W Wraga M Proffitt D R amp Williams G C(2000) Eye height scaling of absolute size in immersive and

nonimmersive displays Journal of Experimental PsychologyHuman Perception and Performance 26(2) 582ndash593

Durgin F H Fox L F Lewis J amp Walley K A (2002)Perceptuomotor adaptation More than meets the eye Pa-per presented at the forty-third annual meeting of the Psy-chonomic Society Kansas City MO

Ellis S R amp Menges B M (1997) Judgments of the dis-tance to nearby virtual objects Interaction of viewing con-ditions and accommodative demand Presence Teleoperatorsand Virtual Environments 6 452ndash462

Foley J M Ribeiro-Filho N P amp Da Silva J A (2004)Visual perception of extent and the geometry of visualspace Vision Research 44 147ndash156

Fukusima S S Loomis J M amp Da Silva J A (1997) Vi-sual perception of egocentric distance as assessed by trian-gulation Journal of Experimental Psychology Human Per-ception and Performance 23(1) 86ndash100

Knapp J M (1999) The visual perception of egocentric dis-tance in virtual environments Unpublished doctoral disser-tation University of California at Santa Barbara

Knapp J M amp Loomis J M (in press) Limited field ofview of head-mounted displays is not the cause of distanceunderestimation in virtual environments Presence Teleop-erators and Virtual Environments

Lackner J R amp DiZio P (1989) Altered sensory-motorcontrol of the head as an etiological factor in space-motionsickness Perceptual and Motor Skills 68 784ndash786

Lampton D R McDonald D P Singer M amp Bliss J(1995) Distance estimation in virtual environments In Pro-ceedings of the Human Factors and Ergonomics Society 39thAnnual Meeting 1268ndash1272

Loomis J M Blascovich J J amp Beall A C (1999) Im-mersive virtual environment technology as a basic researchtool in psychology Behavior Research Methods Instrumentsand Computers 31(4) 557ndash564

Loomis J M Da Silva J A Fujita N amp Fukusima S S(1992) Visual space perception and visually directed actionJournal of Experimental Psychology Human Perception andPerformance 18 906ndash921

Loomis J M amp Knapp J M (2003) Visual perception ofegocentric distance in real and virtual environments In LHettinger amp M Haas (Eds) Virtual and adaptive environ-ments (pp 21ndash 46) Hillsdale NJ Erlbaum

Loomis J M Philbeck J W amp Zahorik P (2002) Disso-ciation between location and shape in visual space Journal

570 PRESENCE VOLUME 13 NUMBER 5

of Experimental Psychology Human Perception and Perfor-mance 28 1202ndash1212

Ooi T L Wu B amp He Z J (2001) Distance determina-tion by the angular declination below the horizon Nature414 197ndash200

Pick H L Jr Rieser J J Wagner D amp Garing A E(1999) The recalibration of rotational locomotion Journalof Experimental Psychology Human Perception and Perfor-mance 25(5) 1179ndash1188

Raskar R amp Cohen M (1999) Image precision silhouetteedges In Proceedings of the ACM Symposium on Interactive3D Graphics 135ndash140

Rieser J J (1999) Dynamic spatial orientation and the cou-pling of representation and action In R G Golledge (Ed)Wayfinding behavior Cognitive mapping and other spatialprocesses (pp 168ndash190) Baltimore MD Johns HopkinsUniversity Press

Rieser J J Ashmead D H Talor C R amp YoungquistG A (1990) Visual perception and the guidance of loco-motion without vision to previously seen targets Perception19 675ndash689

Rieser J J Pick H L Jr Ashmead D amp Garing A(1995) Calibration of human locomotion and models ofperceptual-motor organization Journal of Experimental

Psychology Human Perception and Performance 21 480ndash497

Rolland J P Gibson W amp Arierly D (1995) Towardsquantifying depth and size perception as a function of view-ing distance Presence Teleoperators and Virtual Environ-ments 4 24ndash49

Thomson J A (1983) Is continuous visual monitoring nec-essary in visually guided locomotion Journal of Experimen-tal Psychology Human Perception and Performance 9(3)427ndash443

Wann J P Rushton S amp Mon-Williams M (1995) Natu-ral problems for stereoscopic depth perception in virtualenvironments Vision Research 35(19) 2731ndash2736

Willemsen P amp Gooch A (2002) Perceived egocentric dis-tances in real image-based and traditional virtual environ-ments In Proceedings of IEEE Virtual Reality Conference89ndash90

Witmer B amp Sadowski W Jr (1998) Nonvisually guidedlocomotion to a previously viewed target in real and virtualenvironments Human Factors 40 478ndash488

Wraga M (1999) Using eye height in different postures toscale the heights of objects Journal of Experimental Psychol-ogy Human Perception and Performance 25(2) 518ndash530

Thompson et al 571

Page 2: Santa Barbara Department of Psychology, Santa Barbara ... et al Presence.… · Santa Barbara Santa Barbara, CA 93106 Andrew C. Beall beall@psych.ucsb.edu Department of Psychology,

but important visual cues (eg natural texture high-lights) If this hypothesis is correct it means thatphotorealistic rendering of the surfaces and objects in asimulated environment is likely to produce more accu-rate perception of distancerdquo

This paper explores the conjecture that image qualityaffects distance judgments in virtual environments Westart with a discussion of what is meant by a ldquodistancejudgmentrdquo and point out that different types of distancejudgments likely depend on distinctly different visualcues We next discuss how to experimentally determineperceptual judgments of one type of perceived distanceThis is followed by the presentation of experimentalresults comparing distance judgments in the real worldwith judgments based on graphical renderings of vary-ing quality showing that quality of graphics has littleeffect on the accuracy of distance judgments We endwith a discussion contributing to the speculation onwhy distances are incorrectly perceived in visually im-mersive displays

2 Background

21 Visual Cues for Distance

Visual perception of distance can be defined inmultiple ways It is often categorized by the frame ofreference used Egocentric distances are measured fromthe observer to individual locations in the environmentExocentric distances are measured between two points inthe environment The distinction is important for tworeasons First of all the errors associated with the per-ception of egocentric and exocentric distances are differ-ent Although people perceive egocentric distances ac-curately when distance cues are abundant they makelarge systematic errors in perceiving an exocentric inter-val under the same viewing conditions Recent researchby Foley Ribeiro-Filho and Da Silva (2004) and byLoomis Philbeck and Zahorik (2002) demystifies theseparadoxical results Secondly some depth cues such asshadows can provide information about exocentric dis-tances but not egocentric distances

Another distinction between types of distance percep-tion is also critical Distance perception can involve abso-

lute relative or ordinal judgments Absolute distancesare specified in terms of some standard that need not bein the visual field (eg ldquotwo metersrdquo or ldquofive eye-heightsrdquo) Relative distances are specified in terms ofcomparisons with other visually determined distances(eg ldquolocation A is twice as far away as location Brdquo)Relative distances can be thought of as absolute dis-tances that have been subjected to an unknown butfixed scaling transformation Ordinal distances are a spe-cial case of relative distances in which it is possible onlyto determine the depth ordering between two locationsbut not the magnitude of the difference

Finally distance from the observer affects the natureand accuracy of distance perception Cutting and Vish-ton (1995) divide distances into three zones personalspace which extends slightly beyond an armrsquos reachfrom the observer action space within which we canrapidly locomote extending from the boundaries ofpersonal space to approximately 30 m from the ob-server and vista space beyond 30 m from the observer

The study reported on below deals with absolute ego-centric distance judgments in action space which areparticularly relevant to many virtual environment appli-cations A computational analysis shows that only a fewvisual cues provide information about such distances(Table 1) Accommodation and binocular disparity arenot effective beyond a few meters Absolute-motionparallax has the potential to provide information aboutabsolute egocentric distance if the velocity of the ob-server is utilized for scaling but this appears to be aweak distance cue for people (Beall Loomis Philbeckamp Fikes 1995) Within action space the related cues oflinear perspective height in the field and horizon ratioare relative-depth cues that have the potential for pro-viding absolute depth to objects resting on a groundplane when combined with information about the ob-serverrsquos eye height above the ground plane (Wraga1999) These cues can be exploited in visually immer-sive interfaces if the rendering geometry is correct andboth observer and object are in contact with a groundplane having adequate perspective cues Familiar sizemdashwhich involves exploiting the relationship between theassumed physical size of an object the distance of theobject from the observer and the retinal size of the im-

Thompson et al 561

age of the objectmdashcan also serve as an absolute-depthcue It is reasonable to assume that the effectiveness ofthe familiar-size cue depends at least in part of the real-ism of the imagery being viewed though we are notaware of definitive studies addressing this issue In the ex-periment described below we vary the quality of immer-sively viewed imagery while fixing the information availablefrom perspective cues in order to determine whether im-age quality affects absolute egocentric depth judgments

22 Experimentally EstimatingJudgments of Absolute EgocentricDistance

It is quite difficult to determine the distance to atarget that is ldquoseenrdquo by an observer This is particularlytrue for absolute-distance judgments since methodsinvolving just-noticeable-differences reference stan-dards and equal-interval tasks all involve relative dis-tance Verbal reporting can be used (eg ldquoHow manymeters away is location Ardquo) but verbal reports tend tobe noisy and are subject to a variety of biases that aredifficult to control An alternative for evaluating the per-ception of distance is to have subjects perform sometask in which the actions taken are dependent on theperceived distance to visible objects (Loomis Da Silva

Fujita amp Fukusima 1992 Rieser 1999) Such ap-proaches have the additional advantage of being particu-larly appropriate for evaluating the effectiveness of inter-active virtual environments

Walking to or toward previously viewed targets hasbeen used extensively to evaluate judgments of absoluteegocentric distance In one form of this task subjectsfirst look at a target and then walk to the target whileblindfolded They are told to stop at the target locationand the distance between their starting and stoppingpoints is presumed to be an indication of the originallyperceived distance (Thomson 1983 Rieser AshmeadTalor amp Youngquist 1990) A second form of this taskinvolves looking at a target walking while blindfoldedin an oblique direction from the original line of sight tothe target and then pointing toward or walking towardthe (now unseen) target (Fukusima Loomis amp DaSilva 1997) The presumed perceived distance is deter-mined based on the original starting point and the in-tersection of the original line of sight with the final indi-cated direction (Figure 1) Triangulated walking orpointing can be used to evaluate perception of largerdistances than can easily be tested using direct walkingand has a theoretical advantage over direct walking inthat it is less likely to involve some specialized visual-action coupling not related to more generally useful

Table 1 Common Visual Cues for Absolute (a) Relative (r) and Ordinal (o) Depth

Cue a r o Requirements for absolute depth

Accommodation x Very limited rangeBinocular convergence x x x Limited rangeBinocular disparity ndash x x Limited rangeLinear perspective height in picture horizon ratio x x x Requires viewpoint heightFamiliar size x x xRelative size ndash x x Subject to errorsAerial perspective ndash x x Adaptation to local conditionsAbsolute motion parallax x x Requires viewpoint velocityRelative motion parallax ndash ndash xTexture gradients ndash x ndashShading ndash x ndashOcclusion ndash ndash x

562 PRESENCE VOLUME 13 NUMBER 5

distance perception High accuracy in distance estima-tion has been observed in visually directed action experi-ments across many studies

23 Prior Studies of DistanceJudgments in Visually ImmersiveEnvironments

In the last few years a number of research groupshave addressed the issue of space perception in visuallyimmersive environments This work has been motivatedby a desire to explore new techniques both for probinghuman vision (Loomis Blascovich amp Beall 1999) andfor quantifying operator performance in virtual environ-ments (Lampton McDonald Singer amp Bliss 1995)Table 2 summarizes the results of four previous studiesof absolute egocentric distance judgments over actionspace in visually immersive environments along withsome of the results discussed further in section 4 Ineach of these studies some form of directed action wasused to evaluate distance judgments in both real andcomputer generated scenes All involved indoor envi-

ronments and targets situated on a level ground planeThe first study used a Fakespace Labs BOOM2C displaywith 1280 1024 resolution The second and thirdstudies used a Virtual Research FS5 HMD with 800

480 resolution The final two studies used an nVisionHiRes HMD with 1280 1024 resolution

One of the striking results from these studies is thatdistance judgments in virtual environments were consis-tently underestimated compared with judgments in thereal world Most of the results in the CG column of Ta-ble 2 were based on imagery comparable to that shownin Figure 2b One potential explanation for this com-pression of virtual space is that the quality of the imag-ery is too poor to generate an effective familiar-size ef-fect The experiment described below is aimed atexploring this conjecture

3 Method

In order to investigate the degree to which imagequality affects egocentric-distance judgments in virtualenvironments we compared distance judgments in thereal world with distance judgments in virtual environ-ments utilizing three very distinct styles of graphicalrendering 360deg high-resolution panoramic images in-tentionally low-quality texture-mapped computer graphicsand wireframe renderings (Figure 2) We probed sub-jectsrsquo perceptions of distance using a directed-actiontask in which subjects indirectly walked without visiontoward a previously viewed target A between-subjectsdesign was used in which a given subject viewed trialsat three different distances in one of four different envi-ronments Care was taken to make the tasks similar inboth the real and virtual environments and to make thescale and general layout of all four environments equiva-lent

31 Participants

Forty-eight college-age students participated inthis study with six male and six female subjects in eachcondition Subjects either received course credit for par-ticipating or were volunteers All subjects were given a

Figure 1 Triangulated walking task Subjects start walking in an

oblique direction from the direction of a previously viewed target On

directions from the experimenter they turn and take several steps

toward where they perceived the previously viewed target to be

Thompson et al 563

stereogram eye test and had normal or corrected-to-normal vision Interpupillary distances ranged from 51cm to 77 cm with an average of 619 cm

32 Materials

In the real-world condition subjects viewed afoam-core circular disk approximately 37 cm in diameterand placed on the ground at distances of 5 m 10 mand 15 m The experiment was performed in the lobbyof an engineering classroom building Subject positionsrelevant to computing apparent distance (Figure 1) weredetermined by measuring foot positions on the floor

In the three virtual-world conditions imagery waspresented using an nVision Datavisor HiRes HMD withinterlaced 1280 1024 resolution full field-sequentialcolor and a 42deg horizontal field of view The angularresolution of the HMD was on the order of 2 arc min-utes per pixel The nVision has user-adjustable focusThe display was configured with 100 stereo overlapbetween the two eyes Head tracking was done using anInterSense IS600 Mark 2 tracker This tracker uses amix of inertial gravitational and acoustic technologiesto provide state-of-the art accuracy and latency Onlytracker rotation was used to update the viewpointWhile translational tracker positions were recorded theresults reported in section 4 were based on measuredfoot position on the floor in order to be consistent withthe real-world condition All computer-generated envi-

ronments were rendered on an SGI Onyx2 R12000with two IR2 rendering pipelines One rendering pipe-line was used for each eye to provide stereopsis

Multiple sets of panorama images were produced fordifferent target distances and eye heights based on pho-tographs acquired by swinging a camera around a fixedaxis located in the same position as the viewpoint forthe real-world condition Targets were placed in thesame locations as for the real-world condition To pro-vide stereo viewing two sets of images were taken foreach panorama with the camera offset laterally 325cm from the axis of rotation The two sets of photo-graphs were digitized onto a PhotoCD and then mosa-icked into two cylindrical images using the PanoramaFactory software package Each cylindrical image wastexture-mapped onto a set of polygons forming a cylin-drical configuration providing the ability to generateviews over a 360deg by 100deg portion of the optical sphereRendering update rates were no less than 40 frames persecond in each eye The result was a compelling sense ofbeing able to look around in the virtual environmentthough no motion parallax was available and the stereogeometry was slightly incorrect To control for subjectsrsquoeye heights multiple-panorama image pairs were pro-duced for eye heights spaced at 5 cm intervals and theset nearest to a given subjectrsquos eye height was used forthat subjectrsquos trials Practical concerns relating to themanner in which the original images were captured pre-cluded a similar control for interpupillary distance

Table 2 Distance Judgments Based on Viewing Imagery Generated by Computer Graphics (CG) and Using VisuallyImmersive Displays

Study Distance (m) Real () CG () Task

Witmer amp Sadowski (1998) 46ndash32 92 85 Treadmill walkingKnapp (1999) 5ndash15 100 42 Triangulated walkingDurgin Fox Lewis amp Walley

(2002) 2ndash8 65 Direct walkingWillemsen amp Gooch (2002) 2ndash5 100 81 Direct walkingConditions 1 and 2 this study 5ndash15 95 44 Triangulated walking

Note Distances are compressed relative to comparable judgments based on viewing real-world environments Thepercentages indicate the overall ratio of perceived distance to actual distance

564 PRESENCE VOLUME 13 NUMBER 5

The second virtual-environment condition involved acomputer graphics rendering of the same classroombuilding lobby The scale of the model was the same asthe actual building lobby but the geometric detail wasintentionally kept quite simple Stereotypical tiled tex-ture maps were used Simple point-source lighting wasused with no shadows or other global illumination ef-fects Targets were rendered as red disks with the sizeand position corresponding to what was used for the

real-world condition Rendering update rates were noless than 30 frames per second in each eye

The wireframe virtual environment condition wasconstructed by rendering feature edges of the modelused in the second virtual-environment condition Oursoftware used an OpenGL silhouette drawing algorithm(Raskar amp Cohen 1999) to generate the feature edgesThe frame rates for this environment were no less than40 frames per second The wireframe rendering pro-

Figure 2 Sample imagery for conditions 2 3 and 4 (a) Section of panorama image showing target (b) Example of low-quality computer

graphics image showing target The viewpoint is the same as for Figure 2a (c) Example of wireframe computer graphics image showing target

The viewpoint is the same as for Figure 2a

Thompson et al 565

duced scenes that resemble black-on-white sketches ofthe classroom building lobby The target was renderedwith feature edges as well with size and position thesame as for the previous conditions

For both the texture-mapped and wireframe com-puter graphics conditions eye heights were renderedbased on the subjectsrsquo actual eye heights Interpupillarydistances for stereo rendering were fixed at 65 cm con-sistent with the panorama images

33 Procedure

Subjects were first provided with written instruc-tions that described the triangulated walking task andthen given a demonstration of the task in a space bothsmaller and different from the actual experiment spatiallayout For all conditions both real and virtual subjectswere instructed to obtain a good image of the targetand their local surroundings while first facing the targetSubjects were told that a ldquogood imagerdquo is obtained ifafter closing their eyes they would still be able to ldquoseerdquothe environment and most importantly the targetSubjects were allowed to rotate their head about theirneck but were instructed not to move their head fromside to side or back and forth This was done to mini-mize motion-parallax cues in the real-world conditionso as to make it as comparable as possible to the virtual-world conditions

Once a good image was achieved subjects were in-structed to physically turn their bodies approximately70deg to the right to face a junction of two walls in theenvironment After subjects turned they were in-structed to turn their head back toward the target toobtain a final view and reaffirm their mental image ofthe environment Then subjects either blindfoldedthemselves (real-world condition) or closed their eyeswhile the HMD screen was cleared to black (virtual-world conditions) Subjects were then directed to walkpurposefully and decisively in the direction their bodywas facing After walking approximately 25 m an ex-perimenter would give the verbal command ldquoturnrdquo sig-naling the subject to turn toward the target and stopwalking when they felt they were facing the target Sub-jects were instructed to perform this turn as if they were

turning a corner in a real hallway to make the move-ment as natural as possible At this point the subjectrsquosposition was marked and they were directed to ldquoTaketwo steps in the direction of the targetrdquo Again the sub-jectrsquos position was marked and recorded The subjectwas then led without vision to the starting location byan experimenter In all conditions the apparent locationof the target was assumed to lie at the intersection ofthe line of sight to the (visible) target from the initialvantage point and a line corresponding to the subjectrsquostrajectory on the final walk toward the presumed targetlocation (Figure 1)

The userrsquos own body is seldom rendered in immersivevirtual environments This is a potential problem wheninvestigating absolute egocentric distance judgmentssince eye height is an important scaling factor that couldconceivably be affected by looking down at the userrsquosfeet and the floor on which she or he is standing Ren-dering avatar feet may not be sufficient since it is diffi-cult to achieve a high degree of realism We controlledfor this potential problem by having users wear a circu-lar collar in both the real-world and virtual-world condi-tions (Figure 3) The collar had the effect of occludingusersrsquo view of the floor out to about 2 m hiding thearea around their feet in all four tested conditions

Prior to the experiment trials subjects practiced blindwalking for five minutes During this practice subjectswalked blindfolded in a hallway and responded to verbalcommands to start and stop walking The training ishelpful in building trust between the experimenter andthe subject (Rieser 1999) but more importantly accus-

Figure 3 Viewing collar to hide viewerrsquos body and floor close to

standing position

566 PRESENCE VOLUME 13 NUMBER 5

toms the subject to walking blind During both thetraining session and the actual experiment subjectswore headphones fed by an external microphone to helplimit the effects of sound localization in the environ-ment A remote microphone worn by the experimenterallowed subjects to hear instructions After the trainingsession subjects were led still blindfolded either to ourlaboratory or to the real lobby This last step was per-formed to help ensure that the subjectrsquos movement dur-ing the experiment would not be inhibited by a prioriknowledge of the location of the walls in our lab Thesound-masking headphones remained on during thistime For the virtual-world conditions when subjectsarrived in the laboratory the HMD was placed on theirhead while their eyes remained closed Once on sub-jects were allowed to open their eyes and adjust the fitand focus of the HMD after which the orientation ofthe virtual world was aligned with the the natural rest-ing position of the HMD on the subject

4 Results

Figures 4ndash7 show the average judgments for eachof the four conditions real world high-quality pan-

orama images low-quality texture-mapped computergraphics and wireframe Error bars indicate one stan-dard error above and below the mean The intersectioncomputation used to compute apparent distance (Figure1) results in asymmetric variability around the meansince a turn of deg too far to the right produced an over-shoot in distance larger than the undershoot in distance

Figure 4 Distance judgments Real world Figure 5 Distance judgments Panorama images

Figure 6 Distance judgments Low-quality computer graphics

Thompson et al 567

produced by a turn of deg too far to the left An arctan-gent transform was applied to the data to reduce thiseffect Averages error estimates and measures of statis-tical significance were calculated in the transform spaceThe inverse transform was then applied to the calculatedaverages and errors in order to allow presentation of thefinal results in terms of judged distance

Figure 8 allows easy comparisons between results forall four conditions The experiment confirmed previousstudies showing that for standing observers viewingground-level targets in action-space range distancejudgments in the real world were near veridical (real-world) while distance judgments based on computergraphics were significantly compressed The surprisingresult was that the amount of compression was nearlythe same for all three graphical displays That is dis-tance judgments were almost unaffected by the qualityof the imagery presented to subjects

A 4 (environment) 3 (distance) 2 (sex) repeated-measures ANOVA with distance as a within-subjectvariable and environment and sex as between-subjectvariables was performed on the transformed averagedistance judgments and indicated a significant effect ofenvironment F(3 40) 1077 p 001 Collapsedacross distance Scheffe post hoc comparisons showed

that distance judgments in the real world were greaterthan those given in each of the other environments (p

01) and that performance in the other three environ-ments did not differ (p 48 for all comparisons) Al-though the means at 10 m or 15 m suggest differencesbetween the virtual conditions post hoc univariateANOVAs (with three environmental conditions) at eachdistance indicated that these differences were negligible(p 4 for the effect of environment) The ANOVAalso indicated an effect of distance F(2 80) 18384p 001 Judged distance increased as a function ofphysical distance for all environments In all the analy-ses demonstrated that perceived distance was signifi-cantly more accurate in the real world compared to thevirtual environments and that distance judgments in thevirtual environments did not vary much from eachother

5 Discussion

The results presented above are a strong indicatorthat compressed absolute egocentric distance judgmentsin visually immersive environments are not caused by alack of realistic graphics rendering The phenomenal

Figure 7 Distance judgments Wireframe graphics Figure 8 Distance judgments Comparison of all conditions

568 PRESENCE VOLUME 13 NUMBER 5

experience of realism in the panoramic environment isbest expressed by the comments of several subjectsWhen looking into a glass window in the rendered dis-play they commented ldquoWhy canrsquot I see my reflection inthe glassrdquo Despite this subjective experience judg-ments based on wireframe renderings were as good asjudgments based on actual images presented with thesame display system In all virtual environments therewas a large compression of egocentric distance As aresult absolute egocentric distance judgments in virtualenvironments are not likely to be aided by photorealisticimprovements in computer graphics such as better tex-turing and illumination From a theoretical standpointthis suggests that familiar size may be a relatively minorcontributor to the sort of distance judgments that wereinvestigated though it is important to note that all fourconditions involved hallway-like scaling and geometryThe similarity between judged distances to targets onthe floor in the three types of virtual displays is consis-tent with the hypothesis that the declination of visualangle to targets dominates distance egocentric percep-tion (Ooi Wu amp He 2001) However this does notexplain the large differences observed between distancejudgments in the real and virtual conditions

The present experiment used a methodology that in-volved a stationary viewer and an action-based judg-ment task to address specific questions about judgmentsof distance in visually immersive environments Our in-tent was to determine whether observers would judgeegocentric distance in the simulated environment in asimilar manner as in the real world without the experi-ence of active exploration Thus we restricted the ob-serverrsquos movement while viewing the environmentsPrevious visual-motor adaptation studies (Rieser PickAshmead amp Garing 1995 Pick Rieser Wagner ampGaring 1999) have demonstrated that active observerswill quickly adapt to a new mapping between visual in-put and their own movements leading to the result ofmodified motor output that corresponds to the visualworld (recalibration) We might predict that allowingactive exploration of the virtual environments wouldlead to a similar adaptation and recalibration effect sothat observers would learn to walk and turn an accuratedistance to virtual targets While this prediction ad-

dresses an important question it is a different questionthan the one presently asked in this paper Our goal wasto test whether egocentric distance judgments wouldreplicate the accurate performance demonstrated in thereal world not whether these judgments could becomeaccurate after interacting within a compressed percep-tion of the world Future studies should consider boththe extent of veridical perception in visually immersiveenvironments and the role of actions in making immer-sive environments useful despite a potential lack ofveridical perception

What might explain the compression of absolute ego-centric distance judgments if not image quality Wesuggest several possibilities but no solid evidence sup-porting any of the potential explanations has yet beenpublished While the realism of the panorama imagesused in this study far exceeded any of the computergraphics employed in distance-judgment experiments byother investigators resolution and apparent sharpnesswere still limited compared to natural viewing of thereal world This may have influenced a familiar-size ef-fect or may have degraded the sense of presence whilewearing the HMD Dixon Wraga Proffitt and Wil-liams (2000) found that visual immersion was neededfor eye height to appropriately scale linear perspectivecues Perhaps a full sense of presence not only visualimmersion is needed for distance judgments to be com-parable to what is seen in the real world Limited fieldof view is often suggested as a cause of distorted spatialvision in HMDs but Knapp and Loomis (in press)found that limiting FOV did not affect real-world ego-centric distance judgments at least if the observer wasfree to move his or her head to visually explore the envi-ronment Motion parallax was not present in our virtualdisplay conditions but motion parallax appears to be arather weak absolute-distance cue (Beall et al 1995)In addition subjects performed veridically in our real-word condition with at most very limited translationalhead motion Focus and stereo convergence are not wellcontrolled in HMDs (Rolland et al 1995 WannRushton amp Mon-Williams 1995) and incorrect accom-modation cues are known to affect distance judgments(Andersen Saidpour amp Braunstein 1998 BinghamBradley Bailey amp Vinner 2001) It seems unlikely

Thompson et al 569

however that accommodation and convergence wouldhave an effect this large at the distances we were investi-gating Finally there may be some sort of ergonomiceffect associated with wearing an HMD (Lackner amp Di-Zio 1989)

Future research that manipulates factors other thanthe image quality such as FOV stereo and physicaleffects of the HMD is needed to begin to answer thesequestions A sense of presence is more difficult to defineand manipulate but is likely to be an important compo-nent in accurate distance perception in virtual environ-ments

Acknowledgments

This material is based upon work supported by the NationalScience Foundation under grants 9623614 0080999 and0121084 Thanks to Alias Wavefront for their donation ofMaya Complete which was used in this project

References

Andersen G J Saidpour A amp Braunstein M L (1998)Effects of collimation on perceived layout in 3-D scenesPerception 27 1305ndash1315

Beall A C Loomis J M Philbeck J M amp Fikes T J(1995) Absolute motion parallax weakly determines visualscale in real and virtual environments In Proceedings of theSPIE-The International Society for Optical Engineering2411 288ndash297

Bingham G P Bradley A Bailey M amp Vinner R (2001)Accomodation occlusion and disparity matching are usedto guide reaching A comparison of actual versus virtual en-vironments Journal of Experimental Psychology HumanPerception and Performance 27 1314ndash1334

Cutting J E amp Vishton P M (1995) Perceiving layoutand knowing distance The integration relative potency andcontextual use of different information about depth In WEpstein amp S Rogers (Eds) Perception of Space and Motion(pp 69ndash117) New York Academic

Dixon M W Wraga M Proffitt D R amp Williams G C(2000) Eye height scaling of absolute size in immersive and

nonimmersive displays Journal of Experimental PsychologyHuman Perception and Performance 26(2) 582ndash593

Durgin F H Fox L F Lewis J amp Walley K A (2002)Perceptuomotor adaptation More than meets the eye Pa-per presented at the forty-third annual meeting of the Psy-chonomic Society Kansas City MO

Ellis S R amp Menges B M (1997) Judgments of the dis-tance to nearby virtual objects Interaction of viewing con-ditions and accommodative demand Presence Teleoperatorsand Virtual Environments 6 452ndash462

Foley J M Ribeiro-Filho N P amp Da Silva J A (2004)Visual perception of extent and the geometry of visualspace Vision Research 44 147ndash156

Fukusima S S Loomis J M amp Da Silva J A (1997) Vi-sual perception of egocentric distance as assessed by trian-gulation Journal of Experimental Psychology Human Per-ception and Performance 23(1) 86ndash100

Knapp J M (1999) The visual perception of egocentric dis-tance in virtual environments Unpublished doctoral disser-tation University of California at Santa Barbara

Knapp J M amp Loomis J M (in press) Limited field ofview of head-mounted displays is not the cause of distanceunderestimation in virtual environments Presence Teleop-erators and Virtual Environments

Lackner J R amp DiZio P (1989) Altered sensory-motorcontrol of the head as an etiological factor in space-motionsickness Perceptual and Motor Skills 68 784ndash786

Lampton D R McDonald D P Singer M amp Bliss J(1995) Distance estimation in virtual environments In Pro-ceedings of the Human Factors and Ergonomics Society 39thAnnual Meeting 1268ndash1272

Loomis J M Blascovich J J amp Beall A C (1999) Im-mersive virtual environment technology as a basic researchtool in psychology Behavior Research Methods Instrumentsand Computers 31(4) 557ndash564

Loomis J M Da Silva J A Fujita N amp Fukusima S S(1992) Visual space perception and visually directed actionJournal of Experimental Psychology Human Perception andPerformance 18 906ndash921

Loomis J M amp Knapp J M (2003) Visual perception ofegocentric distance in real and virtual environments In LHettinger amp M Haas (Eds) Virtual and adaptive environ-ments (pp 21ndash 46) Hillsdale NJ Erlbaum

Loomis J M Philbeck J W amp Zahorik P (2002) Disso-ciation between location and shape in visual space Journal

570 PRESENCE VOLUME 13 NUMBER 5

of Experimental Psychology Human Perception and Perfor-mance 28 1202ndash1212

Ooi T L Wu B amp He Z J (2001) Distance determina-tion by the angular declination below the horizon Nature414 197ndash200

Pick H L Jr Rieser J J Wagner D amp Garing A E(1999) The recalibration of rotational locomotion Journalof Experimental Psychology Human Perception and Perfor-mance 25(5) 1179ndash1188

Raskar R amp Cohen M (1999) Image precision silhouetteedges In Proceedings of the ACM Symposium on Interactive3D Graphics 135ndash140

Rieser J J (1999) Dynamic spatial orientation and the cou-pling of representation and action In R G Golledge (Ed)Wayfinding behavior Cognitive mapping and other spatialprocesses (pp 168ndash190) Baltimore MD Johns HopkinsUniversity Press

Rieser J J Ashmead D H Talor C R amp YoungquistG A (1990) Visual perception and the guidance of loco-motion without vision to previously seen targets Perception19 675ndash689

Rieser J J Pick H L Jr Ashmead D amp Garing A(1995) Calibration of human locomotion and models ofperceptual-motor organization Journal of Experimental

Psychology Human Perception and Performance 21 480ndash497

Rolland J P Gibson W amp Arierly D (1995) Towardsquantifying depth and size perception as a function of view-ing distance Presence Teleoperators and Virtual Environ-ments 4 24ndash49

Thomson J A (1983) Is continuous visual monitoring nec-essary in visually guided locomotion Journal of Experimen-tal Psychology Human Perception and Performance 9(3)427ndash443

Wann J P Rushton S amp Mon-Williams M (1995) Natu-ral problems for stereoscopic depth perception in virtualenvironments Vision Research 35(19) 2731ndash2736

Willemsen P amp Gooch A (2002) Perceived egocentric dis-tances in real image-based and traditional virtual environ-ments In Proceedings of IEEE Virtual Reality Conference89ndash90

Witmer B amp Sadowski W Jr (1998) Nonvisually guidedlocomotion to a previously viewed target in real and virtualenvironments Human Factors 40 478ndash488

Wraga M (1999) Using eye height in different postures toscale the heights of objects Journal of Experimental Psychol-ogy Human Perception and Performance 25(2) 518ndash530

Thompson et al 571

Page 3: Santa Barbara Department of Psychology, Santa Barbara ... et al Presence.… · Santa Barbara Santa Barbara, CA 93106 Andrew C. Beall beall@psych.ucsb.edu Department of Psychology,

age of the objectmdashcan also serve as an absolute-depthcue It is reasonable to assume that the effectiveness ofthe familiar-size cue depends at least in part of the real-ism of the imagery being viewed though we are notaware of definitive studies addressing this issue In the ex-periment described below we vary the quality of immer-sively viewed imagery while fixing the information availablefrom perspective cues in order to determine whether im-age quality affects absolute egocentric depth judgments

22 Experimentally EstimatingJudgments of Absolute EgocentricDistance

It is quite difficult to determine the distance to atarget that is ldquoseenrdquo by an observer This is particularlytrue for absolute-distance judgments since methodsinvolving just-noticeable-differences reference stan-dards and equal-interval tasks all involve relative dis-tance Verbal reporting can be used (eg ldquoHow manymeters away is location Ardquo) but verbal reports tend tobe noisy and are subject to a variety of biases that aredifficult to control An alternative for evaluating the per-ception of distance is to have subjects perform sometask in which the actions taken are dependent on theperceived distance to visible objects (Loomis Da Silva

Fujita amp Fukusima 1992 Rieser 1999) Such ap-proaches have the additional advantage of being particu-larly appropriate for evaluating the effectiveness of inter-active virtual environments

Walking to or toward previously viewed targets hasbeen used extensively to evaluate judgments of absoluteegocentric distance In one form of this task subjectsfirst look at a target and then walk to the target whileblindfolded They are told to stop at the target locationand the distance between their starting and stoppingpoints is presumed to be an indication of the originallyperceived distance (Thomson 1983 Rieser AshmeadTalor amp Youngquist 1990) A second form of this taskinvolves looking at a target walking while blindfoldedin an oblique direction from the original line of sight tothe target and then pointing toward or walking towardthe (now unseen) target (Fukusima Loomis amp DaSilva 1997) The presumed perceived distance is deter-mined based on the original starting point and the in-tersection of the original line of sight with the final indi-cated direction (Figure 1) Triangulated walking orpointing can be used to evaluate perception of largerdistances than can easily be tested using direct walkingand has a theoretical advantage over direct walking inthat it is less likely to involve some specialized visual-action coupling not related to more generally useful

Table 1 Common Visual Cues for Absolute (a) Relative (r) and Ordinal (o) Depth

Cue a r o Requirements for absolute depth

Accommodation x Very limited rangeBinocular convergence x x x Limited rangeBinocular disparity ndash x x Limited rangeLinear perspective height in picture horizon ratio x x x Requires viewpoint heightFamiliar size x x xRelative size ndash x x Subject to errorsAerial perspective ndash x x Adaptation to local conditionsAbsolute motion parallax x x Requires viewpoint velocityRelative motion parallax ndash ndash xTexture gradients ndash x ndashShading ndash x ndashOcclusion ndash ndash x

562 PRESENCE VOLUME 13 NUMBER 5

distance perception High accuracy in distance estima-tion has been observed in visually directed action experi-ments across many studies

23 Prior Studies of DistanceJudgments in Visually ImmersiveEnvironments

In the last few years a number of research groupshave addressed the issue of space perception in visuallyimmersive environments This work has been motivatedby a desire to explore new techniques both for probinghuman vision (Loomis Blascovich amp Beall 1999) andfor quantifying operator performance in virtual environ-ments (Lampton McDonald Singer amp Bliss 1995)Table 2 summarizes the results of four previous studiesof absolute egocentric distance judgments over actionspace in visually immersive environments along withsome of the results discussed further in section 4 Ineach of these studies some form of directed action wasused to evaluate distance judgments in both real andcomputer generated scenes All involved indoor envi-

ronments and targets situated on a level ground planeThe first study used a Fakespace Labs BOOM2C displaywith 1280 1024 resolution The second and thirdstudies used a Virtual Research FS5 HMD with 800

480 resolution The final two studies used an nVisionHiRes HMD with 1280 1024 resolution

One of the striking results from these studies is thatdistance judgments in virtual environments were consis-tently underestimated compared with judgments in thereal world Most of the results in the CG column of Ta-ble 2 were based on imagery comparable to that shownin Figure 2b One potential explanation for this com-pression of virtual space is that the quality of the imag-ery is too poor to generate an effective familiar-size ef-fect The experiment described below is aimed atexploring this conjecture

3 Method

In order to investigate the degree to which imagequality affects egocentric-distance judgments in virtualenvironments we compared distance judgments in thereal world with distance judgments in virtual environ-ments utilizing three very distinct styles of graphicalrendering 360deg high-resolution panoramic images in-tentionally low-quality texture-mapped computer graphicsand wireframe renderings (Figure 2) We probed sub-jectsrsquo perceptions of distance using a directed-actiontask in which subjects indirectly walked without visiontoward a previously viewed target A between-subjectsdesign was used in which a given subject viewed trialsat three different distances in one of four different envi-ronments Care was taken to make the tasks similar inboth the real and virtual environments and to make thescale and general layout of all four environments equiva-lent

31 Participants

Forty-eight college-age students participated inthis study with six male and six female subjects in eachcondition Subjects either received course credit for par-ticipating or were volunteers All subjects were given a

Figure 1 Triangulated walking task Subjects start walking in an

oblique direction from the direction of a previously viewed target On

directions from the experimenter they turn and take several steps

toward where they perceived the previously viewed target to be

Thompson et al 563

stereogram eye test and had normal or corrected-to-normal vision Interpupillary distances ranged from 51cm to 77 cm with an average of 619 cm

32 Materials

In the real-world condition subjects viewed afoam-core circular disk approximately 37 cm in diameterand placed on the ground at distances of 5 m 10 mand 15 m The experiment was performed in the lobbyof an engineering classroom building Subject positionsrelevant to computing apparent distance (Figure 1) weredetermined by measuring foot positions on the floor

In the three virtual-world conditions imagery waspresented using an nVision Datavisor HiRes HMD withinterlaced 1280 1024 resolution full field-sequentialcolor and a 42deg horizontal field of view The angularresolution of the HMD was on the order of 2 arc min-utes per pixel The nVision has user-adjustable focusThe display was configured with 100 stereo overlapbetween the two eyes Head tracking was done using anInterSense IS600 Mark 2 tracker This tracker uses amix of inertial gravitational and acoustic technologiesto provide state-of-the art accuracy and latency Onlytracker rotation was used to update the viewpointWhile translational tracker positions were recorded theresults reported in section 4 were based on measuredfoot position on the floor in order to be consistent withthe real-world condition All computer-generated envi-

ronments were rendered on an SGI Onyx2 R12000with two IR2 rendering pipelines One rendering pipe-line was used for each eye to provide stereopsis

Multiple sets of panorama images were produced fordifferent target distances and eye heights based on pho-tographs acquired by swinging a camera around a fixedaxis located in the same position as the viewpoint forthe real-world condition Targets were placed in thesame locations as for the real-world condition To pro-vide stereo viewing two sets of images were taken foreach panorama with the camera offset laterally 325cm from the axis of rotation The two sets of photo-graphs were digitized onto a PhotoCD and then mosa-icked into two cylindrical images using the PanoramaFactory software package Each cylindrical image wastexture-mapped onto a set of polygons forming a cylin-drical configuration providing the ability to generateviews over a 360deg by 100deg portion of the optical sphereRendering update rates were no less than 40 frames persecond in each eye The result was a compelling sense ofbeing able to look around in the virtual environmentthough no motion parallax was available and the stereogeometry was slightly incorrect To control for subjectsrsquoeye heights multiple-panorama image pairs were pro-duced for eye heights spaced at 5 cm intervals and theset nearest to a given subjectrsquos eye height was used forthat subjectrsquos trials Practical concerns relating to themanner in which the original images were captured pre-cluded a similar control for interpupillary distance

Table 2 Distance Judgments Based on Viewing Imagery Generated by Computer Graphics (CG) and Using VisuallyImmersive Displays

Study Distance (m) Real () CG () Task

Witmer amp Sadowski (1998) 46ndash32 92 85 Treadmill walkingKnapp (1999) 5ndash15 100 42 Triangulated walkingDurgin Fox Lewis amp Walley

(2002) 2ndash8 65 Direct walkingWillemsen amp Gooch (2002) 2ndash5 100 81 Direct walkingConditions 1 and 2 this study 5ndash15 95 44 Triangulated walking

Note Distances are compressed relative to comparable judgments based on viewing real-world environments Thepercentages indicate the overall ratio of perceived distance to actual distance

564 PRESENCE VOLUME 13 NUMBER 5

The second virtual-environment condition involved acomputer graphics rendering of the same classroombuilding lobby The scale of the model was the same asthe actual building lobby but the geometric detail wasintentionally kept quite simple Stereotypical tiled tex-ture maps were used Simple point-source lighting wasused with no shadows or other global illumination ef-fects Targets were rendered as red disks with the sizeand position corresponding to what was used for the

real-world condition Rendering update rates were noless than 30 frames per second in each eye

The wireframe virtual environment condition wasconstructed by rendering feature edges of the modelused in the second virtual-environment condition Oursoftware used an OpenGL silhouette drawing algorithm(Raskar amp Cohen 1999) to generate the feature edgesThe frame rates for this environment were no less than40 frames per second The wireframe rendering pro-

Figure 2 Sample imagery for conditions 2 3 and 4 (a) Section of panorama image showing target (b) Example of low-quality computer

graphics image showing target The viewpoint is the same as for Figure 2a (c) Example of wireframe computer graphics image showing target

The viewpoint is the same as for Figure 2a

Thompson et al 565

duced scenes that resemble black-on-white sketches ofthe classroom building lobby The target was renderedwith feature edges as well with size and position thesame as for the previous conditions

For both the texture-mapped and wireframe com-puter graphics conditions eye heights were renderedbased on the subjectsrsquo actual eye heights Interpupillarydistances for stereo rendering were fixed at 65 cm con-sistent with the panorama images

33 Procedure

Subjects were first provided with written instruc-tions that described the triangulated walking task andthen given a demonstration of the task in a space bothsmaller and different from the actual experiment spatiallayout For all conditions both real and virtual subjectswere instructed to obtain a good image of the targetand their local surroundings while first facing the targetSubjects were told that a ldquogood imagerdquo is obtained ifafter closing their eyes they would still be able to ldquoseerdquothe environment and most importantly the targetSubjects were allowed to rotate their head about theirneck but were instructed not to move their head fromside to side or back and forth This was done to mini-mize motion-parallax cues in the real-world conditionso as to make it as comparable as possible to the virtual-world conditions

Once a good image was achieved subjects were in-structed to physically turn their bodies approximately70deg to the right to face a junction of two walls in theenvironment After subjects turned they were in-structed to turn their head back toward the target toobtain a final view and reaffirm their mental image ofthe environment Then subjects either blindfoldedthemselves (real-world condition) or closed their eyeswhile the HMD screen was cleared to black (virtual-world conditions) Subjects were then directed to walkpurposefully and decisively in the direction their bodywas facing After walking approximately 25 m an ex-perimenter would give the verbal command ldquoturnrdquo sig-naling the subject to turn toward the target and stopwalking when they felt they were facing the target Sub-jects were instructed to perform this turn as if they were

turning a corner in a real hallway to make the move-ment as natural as possible At this point the subjectrsquosposition was marked and they were directed to ldquoTaketwo steps in the direction of the targetrdquo Again the sub-jectrsquos position was marked and recorded The subjectwas then led without vision to the starting location byan experimenter In all conditions the apparent locationof the target was assumed to lie at the intersection ofthe line of sight to the (visible) target from the initialvantage point and a line corresponding to the subjectrsquostrajectory on the final walk toward the presumed targetlocation (Figure 1)

The userrsquos own body is seldom rendered in immersivevirtual environments This is a potential problem wheninvestigating absolute egocentric distance judgmentssince eye height is an important scaling factor that couldconceivably be affected by looking down at the userrsquosfeet and the floor on which she or he is standing Ren-dering avatar feet may not be sufficient since it is diffi-cult to achieve a high degree of realism We controlledfor this potential problem by having users wear a circu-lar collar in both the real-world and virtual-world condi-tions (Figure 3) The collar had the effect of occludingusersrsquo view of the floor out to about 2 m hiding thearea around their feet in all four tested conditions

Prior to the experiment trials subjects practiced blindwalking for five minutes During this practice subjectswalked blindfolded in a hallway and responded to verbalcommands to start and stop walking The training ishelpful in building trust between the experimenter andthe subject (Rieser 1999) but more importantly accus-

Figure 3 Viewing collar to hide viewerrsquos body and floor close to

standing position

566 PRESENCE VOLUME 13 NUMBER 5

toms the subject to walking blind During both thetraining session and the actual experiment subjectswore headphones fed by an external microphone to helplimit the effects of sound localization in the environ-ment A remote microphone worn by the experimenterallowed subjects to hear instructions After the trainingsession subjects were led still blindfolded either to ourlaboratory or to the real lobby This last step was per-formed to help ensure that the subjectrsquos movement dur-ing the experiment would not be inhibited by a prioriknowledge of the location of the walls in our lab Thesound-masking headphones remained on during thistime For the virtual-world conditions when subjectsarrived in the laboratory the HMD was placed on theirhead while their eyes remained closed Once on sub-jects were allowed to open their eyes and adjust the fitand focus of the HMD after which the orientation ofthe virtual world was aligned with the the natural rest-ing position of the HMD on the subject

4 Results

Figures 4ndash7 show the average judgments for eachof the four conditions real world high-quality pan-

orama images low-quality texture-mapped computergraphics and wireframe Error bars indicate one stan-dard error above and below the mean The intersectioncomputation used to compute apparent distance (Figure1) results in asymmetric variability around the meansince a turn of deg too far to the right produced an over-shoot in distance larger than the undershoot in distance

Figure 4 Distance judgments Real world Figure 5 Distance judgments Panorama images

Figure 6 Distance judgments Low-quality computer graphics

Thompson et al 567

produced by a turn of deg too far to the left An arctan-gent transform was applied to the data to reduce thiseffect Averages error estimates and measures of statis-tical significance were calculated in the transform spaceThe inverse transform was then applied to the calculatedaverages and errors in order to allow presentation of thefinal results in terms of judged distance

Figure 8 allows easy comparisons between results forall four conditions The experiment confirmed previousstudies showing that for standing observers viewingground-level targets in action-space range distancejudgments in the real world were near veridical (real-world) while distance judgments based on computergraphics were significantly compressed The surprisingresult was that the amount of compression was nearlythe same for all three graphical displays That is dis-tance judgments were almost unaffected by the qualityof the imagery presented to subjects

A 4 (environment) 3 (distance) 2 (sex) repeated-measures ANOVA with distance as a within-subjectvariable and environment and sex as between-subjectvariables was performed on the transformed averagedistance judgments and indicated a significant effect ofenvironment F(3 40) 1077 p 001 Collapsedacross distance Scheffe post hoc comparisons showed

that distance judgments in the real world were greaterthan those given in each of the other environments (p

01) and that performance in the other three environ-ments did not differ (p 48 for all comparisons) Al-though the means at 10 m or 15 m suggest differencesbetween the virtual conditions post hoc univariateANOVAs (with three environmental conditions) at eachdistance indicated that these differences were negligible(p 4 for the effect of environment) The ANOVAalso indicated an effect of distance F(2 80) 18384p 001 Judged distance increased as a function ofphysical distance for all environments In all the analy-ses demonstrated that perceived distance was signifi-cantly more accurate in the real world compared to thevirtual environments and that distance judgments in thevirtual environments did not vary much from eachother

5 Discussion

The results presented above are a strong indicatorthat compressed absolute egocentric distance judgmentsin visually immersive environments are not caused by alack of realistic graphics rendering The phenomenal

Figure 7 Distance judgments Wireframe graphics Figure 8 Distance judgments Comparison of all conditions

568 PRESENCE VOLUME 13 NUMBER 5

experience of realism in the panoramic environment isbest expressed by the comments of several subjectsWhen looking into a glass window in the rendered dis-play they commented ldquoWhy canrsquot I see my reflection inthe glassrdquo Despite this subjective experience judg-ments based on wireframe renderings were as good asjudgments based on actual images presented with thesame display system In all virtual environments therewas a large compression of egocentric distance As aresult absolute egocentric distance judgments in virtualenvironments are not likely to be aided by photorealisticimprovements in computer graphics such as better tex-turing and illumination From a theoretical standpointthis suggests that familiar size may be a relatively minorcontributor to the sort of distance judgments that wereinvestigated though it is important to note that all fourconditions involved hallway-like scaling and geometryThe similarity between judged distances to targets onthe floor in the three types of virtual displays is consis-tent with the hypothesis that the declination of visualangle to targets dominates distance egocentric percep-tion (Ooi Wu amp He 2001) However this does notexplain the large differences observed between distancejudgments in the real and virtual conditions

The present experiment used a methodology that in-volved a stationary viewer and an action-based judg-ment task to address specific questions about judgmentsof distance in visually immersive environments Our in-tent was to determine whether observers would judgeegocentric distance in the simulated environment in asimilar manner as in the real world without the experi-ence of active exploration Thus we restricted the ob-serverrsquos movement while viewing the environmentsPrevious visual-motor adaptation studies (Rieser PickAshmead amp Garing 1995 Pick Rieser Wagner ampGaring 1999) have demonstrated that active observerswill quickly adapt to a new mapping between visual in-put and their own movements leading to the result ofmodified motor output that corresponds to the visualworld (recalibration) We might predict that allowingactive exploration of the virtual environments wouldlead to a similar adaptation and recalibration effect sothat observers would learn to walk and turn an accuratedistance to virtual targets While this prediction ad-

dresses an important question it is a different questionthan the one presently asked in this paper Our goal wasto test whether egocentric distance judgments wouldreplicate the accurate performance demonstrated in thereal world not whether these judgments could becomeaccurate after interacting within a compressed percep-tion of the world Future studies should consider boththe extent of veridical perception in visually immersiveenvironments and the role of actions in making immer-sive environments useful despite a potential lack ofveridical perception

What might explain the compression of absolute ego-centric distance judgments if not image quality Wesuggest several possibilities but no solid evidence sup-porting any of the potential explanations has yet beenpublished While the realism of the panorama imagesused in this study far exceeded any of the computergraphics employed in distance-judgment experiments byother investigators resolution and apparent sharpnesswere still limited compared to natural viewing of thereal world This may have influenced a familiar-size ef-fect or may have degraded the sense of presence whilewearing the HMD Dixon Wraga Proffitt and Wil-liams (2000) found that visual immersion was neededfor eye height to appropriately scale linear perspectivecues Perhaps a full sense of presence not only visualimmersion is needed for distance judgments to be com-parable to what is seen in the real world Limited fieldof view is often suggested as a cause of distorted spatialvision in HMDs but Knapp and Loomis (in press)found that limiting FOV did not affect real-world ego-centric distance judgments at least if the observer wasfree to move his or her head to visually explore the envi-ronment Motion parallax was not present in our virtualdisplay conditions but motion parallax appears to be arather weak absolute-distance cue (Beall et al 1995)In addition subjects performed veridically in our real-word condition with at most very limited translationalhead motion Focus and stereo convergence are not wellcontrolled in HMDs (Rolland et al 1995 WannRushton amp Mon-Williams 1995) and incorrect accom-modation cues are known to affect distance judgments(Andersen Saidpour amp Braunstein 1998 BinghamBradley Bailey amp Vinner 2001) It seems unlikely

Thompson et al 569

however that accommodation and convergence wouldhave an effect this large at the distances we were investi-gating Finally there may be some sort of ergonomiceffect associated with wearing an HMD (Lackner amp Di-Zio 1989)

Future research that manipulates factors other thanthe image quality such as FOV stereo and physicaleffects of the HMD is needed to begin to answer thesequestions A sense of presence is more difficult to defineand manipulate but is likely to be an important compo-nent in accurate distance perception in virtual environ-ments

Acknowledgments

This material is based upon work supported by the NationalScience Foundation under grants 9623614 0080999 and0121084 Thanks to Alias Wavefront for their donation ofMaya Complete which was used in this project

References

Andersen G J Saidpour A amp Braunstein M L (1998)Effects of collimation on perceived layout in 3-D scenesPerception 27 1305ndash1315

Beall A C Loomis J M Philbeck J M amp Fikes T J(1995) Absolute motion parallax weakly determines visualscale in real and virtual environments In Proceedings of theSPIE-The International Society for Optical Engineering2411 288ndash297

Bingham G P Bradley A Bailey M amp Vinner R (2001)Accomodation occlusion and disparity matching are usedto guide reaching A comparison of actual versus virtual en-vironments Journal of Experimental Psychology HumanPerception and Performance 27 1314ndash1334

Cutting J E amp Vishton P M (1995) Perceiving layoutand knowing distance The integration relative potency andcontextual use of different information about depth In WEpstein amp S Rogers (Eds) Perception of Space and Motion(pp 69ndash117) New York Academic

Dixon M W Wraga M Proffitt D R amp Williams G C(2000) Eye height scaling of absolute size in immersive and

nonimmersive displays Journal of Experimental PsychologyHuman Perception and Performance 26(2) 582ndash593

Durgin F H Fox L F Lewis J amp Walley K A (2002)Perceptuomotor adaptation More than meets the eye Pa-per presented at the forty-third annual meeting of the Psy-chonomic Society Kansas City MO

Ellis S R amp Menges B M (1997) Judgments of the dis-tance to nearby virtual objects Interaction of viewing con-ditions and accommodative demand Presence Teleoperatorsand Virtual Environments 6 452ndash462

Foley J M Ribeiro-Filho N P amp Da Silva J A (2004)Visual perception of extent and the geometry of visualspace Vision Research 44 147ndash156

Fukusima S S Loomis J M amp Da Silva J A (1997) Vi-sual perception of egocentric distance as assessed by trian-gulation Journal of Experimental Psychology Human Per-ception and Performance 23(1) 86ndash100

Knapp J M (1999) The visual perception of egocentric dis-tance in virtual environments Unpublished doctoral disser-tation University of California at Santa Barbara

Knapp J M amp Loomis J M (in press) Limited field ofview of head-mounted displays is not the cause of distanceunderestimation in virtual environments Presence Teleop-erators and Virtual Environments

Lackner J R amp DiZio P (1989) Altered sensory-motorcontrol of the head as an etiological factor in space-motionsickness Perceptual and Motor Skills 68 784ndash786

Lampton D R McDonald D P Singer M amp Bliss J(1995) Distance estimation in virtual environments In Pro-ceedings of the Human Factors and Ergonomics Society 39thAnnual Meeting 1268ndash1272

Loomis J M Blascovich J J amp Beall A C (1999) Im-mersive virtual environment technology as a basic researchtool in psychology Behavior Research Methods Instrumentsand Computers 31(4) 557ndash564

Loomis J M Da Silva J A Fujita N amp Fukusima S S(1992) Visual space perception and visually directed actionJournal of Experimental Psychology Human Perception andPerformance 18 906ndash921

Loomis J M amp Knapp J M (2003) Visual perception ofegocentric distance in real and virtual environments In LHettinger amp M Haas (Eds) Virtual and adaptive environ-ments (pp 21ndash 46) Hillsdale NJ Erlbaum

Loomis J M Philbeck J W amp Zahorik P (2002) Disso-ciation between location and shape in visual space Journal

570 PRESENCE VOLUME 13 NUMBER 5

of Experimental Psychology Human Perception and Perfor-mance 28 1202ndash1212

Ooi T L Wu B amp He Z J (2001) Distance determina-tion by the angular declination below the horizon Nature414 197ndash200

Pick H L Jr Rieser J J Wagner D amp Garing A E(1999) The recalibration of rotational locomotion Journalof Experimental Psychology Human Perception and Perfor-mance 25(5) 1179ndash1188

Raskar R amp Cohen M (1999) Image precision silhouetteedges In Proceedings of the ACM Symposium on Interactive3D Graphics 135ndash140

Rieser J J (1999) Dynamic spatial orientation and the cou-pling of representation and action In R G Golledge (Ed)Wayfinding behavior Cognitive mapping and other spatialprocesses (pp 168ndash190) Baltimore MD Johns HopkinsUniversity Press

Rieser J J Ashmead D H Talor C R amp YoungquistG A (1990) Visual perception and the guidance of loco-motion without vision to previously seen targets Perception19 675ndash689

Rieser J J Pick H L Jr Ashmead D amp Garing A(1995) Calibration of human locomotion and models ofperceptual-motor organization Journal of Experimental

Psychology Human Perception and Performance 21 480ndash497

Rolland J P Gibson W amp Arierly D (1995) Towardsquantifying depth and size perception as a function of view-ing distance Presence Teleoperators and Virtual Environ-ments 4 24ndash49

Thomson J A (1983) Is continuous visual monitoring nec-essary in visually guided locomotion Journal of Experimen-tal Psychology Human Perception and Performance 9(3)427ndash443

Wann J P Rushton S amp Mon-Williams M (1995) Natu-ral problems for stereoscopic depth perception in virtualenvironments Vision Research 35(19) 2731ndash2736

Willemsen P amp Gooch A (2002) Perceived egocentric dis-tances in real image-based and traditional virtual environ-ments In Proceedings of IEEE Virtual Reality Conference89ndash90

Witmer B amp Sadowski W Jr (1998) Nonvisually guidedlocomotion to a previously viewed target in real and virtualenvironments Human Factors 40 478ndash488

Wraga M (1999) Using eye height in different postures toscale the heights of objects Journal of Experimental Psychol-ogy Human Perception and Performance 25(2) 518ndash530

Thompson et al 571

Page 4: Santa Barbara Department of Psychology, Santa Barbara ... et al Presence.… · Santa Barbara Santa Barbara, CA 93106 Andrew C. Beall beall@psych.ucsb.edu Department of Psychology,

distance perception High accuracy in distance estima-tion has been observed in visually directed action experi-ments across many studies

23 Prior Studies of DistanceJudgments in Visually ImmersiveEnvironments

In the last few years a number of research groupshave addressed the issue of space perception in visuallyimmersive environments This work has been motivatedby a desire to explore new techniques both for probinghuman vision (Loomis Blascovich amp Beall 1999) andfor quantifying operator performance in virtual environ-ments (Lampton McDonald Singer amp Bliss 1995)Table 2 summarizes the results of four previous studiesof absolute egocentric distance judgments over actionspace in visually immersive environments along withsome of the results discussed further in section 4 Ineach of these studies some form of directed action wasused to evaluate distance judgments in both real andcomputer generated scenes All involved indoor envi-

ronments and targets situated on a level ground planeThe first study used a Fakespace Labs BOOM2C displaywith 1280 1024 resolution The second and thirdstudies used a Virtual Research FS5 HMD with 800

480 resolution The final two studies used an nVisionHiRes HMD with 1280 1024 resolution

One of the striking results from these studies is thatdistance judgments in virtual environments were consis-tently underestimated compared with judgments in thereal world Most of the results in the CG column of Ta-ble 2 were based on imagery comparable to that shownin Figure 2b One potential explanation for this com-pression of virtual space is that the quality of the imag-ery is too poor to generate an effective familiar-size ef-fect The experiment described below is aimed atexploring this conjecture

3 Method

In order to investigate the degree to which imagequality affects egocentric-distance judgments in virtualenvironments we compared distance judgments in thereal world with distance judgments in virtual environ-ments utilizing three very distinct styles of graphicalrendering 360deg high-resolution panoramic images in-tentionally low-quality texture-mapped computer graphicsand wireframe renderings (Figure 2) We probed sub-jectsrsquo perceptions of distance using a directed-actiontask in which subjects indirectly walked without visiontoward a previously viewed target A between-subjectsdesign was used in which a given subject viewed trialsat three different distances in one of four different envi-ronments Care was taken to make the tasks similar inboth the real and virtual environments and to make thescale and general layout of all four environments equiva-lent

31 Participants

Forty-eight college-age students participated inthis study with six male and six female subjects in eachcondition Subjects either received course credit for par-ticipating or were volunteers All subjects were given a

Figure 1 Triangulated walking task Subjects start walking in an

oblique direction from the direction of a previously viewed target On

directions from the experimenter they turn and take several steps

toward where they perceived the previously viewed target to be

Thompson et al 563

stereogram eye test and had normal or corrected-to-normal vision Interpupillary distances ranged from 51cm to 77 cm with an average of 619 cm

32 Materials

In the real-world condition subjects viewed afoam-core circular disk approximately 37 cm in diameterand placed on the ground at distances of 5 m 10 mand 15 m The experiment was performed in the lobbyof an engineering classroom building Subject positionsrelevant to computing apparent distance (Figure 1) weredetermined by measuring foot positions on the floor

In the three virtual-world conditions imagery waspresented using an nVision Datavisor HiRes HMD withinterlaced 1280 1024 resolution full field-sequentialcolor and a 42deg horizontal field of view The angularresolution of the HMD was on the order of 2 arc min-utes per pixel The nVision has user-adjustable focusThe display was configured with 100 stereo overlapbetween the two eyes Head tracking was done using anInterSense IS600 Mark 2 tracker This tracker uses amix of inertial gravitational and acoustic technologiesto provide state-of-the art accuracy and latency Onlytracker rotation was used to update the viewpointWhile translational tracker positions were recorded theresults reported in section 4 were based on measuredfoot position on the floor in order to be consistent withthe real-world condition All computer-generated envi-

ronments were rendered on an SGI Onyx2 R12000with two IR2 rendering pipelines One rendering pipe-line was used for each eye to provide stereopsis

Multiple sets of panorama images were produced fordifferent target distances and eye heights based on pho-tographs acquired by swinging a camera around a fixedaxis located in the same position as the viewpoint forthe real-world condition Targets were placed in thesame locations as for the real-world condition To pro-vide stereo viewing two sets of images were taken foreach panorama with the camera offset laterally 325cm from the axis of rotation The two sets of photo-graphs were digitized onto a PhotoCD and then mosa-icked into two cylindrical images using the PanoramaFactory software package Each cylindrical image wastexture-mapped onto a set of polygons forming a cylin-drical configuration providing the ability to generateviews over a 360deg by 100deg portion of the optical sphereRendering update rates were no less than 40 frames persecond in each eye The result was a compelling sense ofbeing able to look around in the virtual environmentthough no motion parallax was available and the stereogeometry was slightly incorrect To control for subjectsrsquoeye heights multiple-panorama image pairs were pro-duced for eye heights spaced at 5 cm intervals and theset nearest to a given subjectrsquos eye height was used forthat subjectrsquos trials Practical concerns relating to themanner in which the original images were captured pre-cluded a similar control for interpupillary distance

Table 2 Distance Judgments Based on Viewing Imagery Generated by Computer Graphics (CG) and Using VisuallyImmersive Displays

Study Distance (m) Real () CG () Task

Witmer amp Sadowski (1998) 46ndash32 92 85 Treadmill walkingKnapp (1999) 5ndash15 100 42 Triangulated walkingDurgin Fox Lewis amp Walley

(2002) 2ndash8 65 Direct walkingWillemsen amp Gooch (2002) 2ndash5 100 81 Direct walkingConditions 1 and 2 this study 5ndash15 95 44 Triangulated walking

Note Distances are compressed relative to comparable judgments based on viewing real-world environments Thepercentages indicate the overall ratio of perceived distance to actual distance

564 PRESENCE VOLUME 13 NUMBER 5

The second virtual-environment condition involved acomputer graphics rendering of the same classroombuilding lobby The scale of the model was the same asthe actual building lobby but the geometric detail wasintentionally kept quite simple Stereotypical tiled tex-ture maps were used Simple point-source lighting wasused with no shadows or other global illumination ef-fects Targets were rendered as red disks with the sizeand position corresponding to what was used for the

real-world condition Rendering update rates were noless than 30 frames per second in each eye

The wireframe virtual environment condition wasconstructed by rendering feature edges of the modelused in the second virtual-environment condition Oursoftware used an OpenGL silhouette drawing algorithm(Raskar amp Cohen 1999) to generate the feature edgesThe frame rates for this environment were no less than40 frames per second The wireframe rendering pro-

Figure 2 Sample imagery for conditions 2 3 and 4 (a) Section of panorama image showing target (b) Example of low-quality computer

graphics image showing target The viewpoint is the same as for Figure 2a (c) Example of wireframe computer graphics image showing target

The viewpoint is the same as for Figure 2a

Thompson et al 565

duced scenes that resemble black-on-white sketches ofthe classroom building lobby The target was renderedwith feature edges as well with size and position thesame as for the previous conditions

For both the texture-mapped and wireframe com-puter graphics conditions eye heights were renderedbased on the subjectsrsquo actual eye heights Interpupillarydistances for stereo rendering were fixed at 65 cm con-sistent with the panorama images

33 Procedure

Subjects were first provided with written instruc-tions that described the triangulated walking task andthen given a demonstration of the task in a space bothsmaller and different from the actual experiment spatiallayout For all conditions both real and virtual subjectswere instructed to obtain a good image of the targetand their local surroundings while first facing the targetSubjects were told that a ldquogood imagerdquo is obtained ifafter closing their eyes they would still be able to ldquoseerdquothe environment and most importantly the targetSubjects were allowed to rotate their head about theirneck but were instructed not to move their head fromside to side or back and forth This was done to mini-mize motion-parallax cues in the real-world conditionso as to make it as comparable as possible to the virtual-world conditions

Once a good image was achieved subjects were in-structed to physically turn their bodies approximately70deg to the right to face a junction of two walls in theenvironment After subjects turned they were in-structed to turn their head back toward the target toobtain a final view and reaffirm their mental image ofthe environment Then subjects either blindfoldedthemselves (real-world condition) or closed their eyeswhile the HMD screen was cleared to black (virtual-world conditions) Subjects were then directed to walkpurposefully and decisively in the direction their bodywas facing After walking approximately 25 m an ex-perimenter would give the verbal command ldquoturnrdquo sig-naling the subject to turn toward the target and stopwalking when they felt they were facing the target Sub-jects were instructed to perform this turn as if they were

turning a corner in a real hallway to make the move-ment as natural as possible At this point the subjectrsquosposition was marked and they were directed to ldquoTaketwo steps in the direction of the targetrdquo Again the sub-jectrsquos position was marked and recorded The subjectwas then led without vision to the starting location byan experimenter In all conditions the apparent locationof the target was assumed to lie at the intersection ofthe line of sight to the (visible) target from the initialvantage point and a line corresponding to the subjectrsquostrajectory on the final walk toward the presumed targetlocation (Figure 1)

The userrsquos own body is seldom rendered in immersivevirtual environments This is a potential problem wheninvestigating absolute egocentric distance judgmentssince eye height is an important scaling factor that couldconceivably be affected by looking down at the userrsquosfeet and the floor on which she or he is standing Ren-dering avatar feet may not be sufficient since it is diffi-cult to achieve a high degree of realism We controlledfor this potential problem by having users wear a circu-lar collar in both the real-world and virtual-world condi-tions (Figure 3) The collar had the effect of occludingusersrsquo view of the floor out to about 2 m hiding thearea around their feet in all four tested conditions

Prior to the experiment trials subjects practiced blindwalking for five minutes During this practice subjectswalked blindfolded in a hallway and responded to verbalcommands to start and stop walking The training ishelpful in building trust between the experimenter andthe subject (Rieser 1999) but more importantly accus-

Figure 3 Viewing collar to hide viewerrsquos body and floor close to

standing position

566 PRESENCE VOLUME 13 NUMBER 5

toms the subject to walking blind During both thetraining session and the actual experiment subjectswore headphones fed by an external microphone to helplimit the effects of sound localization in the environ-ment A remote microphone worn by the experimenterallowed subjects to hear instructions After the trainingsession subjects were led still blindfolded either to ourlaboratory or to the real lobby This last step was per-formed to help ensure that the subjectrsquos movement dur-ing the experiment would not be inhibited by a prioriknowledge of the location of the walls in our lab Thesound-masking headphones remained on during thistime For the virtual-world conditions when subjectsarrived in the laboratory the HMD was placed on theirhead while their eyes remained closed Once on sub-jects were allowed to open their eyes and adjust the fitand focus of the HMD after which the orientation ofthe virtual world was aligned with the the natural rest-ing position of the HMD on the subject

4 Results

Figures 4ndash7 show the average judgments for eachof the four conditions real world high-quality pan-

orama images low-quality texture-mapped computergraphics and wireframe Error bars indicate one stan-dard error above and below the mean The intersectioncomputation used to compute apparent distance (Figure1) results in asymmetric variability around the meansince a turn of deg too far to the right produced an over-shoot in distance larger than the undershoot in distance

Figure 4 Distance judgments Real world Figure 5 Distance judgments Panorama images

Figure 6 Distance judgments Low-quality computer graphics

Thompson et al 567

produced by a turn of deg too far to the left An arctan-gent transform was applied to the data to reduce thiseffect Averages error estimates and measures of statis-tical significance were calculated in the transform spaceThe inverse transform was then applied to the calculatedaverages and errors in order to allow presentation of thefinal results in terms of judged distance

Figure 8 allows easy comparisons between results forall four conditions The experiment confirmed previousstudies showing that for standing observers viewingground-level targets in action-space range distancejudgments in the real world were near veridical (real-world) while distance judgments based on computergraphics were significantly compressed The surprisingresult was that the amount of compression was nearlythe same for all three graphical displays That is dis-tance judgments were almost unaffected by the qualityof the imagery presented to subjects

A 4 (environment) 3 (distance) 2 (sex) repeated-measures ANOVA with distance as a within-subjectvariable and environment and sex as between-subjectvariables was performed on the transformed averagedistance judgments and indicated a significant effect ofenvironment F(3 40) 1077 p 001 Collapsedacross distance Scheffe post hoc comparisons showed

that distance judgments in the real world were greaterthan those given in each of the other environments (p

01) and that performance in the other three environ-ments did not differ (p 48 for all comparisons) Al-though the means at 10 m or 15 m suggest differencesbetween the virtual conditions post hoc univariateANOVAs (with three environmental conditions) at eachdistance indicated that these differences were negligible(p 4 for the effect of environment) The ANOVAalso indicated an effect of distance F(2 80) 18384p 001 Judged distance increased as a function ofphysical distance for all environments In all the analy-ses demonstrated that perceived distance was signifi-cantly more accurate in the real world compared to thevirtual environments and that distance judgments in thevirtual environments did not vary much from eachother

5 Discussion

The results presented above are a strong indicatorthat compressed absolute egocentric distance judgmentsin visually immersive environments are not caused by alack of realistic graphics rendering The phenomenal

Figure 7 Distance judgments Wireframe graphics Figure 8 Distance judgments Comparison of all conditions

568 PRESENCE VOLUME 13 NUMBER 5

experience of realism in the panoramic environment isbest expressed by the comments of several subjectsWhen looking into a glass window in the rendered dis-play they commented ldquoWhy canrsquot I see my reflection inthe glassrdquo Despite this subjective experience judg-ments based on wireframe renderings were as good asjudgments based on actual images presented with thesame display system In all virtual environments therewas a large compression of egocentric distance As aresult absolute egocentric distance judgments in virtualenvironments are not likely to be aided by photorealisticimprovements in computer graphics such as better tex-turing and illumination From a theoretical standpointthis suggests that familiar size may be a relatively minorcontributor to the sort of distance judgments that wereinvestigated though it is important to note that all fourconditions involved hallway-like scaling and geometryThe similarity between judged distances to targets onthe floor in the three types of virtual displays is consis-tent with the hypothesis that the declination of visualangle to targets dominates distance egocentric percep-tion (Ooi Wu amp He 2001) However this does notexplain the large differences observed between distancejudgments in the real and virtual conditions

The present experiment used a methodology that in-volved a stationary viewer and an action-based judg-ment task to address specific questions about judgmentsof distance in visually immersive environments Our in-tent was to determine whether observers would judgeegocentric distance in the simulated environment in asimilar manner as in the real world without the experi-ence of active exploration Thus we restricted the ob-serverrsquos movement while viewing the environmentsPrevious visual-motor adaptation studies (Rieser PickAshmead amp Garing 1995 Pick Rieser Wagner ampGaring 1999) have demonstrated that active observerswill quickly adapt to a new mapping between visual in-put and their own movements leading to the result ofmodified motor output that corresponds to the visualworld (recalibration) We might predict that allowingactive exploration of the virtual environments wouldlead to a similar adaptation and recalibration effect sothat observers would learn to walk and turn an accuratedistance to virtual targets While this prediction ad-

dresses an important question it is a different questionthan the one presently asked in this paper Our goal wasto test whether egocentric distance judgments wouldreplicate the accurate performance demonstrated in thereal world not whether these judgments could becomeaccurate after interacting within a compressed percep-tion of the world Future studies should consider boththe extent of veridical perception in visually immersiveenvironments and the role of actions in making immer-sive environments useful despite a potential lack ofveridical perception

What might explain the compression of absolute ego-centric distance judgments if not image quality Wesuggest several possibilities but no solid evidence sup-porting any of the potential explanations has yet beenpublished While the realism of the panorama imagesused in this study far exceeded any of the computergraphics employed in distance-judgment experiments byother investigators resolution and apparent sharpnesswere still limited compared to natural viewing of thereal world This may have influenced a familiar-size ef-fect or may have degraded the sense of presence whilewearing the HMD Dixon Wraga Proffitt and Wil-liams (2000) found that visual immersion was neededfor eye height to appropriately scale linear perspectivecues Perhaps a full sense of presence not only visualimmersion is needed for distance judgments to be com-parable to what is seen in the real world Limited fieldof view is often suggested as a cause of distorted spatialvision in HMDs but Knapp and Loomis (in press)found that limiting FOV did not affect real-world ego-centric distance judgments at least if the observer wasfree to move his or her head to visually explore the envi-ronment Motion parallax was not present in our virtualdisplay conditions but motion parallax appears to be arather weak absolute-distance cue (Beall et al 1995)In addition subjects performed veridically in our real-word condition with at most very limited translationalhead motion Focus and stereo convergence are not wellcontrolled in HMDs (Rolland et al 1995 WannRushton amp Mon-Williams 1995) and incorrect accom-modation cues are known to affect distance judgments(Andersen Saidpour amp Braunstein 1998 BinghamBradley Bailey amp Vinner 2001) It seems unlikely

Thompson et al 569

however that accommodation and convergence wouldhave an effect this large at the distances we were investi-gating Finally there may be some sort of ergonomiceffect associated with wearing an HMD (Lackner amp Di-Zio 1989)

Future research that manipulates factors other thanthe image quality such as FOV stereo and physicaleffects of the HMD is needed to begin to answer thesequestions A sense of presence is more difficult to defineand manipulate but is likely to be an important compo-nent in accurate distance perception in virtual environ-ments

Acknowledgments

This material is based upon work supported by the NationalScience Foundation under grants 9623614 0080999 and0121084 Thanks to Alias Wavefront for their donation ofMaya Complete which was used in this project

References

Andersen G J Saidpour A amp Braunstein M L (1998)Effects of collimation on perceived layout in 3-D scenesPerception 27 1305ndash1315

Beall A C Loomis J M Philbeck J M amp Fikes T J(1995) Absolute motion parallax weakly determines visualscale in real and virtual environments In Proceedings of theSPIE-The International Society for Optical Engineering2411 288ndash297

Bingham G P Bradley A Bailey M amp Vinner R (2001)Accomodation occlusion and disparity matching are usedto guide reaching A comparison of actual versus virtual en-vironments Journal of Experimental Psychology HumanPerception and Performance 27 1314ndash1334

Cutting J E amp Vishton P M (1995) Perceiving layoutand knowing distance The integration relative potency andcontextual use of different information about depth In WEpstein amp S Rogers (Eds) Perception of Space and Motion(pp 69ndash117) New York Academic

Dixon M W Wraga M Proffitt D R amp Williams G C(2000) Eye height scaling of absolute size in immersive and

nonimmersive displays Journal of Experimental PsychologyHuman Perception and Performance 26(2) 582ndash593

Durgin F H Fox L F Lewis J amp Walley K A (2002)Perceptuomotor adaptation More than meets the eye Pa-per presented at the forty-third annual meeting of the Psy-chonomic Society Kansas City MO

Ellis S R amp Menges B M (1997) Judgments of the dis-tance to nearby virtual objects Interaction of viewing con-ditions and accommodative demand Presence Teleoperatorsand Virtual Environments 6 452ndash462

Foley J M Ribeiro-Filho N P amp Da Silva J A (2004)Visual perception of extent and the geometry of visualspace Vision Research 44 147ndash156

Fukusima S S Loomis J M amp Da Silva J A (1997) Vi-sual perception of egocentric distance as assessed by trian-gulation Journal of Experimental Psychology Human Per-ception and Performance 23(1) 86ndash100

Knapp J M (1999) The visual perception of egocentric dis-tance in virtual environments Unpublished doctoral disser-tation University of California at Santa Barbara

Knapp J M amp Loomis J M (in press) Limited field ofview of head-mounted displays is not the cause of distanceunderestimation in virtual environments Presence Teleop-erators and Virtual Environments

Lackner J R amp DiZio P (1989) Altered sensory-motorcontrol of the head as an etiological factor in space-motionsickness Perceptual and Motor Skills 68 784ndash786

Lampton D R McDonald D P Singer M amp Bliss J(1995) Distance estimation in virtual environments In Pro-ceedings of the Human Factors and Ergonomics Society 39thAnnual Meeting 1268ndash1272

Loomis J M Blascovich J J amp Beall A C (1999) Im-mersive virtual environment technology as a basic researchtool in psychology Behavior Research Methods Instrumentsand Computers 31(4) 557ndash564

Loomis J M Da Silva J A Fujita N amp Fukusima S S(1992) Visual space perception and visually directed actionJournal of Experimental Psychology Human Perception andPerformance 18 906ndash921

Loomis J M amp Knapp J M (2003) Visual perception ofegocentric distance in real and virtual environments In LHettinger amp M Haas (Eds) Virtual and adaptive environ-ments (pp 21ndash 46) Hillsdale NJ Erlbaum

Loomis J M Philbeck J W amp Zahorik P (2002) Disso-ciation between location and shape in visual space Journal

570 PRESENCE VOLUME 13 NUMBER 5

of Experimental Psychology Human Perception and Perfor-mance 28 1202ndash1212

Ooi T L Wu B amp He Z J (2001) Distance determina-tion by the angular declination below the horizon Nature414 197ndash200

Pick H L Jr Rieser J J Wagner D amp Garing A E(1999) The recalibration of rotational locomotion Journalof Experimental Psychology Human Perception and Perfor-mance 25(5) 1179ndash1188

Raskar R amp Cohen M (1999) Image precision silhouetteedges In Proceedings of the ACM Symposium on Interactive3D Graphics 135ndash140

Rieser J J (1999) Dynamic spatial orientation and the cou-pling of representation and action In R G Golledge (Ed)Wayfinding behavior Cognitive mapping and other spatialprocesses (pp 168ndash190) Baltimore MD Johns HopkinsUniversity Press

Rieser J J Ashmead D H Talor C R amp YoungquistG A (1990) Visual perception and the guidance of loco-motion without vision to previously seen targets Perception19 675ndash689

Rieser J J Pick H L Jr Ashmead D amp Garing A(1995) Calibration of human locomotion and models ofperceptual-motor organization Journal of Experimental

Psychology Human Perception and Performance 21 480ndash497

Rolland J P Gibson W amp Arierly D (1995) Towardsquantifying depth and size perception as a function of view-ing distance Presence Teleoperators and Virtual Environ-ments 4 24ndash49

Thomson J A (1983) Is continuous visual monitoring nec-essary in visually guided locomotion Journal of Experimen-tal Psychology Human Perception and Performance 9(3)427ndash443

Wann J P Rushton S amp Mon-Williams M (1995) Natu-ral problems for stereoscopic depth perception in virtualenvironments Vision Research 35(19) 2731ndash2736

Willemsen P amp Gooch A (2002) Perceived egocentric dis-tances in real image-based and traditional virtual environ-ments In Proceedings of IEEE Virtual Reality Conference89ndash90

Witmer B amp Sadowski W Jr (1998) Nonvisually guidedlocomotion to a previously viewed target in real and virtualenvironments Human Factors 40 478ndash488

Wraga M (1999) Using eye height in different postures toscale the heights of objects Journal of Experimental Psychol-ogy Human Perception and Performance 25(2) 518ndash530

Thompson et al 571

Page 5: Santa Barbara Department of Psychology, Santa Barbara ... et al Presence.… · Santa Barbara Santa Barbara, CA 93106 Andrew C. Beall beall@psych.ucsb.edu Department of Psychology,

stereogram eye test and had normal or corrected-to-normal vision Interpupillary distances ranged from 51cm to 77 cm with an average of 619 cm

32 Materials

In the real-world condition subjects viewed afoam-core circular disk approximately 37 cm in diameterand placed on the ground at distances of 5 m 10 mand 15 m The experiment was performed in the lobbyof an engineering classroom building Subject positionsrelevant to computing apparent distance (Figure 1) weredetermined by measuring foot positions on the floor

In the three virtual-world conditions imagery waspresented using an nVision Datavisor HiRes HMD withinterlaced 1280 1024 resolution full field-sequentialcolor and a 42deg horizontal field of view The angularresolution of the HMD was on the order of 2 arc min-utes per pixel The nVision has user-adjustable focusThe display was configured with 100 stereo overlapbetween the two eyes Head tracking was done using anInterSense IS600 Mark 2 tracker This tracker uses amix of inertial gravitational and acoustic technologiesto provide state-of-the art accuracy and latency Onlytracker rotation was used to update the viewpointWhile translational tracker positions were recorded theresults reported in section 4 were based on measuredfoot position on the floor in order to be consistent withthe real-world condition All computer-generated envi-

ronments were rendered on an SGI Onyx2 R12000with two IR2 rendering pipelines One rendering pipe-line was used for each eye to provide stereopsis

Multiple sets of panorama images were produced fordifferent target distances and eye heights based on pho-tographs acquired by swinging a camera around a fixedaxis located in the same position as the viewpoint forthe real-world condition Targets were placed in thesame locations as for the real-world condition To pro-vide stereo viewing two sets of images were taken foreach panorama with the camera offset laterally 325cm from the axis of rotation The two sets of photo-graphs were digitized onto a PhotoCD and then mosa-icked into two cylindrical images using the PanoramaFactory software package Each cylindrical image wastexture-mapped onto a set of polygons forming a cylin-drical configuration providing the ability to generateviews over a 360deg by 100deg portion of the optical sphereRendering update rates were no less than 40 frames persecond in each eye The result was a compelling sense ofbeing able to look around in the virtual environmentthough no motion parallax was available and the stereogeometry was slightly incorrect To control for subjectsrsquoeye heights multiple-panorama image pairs were pro-duced for eye heights spaced at 5 cm intervals and theset nearest to a given subjectrsquos eye height was used forthat subjectrsquos trials Practical concerns relating to themanner in which the original images were captured pre-cluded a similar control for interpupillary distance

Table 2 Distance Judgments Based on Viewing Imagery Generated by Computer Graphics (CG) and Using VisuallyImmersive Displays

Study Distance (m) Real () CG () Task

Witmer amp Sadowski (1998) 46ndash32 92 85 Treadmill walkingKnapp (1999) 5ndash15 100 42 Triangulated walkingDurgin Fox Lewis amp Walley

(2002) 2ndash8 65 Direct walkingWillemsen amp Gooch (2002) 2ndash5 100 81 Direct walkingConditions 1 and 2 this study 5ndash15 95 44 Triangulated walking

Note Distances are compressed relative to comparable judgments based on viewing real-world environments Thepercentages indicate the overall ratio of perceived distance to actual distance

564 PRESENCE VOLUME 13 NUMBER 5

The second virtual-environment condition involved acomputer graphics rendering of the same classroombuilding lobby The scale of the model was the same asthe actual building lobby but the geometric detail wasintentionally kept quite simple Stereotypical tiled tex-ture maps were used Simple point-source lighting wasused with no shadows or other global illumination ef-fects Targets were rendered as red disks with the sizeand position corresponding to what was used for the

real-world condition Rendering update rates were noless than 30 frames per second in each eye

The wireframe virtual environment condition wasconstructed by rendering feature edges of the modelused in the second virtual-environment condition Oursoftware used an OpenGL silhouette drawing algorithm(Raskar amp Cohen 1999) to generate the feature edgesThe frame rates for this environment were no less than40 frames per second The wireframe rendering pro-

Figure 2 Sample imagery for conditions 2 3 and 4 (a) Section of panorama image showing target (b) Example of low-quality computer

graphics image showing target The viewpoint is the same as for Figure 2a (c) Example of wireframe computer graphics image showing target

The viewpoint is the same as for Figure 2a

Thompson et al 565

duced scenes that resemble black-on-white sketches ofthe classroom building lobby The target was renderedwith feature edges as well with size and position thesame as for the previous conditions

For both the texture-mapped and wireframe com-puter graphics conditions eye heights were renderedbased on the subjectsrsquo actual eye heights Interpupillarydistances for stereo rendering were fixed at 65 cm con-sistent with the panorama images

33 Procedure

Subjects were first provided with written instruc-tions that described the triangulated walking task andthen given a demonstration of the task in a space bothsmaller and different from the actual experiment spatiallayout For all conditions both real and virtual subjectswere instructed to obtain a good image of the targetand their local surroundings while first facing the targetSubjects were told that a ldquogood imagerdquo is obtained ifafter closing their eyes they would still be able to ldquoseerdquothe environment and most importantly the targetSubjects were allowed to rotate their head about theirneck but were instructed not to move their head fromside to side or back and forth This was done to mini-mize motion-parallax cues in the real-world conditionso as to make it as comparable as possible to the virtual-world conditions

Once a good image was achieved subjects were in-structed to physically turn their bodies approximately70deg to the right to face a junction of two walls in theenvironment After subjects turned they were in-structed to turn their head back toward the target toobtain a final view and reaffirm their mental image ofthe environment Then subjects either blindfoldedthemselves (real-world condition) or closed their eyeswhile the HMD screen was cleared to black (virtual-world conditions) Subjects were then directed to walkpurposefully and decisively in the direction their bodywas facing After walking approximately 25 m an ex-perimenter would give the verbal command ldquoturnrdquo sig-naling the subject to turn toward the target and stopwalking when they felt they were facing the target Sub-jects were instructed to perform this turn as if they were

turning a corner in a real hallway to make the move-ment as natural as possible At this point the subjectrsquosposition was marked and they were directed to ldquoTaketwo steps in the direction of the targetrdquo Again the sub-jectrsquos position was marked and recorded The subjectwas then led without vision to the starting location byan experimenter In all conditions the apparent locationof the target was assumed to lie at the intersection ofthe line of sight to the (visible) target from the initialvantage point and a line corresponding to the subjectrsquostrajectory on the final walk toward the presumed targetlocation (Figure 1)

The userrsquos own body is seldom rendered in immersivevirtual environments This is a potential problem wheninvestigating absolute egocentric distance judgmentssince eye height is an important scaling factor that couldconceivably be affected by looking down at the userrsquosfeet and the floor on which she or he is standing Ren-dering avatar feet may not be sufficient since it is diffi-cult to achieve a high degree of realism We controlledfor this potential problem by having users wear a circu-lar collar in both the real-world and virtual-world condi-tions (Figure 3) The collar had the effect of occludingusersrsquo view of the floor out to about 2 m hiding thearea around their feet in all four tested conditions

Prior to the experiment trials subjects practiced blindwalking for five minutes During this practice subjectswalked blindfolded in a hallway and responded to verbalcommands to start and stop walking The training ishelpful in building trust between the experimenter andthe subject (Rieser 1999) but more importantly accus-

Figure 3 Viewing collar to hide viewerrsquos body and floor close to

standing position

566 PRESENCE VOLUME 13 NUMBER 5

toms the subject to walking blind During both thetraining session and the actual experiment subjectswore headphones fed by an external microphone to helplimit the effects of sound localization in the environ-ment A remote microphone worn by the experimenterallowed subjects to hear instructions After the trainingsession subjects were led still blindfolded either to ourlaboratory or to the real lobby This last step was per-formed to help ensure that the subjectrsquos movement dur-ing the experiment would not be inhibited by a prioriknowledge of the location of the walls in our lab Thesound-masking headphones remained on during thistime For the virtual-world conditions when subjectsarrived in the laboratory the HMD was placed on theirhead while their eyes remained closed Once on sub-jects were allowed to open their eyes and adjust the fitand focus of the HMD after which the orientation ofthe virtual world was aligned with the the natural rest-ing position of the HMD on the subject

4 Results

Figures 4ndash7 show the average judgments for eachof the four conditions real world high-quality pan-

orama images low-quality texture-mapped computergraphics and wireframe Error bars indicate one stan-dard error above and below the mean The intersectioncomputation used to compute apparent distance (Figure1) results in asymmetric variability around the meansince a turn of deg too far to the right produced an over-shoot in distance larger than the undershoot in distance

Figure 4 Distance judgments Real world Figure 5 Distance judgments Panorama images

Figure 6 Distance judgments Low-quality computer graphics

Thompson et al 567

produced by a turn of deg too far to the left An arctan-gent transform was applied to the data to reduce thiseffect Averages error estimates and measures of statis-tical significance were calculated in the transform spaceThe inverse transform was then applied to the calculatedaverages and errors in order to allow presentation of thefinal results in terms of judged distance

Figure 8 allows easy comparisons between results forall four conditions The experiment confirmed previousstudies showing that for standing observers viewingground-level targets in action-space range distancejudgments in the real world were near veridical (real-world) while distance judgments based on computergraphics were significantly compressed The surprisingresult was that the amount of compression was nearlythe same for all three graphical displays That is dis-tance judgments were almost unaffected by the qualityof the imagery presented to subjects

A 4 (environment) 3 (distance) 2 (sex) repeated-measures ANOVA with distance as a within-subjectvariable and environment and sex as between-subjectvariables was performed on the transformed averagedistance judgments and indicated a significant effect ofenvironment F(3 40) 1077 p 001 Collapsedacross distance Scheffe post hoc comparisons showed

that distance judgments in the real world were greaterthan those given in each of the other environments (p

01) and that performance in the other three environ-ments did not differ (p 48 for all comparisons) Al-though the means at 10 m or 15 m suggest differencesbetween the virtual conditions post hoc univariateANOVAs (with three environmental conditions) at eachdistance indicated that these differences were negligible(p 4 for the effect of environment) The ANOVAalso indicated an effect of distance F(2 80) 18384p 001 Judged distance increased as a function ofphysical distance for all environments In all the analy-ses demonstrated that perceived distance was signifi-cantly more accurate in the real world compared to thevirtual environments and that distance judgments in thevirtual environments did not vary much from eachother

5 Discussion

The results presented above are a strong indicatorthat compressed absolute egocentric distance judgmentsin visually immersive environments are not caused by alack of realistic graphics rendering The phenomenal

Figure 7 Distance judgments Wireframe graphics Figure 8 Distance judgments Comparison of all conditions

568 PRESENCE VOLUME 13 NUMBER 5

experience of realism in the panoramic environment isbest expressed by the comments of several subjectsWhen looking into a glass window in the rendered dis-play they commented ldquoWhy canrsquot I see my reflection inthe glassrdquo Despite this subjective experience judg-ments based on wireframe renderings were as good asjudgments based on actual images presented with thesame display system In all virtual environments therewas a large compression of egocentric distance As aresult absolute egocentric distance judgments in virtualenvironments are not likely to be aided by photorealisticimprovements in computer graphics such as better tex-turing and illumination From a theoretical standpointthis suggests that familiar size may be a relatively minorcontributor to the sort of distance judgments that wereinvestigated though it is important to note that all fourconditions involved hallway-like scaling and geometryThe similarity between judged distances to targets onthe floor in the three types of virtual displays is consis-tent with the hypothesis that the declination of visualangle to targets dominates distance egocentric percep-tion (Ooi Wu amp He 2001) However this does notexplain the large differences observed between distancejudgments in the real and virtual conditions

The present experiment used a methodology that in-volved a stationary viewer and an action-based judg-ment task to address specific questions about judgmentsof distance in visually immersive environments Our in-tent was to determine whether observers would judgeegocentric distance in the simulated environment in asimilar manner as in the real world without the experi-ence of active exploration Thus we restricted the ob-serverrsquos movement while viewing the environmentsPrevious visual-motor adaptation studies (Rieser PickAshmead amp Garing 1995 Pick Rieser Wagner ampGaring 1999) have demonstrated that active observerswill quickly adapt to a new mapping between visual in-put and their own movements leading to the result ofmodified motor output that corresponds to the visualworld (recalibration) We might predict that allowingactive exploration of the virtual environments wouldlead to a similar adaptation and recalibration effect sothat observers would learn to walk and turn an accuratedistance to virtual targets While this prediction ad-

dresses an important question it is a different questionthan the one presently asked in this paper Our goal wasto test whether egocentric distance judgments wouldreplicate the accurate performance demonstrated in thereal world not whether these judgments could becomeaccurate after interacting within a compressed percep-tion of the world Future studies should consider boththe extent of veridical perception in visually immersiveenvironments and the role of actions in making immer-sive environments useful despite a potential lack ofveridical perception

What might explain the compression of absolute ego-centric distance judgments if not image quality Wesuggest several possibilities but no solid evidence sup-porting any of the potential explanations has yet beenpublished While the realism of the panorama imagesused in this study far exceeded any of the computergraphics employed in distance-judgment experiments byother investigators resolution and apparent sharpnesswere still limited compared to natural viewing of thereal world This may have influenced a familiar-size ef-fect or may have degraded the sense of presence whilewearing the HMD Dixon Wraga Proffitt and Wil-liams (2000) found that visual immersion was neededfor eye height to appropriately scale linear perspectivecues Perhaps a full sense of presence not only visualimmersion is needed for distance judgments to be com-parable to what is seen in the real world Limited fieldof view is often suggested as a cause of distorted spatialvision in HMDs but Knapp and Loomis (in press)found that limiting FOV did not affect real-world ego-centric distance judgments at least if the observer wasfree to move his or her head to visually explore the envi-ronment Motion parallax was not present in our virtualdisplay conditions but motion parallax appears to be arather weak absolute-distance cue (Beall et al 1995)In addition subjects performed veridically in our real-word condition with at most very limited translationalhead motion Focus and stereo convergence are not wellcontrolled in HMDs (Rolland et al 1995 WannRushton amp Mon-Williams 1995) and incorrect accom-modation cues are known to affect distance judgments(Andersen Saidpour amp Braunstein 1998 BinghamBradley Bailey amp Vinner 2001) It seems unlikely

Thompson et al 569

however that accommodation and convergence wouldhave an effect this large at the distances we were investi-gating Finally there may be some sort of ergonomiceffect associated with wearing an HMD (Lackner amp Di-Zio 1989)

Future research that manipulates factors other thanthe image quality such as FOV stereo and physicaleffects of the HMD is needed to begin to answer thesequestions A sense of presence is more difficult to defineand manipulate but is likely to be an important compo-nent in accurate distance perception in virtual environ-ments

Acknowledgments

This material is based upon work supported by the NationalScience Foundation under grants 9623614 0080999 and0121084 Thanks to Alias Wavefront for their donation ofMaya Complete which was used in this project

References

Andersen G J Saidpour A amp Braunstein M L (1998)Effects of collimation on perceived layout in 3-D scenesPerception 27 1305ndash1315

Beall A C Loomis J M Philbeck J M amp Fikes T J(1995) Absolute motion parallax weakly determines visualscale in real and virtual environments In Proceedings of theSPIE-The International Society for Optical Engineering2411 288ndash297

Bingham G P Bradley A Bailey M amp Vinner R (2001)Accomodation occlusion and disparity matching are usedto guide reaching A comparison of actual versus virtual en-vironments Journal of Experimental Psychology HumanPerception and Performance 27 1314ndash1334

Cutting J E amp Vishton P M (1995) Perceiving layoutand knowing distance The integration relative potency andcontextual use of different information about depth In WEpstein amp S Rogers (Eds) Perception of Space and Motion(pp 69ndash117) New York Academic

Dixon M W Wraga M Proffitt D R amp Williams G C(2000) Eye height scaling of absolute size in immersive and

nonimmersive displays Journal of Experimental PsychologyHuman Perception and Performance 26(2) 582ndash593

Durgin F H Fox L F Lewis J amp Walley K A (2002)Perceptuomotor adaptation More than meets the eye Pa-per presented at the forty-third annual meeting of the Psy-chonomic Society Kansas City MO

Ellis S R amp Menges B M (1997) Judgments of the dis-tance to nearby virtual objects Interaction of viewing con-ditions and accommodative demand Presence Teleoperatorsand Virtual Environments 6 452ndash462

Foley J M Ribeiro-Filho N P amp Da Silva J A (2004)Visual perception of extent and the geometry of visualspace Vision Research 44 147ndash156

Fukusima S S Loomis J M amp Da Silva J A (1997) Vi-sual perception of egocentric distance as assessed by trian-gulation Journal of Experimental Psychology Human Per-ception and Performance 23(1) 86ndash100

Knapp J M (1999) The visual perception of egocentric dis-tance in virtual environments Unpublished doctoral disser-tation University of California at Santa Barbara

Knapp J M amp Loomis J M (in press) Limited field ofview of head-mounted displays is not the cause of distanceunderestimation in virtual environments Presence Teleop-erators and Virtual Environments

Lackner J R amp DiZio P (1989) Altered sensory-motorcontrol of the head as an etiological factor in space-motionsickness Perceptual and Motor Skills 68 784ndash786

Lampton D R McDonald D P Singer M amp Bliss J(1995) Distance estimation in virtual environments In Pro-ceedings of the Human Factors and Ergonomics Society 39thAnnual Meeting 1268ndash1272

Loomis J M Blascovich J J amp Beall A C (1999) Im-mersive virtual environment technology as a basic researchtool in psychology Behavior Research Methods Instrumentsand Computers 31(4) 557ndash564

Loomis J M Da Silva J A Fujita N amp Fukusima S S(1992) Visual space perception and visually directed actionJournal of Experimental Psychology Human Perception andPerformance 18 906ndash921

Loomis J M amp Knapp J M (2003) Visual perception ofegocentric distance in real and virtual environments In LHettinger amp M Haas (Eds) Virtual and adaptive environ-ments (pp 21ndash 46) Hillsdale NJ Erlbaum

Loomis J M Philbeck J W amp Zahorik P (2002) Disso-ciation between location and shape in visual space Journal

570 PRESENCE VOLUME 13 NUMBER 5

of Experimental Psychology Human Perception and Perfor-mance 28 1202ndash1212

Ooi T L Wu B amp He Z J (2001) Distance determina-tion by the angular declination below the horizon Nature414 197ndash200

Pick H L Jr Rieser J J Wagner D amp Garing A E(1999) The recalibration of rotational locomotion Journalof Experimental Psychology Human Perception and Perfor-mance 25(5) 1179ndash1188

Raskar R amp Cohen M (1999) Image precision silhouetteedges In Proceedings of the ACM Symposium on Interactive3D Graphics 135ndash140

Rieser J J (1999) Dynamic spatial orientation and the cou-pling of representation and action In R G Golledge (Ed)Wayfinding behavior Cognitive mapping and other spatialprocesses (pp 168ndash190) Baltimore MD Johns HopkinsUniversity Press

Rieser J J Ashmead D H Talor C R amp YoungquistG A (1990) Visual perception and the guidance of loco-motion without vision to previously seen targets Perception19 675ndash689

Rieser J J Pick H L Jr Ashmead D amp Garing A(1995) Calibration of human locomotion and models ofperceptual-motor organization Journal of Experimental

Psychology Human Perception and Performance 21 480ndash497

Rolland J P Gibson W amp Arierly D (1995) Towardsquantifying depth and size perception as a function of view-ing distance Presence Teleoperators and Virtual Environ-ments 4 24ndash49

Thomson J A (1983) Is continuous visual monitoring nec-essary in visually guided locomotion Journal of Experimen-tal Psychology Human Perception and Performance 9(3)427ndash443

Wann J P Rushton S amp Mon-Williams M (1995) Natu-ral problems for stereoscopic depth perception in virtualenvironments Vision Research 35(19) 2731ndash2736

Willemsen P amp Gooch A (2002) Perceived egocentric dis-tances in real image-based and traditional virtual environ-ments In Proceedings of IEEE Virtual Reality Conference89ndash90

Witmer B amp Sadowski W Jr (1998) Nonvisually guidedlocomotion to a previously viewed target in real and virtualenvironments Human Factors 40 478ndash488

Wraga M (1999) Using eye height in different postures toscale the heights of objects Journal of Experimental Psychol-ogy Human Perception and Performance 25(2) 518ndash530

Thompson et al 571

Page 6: Santa Barbara Department of Psychology, Santa Barbara ... et al Presence.… · Santa Barbara Santa Barbara, CA 93106 Andrew C. Beall beall@psych.ucsb.edu Department of Psychology,

The second virtual-environment condition involved acomputer graphics rendering of the same classroombuilding lobby The scale of the model was the same asthe actual building lobby but the geometric detail wasintentionally kept quite simple Stereotypical tiled tex-ture maps were used Simple point-source lighting wasused with no shadows or other global illumination ef-fects Targets were rendered as red disks with the sizeand position corresponding to what was used for the

real-world condition Rendering update rates were noless than 30 frames per second in each eye

The wireframe virtual environment condition wasconstructed by rendering feature edges of the modelused in the second virtual-environment condition Oursoftware used an OpenGL silhouette drawing algorithm(Raskar amp Cohen 1999) to generate the feature edgesThe frame rates for this environment were no less than40 frames per second The wireframe rendering pro-

Figure 2 Sample imagery for conditions 2 3 and 4 (a) Section of panorama image showing target (b) Example of low-quality computer

graphics image showing target The viewpoint is the same as for Figure 2a (c) Example of wireframe computer graphics image showing target

The viewpoint is the same as for Figure 2a

Thompson et al 565

duced scenes that resemble black-on-white sketches ofthe classroom building lobby The target was renderedwith feature edges as well with size and position thesame as for the previous conditions

For both the texture-mapped and wireframe com-puter graphics conditions eye heights were renderedbased on the subjectsrsquo actual eye heights Interpupillarydistances for stereo rendering were fixed at 65 cm con-sistent with the panorama images

33 Procedure

Subjects were first provided with written instruc-tions that described the triangulated walking task andthen given a demonstration of the task in a space bothsmaller and different from the actual experiment spatiallayout For all conditions both real and virtual subjectswere instructed to obtain a good image of the targetand their local surroundings while first facing the targetSubjects were told that a ldquogood imagerdquo is obtained ifafter closing their eyes they would still be able to ldquoseerdquothe environment and most importantly the targetSubjects were allowed to rotate their head about theirneck but were instructed not to move their head fromside to side or back and forth This was done to mini-mize motion-parallax cues in the real-world conditionso as to make it as comparable as possible to the virtual-world conditions

Once a good image was achieved subjects were in-structed to physically turn their bodies approximately70deg to the right to face a junction of two walls in theenvironment After subjects turned they were in-structed to turn their head back toward the target toobtain a final view and reaffirm their mental image ofthe environment Then subjects either blindfoldedthemselves (real-world condition) or closed their eyeswhile the HMD screen was cleared to black (virtual-world conditions) Subjects were then directed to walkpurposefully and decisively in the direction their bodywas facing After walking approximately 25 m an ex-perimenter would give the verbal command ldquoturnrdquo sig-naling the subject to turn toward the target and stopwalking when they felt they were facing the target Sub-jects were instructed to perform this turn as if they were

turning a corner in a real hallway to make the move-ment as natural as possible At this point the subjectrsquosposition was marked and they were directed to ldquoTaketwo steps in the direction of the targetrdquo Again the sub-jectrsquos position was marked and recorded The subjectwas then led without vision to the starting location byan experimenter In all conditions the apparent locationof the target was assumed to lie at the intersection ofthe line of sight to the (visible) target from the initialvantage point and a line corresponding to the subjectrsquostrajectory on the final walk toward the presumed targetlocation (Figure 1)

The userrsquos own body is seldom rendered in immersivevirtual environments This is a potential problem wheninvestigating absolute egocentric distance judgmentssince eye height is an important scaling factor that couldconceivably be affected by looking down at the userrsquosfeet and the floor on which she or he is standing Ren-dering avatar feet may not be sufficient since it is diffi-cult to achieve a high degree of realism We controlledfor this potential problem by having users wear a circu-lar collar in both the real-world and virtual-world condi-tions (Figure 3) The collar had the effect of occludingusersrsquo view of the floor out to about 2 m hiding thearea around their feet in all four tested conditions

Prior to the experiment trials subjects practiced blindwalking for five minutes During this practice subjectswalked blindfolded in a hallway and responded to verbalcommands to start and stop walking The training ishelpful in building trust between the experimenter andthe subject (Rieser 1999) but more importantly accus-

Figure 3 Viewing collar to hide viewerrsquos body and floor close to

standing position

566 PRESENCE VOLUME 13 NUMBER 5

toms the subject to walking blind During both thetraining session and the actual experiment subjectswore headphones fed by an external microphone to helplimit the effects of sound localization in the environ-ment A remote microphone worn by the experimenterallowed subjects to hear instructions After the trainingsession subjects were led still blindfolded either to ourlaboratory or to the real lobby This last step was per-formed to help ensure that the subjectrsquos movement dur-ing the experiment would not be inhibited by a prioriknowledge of the location of the walls in our lab Thesound-masking headphones remained on during thistime For the virtual-world conditions when subjectsarrived in the laboratory the HMD was placed on theirhead while their eyes remained closed Once on sub-jects were allowed to open their eyes and adjust the fitand focus of the HMD after which the orientation ofthe virtual world was aligned with the the natural rest-ing position of the HMD on the subject

4 Results

Figures 4ndash7 show the average judgments for eachof the four conditions real world high-quality pan-

orama images low-quality texture-mapped computergraphics and wireframe Error bars indicate one stan-dard error above and below the mean The intersectioncomputation used to compute apparent distance (Figure1) results in asymmetric variability around the meansince a turn of deg too far to the right produced an over-shoot in distance larger than the undershoot in distance

Figure 4 Distance judgments Real world Figure 5 Distance judgments Panorama images

Figure 6 Distance judgments Low-quality computer graphics

Thompson et al 567

produced by a turn of deg too far to the left An arctan-gent transform was applied to the data to reduce thiseffect Averages error estimates and measures of statis-tical significance were calculated in the transform spaceThe inverse transform was then applied to the calculatedaverages and errors in order to allow presentation of thefinal results in terms of judged distance

Figure 8 allows easy comparisons between results forall four conditions The experiment confirmed previousstudies showing that for standing observers viewingground-level targets in action-space range distancejudgments in the real world were near veridical (real-world) while distance judgments based on computergraphics were significantly compressed The surprisingresult was that the amount of compression was nearlythe same for all three graphical displays That is dis-tance judgments were almost unaffected by the qualityof the imagery presented to subjects

A 4 (environment) 3 (distance) 2 (sex) repeated-measures ANOVA with distance as a within-subjectvariable and environment and sex as between-subjectvariables was performed on the transformed averagedistance judgments and indicated a significant effect ofenvironment F(3 40) 1077 p 001 Collapsedacross distance Scheffe post hoc comparisons showed

that distance judgments in the real world were greaterthan those given in each of the other environments (p

01) and that performance in the other three environ-ments did not differ (p 48 for all comparisons) Al-though the means at 10 m or 15 m suggest differencesbetween the virtual conditions post hoc univariateANOVAs (with three environmental conditions) at eachdistance indicated that these differences were negligible(p 4 for the effect of environment) The ANOVAalso indicated an effect of distance F(2 80) 18384p 001 Judged distance increased as a function ofphysical distance for all environments In all the analy-ses demonstrated that perceived distance was signifi-cantly more accurate in the real world compared to thevirtual environments and that distance judgments in thevirtual environments did not vary much from eachother

5 Discussion

The results presented above are a strong indicatorthat compressed absolute egocentric distance judgmentsin visually immersive environments are not caused by alack of realistic graphics rendering The phenomenal

Figure 7 Distance judgments Wireframe graphics Figure 8 Distance judgments Comparison of all conditions

568 PRESENCE VOLUME 13 NUMBER 5

experience of realism in the panoramic environment isbest expressed by the comments of several subjectsWhen looking into a glass window in the rendered dis-play they commented ldquoWhy canrsquot I see my reflection inthe glassrdquo Despite this subjective experience judg-ments based on wireframe renderings were as good asjudgments based on actual images presented with thesame display system In all virtual environments therewas a large compression of egocentric distance As aresult absolute egocentric distance judgments in virtualenvironments are not likely to be aided by photorealisticimprovements in computer graphics such as better tex-turing and illumination From a theoretical standpointthis suggests that familiar size may be a relatively minorcontributor to the sort of distance judgments that wereinvestigated though it is important to note that all fourconditions involved hallway-like scaling and geometryThe similarity between judged distances to targets onthe floor in the three types of virtual displays is consis-tent with the hypothesis that the declination of visualangle to targets dominates distance egocentric percep-tion (Ooi Wu amp He 2001) However this does notexplain the large differences observed between distancejudgments in the real and virtual conditions

The present experiment used a methodology that in-volved a stationary viewer and an action-based judg-ment task to address specific questions about judgmentsof distance in visually immersive environments Our in-tent was to determine whether observers would judgeegocentric distance in the simulated environment in asimilar manner as in the real world without the experi-ence of active exploration Thus we restricted the ob-serverrsquos movement while viewing the environmentsPrevious visual-motor adaptation studies (Rieser PickAshmead amp Garing 1995 Pick Rieser Wagner ampGaring 1999) have demonstrated that active observerswill quickly adapt to a new mapping between visual in-put and their own movements leading to the result ofmodified motor output that corresponds to the visualworld (recalibration) We might predict that allowingactive exploration of the virtual environments wouldlead to a similar adaptation and recalibration effect sothat observers would learn to walk and turn an accuratedistance to virtual targets While this prediction ad-

dresses an important question it is a different questionthan the one presently asked in this paper Our goal wasto test whether egocentric distance judgments wouldreplicate the accurate performance demonstrated in thereal world not whether these judgments could becomeaccurate after interacting within a compressed percep-tion of the world Future studies should consider boththe extent of veridical perception in visually immersiveenvironments and the role of actions in making immer-sive environments useful despite a potential lack ofveridical perception

What might explain the compression of absolute ego-centric distance judgments if not image quality Wesuggest several possibilities but no solid evidence sup-porting any of the potential explanations has yet beenpublished While the realism of the panorama imagesused in this study far exceeded any of the computergraphics employed in distance-judgment experiments byother investigators resolution and apparent sharpnesswere still limited compared to natural viewing of thereal world This may have influenced a familiar-size ef-fect or may have degraded the sense of presence whilewearing the HMD Dixon Wraga Proffitt and Wil-liams (2000) found that visual immersion was neededfor eye height to appropriately scale linear perspectivecues Perhaps a full sense of presence not only visualimmersion is needed for distance judgments to be com-parable to what is seen in the real world Limited fieldof view is often suggested as a cause of distorted spatialvision in HMDs but Knapp and Loomis (in press)found that limiting FOV did not affect real-world ego-centric distance judgments at least if the observer wasfree to move his or her head to visually explore the envi-ronment Motion parallax was not present in our virtualdisplay conditions but motion parallax appears to be arather weak absolute-distance cue (Beall et al 1995)In addition subjects performed veridically in our real-word condition with at most very limited translationalhead motion Focus and stereo convergence are not wellcontrolled in HMDs (Rolland et al 1995 WannRushton amp Mon-Williams 1995) and incorrect accom-modation cues are known to affect distance judgments(Andersen Saidpour amp Braunstein 1998 BinghamBradley Bailey amp Vinner 2001) It seems unlikely

Thompson et al 569

however that accommodation and convergence wouldhave an effect this large at the distances we were investi-gating Finally there may be some sort of ergonomiceffect associated with wearing an HMD (Lackner amp Di-Zio 1989)

Future research that manipulates factors other thanthe image quality such as FOV stereo and physicaleffects of the HMD is needed to begin to answer thesequestions A sense of presence is more difficult to defineand manipulate but is likely to be an important compo-nent in accurate distance perception in virtual environ-ments

Acknowledgments

This material is based upon work supported by the NationalScience Foundation under grants 9623614 0080999 and0121084 Thanks to Alias Wavefront for their donation ofMaya Complete which was used in this project

References

Andersen G J Saidpour A amp Braunstein M L (1998)Effects of collimation on perceived layout in 3-D scenesPerception 27 1305ndash1315

Beall A C Loomis J M Philbeck J M amp Fikes T J(1995) Absolute motion parallax weakly determines visualscale in real and virtual environments In Proceedings of theSPIE-The International Society for Optical Engineering2411 288ndash297

Bingham G P Bradley A Bailey M amp Vinner R (2001)Accomodation occlusion and disparity matching are usedto guide reaching A comparison of actual versus virtual en-vironments Journal of Experimental Psychology HumanPerception and Performance 27 1314ndash1334

Cutting J E amp Vishton P M (1995) Perceiving layoutand knowing distance The integration relative potency andcontextual use of different information about depth In WEpstein amp S Rogers (Eds) Perception of Space and Motion(pp 69ndash117) New York Academic

Dixon M W Wraga M Proffitt D R amp Williams G C(2000) Eye height scaling of absolute size in immersive and

nonimmersive displays Journal of Experimental PsychologyHuman Perception and Performance 26(2) 582ndash593

Durgin F H Fox L F Lewis J amp Walley K A (2002)Perceptuomotor adaptation More than meets the eye Pa-per presented at the forty-third annual meeting of the Psy-chonomic Society Kansas City MO

Ellis S R amp Menges B M (1997) Judgments of the dis-tance to nearby virtual objects Interaction of viewing con-ditions and accommodative demand Presence Teleoperatorsand Virtual Environments 6 452ndash462

Foley J M Ribeiro-Filho N P amp Da Silva J A (2004)Visual perception of extent and the geometry of visualspace Vision Research 44 147ndash156

Fukusima S S Loomis J M amp Da Silva J A (1997) Vi-sual perception of egocentric distance as assessed by trian-gulation Journal of Experimental Psychology Human Per-ception and Performance 23(1) 86ndash100

Knapp J M (1999) The visual perception of egocentric dis-tance in virtual environments Unpublished doctoral disser-tation University of California at Santa Barbara

Knapp J M amp Loomis J M (in press) Limited field ofview of head-mounted displays is not the cause of distanceunderestimation in virtual environments Presence Teleop-erators and Virtual Environments

Lackner J R amp DiZio P (1989) Altered sensory-motorcontrol of the head as an etiological factor in space-motionsickness Perceptual and Motor Skills 68 784ndash786

Lampton D R McDonald D P Singer M amp Bliss J(1995) Distance estimation in virtual environments In Pro-ceedings of the Human Factors and Ergonomics Society 39thAnnual Meeting 1268ndash1272

Loomis J M Blascovich J J amp Beall A C (1999) Im-mersive virtual environment technology as a basic researchtool in psychology Behavior Research Methods Instrumentsand Computers 31(4) 557ndash564

Loomis J M Da Silva J A Fujita N amp Fukusima S S(1992) Visual space perception and visually directed actionJournal of Experimental Psychology Human Perception andPerformance 18 906ndash921

Loomis J M amp Knapp J M (2003) Visual perception ofegocentric distance in real and virtual environments In LHettinger amp M Haas (Eds) Virtual and adaptive environ-ments (pp 21ndash 46) Hillsdale NJ Erlbaum

Loomis J M Philbeck J W amp Zahorik P (2002) Disso-ciation between location and shape in visual space Journal

570 PRESENCE VOLUME 13 NUMBER 5

of Experimental Psychology Human Perception and Perfor-mance 28 1202ndash1212

Ooi T L Wu B amp He Z J (2001) Distance determina-tion by the angular declination below the horizon Nature414 197ndash200

Pick H L Jr Rieser J J Wagner D amp Garing A E(1999) The recalibration of rotational locomotion Journalof Experimental Psychology Human Perception and Perfor-mance 25(5) 1179ndash1188

Raskar R amp Cohen M (1999) Image precision silhouetteedges In Proceedings of the ACM Symposium on Interactive3D Graphics 135ndash140

Rieser J J (1999) Dynamic spatial orientation and the cou-pling of representation and action In R G Golledge (Ed)Wayfinding behavior Cognitive mapping and other spatialprocesses (pp 168ndash190) Baltimore MD Johns HopkinsUniversity Press

Rieser J J Ashmead D H Talor C R amp YoungquistG A (1990) Visual perception and the guidance of loco-motion without vision to previously seen targets Perception19 675ndash689

Rieser J J Pick H L Jr Ashmead D amp Garing A(1995) Calibration of human locomotion and models ofperceptual-motor organization Journal of Experimental

Psychology Human Perception and Performance 21 480ndash497

Rolland J P Gibson W amp Arierly D (1995) Towardsquantifying depth and size perception as a function of view-ing distance Presence Teleoperators and Virtual Environ-ments 4 24ndash49

Thomson J A (1983) Is continuous visual monitoring nec-essary in visually guided locomotion Journal of Experimen-tal Psychology Human Perception and Performance 9(3)427ndash443

Wann J P Rushton S amp Mon-Williams M (1995) Natu-ral problems for stereoscopic depth perception in virtualenvironments Vision Research 35(19) 2731ndash2736

Willemsen P amp Gooch A (2002) Perceived egocentric dis-tances in real image-based and traditional virtual environ-ments In Proceedings of IEEE Virtual Reality Conference89ndash90

Witmer B amp Sadowski W Jr (1998) Nonvisually guidedlocomotion to a previously viewed target in real and virtualenvironments Human Factors 40 478ndash488

Wraga M (1999) Using eye height in different postures toscale the heights of objects Journal of Experimental Psychol-ogy Human Perception and Performance 25(2) 518ndash530

Thompson et al 571

Page 7: Santa Barbara Department of Psychology, Santa Barbara ... et al Presence.… · Santa Barbara Santa Barbara, CA 93106 Andrew C. Beall beall@psych.ucsb.edu Department of Psychology,

duced scenes that resemble black-on-white sketches ofthe classroom building lobby The target was renderedwith feature edges as well with size and position thesame as for the previous conditions

For both the texture-mapped and wireframe com-puter graphics conditions eye heights were renderedbased on the subjectsrsquo actual eye heights Interpupillarydistances for stereo rendering were fixed at 65 cm con-sistent with the panorama images

33 Procedure

Subjects were first provided with written instruc-tions that described the triangulated walking task andthen given a demonstration of the task in a space bothsmaller and different from the actual experiment spatiallayout For all conditions both real and virtual subjectswere instructed to obtain a good image of the targetand their local surroundings while first facing the targetSubjects were told that a ldquogood imagerdquo is obtained ifafter closing their eyes they would still be able to ldquoseerdquothe environment and most importantly the targetSubjects were allowed to rotate their head about theirneck but were instructed not to move their head fromside to side or back and forth This was done to mini-mize motion-parallax cues in the real-world conditionso as to make it as comparable as possible to the virtual-world conditions

Once a good image was achieved subjects were in-structed to physically turn their bodies approximately70deg to the right to face a junction of two walls in theenvironment After subjects turned they were in-structed to turn their head back toward the target toobtain a final view and reaffirm their mental image ofthe environment Then subjects either blindfoldedthemselves (real-world condition) or closed their eyeswhile the HMD screen was cleared to black (virtual-world conditions) Subjects were then directed to walkpurposefully and decisively in the direction their bodywas facing After walking approximately 25 m an ex-perimenter would give the verbal command ldquoturnrdquo sig-naling the subject to turn toward the target and stopwalking when they felt they were facing the target Sub-jects were instructed to perform this turn as if they were

turning a corner in a real hallway to make the move-ment as natural as possible At this point the subjectrsquosposition was marked and they were directed to ldquoTaketwo steps in the direction of the targetrdquo Again the sub-jectrsquos position was marked and recorded The subjectwas then led without vision to the starting location byan experimenter In all conditions the apparent locationof the target was assumed to lie at the intersection ofthe line of sight to the (visible) target from the initialvantage point and a line corresponding to the subjectrsquostrajectory on the final walk toward the presumed targetlocation (Figure 1)

The userrsquos own body is seldom rendered in immersivevirtual environments This is a potential problem wheninvestigating absolute egocentric distance judgmentssince eye height is an important scaling factor that couldconceivably be affected by looking down at the userrsquosfeet and the floor on which she or he is standing Ren-dering avatar feet may not be sufficient since it is diffi-cult to achieve a high degree of realism We controlledfor this potential problem by having users wear a circu-lar collar in both the real-world and virtual-world condi-tions (Figure 3) The collar had the effect of occludingusersrsquo view of the floor out to about 2 m hiding thearea around their feet in all four tested conditions

Prior to the experiment trials subjects practiced blindwalking for five minutes During this practice subjectswalked blindfolded in a hallway and responded to verbalcommands to start and stop walking The training ishelpful in building trust between the experimenter andthe subject (Rieser 1999) but more importantly accus-

Figure 3 Viewing collar to hide viewerrsquos body and floor close to

standing position

566 PRESENCE VOLUME 13 NUMBER 5

toms the subject to walking blind During both thetraining session and the actual experiment subjectswore headphones fed by an external microphone to helplimit the effects of sound localization in the environ-ment A remote microphone worn by the experimenterallowed subjects to hear instructions After the trainingsession subjects were led still blindfolded either to ourlaboratory or to the real lobby This last step was per-formed to help ensure that the subjectrsquos movement dur-ing the experiment would not be inhibited by a prioriknowledge of the location of the walls in our lab Thesound-masking headphones remained on during thistime For the virtual-world conditions when subjectsarrived in the laboratory the HMD was placed on theirhead while their eyes remained closed Once on sub-jects were allowed to open their eyes and adjust the fitand focus of the HMD after which the orientation ofthe virtual world was aligned with the the natural rest-ing position of the HMD on the subject

4 Results

Figures 4ndash7 show the average judgments for eachof the four conditions real world high-quality pan-

orama images low-quality texture-mapped computergraphics and wireframe Error bars indicate one stan-dard error above and below the mean The intersectioncomputation used to compute apparent distance (Figure1) results in asymmetric variability around the meansince a turn of deg too far to the right produced an over-shoot in distance larger than the undershoot in distance

Figure 4 Distance judgments Real world Figure 5 Distance judgments Panorama images

Figure 6 Distance judgments Low-quality computer graphics

Thompson et al 567

produced by a turn of deg too far to the left An arctan-gent transform was applied to the data to reduce thiseffect Averages error estimates and measures of statis-tical significance were calculated in the transform spaceThe inverse transform was then applied to the calculatedaverages and errors in order to allow presentation of thefinal results in terms of judged distance

Figure 8 allows easy comparisons between results forall four conditions The experiment confirmed previousstudies showing that for standing observers viewingground-level targets in action-space range distancejudgments in the real world were near veridical (real-world) while distance judgments based on computergraphics were significantly compressed The surprisingresult was that the amount of compression was nearlythe same for all three graphical displays That is dis-tance judgments were almost unaffected by the qualityof the imagery presented to subjects

A 4 (environment) 3 (distance) 2 (sex) repeated-measures ANOVA with distance as a within-subjectvariable and environment and sex as between-subjectvariables was performed on the transformed averagedistance judgments and indicated a significant effect ofenvironment F(3 40) 1077 p 001 Collapsedacross distance Scheffe post hoc comparisons showed

that distance judgments in the real world were greaterthan those given in each of the other environments (p

01) and that performance in the other three environ-ments did not differ (p 48 for all comparisons) Al-though the means at 10 m or 15 m suggest differencesbetween the virtual conditions post hoc univariateANOVAs (with three environmental conditions) at eachdistance indicated that these differences were negligible(p 4 for the effect of environment) The ANOVAalso indicated an effect of distance F(2 80) 18384p 001 Judged distance increased as a function ofphysical distance for all environments In all the analy-ses demonstrated that perceived distance was signifi-cantly more accurate in the real world compared to thevirtual environments and that distance judgments in thevirtual environments did not vary much from eachother

5 Discussion

The results presented above are a strong indicatorthat compressed absolute egocentric distance judgmentsin visually immersive environments are not caused by alack of realistic graphics rendering The phenomenal

Figure 7 Distance judgments Wireframe graphics Figure 8 Distance judgments Comparison of all conditions

568 PRESENCE VOLUME 13 NUMBER 5

experience of realism in the panoramic environment isbest expressed by the comments of several subjectsWhen looking into a glass window in the rendered dis-play they commented ldquoWhy canrsquot I see my reflection inthe glassrdquo Despite this subjective experience judg-ments based on wireframe renderings were as good asjudgments based on actual images presented with thesame display system In all virtual environments therewas a large compression of egocentric distance As aresult absolute egocentric distance judgments in virtualenvironments are not likely to be aided by photorealisticimprovements in computer graphics such as better tex-turing and illumination From a theoretical standpointthis suggests that familiar size may be a relatively minorcontributor to the sort of distance judgments that wereinvestigated though it is important to note that all fourconditions involved hallway-like scaling and geometryThe similarity between judged distances to targets onthe floor in the three types of virtual displays is consis-tent with the hypothesis that the declination of visualangle to targets dominates distance egocentric percep-tion (Ooi Wu amp He 2001) However this does notexplain the large differences observed between distancejudgments in the real and virtual conditions

The present experiment used a methodology that in-volved a stationary viewer and an action-based judg-ment task to address specific questions about judgmentsof distance in visually immersive environments Our in-tent was to determine whether observers would judgeegocentric distance in the simulated environment in asimilar manner as in the real world without the experi-ence of active exploration Thus we restricted the ob-serverrsquos movement while viewing the environmentsPrevious visual-motor adaptation studies (Rieser PickAshmead amp Garing 1995 Pick Rieser Wagner ampGaring 1999) have demonstrated that active observerswill quickly adapt to a new mapping between visual in-put and their own movements leading to the result ofmodified motor output that corresponds to the visualworld (recalibration) We might predict that allowingactive exploration of the virtual environments wouldlead to a similar adaptation and recalibration effect sothat observers would learn to walk and turn an accuratedistance to virtual targets While this prediction ad-

dresses an important question it is a different questionthan the one presently asked in this paper Our goal wasto test whether egocentric distance judgments wouldreplicate the accurate performance demonstrated in thereal world not whether these judgments could becomeaccurate after interacting within a compressed percep-tion of the world Future studies should consider boththe extent of veridical perception in visually immersiveenvironments and the role of actions in making immer-sive environments useful despite a potential lack ofveridical perception

What might explain the compression of absolute ego-centric distance judgments if not image quality Wesuggest several possibilities but no solid evidence sup-porting any of the potential explanations has yet beenpublished While the realism of the panorama imagesused in this study far exceeded any of the computergraphics employed in distance-judgment experiments byother investigators resolution and apparent sharpnesswere still limited compared to natural viewing of thereal world This may have influenced a familiar-size ef-fect or may have degraded the sense of presence whilewearing the HMD Dixon Wraga Proffitt and Wil-liams (2000) found that visual immersion was neededfor eye height to appropriately scale linear perspectivecues Perhaps a full sense of presence not only visualimmersion is needed for distance judgments to be com-parable to what is seen in the real world Limited fieldof view is often suggested as a cause of distorted spatialvision in HMDs but Knapp and Loomis (in press)found that limiting FOV did not affect real-world ego-centric distance judgments at least if the observer wasfree to move his or her head to visually explore the envi-ronment Motion parallax was not present in our virtualdisplay conditions but motion parallax appears to be arather weak absolute-distance cue (Beall et al 1995)In addition subjects performed veridically in our real-word condition with at most very limited translationalhead motion Focus and stereo convergence are not wellcontrolled in HMDs (Rolland et al 1995 WannRushton amp Mon-Williams 1995) and incorrect accom-modation cues are known to affect distance judgments(Andersen Saidpour amp Braunstein 1998 BinghamBradley Bailey amp Vinner 2001) It seems unlikely

Thompson et al 569

however that accommodation and convergence wouldhave an effect this large at the distances we were investi-gating Finally there may be some sort of ergonomiceffect associated with wearing an HMD (Lackner amp Di-Zio 1989)

Future research that manipulates factors other thanthe image quality such as FOV stereo and physicaleffects of the HMD is needed to begin to answer thesequestions A sense of presence is more difficult to defineand manipulate but is likely to be an important compo-nent in accurate distance perception in virtual environ-ments

Acknowledgments

This material is based upon work supported by the NationalScience Foundation under grants 9623614 0080999 and0121084 Thanks to Alias Wavefront for their donation ofMaya Complete which was used in this project

References

Andersen G J Saidpour A amp Braunstein M L (1998)Effects of collimation on perceived layout in 3-D scenesPerception 27 1305ndash1315

Beall A C Loomis J M Philbeck J M amp Fikes T J(1995) Absolute motion parallax weakly determines visualscale in real and virtual environments In Proceedings of theSPIE-The International Society for Optical Engineering2411 288ndash297

Bingham G P Bradley A Bailey M amp Vinner R (2001)Accomodation occlusion and disparity matching are usedto guide reaching A comparison of actual versus virtual en-vironments Journal of Experimental Psychology HumanPerception and Performance 27 1314ndash1334

Cutting J E amp Vishton P M (1995) Perceiving layoutand knowing distance The integration relative potency andcontextual use of different information about depth In WEpstein amp S Rogers (Eds) Perception of Space and Motion(pp 69ndash117) New York Academic

Dixon M W Wraga M Proffitt D R amp Williams G C(2000) Eye height scaling of absolute size in immersive and

nonimmersive displays Journal of Experimental PsychologyHuman Perception and Performance 26(2) 582ndash593

Durgin F H Fox L F Lewis J amp Walley K A (2002)Perceptuomotor adaptation More than meets the eye Pa-per presented at the forty-third annual meeting of the Psy-chonomic Society Kansas City MO

Ellis S R amp Menges B M (1997) Judgments of the dis-tance to nearby virtual objects Interaction of viewing con-ditions and accommodative demand Presence Teleoperatorsand Virtual Environments 6 452ndash462

Foley J M Ribeiro-Filho N P amp Da Silva J A (2004)Visual perception of extent and the geometry of visualspace Vision Research 44 147ndash156

Fukusima S S Loomis J M amp Da Silva J A (1997) Vi-sual perception of egocentric distance as assessed by trian-gulation Journal of Experimental Psychology Human Per-ception and Performance 23(1) 86ndash100

Knapp J M (1999) The visual perception of egocentric dis-tance in virtual environments Unpublished doctoral disser-tation University of California at Santa Barbara

Knapp J M amp Loomis J M (in press) Limited field ofview of head-mounted displays is not the cause of distanceunderestimation in virtual environments Presence Teleop-erators and Virtual Environments

Lackner J R amp DiZio P (1989) Altered sensory-motorcontrol of the head as an etiological factor in space-motionsickness Perceptual and Motor Skills 68 784ndash786

Lampton D R McDonald D P Singer M amp Bliss J(1995) Distance estimation in virtual environments In Pro-ceedings of the Human Factors and Ergonomics Society 39thAnnual Meeting 1268ndash1272

Loomis J M Blascovich J J amp Beall A C (1999) Im-mersive virtual environment technology as a basic researchtool in psychology Behavior Research Methods Instrumentsand Computers 31(4) 557ndash564

Loomis J M Da Silva J A Fujita N amp Fukusima S S(1992) Visual space perception and visually directed actionJournal of Experimental Psychology Human Perception andPerformance 18 906ndash921

Loomis J M amp Knapp J M (2003) Visual perception ofegocentric distance in real and virtual environments In LHettinger amp M Haas (Eds) Virtual and adaptive environ-ments (pp 21ndash 46) Hillsdale NJ Erlbaum

Loomis J M Philbeck J W amp Zahorik P (2002) Disso-ciation between location and shape in visual space Journal

570 PRESENCE VOLUME 13 NUMBER 5

of Experimental Psychology Human Perception and Perfor-mance 28 1202ndash1212

Ooi T L Wu B amp He Z J (2001) Distance determina-tion by the angular declination below the horizon Nature414 197ndash200

Pick H L Jr Rieser J J Wagner D amp Garing A E(1999) The recalibration of rotational locomotion Journalof Experimental Psychology Human Perception and Perfor-mance 25(5) 1179ndash1188

Raskar R amp Cohen M (1999) Image precision silhouetteedges In Proceedings of the ACM Symposium on Interactive3D Graphics 135ndash140

Rieser J J (1999) Dynamic spatial orientation and the cou-pling of representation and action In R G Golledge (Ed)Wayfinding behavior Cognitive mapping and other spatialprocesses (pp 168ndash190) Baltimore MD Johns HopkinsUniversity Press

Rieser J J Ashmead D H Talor C R amp YoungquistG A (1990) Visual perception and the guidance of loco-motion without vision to previously seen targets Perception19 675ndash689

Rieser J J Pick H L Jr Ashmead D amp Garing A(1995) Calibration of human locomotion and models ofperceptual-motor organization Journal of Experimental

Psychology Human Perception and Performance 21 480ndash497

Rolland J P Gibson W amp Arierly D (1995) Towardsquantifying depth and size perception as a function of view-ing distance Presence Teleoperators and Virtual Environ-ments 4 24ndash49

Thomson J A (1983) Is continuous visual monitoring nec-essary in visually guided locomotion Journal of Experimen-tal Psychology Human Perception and Performance 9(3)427ndash443

Wann J P Rushton S amp Mon-Williams M (1995) Natu-ral problems for stereoscopic depth perception in virtualenvironments Vision Research 35(19) 2731ndash2736

Willemsen P amp Gooch A (2002) Perceived egocentric dis-tances in real image-based and traditional virtual environ-ments In Proceedings of IEEE Virtual Reality Conference89ndash90

Witmer B amp Sadowski W Jr (1998) Nonvisually guidedlocomotion to a previously viewed target in real and virtualenvironments Human Factors 40 478ndash488

Wraga M (1999) Using eye height in different postures toscale the heights of objects Journal of Experimental Psychol-ogy Human Perception and Performance 25(2) 518ndash530

Thompson et al 571

Page 8: Santa Barbara Department of Psychology, Santa Barbara ... et al Presence.… · Santa Barbara Santa Barbara, CA 93106 Andrew C. Beall beall@psych.ucsb.edu Department of Psychology,

toms the subject to walking blind During both thetraining session and the actual experiment subjectswore headphones fed by an external microphone to helplimit the effects of sound localization in the environ-ment A remote microphone worn by the experimenterallowed subjects to hear instructions After the trainingsession subjects were led still blindfolded either to ourlaboratory or to the real lobby This last step was per-formed to help ensure that the subjectrsquos movement dur-ing the experiment would not be inhibited by a prioriknowledge of the location of the walls in our lab Thesound-masking headphones remained on during thistime For the virtual-world conditions when subjectsarrived in the laboratory the HMD was placed on theirhead while their eyes remained closed Once on sub-jects were allowed to open their eyes and adjust the fitand focus of the HMD after which the orientation ofthe virtual world was aligned with the the natural rest-ing position of the HMD on the subject

4 Results

Figures 4ndash7 show the average judgments for eachof the four conditions real world high-quality pan-

orama images low-quality texture-mapped computergraphics and wireframe Error bars indicate one stan-dard error above and below the mean The intersectioncomputation used to compute apparent distance (Figure1) results in asymmetric variability around the meansince a turn of deg too far to the right produced an over-shoot in distance larger than the undershoot in distance

Figure 4 Distance judgments Real world Figure 5 Distance judgments Panorama images

Figure 6 Distance judgments Low-quality computer graphics

Thompson et al 567

produced by a turn of deg too far to the left An arctan-gent transform was applied to the data to reduce thiseffect Averages error estimates and measures of statis-tical significance were calculated in the transform spaceThe inverse transform was then applied to the calculatedaverages and errors in order to allow presentation of thefinal results in terms of judged distance

Figure 8 allows easy comparisons between results forall four conditions The experiment confirmed previousstudies showing that for standing observers viewingground-level targets in action-space range distancejudgments in the real world were near veridical (real-world) while distance judgments based on computergraphics were significantly compressed The surprisingresult was that the amount of compression was nearlythe same for all three graphical displays That is dis-tance judgments were almost unaffected by the qualityof the imagery presented to subjects

A 4 (environment) 3 (distance) 2 (sex) repeated-measures ANOVA with distance as a within-subjectvariable and environment and sex as between-subjectvariables was performed on the transformed averagedistance judgments and indicated a significant effect ofenvironment F(3 40) 1077 p 001 Collapsedacross distance Scheffe post hoc comparisons showed

that distance judgments in the real world were greaterthan those given in each of the other environments (p

01) and that performance in the other three environ-ments did not differ (p 48 for all comparisons) Al-though the means at 10 m or 15 m suggest differencesbetween the virtual conditions post hoc univariateANOVAs (with three environmental conditions) at eachdistance indicated that these differences were negligible(p 4 for the effect of environment) The ANOVAalso indicated an effect of distance F(2 80) 18384p 001 Judged distance increased as a function ofphysical distance for all environments In all the analy-ses demonstrated that perceived distance was signifi-cantly more accurate in the real world compared to thevirtual environments and that distance judgments in thevirtual environments did not vary much from eachother

5 Discussion

The results presented above are a strong indicatorthat compressed absolute egocentric distance judgmentsin visually immersive environments are not caused by alack of realistic graphics rendering The phenomenal

Figure 7 Distance judgments Wireframe graphics Figure 8 Distance judgments Comparison of all conditions

568 PRESENCE VOLUME 13 NUMBER 5

experience of realism in the panoramic environment isbest expressed by the comments of several subjectsWhen looking into a glass window in the rendered dis-play they commented ldquoWhy canrsquot I see my reflection inthe glassrdquo Despite this subjective experience judg-ments based on wireframe renderings were as good asjudgments based on actual images presented with thesame display system In all virtual environments therewas a large compression of egocentric distance As aresult absolute egocentric distance judgments in virtualenvironments are not likely to be aided by photorealisticimprovements in computer graphics such as better tex-turing and illumination From a theoretical standpointthis suggests that familiar size may be a relatively minorcontributor to the sort of distance judgments that wereinvestigated though it is important to note that all fourconditions involved hallway-like scaling and geometryThe similarity between judged distances to targets onthe floor in the three types of virtual displays is consis-tent with the hypothesis that the declination of visualangle to targets dominates distance egocentric percep-tion (Ooi Wu amp He 2001) However this does notexplain the large differences observed between distancejudgments in the real and virtual conditions

The present experiment used a methodology that in-volved a stationary viewer and an action-based judg-ment task to address specific questions about judgmentsof distance in visually immersive environments Our in-tent was to determine whether observers would judgeegocentric distance in the simulated environment in asimilar manner as in the real world without the experi-ence of active exploration Thus we restricted the ob-serverrsquos movement while viewing the environmentsPrevious visual-motor adaptation studies (Rieser PickAshmead amp Garing 1995 Pick Rieser Wagner ampGaring 1999) have demonstrated that active observerswill quickly adapt to a new mapping between visual in-put and their own movements leading to the result ofmodified motor output that corresponds to the visualworld (recalibration) We might predict that allowingactive exploration of the virtual environments wouldlead to a similar adaptation and recalibration effect sothat observers would learn to walk and turn an accuratedistance to virtual targets While this prediction ad-

dresses an important question it is a different questionthan the one presently asked in this paper Our goal wasto test whether egocentric distance judgments wouldreplicate the accurate performance demonstrated in thereal world not whether these judgments could becomeaccurate after interacting within a compressed percep-tion of the world Future studies should consider boththe extent of veridical perception in visually immersiveenvironments and the role of actions in making immer-sive environments useful despite a potential lack ofveridical perception

What might explain the compression of absolute ego-centric distance judgments if not image quality Wesuggest several possibilities but no solid evidence sup-porting any of the potential explanations has yet beenpublished While the realism of the panorama imagesused in this study far exceeded any of the computergraphics employed in distance-judgment experiments byother investigators resolution and apparent sharpnesswere still limited compared to natural viewing of thereal world This may have influenced a familiar-size ef-fect or may have degraded the sense of presence whilewearing the HMD Dixon Wraga Proffitt and Wil-liams (2000) found that visual immersion was neededfor eye height to appropriately scale linear perspectivecues Perhaps a full sense of presence not only visualimmersion is needed for distance judgments to be com-parable to what is seen in the real world Limited fieldof view is often suggested as a cause of distorted spatialvision in HMDs but Knapp and Loomis (in press)found that limiting FOV did not affect real-world ego-centric distance judgments at least if the observer wasfree to move his or her head to visually explore the envi-ronment Motion parallax was not present in our virtualdisplay conditions but motion parallax appears to be arather weak absolute-distance cue (Beall et al 1995)In addition subjects performed veridically in our real-word condition with at most very limited translationalhead motion Focus and stereo convergence are not wellcontrolled in HMDs (Rolland et al 1995 WannRushton amp Mon-Williams 1995) and incorrect accom-modation cues are known to affect distance judgments(Andersen Saidpour amp Braunstein 1998 BinghamBradley Bailey amp Vinner 2001) It seems unlikely

Thompson et al 569

however that accommodation and convergence wouldhave an effect this large at the distances we were investi-gating Finally there may be some sort of ergonomiceffect associated with wearing an HMD (Lackner amp Di-Zio 1989)

Future research that manipulates factors other thanthe image quality such as FOV stereo and physicaleffects of the HMD is needed to begin to answer thesequestions A sense of presence is more difficult to defineand manipulate but is likely to be an important compo-nent in accurate distance perception in virtual environ-ments

Acknowledgments

This material is based upon work supported by the NationalScience Foundation under grants 9623614 0080999 and0121084 Thanks to Alias Wavefront for their donation ofMaya Complete which was used in this project

References

Andersen G J Saidpour A amp Braunstein M L (1998)Effects of collimation on perceived layout in 3-D scenesPerception 27 1305ndash1315

Beall A C Loomis J M Philbeck J M amp Fikes T J(1995) Absolute motion parallax weakly determines visualscale in real and virtual environments In Proceedings of theSPIE-The International Society for Optical Engineering2411 288ndash297

Bingham G P Bradley A Bailey M amp Vinner R (2001)Accomodation occlusion and disparity matching are usedto guide reaching A comparison of actual versus virtual en-vironments Journal of Experimental Psychology HumanPerception and Performance 27 1314ndash1334

Cutting J E amp Vishton P M (1995) Perceiving layoutand knowing distance The integration relative potency andcontextual use of different information about depth In WEpstein amp S Rogers (Eds) Perception of Space and Motion(pp 69ndash117) New York Academic

Dixon M W Wraga M Proffitt D R amp Williams G C(2000) Eye height scaling of absolute size in immersive and

nonimmersive displays Journal of Experimental PsychologyHuman Perception and Performance 26(2) 582ndash593

Durgin F H Fox L F Lewis J amp Walley K A (2002)Perceptuomotor adaptation More than meets the eye Pa-per presented at the forty-third annual meeting of the Psy-chonomic Society Kansas City MO

Ellis S R amp Menges B M (1997) Judgments of the dis-tance to nearby virtual objects Interaction of viewing con-ditions and accommodative demand Presence Teleoperatorsand Virtual Environments 6 452ndash462

Foley J M Ribeiro-Filho N P amp Da Silva J A (2004)Visual perception of extent and the geometry of visualspace Vision Research 44 147ndash156

Fukusima S S Loomis J M amp Da Silva J A (1997) Vi-sual perception of egocentric distance as assessed by trian-gulation Journal of Experimental Psychology Human Per-ception and Performance 23(1) 86ndash100

Knapp J M (1999) The visual perception of egocentric dis-tance in virtual environments Unpublished doctoral disser-tation University of California at Santa Barbara

Knapp J M amp Loomis J M (in press) Limited field ofview of head-mounted displays is not the cause of distanceunderestimation in virtual environments Presence Teleop-erators and Virtual Environments

Lackner J R amp DiZio P (1989) Altered sensory-motorcontrol of the head as an etiological factor in space-motionsickness Perceptual and Motor Skills 68 784ndash786

Lampton D R McDonald D P Singer M amp Bliss J(1995) Distance estimation in virtual environments In Pro-ceedings of the Human Factors and Ergonomics Society 39thAnnual Meeting 1268ndash1272

Loomis J M Blascovich J J amp Beall A C (1999) Im-mersive virtual environment technology as a basic researchtool in psychology Behavior Research Methods Instrumentsand Computers 31(4) 557ndash564

Loomis J M Da Silva J A Fujita N amp Fukusima S S(1992) Visual space perception and visually directed actionJournal of Experimental Psychology Human Perception andPerformance 18 906ndash921

Loomis J M amp Knapp J M (2003) Visual perception ofegocentric distance in real and virtual environments In LHettinger amp M Haas (Eds) Virtual and adaptive environ-ments (pp 21ndash 46) Hillsdale NJ Erlbaum

Loomis J M Philbeck J W amp Zahorik P (2002) Disso-ciation between location and shape in visual space Journal

570 PRESENCE VOLUME 13 NUMBER 5

of Experimental Psychology Human Perception and Perfor-mance 28 1202ndash1212

Ooi T L Wu B amp He Z J (2001) Distance determina-tion by the angular declination below the horizon Nature414 197ndash200

Pick H L Jr Rieser J J Wagner D amp Garing A E(1999) The recalibration of rotational locomotion Journalof Experimental Psychology Human Perception and Perfor-mance 25(5) 1179ndash1188

Raskar R amp Cohen M (1999) Image precision silhouetteedges In Proceedings of the ACM Symposium on Interactive3D Graphics 135ndash140

Rieser J J (1999) Dynamic spatial orientation and the cou-pling of representation and action In R G Golledge (Ed)Wayfinding behavior Cognitive mapping and other spatialprocesses (pp 168ndash190) Baltimore MD Johns HopkinsUniversity Press

Rieser J J Ashmead D H Talor C R amp YoungquistG A (1990) Visual perception and the guidance of loco-motion without vision to previously seen targets Perception19 675ndash689

Rieser J J Pick H L Jr Ashmead D amp Garing A(1995) Calibration of human locomotion and models ofperceptual-motor organization Journal of Experimental

Psychology Human Perception and Performance 21 480ndash497

Rolland J P Gibson W amp Arierly D (1995) Towardsquantifying depth and size perception as a function of view-ing distance Presence Teleoperators and Virtual Environ-ments 4 24ndash49

Thomson J A (1983) Is continuous visual monitoring nec-essary in visually guided locomotion Journal of Experimen-tal Psychology Human Perception and Performance 9(3)427ndash443

Wann J P Rushton S amp Mon-Williams M (1995) Natu-ral problems for stereoscopic depth perception in virtualenvironments Vision Research 35(19) 2731ndash2736

Willemsen P amp Gooch A (2002) Perceived egocentric dis-tances in real image-based and traditional virtual environ-ments In Proceedings of IEEE Virtual Reality Conference89ndash90

Witmer B amp Sadowski W Jr (1998) Nonvisually guidedlocomotion to a previously viewed target in real and virtualenvironments Human Factors 40 478ndash488

Wraga M (1999) Using eye height in different postures toscale the heights of objects Journal of Experimental Psychol-ogy Human Perception and Performance 25(2) 518ndash530

Thompson et al 571

Page 9: Santa Barbara Department of Psychology, Santa Barbara ... et al Presence.… · Santa Barbara Santa Barbara, CA 93106 Andrew C. Beall beall@psych.ucsb.edu Department of Psychology,

produced by a turn of deg too far to the left An arctan-gent transform was applied to the data to reduce thiseffect Averages error estimates and measures of statis-tical significance were calculated in the transform spaceThe inverse transform was then applied to the calculatedaverages and errors in order to allow presentation of thefinal results in terms of judged distance

Figure 8 allows easy comparisons between results forall four conditions The experiment confirmed previousstudies showing that for standing observers viewingground-level targets in action-space range distancejudgments in the real world were near veridical (real-world) while distance judgments based on computergraphics were significantly compressed The surprisingresult was that the amount of compression was nearlythe same for all three graphical displays That is dis-tance judgments were almost unaffected by the qualityof the imagery presented to subjects

A 4 (environment) 3 (distance) 2 (sex) repeated-measures ANOVA with distance as a within-subjectvariable and environment and sex as between-subjectvariables was performed on the transformed averagedistance judgments and indicated a significant effect ofenvironment F(3 40) 1077 p 001 Collapsedacross distance Scheffe post hoc comparisons showed

that distance judgments in the real world were greaterthan those given in each of the other environments (p

01) and that performance in the other three environ-ments did not differ (p 48 for all comparisons) Al-though the means at 10 m or 15 m suggest differencesbetween the virtual conditions post hoc univariateANOVAs (with three environmental conditions) at eachdistance indicated that these differences were negligible(p 4 for the effect of environment) The ANOVAalso indicated an effect of distance F(2 80) 18384p 001 Judged distance increased as a function ofphysical distance for all environments In all the analy-ses demonstrated that perceived distance was signifi-cantly more accurate in the real world compared to thevirtual environments and that distance judgments in thevirtual environments did not vary much from eachother

5 Discussion

The results presented above are a strong indicatorthat compressed absolute egocentric distance judgmentsin visually immersive environments are not caused by alack of realistic graphics rendering The phenomenal

Figure 7 Distance judgments Wireframe graphics Figure 8 Distance judgments Comparison of all conditions

568 PRESENCE VOLUME 13 NUMBER 5

experience of realism in the panoramic environment isbest expressed by the comments of several subjectsWhen looking into a glass window in the rendered dis-play they commented ldquoWhy canrsquot I see my reflection inthe glassrdquo Despite this subjective experience judg-ments based on wireframe renderings were as good asjudgments based on actual images presented with thesame display system In all virtual environments therewas a large compression of egocentric distance As aresult absolute egocentric distance judgments in virtualenvironments are not likely to be aided by photorealisticimprovements in computer graphics such as better tex-turing and illumination From a theoretical standpointthis suggests that familiar size may be a relatively minorcontributor to the sort of distance judgments that wereinvestigated though it is important to note that all fourconditions involved hallway-like scaling and geometryThe similarity between judged distances to targets onthe floor in the three types of virtual displays is consis-tent with the hypothesis that the declination of visualangle to targets dominates distance egocentric percep-tion (Ooi Wu amp He 2001) However this does notexplain the large differences observed between distancejudgments in the real and virtual conditions

The present experiment used a methodology that in-volved a stationary viewer and an action-based judg-ment task to address specific questions about judgmentsof distance in visually immersive environments Our in-tent was to determine whether observers would judgeegocentric distance in the simulated environment in asimilar manner as in the real world without the experi-ence of active exploration Thus we restricted the ob-serverrsquos movement while viewing the environmentsPrevious visual-motor adaptation studies (Rieser PickAshmead amp Garing 1995 Pick Rieser Wagner ampGaring 1999) have demonstrated that active observerswill quickly adapt to a new mapping between visual in-put and their own movements leading to the result ofmodified motor output that corresponds to the visualworld (recalibration) We might predict that allowingactive exploration of the virtual environments wouldlead to a similar adaptation and recalibration effect sothat observers would learn to walk and turn an accuratedistance to virtual targets While this prediction ad-

dresses an important question it is a different questionthan the one presently asked in this paper Our goal wasto test whether egocentric distance judgments wouldreplicate the accurate performance demonstrated in thereal world not whether these judgments could becomeaccurate after interacting within a compressed percep-tion of the world Future studies should consider boththe extent of veridical perception in visually immersiveenvironments and the role of actions in making immer-sive environments useful despite a potential lack ofveridical perception

What might explain the compression of absolute ego-centric distance judgments if not image quality Wesuggest several possibilities but no solid evidence sup-porting any of the potential explanations has yet beenpublished While the realism of the panorama imagesused in this study far exceeded any of the computergraphics employed in distance-judgment experiments byother investigators resolution and apparent sharpnesswere still limited compared to natural viewing of thereal world This may have influenced a familiar-size ef-fect or may have degraded the sense of presence whilewearing the HMD Dixon Wraga Proffitt and Wil-liams (2000) found that visual immersion was neededfor eye height to appropriately scale linear perspectivecues Perhaps a full sense of presence not only visualimmersion is needed for distance judgments to be com-parable to what is seen in the real world Limited fieldof view is often suggested as a cause of distorted spatialvision in HMDs but Knapp and Loomis (in press)found that limiting FOV did not affect real-world ego-centric distance judgments at least if the observer wasfree to move his or her head to visually explore the envi-ronment Motion parallax was not present in our virtualdisplay conditions but motion parallax appears to be arather weak absolute-distance cue (Beall et al 1995)In addition subjects performed veridically in our real-word condition with at most very limited translationalhead motion Focus and stereo convergence are not wellcontrolled in HMDs (Rolland et al 1995 WannRushton amp Mon-Williams 1995) and incorrect accom-modation cues are known to affect distance judgments(Andersen Saidpour amp Braunstein 1998 BinghamBradley Bailey amp Vinner 2001) It seems unlikely

Thompson et al 569

however that accommodation and convergence wouldhave an effect this large at the distances we were investi-gating Finally there may be some sort of ergonomiceffect associated with wearing an HMD (Lackner amp Di-Zio 1989)

Future research that manipulates factors other thanthe image quality such as FOV stereo and physicaleffects of the HMD is needed to begin to answer thesequestions A sense of presence is more difficult to defineand manipulate but is likely to be an important compo-nent in accurate distance perception in virtual environ-ments

Acknowledgments

This material is based upon work supported by the NationalScience Foundation under grants 9623614 0080999 and0121084 Thanks to Alias Wavefront for their donation ofMaya Complete which was used in this project

References

Andersen G J Saidpour A amp Braunstein M L (1998)Effects of collimation on perceived layout in 3-D scenesPerception 27 1305ndash1315

Beall A C Loomis J M Philbeck J M amp Fikes T J(1995) Absolute motion parallax weakly determines visualscale in real and virtual environments In Proceedings of theSPIE-The International Society for Optical Engineering2411 288ndash297

Bingham G P Bradley A Bailey M amp Vinner R (2001)Accomodation occlusion and disparity matching are usedto guide reaching A comparison of actual versus virtual en-vironments Journal of Experimental Psychology HumanPerception and Performance 27 1314ndash1334

Cutting J E amp Vishton P M (1995) Perceiving layoutand knowing distance The integration relative potency andcontextual use of different information about depth In WEpstein amp S Rogers (Eds) Perception of Space and Motion(pp 69ndash117) New York Academic

Dixon M W Wraga M Proffitt D R amp Williams G C(2000) Eye height scaling of absolute size in immersive and

nonimmersive displays Journal of Experimental PsychologyHuman Perception and Performance 26(2) 582ndash593

Durgin F H Fox L F Lewis J amp Walley K A (2002)Perceptuomotor adaptation More than meets the eye Pa-per presented at the forty-third annual meeting of the Psy-chonomic Society Kansas City MO

Ellis S R amp Menges B M (1997) Judgments of the dis-tance to nearby virtual objects Interaction of viewing con-ditions and accommodative demand Presence Teleoperatorsand Virtual Environments 6 452ndash462

Foley J M Ribeiro-Filho N P amp Da Silva J A (2004)Visual perception of extent and the geometry of visualspace Vision Research 44 147ndash156

Fukusima S S Loomis J M amp Da Silva J A (1997) Vi-sual perception of egocentric distance as assessed by trian-gulation Journal of Experimental Psychology Human Per-ception and Performance 23(1) 86ndash100

Knapp J M (1999) The visual perception of egocentric dis-tance in virtual environments Unpublished doctoral disser-tation University of California at Santa Barbara

Knapp J M amp Loomis J M (in press) Limited field ofview of head-mounted displays is not the cause of distanceunderestimation in virtual environments Presence Teleop-erators and Virtual Environments

Lackner J R amp DiZio P (1989) Altered sensory-motorcontrol of the head as an etiological factor in space-motionsickness Perceptual and Motor Skills 68 784ndash786

Lampton D R McDonald D P Singer M amp Bliss J(1995) Distance estimation in virtual environments In Pro-ceedings of the Human Factors and Ergonomics Society 39thAnnual Meeting 1268ndash1272

Loomis J M Blascovich J J amp Beall A C (1999) Im-mersive virtual environment technology as a basic researchtool in psychology Behavior Research Methods Instrumentsand Computers 31(4) 557ndash564

Loomis J M Da Silva J A Fujita N amp Fukusima S S(1992) Visual space perception and visually directed actionJournal of Experimental Psychology Human Perception andPerformance 18 906ndash921

Loomis J M amp Knapp J M (2003) Visual perception ofegocentric distance in real and virtual environments In LHettinger amp M Haas (Eds) Virtual and adaptive environ-ments (pp 21ndash 46) Hillsdale NJ Erlbaum

Loomis J M Philbeck J W amp Zahorik P (2002) Disso-ciation between location and shape in visual space Journal

570 PRESENCE VOLUME 13 NUMBER 5

of Experimental Psychology Human Perception and Perfor-mance 28 1202ndash1212

Ooi T L Wu B amp He Z J (2001) Distance determina-tion by the angular declination below the horizon Nature414 197ndash200

Pick H L Jr Rieser J J Wagner D amp Garing A E(1999) The recalibration of rotational locomotion Journalof Experimental Psychology Human Perception and Perfor-mance 25(5) 1179ndash1188

Raskar R amp Cohen M (1999) Image precision silhouetteedges In Proceedings of the ACM Symposium on Interactive3D Graphics 135ndash140

Rieser J J (1999) Dynamic spatial orientation and the cou-pling of representation and action In R G Golledge (Ed)Wayfinding behavior Cognitive mapping and other spatialprocesses (pp 168ndash190) Baltimore MD Johns HopkinsUniversity Press

Rieser J J Ashmead D H Talor C R amp YoungquistG A (1990) Visual perception and the guidance of loco-motion without vision to previously seen targets Perception19 675ndash689

Rieser J J Pick H L Jr Ashmead D amp Garing A(1995) Calibration of human locomotion and models ofperceptual-motor organization Journal of Experimental

Psychology Human Perception and Performance 21 480ndash497

Rolland J P Gibson W amp Arierly D (1995) Towardsquantifying depth and size perception as a function of view-ing distance Presence Teleoperators and Virtual Environ-ments 4 24ndash49

Thomson J A (1983) Is continuous visual monitoring nec-essary in visually guided locomotion Journal of Experimen-tal Psychology Human Perception and Performance 9(3)427ndash443

Wann J P Rushton S amp Mon-Williams M (1995) Natu-ral problems for stereoscopic depth perception in virtualenvironments Vision Research 35(19) 2731ndash2736

Willemsen P amp Gooch A (2002) Perceived egocentric dis-tances in real image-based and traditional virtual environ-ments In Proceedings of IEEE Virtual Reality Conference89ndash90

Witmer B amp Sadowski W Jr (1998) Nonvisually guidedlocomotion to a previously viewed target in real and virtualenvironments Human Factors 40 478ndash488

Wraga M (1999) Using eye height in different postures toscale the heights of objects Journal of Experimental Psychol-ogy Human Perception and Performance 25(2) 518ndash530

Thompson et al 571

Page 10: Santa Barbara Department of Psychology, Santa Barbara ... et al Presence.… · Santa Barbara Santa Barbara, CA 93106 Andrew C. Beall beall@psych.ucsb.edu Department of Psychology,

experience of realism in the panoramic environment isbest expressed by the comments of several subjectsWhen looking into a glass window in the rendered dis-play they commented ldquoWhy canrsquot I see my reflection inthe glassrdquo Despite this subjective experience judg-ments based on wireframe renderings were as good asjudgments based on actual images presented with thesame display system In all virtual environments therewas a large compression of egocentric distance As aresult absolute egocentric distance judgments in virtualenvironments are not likely to be aided by photorealisticimprovements in computer graphics such as better tex-turing and illumination From a theoretical standpointthis suggests that familiar size may be a relatively minorcontributor to the sort of distance judgments that wereinvestigated though it is important to note that all fourconditions involved hallway-like scaling and geometryThe similarity between judged distances to targets onthe floor in the three types of virtual displays is consis-tent with the hypothesis that the declination of visualangle to targets dominates distance egocentric percep-tion (Ooi Wu amp He 2001) However this does notexplain the large differences observed between distancejudgments in the real and virtual conditions

The present experiment used a methodology that in-volved a stationary viewer and an action-based judg-ment task to address specific questions about judgmentsof distance in visually immersive environments Our in-tent was to determine whether observers would judgeegocentric distance in the simulated environment in asimilar manner as in the real world without the experi-ence of active exploration Thus we restricted the ob-serverrsquos movement while viewing the environmentsPrevious visual-motor adaptation studies (Rieser PickAshmead amp Garing 1995 Pick Rieser Wagner ampGaring 1999) have demonstrated that active observerswill quickly adapt to a new mapping between visual in-put and their own movements leading to the result ofmodified motor output that corresponds to the visualworld (recalibration) We might predict that allowingactive exploration of the virtual environments wouldlead to a similar adaptation and recalibration effect sothat observers would learn to walk and turn an accuratedistance to virtual targets While this prediction ad-

dresses an important question it is a different questionthan the one presently asked in this paper Our goal wasto test whether egocentric distance judgments wouldreplicate the accurate performance demonstrated in thereal world not whether these judgments could becomeaccurate after interacting within a compressed percep-tion of the world Future studies should consider boththe extent of veridical perception in visually immersiveenvironments and the role of actions in making immer-sive environments useful despite a potential lack ofveridical perception

What might explain the compression of absolute ego-centric distance judgments if not image quality Wesuggest several possibilities but no solid evidence sup-porting any of the potential explanations has yet beenpublished While the realism of the panorama imagesused in this study far exceeded any of the computergraphics employed in distance-judgment experiments byother investigators resolution and apparent sharpnesswere still limited compared to natural viewing of thereal world This may have influenced a familiar-size ef-fect or may have degraded the sense of presence whilewearing the HMD Dixon Wraga Proffitt and Wil-liams (2000) found that visual immersion was neededfor eye height to appropriately scale linear perspectivecues Perhaps a full sense of presence not only visualimmersion is needed for distance judgments to be com-parable to what is seen in the real world Limited fieldof view is often suggested as a cause of distorted spatialvision in HMDs but Knapp and Loomis (in press)found that limiting FOV did not affect real-world ego-centric distance judgments at least if the observer wasfree to move his or her head to visually explore the envi-ronment Motion parallax was not present in our virtualdisplay conditions but motion parallax appears to be arather weak absolute-distance cue (Beall et al 1995)In addition subjects performed veridically in our real-word condition with at most very limited translationalhead motion Focus and stereo convergence are not wellcontrolled in HMDs (Rolland et al 1995 WannRushton amp Mon-Williams 1995) and incorrect accom-modation cues are known to affect distance judgments(Andersen Saidpour amp Braunstein 1998 BinghamBradley Bailey amp Vinner 2001) It seems unlikely

Thompson et al 569

however that accommodation and convergence wouldhave an effect this large at the distances we were investi-gating Finally there may be some sort of ergonomiceffect associated with wearing an HMD (Lackner amp Di-Zio 1989)

Future research that manipulates factors other thanthe image quality such as FOV stereo and physicaleffects of the HMD is needed to begin to answer thesequestions A sense of presence is more difficult to defineand manipulate but is likely to be an important compo-nent in accurate distance perception in virtual environ-ments

Acknowledgments

This material is based upon work supported by the NationalScience Foundation under grants 9623614 0080999 and0121084 Thanks to Alias Wavefront for their donation ofMaya Complete which was used in this project

References

Andersen G J Saidpour A amp Braunstein M L (1998)Effects of collimation on perceived layout in 3-D scenesPerception 27 1305ndash1315

Beall A C Loomis J M Philbeck J M amp Fikes T J(1995) Absolute motion parallax weakly determines visualscale in real and virtual environments In Proceedings of theSPIE-The International Society for Optical Engineering2411 288ndash297

Bingham G P Bradley A Bailey M amp Vinner R (2001)Accomodation occlusion and disparity matching are usedto guide reaching A comparison of actual versus virtual en-vironments Journal of Experimental Psychology HumanPerception and Performance 27 1314ndash1334

Cutting J E amp Vishton P M (1995) Perceiving layoutand knowing distance The integration relative potency andcontextual use of different information about depth In WEpstein amp S Rogers (Eds) Perception of Space and Motion(pp 69ndash117) New York Academic

Dixon M W Wraga M Proffitt D R amp Williams G C(2000) Eye height scaling of absolute size in immersive and

nonimmersive displays Journal of Experimental PsychologyHuman Perception and Performance 26(2) 582ndash593

Durgin F H Fox L F Lewis J amp Walley K A (2002)Perceptuomotor adaptation More than meets the eye Pa-per presented at the forty-third annual meeting of the Psy-chonomic Society Kansas City MO

Ellis S R amp Menges B M (1997) Judgments of the dis-tance to nearby virtual objects Interaction of viewing con-ditions and accommodative demand Presence Teleoperatorsand Virtual Environments 6 452ndash462

Foley J M Ribeiro-Filho N P amp Da Silva J A (2004)Visual perception of extent and the geometry of visualspace Vision Research 44 147ndash156

Fukusima S S Loomis J M amp Da Silva J A (1997) Vi-sual perception of egocentric distance as assessed by trian-gulation Journal of Experimental Psychology Human Per-ception and Performance 23(1) 86ndash100

Knapp J M (1999) The visual perception of egocentric dis-tance in virtual environments Unpublished doctoral disser-tation University of California at Santa Barbara

Knapp J M amp Loomis J M (in press) Limited field ofview of head-mounted displays is not the cause of distanceunderestimation in virtual environments Presence Teleop-erators and Virtual Environments

Lackner J R amp DiZio P (1989) Altered sensory-motorcontrol of the head as an etiological factor in space-motionsickness Perceptual and Motor Skills 68 784ndash786

Lampton D R McDonald D P Singer M amp Bliss J(1995) Distance estimation in virtual environments In Pro-ceedings of the Human Factors and Ergonomics Society 39thAnnual Meeting 1268ndash1272

Loomis J M Blascovich J J amp Beall A C (1999) Im-mersive virtual environment technology as a basic researchtool in psychology Behavior Research Methods Instrumentsand Computers 31(4) 557ndash564

Loomis J M Da Silva J A Fujita N amp Fukusima S S(1992) Visual space perception and visually directed actionJournal of Experimental Psychology Human Perception andPerformance 18 906ndash921

Loomis J M amp Knapp J M (2003) Visual perception ofegocentric distance in real and virtual environments In LHettinger amp M Haas (Eds) Virtual and adaptive environ-ments (pp 21ndash 46) Hillsdale NJ Erlbaum

Loomis J M Philbeck J W amp Zahorik P (2002) Disso-ciation between location and shape in visual space Journal

570 PRESENCE VOLUME 13 NUMBER 5

of Experimental Psychology Human Perception and Perfor-mance 28 1202ndash1212

Ooi T L Wu B amp He Z J (2001) Distance determina-tion by the angular declination below the horizon Nature414 197ndash200

Pick H L Jr Rieser J J Wagner D amp Garing A E(1999) The recalibration of rotational locomotion Journalof Experimental Psychology Human Perception and Perfor-mance 25(5) 1179ndash1188

Raskar R amp Cohen M (1999) Image precision silhouetteedges In Proceedings of the ACM Symposium on Interactive3D Graphics 135ndash140

Rieser J J (1999) Dynamic spatial orientation and the cou-pling of representation and action In R G Golledge (Ed)Wayfinding behavior Cognitive mapping and other spatialprocesses (pp 168ndash190) Baltimore MD Johns HopkinsUniversity Press

Rieser J J Ashmead D H Talor C R amp YoungquistG A (1990) Visual perception and the guidance of loco-motion without vision to previously seen targets Perception19 675ndash689

Rieser J J Pick H L Jr Ashmead D amp Garing A(1995) Calibration of human locomotion and models ofperceptual-motor organization Journal of Experimental

Psychology Human Perception and Performance 21 480ndash497

Rolland J P Gibson W amp Arierly D (1995) Towardsquantifying depth and size perception as a function of view-ing distance Presence Teleoperators and Virtual Environ-ments 4 24ndash49

Thomson J A (1983) Is continuous visual monitoring nec-essary in visually guided locomotion Journal of Experimen-tal Psychology Human Perception and Performance 9(3)427ndash443

Wann J P Rushton S amp Mon-Williams M (1995) Natu-ral problems for stereoscopic depth perception in virtualenvironments Vision Research 35(19) 2731ndash2736

Willemsen P amp Gooch A (2002) Perceived egocentric dis-tances in real image-based and traditional virtual environ-ments In Proceedings of IEEE Virtual Reality Conference89ndash90

Witmer B amp Sadowski W Jr (1998) Nonvisually guidedlocomotion to a previously viewed target in real and virtualenvironments Human Factors 40 478ndash488

Wraga M (1999) Using eye height in different postures toscale the heights of objects Journal of Experimental Psychol-ogy Human Perception and Performance 25(2) 518ndash530

Thompson et al 571

Page 11: Santa Barbara Department of Psychology, Santa Barbara ... et al Presence.… · Santa Barbara Santa Barbara, CA 93106 Andrew C. Beall beall@psych.ucsb.edu Department of Psychology,

however that accommodation and convergence wouldhave an effect this large at the distances we were investi-gating Finally there may be some sort of ergonomiceffect associated with wearing an HMD (Lackner amp Di-Zio 1989)

Future research that manipulates factors other thanthe image quality such as FOV stereo and physicaleffects of the HMD is needed to begin to answer thesequestions A sense of presence is more difficult to defineand manipulate but is likely to be an important compo-nent in accurate distance perception in virtual environ-ments

Acknowledgments

This material is based upon work supported by the NationalScience Foundation under grants 9623614 0080999 and0121084 Thanks to Alias Wavefront for their donation ofMaya Complete which was used in this project

References

Andersen G J Saidpour A amp Braunstein M L (1998)Effects of collimation on perceived layout in 3-D scenesPerception 27 1305ndash1315

Beall A C Loomis J M Philbeck J M amp Fikes T J(1995) Absolute motion parallax weakly determines visualscale in real and virtual environments In Proceedings of theSPIE-The International Society for Optical Engineering2411 288ndash297

Bingham G P Bradley A Bailey M amp Vinner R (2001)Accomodation occlusion and disparity matching are usedto guide reaching A comparison of actual versus virtual en-vironments Journal of Experimental Psychology HumanPerception and Performance 27 1314ndash1334

Cutting J E amp Vishton P M (1995) Perceiving layoutand knowing distance The integration relative potency andcontextual use of different information about depth In WEpstein amp S Rogers (Eds) Perception of Space and Motion(pp 69ndash117) New York Academic

Dixon M W Wraga M Proffitt D R amp Williams G C(2000) Eye height scaling of absolute size in immersive and

nonimmersive displays Journal of Experimental PsychologyHuman Perception and Performance 26(2) 582ndash593

Durgin F H Fox L F Lewis J amp Walley K A (2002)Perceptuomotor adaptation More than meets the eye Pa-per presented at the forty-third annual meeting of the Psy-chonomic Society Kansas City MO

Ellis S R amp Menges B M (1997) Judgments of the dis-tance to nearby virtual objects Interaction of viewing con-ditions and accommodative demand Presence Teleoperatorsand Virtual Environments 6 452ndash462

Foley J M Ribeiro-Filho N P amp Da Silva J A (2004)Visual perception of extent and the geometry of visualspace Vision Research 44 147ndash156

Fukusima S S Loomis J M amp Da Silva J A (1997) Vi-sual perception of egocentric distance as assessed by trian-gulation Journal of Experimental Psychology Human Per-ception and Performance 23(1) 86ndash100

Knapp J M (1999) The visual perception of egocentric dis-tance in virtual environments Unpublished doctoral disser-tation University of California at Santa Barbara

Knapp J M amp Loomis J M (in press) Limited field ofview of head-mounted displays is not the cause of distanceunderestimation in virtual environments Presence Teleop-erators and Virtual Environments

Lackner J R amp DiZio P (1989) Altered sensory-motorcontrol of the head as an etiological factor in space-motionsickness Perceptual and Motor Skills 68 784ndash786

Lampton D R McDonald D P Singer M amp Bliss J(1995) Distance estimation in virtual environments In Pro-ceedings of the Human Factors and Ergonomics Society 39thAnnual Meeting 1268ndash1272

Loomis J M Blascovich J J amp Beall A C (1999) Im-mersive virtual environment technology as a basic researchtool in psychology Behavior Research Methods Instrumentsand Computers 31(4) 557ndash564

Loomis J M Da Silva J A Fujita N amp Fukusima S S(1992) Visual space perception and visually directed actionJournal of Experimental Psychology Human Perception andPerformance 18 906ndash921

Loomis J M amp Knapp J M (2003) Visual perception ofegocentric distance in real and virtual environments In LHettinger amp M Haas (Eds) Virtual and adaptive environ-ments (pp 21ndash 46) Hillsdale NJ Erlbaum

Loomis J M Philbeck J W amp Zahorik P (2002) Disso-ciation between location and shape in visual space Journal

570 PRESENCE VOLUME 13 NUMBER 5

of Experimental Psychology Human Perception and Perfor-mance 28 1202ndash1212

Ooi T L Wu B amp He Z J (2001) Distance determina-tion by the angular declination below the horizon Nature414 197ndash200

Pick H L Jr Rieser J J Wagner D amp Garing A E(1999) The recalibration of rotational locomotion Journalof Experimental Psychology Human Perception and Perfor-mance 25(5) 1179ndash1188

Raskar R amp Cohen M (1999) Image precision silhouetteedges In Proceedings of the ACM Symposium on Interactive3D Graphics 135ndash140

Rieser J J (1999) Dynamic spatial orientation and the cou-pling of representation and action In R G Golledge (Ed)Wayfinding behavior Cognitive mapping and other spatialprocesses (pp 168ndash190) Baltimore MD Johns HopkinsUniversity Press

Rieser J J Ashmead D H Talor C R amp YoungquistG A (1990) Visual perception and the guidance of loco-motion without vision to previously seen targets Perception19 675ndash689

Rieser J J Pick H L Jr Ashmead D amp Garing A(1995) Calibration of human locomotion and models ofperceptual-motor organization Journal of Experimental

Psychology Human Perception and Performance 21 480ndash497

Rolland J P Gibson W amp Arierly D (1995) Towardsquantifying depth and size perception as a function of view-ing distance Presence Teleoperators and Virtual Environ-ments 4 24ndash49

Thomson J A (1983) Is continuous visual monitoring nec-essary in visually guided locomotion Journal of Experimen-tal Psychology Human Perception and Performance 9(3)427ndash443

Wann J P Rushton S amp Mon-Williams M (1995) Natu-ral problems for stereoscopic depth perception in virtualenvironments Vision Research 35(19) 2731ndash2736

Willemsen P amp Gooch A (2002) Perceived egocentric dis-tances in real image-based and traditional virtual environ-ments In Proceedings of IEEE Virtual Reality Conference89ndash90

Witmer B amp Sadowski W Jr (1998) Nonvisually guidedlocomotion to a previously viewed target in real and virtualenvironments Human Factors 40 478ndash488

Wraga M (1999) Using eye height in different postures toscale the heights of objects Journal of Experimental Psychol-ogy Human Perception and Performance 25(2) 518ndash530

Thompson et al 571

Page 12: Santa Barbara Department of Psychology, Santa Barbara ... et al Presence.… · Santa Barbara Santa Barbara, CA 93106 Andrew C. Beall beall@psych.ucsb.edu Department of Psychology,

of Experimental Psychology Human Perception and Perfor-mance 28 1202ndash1212

Ooi T L Wu B amp He Z J (2001) Distance determina-tion by the angular declination below the horizon Nature414 197ndash200

Pick H L Jr Rieser J J Wagner D amp Garing A E(1999) The recalibration of rotational locomotion Journalof Experimental Psychology Human Perception and Perfor-mance 25(5) 1179ndash1188

Raskar R amp Cohen M (1999) Image precision silhouetteedges In Proceedings of the ACM Symposium on Interactive3D Graphics 135ndash140

Rieser J J (1999) Dynamic spatial orientation and the cou-pling of representation and action In R G Golledge (Ed)Wayfinding behavior Cognitive mapping and other spatialprocesses (pp 168ndash190) Baltimore MD Johns HopkinsUniversity Press

Rieser J J Ashmead D H Talor C R amp YoungquistG A (1990) Visual perception and the guidance of loco-motion without vision to previously seen targets Perception19 675ndash689

Rieser J J Pick H L Jr Ashmead D amp Garing A(1995) Calibration of human locomotion and models ofperceptual-motor organization Journal of Experimental

Psychology Human Perception and Performance 21 480ndash497

Rolland J P Gibson W amp Arierly D (1995) Towardsquantifying depth and size perception as a function of view-ing distance Presence Teleoperators and Virtual Environ-ments 4 24ndash49

Thomson J A (1983) Is continuous visual monitoring nec-essary in visually guided locomotion Journal of Experimen-tal Psychology Human Perception and Performance 9(3)427ndash443

Wann J P Rushton S amp Mon-Williams M (1995) Natu-ral problems for stereoscopic depth perception in virtualenvironments Vision Research 35(19) 2731ndash2736

Willemsen P amp Gooch A (2002) Perceived egocentric dis-tances in real image-based and traditional virtual environ-ments In Proceedings of IEEE Virtual Reality Conference89ndash90

Witmer B amp Sadowski W Jr (1998) Nonvisually guidedlocomotion to a previously viewed target in real and virtualenvironments Human Factors 40 478ndash488

Wraga M (1999) Using eye height in different postures toscale the heights of objects Journal of Experimental Psychol-ogy Human Perception and Performance 25(2) 518ndash530

Thompson et al 571