Dynamic Spatially Augmented 3D Paintingandrei/pubs/2001_TR_3Dpaint.pdf · 3D Painting with brush...

6
Dynamic Spatially Augmented 3D Painting Deepak Bandyopadhyay Ramesh Raskar Andrei State Henry Fuchs Department of Computer Science The University of North Carolina at Chapel Hill Abstract We present techniques for creating a Spatially Augmented Reality for moving polygonal objects in an indoor environment, and a sys- tem that uses these techniques for painting in 3D on these objects in real time. By tracking the object and projecting on its different surfaces with two or more projectors, our system creates a dynamic spatially augmented version of the object that is registered with the real object as it moves. Along with modification of the lighting and material properties, our application allows real-time 3D painting on the surface of the real object. Even though extensive user studies for this application have not yet been done, the application is new and compelling enough even at the proof of concept stage to war- rant further exploration into this area to solve all of the problems encountered, particularly into the problem of real-time blending of the projected images in the areas of projector overlap. Keywords: Applications, 3D Painting, Spatially Augmented Re- ality, Projector, Tracking, Human-Computer Interface Additional Keywords: Augmented Reality, Dynamic, Virtual Re- ality 1 Introduction Spatially augmented reality ([13]) is a technique in which we in- troduce computer-generated imagery on top of real-world objects, that lets multiple independent viewers see an augmented version of their surroundings in stereo without the need for head tracking or head-mounted displays. The Shader Lamps technique ([11]) shows a way of altering the color, lighting, diffuse and specular properties of static objects by projecting on them using image-based illumination. What if these objects could be allowed to move in arbitrary ways, retaining their current state of shading as they did so? For the purposes of interac- tive shading of the objects, a painting interface seems ideal. So in such a system I’d be able to draw something on an object’s faces, and then pass it to you to examine and add to it. This was the vi- sion that guided the starting of this work. It was accomplished as a course project in the UNC course on Exploring Virtual Worlds offered in the spring of 2000. The rest of this document is organized as follows. Section 2 de- tails the goals and objectives that we started out with. Section 3 deals with the actual proof-of-concept implementation, the issues that arose and the decisions made in the development of this imple- mentation. Section 4 summarizes the initial reactions of users of the paint system, along with a visual sample of what the system is currently capable of. Section 5 talks about some of the issues that currently need to be addressed and how we are planning to deal with them in the short to medium term, and section 6 concludes with a summary and an indication of the exciting challenges and future possibilities for research in this area. 2 Objectives Initially our technological objective was to create an application for painting on a moving object in a virtual environment using pro- jected imagery. This was to be accomplished in two distinct stages: 1. Painting on a static real object using a hand-held tracker as a paintbrush, and a set of projectors as a display device for projecting color patterns onto the faces of the object. 2. Letting the object move, and the color patterns stay registered on its faces and move with it. This would create a truly dy- namic environment. In the current system, we take these objectives further by fusing the two and making moving and painting work together simultane- ously, so that one could paint on a moving object and have the paint appear in the right place. 3 Implementation 3.1 System details The modeling and rendering code in our system runs on a sin- gle pipe of an SGI InfiniteReality2[10] engine, though it is fully

Transcript of Dynamic Spatially Augmented 3D Paintingandrei/pubs/2001_TR_3Dpaint.pdf · 3D Painting with brush...

Page 1: Dynamic Spatially Augmented 3D Paintingandrei/pubs/2001_TR_3Dpaint.pdf · 3D Painting with brush functions and mapping from 3D points to texture coordinates Rendering, the actual

Dynamic Spatiall y Augmented 3D Painting

DeepakBandyopadhyay RameshRaskar AndreiState HenryFuchs

Departmentof ComputerScienceTheUniversityof North CarolinaatChapelHill

Abstract

We presenttechniquesfor creatinga SpatiallyAugmentedRealityfor moving polygonalobjectsin an indoor environment,anda sys-temthat usesthesetechniquesfor painting in 3D on theseobjectsin real time. By tracking theobjectandprojectingon its differentsurfaceswith twoor moreprojectors,our systemcreatesa dynamicspatiallyaugmentedversionof theobjectthat is registeredwith therealobjectasit moves.Alongwith modificationof thelighting andmaterial properties,our applicationallows real-time3D paintingonthesurfaceof therealobject.Eventhoughextensiveuserstudiesfor this applicationhavenot yetbeendone, theapplicationis newandcompellingenoughevenat theproof of conceptstage to war-rant further exploration into this area to solveall of the problemsencountered,particularly into theproblemof real-timeblendingoftheprojectedimagesin theareasof projectoroverlap.

Keywords: Applications,3D Painting,SpatiallyAugmentedRe-ality, Projector, Tracking,Human-ComputerInterface

Additional Keywords: AugmentedReality, Dynamic,Virtual Re-ality

1 Intr oduction

Spatiallyaugmentedreality ([13]) is a techniquein which we in-troducecomputer-generatedimageryon top of real-world objects,thatletsmultiple independentviewersseeanaugmentedversionoftheir surroundingsin stereowithout the needfor headtrackingorhead-mounteddisplays.

TheShader Lamps technique([11]) showsawayof alteringthecolor, lighting, diffuseandspecularpropertiesof staticobjectsbyprojectingon themusingimage-basedillumination. What if theseobjectscouldbeallowedto move in arbitraryways,retainingtheircurrentstateof shadingasthey did so?For thepurposesof interac-tive shadingof theobjects,a paintinginterfaceseemsideal. So insucha systemI’d beableto draw somethingon anobject’s faces,andthenpassit to you to examineandaddto it. This wasthevi-sion thatguidedthestartingof this work. It wasaccomplishedas

a courseproject in the UNC courseon Exploring Virtual Worldsofferedin thespringof 2000.

Therestof this documentis organizedasfollows. Section2 de-tails the goalsandobjectives that we startedout with. Section3dealswith the actualproof-of-conceptimplementation,the issuesthataroseandthedecisionsmadein thedevelopmentof this imple-mentation. Section4 summarizesthe initial reactionsof usersofthepaintsystem,alongwith a visualsampleof what thesystemiscurrentlycapableof. Section5 talksaboutsomeof the issuesthatcurrently needto be addressedand how we are planningto dealwith them in the short to mediumterm, andsection6 concludeswith a summaryandan indicationof the exciting challengesandfuturepossibilitiesfor researchin thisarea.

2 Objectives

Initially our technologicalobjectivewasto createanapplicationforpainting on a moving object in a virtual environmentusing pro-jectedimagery. Thiswasto beaccomplishedin two distinctstages:

1. Painting on a static real objectusinga hand-heldtracker asa paintbrush,anda setof projectorsasa display device forprojectingcolor patternsontothefacesof theobject.

2. Letting theobjectmove,andthecolorpatternsstayregisteredon its facesandmove with it. This would createa truly dy-namicenvironment.

In thecurrentsystem,we take theseobjectivesfurtherby fusingthetwo andmakingmoving andpaintingwork togethersimultane-ously, sothatonecouldpaintonamovingobjectandhave thepaintappearin theright place.

3 Implementation

3.1 System details

The modeling and renderingcode in our systemruns on a sin-gle pipe of an SGI InfiniteReality2[10]engine,thoughit is fully

Page 2: Dynamic Spatially Augmented 3D Paintingandrei/pubs/2001_TR_3Dpaint.pdf · 3D Painting with brush functions and mapping from 3D points to texture coordinates Rendering, the actual

portableto a PCplatform,which we have demonstrated.Thedis-play is on Sony LCD projectors. The tracker we use currentlyhasdriversonly for theSGI platform,soon thePCversionof thesystemwe areinvestigatingthe useof alternatetrackersandevencamera-basedtracking.

3.2 Issues

Someof themajorproblemsthatweredealtwith wereasfollows:

� Suitability of any realobjectfor beingprojectedandpaintedon.

� Setup and Calibration of theprojectorsto displayin worldcoordinates.

� Modeling of therealobjectinto 3D geometryandtextureco-ordinates.

� Display and Registration of theprojectionsontotheobject.� Tracking and Updates of the real object’s positionandori-

entationastheusermovesit around.� Lighting and Material propertiesof the object,andhow to

modify themwith image-basedillumination� 3D Painting with brush functions and mapping from 3D

pointsto texturecoordinates� Rendering, the actualprojectionstepfrom two projectors,

updatedsimultaneouslyfor all-aroundviewing of theobject.

3.3 Choice of Object

Theidealobjectmustbedull (not specular)sothatwe mayprojectany arbitrary illumination onto it. It mustbe neutral (not dark orcolored)so that we canget a wider rangeof colorsby projection.It shouldbe lightweight (for easymanipulation)andreadilyavail-able. The proof of conceptcanbe shown even with an objectofsmall polygoncount, thoughthe systemwill work with medium-sizedpolygoncountobjectsaswell. Symmetryin theobjectis to beavoided,though,sothattheregistrationof a particularfeatureontoaparticularpolygonis discernibleasall polygonsdonot look alike.

We chosea cuboid(madefrom a cardboardbox) as it satisfiesall the above propertiesand is sufficiently simple to model as afirst objectwithout sacrificingany of the power of the underlyingtechniques.So for examplewe canmove it aroundarbitrarily andit looksuniqueateachorientation;andwecanpainton its faces,oraroundthecornersandedgessothatthecolorsaresharedbetweenfaces.

3.4 Tracking and Coor dinates

We chosetheareausedfor theUltrasoundAugmentedRealityex-perimentsetupfor settingup our system,asit providesa largerel-atively flat surfacealongwith anopticaltracker (theFlashpoint

���

5000from ImageGuidedTechnologiesInc.,Denver, Colorado)thathasmultiple sensorswith active LEDSthatcanbetrackedsimulta-neously.

We attacheda sensor(the rig) with threeLEDs onto the objectbeingtracked,andthecurrentpositionandorientationof thesensorin tracker cordinateswascomputedby the tracker driver routinesfrom thepositionsof theLEDs. Anothersensor(thepointer) withtwo LEDsservedasthepaintbrush;thepositionof its tip wascom-putedby extendingthestraightline joining the two LEDs forwardby a fixed distance.This sensorhadonly five degreesof freedomasit couldaccuratelycomputethepositionof the tip andthepitchand roll componentsof the sensor’s orientation,but not the twistcomponentof rotationaboutits longitudinalaxis.Thiswasnotper-ceived asa major problemdueto the way thebrushwasmodeled(seeSectionblahblah).

Another simplification we make is that the world coordinateframeis the sameasasthe tracker coordinateframe,whereastheobjectcoordinatesarespecifiedin the coordinateframeof a sen-sor that is rigidly attachedto theobject. This allows us to readthematrix thatgivesthesensorpositionandorientationin tracker co-ordinates,multiply it with eachpoint on the object (a constantinobjectcoordinates)andgetthecoordinatesat which thepoint is toberendereddirectly.

3.5 Projector Setup and Calibration

Two projectorsaremountedin theceiling facingeachother. Theprojectorstilt downwardsandtheirfield of view andzoomaresettocover thetrackingarea,andhavealargeoverlapwithin thetrackingareato ensurethatall facesof theobjectareilluminatedby at leastoneprojectorwhile it is beingtracked.

Eachprojectorneedsto becalibratedsothatit candraw correctlyin world coordinates.For astaticworld, findingtheinternalparam-etersandtherigid transformationbetweenthecoordinatesystemofthe world and the projectoris enough. Raskaret al in [15] solvethis problemby taking a setof fiducialswith known 3D locationson thephysicalobjectandthenfinding thecorrespondingprojectorpixelsthat illuminatethem,andsolvingfor the3x4 projectionma-trix correctto a constantscalefactor, which is thendecomposedtoyield theinternalandexternalparametersof theprojector.

Wemodify thismethodby addingthestipulationthatthecalibra-tion pointsspantheoverlapof theprojectionandtrackingvolumes,andareroughly uniformly distributedacrossthis intersectionvol-ume.This ensuresthatthecalibrationis goodno matterwhereandin which orientationwe placetheobject.

Now the tracked objectcanbedrawn directly by concatenatingthe internalandtheexternalparametersof theprojector. The ren-deringprocessusesthe sameinternalandexternalparameters,sothattheprojectedimagesareregisteredwith thephysicalobjects.

3.6 Registration

First, for thecaseof oneprojector, registrationmeansthatwe candraw with someprecisionin world coordinates.Staticregistrationis improvedby improving thecalibration,modelingtheobjectmoreaccuratelyandmoreaccuratetracking.Dynamictemporalregistra-tion (for the imageto stay registeredas the object moves) needsaccurateas well as fast tracking, along with someprediction tooffsettheeffectof tracker latency.

Our applicationis moresensitive to tracking latency thancon-ventionalVR applications,as the renderedsceneis observed to-getherwith the real environment. In this situation,tracker latencyanderror can translateinto dynamicandstatic registrationerror,respectively. Dynamicregistrationerrorcausesthemoving usertoperceive shearingin the renderedscene,andthesefactorsdestroytheillusion of presence.

Futureversionsof thesystemmayuseanoptical,ceiling trackersuchas the 3rdTech HiBall [18, 19]. This tracker samplesoveronethousandreadingspersecond,hasa high statictrackingaccu-racy (low noise)aswell asa low enoughlatency. Togetherwithtechniquesfor prediction[1], this shouldsolve someproblemswithlatency for the brush. For trackingtheobject(s)beingpaintedon,theshorttermsolutionlies with multiple sensorsconnectedto thesametracker coordinateframe,oneto eachobjectandpreferablywireless.For thepresentFlashpointLED-basedopticalsystem,thiswould meandefininga customtracker for eachobjectby plantingthreewired LEDs in it at “good” locationsso that in most posi-tionsandorientationsthey arevisibleandtheobjectcanbetracked.To extendthis for tracking in all orientations,theseLEDs canbeplantedall aroundtheobjectso thatalwaysat leastthreearevisi-ble,andany threevisible onesareusedto tracktheobjectwith six

Page 3: Dynamic Spatially Augmented 3D Paintingandrei/pubs/2001_TR_3Dpaint.pdf · 3D Painting with brush functions and mapping from 3D points to texture coordinates Rendering, the actual

degreesof freedom.

3.7 Modeling

The modeling systemusedis essentiallythe sameas usedwithShaderLamps[15] with a coupleof modifications.Thevertex listinsteadof beinghardcodedin arraysis storedin a file. This file istheoutputof aprogramthatusesa tracker to measurepointson theobject. A similar thing is donewith the edgelist, usinga surfacereconstructor[9] to createthemeshconnectivity.

For eachtrianglewe storea pointerto a texturemapobjectin-steadof just an OpenGLtexture ID. This is becausewe needtostoreeachtexturein anarrayalso,andmodify thatarrayduringthepaintingprocess,at which point the GL texture object is updatedwith thenew valuesin thearray. Thetexturecoordinatesarestoredpervertex in themodelfile. Right now texturecoordinatesareob-tainedmanuallybut for any verycomplex examplethey will havetobeautomaticallygenerated.Oneway is to usea cylinder or spheremappingandblow out all trianglesfrom a centralaxis or point tothesurfaceof a cylinderor sphererespectively, andthenmapcoor-dinateson the cylinder or sphereonto a rectangularmap. Specialcarehasto be taken that no trianglessharethe sameregion of atexturemap,or elsepaintingin onetrianglewill causeinadvertentandabsurdchangesin thetextureof another. Thisconstraintcanbeincorporatedinto a texturecoordinateoptimizationstage([16], [7])thatfollows thegenerationstage.

3.8 Rendering

The shadedmodel is currently renderedfrom two projectorsontwo channelsof one graphicspipe of an SGI InfinityReality2engine[10]. Theprojectorsetupis with bothfacingeachotherandslightly tilted downwards,to bestcover thetrackingregionandalsominimize the overlap surfacearea1, so that wherever the objectis moved within the region, in most orientationsit getsfull cov-erageon mostof its surfaceswith doubleintensityartifactson asfew surfacesaspossible.Thewindow setupis with two viewports,eachonemappedby screenplacementto feedaseparate1024x768displaychannelswitchedto a projector. Thewholewindow is re-freshedat thesametimeby swappingbuffers,sothattheviewportsareupdatedsynchronously.

On one graphicspipe we are able to achieve nearly 60 fps ofrenderingspeed,with oneor moretracker updatesbeingreadperframetime. Thelatency of 4-5 framesbetweenthetracker andtheprojectordisplayis a significantfactorleadingto breakin presencewhentheobjector thebrushis moved at moderateto high speeds- the shadedversionlagsbehindthe real one,asshown in 3. Forstaticor slow moving objects,however, the shadedobjectis deadon in regionswherethecalibrationsamplepointsweretaken- thuswestresstheimportanceof calibratingwith samplepointsthrough-outtheprojectoroverlapregionandbeyond.Thelatency wouldim-provewith theuseof amoreadvancedtrackingtechnologyandalsosomepredictionof future tracker readingsusinga Kalman filter-basedmethod.

Sincetheuseris nothead-trackedin ourprototypesystem,thereis no view-dependent(specular)lighting aspartof theshaderlampvocabulary demonstrated.The diffuseshadingthat is therelookscorrectfrom all userviewpointsandallows multiple simultaneoususers,following the paradigmof spatiallyaugmentedreality firstintroducedby [14] and subsequentlyappliedto a table-topenvi-ronmentby [12]. View dependenteffectscaneasilybe integrated,though,for asingletrackeduserasdescribedin previouswork[13].

3.9 Interactive 3D Painting

Onceshaderlampshadbeenextendedto moving surfaces,thenextstepwasto provide interactionthatletsonemodify theattributesoftheshaderlamps(andhenceof theobject)in real time. Onecouldpreprocesstheseshadingattributesfor a demonstrationof shaderlampswith any given model, but ultimately we would like to beableto createthesedemoson thefly by modifying theshadingat-tributesinteractively. This is an interactionprocesssimilar to thatof paintingthemodelsurface,sowe implementedthefirst applica-tion of paintingfor spatiallyaugmentedrealityin theform of shaderlamps.While it currentlypaintscolordirectlyandmodifieslightingon the fly with a tracked light attachedto the brush,it canbe ex-tendedto ’paint’ a new materialandapplya displacementor otherfilter to theexistingcolor or materialshading.

Thepaintingschemeusedin thissystemis basedontheoneusedby Gregory et al [4] in the inTouchsystemfor haptic3d painting.We maintainoneor moretexturemapsfor themodel,with all thetriangleverticeshaving texturecoordinatesinto oneof thesemaps.Thefollowing arethestepsin thepaintingprocess:

3.9.1 Contact Determination by Proximity Detection

Rightnow wecomputeanaxis-alignedboundingboxfor themodelandthencheckthe transformedpositionof thebrushheadagainstthe boundingbox, and if outsidewe stop. This saves us wastedchecksfor trianglesto paintmostof thetime. Whenthechecksuc-ceeds,we find the perpendiculardistancebetweenthe brushheadcenterandtheplaneof every trianglewithin theboundingbox,andfor thosewithin an admittancethresholdradius(a brushparame-ter),wemakesuretheperpendiculardroppedfrom thebrushcenterfalls within the triangle or upto an outsidetriangle margin (nor-mally a constanttimes the brushradius)outsideits edges. Thishelpsto make surethattheblob of paint is not clippedto thetrian-glewithin whichit liesbut stretchesacrossto neighboringtriangles.We collecta list of trianglesthathave metbothcriteriafor sendingto thenext stage.

Notethatthis processcouldbeimprovedvastlywith anorientedboundingbox for the model,or someother form of spacesubdi-vision,mosteffectively hierarchicalsubdivision asin anAABB orOBB-treeor an octree[3, 2, 8, 5]. The proximity test betweenapoint anda polygonalsurfaceis besthandledasa sequentialtestwithin a region culledusingboundingboxes. However for a morerealisticbrushgeometryapromimitypackagesuchasPQP[6] maybeused.

3.9.2 Brush Function

This function,usedin blendingin thebrushcolor with thecolor ofa point,providesthefactor � to beusedfor theblendasa functionof 3D point positiongiventhepositionof thebrush.We assumeasphericalgeometryfor thebrushheadanda correspondingspheri-cal distribution of the functionaroundthebrushcenterc, with thesimpleformulafor BF atpoint x:

����� �������������������

wherer is the distancefrom c to x, andR is the constant“brushradius”.

This is alwaysbetween0 and1 for r¡R. Thusfor thisBF to workproperly, thebrushadmittancethresholdmustbelessthanor equalto the brushradius. Whenit is equal,a very fine film of paint isdepositedon trianglesat the fringe of the threshold. This whencoupledwith a larger thannormal brushradiusleadsto what wecall theairbrushmode. Whenthe thresholdis markedly lessthanthebrushradius,however, depositionwill beimmediate,strongandbold. Weusethis with a smallerbrushradiusanda thresholdsetto

Page 4: Dynamic Spatially Augmented 3D Paintingandrei/pubs/2001_TR_3Dpaint.pdf · 3D Painting with brush functions and mapping from 3D points to texture coordinates Rendering, the actual

theradiusof aphysicalsphereattachedto thebrushheadtosimulatethehaptic� or contactpaintingmode.

3.9.3 Painting as Triangle Rasterization

Oneby one,thetriangleschosento paintarescanconvertedor ras-terizedusing a specialroutine taken from [4]. The routine stepsthrough2D pointson thetriangle’s texturemapandcorresponding3D pointsinterpolatedfrom the3D positionsof its vertices.

3.9.4 Texture Map Modification

The brushfunction evaluatedat the 3D point is usedto calculatethenew valuein thetexturemapat its correspondingposition.Theformulausedis a simplealphablending:����� �� �"! �#�$�����%�� �"! �'&��(�)�*������+ ! ���-,.����� ��/0�214365 �7&�8��� + ! �

4 Initial User Reactions

Sincethepreliminaryproof of conceptimplementationof our sys-temhasjustbeencompleted,systematicuserstudydatais notavail-able. The working systemwasdemonstratedto leadinggraphicsresearchersandstudents,aswell asto a mediacrew with no previ-ousexposureto projector-basedgraphicsresearch.It seemsto holduniversalappealdueto its creationof art on realobjectsin anaug-mentedsetting,while alsodemonstratingastepaheadof thepowerof shaderlampsin a direction that may be critical to achieve thegoalof trueinteractivity andscalability.

To the questionof how this comparedwith otherpaintingpro-grams- 2D imagebased,3D modelbasedor fully immersive (vir-tual), theresponsewasthatthis is certainlydifferentfrom thoseap-plicationsandhasawholenew rangeof applicationsencompassingthosevisualizedfor shaderlamps.Oneuserreinforcedour feelingthat the feedbackof real paint-basedanddigital artistswould bevaluable.Accordingly in somefutureextensionswe plan to studywhat these“power users”think aboutthesystem.Advancedtech-nicaluserslovedtheartisticcapabilities,but couldseethattrackingtechnologyandits latency boundswerethe significantbottleneckin performanceaswell asbreakin AR presencefor thesystem.

5 Future Work

Therearea numberof unsolved problemswith thecurrentsystemandthe whole paradigmof paintingfor SAR, the significantonesfrom whichwe list here:

� Tracking: The latencyproblemwill not go away soonasthedisplay, thetracker andtheprojectoreachaddoneor twoframesof latency. Ratherthangobackto thecaseof notmov-ing theobject(staticshaderlamps)or moving it in a slow, re-strictedmanner(asin bolting it to a turntableor amechanicalarm) to reducelatency, the preferredsolution is to fine-tunethetrackingcodeandaddprediction[1] to it to offsetmostoftheeffectsof latency.

� Scalability: In orderto extendthe range of operationof thesystemto cover the large room-sizedenvironmentswe visu-alize it will be usedin, we needmoreprojectorsandhighertrackngrange.For moreprojectors,thecurrentsolutionscalesreadily(notethoughthattheblendingproblemthatwehaven’tsolved is mademorecomplex by the needto scaleto n pro-jectors). Projectorplacementto cover the spaceefficientlyhasbeeninvestigated[17]. Extendingthe tracking rangeistrickier, asis settingup the projectorsandtracker to get the

highestpossiblevolume of intersectionbetweenthe projec-tion volumeandthetrackingvolume.

� Projection: Dynamicblendingin real-timeis a problemnotyet solved. Thestaticcasewassolved in [15] by computingintensityweights. However, in our systemwith the surfacesmoving aroundit is hard to solve for theseweights in realtime unlessonecandrasticallyminimize the areaof overlapby partitioning the setof surfacesbetweenthe projectorsinsomeway. Thuswe ignorethemandit shows up asdoubleintensityin someregionsandimperfectoverlapbetweentheimagesof theprojectorsat angleswhenbotharecontributingto a surfaceat anobliqueangle.

� Applications: Thereis a needto develop and demonstrateapplicationsfor this technology, which clearlyshow its supe-riority to theexisting way of paintingon realobjectsaswellasmodelsinsidethecomputerwith no realobjectthere.Cre-ating an AR painting spaceis a technologalmilestone,butcommercialapplicationsareneededto drive its development.Wide-areateleconferencingusingthepaintsystemonrealob-jects is an obvious choice;so arevariousmedical,cosmetic(paintingon humanskin), artistic and architecturalapplica-tions.

6 Conc lusions

We have presenteda new systemfor 3D painting and projectionon most real objectsasthey move. All that is requiredthat therebe two projectorsfacingeachother, a tracker attachedto the ob-ject (andin thepaintbrush),theprojectorsbothbecalibratedto thetracker’sframeof reference,andthattheobject’sgeometryandtex-turecoordinatesbepre-acquired.Thesystemis demonstratedwitha cuboidalobject,and this object is goodenoughto test the reg-istrationof verticesandedgesin the real objectandthe projectedtexture. The sameframework will work with an arbitrary num-berof overlappingprojectorsandanarbitrarily complex polygonalobject (with the restrictionof it being locally convex). The per-formancefor high polygoncountmodelscanbeacceleratedusinghierarchicalcollision detectiontechniquesto pick the surfacestopaint. Someoutstandingissuesaredynamicblendingfor projec-tor overlap,increasingtracker range,reducinglatency to tolerablelevelsanddevelopingcompellingapplications.

References

[1] R. AzumaandG. Bishop. Improving staticanddynamicreg-istrationin anopticalsee-ThroughHMD. In SIGGRAPH94,pages197–204,July 1994.

[2] S. Gottschalk. Collision QueriesUsing OrientedBoundingBoxes. PhDthesis,Departmentof ComputerScience,Univer-sity of N. Carolina,ChapelHill, 2000.

[3] S. Gottschalk,M. Lin, andD. Manocha. Obb-tree:A hier-archicalstructurefor rapid interferencedetection. In SIG-GRAPH’96 ConferenceProceedings, pages171–180,1996.

[4] Arthur Gregory, StephenEhmann,andMing C. Lin. intouch: Interactive multiresolutionmodelingand3d paintingwitha haptic interface. In ACM Symposiumon Interactive 3DGraphics(I3D’99) ConferenceProceedings, March1999.

[5] Arthur Gregory, Ming C. Lin, StefanGottschalk,andRussellTaylor. A framework for fastandaccuratecollision detectionfor hapticinteraction. In IEEE Virtual Reality1998Confer-enceProceedings, 1998.

Page 5: Dynamic Spatially Augmented 3D Paintingandrei/pubs/2001_TR_3Dpaint.pdf · 3D Painting with brush functions and mapping from 3D points to texture coordinates Rendering, the actual

[6] Eric Larsen, Stefan Gottschalk,Ming C. Lin, and DineshManocha. Fast Proximity Querieswith SweptSphereVol-umes.TechnicalReportTR-99-018,Departmentof ComputerScience,Universityof N. Carolina,ChapelHill, 1998.

[7] S. R. Marschner. InverseRenderingfor ComputerGraphics.PhD thesis,Departmentof ComputerScience,Cornell Uni-versity, 1998.

[8] D. Meagher. Geometricmodelingusingoctreeencoding.InComputerGraphicsandImage Processing, volume19,pages129–147,June1982.

[9] GopiMeenakshisundaramandShankarKrishnan.Surfacere-constructionusinglower-dimensionaldelaunaytriangulation.In Eurographics2000ConferenceProceedings, June2000.

[10] J.S.Montrym,D. R. Baum,D. L. Dignam,andC. J.Migdal.Infinitereality: A real-timegraphicssystem. In SIGGRAPH97, pages293–302,August1997.

[11] R.Raskar. ProjectorBased3D Graphics. PhDthesis,Depart-mentof ComputerScience,Universityof N. Carolina,ChapelHill, 2000.

[12] R. Raskar, G. Welch, and W.-C. Chen. Table-topspatiallyaugmentedreality : Bringingphysicalmodelsto life with pro-jectedimagery. In SecondInternationalWorkshopon Aug-mentedReality(IWAR’99), pages11–20,November1999.

[13] R. Raskar, G. Welch, M. Cutts, A. Lake, L. Stesin, andH. Fuchs. The office of the future: A unified approachto image-basedmodelingand spatially immersive displays.In SIGGRAPH’98 ConferenceProceedings, pages179–188,1998.

[14] R.Raskar, G.Welch,andH. Fuchs.Spatiallyaugmentedreal-ity. In FirstIEEEWorkshoponAugmentedReality(IWAR’98),pages11–20,November1998.

[15] RameshRaskar, Kok-Lim Low, DeepakBandyopadhyay, andGreg Welch. Shaderlamps : Animating real objectswithimage-basedillumination. In EurographicsRenderingWork-shop2001, August2001.

[16] P.-P. J. Sloan,WeinsteinD. M., andJ. D. Brederson.Impor-tancedriventexturecoordinateoptimization.In Eurographics1998ConferenceProceedings, volume17,June1998.

[17] WolfgangStrzlinger. Imagingall visiblesurfaces.In GraphicsInterface’99ConferenceProceedings, pages115–122,June1999.

[18] G. WelchandG. Bishop. SCAAT: Incrementaltrackingwithincompleteinformation. In SIGGRAPH97, pages333–344,August1997.

[19] G. Welch,G. Bishop,L. Vicci, S. Brumback,andK. Keller.Thehiball tracker: High-performancewide-areatrackingforvirtual andaugmentedenvironments.In SymposiumonVRST,December1999.

Page 6: Dynamic Spatially Augmented 3D Paintingandrei/pubs/2001_TR_3Dpaint.pdf · 3D Painting with brush functions and mapping from 3D points to texture coordinates Rendering, the actual

Proj1Proj2

Figure1: Schematicdiagramof thesetup.Two projectorsfaceeachotherandprojectingdownwards,cover thevolumewheretrackingis active.

Figure 2: Usersdemonstratingtheir creationswith the 3D paintsystem

Figure 3: Demonstrationof the latency in the tracking pipeline.Noticetheblurringandtrailsartifacts.

Figure4: Completedsystem(projectorsoff) with a tracker placedon theobject

Figure5: Projectorcalibration

Figure6: Verificationof calibration- drawing markers at knownpositionsin the real world using calibrationmatrices. When thecalibration is a little off, this methodcan be usedto detectthisandroughly recalibrateby moving the projectoruntil the imagesall snapto thecorrectlocations.