Towards Intuitive Exploration Tools for Data Visualization ...€¦ · data and intuitive control...

8
Towards Intuitive Exploration Tools for Data Visualization in VR Gerwin de Haan, Michal Koutek, Frits H. Post Faculty of Information Technology and Systems Delft University of Technology Mekelweg 4, 26 28 CD, Delft, The Netherlands Tel. +31-15 278 2528 {g.dehaan, m.koutek, f.h.post}@its.tudelft.nl ABSTRACT In this paper we present a basic set of intuitive exploration tools for the data visualization in a Virtual Environment on the Respon- sive Workbench. First, we introduce the Plexipad, a transparent acrylic panel which allows two-handed interaction in combination with a stylus. After a description of various interaction scenarios with these two devices, we present a basic set of interaction tools, which support the user in the process of exploring volumetric datasets. Besides the interaction tools for navigation and selection we present tools that are closely coupled with probing tools. These interactive probing tools are used as input for complex visualization tools and for performing virtual measurements. We illustrate the use of our tools in two applications from different research areas which use volumetric and particle data. Categories and Subject Descriptors H.5.2 [Information Interfaces and Presentation]: User Inter- faces – Interaction styles. General Terms Algorithms, Design, Human Factors Keywords Virtual Reality, Visualization, Data Exploration, Two-handed Interaction, User Interface 1. INTRODUCTION Scientific visualization is widely used as a means of interpreting and understanding data and models originating from many scien- tific research areas. The use of graphical data representations ex- ploits the capabilities of the human visual system in recognizing complex patterns and features in the data. Researchers use visuali- zation systems as a tool in an effort to explore and interpret avail- able data from their experiments, measurements or simulations. Thus the system must provide its users with tools to effectively perform the data exploration task. The interactive nature of this task implies that the man-machine interface plays a prominent role in designing effective visualization applications. Virtual Reality has a potential to enhance this two-way communi- cation between the researcher and the visualization application. In contrast to desktop-based 3D visualization applications, the use of stereoscopic, (semi-)immersive displays and spatial interaction devices provides the user with a 3D experience of datasets, al- lowing a quicker exploration. Although VR can provide us with an intensified view of datasets, and more intuitive ways of inter- action with the environment, some problems concerning data exploration tasks that already existed in desktop visualization applications have not been solved. These problems include high interactivity of the visualization system and the Virtual Environ- ment (VE), intuitive navigation in the visual representation of the data and intuitive control of visualization tools and their parame- ters. Figure 1. Two-handed exploration tools in data visualization The datasets from simulations and measurements from various research areas are multi-modal and multidimensional, growing rapidly in size and dimensionality. Advanced visualization tech- niques such as direct volume rendering or iso-surfaces are com- putationally intensive and their performance degrades dramati- cally as the dataset size grows. In addition to these performance implications, the interactive control of the input parameters (e.g. selecting an iso-value or defining a transfer function) to achieve useful graphical representations of the data is a complex task. Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advan- tage and that copies bear this notice and the full citation on the first page. To copy otherwise, or republish, to post on servers or to re- distribute to lists, requires prior specific permission and/or a fee. VRST ’02, November 11-13, 2002, Hong Kong. Copyright 2002 ACM 1-58113-530-0/02/0011…$5.00.

Transcript of Towards Intuitive Exploration Tools for Data Visualization ...€¦ · data and intuitive control...

Page 1: Towards Intuitive Exploration Tools for Data Visualization ...€¦ · data and intuitive control of visualization tools and their parame-ters. Figure 1. Two-handed exploration tools

Towards Intuitive Exploration Toolsfor Data Visualization in VRGerwin de Haan, Michal Koutek, Frits H. Post

Faculty of Information Technology and SystemsDelft University of Technology

Mekelweg 4, 26 28 CD, Delft, The NetherlandsTel. +31-15 278 2528

{g.dehaan, m.koutek, f.h.post}@its.tudelft.nl

ABSTRACTIn this paper we present a basic set of intuitive exploration toolsfor the data visualization in a Virtual Environment on the Respon-sive Workbench. First, we introduce the Plexipad, a transparentacrylic panel which allows two-handed interaction in combinationwith a stylus. After a description of various interaction scenarioswith these two devices, we present a basic set of interaction tools,which support the user in the process of exploring volumetricdatasets. Besides the interaction tools for navigation and selectionwe present tools that are closely coupled with probing tools.These interactive probing tools are used as input for complexvisualization tools and for performing virtual measurements. Weillustrate the use of our tools in two applications from differentresearch areas which use volumetric and particle data.

Categories and Subject DescriptorsH.5.2 [Information Interfaces and Presentation]: User Inter-faces – Interaction styles.

General TermsAlgorithms, Design, Human Factors

KeywordsVirtual Reality, Visualization, Data Exploration, Two-handedInteraction, User Interface

1. INTRODUCTIONScientific visualization is widely used as a means of interpretingand understanding data and models originating from many scien-tific research areas. The use of graphical data representations ex-ploits the capabilities of the human visual system in recognizingcomplex patterns and features in the data. Researchers use visuali-zation systems as a tool in an effort to explore and interpret avail-able data from their experiments, measurements or simulations.

Thus the system must provide its users with tools to effectivelyperform the data exploration task. The interactive nature of thistask implies that the man-machine interface plays a prominent rolein designing effective visualization applications.Virtual Reality has a potential to enhance this two-way communi-cation between the researcher and the visualization application. Incontrast to desktop-based 3D visualization applications, the use ofstereoscopic, (semi-)immersive displays and spatial interactiondevices provides the user with a 3D experience of datasets, al-lowing a quicker exploration. Although VR can provide us withan intensified view of datasets, and more intuitive ways of inter-action with the environment, some problems concerning dataexploration tasks that already existed in desktop visualizationapplications have not been solved. These problems include highinteractivity of the visualization system and the Virtual Environ-ment (VE), intuitive navigation in the visual representation of thedata and intuitive control of visualization tools and their parame-ters.

Figure 1. Two-handed exploration tools in data visualizationThe datasets from simulations and measurements from variousresearch areas are multi-modal and multidimensional, growingrapidly in size and dimensionality. Advanced visualization tech-niques such as direct volume rendering or iso-surfaces are com-putationally intensive and their performance degrades dramati-cally as the dataset size grows. In addition to these performanceimplications, the interactive control of the input parameters (e.g.selecting an iso-value or defining a transfer function) to achieveuseful graphical representations of the data is a complex task.

Permission to make digital or hard copies of all or part of this workfor personal or classroom use is granted without fee provided thatcopies are not made or distributed for profit or commercial advan-tage and that copies bear this notice and the full citation on the firstpage. To copy otherwise, or republish, to post on servers or to re-distribute to lists, requires prior specific permission and/or a fee.VRST ’02, November 11-13, 2002, Hong Kong.Copyright 2002 ACM 1-58113-530-0/02/0011…$5.00.

Page 2: Towards Intuitive Exploration Tools for Data Visualization ...€¦ · data and intuitive control of visualization tools and their parame-ters. Figure 1. Two-handed exploration tools

Instead of concentrating on the acceleration of visualization tech-niques we focus on the intuitivity and interactivity of the explora-tion process. We use intuitive interaction scenarios to support theinterest-driven exploration. Intuitive navigation and the use ofsimple and fast interactive visualization tools provide a usefulapproach to the effective exploration of volumetric data.

We will first review some related work in the field of two-handedinteraction techniques in visualization applications in VEs. Thenwe present the Plexipad, a transparent acrylic panel, which formsthe basis for our two-handed interaction tools on the ResponsiveWorkbench. We define appropriate interaction scenarios usingboth the Plexipad and the stylus. Based on these interaction sce-narios, we describe our set of implemented exploration tools andtheir characteristics. We focus our description on navigation andprobing tools. After an overview of implementation details, ex-ample visualization applications from different research fieldsdemonstrate the use of our tools in practice. Finally, we presentour conclusions and some areas for future work.

2. RELATED WORKA good overview of the various challenges of scientific visualiza-tion in VR is given in [9]. One of the main challenges describedin this report is to “make interaction comfortable, fast, and effec-tive”. In our paper we concentrate on this topic, focusing on theintuitivity of the interaction techniques involved in the explora-tion and visualization of data. Based on promising results ofworking with both input devices simultaneously, we have focusedon the use of two-handed interaction techniques.Numerous studies have shown that Guiard’s framework [4] isuseful as a guideline for designing a two-handed interface. Hisfindings in the distribution of labor between two hands in every-day activities not only proved useful for 2D computer drawinginteraction schemes [15,16], but was also applicable to the studyof two-handed interface scenarios in VEs [1,2,3,12,13]. The fol-lowing types of interaction tasks can be distinguished:

- One-handed task: only one hand performs a task.- Double one-handed task: each hand performs a separate

one-handed task.- Two-handed task: both hands co-operate to perform a single

task.

The division of tasks between hands in the case of two-handedtasks can either be symmetric or asymmetric. A two-handed taskis symmetric when both hands perform identical actions. In anasymmetric task, the most common form of two-handed tasks,each hand performs an individual action, involving a complex co-ordination between hands. The dominant hand is the preferredhand for precise movements such as writing, for most people thisis the right hand; the non-dominant hand provides guidance andsupport. For asymmetric two-handed tasks, Guiard described thefollowing principles:- Dominant to non-dominant reference: the motion of the

dominant hand finds its spatial reference in the results of themotion of the non-dominant hand.

- Asymmetric scales: The right and left hand are involved indifferent motions. The motions of the non-dominant handtend to be of lower frequency and higher spatial amplitude.In other words, the non-dominant hand is responsible for the

infrequent large motions while the dominant hand has moremovements in a smaller area.

- Non-dominant precedence: The movement of the non-dominant hand precedes the dominant hand. The dominanthand waits for the non-dominant hand to initiate and set thespatial reference before engaging action.

Most of the work in the field of two-handed interaction interfacesconcentrates on object manipulation or assembly tasks [2]. Inaddition, most reports on two-handed interaction in VR deal withtwo identical input devices for each hand, like wands or gloves.Similar approaches that use a hand held panel and a stylus, aredescribed in [3,12,13]. The use of a transparent panel on a pro-jection based table has been reported in [1], though the panel isnot actively used in a visualization process. Alternative ap-proaches to the improvement of man-machine interaction useother input modalities like speech- and gesture recognition. Anexample of this multi-modal interface in a visualization applica-tion is described in [11].We use the two-handed task principles to match the interactionscenarios for the tools in our system. These scenarios are reflectedby various interaction tools for navigation and probing.

3. INTERACTION METHODSThe Responsive Workbench (RWB) provides a laboratory tableVE. The user stands in the real world and looks down into thevirtual world. Instead of bringing the user into the virtual world,the virtual world is brought to the user [10]. The virtual work-space is usually within reach of the user’s hands, but can also beextended under the projection surface. Most of our tools aretherefore designed to work both directly and remotely. The pro-jection surface of the RWB provides passive haptic feedback andis suitable for the placement of 2D/3D interaction widgets.

3.1 Input DevicesIn our VR set-up we use the following two input devices whichare tracked by electromagnetic trackers with six degrees of free-dom (DOF): position and orientation (Figure 2).

Figure 2. The stylus, the Plexipad and their two-handed useStylus: a pen-shaped input device with a single button. The pencan define a point in space (zero-dimensional or 0D). The shapeof the pen defines a directional reference axis. The pen can beused to intuitively specify a line in 3D space, extending the ac-tions from 0D to 1D. This function is often used by ray-castingselection.

Plexipad: a lightweight transparent acrylic panel (300 x 300 x 2mm) on which the tracker is mounted. The pad can be used tointuitively position and orient a 2D plane. It defines a 2D refer-

Page 3: Towards Intuitive Exploration Tools for Data Visualization ...€¦ · data and intuitive control of visualization tools and their parame-ters. Figure 1. Two-handed exploration tools

ence plane in 3D space. In contrast to a similar prop presented in[1], the tracker is mounted under a foam handle at the edge of thepanel. The handle allows a firm and comfortable palm grip, re-ducing fatigue in the fingers. With the tracker mounted close tothe wrist the inconvenience of the tracker cable (obstruction of theview, weight on the panel) is reduced to a minimum.In our two-handed interaction set-up, the dominant hand holds thestylus while the non-dominant hand holds the Plexipad. The sty-lus and Plexipad are interchangeable, and this allows both right-handed and left-handed persons to operate the tools.

3.2 Interaction ScenariosBased on our input devices and Guiard’s principles of two-handedtasks we have derived the following interaction scenarios:

One-handed interaction: Either the stylus or the Plexipad isactively used to interact with the environment. The stylus is suit-able for direct (0D) or ray-casting (1D) selection and manipula-tion, where the Plexipad allows direct control (positioning andorientation) of objects which are virtually attached to thePlexipad.

Double one-handed interaction: The stylus and the Plexipad areused to perform unrelated one-handed tasks. The Plexipad andstylus each have their own separate functionality: a direct cou-pling between the tools is absent. This scenario allows a combi-nation of stylus-based and Plexipad-based one-handed interactionscenarios. Although this direct relation between the two interac-tion tasks is absent, usually there will be a higher level goal that ispursued.

Symmetric two-handed interaction: Both hands perform identi-cal tasks. This type of interaction task is not likely to be used inour scenarios, considering our use of two distinct input devices.Systems that use two identical input devices like wands or glovesdo support symmetric two-handed interaction tools [2].

Asymmetric two-handed interaction: The Plexipad sets the 2Dreference plane for the stylus. This combination of the two toolsexploits the familiarity with the “pen and pad” metaphor. In addi-tion, the pad provides tactile feedback to the stylus’ movements.The combination of the plane shaped panel and similarly shapedvirtual object proves to be very intuitive. It feels as if you areholding the virtual tool in your hand. We distinguish the follow-ing three scenarios for this asymmetric two-handed interaction:

- The Plexipad serves as an object container. The Plexipadis used as a container for 2D and 3D virtual objects or inter-action widgets, which can be manipulated or operated by thestylus.

- The Plexipad constrains the stylus. The actions of the sty-lus are projected on the reference plane defined by thePlexipad. This actively constrains the 3D actions of the sty-lus in a 2D plane (Figure 3, left).

- The Plexipad and stylus are used for a complex interac-tion task. Here the Plexipad and stylus form a pair of inputdevices that control a single complex interaction task. Exam-ples of this interaction scenario are the selection of a regionof interest (Figure 3, right), a 3D lasso selection tool, a mod-eling tool or a cutting tool.

Figure 3. Asymmetric two-handed interaction: constrainedactions (left) and complex actions (right)

The Plexipad as a passive object container is widely used and iswell suited for hand-held 2D or 3D menus and object snapping.Although we use the Plexipad for this purpose too, we concentrateon more dynamic asymmetric two-handed interaction scenarios.The second scenario uses the Plexipad as a pure 2D referenceplane for stylus interaction. In the third scenario however, thePlexipad actively participates in the interaction. This complexinteraction has been used mainly with two 0D/1D input devicessuch as pens or gloves [2]. The 2D plane shape of the Plexipadcan be exploited to create more expressive interaction tools. Weuse the concept of holding two dimensions in one hand (thePlexipad) while the other hand (the stylus) controls the third di-mension.In the following section we will describe how these interactionscenarios reflect in the design of our interaction tools for naviga-tion and probing.

4. INTERACTION TOOLSWe have implemented a VE for visualization and exploration ofdata on the RWB (Figure 4). Besides conventional 3D GUI theuser can interact with various visualization tools and the dataspace which can contain both volumetric and object data. Thedata space is represented as a virtual object, using an outline toindicate the spatial boundaries.

Figure 4. Overview of the VE for data exploration andvisualization; (This figure is reproduced in color on page 000.)In the description of the exploration tools we will use the case ofMolecular Dynamics (MD), our testbed during development, toillustrate their functionality. MD is used to study the propertiesand behaviour of complex particle systems. This application is agood example of combining volumetric information with objectinformation (particles). The various illustrations in this paper

Page 4: Towards Intuitive Exploration Tools for Data Visualization ...€¦ · data and intuitive control of visualization tools and their parame-ters. Figure 1. Two-handed exploration tools

show the study of a solid electrolyte (sodium beta-alumina) whichconsists of a crystal molecular structure with layers of sodiumions. In the application section the case of MD will be describedin more detail.A typical exploration process begins with a quick spatial and tem-poral scan of the volumetric data for interesting information. If aninteresting region or phenomenon has been found in the data, theattention is focused on this aspect of the data. The user then triesto get detailed insight in this data by inspecting or probing thevarious data values in the neighbourhood, using visualizationtechniques or measurement tools to explore the data. In our ap-proach we provide interaction tools to support these steps of theexploration process. These can be divided into two main catego-ries: navigation (positioning, orienting, cropping and zooming ofthe data space) and probing (localized visualization and measure-ment of the data).

4.1 NavigationIn our VR concept we do not see navigation as flying through aVE. Instead we use the laboratory table metaphor, where the po-sition of the VE with respect to the physical table is fixed. Navi-gation consists of manipulation actions on the data space objectwhich contains all data. Common direct or ray-cast manipulationtools can be used to position and orient the data space object. Forother types of navigation we have developed the following inter-est-driven tools.

Zoom ToolThe zoom tool allows the user to take a closer look at a point inthe data. In naive implementations (e.g. using a slider to adjust thezoom factor), the user has to pay attention to both the manipula-tion of the slider and the size of the object. Moreover, the objectwill often scale from its origin, thus effectively moving the pointof interest to another position. We observed users repeatedlyzooming and repositioning in an effort to get a good view of thepoint of interest in the object. Our “magnifying glass” metaphor isinterest driven: click at a point of interest, pull back to zoom, pushaway to zoom out.

Figure 5. Zoom toolAfter activating the zoom tool, the user can specify the point ofinterest in the VE (Figure 5, stylus position 1). The zooming willoccur around this focal point, so that this point will remain sta-tionary. The line from this point to the eyes defines the normal ofthe reference plane. The user can move the stylus away from the

initial position to adjust the zoom factor. If the stylus is movedtowards or away from the eyes, the zoom factor increases or de-creases respectively. This factor is determined by distance d, theperpendicular distance from the stylus (position 2) to the referenceplane. To prevent jerky zooming with small movements whileallowing large zooming with large movements, we use a non-linear scaling function. Ray-cast zooming can be used to zoom inon distant points (Figure 6).

Figure 6. Ray-cast zoom toolThe zoom tool using the “magnifying glass” metaphor has beenfound very intuitive, easy-to-learn and useful by users. It has sig-nificantly decreased the time and effort needed for obtaining amore detailed view on a point in space.

Mini System toolUsing the zoom tool the user can be fully immersed in the data,thus losing orientation. The mini system provides a global contextof the data space in the form of a small model (Figure 7). To havegood navigation control in the data space we coupled the orienta-tion of the mini system to the data space object. If the user rotatesthe data space object, the mini system is rotated accordingly.Likewise, if the user rotates the mini system, the data space objectis rotated as well. As a result the system and its miniature are al-ways aligned. The data space object is rotated around a point ofinterest, for which we selected the center of the projection screen.We decided not to share positioning information between the dataspace and its small version. A small repositioning of the minisystem would cause a much greater repositioning of the data spaceobject, thereby confusing the user.

Figure 7. Mini System toolAn advantage of not using the location information of the systemis that the mini system can be placed anywhere in the VE. Forexample, the user can use the mini system to rotate the data spaceobject and then move it out of sight, e.g. to the side of the screen.

Page 5: Towards Intuitive Exploration Tools for Data Visualization ...€¦ · data and intuitive control of visualization tools and their parame-ters. Figure 1. Two-handed exploration tools

Region of Interest toolThe region of interest (ROI) interaction tool allows users to selectan arbitrarily oriented 3D box in the VE. The user uses the stylusto define a point in space which will define an extent of the box.The box is created using this click point relative to the Plexipadand the new stylus position as the other extent of the box. Whileholding the button, the user can move both stylus and Plexipad todynamically adjust the size and position of the box. In contrast toother two-handed region selections, the Plexipad also defines theorientation of the base plane of the box. This enables the user tonot only reposition the box but also to adjust orientation of thebox dynamically. The complex asymmetric interaction betweenthe two hands reduces the number of actions, allowing easy crea-tion of axis-aligned boxes as well as arbitrarily oriented boxes in3D space in a single movement.

Figure 8. Region of Interest toolIn our MD application we use the box to define the volumetricregion in which we want to display particles (Figure 8). The tech-nique can also be used for object selection, volume probing or 3Dmodeling. The Plexipad can control the 2D base of an objectwhile the position of the stylus defines another parameter such asheight or extent.

4.2 Probing ToolsThe probing tools, which allow the user to inspect the data, workin synergy with the navigation tools. Probing in a dataset is a nec-essary input for various visualization and measurement tools. 3Dprobes like point probes (0D), line probes (1D), plane probes(2D) and volume probes (3D) are therefore essential in the dataexploration process. Without an appropriate feedback it is diffi-cult to freely probe the data in 3D space. We have found that thepassive haptic feedback provided by the Plexipad is an excellentsupport during 3D probing. We take advantage of the two-handedinput scenarios defined earlier, using the Plexipad as a 2D refer-ence plane. This reference plane forms the basis for the creationand positioning of probes.

Plane ProbingThe Plexipad allows the user to freely navigate a 2D planethrough the VE. We exploit this intuitive interaction by attachinga plane-shaped probing tool directly to the Plexipad. This consistsof a grid of point probes, which perform a trilinear interpolationon the data values in the volumetric data. We directly visualizethese data values on the plane-shaped probe by using a texturedquad surface. As a result, the user directly controls the positionand orientation of the probing tool, slicing through the volumetricdata. We call this tool the direct data slicer (Figure 9).

The user holds the direct data slicer in the non-dominant hand andcan quickly probe through the volume to get an overview of thedata values inside. Concurrently the dominant hand can be used tooperate the stylus for manipulation, zooming of the data, and theselection of new tools.

Figure 9. Direct data slicer: A user slices through anatomic density field of a solid electrolyte.

The direct data slicer forms a two-dimensional reference plane forother tools. It not only provides a reference for asymmetric two-handed interaction and passive haptic feedback, but it also pro-vides a visual reference plane in a volumetric dataset. This com-bined feedback assists in a more accurate and intuitive selection ofpoints and lines in 3D space by using the stylus directly on thePlexipad. This assisted or constrained selection can be effectivelyused to define the input for visualization or measurement tools.The advantage of the plane probe is that it can also serve as areference plane for other probing tools.

Point ProbingThe point probe allows the user to request the data value(s) at apoint in 3D space using the stylus. In addition to freehand prob-ing, the direct data slicer can serve as an excellent feedback forconstrained probing, by clicking with the stylus on a point of in-terest on the Plexipad (a 0D selection on a 2D slice). An exampleof using a point probe is the click iso-surface tool, see Figure 10.This tool uses the value of an interactively positioned point probeas the iso-value which is used to create an iso-surface instantly.The iso-surface is generated by the marching cubes module inVTK and transferred to the VE using vtkActorToPF.

Figure 10. Point Probing: The data value on the selected pointis used as the input for an iso-surface visualization tool.

Page 6: Towards Intuitive Exploration Tools for Data Visualization ...€¦ · data and intuitive control of visualization tools and their parame-ters. Figure 1. Two-handed exploration tools

Line ProbingLine Probing provides a way of selecting a line in 3D space,which can be used for data probing applications. Again, the directdata slicer can be used to create a 2D reference plane in 3D space,allowing the user to draw a line on the Plexipad (a 1D selectionon a 2D slice). An example of Line Probing is using the probeddata values along the selected line to define the data range of acolor mapper. The color mapper defines the color-coding of thevalues probed by the direct data slicer. The Line Probe can beused to select a line of interest (LOI) after which the color mapperis calibrated to reveal small variations in the selected profile(Figure 11). This profile can also be used to select an appropriateiso-value for an iso-surface.

Figure 11. Line Probing: The user selects a line of interest tocalibrate the gray-scale color mapper in a medical CT-scan.

Sub-Volume ProbingThe principle of the oriented 3D box described in ROI can beused for probing sub-volumes. One can crop or cut a selectedvolume of the data, while data slicers probe and visualize the dataon the inside faces of the 3D box. The box selection can also beused for direct volume rendering in the given region. It is impor-tant that the user can interactively select and adjust the box whilevisualizing the data at the same time.

After the initial exploration process, the presented probing toolscan be used for interactive quantitative measurement purposes bypresenting the probe results in a classic 2D or 3D graph. For ex-ample, the data values along a line of interest can be presented ina 2D graph to visualize the data profile along this line. Anotherexample is to display the results of a point or line probe over aperiod of time. These dynamic graphs can be placed anywhere inthe VE and stored for further analysis. We expect that this datarepresentation will be appreciated by researchers for its analyticalcharacter.

5. IMPLEMENTATIONOur visualization applications run in parallel on a four-processorSGI Onyx2 and are implemented using RWB-Library [6], whichis based on Iris Performer and OpenGL.The RWB-Library and Simulator provide an environment fordevelopment of VR applications on the Responsive Workbench. Ituses multiprocessing scheme and shared memory. It providesbasic and advanced interaction functions for object selection andmanipulation and for navigation through the VE. It contains basicset of 3D widgets, like icons, buttons, sliders, 3D text, menus, etc.It greatly assists in building the scene graph hierarchy through theuse of generic rwb-objects with callback events, when objects arebeing selected, manipulated or released. RWB-Library makes the

co-ordinate transformations of Iris Performer easier. Each inputtracking sensor has its local co-ordinate system given by pfDCS(Performer Dynamic Co-ordinate System). Virtual (rwb-) objectscan be easily connected with these DCS’s, following the motionof the tracker sensors. We also integrated VTK as a visualizationengine in our visualization framework.The RWB-Library uses a default rwb-interactor class, which canselect and manipulate objects directly with the stylus or using ray-casting for distant objects. It checks for bounding volume or ray-casting selections, providing the intersection and interactionpoints (stylus/ray with objects). This class works on the principleof an interaction state automata, driven by interaction events.Rwb-objects can be selected and de-selected, picked (first buttonclick), manipulated and released, each time invoking their eventcallback function. This way the desired behaviour of virtual ob-jects can be easily implemented.Using the generic rwb-interactor class we can easily define newinteraction tools, like the zoom tool, the ROI tool or the SpringManipulator [5]. We have implemented two interactor schemes:- The user extension of rwb-interactor first executes the

default rwb-interactor to perform the intersection checks andto obtain a new interaction state and interaction points; thenthe user interactor function implements the desired behav-iour (examples: ZOOM, ROI, Click-ISO, Probing Tools)

- The complete user rwb-interactor overloads the function-ality of the default interactor and processes itself the inputsensors (via RWB-Lib) and implements the desired function-ality (e.g.: Spring Manipulators, Virtual Particle Steering)

The activation of the user rwb-interactors can be initiated byclicking on a widget button. The deactivation of the rwb-interactor can be controlled by the tool itself or by activating an-other interactor. For example the zoom tool is automatically deac-tivated after a zoom action and the control is returned to the de-fault rwb-interactor. The design of the interactor determines theway the tool works. The event-based interactors, straightforwardco-ordinate transformations between the Plexipad and stylus andthe simulator environment all provided by the RWB-Library allowrapid development of new (two-handed) interaction tools.

6. APPLICATIONSWe will show the interaction scenarios described above on twoapplications. Visualization and steering of MD simulations havebeen an important inspiration for our research. Within the scopeof MD visualization research we have developed most of the pre-sented interaction techniques. To prove their wider applicationdomain we also applied them in other case studies.

6.1 Particle Steering in Molecular DynamicsIn our VR lab we have developed the MolDRIVE system, a VRsystem for visualization and steering of real-time remotely run-ning MD simulations [5,7]. As described earlier, MD simulationsare used to study properties and behaviour of particle systems. Inthe MolDRIVE system we work with particle data (positions,force and velocity-vectors) and volumetric data. The volumetricdata consists of regularly structured grids of scalar data (e.g. ki-netic energy, potential energy, particle density) and vector data(e.g. force fields). As we are dealing with a real-time simulation,the VE content is updated each time we receive new data. Wehave used double-buffer data management so that the update of

Page 7: Towards Intuitive Exploration Tools for Data Visualization ...€¦ · data and intuitive control of visualization tools and their parame-ters. Figure 1. Two-handed exploration tools

visualization tools is not disturbed when the simulation deliversnew data.

Growing interest for the ability to steer particles of running simu-lations has led us to implement particle steering tools. The mostreliable steering is provided by the Spring Force Manipulator [5],which has been derived from the Spring Manipulation Tools,providing visual force feedback during manipulation.

The task of particle steering can be effectively assisted by thedirect data slicer. A higher level interaction is achieved when forexample the stylus is used for the pulling of a particle while thePlexipad is used to gain information on the potential energyaround the particle (Figure 12). In MD simulations the particlesusually have preference to move from higher to lower potential.Using the direct data slicer the user can see what the most effi-cient trajectory for the particle can be and then uses the steeringtool (Spring Manipulator) to drag the particle in that direction.The interaction of the non-dominant hand itself does not influencethe reference frame of the stylus. Instead it presents valuable in-formation which can be interpreted to adjust actions with thedominant hand. The visualization tool in the non-dominant handprovides information that allows the user to redirect their simula-tion steering actions with the dominant hand.

Figure 12. Molecular Dynamics: The direct data slicer is usedto display potential energy around an individual atom during

particle steering with the stylus.

6.2 Visualization of Cloud SimulationsAtmospheric simulations are usually very complex and computa-tionally intensive, and they produce large time-dependent datasets. In this case study we are dealing with data originating fromLarge Eddy Simulation (LES) with cumulus clouds. Scientistsstudy atmospheric boundary layers to get a better understandingof the turbulent dynamics and the behaviour of clouds (cumulus,stratocumulus). Turbulent convective motions are very importantsince they are responsible for the vertical transport of heat, mois-ture and pollutants. The presence of clouds in the boundary layermakes the dynamics even richer but forms an additional compli-cation due to the phase changes (condensation / evaporation) [8].The LES produces with a large dataset (20 GB), with grid dimen-sions of 128x128x80 and 600 time-steps, which means one hourof simulated clouds in an area of 6x6x3 km. The key quantities are

momentum, pressure, temperature and moisture. It is a real chal-lenge to be able to interactively visualize and browse throughsuch a large time-dependent dataset. The exploring user issearching for an interesting cloud with a complete life cycle insidethe simulated time-steps. The spatial relations and the simulatedphysical properties around the selected cloud have to be exploredin detail.

Figure 13. Atmospheric visualization: the direct vector slicershows flow momentum around clouds (iso-surfaces)

The presented two-handed interaction scenarios have also beenapplied in this case study. The exploration process begins with aquick search through the dataset. The Plexipad as object containercontains our time-control widget, while the stylus can be used foroperating it, as well as for navigation in the visualization of thecloud field, see color section – Figure 14.The direct data slicer can be attached to the Plexipad as well, ena-bling highly interactive exploration of the data (Figure 13). Thestylus can be used to probe the data on the surface of the directdata slicer. While using the direct data slicer, the stylus can beused to operate the rest of the VE, using the 3D GUI to change forexample the visualized data or adjusting the color mapper. A colormapper widget can also be attached to the data slicer, enabling thecolor mapping adjustments without changing view context. TheLOI tool can be used to calibrate the color mapper on the selecteddata range.

7. CONCLUSIONS AND FUTURE WORKWe presented intuitive interaction scenarios for the Plexipad andthe stylus, based on well-founded two-handed interaction para-digms. The described interaction scenarios provide a solid ap-proach to the development of intuitive navigation and probingtools. The use of two-handed scenarios proved valuable for thedevelopment of volumetric visualization tools in projection-basedsemi-immersive VEs, like the Responsive Workbench. The com-plex two-handed interaction scenario, holding the two dimensionsof the Plexipad in one hand and controlling the third dimensionby the stylus in the other, allows the creation of complex but in-tuitive interaction tools. This two-handed synergy between thestylus and the Plexipad allowed intuitive exploration and probingof volumetric data sets.The combination of the zoom tool, the region of interest tool andthe mini system allows a flexible way of navigating through thedata and focusing on interesting regions of the data while main-

Page 8: Towards Intuitive Exploration Tools for Data Visualization ...€¦ · data and intuitive control of visualization tools and their parame-ters. Figure 1. Two-handed exploration tools

taining context. The presented direct data slicer can be used toquickly probe large datasets and scan for interesting phenomenain the 3D data space, even during navigation. The tactile and vis-ual feedback provided by this tool also provide a spatial referenceplane for various 3D probing tools, allowing a more accurateplacement of measurement tools. Moreover, the presented inter-action tools are not computationally intensive and their perform-ance is independent of the data size.A successful employment of presented interaction concepts needsaccurate tracking. The coherence of the Plexipad and the virtualobject has to be as good as possible to achieve a good tactilefeedback for a more intuitive interaction experience when usingthe stylus on the Plexipad. As we are using electromagnetic track-ers, we had to implement a special calibration scheme to deal withtracking errors especially in orientation of the Plexipad. Currenttrends show a good alternative of using optical tracking.We have presented the interaction tools and a new way of ex-ploring data in 3D to the scientists whose data we visualize andthey became very quickly familiar with this exploration interface.We have asked them what is intuitive about this concept. It isexactly the pen and notepad that form a natural interaction pair.Our conclusion at this point is that intuitive data visualization andexploration tools for VR should relate to real tools and interactionparadigms, which are used in real world when the scientists maketheir observations. Although we give them a slightly differentlook and functionality, people will have no problems using them.Another aspect is that people are used to work in 3D of the realworld. When performing tasks like drawing, construction ormeasuring, they usually search for a supporting plane or a refer-ence. Measuring freely in the air is difficult. The passive hapticfeedback and constraints are significant. In our solution it is pro-vided by the transparent Plexipad, which is augmented by virtualobjects and tools with straightforward meaning.Currently we employ the interaction techniques on various visu-alization applications, of which some were shown as examples inthis paper. We work on extending our work with intuitive meas-urement tools. In this framework we will be able to control vari-ous visualization and measurement tools to explore and analyzelarge multi-modal and multidimensional datasets. The use of ourprobing tools in combination with complex time-critical visuali-zation tools will allow us to explore and analyze large and com-plex datasets more effectively.

8. ACKNOWLEDGMENTSWe would like to thank J. van Hees, J. den Hertog, Dr. A.F. Bak-ker and Dr. H.J.J. Jonker for their valuable contribution to thedevelopment and testing of the various interaction techniques.

9. REFERENCES[1] D. Schmalstieg, L.M. Encarnação, Z. Szalavári, "Using

Transparent Props for Interaction With The Virtual Table",Proc. ACM Symp. Interactive 3D Graphics `99, pp. 147-154

[2] L.D. Cutler, B. Fröhlich, P. Hanrahan, "Two-Handed DirectManipulation on the Responsive Workbench", Proc. ACMSymp. Interactive 3D Graphics '97, pp. 107-114

[3] Z. Szalavári, M. Gervautz, "The Personal Interaction Panel -a Two-Handed Interface for Augmented Reality", Proc. Eu-rographics Computer Graphics Forum '97, 16(3), pp. 335-346

[4] Y. Guiard, "Asymmetric Division of Labor in Human SkilledBimanual Action: The Kinematic Chain as Model", Journalof Motor Behaviour, 19(4):486-517, 1987

[5] M. Koutek, J. van Hees, F.H. Post, A.F. Bakker, "VirtualSpring Manipulators for Particle Steering in Molecular Dy-namics on the Responsive Workbench", Proc. EurographicsVirtual Environments 2002, Barcelona, pp. 53-62

[6] M. Koutek, F.H. Post, "The Responsive Workbench Simu-lator: a Tool for Application Development and Analysis",Proc. WSCG 2002, Pilsen, Czech Republic, pp. 235-244

[7] J. Van Hees, J. Den Hertog, "MolDRIVE: a system for re-mote interactive MD simulations on a Virtual Reality Re-sponsive Workbench", M.Sc.Thesis TU-Delft, March 2002

[8] A.P. Siebesma and H.J.J. Jonker, "Anomalous scaling ofcumulus cloud boundaries", Phys. Rev. Letters, vol. 85, (1),pp. 214-217

[9] A. van Dam, A.S. Forsberg, D.H. Laidlaw, J. LaViola, Jr.,R.M. Simpson, "Immersive VR for Scientific Visualization:A Progress Report", IEEE CG&A, Nov/Dec 2000, pp. 26-52

[10] R. van de Pol, W. Ribarsky, L. Hodges, F.H. Post, "Interac-tion techniques on the Virtual Workbench", Proc. of Eu-rographics Virtual Environments '99 workshop, Springer,Vienna, pp. 157-167

[11] J. LaViola, "MSVT: A Virtual Reality-Based MultimodalScientific Visualization Tool", Proc. Third IASTED Interna-tional Conference on Computer Graphics and Imaging, pp.1-7, Nov. 2000

[12] R. Lindeman, J. Sibert, J. Hahn, "Hand-Held Windows: To-wards Effective 2D Interaction in Immersive Virtual Envi-ronments", Proc. IEEE Virtual Reality '99, pp. 205-212

[13] S. Coquillart, G. Wesche, "The Virtual Palette and the Vir-tual Remote Control Panel: A Device and an InteractionParadigm for the Responsive Workbench", Proc. IEEE Vir-tual Reality '99, pp. 213-216

[14] K. Hinckley, R. Pausch, J.C. Goble, N.F. Kassel, "Passivereal-world interface props for neurosurgical visualization",Proc. ACM CHI'94, pp. 452-458

[15] P. Kabbash, W. Buxton , A. Sellen, "Two-handed input in acompound task", Proc. ACM CHI'94, pp. 452-458

[16] E.A. Bier, M.C. Stone, K. Pier, W. Buxton, T. DeRose,"Toolglass and Magic Lenses: The see-through interface",Proc. SIGGRAPH '93, vol. 27, pp.73-80