We are planning a number of experiments to see whether people can directly experience a four dimensional space. We plan to use the PHANToM to give people the experience of moving objects around in 4D, while feeling those objects push back. It would be interesting to use two of these devices simultaneously, so that the user can pick up objects between two fingers or hands.
One tricky thing here will be to design an experiment that does a "reasonable" mapping between an intuitive sense of a four dimensional space and the three dimensional space in which the PHANToM (as well as the user) actually exists.
We can build on earlier work in procedural volume filling textures, which was first published in the paper: Hypertexture; (Ken Perlin and Eric Hoffert, 1989 Computer Graphics [proceedings of ACM SIGGRAPH Conference]; Vol. 23 No. 3.)
There is a natural extension of this work to volume-filling haptic textures. Such textures will enable users to directly "feel" properties associated with different regions of space. These have natural markets in medical financial, and physics simulation packages.
Ken Perlin and Luiz Velho developed infinite resolution painting systems that allow the embedding of procedurally generated textures. This was first published as Live Paint: Painting with Procedural Multiscale Textures; (Ken Perlin and Luiz Velho, Computer Graphics; Vol. 28; No. 3).
Haptic extensions of this research will allow users to haptically intercact with multiple levels of detail of textured surfaces. In the course of this, we will establish techniques to exploit the advantages of Wavelet representations for haptic modeling and feedback.
Aaron Hertzmann's recent work on Naturalistic Painted textures was presented as a technical paper at the 1998 SIGGRAPH conference this July.
Our haptic research plan for this work is to extend it to allow users to directly feel various styles of painted surfaces. This will allow viewers of virtual oil paintings to experience true immersion, which will greatly enhance the physical immediacy of such media.
Professor Arie Kaufman at the University of Stonybrook has built a highly successful research system to combine his volumetric rendering work with haptic feedback, to create a virtual sculpting system. The user directly manipulates a volumetric representation, using a PHANToM as a scupting device.
We propose to extend this research, so that materials to be sculpted can have different textural qualities. This will greatly enhance the sense of realism of the interaction, and should allow for finer manipulation.
Our Improv project is in the area of responsive interactive animated characters. This work has been showcased at the SIGGRAPH Electronic Theatre, at major exhibits of the SIGGRAPH Digital Bayou, and in a number of key technical papers, including Improv: A System for Scripting Interactive Actors in Virtual Worlds; (Ken Perlin and Athomas Goldberg, Computer Graphics; Vol. 29 No. 3.)
We are continuing a concerted focus on making such characters more physically present in the environment of the user. For example, we are creating autostereoscopic display environments, so that responsive animated characters appear not as figures on a screen, but rather as embodied characters that seem to be in the physical space of the viewer.
We plan to use haptic feedback to allow users to physically interact with these characters, directly engaging them through touch. We believe that this will powerfully enhance the sense of believability of interactive animation.
Denis Zorin has recently joined our department's faculty and our laboratory. Professor Zorin has published landmark work in multiresolution surface representation. His surface techniques were showcased in the recent Academy Award winning Pixar animation Geri's Game.
The PHANToM is an extremely valuable device for modeling. The interactive multiresolution modeling system that Professor Zorin has developed would greatly benefit from haptic input. The modeling system was designed primarily with conceptual 3D design and animation in mind; it allows its user to manipulate highly complex geometric models, obtained, for example using a laser scanner. Modification of such models can be performed in a much more natural way when haptic input is available. Developers can create a variety of haptic sculpting tools, to perform such actions as creating grooves of varying profile, pinching, smoothing, etc. Haptic input can be used to provide feedback when painting and drawing on surfaces.
In the future Professor Zorin plans to integrate fast algorithms for dynamic simulation and collision detection into the system. In combination with haptic input and a VR system such as responsive workbench this will allow the creation of a prototype virtual sculpting system, in which the manipulation of a virtual object closely resembles the manipulation of a real object. A key goal of this work will be the creation of a standard software component to efficiently represent the force signature of a multi-resolution mesh at differing levels of detail.
These modeling tools can also serve as a basis for an animation system: in this case, a different spectrum of tools with haptic feedback can be used to specify motion of the characters, much as one can move a marionette; haptic feedback is crucial for the subtle adjustments required in character animation.
There may be opportunities in the areas of financial information visualization and training for the use of haptic feedback. For example, an analyst could literally feel the indent of a highlighted trend within a volumetric visualization of a financial forecast.