Cristian Luciano, Pat Banerjee, Lucian Florea, Greg Dawe Electronic Visualization Laboratory Industrial Virtual Reality Institute University of Illinois at Chicago 842 West Taylor St., Chicago, IL 60607 {clucia1, banerjee, lflore11}@uic.edu; dawe@evl.uic.edu
Abstract
ImmersiveTouch™ is the next generation of augmented virtual reality technology, being the first system that integrates a haptic device, with a head and hand tracking system, and a high-resolution and high-pixel-density stereoscopic display. Its ergonomic design provides a comfortable working volume in the space of a standard desktop. The haptic device is collocated with the 3D graphics, giving the user a more realistic and natural means to manipulate and modify 3D data in real time. The high-performance, multi-sensorial computer interface allows easy development of medical, dental, engineering or scientific virtual reality simulation and training applications that appeal to many stimuli: audio, visual, tactile and kinesthetic.
1
Introduction
ImmersiveTouch™ 1,2 is a new haptics-based high-resolution augmented virtual reality system that provides an efficient way to display and manipulate three-dimensional data for training and simulation purposes. It is a complete hardware and software solution (Figure 1). The hardware integrates 3D stereo visualization, force feedback, head and hand tracking, and 3D audio. The software provides a unified API (Applications Programming Interface) to handle volume processing, graphics rendering, haptics rendering, 3D audio feedback, interactive menus and buttons. This paper describes the design process of the hardware as well as the software of the ImmersiveTouch™ prototype. The problems of current virtual reality systems and how they motivated the design of this system will be explained in the following section. The hardware constraints considered to achieve the
References: 1. 2. 3. 4. 5. 6. Accommodation/Convergence conflict, http://vresources.jump-gate.com/articles/vre_articles/stereo/ sterean2.html Ascension Technologies Corp., pciBIRD API, http://www.ascension-tech.com/products/pcibird.php Creative, OpenAL, http://www.openal.org/ Cruz-Neira, C., Sandin, D., DeFanti, T., Kenyon, R., and Hart, J.C., The CAVE: Audio Visual Experience Automatic Virtual Environment, Communications of the ACM, Vol. 35, No. 6, 1992, pp. 65-72. Czernuszenko, M., Pape, D., Sandin, D., DeFanti, T., Dawe, G., Brown, M., The ImmersaDesk and Infinity Wall Projection-Based Virtual Reality Displays. Computer Graphics, 1997. Czernuszenko, M., Sandin D., DeFanti, T., Line of Sight Method for Tracker Calibration in ProjectionBased VR Systems, Proceedings of 2nd International Immersive Projection Technology Workshop, Ames, Iowa, 1998. Fast Light ToolKit, http://www.fltk.org/ Johnson, A., Sandin, D., Dawe, G., DeFanti, T., Pape, D., Qiu, Z., Thongrong, S., Plepys, D., Developing the PARIS: Using the CAVE to Prototype a New VR Display, Proceedings of IPT 2000: Immersive Projection Technology Workshop, Ames, IA., 2000. Kitware Inc., Visualization ToolKit 4.5, http://www.vtk.org/ LaserAid, SpaceGrips, http://www.spacegrips.com/spacegrips.htm Reachin Display, http://www.reachin.se/ SensAble Technologies, GHOST 4.0, http://www.sensable.com/ SenseGraphics 3D-MIW, http://www.sensegraphics.se/3DMIW.pdf Stereographics theory, http://astronomy.swin.edu.au/~pbourke/stereographics/vpac/theory.html Systems in Motion, Coin 2.3, http://www.coin3d.org/ The Visible Human Project, http://www.nlm.nih.gov/research/visible/visible_human.html Pape D., Sandin, D., Transparently supporting a wide range of VR and stereoscopic display devices, Proceedings of SPIE, Stereoscopic Displays and Virtual Reality Systems VI (The Engineering Reality of Virtual Reality 1999), vol 3639, San Jose, CA VRCO, CAVELib™ and Trackd®, http://www.vrco.com/ Zwern, A., How to select the right head-mounted display, Meckler’s VR World, 1995, http://www.genreality.com/howtochoose.html 7. 8. 9. 10. 11. 12. 13. 14. 15. 16. 17. 18. 19.