July 16th, 2008 - August 16th, 2008
CHICAGO - The National Science Foundation (NSF) recently awarded a Major Research Instrumentation grant in the amount of $450,000 to University of Illinois at Chicago’s Electronic Visualization Laboratory (EVL) to build OmegaTable, a modular multi-sensory touch tabletop for interactive 2D and 3D visual data exploration.
This powerful virtual-reality device will enable scientific communities to view, share and interact with large-scale 2D and 3D data at the same time, and will enable computer scientists to study the integration of multi-sensory touch and gestural interaction techniques for seamlessly manipulating both 2D and 3D data.
“Integrated visualization instruments with powerful computing capabilities are becoming important in domain science because scientists have access to more and more types of electronic data,” says EVL director Jason Leigh. “These displays are the new microscopes and telescopes, enabling researchers to magnify and zoom-in on interesting phenomena in today’s digital world.”
EVL’s first interactive tabletop display prototype, LambdaTable, was the buzz of Supercomputing 2007, a major annual forum for high-performance computing and networking, held last November in Reno, Nevada. There, the EVL team used LambdaTable to magnify, resize, and rearrange ultra-high-resolution 2D scientific imagery - from rat brain mitochondria to space nebula - in real time using infrared (IR)-light-tracked pucks.
LambdaTable is a 24-million pixel table-oriented display system built using six LCD displays (tiled in a 2x3 configuration), with each LCD having a resolution twice that of high-definition television. The resolution of OmegaTable will be at least 24 million pixels, and will have the ability to display 2D and autostereoscopic 3D simultaneously.
For years EVL has been at the forefront of real-time, tracked autostereoscopic technology with its Varrier research, in which viewers can see stereoscopic 3D content without having to wear special glasses. By incorporating gestural interaction, the OmegaTable will finally allow users to experience virtual reality without being encumbered with special glasses, hand-held controls, or gloves.
Interaction is crucial for meaningful data interpretation, enabling research scientists to explore “what if” scenarios, or to explain complex scientific concepts to public audiences within a museum context. The Science Museum of Minnesota is building a LambdaTable as a traveling exhibit that will allow the public to interact with a virtual watershed application. It will be added to the “Water: H2O = Life” exhibit currently traveling across North America when it passes through Minnesota in spring 2009.
Eight leading domain-science research and education institutions have already expressed interest in testing and adopting OmegaTable, including the Science Museum of Minnesota, the National Center for Microscopy and Imaging Research at the University of California, San Diego, and the Pacific Rim Applications and Grid Middleware Assembly, to facilitate large-scale collaborations requiring advanced cyberinfrastructure.
“We’re pleased to have this critical NSF funding to construct this next generation device, and a scientific community eager to work with us,” says Leigh. “As is the goal of all of our visualization display technologies, we expect that the OmegaTable will transform science team workflows by providing new and more intuitive ways of seeing and interacting with information.”
LambdaTable is a tiled LCD tabletop display connected to high-bandwidth optical networks that supports interactive group-based visualization of high-resolution data.
LambdaTable’s camera tracking system is a scalable multi-camera computer vision architecture designed to track input from many simultaneous users interacting with a variety of different interface devices. LambdaTable is a 6 LCD panel tiled array driven by a 7-node cluster (including tracking) with a combined resolution of 24 Megapixels.