Optimizing Stereo Video Formats for Projection Based Virtual Reality

Author and contact person:
Gary Lindahl
Iowa Center for Emerging Manufacturing Technology
Iowa State University
2062 Black Engineering Building
Ames, IA 50011-2161
(515) 294-3092
(515) 294-5530 fax
lindahl@icemt.iastate.edu

Additional authors:
Tom DeFanti
Electronic Visualization Laboratory
University of Illinois at Chicago
tom@uic.edu

Dan Sandin
Electronic Visualization Laboratory
University of Illinois at Chicago
dan@uic.edu

Greg Dawe
Electronic Visualization Laboratory
University of Illinois at Chicago
dawe@eecs.uic.edu

Maxine Brown
Electronic Visualization Laboratory
University of Illinois at Chicago
maxine@uic.edu

Submission category: Technical Submission summary:

When designing projection-based virtual reality devices, the integration of LCD shutter glasses, high-end graphics computers, and large-scale video projectors can be difficult. We describe four common problems and their solutions.

1.0 Introduction
Many of today's high-end, projection-based virtual reality (VR) systems generate frame-interleaved video for stereoscopic equipment. Integrating LCD shutter glasses, high-end graphics computers, and large-scale projectors is difficult. This paper identifies problems discovered as part of our VR efforts at the Electronic Visualization Laboratory (EVL), and offers solutions.

2.0 Common Problems
There are four problems commonly encountered when integrating the SGI InfiniteReality(tm) (IR) graphics system [2], Electrohome Marquee 8500 projector, and StereoGraphics CrystalEyewear. The problems are: phosphor decay lag or ghosting colors, stereo separation or left and right eye phase lock, video genlock or video vertical phase lock, and video interference and reflection.

2.1 Phosphor Decay Lag (Ghosting Colors).
A CRT projection system is made up of red, green, and blue phosphor tubes; the light from each tube combines to create full-color images. The common green phosphor tube has a slow decay rate that affects good stereo viewing separation; i.e., the green components of full-color images don't degrade as quickly as the red and blue components, so the user still sees the green component of previous images as images are updated; this is "ghosting". The red tube also has a slow decay rate, but is acceptable for most people. To fix this lag problem, order a projector with a fast decay P43 green phosphor tube.

2.2 Stereo Separation (Left and Right Eye Phase Lock).
Stereo separation, the ability to synchronize stereo images on a single screen, was a problem only recently corrected with the introduction of the IR. To see this problem, display the pattern, shown in Figure 1(A), in stereo. Wearing CrystalEyewear glasses, you will notice the defect, illustrated in Figure 1(B), at the top of the video frame-at the top edge of the video frame, the left and right eye images are swapped. SGI corrected this in software by adding more pixel lines to the vertical back porch of the video frame, positioning the swapped left/right eye image above the frame. A disadvantage of this solution is that the signal bandwidth exceeds the projector display bandwidth and decreases the possible number of pixels that can be viewed on the screen; they now appear in the hidden portion of the video frame.

Figure 1. Video stereo display with (A) correct separation, (B) incorrect separation, and (C) stereo separation drifts.

To correct for pixels off-screen, SGI adjusts the actual stereo sync pulse position relative to the video vertical frame retrace by adding a function to the Video Format Compiler (VFC) [2]. This code (Figure 2) makes all pixels visible, and uses the VFC option to adjust stereo separation.
 

Line Based {
int linesHigh = [user defined];
int linesLow = [user defined];
signal "FIELD" initial state = low;
Transition Lines Range "FIELD" = linesHigh to linesHigh high at BeginTime;
Transition Lines Range "FIELD" = linesLow to linesLow low at BeginTime; }

Figure 2. The source code needed to change the stereo sync pulse location.

2.3 Video Genlock (Video Vertical Phase Lock).
Video genlock synchronizes the display of images from multiple video sources onto multiple screens. The CAVE uses multiple video sources, one for each wall of the CAVE, which require genlocking. Each video source "locks" to the master video source that supplies the stereo sync output. Without locking to the master source, a horizontal dark bar will appear and stereo separation will drift, as shown in Figure 1(C), when seen through the CrystalEyewear glasses. An SGI hardware solution is to connect the genlock sync input with the horizontal sync output of the master video source. A software solution is to configure the video output format of the slave source so it genlocks to the TTL or nominal video sync levels of the master video output. The software fix is usually the hardest since one has to develop a software-defined rule set that is compatible with the design of the video hardware.

2.4 Video Interference and Reflection.
Ripple-like defects (noise "interference" in the projector) or smearing (an echo-like "reflection" in which images fade and overlap) may occur in the video image frame. The ripple-like defect is caused by the projector not having enough retrace time to stabilize in order to scan the video frame cleanly. The VFC option lets us add pixels to the video format definition-adding more pixels to the horizontal front porch gives the projector more time to retrace and stabilize. Smearing is corrected by using high-bandwidth-grade video cable or a video extender over fiber (if longer cable length is required).

3.0 Conclusion
EVL works with equipment manufacturers to help identify and correct problems such as these as newer generations of computer and display technologies become available for use with projection-based VR systems like the CAVE.

4.0 References
[1] C. Cruz-Neira, D.J. Sandin, T.A. DeFanti, "Surround-Screen Projection Based Virtual Reality: The Design and Implementation of the CAVE," Proceedings of ACM SIGGRAPH 93, pp. 135-142.

[2] J.S. Montrym, D.R. Baum, D.L. Dignam, and C.J. Migdal, "InfiniteReality: A Real-Time Graphics System," Proceedings of ACM SIGGRAPH 97, pp. 293-302.

5.0 Acknowledgments
EVL receives major funding from the National Science Foundation (NSF), the Defense Advanced Research Projects Agency, and the US Department of Energy; specifically NSF awards CDA-9303433, CDA-9512272, NCR-9712283, CDA-9720351, and the NSF ASC PACI program. We also acknowledge Silicon Graphics, Inc. and Advanced Network and Services for their support. CAVE and ImmersaDesk are trademarks of the Board of Trustees of the University of Illinois.