TeraVision was designed as a hardware assisted solution for distributing video in real-time for graphic-rich scientific visualization and collaborative applications. The system’s is best described as a power-point projector with a gigabit network card which allows users to walk up to a TeraVision box and stream the video output of their laptops, workstations or even graphics clusters by simply plugging the VGA/DVI output of the rendering machines to the TeraVision box. The sending TeraVision box is built around a PC with a high-resolution video capture card and a gigabit network adapter.
The receiving end however does not require any special hardware for displaying the streamed video as long as it has a comparable network adapter and a decent graphics card. The receiving end can either be a laptop, a workstation style display machine or even a tiled-display. The software for the system is available for Linux and Windows and is open-source.
Figure 1: The system is capable of taking inputs from various kinds of hardware and can stream the video, uncompressed (or compressed) to a variety of displays.
Figure 2: A user streams her 1600x1200 laptop screen to EVL’s 6400 x 3072 pixel tiled display at 12 fps.
Figure 3: In this picture three remote TeraVision servers are streaming high-resolution video to three different sections of the tiled-display. Each video stream is independent of each other but the screens for each stream are synchronized.
1.2 Advantages of the hardware-assisted approach
The advantages to such a hardware assisted approach are significant for compute-intensive applications.
- A desktop at 1600x1200 at 20 frames per second results in a video stream of 920 Mbps. Compressing this data in real-time at near-lossless compression is an impossible task for the modern PCs. Thus the video has to be transmitted uncompressed. This places a considerable load on the CPU and if the machine responsible for rendering also has to stream the video, it soon becomes a problem, especially for CPU intensive graphics applications.
The TeraVision approach offloads the task of network streaming by dedicating a machine, which acts as a video server to the display clients.
- If an application has to stream video, the software has to be designed for it. By capturing the video output of rendering machines by using a capture card, we make it possible for the video output to be streamed without making any changes to the source machines/applications.
- TeraVision provides a framework for network protocol developers to try different protocols for streaming over LFNs (Long Fat Networks). Currently it supports TCP, UDP, Multicast and EVL’s LambdaStream protocol for wide-area streaming.
Currently the system is undergoing development of v3.0 and following is a list of features that are supported. Readers are encouraged to go through the recent TeraVision paper submitted at Cluster 2004 for a better understanding of the system’s capability.
- Integration with the SAGE (Scalable Adaptive Graphics Environment) will allow users to stream the output of TeraVision boxes to SAGE displays, thus allowing collaborative sessions within the SAGE framework.
- All TeraVision installations register with a central 'TV Station' which allows users around the world to 'tune' into TeraVision channels for collaboration or as passive viewers.
- Multicasting capabilities to allow multiple sites to collaborate simultaneously. The sytem was recently tested over wide-area gigabit network at SC Global 2004 at speeds upto 500 Mbps.
- Optical multicasting capability (if properly configured)
- An improved User Interface to help users debug and configure the system easily. The users can ‘tune’ into TeraVision channels on the network or start their own channel and allow other users to receive the broadcasted video.
- Later releases of this version will also incorporate real-time audio streaming which would allow users to do HDTV quality video conferencing between multiple sites.
- Beta-level system integration with the Access Grid interface
- System has been tested with streaming desktops at 1600x1200 at 12-15 fps and HDTV (1080i) at 24 fps.
- Network protocols supported include TCP, UDP, Multicast and LambdaStream.
Future releases will incorporate the following features:
- Control plane security. Currently, all control plane information is sent over unencrypted TCP sockets, which is a potential security hazard and must be addressed before the system is widely deployed.
- Multicasting tiled displays. The current system can only handle the multicasting of a single video stream from one computer to single clients at remote sites.
- Incorporating real-time extensions provided by the operating system to further improve the synchronization of the system running across machines.
- Improved performance monitoring and logging capability.
- Compression over unreliable protocols. Special care has to be taken while sending compressed data over unreliable channels, such as UDP. The current implementation only has one compression module, which works over TCP.