SENSEI-Panama

EVL researchers Jillian Aurisano (lead) and James Hwang explore SENSEI-Panama in EVL's CAVE2™ - L. Long, EVL/UIC

Developers: James Hwang, Jillian Aurisano, Oliver San-Juan, Ishta Bhagat

URL: https://sites.google.com/site/senseipanama/

Funding: NSF Award #CNS-1456638

SENSEI: Sensor Environment Imaging (SENSEI) Instrument is a National Science Foundation award to construct a reconfigurable, ultra-high-resolution, spherical (4π steradian), photometric, radiometric and photogrammetric real-time data-acquisition, sensor-based, camera system capable of capturing 3D stereo and still images for viewing in collaboration-enabled, nationally networked, virtual-reality systems.

As SENSEI researchers develop the SENSEI camera system, collaborators are exploring and refining techniques for displaying data captured via a variety of sensors, as well as investigating human computer interaction modalities for stereoscopic, virtual reality applications and environments incorporating complex layers of sensor data.

SENSEI-Panama (SENSor Environment Imaging-Panama) is a environmental visualization approach developed at the Electronic Visualization Laboratory (EVL) to investigate the movement patterns on newly isolated species in the anthropological sciences. Researchers who have left the field are unable to visually determine what animals see in their environment; however, they produce large amounts of data through the use of GPS collars. accelerometers, LIDAR, aerial photography, and terrain maps.

SENSEI-Panama considers how to visualize this kind of data in an environment like CAVE2™ and combine it with the animal movement data collected. The data is transformed to create a virtual map to enable researchers at the University of California Davis to easily track the movements of animals through their habitat on the island, and study their behavior.

Email: jillian.aurisano@gmail.com

Date: November 1, 2015 - Ongoing

Related Entries

Directory:

Events:

Research:

Related Categories