NSF Funds EVL & Collaborators to Develop a Methodology for Creating Intelligent Digital Humans

Participants: Andrew Johnson, Jason Leigh, Luc Renambot, Maxine Brown, Thomas A. DeFanti, Steve Jones and Gordon Carlson (UIC / Communication), Avelino Gonzalez and Ron Demarr (University of Central Florida / CFU)

Institutions: University of Illinois at Chicago and University of Central Florida

Chicago, IL and Orlando, FL

The National Science Foundation (NSF) awards a three-year collaborative project titled “Towards Life-like Computer Interfaces that Learn”, that combines computer animation, virtual reality, artificial intelligence and natural language processing to develop a methodology for creating intelligent digital humans that can process human voice and gesture input and respond appropriately using similarly natural voice and gestures. The technology can be used in a variety of applications such as creating 3D archival recordings of important historical figures, virtual reality learning environments, and intelligent characters for next-generation video games.

UIC’s research team will develop the graphics component of the project, and build a new state-of-the-art motion-capture studio to support project requirements. This technology will also be used as part of the UIC Computer Science Department’s Computer Animation, Virtual Reality, and Video Game Programming curriculum. Project members at CFU will work on the database and natural language processing components.

Email: spiff@uic.edu

Date: February 14, 2007
Realistic 3D Facial Models Created Using FaceGen - FaceGen

Related Entries



Related Categories