NSF Funds EVL & Collaborators to Develop a Methodology for Creating Intelligent Digital Humans
 

participants: Maxine Brown, Thomas A. DeFanti, Andrew Johnson, Jason Leigh, Luc Renambot, Steve Jones and Gordon Carlson (UIC / Communication), Avelino Gonzalez and Ron Demarr (University of Central Florida / CFU)

institutions: University of Illinois at Chicago and University of Central Florida

location: Chicago, IL; Orlando, FL

The National Science Foundation (NSF) awards a three-year collaborative project titled "Towards Life-like Computer Interfaces that Learn", that combines computer animation, virtual reality, artificial intelligence and natural language processing to develop a methodology for creating intelligent digital humans that can process human voice and gesture input and respond appropriately using similarly natural voice and gestures. The technology can be used in a variety of applications such as creating 3D archival recordings of important historical figures, virtual reality learning environments, and intelligent characters for next-generation video games.

UIC's research team will develop the graphics component of the project, and build a new state-of-the-art motion-capture studio to support project requirements. This technology will also be used as part of the UIC Computer Science Department's Computer Animation, Virtual Reality, and Video Game Programming curriculum. Project members at CFU will work on the database and natural language processing components.

start date: 02/14/2007
end date: 02/14/2007

contact:

Realistic 3D Facial Models Created Using FaceGen
image provided by FaceGen www.facegen.com
 

related document:
no related document available

 
 
related projects:
Towards Lifelike Computer Interfaces that Learn
related info:
no associated info
 
related categories:
applications
software
visualization
animation
multimedia<