Virtual Reality for Virtual Eternity
participants: Maxine Brown, Thomas A. DeFanti, Andrew Johnson, Sangyoon Lee, Jason Leigh, Luc Renambot, Steve Jones and Gordon Carlson (UIC / Communication), Avelino Gonzalez and Ron Demarr (University of Central Florida / CFU)
institutions: National Science Foundation (NSF), University of Central Florida
location: Chicago, IL
Please click this link to view the full text article
UIC News Release
March 12, 2007
CONTACT: Paul Francuch, (312) 996-3457, francuch @ uic.edu
Imagine having a discussion with Isaac Newton or Albert Einstein on the nature of the universe, where their 3-D, life-sized representations looked you in the eye, examined your body language, considered voice nuances and phraseology of your questions, then answered you in a way that is so real you would swear the images were alive.
This was an opening scene from an episode of the TV show “Star Trek” almost a decade and a half ago. A new research project between the University of Illinois at Chicago and the University of Central Florida in Orlando may soon make such imaginary conversations a reality.
Technology from computer games, animation and artificial intelligence provide the elements to make this happen. The National Science Foundation has awarded a half-million dollar, three-year grant to UIC and UCF researchers to bring those elements together and create the methodology for making such virtual figures commonplace.
UIC will focus on the computer graphics and interaction while UCF will concentrate on artificial intelligence and natural language processing software.
“The goal is to combine artificial intelligence with the latest advanced graphics and video game-type technology to enable us to create historical archives of people beyond what can be achieved using traditional technologies such as text, audio and video footage,” said Jason Leigh, associate professor of computer science and director of UIC’s Electronic Visualization Laboratory. Leigh is UIC’s lead principal investigator.
EVL will build a state-of-the-art motion-capture studio to digitalize the image and movement of real people who will go on to live a virtual eternity in virtual reality. Knowledge will be archived into databases. Voices will be analyzed to create synthesized but natural-sounding “virtual” voices. Mannerisms will be studied and used in creating the 3-D virtual forms, known technically as avatars.
start date: 03/12/2007
end date: 03/25/2007
contact: laura @ evl.uic.edu
User Interacts with Virtual Alex’s Avatar
image provided by S. Lee, EVL
no related document available
Towards Lifelike Computer Interfaces that Learn
no associated papers