June 7th, 2010
Categories: Applications, Multimedia, Software, User Groups, VR
The June 7, 2010 issue of “New Scientist” magazine has an article by Linda Geddes on current research to create lifelike digital representations, or avatars, of people. The ultimate goal is to create a personalized, conscious avatar embodied in a robot - effectively enabling you, or some semblance of you, to achieve immortality. Several projects to make representations of the human face more lifelike as well as empathetic in order to aid interaction with digital avatars are featured, including UIC’s Electronic Visualization Laboratory, who has been collaborating with University of Central Florida in Orlando since 2007 on Project LifeLike, which aims to create a realistic avatar of Alexander Schwarzkopf, a former program director of the US National Science Foundation.
The article reports that UIC has learned that how an avatar looks may matter less than its behavior. “It might be how they cock their head when they speak or how they arch an eyebrow,” explains EVL team member Steve Jones of the UIC Communications department, or showing empathy if the avatar is listening to you tell a sad story. In fact, Project LifeLike researchers are integrating a camera into their digital Schwarzkopf so that it can pick up visual clues from people’s body language and adapt its behavior accordingly.
Read the entire New Scientist article
About Project LifeLike
“Project LifeLike” is a collaboration between the Electronic Visualization Laboratory (EVL) and Communications Department at the University of Illinois at Chicago (UIC) and the Intelligent Systems Laboratory (ISL) at the University of Central Florida (UCF), and receives major funding from the National Science Foundation (NSF). UIC and UCF are developing a lifelike avatar of Dr. Alex Schwarzkopf, a recently retired NSF program officer who founded and directed the NSF Industry / University Cooperative Research Center (I/UCRC) program. Even if it were possible to document his wealth of knowledge in volumes of tomes, quickly accessing information would be cumbersome and difficult. Instead, NSF wanted to preserve his legacy by developing a natural-looking computer representation of Alex (an avatar) as a portal to an intelligent decision support system, called AskAlex. This system would have the AlexAvatar intelligently respond to user questions about I/UCRC via spoken language, with realistic inflection and visual expressions. Project website: www.evl.uic.edu/cavern/lifelike/index.php
EDITOR’S NOTE: Project LifeLike’s Alex avatar won first place in the 3rd annual UIC Image of Research 2010 contest.