evl logo
evl YouTube evl Facebook evl Twitter
The High-Tech Future of Body Language

participants: Gordon Carlson, Andrew Johnson, Sangyoon Lee, Jason Leigh, Luc Renambot, and collaborators at the University of Central Florida (UCF) Intelligent Systems Laboratory and Computer Architecture Laboratory

location: Washington Post

The October 22, 2009 edition of the WASHINGTON POST has an article by Carol Kinsey Goman on "The High-Tech Future of Body Language," which talks about the importance of non-verbal queues for those "face-to-face" meetings mediated by a screen in the middle. Think video teleconferencing! To impress how non-verbal cues are significant -- if not moreso -- in our digital future, the author describes five technology and research projects. One of the advancements profiled is "Project LifeLike," a collaboration between the Intelligent Systems Laboratory (ISL) at the University of Central Florida (UCF) and the Electronic Visualization Laboratory (EVL) at the University of Illinois at Chicago (UIC). The article references the May 18th article "LifeLike - EVL Lab Sets Out To Simulate Your (Evil) Twin," on the website Scientific Blogging - Science 2.0.

Project Lifelike receives major funding from the National Science Foundation, award # 0703916 to UIC and UCF.

start date: 10/22/2009
end date: 10/22/2009

contact:

Alex Schwarzkopf Avatar Head & Photo

image provided by S. Lee, EVL

related document:
no related document available

related projects:
Towards Lifelike Computer Interfaces that Learn
related info:
no associated papers
 
related categories:
applications
software
tele-immersion
human factors
multimedia
vr
government
user groups