Articulate: a Semi-automated System for Translating Natural Language Queries into Meaningful Visualizations
 

developers: Yiwen Sun,

Thesis advisory committee: Jason Leigh (Director, EVL). Andrew Johnson (Associate Professor, CS/EVL), Barbara Di Eugenio (Associate Professor, CS), Luc Renambot (Research Assistant Professor, CS/EV

While many visualization tools exist that offer sophisticated functions for charting complex data, they still expect users to possess a high degree of expertise in wielding the tools to create an effective visualization. Articulate is a semi-automated visual analytic system that is guided by a conversational user interface to allow users to verbally describe and then manipulate what they want to see. Natural language processing and machine learning methods are used to translate the imprecise sentences into explicit expressions, and then a heuristic graph generation algorithm is applied to create a suitable visualization. The goal is to relieve the user of the burden of having to learn a complex user-interface in order to craft a visualization.

start date: 03/01/2009
end date: ongoing

contact:

Screenshot of Articulate system
image provided by , Y. Sun, EVL,
 
 
 
movie > size: 5.61 MB
 
 
related projects:
Towards Lifelike Computer Interfaces that Learn
related info:
2 associated paper(s)
3 associated event(s)<
1 associated movie(s)
 
related categories:
applications
software
visualization
human factors
MS/PhD thesis<