Context-aware dynamic presentation synthesis for exploratory multimodal environments

Harini Sridharan, Ankur Mani, Hart Sundaram, Jennifer Brungart, David Birchfield

Research output: Chapter in Book/Report/Conference proceedingConference contribution

2 Scopus citations

Abstract

In this paper, we develop a novel real-time, interactive, automatic multimodal exploratory environment that dynamically adapts the media presented, to user context. There are two key contributions of this paper - (a) development of multimodal user-context model and (b) modeling the dynamics of the presentation to maximize coherence. We develop a novel user-context model comprising interests, media history, interaction behavior and tasks, that evolves based on the specific interaction. We also develop novel metrics between media elements and the user context. The presentation environment dynamically adapts to the current user context. We develop an optimal media selection and display framework that maximizes coherence, while constrained by the user-context, user goals and the structure of the knowledge in the exploratory environment. The experimental results indicate that the system performs well. The results also show that user-context models significantly improve presentation coherence.

Original languageEnglish (US)
Title of host publicationIEEE International Conference on Multimedia and Expo, ICME 2005
Pages1014-1017
Number of pages4
DOIs
StatePublished - 2005
EventIEEE International Conference on Multimedia and Expo, ICME 2005 - Amsterdam, Netherlands
Duration: Jul 6 2005Jul 8 2005

Publication series

NameIEEE International Conference on Multimedia and Expo, ICME 2005
Volume2005

Other

OtherIEEE International Conference on Multimedia and Expo, ICME 2005
CountryNetherlands
CityAmsterdam
Period7/6/057/8/05

Fingerprint Dive into the research topics of 'Context-aware dynamic presentation synthesis for exploratory multimodal environments'. Together they form a unique fingerprint.

Cite this