A Multi-modal Dialog System for a Mobile Robot

Ioannis Toptsis, Shuyin Li, Britta Wrede and Gernot A. Fink
Proc. Int. Conf. on Spoken Language Processing, pages 273-276, 2004.

Jeju, Korea

BibTeX PDF

Abstract

A challenging domain for dialog systems is their use for the communication with robotic assistants. In contrast to the classical use of spoken language for information retrieval, on a mobile robot multi-modal dialogs and the dynamic interaction of the robot system with its environment have to be considered. In this paper we will present the dialog system developed for BIRON -- the Bielefeld Robot Companion. The system is able to handle multi-modal dialogs by augmenting semantic interpretation structures derived from speech with hypotheses for additional modalities as e.g. speech-accompanying gestures. The architecture of the system is modular with the dialog manager being the central component. In order to be aware of the dynamic behavior of the robot itself, the possible states of the robot control system are integrated into the dialog model. For flexible use and easy configuration the communication between the individual modules as well as the declarative specification of the dialog model are encoded in XML. We will present example interactions with BIRON from the ``home-tour'' scenario defined within the COGNIRON project.