Exploring Synergies Between Gaze and Speech Interaction: A First Prototype
Diogo Vieira, João Dinis Freitas, Cengiz Acartürk, António Teixeira, Luís Sousa, Samuel Silva, Sara Candeias and Miguel Sales Dias
The 17th International ACM SIGACCESS Conference on Computers and Accessibility - Posters and Demos (ASSETS 2015)
Lisbon, Portugal, October 26-28, 2015
Gaze information has the potential to bring great benefits to Human-Computer Interaction (HCI), particularly when combined with speech. Gaze can provide information about user intention, as a secondary modality, or it can be used as the main modality by impaired users. In this paper we describe a multimodal HCI system prototype which supports speech, gaze and the combination of both. The system has been developed for Active Assisted Living scenarios. The examples of use include checking the news module from the PaeLife Personal Assistant by means of gaze, speech or both, simultaneously.
Conference Manager (V2.61.0 - Rev. 3862)