On invitation of Prof. Rukzio Dr Ken Pfeuffer, Bundeswehr University Munich, presents research results about Future Eye-based and Multimodal User Interfaces.
Abstract: User Interfaces define how we interact with computers, and they are primarily controlled via hands. With advances in input devices, motion tracking, and natural interaction, future interfaces have the possibility to integrate and fuse multiple user inputs to advance the existing, and create new ways of interaction. Information from advanced modalities such as eye gaze are linked with human cognitive processes, and allow systems to better understand the user's intent and act before users would to ultimately make interactions easier. Our research explores new, potential multimodal systems, to approach questions such as 1) what natural ways are there to fuse multiple input signals for a more powerful UI, 2) what are advantages and drawbacks of such a system compared to traditional manual interaction, or 3) how can complex multi modal systems be presented and designed in a way intuitive for users? To investigate these questions, we explore the input theory, we design prototypes, and we assess their usability in studies. Our insights indicate how interfaces can be advanced in the background to facilitate users by observing their natural behaviour, and in the foreground where explicit user actions are enhanced by intelligent systems interpreting multimodal signals.
Further information about the colloquium.