Adaptive explanation generation in HRI


Suitable for:
Master project

Description:

Successful human-machine interaction requires users to have a sufficient understanding of the technical system. We study this problem for the case of interacting with an autonomous social companion robot that is supposed to behave in a proactive manner (i.e. initiating actions by itself) in order assist the user as well as to maintain its functioning in the long term. In this project you will explore how a robot can and should explain its actions in ways that are understandable and acceptable to its user. In particular you will develop and test algorithms that allow for generating explanations and adapting them to the specific user or the situational context. Algorithms will be developed and integrated into an existing dialogue manager within a social robot architecture that drives the robot’s actions and thus provides the information to be explained.

Requirements:

  • Interest and basic knowledge in human-robot interaction and language/dialogue
  • Solid programming skills (Python)
  • Ability for integrated software development in a given advanced framework

If interested, students can conduct empirical HRI studies to collect data or evaluate their developed approach.


Contact:
Sonja Stange