Current Projects



Computational cognitive modeling of the predictive active self in situated action (COMPAS)

The COMPAS project seeks to develop a first computational cognitive model of the execution and control of situated action in an embodied cognitive architecture that allows for (a) detailed explanation of mechanisms and processes underlying the sense of agency and the active self; (b) simulation of situated action with predictions about the subjectively perceived sense […]

More

Satisficing mentalizing — Learning models of Theories of Mind for behavior understanding; ?>

Satisficing mentalizing — Learning models of Theories of Mind for behavior understanding

2016-2019

This projects explores different strategies of how artificial systems could be equipped with a Theory of Mind, i.e. being able to infer hidden mental states of other agents from observable behavior. We are developing models which can learn from past encounters with agents to form models capable of performing such mentalizing in a satisficing way, […]

More

Lively and trustworthy social robots (VIVA); ?>

The VIVA project aims to build a mobile social robot that produces lively and socially-appropriate behavior in interaction. We are in charge of developing an embodied communication architecture that controls and mediates the robot’s responsive behavior. In addition, we endow the robot with abilities for cohesive spoken dialogue over long-term interactions.

More

Mental models in collaborative interactive reinforcement learning

2018-2020

This project is part of the Research Cluster CINEMENTAS (“Collaborative Intelligence Based on Mental Models of Assistive Systems”) funded by a major international technology company. We investigate the role of mental models for interactive reinforcement learning, according to which learning is seen as a  dynamic collaborative process in which the trainer and learner together trying to […]

More

Development of iconic gesture-speech integration in preschool children (EcoGest); ?>

This projects aims to provide a detailed account of the development of iconic gesturing and its integration with speech in different communicative genres. We will study pre-school children at 4 to 5 years of age to investigate their speech-accompanying iconic gesture use and to develop a computational cognitive model of their development. We apply qualitative and […]

More

Child-Robot Communication and Collaboration (BabyRobot); ?>

BabyRobot aims to develop robots that can support the development of socio-affective, communication and collaboration skills in both typically developing and autistic spectrum children. The goal of the project is to enable robots to share attention, establish common ground and form joint goals with children. We will contribute research on nonverbal gesture-based interaction in relation to shared attention, interpersonal alignment and grounding. Specifically, we develop modules for recognition and synthesis of expressive social signals and gestures on social robot platforms.

More

Second Language Tutoring Using Social Robots (L2TOR); ?>

L2TOR (pronounced ‘el tutor’) aims to design a child-friendly tutor robot that can be used to support teaching preschool children a second language (L2) by interacting with children in their social and referential world. The project will focus on teaching English to native speakers of Dutch, German and Turkish, and teaching Dutch and German as L2 to immigrant children. We contribute research on interaction management tasks based on decision-theoretic dialogue modeling, probabilistic mental state attribution and incremental grounding.

More

Social Motorics; ?>

Social Motorics

2015-2018

This projects explores the role of perceptual and motor processes in live social interaction. We are developing a probabilistic model of how prediction-based action observation and mentalizing (i.e. theory of mind) contribute and work together during dynamically unfolding gestural interactions.

More