Evolutionary & Hybrid AI

a VUB AI Lab research team


We aim to build truly intelligent systems that are able to interact with and reason about their native environment in order to solve an open-ended set of tasks. Our systems are deeply inspired by evolutionary principles such as self-organisation, selection and emergent functionality, and are therefore adaptive by design. We adopt a hybrid approach that integrates symbolic and subsymbolic AI techniques, combining their strengths to achieve general, accurate and interpretable models. We focus in particular on tasks that require human language-like communication, involving advanced perception, reasoning and learning skills. We investigate fundamental research questions that have a tight connection to real-world problems.

Featured projects

A selection of exciting research projects that showcase our expertise and provide insight into our long-term research program.
  • Image
    Computational Construction Grammar: A Practical Introduction

    We are working on an introductory textbook that explains in a step-by-step manner how construction grammars can be computationally implemented. The book is at the same time an ideal textbook for courses on construction grammar or computational linguistics, and an indispensable resource for construction grammar researchers who would like to run their analyses on corpora.

  • Image
    Grounded Concept Learning

    This project investigates how autonomous agents can distill meaningful concepts and words from continuous streams of perceptual data. The concepts are constructed through task-based communicative interactions and reflect properties of the world that are relevant for the task. The agents either learn the conceptual system of an existing natural language, or construct their own conceptual system and language.

  • Image
    Semantic Frame Extractor

    The semantic frame extractor combines dependency parsing with computational construction grammar to extract semantic frames from text corpora. We have applied the newly developed method to a corpus of English newspaper articles, in order to study expressed causal relations in the climate change debate.

  • Image
    Visual Question Answering

    In this project, we apply our language technologies to the problem of visual question answering. The task requires to understand a natural language question, reason about its relation to a given image and answer the question. We adopt a hybrid and modular approach that parses the question into a procedural semantic representation and executes it on the image.

  • Image
    Visual Dialog

    We extend the insights from our visual question answering approach for the visual dialog task. Here, the goal is to answer a series of questions that often refer to each other. To keep track of what has been said, we develop techniques to store and represent the dialog history in a transparent way. The history is integrated in an executable semantic representation, such that together with the image, the answer is found.

Selected Publications

These publications are representative of the research carried out by the EHAI team. An exhaustive list can be found at the team members' research pages.
A Practical Guide to Studying Emergent Communication through Grounded Language Games Jens Nevens, Paul Van Eecke and Katrien Beuls
Proceedings of AISB '19, Falmouth, United Kingdom, 2019.
Computational Construction Grammar for Visual Question Answering Jens Nevens, Paul Van Eecke and Katrien Beuls
Linguistics Vanguard, 2019.
Exploring the Creative Potential of Computational Construction Grammar Paul Van Eecke and Katrien Beuls
Zeitschrift für Anglistik und Amerikanistik 66(3), 2018, 341-355.
Meta-Layer Problem Solving for Computational Construction Grammar Paul Van Eecke and Katrien Beuls
2017 AAAI Spring Symposium Series, 2017.


The following web demonstrations offer more insight into the techniques and methods we develop.

Visual Question Answering (VQA)

A hybrid AI system for VQA that makes use of computational construction grammar and executable meaning representations.

View demo ›

Grounded Colour Naming Game

Demonstration of the Babel software toolkit with a grounded colour naming game experiment using the new robot interface package.

View demo ›

Semantic Parsing for VQA

A computational construction grammar for mapping between natural language questions and their executable meaning representations.

View demo ›


The EHAI team co-develops a number of software tools that are available to the research community.


  • Image
    Katrien Beuls

    Principal Investigator

  • Image
    Paul Van Eecke

    Postdoctoral researcher and lecturer

  • Image
    Tom Willaert

    Postdoctoral researcher

  • Image
    Jens Nevens

    PhD researcher and teaching assistant

  • Image
    Lara Verheyen

    PhD researcher

  • Image
    Jeroen Van Soest


Interested in collaborating with us?
We're always looking for talented AI researchers to join our interdisciplinary team
Get in touch ›