I am Associate Professor in the Computational Linguistics Group of the University of Groningen.

I am passionate about the statistical modeling of human languages, particularly in a multilingual context. My long-term goal is to design robust language processing algorithms that can adapt to the large variety of linguistic phenomena observed around the world.
Among others, I work towards improving the quality of Machine Translation for challenging language pairs, and making state-of-the art NLP models more interpretable.
As a cross-disciplinary research enthusiast, I'm interested in enhancing research on human language processing or language evolution with computational modeling tools.
Last but not least, I enjoy observing, interacting with, and finding daily inspiration in my two daughters and their trilingual minds in the making.

My research was funded by a Veni grant from the Dutch Research Council (NWO) from 2016 to 2021. Currently, I am involved in two national-consortium projects, both funded by NWO's NWA-ORC initiative: Interpreting deep learning models for language, speech & music (InDeep) and Low Resource Chat-based Conversational Intelligence (LESSEN). I also supervise two China Scholarship Council (CSC)-funded PhD students working on the simulation of human patterns of language learning and change.

Interested in my work ? Also see my Research page.


News

**********************************************************************************
  ###########      I AM LOOKING FOR A PhD!      ###########
**********************************************************************************
Find out more and APPLY HERE!

Project aim: Improving language modeling for (low-resource) morphologically rich languages, taking inspiration from child language acquisition insights. Funded by my NWO-VIDI grant.   Start date September 2024

**********************************************************************************
**********************************************************************************
  • [Dec 2023]   We have received an Outstanding Paper Award at EMNLP'23 and a Best Data Award at the GenBench Workshop, for our paper "Cross-Lingual Consistency of Factual Knowledge in Multilingual Language Models " with Jirui Qi and Raquel Fernández.
  • [Nov 2023]   I'm serving as Senior Area Chair for NAACL 2024, area: Interpretability and Analysis of Models for NLP.
  • [Oct 2023]   Paper accepted at EMNLP: "Cross-Lingual Consistency of Factual Knowledge in Multilingual Language Models ", with Jirui Qi and Raquel Fernández.
  • [Jun 2023]   I have been awarded a Vidi grant! to improve language modeling for (low-resource) morphologically rich languages, taking inspiration from child language acquisition insights. This personal grant from the Dutch Research Council (NWO) will fund my research for the next 5 years. I'll be hiring a postdoc and a PhD soon!
  • [Jun 2023]   Paper accepted at TACL: "Communication Drives the Emergence of Language Universals in Neural Agents: Evidence from the Word-order/Case-marking Trade-off", with Yuchen Lian and Tessa Verhoef. Super proud of my collaboration with Tessa started many years ago with the goal of bringing actual language evolution expertise together with actual NLP expertise. It took us years to understand what we wanted to do (and how!) but we're finally on full swing with two well defined PhD projects. In this TACL work, we introduce an artificial learning framework (NeLLCom) that can be used to simulate human pattern of language learning and change with neural network learners (to be also presented at ESSLLI's workshop on Internal and External Pressures Shaping Language)

  • Disclaimer for prospective students: I regularly receive emails from external students interested in my supervision. I do my best to reply each of them, but don't always manage. If I have paid research positions (PhD, postdoc) I'll always post them here, so no post => no position. As for research internships: I don't take students from external universities due to an already high supervision load!