I am Associate Professor in the Computational Linguistics Group of the University of Groningen.

I am passionate about the statistical modeling of human languages, particularly in a multilingual context. My long-term goal is to design robust language processing algorithms that can adapt to the large variety of linguistic phenomena observed around the world.
Among others, I work towards improving the quality of Machine Translation for challenging language pairs, and making state-of-the art NLP models more interpretable.
As a cross-disciplinary research enthusiast, I'm interested in enhancing research on human language processing or language evolution with computational modeling tools.
Last but not least, I enjoy observing, interacting with, and finding daily inspiration in my two daughters and their trilingual minds in the making.

My research was funded by a Veni grant from the Dutch Research Council (NWO) from 2016 to 2021. Currently, I am involved in two national-consortium projects, both funded by NWO's NWA-ORC initiatives: Interpreting deep learning models for language, speech & music (InDeep) and Low Resource Chat-based Conversational Intelligence (LESSEN). I also supervise two China Scholarship Council (CSC)-funded PhD students working on the simulation of human patterns of language learning and change. I have just started a NWO Vidi grant to improve language modeling for (low-resource) morphologically rich languages, taking inspiration from child language acquisition insights.

Interested in my work ? See also my Research page.


  • [Mar 2024]   Paper accepted at NAACL: --> "Encoding of lexical tone in self-supervised models of spoken language", with Gaofei Shen, Michaela Watkins, Afra Alishahi, Grzegorz Chrupała.
  • [Feb 2024]   I'm serving as Senior Area Chair for NAACL 2024 (Interpretability and Analysis of Models for NLP) and as member of the EACL 2024 Best Paper Committee.
  • [Jan 2024]   Paper accepted at ICLR: "Quantifying the Plausibility of Context Reliance in Neural Machine Translation", with Gabriele Sarti, Grzegorz Chrupała, Malvina Nissim.
  • [Jan 2024]   Paper accepted at TACL: "Are Character-level Translations Worth the Wait? Comparing ByT5 and mT5 for Machine Translation", with Lukas Edman, Gabriele Sarti, Antonio Toral, Gertjan van Noord.
  • [Dec 2023]   We have received an Outstanding Paper Award at EMNLP'23 and a Best Data Award at the GenBench Workshop, for our paper "Cross-Lingual Consistency of Factual Knowledge in Multilingual Language Models" with Jirui Qi and Raquel Fernández.

  • Disclaimer for prospective students: I regularly receive emails from external students interested in my supervision. I do my best to reply each of them, but don't always manage. If I have paid research positions (PhD, postdoc) I'll always post them here, so no post => no position. As for research internships: I don't take students from external universities due to an already high supervision load!