Navigation auf


Department of Computational Linguistics Text Technologies


In the MUTAMUR project (Multitask Learning with Multilingual Resources for Better Natural Language Understanding), we investigate methods for knowledge sharing and transfer between machine learning models in natural language processing.

Modern machine learning models in natural language processing require large amounts of training data to reach high quality. While this training data is task-specific, various tasks are related both in terms of machine learning algorithms and language representations. MUTAMUR investigates new machine learning methods to exploit this relationship, and develop better natural language processing systems for tasks and languages with small amounts of task-specific training data.

The project has so far led to more than 40 scientific publications, including new datasets to study cross-lingual transfer [1] [2], multilingual language and translation models that support low-resource languages [1] [2], new methods for multilingual [1] [2] [3] and multimodal learning [1] [2], including tasks such as speech translation [1] [2] and sign language translation [1], and studies that analyse and provide insights into the inner workings of NLP models [1] [2] [3] [4] [5] [6].


Project Head:

Rico Sennrich



Sina Ahmadi


Previous project members:

Chantal Amrhein

Duygu Ataman

Denis Emelin

Samuel Läubli

Alireza Mohammadshahi

Farhad Nooralahzadeh

Proyag Pal

Annette Rios

Phillip Ströbel

Jannis Vamvas