About Me
I am a third-year PhD student at the Language Technology Lab, University of Cambridge, supervised by Professor Anna Korhonen.
My research tackles multimodal and multilingual problems in Natural Language Processing, with a core interest in how we can efficiently control and adapt large models. This focus on efficiency and transfer learning began early in my NLP journey, when I contributed to AdapterHub, a well-established framework for lightweight model training.
Building on my interest in Parameter-Efficient Fine-Tuning, my current PhD work explores these ideas at the representation level. I investigate techniques like representation steering and the alignment of individually trained modules to guide model behavior without costly retraining.
Publications
ReCoVeR the Target Language: Language Steering without Sacrificing Task Performance
Hannah Sterz, Fabian David Schmidt, Goran Glavaš, and Ivan Vulić
DARE: Diverse Visual Question Answering with Robustness Evaluation
Hannah Sterz, Jonas Pfeiffer, Ivan Vulić
M2QA: Multi-domain Multilingual Question Answering
Leon Engländer, Hannah Sterz, Clifton A Poth, Jonas Pfeiffer, Ilia Kuznetsov, Iryna Gurevych
Scaling Sparse Fine-Tuning to Large Language Models
Alan Ansell, Ivan Vulić, Hannah Sterz, Anna Korhonen, Edoardo M. Ponti
Adapters: A Unified Library for Parameter-Efficient and Modular Transfer Learning
Clifton Poth, Hannah Sterz, Indraneil Paul, Sukannya Purkayastha, Leon Engländer, Timo Imhof, Ivan Vulić, Sebastian Ruder, Iryna Gurevych, Jonas Pfeiffer
UKP-SQUARE: An Online Platform for Question Answering Research.
Tim Baumgärtner, Kexin Wang, Rachneet Sachdeva, Gregor Geigle, Max Eichler, Clifton Poth, Hannah Sterz, Haritz Puerto, Leonardo F. R. Ribeiro, Jonas Pfeiffer, Nils Reimers, Gözde Şahin, and Iryna Gurevych
