S
Sophia Althammer
Researcher at Vienna University of Technology
Publications - 19
Citations - 190
Sophia Althammer is an academic researcher from Vienna University of Technology. The author has contributed to research in topics: Computer science & Language model. The author has an hindex of 2, co-authored 11 publications receiving 37 citations. Previous affiliations of Sophia Althammer include Siemens.
Papers
More filters
Posted Content
Improving Efficient Neural Ranking Models with Cross-Architecture Knowledge Distillation.
TL;DR: This work proposes a cross-architecture training procedure with a margin focused loss (Margin-MSE), that adapts knowledge distillation to the varying score output distributions of different BERT and non-BERT ranking architectures, and shows that across evaluated architectures it significantly improves their effectiveness without compromising their efficiency.
Proceedings ArticleDOI
Introducing Neural Bag of Whole-Words with ColBERTer: Contextualized Late Interactions using Enhanced Reduction
TL;DR: This work proposes ColBERTer, a neural retrieval model using contextualized late interaction (ColBERT) with enhanced reduction that dramatically lowers ColBERT's storage requirements while simultaneously improving the interpretability of its token-matching scores.
Book ChapterDOI
Mitigating the Position Bias of Transformer Models in Passage Re-ranking
TL;DR: This article proposed a debiasing method for passage re-ranking and showed that by mitigating the position bias, Transformer-based re-rank models are equally effective on a biased and debiased dataset, as well as more effective in a transfer-learning setting between two differently biased datasets.
Book ChapterDOI
PARM: A Paragraph Aggregation Retrieval Model for Dense Document-to-Document Retrieval
TL;DR: In this paper , a paragraph aggregation retrieval model (PARM) is proposed to combine the advantages of rank-based aggregation and topical aggregation based on the dense embeddings for dense document-to-document retrieval.
Book ChapterDOI
Cross-Domain Retrieval in the Legal and Patent Domains: A Reproducibility Study
TL;DR: In this paper, the BERT-PLI model was used for cross-domain transfer of retrieval models for domain specific search, and the results showed that the transfer of BERT on the paragraph-level leads to comparable results between both domains as well as first promising results for the crossdomain transfer on the document-level.