M
Michael Sejr Schlichtkrull
Researcher at University of Cambridge
Publications - 25
Citations - 4464
Michael Sejr Schlichtkrull is an academic researcher from University of Cambridge. The author has contributed to research in topics: Computer science & Question answering. The author has an hindex of 10, co-authored 22 publications receiving 2638 citations. Previous affiliations of Michael Sejr Schlichtkrull include University of Copenhagen & Facebook.
Papers
More filters
Book ChapterDOI
Modeling Relational Data with Graph Convolutional Networks
Michael Sejr Schlichtkrull,Thomas Kipf,Peter Bloem,Rianne van den Berg,Ivan Titov,Ivan Titov,Max Welling,Max Welling +7 more
TL;DR: It is shown that factorization models for link prediction such as DistMult can be significantly improved through the use of an R-GCN encoder model to accumulate evidence over multiple inference steps in the graph, demonstrating a large improvement of 29.8% on FB15k-237 over a decoder-only baseline.
Posted Content
Modeling Relational Data with Graph Convolutional Networks
Michael Sejr Schlichtkrull,Thomas Kipf,Peter Bloem,Rianne van den Berg,Ivan Titov,Ivan Titov,Max Welling,Max Welling +7 more
TL;DR: Relational Graph Convolutional Networks (R-GCNets) as discussed by the authors are related to a recent class of neural networks operating on graphs, and are developed specifically to deal with the highly multi-relational data characteristic of realistic knowledge bases.
Posted Content
Interpreting Graph Neural Networks for NLP With Differentiable Edge Masking
TL;DR: This work introduces a post-hoc method for interpreting the predictions of GNNs which identifies unnecessary edges and uses this technique as an attribution method to analyze GNN models for two tasks -- question answering and semantic role labeling -- providing insights into the information flow in these models.
Posted Content
How do Decisions Emerge across Layers in Neural Models? Interpretation with Differentiable Masking
TL;DR: Differentiable Masking relies on learning sparse stochastic gates to completely mask-out subsets of the input while maintaining end-to-end differentiability and is used to study BERT models on sentiment classification and question answering.
Posted Content
NeurIPS 2020 EfficientQA Competition: Systems, Analyses and Lessons Learned
Sewon Min,Jordan Boyd-Graber,Chris Alberti,Danqi Chen,Eunsol Choi,Michael Collins,Kelvin Guu,Hannaneh Hajishirzi,Kenton Lee,Jennimaria Palomaki,Colin Raffel,Adam Roberts,Tom Kwiatkowski,Patrick S. H. Lewis,Yuxiang Wu,Heinrich Küttler,Linqing Liu,Pasquale Minervini,Pontus Stenetorp,Sebastian Riedel,Sohee Yang,Minjoon Seo,Gautier Izacard,Fabio Petroni,Lucas Hosseini,Nicola De Cao,Edouard Grave,Ikuya Yamada,Sonse Shimaoka,Masatoshi Suzuki,Shumpei Miyawaki,Shun Sato,Ryo Takahashi,Jun Suzuki,Martin Fajcik,Martin Docekal,Karel Ondrej,Pavel Smrz,Hao Cheng,Yelong Shen,Xiaodong Liu,Pengcheng He,Weizhu Chen,Jianfeng Gao,Barlas Oguz,Xilun Chen,Vladimir Karpukhin,Stan Peshterliev,Dmytro Okhonko,Michael Sejr Schlichtkrull,Sonal Gupta,Yashar Mehdad,Wen-tau Yih +52 more
TL;DR: The EfficientQA competition as mentioned in this paper focused on open-domain question answering (QA), where systems take natural language questions as input and return natural language answers, and the aim of the competition was to build systems that can predict correct answers while also satisfying strict on-disk memory budgets.