E
Esma Balkir
Researcher at Queen Mary University of London
Publications - 12
Citations - 159
Esma Balkir is an academic researcher from Queen Mary University of London. The author has contributed to research in topics: Computer science & Distributional semantics. The author has an hindex of 5, co-authored 5 publications receiving 120 citations.
Papers
More filters
Journal ArticleDOI
Sentence entailment in compositional distributional semantics
TL;DR: In this article, the authors show that entropy-based distances of vectors and density matrices provide a good candidate to measure word-level entailment, and prove that these distances extend compositionally from words to phrases and sentences.
Book ChapterDOI
Distributional Sentence Entailment Using Density Matrices
TL;DR: Theategorical compositional distributional model of Coecke et al. is expanded, by extending the representations of words from points in meaning space to density operators, which are probability distributions on the subspaces of the space.
Posted Content
Distributional Sentence Entailment Using Density Matrices
TL;DR: In this paper, Coecke et al. extend the categorical compositional distributional model to capture entailment relations by extending the representations of words from points in meaning space to density operators, which are probability distributions on the subspaces of the space.
Journal ArticleDOI
Sentence Entailment in Compositional Distributional Semantics
TL;DR: In this article, the authors show that entropy-based distances of vectors and density matrices provide a good candidate to measure word-level entailment, and prove that these distances extend compositionally from words to phrases and sentences.
Proceedings ArticleDOI
Necessity and Sufficiency for Explaining Text Classifiers: A Case Study in Hate Speech Detection
TL;DR: A novel feature attribution method for explaining text classifiers is presented, and it is shown that different values of necessity and sufficiency for identity terms correspond to different kinds of false positive errors, exposing sources of classifier bias against marginalized groups.