scispace - formally typeset
Y

Yuhao Zhang

Researcher at Stanford University

Publications -  40
Citations -  4126

Yuhao Zhang is an academic researcher from Stanford University. The author has contributed to research in topics: Computer science & Relationship extraction. The author has an hindex of 15, co-authored 28 publications receiving 2183 citations.

Papers
More filters
Proceedings ArticleDOI

Stanza: A Python Natural Language Processing Toolkit for Many Human Languages

TL;DR: This work introduces Stanza, an open-source Python natural language processing toolkit supporting 66 human languages that features a language-agnostic fully neural pipeline for text analysis, including tokenization, multi-word token expansion, lemmatization, part-of-speech and morphological feature tagging, dependency parsing, and named entity recognition.
Proceedings ArticleDOI

Graph Convolution over Pruned Dependency Trees Improves Relation Extraction.

TL;DR: An extension of graph convolutional networks that is tailored for relation extraction, which pools information over arbitrary dependency structures efficiently in parallel is proposed, and a novel pruning strategy is applied to the input trees by keeping words immediately around the shortest path between the two entities among which a relation might hold.
Proceedings ArticleDOI

Position-aware Attention and Supervised Data Improve Slot Filling

TL;DR: An effective new model is proposed, which combines an LSTM sequence model with a form of entity position-aware attention that is better suited to relation extraction that builds TACRED, a large supervised relation extraction dataset obtained via crowdsourcing and targeted towards TAC KBP relations.
Journal Article

Contrastive Learning of Medical Visual Representations from Paired Images and Text

TL;DR: This work proposes an alternative unsupervised strategy to learn medical visual representations directly from the naturally occurring pairing of images and textual data, and shows that this method leads to image representations that considerably outperform strong baselines in most settings.
Proceedings ArticleDOI

Universal Dependency Parsing from Scratch

TL;DR: The authors proposed a complete neural pipeline system that takes raw text as input, and performs all tasks required by the shared task, ranging from tokenization and sentence segmentation, to POS tagging and dependency parsing.