scispace - formally typeset
S

Siliang Tang

Researcher at Zhejiang University

Publications -  119
Citations -  1581

Siliang Tang is an academic researcher from Zhejiang University. The author has contributed to research in topics: Computer science & Context (language use). The author has an hindex of 17, co-authored 112 publications receiving 1017 citations. Previous affiliations of Siliang Tang include Maynooth University.

Papers
More filters
Journal ArticleDOI

Sparse Multi-Modal Hashing

TL;DR: The experimental results show that SM2H outperforms other methods in terms of mAP and Percentage on two real-world data sets.
Journal ArticleDOI

Rethinking the Bottom-Up Framework for Query-Based Video Localization

TL;DR: It is argued that the performance of bottom-up framework is severely underestimated by current unreasonable designs, including both the backbone and head network, and designed a novel top-up model: Graph-FPN with Dense Predictions (GDP).
Journal ArticleDOI

MRFN: Multi-Receptive-Field Network for Fast and Accurate Single Image Super-Resolution

TL;DR: This paper proposes a new solution (named as Multi-Receptive-Field Network - MRFN), which outperforms existing SISR solutions in three different aspects and can achieve more accurate recovering results than most state-of-the-art methods with significantly less complexity.
Journal ArticleDOI

Cascaded Deep Networks With Multiple Receptive Fields for Infrared Image Super-Resolution

TL;DR: A cascaded architecture of deep neural networks with multiple receptive fields is presented to increase the spatial resolution of infrared images by a large scale factor and achieves improved reconstruction accuracy using significantly fewer parameters.
Proceedings ArticleDOI

A low rank structural large margin method for cross-modal ranking

TL;DR: A general cross-modal ranking algorithm to optimize the listwise ranking loss with a low rank embedding, which is called Latent Semantic Cross-Modal Ranking (LSCMR) and shows significant improvements over the state-of-the-art methods.