scispace - formally typeset
J

Jinguo Zhu

Researcher at Xi'an Jiaotong University

Publications -  10
Citations -  150

Jinguo Zhu is an academic researcher from Xi'an Jiaotong University. The author has contributed to research in topics: Deep learning & Mutual information. The author has an hindex of 3, co-authored 8 publications receiving 15 citations.

Papers
More filters
Proceedings ArticleDOI

Layerwise Optimization by Gradient Decomposition for Continual Learning

TL;DR: This work decomposed the gradient of an old task into a part shared by all old tasks and a part specific to that task, and performs optimization for the gradients of each layer separately rather than the concatenation of all gradients as in previous works.
Posted Content

Complementary Relation Contrastive Distillation

TL;DR: This paper proposes a novel knowledge distillation method, namely Complementary Relation Contrastive Distillation (CRCD), to transfer the structural knowledge from the teacher to the student and estimates the mutual relation in an anchor-based way and distill the anchorstudent relation under the supervision of its corresponding anchor-teacher relation.
Proceedings ArticleDOI

Complementary Relation Contrastive Distillation

TL;DR: Zhang et al. as discussed by the authors proposed complementary relation contrastive distillation (CRCD) to transfer the structural knowledge from the teacher to the student by estimating the mutual relation in an anchor-based way and distill the anchor-student relation under the supervision of its corresponding anchor-teacher relation.
Journal ArticleDOI

A Deep Learning Method to Detect Foreign Objects for Inspecting Power Transmission Lines

TL;DR: A deep learning method to detect invading foreign objects for power transmission line inspection is proposed, based on the regression strategy with oriented bounding boxes to accurately predict spatial location and orientation angle of foreign objects, as well as their categories in cluttered backgrounds.
Proceedings ArticleDOI

Uni-Perceiver-MoE: Learning Sparse Generalist Models with Conditional MoEs

TL;DR: By incorporating the proposed Conditional MoEs, the recently proposed generalist model Uni-Perceiver can effectively mitigate the interference across tasks and modalities, and achieves state-of-the-art results on a series of downstream tasks via prompt tuning on 1% of downstream data.