scispace - formally typeset
C

Chang Xu

Researcher at University of Sydney

Publications -  467
Citations -  13012

Chang Xu is an academic researcher from University of Sydney. The author has contributed to research in topics: Computer science & Chemistry. The author has an hindex of 42, co-authored 260 publications receiving 7189 citations. Previous affiliations of Chang Xu include University of Melbourne & Information Technology University.

Papers
More filters
Journal ArticleDOI

A compact, CW mid-infrared intra-cavity Nd:Lu0.5Y0.5VO4∖KTA-OPO at 3.5 μm

TL;DR: In this paper, a continuous-wave (CW) KTA (KTiOAsO4)-OPO with a compact linear cavity utilizing an LD pumped mixed crystal Nd:Lu 0.5Y0.5VO4 laser as the pump source was designed to reduce the OPO's threshold.
Posted Content

Full-Stack Filters to Build Minimum Viable CNNs.

TL;DR: Experiments demonstrate that the proposed method is able to construct minimum viable convolution networks of comparable performance and conduct theoretical analysis on the memory cost and an efficient implementation is introduced for the convolution of the proposed filters.
Journal ArticleDOI

TDFL: Truth Discovery Based Byzantine Robust Federated Learning

TL;DR: This is the first study that uses truth discovery to defend against poisoning attacks and the first scheme which can achieve strong robustness under multiple kinds of attacks launched by high proportion attackers without root datasets.
Journal ArticleDOI

Research on the degradation of ancient Longquan celadons in the Dalian Island shipwreck

TL;DR: In this paper , the degradation mechanism of the ancient porcelain at marine environment was investigated, and the results showed that these celadons can be divided into two types: transparent glazes and matt-opaque glazes.
Posted Content

Targeted Poisoning Attacks on Black-Box Neural Machine Translation.

TL;DR: It is shown that targeted attacks on black-box NMT systems are feasible, based on poisoning a small fraction of their parallel training data, and this attack can be realised practically via targeted corruption of web documents crawled to form the system’s training data.