L
Liang Ding
Researcher at National University of Defense Technology
Publications - 170
Citations - 2334
Liang Ding is an academic researcher from National University of Defense Technology. The author has contributed to research in topics: Thin film & Computer science. The author has an hindex of 22, co-authored 152 publications receiving 1782 citations. Previous affiliations of Liang Ding include University of Texas–Pan American & Nanyang Technological University.
Papers
More filters
Journal ArticleDOI
Human MicroRNA Target Prediction via Multi-Hypotheses Learning.
TL;DR: An algorithm is introduced that aims at not only predicting microRNA targets accurately but also assisting in vivo experiments to understand the mechanisms of targeting, and outperforms state-of-the-art data-driven approaches such as deep learning models and machine learning algorithms and rule-based methods.
Proceedings ArticleDOI
High speed energy-efficient germanium electro-absorption modulator featuring monolithic integration with germanium p-i-n photodetector
Andy Eu-Jin Lim,Tsung-Yang Liow,Qing Fang,Ning Duan,Liang Ding,Mingbin Yu,Guo-Qiang Lo,Dim-Lee Kwong +7 more
TL;DR: In this article, a novel evanescent-coupled germanium electro-absorption modulator with a small active area of 16 µm2 giving an extinction ratio of ∼ dB for a wavelength range of 1580-1610 nm.
Posted Content
Polynomial kernels collapse the W-hierarchy.
TL;DR: This work establishes a close relationship between polynomial (and exponential) kernelizability and the existence of sub-exponential time algorithms for a spectrum of circuit satisfiability problems in FPT.
Proceedings ArticleDOI
The study of adhesive performance within backside-via revealing
Hongyu Li,Liang Ding,G. Q. Lo +2 more
TL;DR: In this paper, the backside dielectric deposition at specific pattern was found to cause the delamination in TSV wafers after temporary bonding, which is a critical process within 3D integration.
Posted Content
Understanding and Improving Lexical Choice in Non-Autoregressive Translation
TL;DR: The authors propose to expose the raw data to NAT models to restore the useful information of low-frequency words, which are missed in the distilled data, and introduce an extra Kullback-Leibler divergence term derived by comparing the lexical choice of NAT model and that embedded in the raw dataset.