Z
Zhen Li
Researcher at Wuhan University
Publications - 3347
Citations - 95191
Zhen Li is an academic researcher from Wuhan University. The author has contributed to research in topics: Medicine & Computer science. The author has an hindex of 127, co-authored 1712 publications receiving 71351 citations. Previous affiliations of Zhen Li include Tsinghua University & Hong Kong University of Science and Technology.
Papers
More filters
Journal ArticleDOI
New second-order nonlinear optical (NLO) hyperbranched polymers containing isolation chromophore moieties derived from one-pot “A2 + B4” approach via Suzuki coupling reaction
TL;DR: In this article, a facile synthetic route was designed to prepare two new hyperbranched polymers HP1 and HP2 useful as second-order nonlinear optical (NLO) materials through a one-pot "A2 + B4" approach via a simple Suzuki coupling reaction.
Journal ArticleDOI
E2F1 suppresses Wnt/β-catenin activity through transactivation of β-catenin interacting protein ICAT.
TL;DR: Evidence is provided that β-catenin interacting protein 1 (CTNNBIP1), also known as ICAT, functions as a crucial node to mediate the cross talk between E2F1 and β- catenin signaling and it is shown that ICAT is a direct transcriptional target of E1F1, and that activation of ICAT by E2f1 is required for E2 F1 to inhibit β-Catenin activity.
Journal ArticleDOI
Inhomogeneous Doping of Perovskite Materials by Dopants from Hole-Transport Layer
Chuanxiao Xiao,Fei Zhang,Zhen Li,Steven P. Harvey,Xihan Chen,Kang Wang,Chun-Sheng Jiang,Kai Zhu,Mowafak Al-Jassim +8 more
TL;DR: In this paper, Li diffusion increases the surface potential of the perovskite film and reveals the hidden doping effects of the organic hole-transport layer (HTL) dopants.
Journal ArticleDOI
Influences of Conjugation Extent on the Aggregation‐Induced Emission Quantum Efficiency in Silole Derivatives: A Computational Study
TL;DR: It is found that the solid-state fluorescence quantum efficiency first increases sharply with the degree of π-conjugation of the 2,5-substituents, then levels off, and finally starts to decrease slightly.
Proceedings ArticleDOI
ET-BERT: A Contextualized Datagram Representation with Pre-training Transformers for Encrypted Traffic Classification
TL;DR: This paper proposes a new traffic representation model called Encrypted Traffic Bidirectional Encoder Representations from Transformer (ET-BERT), which pre-trains deep contextualized datagram-level representation from large-scale unlabeled data and achieves state-of-the-art performance across five encrypted traffic classification tasks.