scispace - formally typeset
L

Lei Wang

Researcher at Chinese Academy of Sciences

Publications -  437
Citations -  7401

Lei Wang is an academic researcher from Chinese Academy of Sciences. The author has contributed to research in topics: Big data & Catalysis. The author has an hindex of 38, co-authored 386 publications receiving 5547 citations. Previous affiliations of Lei Wang include Huaibei Normal University & Xinyang Normal University.

Papers
More filters
Proceedings ArticleDOI

BigDataBench: A big data benchmark suite from internet services

TL;DR: The big data benchmark suite-BigDataBench not only covers broad application scenarios, but also includes diverse and representative data sets, and comprehensively characterize 19 big data workloads included in BigDataBench with varying data inputs.
Journal ArticleDOI

RIC-seq for global in situ profiling of RNA–RNA spatial interactions

TL;DR: RNA in situ conformation sequencing (RIC-seq) enables the generation of three-dimensional interaction maps of RNA in cells, which sheds light on the interactions and regulatory functions of RNA.
Journal ArticleDOI

Transcriptional regulation of strigolactone signalling in Arabidopsis

TL;DR: Many of the molecular targets of strigolactones—plant hormones involved in development and in interactions with symbiotic and parasitic organisms—are uncovered, revealing how striglactones function and an intriguing role for self-regulation of a downstream transcription factor.
Journal ArticleDOI

Merging Photoredox with Palladium Catalysis: Decarboxylative ortho-Acylation of Acetanilides with α-Oxocarboxylic Acids under Mild Reaction Conditions.

TL;DR: A room temperature decarboxylative ortho-acylation of acetanilides with α-oxocarboxylic acids has been developed via a novel Eosin Y with Pd dual catalytic system that shows a broad substrate scope and good functional group tolerance.
Book ChapterDOI

Cosine Normalization: Using Cosine Similarity Instead of Dot Product in Neural Networks

TL;DR: Cosine normalization as discussed by the authors uses cosine similarity instead of dot product in neural networks to reduce the variance of the activation function and decrease the internal covariate shift of the network.