scispace - formally typeset
Search or ask a question
Author

Jun Lu

Bio: Jun Lu is an academic researcher from Chinese Academy of Sciences. The author has contributed to research in topics: Medicine & Materials science. The author has an hindex of 135, co-authored 1526 publications receiving 99767 citations. Previous affiliations of Jun Lu include Drexel University & Argonne National Laboratory.


Papers
More filters
Proceedings ArticleDOI
26 Jul 2022
TL;DR: This paper proposes an embedding adaptive-update strategy to avoid feature drift, which maintains old knowledge by hyper-class representation, and adaptively update category embeddings with a class-attention scheme to involve new classes learned in individual sessions to solve IFSS problems.
Abstract: Incremental few-shot semantic segmentation (IFSS) targets at incrementally expanding model's capacity to segment new class of images supervised by only a few samples. However, features learned on old classes could significantly drift, causing catastrophic forgetting. Moreover, few samples for pixel-level segmentation on new classes lead to notorious overfitting issues in each learning session. In this paper, we explicitly represent class-based knowledge for semantic segmentation as a category embedding and a hyper-class embedding, where the former describes exclusive semantical properties, and the latter expresses hyper-class knowledge as class-shared semantic properties. Aiming to solve IFSS problems, we present EHNet, i.e., Embedding adaptive-update and Hyper-class representation Network from two aspects. First, we propose an embedding adaptive-update strategy to avoid feature drift, which maintains old knowledge by hyper-class representation, and adaptively update category embeddings with a class-attention scheme to involve new classes learned in individual sessions. Second, to resist overfitting issues caused by few training samples, a hyper-class embedding is learned by clustering all category embeddings for initialization and aligned with category embedding of the new class for enhancement, where learned knowledge assists to learn new knowledge, thus alleviating performance dependence on training data scale. Significantly, these two designs provide representation capability for classes with sufficient semantics and limited biases, enabling to perform segmentation tasks requiring high semantic dependence. Experiments on PASCAL-5i and COCO datasets show that EHNet achieves new state-of-the-art performance with remarkable advantages.

19 citations

Journal ArticleDOI
TL;DR: In this paper, the sedimentary layers order of this sea area classified by color identification and by oxygen isotope stratum has the very good corresponding relation, which indicates that the paleoclimatic changes resulted in the rise and fall of sea level, the open and close of strait thoroughfare and the upwelling activity, which causes the changes of the oxidation-reduction condition of deep water along with the dynamic environment in the sea area, thus producing the different sediment features of sea bed.
Abstract: Core NS-93-5 was taken from the mild slope terrace of the southern South China Sea (SCS), which has preserved the steadiness depositional record of the normal marine environment since late Quaternary. Sedimentary sequence and oxygen isotopic stratigraphy of high resolution in the near 200 ka of the southern SCS has been established. By the comparative analysis with GISP2's ice core, the depositional record of D/O's events 1-21 and Heinrich's events H1-H6 in the southern SCS that reflected the quick climate change in short time scale since the last interglacial stage is revealed, which indicates that in the last 200 ka in the southern SCS and the Arctic area there was tele-connection of paleoclimate and the unstability of the Western Pacific Warm Pool. This note shows that the sedimentary layers order of this sea area classified by color identification and by oxygen isotope stratum has the very good corresponding relation. The color feature of sediment changes along with the climate and the former arranges a stagnant time of 1-1.5 ka generally. It was suggested that the paleoclimatic changes resulted in the rise and fall of sea level, the open and close of strait thoroughfare and the upwelling activity, which causes the changes of the oxidation-reduction condition of deep water along with the dynamic environment in this sea area, thus producing the different sediment features of sea bed. In addition, a volcanic ash layer, about 17 cm thick, has been found in the transition of oxygen isotope 4/5 stage, which is related wit the Toba's volcanic eruption.

19 citations

Journal ArticleDOI
TL;DR: In this article, a tensile biaxial stress was shown to promote the stabilization of wurtzite Sc x Al 1− x N/AlN superlattices even for the highest investigated concentration x ǫ = 0.375.

19 citations

Journal ArticleDOI
TL;DR: In this article, a nitrogen-doped porous carbon sheet is prepared by in situ polymerization of pyrrole on both sides of graphene oxide, following which the polypyrrole layers are then transformed to the N-drained porous carbon layers during the following carbonization, and a sandwich structure is formed.
Abstract: A nitrogen (N)-doped porous carbon sheet is prepared by in situ polymerization of pyrrole on both sides of graphene oxide, following which the polypyrrole layers are then transformed to the N-doped porous carbon layers during the following carbonization, and a sandwich structure is formed. Such a sheet-like structure possesses a high specific surface area and, more importantly, guarantees the sufficient utilization of the N-doping active porous sites. The internal graphene layer acts as an excellent electron pathway, and meanwhile, the external thin and porous carbon layer helps to decrease the ion diffusion resistance during electrochemical reactions. As a result, this sandwich structure exhibits prominent catalytic activity toward the oxygen reduction reaction in alkaline media, as evidenced by a more positive onset potential, a larger diffusion-limited current, better durability and poison-tolerance than commercial Pt/C. This study shows a novel method of using graphene to template the traditional poro...

18 citations

Journal ArticleDOI
TL;DR: In this paper, a molecular classification of knee osteoarthritis (KOA) is proposed, based on the temporal alteration of representative molecules, which can be detected in body fluids, including synovial fluid, urine, and blood.
Abstract: Knee osteoarthritis (KOA) is the most common form of joint degeneration with increasing prevalence and incidence in recent decades. KOA is a molecular disorder characterized by the interplay of numerous molecules, a considerable number of which can be detected in body fluids, including synovial fluid, urine, and blood. However, the current diagnosis and treatment of KOA mainly rely on clinical and imaging manifestations, neglecting its molecular pathophysiology. The mismatch between participants' molecular characteristics and drug therapeutic mechanisms might explain the failure of some disease-modifying drugs in clinical trials. Hence, according to the temporal alteration of representative molecules, we propose a novel molecular classification of KOA divided into pre-KOA, early KOA, progressive KOA, and end-stage KOA. Then, progressive KOA is furtherly divided into four subtypes as cartilage degradation-driven, bone remodeling-driven, inflammation-driven, and pain-driven subtype, based on the major pathophysiology in patient clusters. Multiple clinical findings of representatively investigated molecules in recent years will be reviewed and categorized. This molecular classification allows for the prediction of high-risk KOA individuals, the diagnosis of early KOA patients, the assessment of therapeutic efficacy, and in particular, the selection of homogenous patients who may benefit most from the appropriate therapeutic agents.

18 citations


Cited by
More filters
Journal ArticleDOI
04 Mar 2011-Cell
TL;DR: Recognition of the widespread applicability of these concepts will increasingly affect the development of new means to treat human cancer.

51,099 citations

Journal ArticleDOI
TL;DR: The Gene Set Enrichment Analysis (GSEA) method as discussed by the authors focuses on gene sets, that is, groups of genes that share common biological function, chromosomal location, or regulation.
Abstract: Although genomewide RNA expression analysis has become a routine tool in biomedical research, extracting biological insight from such information remains a major challenge. Here, we describe a powerful analytical method called Gene Set Enrichment Analysis (GSEA) for interpreting gene expression data. The method derives its power by focusing on gene sets, that is, groups of genes that share common biological function, chromosomal location, or regulation. We demonstrate how GSEA yields insights into several cancer-related data sets, including leukemia and lung cancer. Notably, where single-gene analysis finds little similarity between two independent studies of patient survival in lung cancer, GSEA reveals many biological pathways in common. The GSEA method is embodied in a freely available software package, together with an initial database of 1,325 biologically defined gene sets.

34,830 citations

Journal ArticleDOI

[...]

08 Dec 2001-BMJ
TL;DR: There is, I think, something ethereal about i —the square root of minus one, which seems an odd beast at that time—an intruder hovering on the edge of reality.
Abstract: There is, I think, something ethereal about i —the square root of minus one. I remember first hearing about it at school. It seemed an odd beast at that time—an intruder hovering on the edge of reality. Usually familiarity dulls this sense of the bizarre, but in the case of i it was the reverse: over the years the sense of its surreal nature intensified. It seemed that it was impossible to write mathematics that described the real world in …

33,785 citations

01 May 1993
TL;DR: Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems.
Abstract: Three parallel algorithms for classical molecular dynamics are presented. The first assigns each processor a fixed subset of atoms; the second assigns each a fixed subset of inter-atomic forces to compute; the third assigns each a fixed spatial region. The algorithms are suitable for molecular dynamics models which can be difficult to parallelize efficiently—those with short-range forces where the neighbors of each atom change rapidly. They can be implemented on any distributed-memory parallel machine which allows for message-passing of data between independently executing processors. The algorithms are tested on a standard Lennard-Jones benchmark problem for system sizes ranging from 500 to 100,000,000 atoms on several parallel supercomputers--the nCUBE 2, Intel iPSC/860 and Paragon, and Cray T3D. Comparing the results to the fastest reported vectorized Cray Y-MP and C90 algorithm shows that the current generation of parallel machines is competitive with conventional vector supercomputers even for small problems. For large problems, the spatial algorithm achieves parallel efficiencies of 90% and a 1840-node Intel Paragon performs up to 165 faster than a single Cray C9O processor. Trade-offs between the three algorithms and guidelines for adapting them to more complex molecular dynamics simulations are also discussed.

29,323 citations