scispace - formally typeset
Search or ask a question
Institution

Dalian University of Technology

EducationDalian, China
About: Dalian University of Technology is a education organization based out in Dalian, China. It is known for research contribution in the topics: Catalysis & Finite element method. The organization has 60890 authors who have published 71921 publications receiving 1188356 citations. The organization is also known as: Dàlián Lǐgōng Dàxué.


Papers
More filters
Journal ArticleDOI
TL;DR: This critical review focuses on the fluorescent or colorimetric sensors for thiols according to their unique mechanisms between sensors andThiols, including Michael addition, cyclization with aldehyde, cleavage of sulfonamide and sulfonate ester by thiol s, and metal complexes-oxidation-reduction,Metal complexes-displace coordination, nano-particles and others.
Abstract: Due to the biological importances of thiols, such as cysteine, homocysteine and glutathione, the development of optical probes for thiols has been an active research area in recent few years. This critical review focuses on the fluorescent or colorimetric sensors for thiols according to their unique mechanisms between sensors and thiols, including Michael addition, cyclization with aldehyde, cleavage of sulfonamide and sulfonate ester by thiols, cleavage of selenium–nitrogen bond by thiols, cleavage of disulfide by thiols, metal complexes-oxidation–reduction, metal complexes-displace coordination, nano-particles and others (110 references).

1,395 citations

Proceedings ArticleDOI
14 Jun 2020
TL;DR: The Efficient Channel Attention (ECA) module as discussed by the authors proposes a local cross-channel interaction strategy without dimensionality reduction, which can be efficiently implemented via 1D convolution, which only involves a handful of parameters while bringing clear performance gain.
Abstract: Recently, channel attention mechanism has demonstrated to offer great potential in improving the performance of deep convolutional neural networks (CNNs). However, most existing methods dedicate to developing more sophisticated attention modules for achieving better performance, which inevitably increase model complexity. To overcome the paradox of performance and complexity trade-off, this paper proposes an Efficient Channel Attention (ECA) module, which only involves a handful of parameters while bringing clear performance gain. By dissecting the channel attention module in SENet, we empirically show avoiding dimensionality reduction is important for learning channel attention, and appropriate cross-channel interaction can preserve performance while significantly decreasing model complexity. Therefore, we propose a local cross-channel interaction strategy without dimensionality reduction, which can be efficiently implemented via 1D convolution. Furthermore, we develop a method to adaptively select kernel size of 1D convolution, determining coverage of local cross-channel interaction. The proposed ECA module is both efficient and effective, e.g., the parameters and computations of our modules against backbone of ResNet50 are 80 vs. 24.37M and 4.7e-4 GFlops vs. 3.86 GFlops, respectively, and the performance boost is more than 2% in terms of Top-1 accuracy. We extensively evaluate our ECA module on image classification, object detection and instance segmentation with backbones of ResNets and MobileNetV2. The experimental results show our module is more efficient while performing favorably against its counterparts.

1,378 citations

Proceedings ArticleDOI
16 Jun 2012
TL;DR: A simple yet robust tracking method based on the structural local sparse appearance model which exploits both partial information and spatial information of the target based on a novel alignment-pooling method and employs a template update strategy which combines incremental subspace learning and sparse representation.
Abstract: Sparse representation has been applied to visual tracking by finding the best candidate with minimal reconstruction error using target templates. However most sparse representation based trackers only consider the holistic representation and do not make full use of the sparse coefficients to discriminate between the target and the background, and hence may fail with more possibility when there is similar object or occlusion in the scene. In this paper we develop a simple yet robust tracking method based on the structural local sparse appearance model. This representation exploits both partial information and spatial information of the target based on a novel alignment-pooling method. The similarity obtained by pooling across the local patches helps not only locate the target more accurately but also handle occlusion. In addition, we employ a template update strategy which combines incremental subspace learning and sparse representation. This strategy adapts the template to the appearance change of the target with less possibility of drifting and reduces the influence of the occluded target template as well. Both qualitative and quantitative evaluations on challenging benchmark image sequences demonstrate that the proposed tracking algorithm performs favorably against several state-of-the-art methods.

1,305 citations

Journal ArticleDOI
TL;DR: Wang et al. as mentioned in this paper evaluated and described green supply chain management (GSCM) drivers, practices and performance among various Chinese manufacturing organizations, based on a literature review, four propositions are put forward.
Abstract: Purpose – Green supply chain management (GSCM) has emerged as a key approach for enterprises seeking to become environmentally sustainable. This paper aims to evaluate and describe GSCM drivers, practices and performance among various Chinese manufacturing organizations.Design/methodology/approach – Based on a literature review, four propositions are put forward. An empirical study using survey research was completed. The survey questionnaire was designed with 54 items using literature and industry expert input. An exploratory factor analysis was conducted to derive groupings of GSCM pressures, practice and performance from the survey data which included 314 responses. A categorical and descriptive nature of the results is then presented with an evaluation and comparative analysis with previous research findings.Findings – Chinese enterprises have increased their environmental awareness due to regulatory, competitive, and marketing pressures and drivers. However, this awareness has not been translated int...

1,287 citations

Proceedings ArticleDOI
17 Dec 2018
TL;DR: In this paper, a deep mutual learning (DML) strategy is proposed to transfer knowledge from a teacher to a student network, where an ensemble of students learn collaboratively and teach each other throughout the training process.
Abstract: Model distillation is an effective and widely used technique to transfer knowledge from a teacher to a student network The typical application is to transfer from a powerful large network or ensemble to a small network, in order to meet the low-memory or fast execution requirements In this paper, we present a deep mutual learning (DML) strategy Different from the one-way transfer between a static pre-defined teacher and a student in model distillation, with DML, an ensemble of students learn collaboratively and teach each other throughout the training process Our experiments show that a variety of network architectures benefit from mutual learning and achieve compelling results on both category and instance recognition tasks Surprisingly, it is revealed that no prior powerful teacher network is necessary - mutual learning of a collection of simple student networks works, and moreover outperforms distillation from a more powerful yet static teacher

1,286 citations


Authors

Showing all 61205 results

NameH-indexPapersCitations
Yang Yang1712644153049
Yury Gogotsi171956144520
Hui Li1352982105903
Michael I. Posner134414104201
Anders Hagfeldt12960079912
Jian Zhou128300791402
Chao Zhang127311984711
Bin Wang126222674364
Chi Lin1251313102710
Tao Zhang123277283866
Bo Wang119290584863
Zhenyu Zhang118116764887
Liang Cheng116177965520
Anthony G. Fane11256540904
Xuelong Li110104446648
Network Information
Related Institutions (5)
Tsinghua University
200.5K papers, 4.5M citations

95% related

University of Science and Technology of China
101K papers, 2.4M citations

95% related

Zhejiang University
183.2K papers, 3.4M citations

93% related

Chinese Academy of Sciences
634.8K papers, 14.8M citations

93% related

Shanghai Jiao Tong University
184.6K papers, 3.4M citations

92% related

Performance
Metrics
No. of papers from the Institution in previous years
YearPapers
2023167
2022838
20216,974
20206,457
20196,261
20185,375