J
Jun Wang
Researcher at Xihua University
Publications - 134
Citations - 3350
Jun Wang is an academic researcher from Xihua University. The author has contributed to research in topics: Computer science & Membrane computing. The author has an hindex of 28, co-authored 113 publications receiving 2293 citations. Previous affiliations of Jun Wang include Huazhong University of Science and Technology.
Papers
More filters
Journal ArticleDOI
Audio watermarking scheme robust against desynchronization attacks based on kernel clustering
Hong Peng,Jun Wang,Zulin Zhang +2 more
TL;DR: An adaptive audio watermarking scheme based on kernel fuzzy c-means (KFCM) clustering algorithm, which possesses robust ability against common signal processing and desynchronization attacks is proposed.
Book ChapterDOI
Asynchronous extended spiking neural p systems with astrocytes
TL;DR: It is proved that asynchronous spiking neural P systems with astrocytes are universal (when using extended rules), and the construction is uniform in the sense that the form of the modules used is independent of the simulated register machine.
Journal ArticleDOI
Novel LMI-based stability and stabilization analysis on impulsive switched system with time delays
TL;DR: First, a novel Razumikhin function is constructed; then based on LMI approach and optimization techniques, it is derived with good properties and extended to the case with perturbations.
Journal ArticleDOI
Fuzzy variable structure control for uncertain systems with disturbance
TL;DR: In this paper, the fuzzy variable structure control for uncertain systems with disturbance is introduced to estimate the control disturbance, the switching control is included to compensate for the approximation error, and they possess the characteristic of simpleness in design and effectiveness in attenuating the control chattering.
Journal ArticleDOI
ConvSNP: a deep learning model embedded with SNP-like neurons
TL;DR: Based on SNP-like neurons, a new class of deep learning models are developed, called ConvSNP models, by referring the structures of the existing convolutional neural networks (CNNs) to create five models, each with a generalized linear function.