scispace - formally typeset
Search or ask a question
Author

Jeng-Shyang Pan

Bio: Jeng-Shyang Pan is an academic researcher from Shandong University of Science and Technology. The author has contributed to research in topics: Digital watermarking & Watermark. The author has an hindex of 50, co-authored 789 publications receiving 11645 citations. Previous affiliations of Jeng-Shyang Pan include National Kaohsiung Normal University & Technical University of Ostrava.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article , an improved Gannet Optimization Algorithm (GOA) is proposed to solve five engineering optimization problems, and the performance study of the PCGOA on the CEC2013 benchmark demonstrates the advantages of the new method in various aspects.
Abstract: The Gannet Optimization Algorithm (GOA) has good performance, but there is still room for improvement in memory consumption and convergence. In this paper, an improved Gannet Optimization Algorithm is proposed to solve five engineering optimization problems. The compact strategy enables the GOA to save a large amount of memory, and the parallel communication strategy allows the algorithm to avoid falling into local optimal solutions. We improve the GOA through the combination of parallel strategy and compact strategy, and we name the improved algorithm Parallel Compact Gannet Optimization Algorithm (PCGOA). The performance study of the PCGOA on the CEC2013 benchmark demonstrates the advantages of our new method in various aspects. Finally, the results of the PCGOA on solving five engineering optimization problems show that the improved algorithm can find the global optimal solution more accurately.

4 citations

Journal ArticleDOI
01 May 2022-Entropy
TL;DR: A novel deep architecture generation model based on Aquila optimization (AO) and a genetic algorithm (GA) so that the evolutionary computing algorithm can be combined with CNN and the experimental results show that the proposed model has good results in terms of search accuracy and time.
Abstract: Manually designing a convolutional neural network (CNN) is an important deep learning method for solving the problem of image classification. However, most of the existing CNN structure designs consume a significant amount of time and computing resources. Over the years, the demand for neural architecture search (NAS) methods has been on the rise. Therefore, we propose a novel deep architecture generation model based on Aquila optimization (AO) and a genetic algorithm (GA). The main contributions of this paper are as follows: Firstly, a new encoding strategy representing the CNN coding structure is proposed, so that the evolutionary computing algorithm can be combined with CNN. Secondly, a new mechanism for updating location is proposed, which incorporates three typical operators from GA cleverly into the model we have designed so that the model can find the optimal solution in the limited search space. Thirdly, the proposed method can deal with the variable-length CNN structure by adding skip connections. Fourthly, combining traditional CNN layers and residual blocks and introducing a grouping strategy provides greater possibilities for searching for the optimal CNN structure. Additionally, we use two notable datasets, consisting of the MNIST and CIFAR-10 datasets for model evaluation. The experimental results show that our proposed model has good results in terms of search accuracy and time.

4 citations

Proceedings ArticleDOI
18 Nov 2015
TL;DR: Improved JPS algorithm is proposed that not only tackles the congestion problem, but also solves the problem more efficiently and results show it outperforms the other algorithms.
Abstract: Path planning is one of the most studied problems in the field of robotics, unmanned aerial vehicles (uavs), vehicle navigation and fields like these. The majority algorithms of path planning produce possible paths of grid graph, and then apply in problems such as classical graph route searching. Astar algorithm, Hierarchical Path-Finding A-star (HPA*) and Jump Point Search (JPS) algorithms are studied in this paper to compare the maze searching capacity and different search maps' efficiency. We also propose an improved JPS algorithm in symmetric grid graph. In this paper, we compare their search time and efficiency. Conducted experiment shows that by adopting the same benchmarks our algorithm not only tackles the congestion problem, but also solves the problem more efficiently. Experiments validated the improved JPS algorithm and results show it outperforms the other algorithms.

4 citations

Proceedings ArticleDOI
05 Sep 2007
TL;DR: Experimental results show that the proposed watermarking method can resist not only cropping attack, but also some common signal processing attacks, such as JPEG compression, Gaussian noise, and filtering etc.
Abstract: In this paper, we proposed a parity modulation based digital image watermarking scheme in DWT domain, focusing on resisting cropping attack. The watermark embedded the LI3, subband coefficients of DWT by the parity modulation method. Experimental results show that the proposed watermarking method can resist not only cropping attack, but also some common signal processing attacks, such as JPEG compression, Gaussian noise, and filtering etc.

4 citations

Journal ArticleDOI
TL;DR: This paper proposes a review spammer detection approach combining both TrustRank and Anti-TrustRank propagation algorithm to identify review spammers and shows that according to two datasets, this method significantly outperforms the existing baselines, and is able to find more abnormal spamming activities.
Abstract: Review spammers detection is an important task in social media sentiment analysis. Previous works employ reviewer behaviors such as text similarities, duplications and rating patterns to indentify suspicious spammers. However, there are still other kinds of abnormal spamming activities which could not be detected by the available techniques. This paper proposes a review spammer detection approach combining both TrustRank and Anti-TrustRank propagation algorithm to identify review spammers. Firstly, a twolayer heterogeneous review relation graph is constructed to capture the relationships among reviewers and products. Secondly, a TrustRank based propagation model and an Anti-TrustRank based propagation model are established to calculate the reviewers’ trustiness value and the reviewers’ anti-trustiness value respectively. Finally, review spammers are detected according to the comprehensive trustiness value which combines both reviewers’ trustiness value and anti-trustiness value. Experimental results show that according to two datasets, our presented method significantly outperforms the existing baselines, and is able to find more abnormal spamming activities.

4 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: It is proved the convergence of a recursive mean shift procedure to the nearest stationary point of the underlying density function and, thus, its utility in detecting the modes of the density.
Abstract: A general non-parametric technique is proposed for the analysis of a complex multimodal feature space and to delineate arbitrarily shaped clusters in it. The basic computational module of the technique is an old pattern recognition procedure: the mean shift. For discrete data, we prove the convergence of a recursive mean shift procedure to the nearest stationary point of the underlying density function and, thus, its utility in detecting the modes of the density. The relation of the mean shift procedure to the Nadaraya-Watson estimator from kernel regression and the robust M-estimators; of location is also established. Algorithms for two low-level vision tasks discontinuity-preserving smoothing and image segmentation - are described as applications. In these algorithms, the only user-set parameter is the resolution of the analysis, and either gray-level or color images are accepted as input. Extensive experimental results illustrate their excellent performance.

11,727 citations

Book
24 Oct 2001
TL;DR: Digital Watermarking covers the crucial research findings in the field and explains the principles underlying digital watermarking technologies, describes the requirements that have given rise to them, and discusses the diverse ends to which these technologies are being applied.
Abstract: Digital watermarking is a key ingredient to copyright protection. It provides a solution to illegal copying of digital material and has many other useful applications such as broadcast monitoring and the recording of electronic transactions. Now, for the first time, there is a book that focuses exclusively on this exciting technology. Digital Watermarking covers the crucial research findings in the field: it explains the principles underlying digital watermarking technologies, describes the requirements that have given rise to them, and discusses the diverse ends to which these technologies are being applied. As a result, additional groundwork is laid for future developments in this field, helping the reader understand and anticipate new approaches and applications.

2,849 citations

Proceedings Article
01 Jan 1999

2,010 citations

Posted Content
TL;DR: This paper defines and explores proofs of retrievability (PORs), a POR scheme that enables an archive or back-up service to produce a concise proof that a user can retrieve a target file F, that is, that the archive retains and reliably transmits file data sufficient for the user to recover F in its entirety.
Abstract: In this paper, we define and explore proofs of retrievability (PORs). A POR scheme enables an archive or back-up service (prover) to produce a concise proof that a user (verifier) can retrieve a target file F, that is, that the archive retains and reliably transmits file data sufficient for the user to recover F in its entirety.A POR may be viewed as a kind of cryptographic proof of knowledge (POK), but one specially designed to handle a large file (or bitstring) F. We explore POR protocols here in which the communication costs, number of memory accesses for the prover, and storage requirements of the user (verifier) are small parameters essentially independent of the length of F. In addition to proposing new, practical POR constructions, we explore implementation considerations and optimizations that bear on previously explored, related schemes.In a POR, unlike a POK, neither the prover nor the verifier need actually have knowledge of F. PORs give rise to a new and unusual security definition whose formulation is another contribution of our work.We view PORs as an important tool for semi-trusted online archives. Existing cryptographic techniques help users ensure the privacy and integrity of files they retrieve. It is also natural, however, for users to want to verify that archives do not delete or modify files prior to retrieval. The goal of a POR is to accomplish these checks without users having to download the files themselves. A POR can also provide quality-of-service guarantees, i.e., show that a file is retrievable within a certain time bound.

1,783 citations