scispace - formally typeset
Search or ask a question
Author

Jeng-Shyang Pan

Bio: Jeng-Shyang Pan is an academic researcher from Shandong University of Science and Technology. The author has contributed to research in topics: Digital watermarking & Watermark. The author has an hindex of 50, co-authored 789 publications receiving 11645 citations. Previous affiliations of Jeng-Shyang Pan include National Kaohsiung Normal University & Technical University of Ostrava.


Papers
More filters
Book
13 Sep 2013
TL;DR: This volume constitutes the proceedings of the 8th International Conference on Hybrid Artificial Intelligent Systems, HAIS 2013, held in Salamanca, Spain, in September 2013.
Abstract: This volume constitutes the proceedings of the 8th International Conference on Hybrid Artificial Intelligent Systems, HAIS 2013, held in Salamanca, Spain, in September 2013. The 68 papers published in this volume were carefully reviewed and selected from 218 submissions. They are organized in topical sessions on Agents and Multi Agents Systems; HAIS Applications; Classification and Cluster Analysis; Data Mining and Knowledge Discovery; Video and Image Analysis; Bio-inspired Models and Evolutionary Computation; Learning Algorithms; Systems, MAN, and Cybernetics; Hybrid Intelligent Systems for Data Mining and Applications; Metaheuristics for Combinatorial Optimization and Modelling Complex Systems.

1 citations

Book ChapterDOI
06 Aug 2007
TL;DR: A novel kernel optimization method based on maximum margin criterion is proposed, which can solve the problem of Xiong's work that the optimal solution can be solved by iteration update algorithm owing to the singular problem of matrix.
Abstract: A novel criterion, namely Maximum Margin Criterion (MMC), is proposed for learning the data-dependent kernel for classification. Different kernels create the different geometrical structures of the data in the feature space, and lead to different class discrimination. Selection of kernel influences greatly the performance of kernel learning. Optimizing kernel is an effective method to improve the classification performance. In this paper, we propose a novel kernel optimization method based on maximum margin criterion, which can solve the problem of Xiong's work [1] that the optimal solution can be solved by iteration update algorithm owing to the singular problem of matrix. Our method can obtain a unique optimal solution by solving an eigenvalue problem, and the performance is enhanced while time consuming is decreased. Experimental results show that the proposed algorithm gives a better performance and a lower time consuming compared with Xiong's work.

1 citations

Book ChapterDOI
07 Sep 2011
TL;DR: A particle swarm optimization with feasibility-based rules is proposed to find optimal values of continuous variables after the MPSO algorithm finishes each independent run, in order to obtain the consistent optimal results for mixed-variable optimization problems.
Abstract: A double particle swarm optimization (DPSO), in which MPSO proposed by Sun et al [1] is used as a global search algorithm and PSO with feasibility-based rules is used to do local searching, is proposed in this paper to solve mixed-variable optimization problems MPSO can solve the non-continuous variables very well However, the imprecise values of continuous variables brought the inconsistent results of each run A particle swarm optimization with feasibility-based rules is proposed to find optimal values of continuous variables after the MPSO algorithm finishes each independent run, in order to obtain the consistent optimal results for mixed-variable optimization problems The performance of DPSO is evaluated against two real-world mixed-variable optimization problems, and it is found to be highly competitive compared with other existing algorithms

1 citations

Journal ArticleDOI
TL;DR: In this article , an enhanced parallel salp swarm algorithm based on the Taguchi method (PTSSA) was proposed, where the initial population uniformly split into several subgroups and then exchange information among the subgroups after a fixed number of iterations, which speeds up the convergence.
Abstract: Salp swarm algorithm (SSA) is an excellent meta-heuristic algorithm, which has been widely used in the engineering field. However, there is still room for improvement in terms of convergence rate and solution accuracy. Therefore, this paper proposes an enhanced parallel salp swarm algorithm based on the Taguchi method (PTSSA). The parallel trick is to split the initial population uniformly into several subgroups and then exchange information among the subgroups after a fixed number of iterations, which speeds up the convergence. Communication strategies are an important component of parallelism techniques. The Taguchi method is widely used in the industry for optimizing product and process conditions. In this paper, the Taguchi method is adopted into the parallelization technique as a novel communication strategy, which improves the robustness and accuracy of the solution. The proposed algorithm was also tested under the CEC2013 test suite. Experimental results show that PTSSA is more competitive than some common algorithms. In addition, PTSSA is applied to optimize the operation of a heatless combined cooling-power system. Simulation results show that the optimized operation provided by PTSSA is more stable and efficient in terms of operating cost reduction.

1 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: It is proved the convergence of a recursive mean shift procedure to the nearest stationary point of the underlying density function and, thus, its utility in detecting the modes of the density.
Abstract: A general non-parametric technique is proposed for the analysis of a complex multimodal feature space and to delineate arbitrarily shaped clusters in it. The basic computational module of the technique is an old pattern recognition procedure: the mean shift. For discrete data, we prove the convergence of a recursive mean shift procedure to the nearest stationary point of the underlying density function and, thus, its utility in detecting the modes of the density. The relation of the mean shift procedure to the Nadaraya-Watson estimator from kernel regression and the robust M-estimators; of location is also established. Algorithms for two low-level vision tasks discontinuity-preserving smoothing and image segmentation - are described as applications. In these algorithms, the only user-set parameter is the resolution of the analysis, and either gray-level or color images are accepted as input. Extensive experimental results illustrate their excellent performance.

11,727 citations

Book
24 Oct 2001
TL;DR: Digital Watermarking covers the crucial research findings in the field and explains the principles underlying digital watermarking technologies, describes the requirements that have given rise to them, and discusses the diverse ends to which these technologies are being applied.
Abstract: Digital watermarking is a key ingredient to copyright protection. It provides a solution to illegal copying of digital material and has many other useful applications such as broadcast monitoring and the recording of electronic transactions. Now, for the first time, there is a book that focuses exclusively on this exciting technology. Digital Watermarking covers the crucial research findings in the field: it explains the principles underlying digital watermarking technologies, describes the requirements that have given rise to them, and discusses the diverse ends to which these technologies are being applied. As a result, additional groundwork is laid for future developments in this field, helping the reader understand and anticipate new approaches and applications.

2,849 citations

Proceedings Article
01 Jan 1999

2,010 citations

Posted Content
TL;DR: This paper defines and explores proofs of retrievability (PORs), a POR scheme that enables an archive or back-up service to produce a concise proof that a user can retrieve a target file F, that is, that the archive retains and reliably transmits file data sufficient for the user to recover F in its entirety.
Abstract: In this paper, we define and explore proofs of retrievability (PORs). A POR scheme enables an archive or back-up service (prover) to produce a concise proof that a user (verifier) can retrieve a target file F, that is, that the archive retains and reliably transmits file data sufficient for the user to recover F in its entirety.A POR may be viewed as a kind of cryptographic proof of knowledge (POK), but one specially designed to handle a large file (or bitstring) F. We explore POR protocols here in which the communication costs, number of memory accesses for the prover, and storage requirements of the user (verifier) are small parameters essentially independent of the length of F. In addition to proposing new, practical POR constructions, we explore implementation considerations and optimizations that bear on previously explored, related schemes.In a POR, unlike a POK, neither the prover nor the verifier need actually have knowledge of F. PORs give rise to a new and unusual security definition whose formulation is another contribution of our work.We view PORs as an important tool for semi-trusted online archives. Existing cryptographic techniques help users ensure the privacy and integrity of files they retrieve. It is also natural, however, for users to want to verify that archives do not delete or modify files prior to retrieval. The goal of a POR is to accomplish these checks without users having to download the files themselves. A POR can also provide quality-of-service guarantees, i.e., show that a file is retrievable within a certain time bound.

1,783 citations