scispace - formally typeset
Search or ask a question
Author

Jeng-Shyang Pan

Bio: Jeng-Shyang Pan is an academic researcher from Shandong University of Science and Technology. The author has contributed to research in topics: Digital watermarking & Watermark. The author has an hindex of 50, co-authored 789 publications receiving 11645 citations. Previous affiliations of Jeng-Shyang Pan include National Kaohsiung Normal University & Technical University of Ostrava.


Papers
More filters
Journal ArticleDOI
TL;DR: In this article , the authors proposed a general software-hardware co-design scheme of intelligent optimization algorithms, where the initialization module and fitness module of the algorithm are deployed on the Advanced RISC Machines (ARM) for execution to increase the flexibility of the program.
Book ChapterDOI
03 Jun 2014
TL;DR: A novel wavelet tree vector based watermarked Multiple Description Scalar Quantization coding frame has been proposed in this paper and good performance in the experiments has shown its efficiency.
Abstract: Watermarked Multiple Description Coding techniques belong to one branch of covert communication techniques. A novel wavelet tree vector based watermarked Multiple Description Scalar Quantization coding frame has been proposed in this paper. The wavelet orientational tree is used as the tree vectors in the coding process. And the overlap of the different orientational information is used to introduce the redundancy. Good performance in the experiments has shown its efficiency.
Proceedings Article
01 Jan 1999
TL;DR: A novel channel distortion measure is proposed by computing the expected chanel distortion using Belta distribution function and all codebook index assignment algorithms can be optimized based on this distortion measure.
Abstract: Vector quantization is very efficient for data compression of speech and image. The channel distortions are introduced due to channel noise. Assigning suitable indices to codevectors can reduce distortion due to an imperfect channel. Several codebook index assignment algorithms were proposed. Unfortunately, no algorithm is always better than the others for any bit error rate due to these algorithms are operated under the assumption of some fixed channel bit error rate which is not realistic. In this paper, a novel channel distortion measure is proposed by computing the expected chanel distortion using Belta distribution function. All codebook index assignment algorithms can be optimized based on this distortion measure. Besides, a fuzzy channel optimized vector quantization for codebook design and index assignment is also derived in this paper.
Book ChapterDOI
24 Oct 2016
TL;DR: This work introduces a high level design for a behavioral analysis system based on action sequences in terms of process modeling and proposes process decomposition as a core step towards design and implementation of a behavior analyzer system.
Abstract: Modeling human actions in a format that is suitable for computer systems to understand is a target for behavior analysis systems. This work introduces a high level design for a behavioral analysis system based on action sequences. The design is introduced in terms of process modeling. System processes are presented in terms of a set of data flow diagrams (also known as DFDs) of multi levels. They represent the decomposition of all processing components required in such system and the data flows among them. The system is designed to receive structured information for human behaviors and actions and produce insights, predictions and classifications for personal and behavioral characteristics. Proposed process decomposition is introduced as a core step towards design and implementation of a behavior analyzer system.
Journal ArticleDOI
TL;DR: In this article , a parallel opposition-based Gaining-Sharing Knowledge-based algorithm (POGSK) was proposed to optimize resource scheduling in the Internet of Vehicles (IoV).
Abstract: Heuristic optimization algorithms have been proved to be powerful in solving nonlinear and complex optimization problems; therefore, many effective optimization algorithms have been applied to solve optimization problems in real-world scenarios. This paper presents a modification of the recently proposed Gaining–Sharing Knowledge (GSK)-based algorithm and applies it to optimize resource scheduling in the Internet of Vehicles (IoV). The GSK algorithm simulates different phases of human life in gaining and sharing knowledge, which is mainly divided into the senior phase and the junior phase. The individual is initially in the junior phase in all dimensions and gradually moves into the senior phase as the individual interacts with the surrounding environment. The main idea used to improve the GSK algorithm is to divide the initial population into different groups, each searching independently and communicating according to two main strategies. Opposite-based learning is introduced to correct the direction of convergence and improve the speed of convergence. This paper proposes an improved algorithm, named parallel opposition-based Gaining–Sharing Knowledge-based algorithm (POGSK). The improved algorithm is tested with the original algorithm and several classical algorithms under the CEC2017 test suite. The results show that the improved algorithm significantly improves the performance of the original algorithm. When POGSK was applied to optimize resource scheduling in IoV, the results also showed that POGSK is more competitive.

Cited by
More filters
Journal ArticleDOI
TL;DR: It is proved the convergence of a recursive mean shift procedure to the nearest stationary point of the underlying density function and, thus, its utility in detecting the modes of the density.
Abstract: A general non-parametric technique is proposed for the analysis of a complex multimodal feature space and to delineate arbitrarily shaped clusters in it. The basic computational module of the technique is an old pattern recognition procedure: the mean shift. For discrete data, we prove the convergence of a recursive mean shift procedure to the nearest stationary point of the underlying density function and, thus, its utility in detecting the modes of the density. The relation of the mean shift procedure to the Nadaraya-Watson estimator from kernel regression and the robust M-estimators; of location is also established. Algorithms for two low-level vision tasks discontinuity-preserving smoothing and image segmentation - are described as applications. In these algorithms, the only user-set parameter is the resolution of the analysis, and either gray-level or color images are accepted as input. Extensive experimental results illustrate their excellent performance.

11,727 citations

Book
24 Oct 2001
TL;DR: Digital Watermarking covers the crucial research findings in the field and explains the principles underlying digital watermarking technologies, describes the requirements that have given rise to them, and discusses the diverse ends to which these technologies are being applied.
Abstract: Digital watermarking is a key ingredient to copyright protection. It provides a solution to illegal copying of digital material and has many other useful applications such as broadcast monitoring and the recording of electronic transactions. Now, for the first time, there is a book that focuses exclusively on this exciting technology. Digital Watermarking covers the crucial research findings in the field: it explains the principles underlying digital watermarking technologies, describes the requirements that have given rise to them, and discusses the diverse ends to which these technologies are being applied. As a result, additional groundwork is laid for future developments in this field, helping the reader understand and anticipate new approaches and applications.

2,849 citations

Proceedings Article
01 Jan 1999

2,010 citations

Posted Content
TL;DR: This paper defines and explores proofs of retrievability (PORs), a POR scheme that enables an archive or back-up service to produce a concise proof that a user can retrieve a target file F, that is, that the archive retains and reliably transmits file data sufficient for the user to recover F in its entirety.
Abstract: In this paper, we define and explore proofs of retrievability (PORs). A POR scheme enables an archive or back-up service (prover) to produce a concise proof that a user (verifier) can retrieve a target file F, that is, that the archive retains and reliably transmits file data sufficient for the user to recover F in its entirety.A POR may be viewed as a kind of cryptographic proof of knowledge (POK), but one specially designed to handle a large file (or bitstring) F. We explore POR protocols here in which the communication costs, number of memory accesses for the prover, and storage requirements of the user (verifier) are small parameters essentially independent of the length of F. In addition to proposing new, practical POR constructions, we explore implementation considerations and optimizations that bear on previously explored, related schemes.In a POR, unlike a POK, neither the prover nor the verifier need actually have knowledge of F. PORs give rise to a new and unusual security definition whose formulation is another contribution of our work.We view PORs as an important tool for semi-trusted online archives. Existing cryptographic techniques help users ensure the privacy and integrity of files they retrieve. It is also natural, however, for users to want to verify that archives do not delete or modify files prior to retrieval. The goal of a POR is to accomplish these checks without users having to download the files themselves. A POR can also provide quality-of-service guarantees, i.e., show that a file is retrievable within a certain time bound.

1,783 citations