scispace - formally typeset
Search or ask a question
Author

Jeng-Shyang Pan

Bio: Jeng-Shyang Pan is an academic researcher from Shandong University of Science and Technology. The author has contributed to research in topics: Digital watermarking & Watermark. The author has an hindex of 50, co-authored 789 publications receiving 11645 citations. Previous affiliations of Jeng-Shyang Pan include National Kaohsiung Normal University & Technical University of Ostrava.


Papers
More filters
Book ChapterDOI
07 Nov 2016
TL;DR: This paper shows that Peng et al.
Abstract: Searchable public key encryption is a cryptographic mechanism which provides an efficient way to search an encrypted keyword. In this paper, we show that Peng et al.’s certificateless searchable public key encryption scheme suffered from a malicious PKG attack and an off-line keyword guessing attack. In the first attack, a malicious PKG can obtain a part of an authorized receiver’s private key. In the second attack, the malicious PKG can guess a keyword related to the authorized receiver’s trapdoor using the part of receiver’s private.

18 citations

Book ChapterDOI
01 Jan 2015
TL;DR: The algorithm mainly focuses on the diversity of locations of the fish rather than what velocity it is when the fish swim from the current location to a better one, which shows that ETFA has a faster convergence rate with an excellent accuracy.
Abstract: Ebb-Tide-Fish Algorithm (ETFA) is a simple but powerful optimization algorithm over continuous search spaces, and the inspiration comes from the foraging behavior of the fish in ebb tide. This kind of fish is a fascinating creature, and it often draws my attention when I walk on the beach. When I studied and got an idea of improving some optimization algorithms recently, the kind of fish flashes in my mind. The algorithm mainly focuses on the diversity of locations of the fish rather than what velocity it is when the fish swim from the current location to a better one. The algorithm gives a formulation of the foraging behavior of the fish, and the detailed model is also given in the paper. The performance of ETFA on a testbed of four functions is compared with several famous published methods. The final results show that ETFA has a faster convergence rate with an excellent accuracy.

17 citations

Proceedings ArticleDOI
18 Jul 2012
TL;DR: A new statistical-based ECG algorithm, which applies the idea of matching Reduced Binary Pattern, is proposed to seek a timely and accurate human identity recognition, which requires neither waveform complex information nor de-noising pre-processing in advance.
Abstract: In this paper, a new statistical-based ECG algorithm, which applies the idea of matching Reduced Binary Pattern, is proposed to seek a timely and accurate human identity recognition. A comparison with previous researches, the proposed design requires neither waveform complex information nor de-noising pre-processing in advance. Our algorithm is tested on the public MIT-BIH arrhythmia and normal sinus rhythm databases. The experimental result confirms that the proposed scheme is feasible for high accuracy, low complexity, and fast processing for ECG identification.

17 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: It is proved the convergence of a recursive mean shift procedure to the nearest stationary point of the underlying density function and, thus, its utility in detecting the modes of the density.
Abstract: A general non-parametric technique is proposed for the analysis of a complex multimodal feature space and to delineate arbitrarily shaped clusters in it. The basic computational module of the technique is an old pattern recognition procedure: the mean shift. For discrete data, we prove the convergence of a recursive mean shift procedure to the nearest stationary point of the underlying density function and, thus, its utility in detecting the modes of the density. The relation of the mean shift procedure to the Nadaraya-Watson estimator from kernel regression and the robust M-estimators; of location is also established. Algorithms for two low-level vision tasks discontinuity-preserving smoothing and image segmentation - are described as applications. In these algorithms, the only user-set parameter is the resolution of the analysis, and either gray-level or color images are accepted as input. Extensive experimental results illustrate their excellent performance.

11,727 citations

Book
24 Oct 2001
TL;DR: Digital Watermarking covers the crucial research findings in the field and explains the principles underlying digital watermarking technologies, describes the requirements that have given rise to them, and discusses the diverse ends to which these technologies are being applied.
Abstract: Digital watermarking is a key ingredient to copyright protection. It provides a solution to illegal copying of digital material and has many other useful applications such as broadcast monitoring and the recording of electronic transactions. Now, for the first time, there is a book that focuses exclusively on this exciting technology. Digital Watermarking covers the crucial research findings in the field: it explains the principles underlying digital watermarking technologies, describes the requirements that have given rise to them, and discusses the diverse ends to which these technologies are being applied. As a result, additional groundwork is laid for future developments in this field, helping the reader understand and anticipate new approaches and applications.

2,849 citations

Proceedings Article
01 Jan 1999

2,010 citations

Posted Content
TL;DR: This paper defines and explores proofs of retrievability (PORs), a POR scheme that enables an archive or back-up service to produce a concise proof that a user can retrieve a target file F, that is, that the archive retains and reliably transmits file data sufficient for the user to recover F in its entirety.
Abstract: In this paper, we define and explore proofs of retrievability (PORs). A POR scheme enables an archive or back-up service (prover) to produce a concise proof that a user (verifier) can retrieve a target file F, that is, that the archive retains and reliably transmits file data sufficient for the user to recover F in its entirety.A POR may be viewed as a kind of cryptographic proof of knowledge (POK), but one specially designed to handle a large file (or bitstring) F. We explore POR protocols here in which the communication costs, number of memory accesses for the prover, and storage requirements of the user (verifier) are small parameters essentially independent of the length of F. In addition to proposing new, practical POR constructions, we explore implementation considerations and optimizations that bear on previously explored, related schemes.In a POR, unlike a POK, neither the prover nor the verifier need actually have knowledge of F. PORs give rise to a new and unusual security definition whose formulation is another contribution of our work.We view PORs as an important tool for semi-trusted online archives. Existing cryptographic techniques help users ensure the privacy and integrity of files they retrieve. It is also natural, however, for users to want to verify that archives do not delete or modify files prior to retrieval. The goal of a POR is to accomplish these checks without users having to download the files themselves. A POR can also provide quality-of-service guarantees, i.e., show that a file is retrievable within a certain time bound.

1,783 citations