scispace - formally typeset
Search or ask a question
Author

Jeng-Shyang Pan

Bio: Jeng-Shyang Pan is an academic researcher from Shandong University of Science and Technology. The author has contributed to research in topics: Digital watermarking & Watermark. The author has an hindex of 50, co-authored 789 publications receiving 11645 citations. Previous affiliations of Jeng-Shyang Pan include National Kaohsiung Normal University & Technical University of Ostrava.


Papers
More filters
Proceedings ArticleDOI
13 Dec 2010
TL;DR: The theory and methods proposed in the past decade and their characteristics based on relevant literature are summarized and classified as quality indicator based approaches and statistic test approach here.
Abstract: Various randomized search heuristics have been proposed for multiobjective optimization problems. We need evaluate and compare the performance of these optimizers in order to make good use of them. This paper reviews the theory and methods proposed in the past decade and summarize their characteristics based on relevant literature. However, we didn’t list and analyze many methods proposed before because the relevant literature has done it. We look at them from the classification perspective. These assessment methods are classified as quality indicator based approaches and statistic test approach here. For quality indicators, we further classify them Pareto dominance compliant and non-compliant, unary and binary parameters. This enables us to choose suitable assessment measures in practice.

5 citations

Proceedings ArticleDOI
26 Sep 2010
TL;DR: A novel feature extraction algorithm, named two-dimensional exponential discriminant analysis (2DEDA), is proposed in this paper, which has higher recognition rate and lower computational complexity than the EDA.
Abstract: A novel feature extraction algorithm, named two-dimensional exponential discriminant analysis (2DEDA), is proposed in this paper. The 2DEDA is a generalization of exponential discriminant analysis (EDA). The 2DEDA is base on image matrices. So compared with the EDA, the 2DEDA has higher recognition rate and lower computational complexity. Experimental results demonstrate the advantages of 2DEDA.

5 citations

Book ChapterDOI
10 Nov 2010
TL;DR: An integrated approach featuring automatic thresholding is developed and presented, and the experimental results indicate that the proposed method greatly improves the visual perceptibility as compared with previous approaches.
Abstract: Due to the inherent low-contrast in Electronic Portal Images (EPI), the perception quality of EPI has certain gap to the expectation of most physicians. It is essential to have effective post-processing methods to enhance the visual quality of EPI. However, only limited efforts had been paid to this issue in the past decade. To this problem, an integrated approach featuring automatic thresholding is developed and presented in this article. Firstly, Gray-Level Grouping (GLG) is applied to improve the global contrast of the whole image. Secondly, Adaptive Image Contrast Enhancement (AICE) is used to refine the local contrast within a neighborhood. Finally, a simple spatial filter is employed to reduce noises. The experimental results indicate that the proposed method greatly improves the visual perceptibility as compared with previous approaches.

5 citations

01 Jan 2020
TL;DR: The experimental findings reveal that the suggested scheme produces being stable, more robust and improves embedding efficiency without altering the embedding rate.
Abstract: This study suggests an optimization efficiency method based on the genetic algorithm (GA) for automated watermarking. In the block coding of the automated watermarking algorithm, adding a hidden massage is an important element in the successful implementation of watermarking. The coding matrix specifies the output and embedding rate of hidden information. The optimization of the coding matrix is done on adjusting GA to assess the main massage. The obtained results by the proposed scheme are compared to previous literature methods, the experimental findings reveal that the suggested scheme produces being stable, more robust and improves embedding efficiency without altering the embedding rate.

5 citations

Book ChapterDOI
17 Jun 2013
TL;DR: A hierarchical gradient diffusion algorithm is proposed to solve the transmission problem and the sensor node's loading problem by adding several relay nodes and arranging the sensor nodes's routing path to reduce the data package transmission loss rate.
Abstract: In this paper, a hierarchical gradient diffusion algorithm is proposed to solve the transmission problem and the sensor node's loading problem by adding several relay nodes and arranging the sensor node's routing path. The proposed hierarchical gradient diffusion aims to balance sensor node's transmission loading, enhance sensor node's lifetime, and reduce the data package transmission loss rate. According to the experimental results, the proposed algorithm not only reduces power consumption about 12% but also decreases data loss rate by 85.5% and increases active nodes by about 51.7%.

5 citations


Cited by
More filters
Journal ArticleDOI
TL;DR: It is proved the convergence of a recursive mean shift procedure to the nearest stationary point of the underlying density function and, thus, its utility in detecting the modes of the density.
Abstract: A general non-parametric technique is proposed for the analysis of a complex multimodal feature space and to delineate arbitrarily shaped clusters in it. The basic computational module of the technique is an old pattern recognition procedure: the mean shift. For discrete data, we prove the convergence of a recursive mean shift procedure to the nearest stationary point of the underlying density function and, thus, its utility in detecting the modes of the density. The relation of the mean shift procedure to the Nadaraya-Watson estimator from kernel regression and the robust M-estimators; of location is also established. Algorithms for two low-level vision tasks discontinuity-preserving smoothing and image segmentation - are described as applications. In these algorithms, the only user-set parameter is the resolution of the analysis, and either gray-level or color images are accepted as input. Extensive experimental results illustrate their excellent performance.

11,727 citations

Book
24 Oct 2001
TL;DR: Digital Watermarking covers the crucial research findings in the field and explains the principles underlying digital watermarking technologies, describes the requirements that have given rise to them, and discusses the diverse ends to which these technologies are being applied.
Abstract: Digital watermarking is a key ingredient to copyright protection. It provides a solution to illegal copying of digital material and has many other useful applications such as broadcast monitoring and the recording of electronic transactions. Now, for the first time, there is a book that focuses exclusively on this exciting technology. Digital Watermarking covers the crucial research findings in the field: it explains the principles underlying digital watermarking technologies, describes the requirements that have given rise to them, and discusses the diverse ends to which these technologies are being applied. As a result, additional groundwork is laid for future developments in this field, helping the reader understand and anticipate new approaches and applications.

2,849 citations

Proceedings Article
01 Jan 1999

2,010 citations

Posted Content
TL;DR: This paper defines and explores proofs of retrievability (PORs), a POR scheme that enables an archive or back-up service to produce a concise proof that a user can retrieve a target file F, that is, that the archive retains and reliably transmits file data sufficient for the user to recover F in its entirety.
Abstract: In this paper, we define and explore proofs of retrievability (PORs). A POR scheme enables an archive or back-up service (prover) to produce a concise proof that a user (verifier) can retrieve a target file F, that is, that the archive retains and reliably transmits file data sufficient for the user to recover F in its entirety.A POR may be viewed as a kind of cryptographic proof of knowledge (POK), but one specially designed to handle a large file (or bitstring) F. We explore POR protocols here in which the communication costs, number of memory accesses for the prover, and storage requirements of the user (verifier) are small parameters essentially independent of the length of F. In addition to proposing new, practical POR constructions, we explore implementation considerations and optimizations that bear on previously explored, related schemes.In a POR, unlike a POK, neither the prover nor the verifier need actually have knowledge of F. PORs give rise to a new and unusual security definition whose formulation is another contribution of our work.We view PORs as an important tool for semi-trusted online archives. Existing cryptographic techniques help users ensure the privacy and integrity of files they retrieve. It is also natural, however, for users to want to verify that archives do not delete or modify files prior to retrieval. The goal of a POR is to accomplish these checks without users having to download the files themselves. A POR can also provide quality-of-service guarantees, i.e., show that a file is retrievable within a certain time bound.

1,783 citations