scispace - formally typeset
Search or ask a question

Showing papers by "Jeng-Shyang Pan published in 2012"


Journal ArticleDOI
TL;DR: The experimental results show that the proposed EPCSO method can provide the optimum recovered aircraft schedule in a very short time and requires less computational time than the existing PSO-based methods.
Abstract: In this paper, we present an enhanced parallel cat swarm optimization (EPCSO) method for solving numerical optimization problems. The parallel cat swarm optimization (PCSO) method is an optimization algorithm designed to solve numerical optimization problems under the conditions of a small population size and a few iteration numbers. The Taguchi method is widely used in the industry for optimizing the product and the process conditions. By adopting the Taguchi method into the tracing mode process of the PCSO method, we propose the EPCSO method with better accuracy and less computational time. In this paper, five test functions are used to evaluate the accuracy of the proposed EPCSO method. The experimental results show that the proposed EPCSO method gets higher accuracies than the existing PSO-based methods and requires less computational time than the PCSO method. We also apply the proposed method to solve the aircraft schedule recovery problem. The experimental results show that the proposed EPCSO method can provide the optimum recovered aircraft schedule in a very short time. The proposed EPCSO method gets the same recovery schedule having the same total delay time, the same delayed flight numbers and the same number of long delay flights as the Liu, Chen, and Chou method (2009). The optimal solutions can be found by the proposed EPCSO method in a very short time.

148 citations


Journal ArticleDOI
TL;DR: Experimental results illustrate that the proposed scheme outperforms ZM/PZM based schemes in terms of embedding capacity and watermark robustness and is also robust to both geometric and signal processing based attacks.

105 citations


Proceedings Article
01 Oct 2012
TL;DR: A comprehensive survey on face recognition from practical applications, sensory inputs, methods, and application conditions, and a comprehensive survey of face recognition methods from the viewpoints of signal processing and machine learning.
Abstract: Face recognition has the wide research and applications on many areas. Many surveys of face recognition are implemented. Different from previous surveys on from a single viewpoint of application, method or condition, this paper has a comprehensive survey on face recognition from practical applications, sensory inputs, methods, and application conditions. In the sensory inputs, we review face recognition from image-based, video-based, 3D-based and hypersprectral image based face recognition, and a comprehensive survey of face recognition methods from the viewpoints of signal processing and machine learning are implemented, such as kernel learning, manifold learning method. Moreover we discuss the single training sample based face recognition and under the variable poses. The prominent algorithms are described and critically analyzed, and relevant issues such as data collection, the influence of the small sample size, and system evaluation are discussed

22 citations


Proceedings ArticleDOI
25 Aug 2012
TL;DR: A new approach is proposed for hand gesture recognition, that is accomplished by dominant points based hand finger counting under skin color extraction, and the hand gesture contour can be extracted by this mean.
Abstract: In this paper, a new approach is proposed for hand gesture recognition, that is accomplished by dominant points based hand finger counting under skin color extraction. Skin color detection is used as a preprocessing segmentation, and the hand gesture contour can be extracted by this mean. after hand segmentation is done, dominant points based algorithm is used for counting hand fingers in this hand gesture control system. the performance and comparison is evaluated in the end as well. then the hand gesture control system can be implement by fuzzy logical efficiently this way.

22 citations


Proceedings ArticleDOI
18 Jul 2012
TL;DR: A new statistical-based ECG algorithm, which applies the idea of matching Reduced Binary Pattern, is proposed to seek a timely and accurate human identity recognition, which requires neither waveform complex information nor de-noising pre-processing in advance.
Abstract: In this paper, a new statistical-based ECG algorithm, which applies the idea of matching Reduced Binary Pattern, is proposed to seek a timely and accurate human identity recognition. A comparison with previous researches, the proposed design requires neither waveform complex information nor de-noising pre-processing in advance. Our algorithm is tested on the public MIT-BIH arrhythmia and normal sinus rhythm databases. The experimental result confirms that the proposed scheme is feasible for high accuracy, low complexity, and fast processing for ECG identification.

17 citations


Journal ArticleDOI
TL;DR: A novel classifier based on feature line, called the nearest feature centre (NFC) classifier, is proposed for face recognition and the experimental results show that NFC outperforms NFL and the other classifiers based onfeature line.
Abstract: A novel classifier based on feature line, called the nearest feature centre (NFC) classifier, is proposed for face recognition NFC uses the new metric, called the feature centre metric, which is different from the classical feature line metric of the nearest feature line (NFL) The experimental results on the ORL face database show that NFC outperforms NFL and the other classifiers based on feature line

17 citations


Journal ArticleDOI
TL;DR: A novel image classification algorithm named Adaptively Weighted Sub-directional Two-Dimensional Linear Discriminant Analysis (AWS2DLDA) can extract the directional features of images in the frequency domain, and it is applied to face recognition.

15 citations


Proceedings ArticleDOI
04 Jun 2012
TL;DR: A novel algorithm named Swapped Huffman code Table (SHT algorithm) which joints compression and encryption based on the Huffman coding is proposed in this paper and results show its feasibility and efficiency.
Abstract: Data compression has been playing an important role in the areas of data transmission. Many great contributions have been made in this area, such as Huffman coding, LZW algorithm, run length coding, and so on. These methods only focus on the data compression. On the other hand, it is very important for us to encrypt our data to against malicious theft and attack during transmission. A novel algorithm named Swapped Huffman code Table (SHT algorithm) which joints compression and encryption based on the Huffman coding is proposed in this paper. This algorithm is a enhanced Huffman coding with encryption for wireless data broadcasting system. The application example and evaluation results show its feasibility and efficiency.

15 citations


Book ChapterDOI
15 Oct 2012
TL;DR: A novel feature extraction algorithm based on nearest feature line that can extract the local discriminant features of the samples using two discriminant power criterions to adaptively determine the parameter.
Abstract: A novel feature extraction algorithm based on nearest feature line is proposed in this paper. The proposed algorithm can extract the local discriminant features of the samples. The performance of the proposed algorithm is directly associated with the parameter, so we use two discriminant power criterions to adaptively determine the parameter. Some experiments are implemented to evaluate the proposed algorithm and the experimental results demonstrate the efficiency of the proposed algorithm.

11 citations


Proceedings ArticleDOI
07 Jul 2012
TL;DR: The use of wavelet-based quantization watermarking scheme on ECG signal for patient protection is adequate and is confirmed to be robust under the network transfer of ECG data.
Abstract: In this article, we use a self-synchronized watermark technology [7], to achieve the purpose of protection of electrocardiogram (ECG) signal. A Harr wavelet transform with 7 levels decomposition is adopted to transform the ECG signal and the synchronization code, combined with watermark, are quantized embedded in the low-frequency sub-band of level 7. The signal to noise ratio (SNR) between the embedded ECG and original one is greater than 30 such that the difference between these two ECG signals is very small and negligible in general. To test the robustness under the network transfer of ECG data, a white noise attack with various strengths is simulated that the bit error rate is quite small unless the SNR of the noise is very large. This study confirms the use of wavelet-based quantization watermarking scheme on ECG signal for patient protection is adequate.

11 citations


Journal ArticleDOI
TL;DR: The main objective of this Special Issue is to provide the readers with a focused set of peer-reviewed articles to reflect recent advances in state-of-the-art algorithms as well as application aspects of swarm intelligence.

Book
27 Feb 2012
TL;DR: The three-volume set L NAI 7196, LNAI 7197 and LNAi 7198 constitutes the refereed proceedings of the 4th Asian Conference on Intelligent Information and Database Systems, ACIIDS 2012, held in Kaohsiung, Taiwan in March 2012.
Abstract: The three-volume set LNAI 7196, LNAI 7197 and LNAI 7198 constitutes the refereed proceedings of the 4th Asian Conference on Intelligent Information and Database Systems, ACIIDS 2012, held in Kaohsiung, Taiwan in March 2012. The 161 revised papers presented were carefully reviewed and selected from more than 472 submissions. The papers included cover the following topics: intelligent database systems, data warehouses and data mining, natural language processing and computational linguistics, semantic Web, social networks and recommendation systems, collaborative systems and applications, e-bussiness and e-commerce systems, e-learning systems, information modeling and requirements engineering, information retrieval systems, intelligent agents and multi-agent systems, intelligent information systems, intelligent internet systems, intelligent optimization techniques, object-relational DBMS, ontologies and knowledge sharing, semi-structured and XML database systems, unified modeling language and unified processes, Web services and semantic Web, computer networks and communication systems.

BookDOI
28 Jul 2012
TL;DR: A sample of recent advances in information hiding techniques and their applications includes image data hiding scheme based on vector quantization and image graph coloring and the copyright protection system for Android platform.
Abstract: This research book presents a sample of recent advances in information hiding techniques and their applications. It includes: Image data hiding scheme based on vector quantization and image graph coloring The copyright protection system for Android platform Reversible data hiding ICA-based image and video watermarking Content-based invariant image watermarking Single bitmap block truncation coding of color images using cat swarm optimization Genetic-based wavelet packet watermarking for copyright protection Lossless text steganography in compression coding Fast and low-distortion capacity acoustic synchronized acoustic-to-acoustic steganography scheme Video watermarking with shot detection.

Proceedings ArticleDOI
25 Aug 2012
TL;DR: This paper presents a hybrid socio-rational secret sharing scheme, which is based on a belief learning model, which allows the player to learn the probabilities of the other players' likely action, upon which the player can responsively choose his next counter action with the expected probabilities.
Abstract: In this paper, we present a hybrid socio-rational secret sharing scheme, which is based on a belief learning model. Using this model, the player can learn the probabilities of the other players' likely action, upon which the player can responsively choose his next counter action(s) with the expected probabilities. Finally, we exploit the application of our proposed scheme in secured cloud storage.

01 Jan 2012
TL;DR: The RICET algorithm extends the traditional data aggregation algorithm to detect composite events, and this algorithm can eliminate redundant transmission and save power consumption, thereby extending the lifetime of the entire wireless sensor network.
Abstract: In this paper, a Reduce Identical Composite Event Transmission (RICET) algorithm is proposed to solve the problem of detecting composite events in wireless sensor networks. The RICET algorithm extends the traditional data aggregation algorithm to detect composite events, and this algorithm can eliminate redundant transmission and save power consumption, thereby extending the lifetime of the entire wireless sensor network. According to the experimental results, the proposed algorithm not only reduces power consumption by approximately 64.78% and 62.67%, but it also enhances the sensor node's lifetime by up to 8.97 times compared with some traditional algorithms.

Proceedings ArticleDOI
07 Jul 2012
TL;DR: This paper proposes a cloud intrusion detection with a new statistical waveform based classification that records network connections over a period of time to form a waveform, and then computes the suspicious characteristics of the waveform.
Abstract: In recent years, many approaches have been proposed for intrusion detection In this paper, we propose a cloud intrusion detection with a new statistical waveform based classification It records network connections over a period of time to form a waveform, and then computes the suspicious characteristics of the waveform It classifies the intrusion with these selected waveform features In our evaluation, a DARPA Intrusion Detection Data Sets has been used in our evaluation, and the preliminary results confirmed that our approach is feasible

Proceedings ArticleDOI
25 Aug 2012
TL;DR: The proposed basic strategy is to divide such an image into many sub images firstly and then detect the blurred sub image by the gradient distribution and the maximum of cepstrum.
Abstract: This paper presents a real-time restoration method for linear local motion-blur image. for an image in which only the fast moving-object is blurred but the background is clear, the proposed basic strategy is to divide such an image into many sub images firstly and then detect the blurred sub image by the gradient distribution and the maximum of cepstrum. for a blurred sub image, the blur direction and blur length are estimated to calculate the parameters of point spread function (PSF) and Lucy-Richardson deconvolution algorithm is employed to restore this blurred sub image. Using many artificial and real blurred images, the experimental results show that the proposed approach is more accurate and robust than other methods.

Proceedings ArticleDOI
01 Aug 2012
TL;DR: This paper proposed a refined efficient identity-based threshold proxy signature in the cloud environment that is not only practical but also enjoys fewer rounds of communication and less operation cost in comparison to the existing schemes.
Abstract: Threshold proxy signature is more practical, flexible and secure than traditional proxy signature and multi-proxy signature schemes. However, the majority of existing identity-based threshold proxy signature schemes is still inefficient and impractical. Cloud services allow users to create, store, edit, and read electronic documents on the internet conveniently. While access control mechanisms enable communications between user terminals and the cloud to be executed securely and in a trusted manner. In this paper, we proposed a refined efficient identity-based threshold proxy signature in the cloud environment. Our approach is not only practical but also enjoys fewer rounds of communication and less operation cost in comparison to the existing schemes. In this paper, we have also introduced the implementation of our approach in the cloud environment.

Proceedings ArticleDOI
07 Jul 2012
TL;DR: A new intrusion detection system architecture is proposed, which can be used by cloud intrusion detection and uses the inaccurate hashing to obtain a fast packet inspection and applies a new stateful rule DFA to track session inspection process.
Abstract: This paper propose a new intrusion detection system architecture, which can be used by cloud intrusion detection. Our new architecture uses the inaccurate hashing to obtain a fast packet inspection and applies a new stateful rule DFA to track session inspection process. After a preliminary analysis, the new architecture should be feasible in the real application.

Proceedings ArticleDOI
07 Jul 2012
TL;DR: Novel fragile watermarking algorithm is proposed based on the Fractional Fourier Transform and the blocking method and good performance in experiments shows its excellent function for preventing tampering.
Abstract: novel fragile watermarking algorithm is proposed based on the Fractional Fourier Transform (FRFT) and the blocking method in this paper. Chaotic sequence is converted into a binary 0, 1 sequence to be used as the watermarking information. The blocking method for fragile watermarking algorithm is used to generate the initial value of the chaotic sequence. Good performance in experiments shows its excellent function for preventing tampering.


Proceedings ArticleDOI
07 Jul 2012
TL;DR: A new algorithm about pattern matching for cloud system is proposed, first it performs inexact matching to filter out the part of non-attack information and then it does exact matching to get the final attack information.
Abstract: With development of the cloud computing, its security issues have got more and more attention. There is a great demand for the examining the content of data or packets in order to improve cloud security. In this paper, we propose a new algorithm about pattern matching for cloud system, First it performs inexact matching to filter out the part of non-attack information and then do exact matching to get the final attack information. Our presented specific algorithm named Bit-Reduced DFA, it is feasible through a preliminary evaluation.

Journal ArticleDOI
TL;DR: An adaptive Fast TCP is proposed to move data faster between data centers of cloud services and adapts the operational parameter of Fast TCP connections to improve the throughput and the rapidness of TCP connections.
Abstract: Cloud service is a trend of promising services on Internet. Data centers with high performance are the key to realize real-time services for worldwide users. For the users travelling in global, data mobility between cloud data centers is essential to offer quality-ensured cloud services. However, existing transmission protocols such as TCP, do not support high speed and high throughput transmission on long-distance networks. Legacy TCP lacks of the ability to efficiently transport data between distant data centers. In this paper, an adaptive Fast TCP is proposed to move data faster between data centers of cloud services. The proposed approach adapts the operational parameter of Fast TCP connections to improve the throughput and the rapidness of TCP connections. Numeric results show that the throughput is improved by 2.5 times and the rapid response of flow control eliminates packet loss. The better data mobility contributes the service quality of travelling users in clouds.

Proceedings ArticleDOI
01 Aug 2012
TL;DR: A new statistical automaton is proposed for the traffic classification which is marked by many multiple signatures during a flow training process, and then it classifies the applications when their statistical results are reached.
Abstract: Traffic classification is crucial in many network and cloud applications, they are from QoS enforcement, network monitoring to security and firewalls. In recent years, all the classification with deep packet inspection (DPI) are using the exact matching with the existing policy semantics. However, if the policy semantics is changed, then the DPI classifier is no longer able to be a workable traffic classification. We proposed a new statistical automaton for the traffic classification. The applications are marked by many multiple signatures during a flow training process, and then it classifies the applications when their statistical results are reached. In the experiment, we evaluate the proposed method with 5 applications which proves our idea is feasible for the network and cloud traffic classification.

Book ChapterDOI
19 Mar 2012
TL;DR: In the system developed in this study, it makes use of Received Signal Strength Index value of low-power active RFID (Radio Frequency Identification) for movement detection and adopts ZigBee wireless transmission technology as reference nodes for positioning detection.
Abstract: Real time location system is not a brand new technology. The most typical approach is using global position system (GPS). However, GPS can only be used outdoors. It is unable to work completely indoors or in an environment with obstacles. Therefore, the development related to indoor position technology is quite important. In the system developed in this study, it makes use of Received Signal Strength Index value of low-power active RFID (Radio Frequency Identification) for movement detection. Besides, it adopts ZigBee wireless transmission technology as reference nodes for positioning detection. Information gathered by reference points is delivered to the server through the Internet. All positioning information is computed by the server. Positioning algorithm uses the average values of signals in its operation. The advantage is to compare many average values with the closing nodes in order to locate the closest position node for the mobile device and reduce the multi-path interference which is caused by other environmental factors. Positioning results can be accessed through the networked computer or mobile device with WiFi functionality. The experimental results are more stable than other positioning algorithms, and the installation of the system is more convenient.

Book ChapterDOI
28 Nov 2012
TL;DR: The proposed algorithm combines neighborhood discriminant nearest feature line analysis and fractional cosine transform to extract the local discriminant features of the samples to form a new discriminant power criterion.
Abstract: A novel subspace learning algorithm based on nearest feature line in time-frequency domain is proposed in this paper. The proposed algorithm combines neighborhood discriminant nearest feature line analysis and fractional cosine transform to extract the local discriminant features of the samples. A new discriminant power criterion based on nearest feature line is also presented in this paper. Some experiments are implemented to evaluate the proposed algorithm and the experimental results demonstrate the efficiency of the proposed algorithm.

Proceedings ArticleDOI
07 Jul 2012
TL;DR: A novel subspace algorithm entitled Simplified Nearest feature Line Space (SNFLS) is proposed based on Nearest Feature Line (NFL) and has the same discriminant power as NFLS.
Abstract: In this paper, a novel subspace algorithm entitled Simplified Nearest Feature Line Space (SNFLS) is proposed based on Nearest Feature Line (NFL). NFL space (NFLS) is a subspace learning method and has desirable discriminant power. However, the computational complexity of NFLS is too high because all the feature lines are used. In SNFLS algorithm, some feature lines will be chosen for learning. SNFLS has the same discriminant power as NFLS. At the same time, SNFLS has a lower NFLS. Experimental results confirm its efficiency.

Proceedings ArticleDOI
16 Nov 2012
TL;DR: The experimental results demonstrate the efficiency of the proposed AWNFSA, a Nearest Feature Space (NFS) based subspace learning approach that evaluates the effect of two scatter for classification through choosing their weights adaptively.
Abstract: In this paper, a new feature extraction algorithm named Adaptive Weighted Nearest Feature Space Analysis (AWNFSA) is proposed AWNFSA is a Nearest Feature Space (NFS) based subspace learning approach In Discriminant Nearest Feature Space Analysis (DNFSA) algorithm based on NFS, it may lead the result into misclassification when the between class scatter is very big or within class scatter is very small Different from DNFSA, AWNFSA evaluates the effect of two scatter for classification through choosing their weights adaptively The proposed AWNFSA is applied to image classification on ORL face Database The experimental results demonstrate the efficiency of the proposed AWNFSA

Book ChapterDOI
28 Nov 2012
TL;DR: Numeric results show that the proposed EM benefits the discovery of optimal solutions in a large solution space, the approach to optimal solutions is more stable and faster, and the search guidance derived from the chromosome similarity is critical to the improvements of optimal solution discovery.
Abstract: In this paper, a new genetic algorithm with elite mutation is proposed for optimization problems. The proposed elite mutation scheme (EM) improves traditional genetic algorithms with a better ability to locate and to approach fast to optimal solutions, even in cases of huge data set. The proposed EM is to select elite chromosomes and mutate according to the similarity between elite chromosomes and selected chromosomes. The designed similarity guides effectively the search toward optimal solutions with less generation. The proposed EM is applied to optimize the cruise area of mobile sinks in hierarchical wireless sensor networks (WSNs). Numeric results show that (1) the proposed EM benefits the discovery of optimal solutions in a large solution space; (2) the approach to optimal solutions is more stable and faster; (3) the search guidance derived from the chromosome similarity is critical to the improvements of optimal solution discovery. Besides, the minimization of cruise are been proved to have the advantages of energy-saving, time-saving and reliable data collection in WSNs.

Book ChapterDOI
19 Mar 2012
TL;DR: Two novel image feature extraction algorithms based on directional filter banks and nearest feature line are proposed, which are named Single Directional Feature Line Discriminant Analysis (SD-NFDA) and Multiple Directional feature Discriminatory Line Analysis (MD- NFDA).
Abstract: In this paper, two novel image feature extraction algorithms based on directional filter banks and nearest feature line are proposed, which are named Single Directional Feature Line Discriminant Analysis (SD-NFDA) and Multiple Directional Feature Discriminant Line Analysis (MD-NFDA). SD-NFDA and MD-NFDA extract not only the statistic feature of samples, but also the directionality feature. SD-NFDA and MD-NFDA can get higher average recognition rate with less running time than other nearest feature line based feature extraction algorithms. Experimental results confirm the advantages of SD-NFDA and MD-NFDA.