scispace - formally typeset
Search or ask a question

Showing papers by "Jeng-Shyang Pan published in 2015"


Journal ArticleDOI
TL;DR: This paper begins with an overview of WSN and the deployment problem, followed by discussions on metaheuristics and how to use them to solve the DP, and a comprehensive comparison between meta heuristics for the DP is given.

66 citations


Journal ArticleDOI
TL;DR: A new scheme for predetermining the optimized routing path is proposed based on the enhanced parallel cat swarm optimization (EPCSO) in this paper, the first leading precedent that the EPCSO is employed to provide the routing scheme for the WSN.
Abstract: The wireless sensor network (WSN) is composed of a set of sensor nodes. It is deemed suitable for deploying with large-scale in the environment for variety of applications. Recent advances in WSN have led to many new protocols specifically for reducing the power consumption of sensor nodes. A new scheme for predetermining the optimized routing path is proposed based on the enhanced parallel cat swarm optimization (EPCSO) in this paper. This is the first leading precedent that the EPCSO is employed to provide the routing scheme for the WSN. The experimental result indicates that the EPCSO is capable of generating a set of the predetermined paths and of smelting the balanced path for every sensor node to forward the interested packages. In addition, a scheme for deploying the sensor nodes based on their payload and the distance to the sink node is presented to extend the life cycle of the WSN. A simulation is given and the results obtained by the EPCSO are compared with the AODV, the LD method based on ACO, and the LD method based on CSO. The simulation results indicate that our proposed method reduces more than 35% power consumption on average.

53 citations


Journal ArticleDOI
TL;DR: Numerical experiments show that this steganography algorithm has provided a novel data embedding domain and high security of information and high perceived quality of the stego-image can be guaranteed.
Abstract: A new image steganography algorithm combining compressive sensing with subsampling is proposed, which can hide secret message into an innovative embedding domain. Considering that natural image tends to be compressible in a transform domain, the characteristics of compressive sensing (CS), dimensional reduction and random projection, are utilized to insert secret message into the compressive sensing transform domain of the sparse image and the measurement matrix which is generated by using a secret key is shared between sender and receiver. Then, stego-image is reconstructed approximately via Total Variation (TV) minimization algorithm. Through adopting different transform coefficients in sub-images gained by subsampling, high perceived quality of the stego-image can be guaranteed. Bit Correction Rate (BCR) between original secret message and extracted message are used to calculate the accuracy of this method. Numerical experiments show that this steganography algorithm has provided a novel data embedding domain and high security of information.

44 citations


Journal ArticleDOI
TL;DR: Two improved classifiers based on the nearest feature line (NFL) are proposed for image classification, where the neighborhood feature line segment-I (NFLS-I) and NFLS-II classifiers outperform the original NFL and some other improved NFL classifiers for object recognition, hand posture recognition, and face recognition.
Abstract: In this paper, two improved classifiers based on the nearest feature line (NFL) are proposed for image classification, where the neighborhood feature line segment-I (NFLS-I) classifier uses the neighborhood of prototypes to select the better-fitted feature lines (FLs) and the neighborhood feature line segment-II (NFLS-II) classifier utilizes the neighborhood of the query sample to choose more-likely FLs. With better selection of FLs, these two classifiers can both improve the recognition performance and the computation problem. A large number of experiments on Soil-100 object database, Yale face database, FEI face database, AR face database, and Jochen triesch static hand posture database are performed to evaluate these two proposed classifiers. The experimental results demonstrate that the proposed NFLS-I and NFLS-II classifiers outperform the original NFL and some other improved NFL classifiers for object recognition, hand posture recognition, and face recognition.

26 citations


Book ChapterDOI
01 Jan 2015
TL;DR: Algorithm proposed in this paper can enhance production of the systems and schedule the tasks to virtual machines (VMs) more efficiently and finishing time of all tasks in the same system will be less than others’.
Abstract: Rapidly development of the cloud computing and Internet makes load balance technique become more and more significant to us than ever. A perfect scheduling algorithm is the key to solve the load balance problems which can not only balance the load, but also can meet the users’ needs. An optimal load balance algorithm is proposed in this paper. Algorithm proposed in this paper can enhance production of the systems and schedule the tasks to virtual machines (VMs) more efficiently. Finishing time of all tasks in the same system will be less than others’. The simulation tools is the CloudSim.

23 citations


Book ChapterDOI
01 Jan 2015
TL;DR: The algorithm mainly focuses on the diversity of locations of the fish rather than what velocity it is when the fish swim from the current location to a better one, which shows that ETFA has a faster convergence rate with an excellent accuracy.
Abstract: Ebb-Tide-Fish Algorithm (ETFA) is a simple but powerful optimization algorithm over continuous search spaces, and the inspiration comes from the foraging behavior of the fish in ebb tide. This kind of fish is a fascinating creature, and it often draws my attention when I walk on the beach. When I studied and got an idea of improving some optimization algorithms recently, the kind of fish flashes in my mind. The algorithm mainly focuses on the diversity of locations of the fish rather than what velocity it is when the fish swim from the current location to a better one. The algorithm gives a formulation of the foraging behavior of the fish, and the detailed model is also given in the paper. The performance of ETFA on a testbed of four functions is compared with several famous published methods. The final results show that ETFA has a faster convergence rate with an excellent accuracy.

17 citations


Journal ArticleDOI
12 Jun 2015-PLOS ONE
TL;DR: A hiding method based on evolutionary multi-objective optimization (EMO) is proposed, which performs the hiding task by selectively inserting items into the database to decrease the confidence of sensitive rules below specified thresholds.
Abstract: During business collaboration, partners may benefit through sharing data. People may use data mining tools to discover useful relationships from shared data. However, some relationships are sensitive to the data owners and they hope to conceal them before sharing. In this paper, we address this problem in forms of association rule hiding. A hiding method based on evolutionary multi-objective optimization (EMO) is proposed, which performs the hiding task by selectively inserting items into the database to decrease the confidence of sensitive rules below specified thresholds. The side effects generated during the hiding process are taken as optimization goals to be minimized. HypE, a recently proposed EMO algorithm, is utilized to identify promising transactions for modification to minimize side effects. Results on real datasets demonstrate that the proposed method can effectively perform sanitization with fewer damages to the non-sensitive knowledge in most cases.

16 citations


Journal ArticleDOI
TL;DR: A novel dual watermarking algorithm is proposed based on the Fractional Fourier Transform (FRFT) and the digitalWatermarking techniques in this paper, which shows good performance both in robustness and in fragile in the experiments.
Abstract: A novel dual watermarking algorithm is proposed based on the Fractional Fourier Transform (FRFT) and the digital watermarking techniques in this paper The 0, 1 sequence is mapped into two different random sequences to realize the robust watermarking process The gray relational analysis method, the easy blocking method and the hierarchical embedding method are used here Good performance both in robustness and in fragile in the experiments shows the efficiency of our proposed method Future research orientations are mentioned in the conclusion

12 citations



Book ChapterDOI
01 Jan 2015
TL;DR: This paper presents a complex No-reference image quality assessment (NR IQA) algorithm, which mainly consists of two steps, which uses Gabor filters to obtain the feature images with different frequencies and orientations to extract the energy and entropy features of each sub-image.
Abstract: With the development of computer vision, there has been an increasing need to develop objective quality measurement techniques that can predict image quality automatically. In this paper, we present a complex No-reference image quality assessment (NR IQA) algorithm, which mainly consists of two steps. The first step uses Gabor filters to obtain the feature images with different frequencies and orientations, so as to extract the energy and entropy features of each sub-image. The second step uses the Linear least squares to obtain the parameters for IQA. We conduct experiments in LIVE IQA Database to verify our method. The experimental results show that the proposed method is much more competitive than other state of the art Full-reference (FR) or NR algorithms.

11 citations


Book ChapterDOI
01 Jan 2015
TL;DR: An innovative algorithm of driving behavior analysis based on AdaBoost with a variety of driving operation and traffic information to monitor driver’s driving operation behavior, including steering wheel angle, brake force, and throttle position is proposed.
Abstract: With the increase in the number of private cars as well as the non-professional drivers, the current traffic environment is in urgent need of driving assist equipment to timely reminder and to rectify the incorrect driving behavior. In order to meet this requirement, this paper proposes an innovative algorithm of driving behavior analysis based on AdaBoost with a variety of driving operation and traffic information. The proposed driving behavior analysis algorithm will mainly monitor driver’s driving operation behavior, including steering wheel angle, brake force, and throttle position. To increase the accuracy of driving behavior analysis, the proposed algorithm also takes road conditions into account. The proposed will make use of AdaBoost to create a driving behavior classification model in various different road conditions, and then could determine whether the current driving behavior belongs to safe driving. Experimental results show the correctness of the proposed driving behavior analysis algorithm can achieve average 80% accuracy in various driving simulations. The proposed algorithm has the potential of applying to real-world driver assistance system.

Journal ArticleDOI
TL;DR: This study proposed a new method which hides sensitive rules by removing some items in a database to reduce the support or confidence levels of sensitive rules below specified thresholds, and chooses suitable candidates for modification aimed at reducing the side effects and the data distortion degree.
Abstract: Association rule mining is a powerful data mining tool, and it can be used to discover unknown patterns from large volumes of data. However, people often have to face the risk of disclosing sensitive information when data is shared with different organizations. The association rule mining techniques may be improperly used to find sensitive patterns which the owner is unwilling to disclose. One of the great challenges in association rule mining is how to protect the confidentiality of sensitive patterns when data is released. Association rule hiding refers to sanitize a database so that certain sensitive association rules cannot be mined out in the released database. In this study, we proposed a new method which hides sensitive rules by removing some items in a database to reduce the support or confidence levels of sensitive rules below specified thresholds. Based on the information of positive border rules and negative border rules contained in transactions, the proposed method chooses suitable candidates for modification aimed at reducing the side effects and the data distortion degree. Comparative experiments on real datasets and synthetic datasets demonstrate that the proposed method can hide sensitive rules with much fewer side effects and database modifications. key words: Association rule hiding, side effects, border rules

Journal ArticleDOI
TL;DR: A novel reversible watermarking scheme based on a local smoothness estimator and multi-step embedding strategy that can obtain high embedding capacity while maintaining good visual quality is proposed and the experimental results demonstrate that the proposed method is effective.
Abstract: A novel reversible watermarking (RW) scheme based on a local smoothness estimator and multi-step embedding strategy is proposed in this paper. All the pixels are divided into four equal parts. Correspondingly, the watermark embedding process is separated into four independent steps. Thus each step is performed to embed watermark information into its corresponding image part. In each step, for each to-be-embedded pixel, a local smoothness estimator defined as the variance of its total neighbors is presented to estimate its local smoothness. An obvious advantage of introducing this estimator is that it can determine those pixels in smooth regions accurately. In fact, accurate determination means the decrease in embedding distortion. At the low embedding rate (ER), modifications induced by difference expansion (DE) are done only to those pixels located in smooth regions. Hence, the proposed method can obtain high embedding capacity while maintaining good visual quality. With ER gradually increased, adaptive embedding is employed. In adaptive embedding, for one to-be-embedded pixel, 1 or 2 bits are adaptively embedded according to the strength of relationship among all the pixels surrounding it. The experimental results demonstrate that the proposed method is effective.

Proceedings ArticleDOI
18 Nov 2015
TL;DR: This paper presents an improved A* algorithm in dynamic urban traffic to give an optimal path in real-time traffic environment and solves the congestion problem more efficiently.
Abstract: With the development of car industry, the number of vehicles traveling on the roads is ever increasing, leading to traffic congestion problem which causes air pollution, driver frustration and meaningless fuel consumption. Excellent navigation algorithm is urgently needed and some solutions have been put forward to tackle the problem. However, existing studies are mostly devised for static networks. They are not effective when they are applied in real dynamic environments. This paper presents an improved A* algorithm in dynamic urban traffic to give an optimal path in real-time traffic environment. This algorithm not only tackles the congestion problem, but also solves the problem more efficiently.

Journal Article
TL;DR: In this article, a hierarchical gradient diffusion algorithm is proposed to solve the transmission problem and the sensor node's loading problem by adding several relay nodes and arranging the sensor nodes routing path.
Abstract: In this paper, a hierarchical gradient diffusion algorithm is proposed to solve the transmission problem and the sensor node’s loading problem by adding several relay nodes and arranging the sensor node’s routing path. The proposed hierarchical gradient diffusion aims to balance sensor node’s transmission loading, enhance sensor node’s lifetime, and reduce the data package transmission loss rate. According to the experimental results, the proposed algorithm not only reduces power consumption about 12% but also decreases data loss rate by 85.5% and increases active nodes by about 51.7%.

Journal ArticleDOI
TL;DR: A novel bilinear discriminant feature line analysis (BDFLA) is proposed for image feature extraction that aims to minimise the within- class scatter and maximise the between-class scatter based on a two-dimensional NFL.
Abstract: A novel bilinear discriminant feature line analysis (BDFLA) is proposed for image feature extraction. The nearest feature line (NFL) is a powerful classifier. Some NFL-based subspace algorithms were introduced recently. In most of the classical NFL-based subspace learning approaches, the input samples are vectors. For image classification tasks, the image samples should be transformed to vectors first. This process induces a high computational complexity and may also lead to loss of the geometric feature of samples. The proposed BDFLA is a matrix-based algorithm. It aims to minimise the within-class scatter and maximise the between-class scatter based on a two-dimensional (2D) NFL. Experimental results on two-image databases confirm the effectiveness.

Journal ArticleDOI
TL;DR: This study first integrates entropy-based embedding technique and SNR into an optimization problem and the results verify the better SNR and the strong robustness against most signal processing or attacks.
Abstract: This study aims to present an optimization-based audio watermarking using entropy-based watermarking technique in the wavelet domain. In general, the performance of a watermarking system is measured in terms of signal-tonoise ratio (SNR) and bit error rate (BER). However, there is a tradeoff between them which issues a challenge in the field of the watermarking. To overcome this challenge, this study first integrates entropy-based embedding technique and SNR into an optimization problem. Because of uncertain service environment for the audio media, an optimization algorithm with less hardware requirement is required to solve this problem. Since both performance index and constraint are nonlinear, the compact particle swarm optimization (cPSO) which suits for embedded system can do this well. In addition, the hidden information can be extracted without knowledge of the original audio. In the experiments, the performance of the proposed method is tested and the results verify the better SNR and the strong robustness against most signal processing or attacks.

Journal ArticleDOI
01 Nov 2015-Optik
TL;DR: The segmentation procedure includes two stages, one is to extract the texture features of each block based on Gabor filter, and second is to classify thetexture features for segmentation based kernel self-optimization Fisher classifier.



Book ChapterDOI
26 Aug 2015
TL;DR: A new load balance algorithm based on swarm intelligence is proposed which can enhance the production of the systems while schedule tasks to VMs properly and get resource utilization higher.
Abstract: A good scheduling algorithm is a key for load balance system, in which system’s load meets users’ requirement. Here, a new load balance algorithm based on swarm intelligence is proposed which can enhance the production of the systems while schedule tasks to VMs properly. Here tasks completion time is compared with some other classical algorithms. The result shows that the proposed algorithm could meet users’ requirement and get resource utilization higher. The algorithm is better for network of a large area which is simulated by CloudSim.

Book ChapterDOI
01 Jan 2015
TL;DR: A framework named DSD (Dynamic SQLIAs Detection) is defined, a concrete detection mechanism based on DSD is proposed to detectSQLIAs by using parse tree and it is demonstrated that the mechanism has higher accuracy, lower false positive rate, and false negative rate.
Abstract: With the development of network technology, database-driven web applications (apps) provide flexible, convenient, available, and various services for users. User can send requests to these web apps by using browser over the Internet to get services such as e-commerce services, entertainments, and financial services. Though web environments have several advantages, various security threats have been described. Among these threats, SQL injection attack (SQLIA) is one of the most serious threats. SQLIA is a code injection attack that exploits secure vulnerabilities consisting in source codes to attack databases. SQLIA allows attackers to bypass authentication, access private information, modify data, and even destroy databases. Since many sensitive and confidential data stored in database must be kept private and secure, a mechanism to detect SQLIAs for web environments is necessary. In this paper, we define a framework named DSD (Dynamic SQLIAs Detection) to counter SQLIAs in web environments. Then, a concrete detection mechanism based on DSD is proposed to detect SQLIAs by using parse tree. The experimental results are demonstrated that our mechanism has higher accuracy, lower false positive rate, and false negative rate.

Proceedings ArticleDOI
18 Nov 2015
TL;DR: Improved JPS algorithm is proposed that not only tackles the congestion problem, but also solves the problem more efficiently and results show it outperforms the other algorithms.
Abstract: Path planning is one of the most studied problems in the field of robotics, unmanned aerial vehicles (uavs), vehicle navigation and fields like these. The majority algorithms of path planning produce possible paths of grid graph, and then apply in problems such as classical graph route searching. Astar algorithm, Hierarchical Path-Finding A-star (HPA*) and Jump Point Search (JPS) algorithms are studied in this paper to compare the maze searching capacity and different search maps' efficiency. We also propose an improved JPS algorithm in symmetric grid graph. In this paper, we compare their search time and efficiency. Conducted experiment shows that by adopting the same benchmarks our algorithm not only tackles the congestion problem, but also solves the problem more efficiently. Experiments validated the improved JPS algorithm and results show it outperforms the other algorithms.

Journal ArticleDOI
TL;DR: Gaussian normal basis (GNB) multiplication based on the proposed architecture can be used to reduce the space complexity of the proposed STMVP scheme.
Abstract: Toeplitz matrix–vector product (TMVP) decomposition is one of the high-precision multiplication algorithms. A symmetric TMVP (STMVP) decomposition is presented and theoretical analysis shows that the space complexity of the proposed STMVP scheme is less compared with the traditional TMVP approach. Gaussian normal basis (GNB) multiplication based on the proposed architecture can be used to reduce the space complexity.

Journal ArticleDOI
01 Dec 2015-Optik
TL;DR: Experimental results prove that the proposed L1-norm plus L2-norm sparse parameter (L1L2-SP) classifier achieves better recognition rate than the SRC classifiers, SFR classifier, and several other classifiers.

Proceedings ArticleDOI
24 May 2015
TL;DR: A new modified SPB multiplication for an arbitrary irreducible pentanomial is presented, and the proposed multiplication scheme has formed a TMVP formula.
Abstract: Toeplitz matrix-vector product (TMVP) approach is a special case of Karatsuba algorithm to design subquadratic multiplier in GF(2m). In binary extension fields, shifted polynomial basis (SPB) is a variable basis representation, and is widely studied. SPB multiplication using coordinate transformation technique can transform TMVP formulas, however, this approach is only applied for the field constructed by all trinomials or special class of pentanomials. For this reason, we present a new modified SPB multiplication for an arbitrary irreducible pentanomial, and the proposed multiplication scheme has formed a TMVP formula.

Book ChapterDOI
01 Jan 2015
TL;DR: This paper proposes an improved online load balancing algorithm that is suitable for cloud computing and uses CloudSim as a simulator to verify its power and efficiency.
Abstract: Since the proposition of the concept of cloud computing in the year 2006, cloud computing has drawn lots of attention from both industry and academic area. Several technologies such as virtualization formed the basis of cloud computing, while some other technologies acts as system improvement strategies in cloud computing. Among the used technologies, Load balancing is indispensable and extremely important in improving system performance and maintaining users’ experience. In this paper, we focus on load balancing algorithms commonly used in cloud computing. Through our analysis we proposed our improved online load balancing algorithm. We use several experimental results to show its power and efficiency. We use CloudSim as a simulator to verify our thoughts.

Book ChapterDOI
26 Aug 2015
TL;DR: An efficient algorithm is proposed to minimize side effects in the sanitization process for hiding sensitive high utility itemsets and three similarity measurements are designed as the new standard used in PPUM.
Abstract: High-Utility Itemset Mining (HUIM) considers both quantity and profit factors to measure whether an item or itemset is a profitable product. With the rapid growth of security considerations, privacy-preserving utility mining (PPUM) has become a critical issue in HUIM. In this paper, an efficient algorithm is proposed to minimize side effects in the sanitization process for hiding sensitive high utility itemsets. Three similarity measurements are also designed as the new standard used in PPUM. Experiments are also conducted to show the performance of the designed algorithm in terms of general side effects in PPDM and the new defined measurements in PPUM.

01 Jan 2015
TL;DR: The evolved bat algorithm is improved by replacing the fixed value, which is determined by the media, with a cosine function, the familiar trigonometric signal which exists in the natural environment is the sine/cosine signal.
Abstract: The diversity created during the searching process in swarm intelligence algorithms plays an important part that affects the exploration ability. The searching result can be further improved if the algorithm gives consideration to both the exploitation and the exploration. In this paper, the evolved bat algorithm is improved by replacing the fixed value, which is determined by the media, with a cosine function. The familiar trigonometric signal exists in the natural environment is the sine/cosine signal. We take the cosine signal in our design for improving the searching capacity of the evolved bat algorithm. To verify the performance and the searching accuracy of our proposed strategy, three test functions with known global optimum values are used in the experiments. Moreover, every test function is tested with four different dimensional criteria, which include 10, 30, 50, and 100 dimensional test environments. The experimental results indicate that our proposed strategy improves the searching accuracy of the evolved bat algorithm about 28.098%, 48.779%, 45.945%, and 48.81%, respectively for different dimensional environments, in average.

Journal ArticleDOI
01 Nov 2015-Optik
TL;DR: A novel classifier based on linear regression classification (LRC) called global linear regression coefficient (GLRC) classifier is proposed for recognition, which achieves better recognition rate than LRC classifier, sparse representation based classification (SRC)classifier, Collaborative representation based classifier and two phase test sample sparse representation (TPTSSR) classifiers and so on.