scispace - formally typeset
Search or ask a question

Showing papers in "Informatica (lithuanian Academy of Sciences) in 2012"


Journal ArticleDOI
TL;DR: Multi-Objective Optimization takes care of different objectives with the objectives keeping their own units and forms a guaranty for a solution being as non-subjective as possible.
Abstract: Multi-Objective Optimization takes care of different objectives with the objectives keeping their own units. The internal mechanical solution of a Ratio System, producing dimensionless numbers, is preferred. The ratio system creates the opportunity to use a second approach: a Reference Point Theory, which uses the ratios of the ratio system. This overall theory is called MOORA (Multi-Objective Optimization by Ratio Analysis). The results are still more convincing if a Full Multiplicative Form is added forming MULTIMOORA. The control by three different approaches forms a guaranty for a solution being as non-subjective as possible. MULTIMOORA, tested after robustness, showed positive results.

194 citations


Journal ArticleDOI
TL;DR: By extending the ratio system part of MOORA method, an algorithm to determine the most preferable alternative among all possible alternatives, when performance ratings are given as intervals, is presented.
Abstract: In some cases of using multi-criteria decision making methods for solving real-world problems ratings of alternatives cannot be determined precisely, and that is why they are expressed in the form of intervals. Therefore, the aim of this paper is to extend the MOORA method for solving decision making problems with interval data. By extending the ratio system part of MOORA method, an algorithm to determine the most preferable alternative among all possible alternatives, when performance ratings are given as intervals, is presented. Finally, an example is shown to highlight the proposed procedure, at the end of this paper.

71 citations


Journal ArticleDOI
TL;DR: In this paper, the authors extend fuzzy MULTIMOORA with linguistic reasoning and group decision-making, and propose a new method consisting of the three parts, namely the fuzzy Ratio System, the fuzzy Utopian Reference Point, and the fuzzy Full Multiplicative Form offering a robust comparison of alternatives against multiple objectives.
Abstract: This paper aims to extend fuzzy MULTIMOORA with linguistic reasoning and group decision-making (MULTIMOORA-FG). The new method consists of the three parts, namely the fuzzy Ratio System, the fuzzy Utopian Reference Point, and the fuzzy Full Multiplicative Form offering a robust comparison of alternatives against multiple objectives. In addition, MULTIMOORA-FG is designed to deal with triangular fuzzy numbers which, in turn, can resemble linguistic variables. MULTIMOORA-FG is a proper instrument for linguistic reasoning under fuzzy environment. In our study an application of personnel selection illustrates the group decision-making procedure according to MULTIMOORA-FG. Given the uncertainties peculiar of personnel selection, the application of multi-objective decision making (MODM) is required in this area. Fuzzy MULTIMOORA enables to aggregate subjective assessments of the decision-makers and thus offer an opportunity to perform a more robust personnel selection. The committee decided to consider eight qualitative characteristics expressed in linguistic variables. A numerical example exhibited possibilities for improvement of human resources management or any other business decision-making by applying MULTIMOORA-FG.

60 citations


Journal Article
TL;DR: This survey discusses some of the core concepts used in object tracking and presents a comprehensive survey of efforts in the past to address this problem.
Abstract: There is a broad range of applications of visual object tracking that motivate the interests of researchers worldwide. These include video surveillance to know the suspicious activity, sport video analysis to extract highlights, traffic monitoring to analyse traffic flow and human computer interface to assist visually challenged people. In general, the processing framework of object tracking in dynamic scenes includes the following stages: segmentation and modelling of interesting moving object, predicting possible location of candidate object in each frame, localization of object in each frame, generally through a similarity measure in feature space. However, tracking an object in a complex environment is a challenging task. This survey discusses some of the core concepts used in object tracking and present a comprehensive survey of efforts in the past to address this problem. We have also explored wavelet domain and found that it has great potential in object tracking as it provides a rich and robust representation of an object. Povzetek: Podan je pregled metod vizualnega sledenja objektov .

56 citations


Journal Article
TL;DR: The existing techniques were compared along with their collected empirical evidences to find if any particular approach was superior to others, and it was yielded that almost half of the techniques for regression test prioritization are independent of their implementation language.
Abstract: The purpose of regression testing is to validate the modified software and detect whether the unmodified code is adversely affected. Regression testing is primarily a maintenance activity. The main motivation behind this systematic review is to provide a ground for advancement of research in the field of Regression Test Prioritization. The existing techniques were compared along with their collected empirical evidences to find if any particular approach was superior to others. 65 papers reporting 50 experiments and 15 case studies were identified. A total of 106 techniques were evaluated for regression test prioritization. Also, a rigorous analysis of the techniques was performed by comparing them in terms of various measures like size of study, type of study, approach, input method, tool, metrics etc. Encouragingly, SLR yielded that almost half of the techniques for regression test prioritization are independent of their implementation language. While on the other hand the future research should focus on bridging the large gaps that were found existing in the usage of various tools and artifacts. During the course of research, preliminary literature survey indicated that to the best of our knowledge, no systematic review has been published so far on the topic of regression test prioritization.

56 citations


Journal ArticleDOI
TL;DR: Analytic network process (ANP) is a technique to solve multi-criteria decision-making problems in which the criteria affect each other and have nonlinear correlation.
Abstract: Nowadays most required products and services of companies are provided through other organisations. Outsourcing as a new approach has a significant role in management literature. Supplier should be selected by executives, when the organization decides to acquire a product or service from other organizations. Concerning supplier selection, the managers should consider more than one factor or criterion, which may be inconsistent and contradictory. Therefore, supplier selection is a multi-criteria decision-making issue. Analytic network process (ANP) is a technique to solve multi-criteria decision-making problems in which the criteria affect each other and have nonlinear correlation. In this study, the goal is to use ANP to select the supplier in a group decision-making.

43 citations


Journal ArticleDOI
TL;DR: This paper employs Tseng and Tsai's revocable concept to propose a new RIBE without random oracles to provide full security, and demonstrates that the proposed R IBE is semantically secure against adaptive-ID attacks in the standard model.
Abstract: Revocation problem is a critical issue for key management of public key systems. Any certificate-based or identity (ID)-based public key systems must provide a revocation method to revoke misbehaving/compromised users from the public key systems. In the past, there was little work on studying the revocation problem of ID-based public key systems. Most recently, Tseng and Tsai presented a novel ID-based public key system with efficient revocation using a public channel, and proposed a practical revocable ID-based encryption (called RIBE). They proved that the proposed RIBE is semantically secure in the random oracle model. Although the ID-based encryption schemes based on the random oracle model can offer better performance, the resulting schemes could be insecure when random oracles are instantiated with concrete hash functions. In this paper, we employ Tseng and Tsai's revocable concept to propose a new RIBE without random oracles to provide full security. We demonstrate that the proposed RIBE is semantically secure against adaptive-ID attacks in the standard model.

31 citations


Journal Article
TL;DR: The main results are that the difference in predictive performance is not significantly different for both approaches, however, in terms of total model size and induction times, ensembles that exploit the output structure are significantly more efficient.
Abstract: While ensembles have been used for structured output learning, the literature lacks an extensive study of different strategies to construct ensembles in this context. In this work, we fill this gap by presenting a thorough empirical comparison of ensembles that predict the complete output structure at once, versus a combination of ensembles that each predicts a single component of the structure. We present results in two structured output learning tasks, using predictive clustering trees as base learners. The main results are that the difference in predictive performance is not significantly different for both approaches. However, in terms of total model size and induction times, ensembles that exploit the output structure are significantly more efficient.

27 citations


Journal Article
TL;DR: It is shown that it Minkowski distance give better results than the Euclidean distance, and can give very good results using less time, and is applied to detect anomalous activity in the network, using detectors generated by the genetic algorithm.
Abstract: Computer security is an issue that will always be under investigation as intruders never stop to find ways to access data and network resources. Researches try to find functions and approaches that would increase chances to detect attacks and at the same time would be less expensive, regarding time and space. In this paper, an approach is applied to detect anomalous activity in the network, using detectors generated by the genetic algorithm. The Minkowski distance function is tested versus the Euclidean distance for the detection process. It is shown that it Minkowski distance give better results than the Euclidean distance, and can give very good results using less time. It gives an overall average detection rate of 81.74% against 77.44% with the Euclidean distance. In addition, formal concept analysis was applied on the data set containing only the selected features and used to visualize correlation between highly effective features. Povzetek: Predstavljena je varnostna metoda na osnovi umetnega imunskega sistema.

27 citations


Journal ArticleDOI
TL;DR: This paper introduces supporting and rejecting notions to describ attributes and objectives relationships leading to an evaluation model in terms of two measures or indices (selectability and rejectability) for each alternative in the framework of satisficing game theory.
Abstract: Three main approaches presently dominate preferences derivation or evaluation process in decision analysis (selecting, ranking or sorting options, alternatives, actions or decisions): value type approach (a value function or an utility measure is derived for each alternative to represent its adequacy with decision goal); outranking methods (a pair comparison of alternatives are carried up under each attribute or criteria to derive a pre-order on the alternatives set); and decision rules approach (a set of decision rules are derived by a learning process from a decision table with possible missing data). All these approaches suppose to have a single decision objective to satisfy and all alternatives characterized by a common set of attributes or criteria. In this paper we adopt an approach that highlights bipolar nature of attributes with regards to objectives that we consider to be inherent to any decision analysis problem. We, therefore, introduce supporting and rejecting notions to describ attributes and objectives relationships leading to an evaluation model in terms of two measures or indices (selectability and rejectability) for each alternative in the framework of satisficing game theory. Supporting or rejecting degree of an attribute with regard to an objective is assessed using known techniques such as analytic hierarchy process (AHP). This model allows alternatives to be characterized by heteregeneous attributes and incomparability between alternatives in terms of Pareto-equilibria.

24 citations


Journal Article
TL;DR: The applications of Graph Theory algorithms to determine paths, trees and connected dominating sets for simulating and analyzing respectively unicast (single-path and multi-path), multicast and broadcast communication in mobile ad hoc networks (MANETs).
Abstract: Various simulators (e.g., ns-2 and GloMoSim) are available to implement and study the behavior of the routing protocols for mobile ad hoc networks (MANETs). But, students and investigators who are new to this area often get perplexed in the complexity of these simulators and lose the focus in designing and analyzing the characteristics of the network and the protocol. Most of the time would be spent in learning the existing code modules of the simulator and the logical flow between the different code modules. The purpose of this paper would be to illustrate the applications of Graph Theory algorithms to study, analyze and simulate the behavior of routing protocols for MANETs. Specifically, we focus on the applications of Graph Theory algorithms to determine paths, trees and connected dominating sets for simulating and analyzing respectively unicast (single-path and multi-path), multicast and broadcast communication in mobile ad hoc networks (MANETs). We will discuss the (i) Dijkstra’s shortest path algorithm and its modifications for finding stable paths and bottleneck paths; (ii) Prim’s minimum spanning tree algorithm and its modification for finding all pairs smallest and largest bottleneck paths; (iii) Minimum Steiner tree algorithm to connect a source node to all the receivers of a multicast group; (iv) A node-degree based algorithm to construct an approximate minimum connected dominating set (CDS) for sending information from one node to all other nodes in the network; and (v) Algorithms to find a sequence of link-disjoint, node-disjoint and zone-disjoint multi-path routes in MANETs. Povzetek: Prispevek opisuje algoritme za mobilna omrežja.

Journal Article
TL;DR: The experiments revealed that the best retrieval performance is obtained after removal of in-code comments and applying a combined weighting scheme based on terms frequencies, normalized term frequencies, and a cosine-based document normalization.
Abstract: Latent Semantic Analysis (LSA) is an intelligent information retrieval technique that uses mathematical algorithms for analyzing large corpora of text and revealing the underlying semantic information of documents. LSA is a highly parameterized statistical method, and its effectiveness is driven by the setting of its parameters which are adjusted based on the task to which it is applied. This paper discusses and evaluates the importance of parameterization for LSA based similarity detection of source-code documents, and the applicability of LSA as a technique for source-code plagiarism detection when its parameters are appropriately tuned. The parameters involve preprocessing techniques, weighting approaches; and parameter tweaking inherent to LSA processing – in particular, the choice of dimensions for the step of reducing the original post-SVD matrix. The experiments revealed that the best retrieval performance is obtained after removal of in-code comments (Java comment blocks) and applying a combined weighting scheme based on term frequencies, normalized term frequencies, and a cosine-based document normalization. Furthermore, the use of similarity thresholds (instead of mere rankings) requires the use of a higher number of dimensions. Povzetek: Prispevek analizira metodo LSA posebej glede plagiarizma izvirne kode.

Journal ArticleDOI
TL;DR: This paper develops a new method for 2-tuple linguistic multiple attribute decision making, namely the 2-Tuple linguistic generalized ordered weighted averaging distance (2LGOWAD) operator, an extension of the OWA operator that utilizes generalized means, distance measures and uncertain information represented as 2- tuple linguistic variables.
Abstract: In this paper we develop a new method for 2-tuple linguistic multiple attribute decision making, namely the 2-tuple linguistic generalized ordered weighted averaging distance (2LGOWAD) operator. This operator is an extension of the OWA operator that utilizes generalized means, distance measures and uncertain information represented as 2-tuple linguistic variables. By using 2LGOWAD, it is possible to obtain a wide range of 2-tuple linguistic aggregation distance operators such as the 2-tuple linguistic maximum distance, the 2-tuple linguistic minimum distance, the 2-tuple linguistic normalized Hamming distance (2LNHD), the 2-tuple linguistic weighted Hamming distance (2LWHD), the 2-tuple linguistic normalized Euclidean distance (2LNED), the 2-tuple linguistic weighted Euclidean distance (2LWED), the 2-tuple linguistic ordered weighted averaging distance (2LOWAD) operator and the 2-tuple linguistic Euclidean ordered weighted averaging distance (2LEOWAD) operator. We study some of its main properties, and we further generalize the 2LGOWAD operator using quasi-arithmetic means. The result is the Quasi-2LOWAD operator. Finally we present an application of the developed operators to decision-making regarding the selection of investment strategies.

Journal ArticleDOI
TL;DR: An analytic form of general backlash characteristic description is proposed, which is based on appropriate switching and internal functions, which can be solved as a quasi-linear problem using an iterative parameter estimation method with internal variable estimation.
Abstract: The notion of general backlash is introduced where instead of the straight lines determining the upward and downward parts of backlash characteristic general curves are considered. An analytic form of general backlash characteristic description is proposed, which is based on appropriate switching and internal functions. Consequently, this multi-valued mapping is represented by one difference equation. All the parameters in the equation describing this hard nonlinearity are separated; hence the general backlash identification can be solved as a quasi-linear problem using an iterative parameter estimation method with internal variable estimation. Also the identification of cascaded systems consisting of a general input backlash followed by a linear dynamic system is presented. Simulation studies of general backlash identification and that of cascaded systems with general input backlash are included.

Journal ArticleDOI
TL;DR: In the random oracle model and under related mathematical hard problems, it is proved that the proposed protocol a secure AGKE protocol with identifying malicious participants is also secure against insider attacks.
Abstract: An authenticated group key exchange (AGKE) protocol allows participants to construct a common key and provide secure group communications in cooperative and distributed applications Recently, Choi et al proposed an identity (ID)-based authenticated group key exchange (IDAGKE) protocol from bilinear pairings However, their protocol suffered from an insider colluding attack because it didn't realize the security issue of withstanding insider attacks Withstanding insider attacks mean that it can detect whether malicious participants exist in the group key exchange protocol Nevertheless, an AGKE protocol resistant to insider attacks is still unable to find “who are malicious participants” In this paper, we propose an ID-based AGKE protocol with identifying malicious participants In our protocol, we use a confirmed computation property to achieve identifying malicious participants Certainly, it is also secure against insider attacks In the random oracle model and under related mathematical hard problems, we prove that the proposed protocol a secure AGKE protocol with identifying malicious participants

Journal Article
TL;DR: Multiple Attribute Decision Making Method Based on the Trapezoid Fuzzy Linguistic Hybrid Harmonic Averaging Operator is presented.
Abstract: Multiple Attribute Decision Making Method Based on the Trapezoid Fuzzy Linguistic Hybrid Harmonic Averaging Operator

Journal Article
TL;DR: The Multiobjective Optimization algorithm for discovering Comfortable Driving Strategies (MOCDS) is obtained, which finds more comfortable driving strategies than MODS, while not significantly deteriorating their traveling time and fuel consumption.
Abstract: Driving a vehicle along a route consists of control actions applied to the vehicle by taking into account the vehicle and route states. Control actions are usually selected by optimizing the traveling time and the fuel consumption. However, the resulting vehicle behavior can be uncomfortable for the driver/passengers. The comfort is measured as the change of acceleration, i.e., jerk. To obtain more comfortable driving strategies, we introduce comfort as an objective to the Multiobjective Optimization algorithm for discovering Driving Strategies (MODS), thus obtaining the Multiobjective Optimization algorithm for discovering Comfortable Driving Strategies (MOCDS). The two algorithms are compared on a real-world route. The results show that MOCDS finds more comfortable driving strategies than MODS, while not significantly deteriorating their traveling time and fuel consumption. The most significant improvement in comfort is achieved on driving strategies with low fuel consumption, which are highly uncomfortable and therefore have the most room for improvement. On the other hand, the driving strategies found by MODS with short traveling time are already comfortable and therefore cannot be additionally improved.

Journal Article
TL;DR: Usage of Holt-Winters Model and Multilayer Perceptron in Network Traffic Modelling and Anomaly Detection is demonstrated.
Abstract: Usage of Holt-Winters Model and Multilayer Perceptron in Network Traffic Modelling and Anomaly Detection

Journal Article
TL;DR: A new ART based on two-point partitioning is presented and shows that the ART-TPP algorithm has a positive improvement for the other two well-known algorithms, i.e. ART-RP and ART-BP.
Abstract: Test data generation is a key issue in the field of software testing. Adaptive random testing (ART) method has been proposed by Chen et al. to improve the fault-revealing ability of random testing. In the paper, we are mainly concerned with the partitioning-based adaptive random testing and present a new ART based on two-point partitioning. In the new algorithm, the current max-area region is partitioned by the midpoint of two points instead of a single point. The first point is randomly generated, and the second point is picked out from the candidate set according to the farthest distance criterion. In order to compare our algorithm with other two well-known algorithms, the experiments for the case of two-dimension are performed. The results show that our ART-TPP algorithm has a positive improvement for the other two, i.e. ART-RP and ART-BP. Moreover, the appropriate size of candidate set is determined as 2 or 3 based on our sensitivity analysis.

Journal ArticleDOI
Ganesh R. Naik1
TL;DR: Simulation improves the results of the separated sources by 7 dB to 8 dB, and also confirms that the separated Sources always have super-Gaussian characteristics irrespective of PDF of the signa ls or mixtures.
Abstract: Conventional Blind Source Separation (BSS) algorithms separate the sources assuming the number of sources equals to that of observations. BSS algorithms have been developed based on an assumption that all sources have non-Gaussian distributions. Most of the instances, these algorithms separate speech signals with super-Gaussian distributions. However, in real world examples there exist speech signals which are sub-Gaussian. In this paper, a novel method is proposed to measure the separation qualities of both super-Gaussian and sub-Gaussian distributions. This study measures the impact of the Probability Distribution Function (PDF) of the signals on the outcomes of both sub and super-Gaussian distributions. This paper also reports the study of impact of mixing environment on the source separation. Simulation improves the results of the separated sources by 7 dB to 8 dB, and also confirms that the separated sources always have super-Gaussian characteristics irrespective of PDF of the signa ls or mixtures.

Journal ArticleDOI
Jonas Mockus1
TL;DR: An extended model USEGM is proposed, which is application of the Nash Equilibrium to strategies that define buying-selling margins and bank haircuts dynamically and enables us to simulate market illiquidity that is an important feature of the present financial crisis.
Abstract: A simple Stock Exchange Game Model (SEGM) was introduced in Mockus (2003), to simulate the behavior of several stockholders using fixed buying-selling margins at fixed bank yield.In this paper, an extended model USEGM is proposed. The advantage of USEGM is application of the Nash Equilibrium (NE) to strategies that define buying-selling margins and bank haircuts dynamically. This enables us to simulate market illiquidity that is an important feature of the present financial crisis (Allen, 2008). In addition, USEGM includes the transaction costs to reflect the reality better. To represent users that prefer linear utility functions, USEGM adds the AR-ABS(p) autoregressive model, minimizing the absolute values, to the traditional AR(p) model minimizing least square deviations.A formal application of NE to simulate the behavior of market participants is a new feature of these models. In the well-known works Allen (2008), Brunnermeier (2009), Brunnermeier and Yogo (2009) equilibrium ideas were applied to supply-demand balance concepts, as usual.The objective of USEGM is not forecasting, but simulation of financial time series that are affected by predictions of the participants. The “ virtual” stock exchange can help in testing the assumption of rational investor behavior vs the recent theories that explain financial markets by irrational responses of major market participants (Krugman, 2000, 2008, 2009). The model helps comparing expected profits of different prediction models using the virtual and historical data.The model has been compared with eighteen actual financial time series and found the results to be close in many cases.

Journal Article
TL;DR: In this article, a biomolecular implementation of the pushdown automaton using DNA molecules was proposed, inspired by Cavaliere et al. (2005) and the idea of this improved implementation was inspired by the original push-down automata.
Abstract: In this paper we propose a biomolecular implementation of the push-down automaton (one of theoretical models of computing device with unbounded memory) using DNA molecules. The idea of this improved implementation was inspired by Cavaliere et al. (2005).

Journal ArticleDOI
TL;DR: This work proposes an identity-based key-insulated signcryption (IBKISC) scheme, which is the fastest with the shortest ciphertext size and Compared with the Sign-then-Encrypt (StE) and Encrypt- then-Sign (EtS) using IBKIE and IBKIS in the standard model.
Abstract: Key-insulated cryptography is an important technique to protect private keys in identity-based (IB) cryptosytems. Despite the flurry of recent results on IB key-insulated encryption (IBKIE) and signature (IBKIS), a problem regarding the security and efficiency of practicing IBKIE and IBKIS as a joint IB key-insulated signature/encryption scheme with a common set of parameters and keys remains open. To deal with the above question, we propose an identity-based key-insulated signcryption (IBKISC) scheme. Compared with the Sign-then-Encrypt (StE) and Encrypt-then-Sign (EtS) using IBKIE and IBKIS in the standard model, our proposed IBKISC scheme is the fastest with the shortest ciphertext size.

Journal Article
TL;DR: This paper proposes a novel approach to similarity-based approximate reasoning in an interval-valued fuzzy environment and uses two examples of shipbuilding processing to illustrate and validate the effectiveness of the proposed schema.
Abstract: This paper proposes a novel approach to similarity-based approximate reasoning in an interval-valued fuzzy environment. In a rule-based system, an ‘if ... then ...’ rule can be translated into an intervalvalued fuzzy relation by suitable implication operations. The similarity grade between a case and the antecedent of a rule is computed and used to modify the relation. A consequent is derived from the wellknown projection operation over the modified relation. The inference mechanism is appropriate because the techniques of the conventional Compositional Rule of Inference are incorporated into the existing similarity-based inference. Two examples of shipbuilding processing are utilized to illustrate and validate the effectiveness of the proposed schema. Povzetek: Clanek obravnava metode razmisljanje v mehki logiki, temeljece na podobnosti.

Journal Article
TL;DR: A model for local Destination Management Systems of heritage destinations (DMS) is proposed which could help to achieve a more globally responsible paradigm for the tourism industry and facilitate the management of destinations and the coordination of the local suppliers.
Abstract: This paper suggests that many of the negative effects of globalization and inadequate tourism growth can be compensate by the use of intelligent ICT solutions in the development of heritage tourism. The encounter between cultural tourism and information and communication technologies represents an opportunity to preserve national culture, create partnership and enhance destinations value in information society. To this aim, we propose a model for local Destination Management Systems of heritage destinations (DMS) which could help to achieve a more globally responsible paradigm for the tourism industry and facilitate the management of destinations and the coordination of the local suppliers. DMSs provide interactive demonstrations of local amenities and attractions and enable consumers to build their own itinerary based on their interests and requirements. All the stakeholders within the destination are linked with each other in order to create collaborative action and a genuine, sustained growth in heritage tourism. An example is given of the Croatian World Heritages Sites and their status on the Web.

Journal ArticleDOI
TL;DR: Through a comprehensive computational study, it is shown that the proposed algorithm provides the best results, the algorithm with fine-tuned parameters finds the global minimum with a high probability.
Abstract: Multidimensional scaling with city-block distances is considered in this paper. The technique requires optimization of an objective function which has many local minima and can be non-differentiable at minimum points. This study is aimed at developing a fast and effective global optimization algorithm spanning the whole search domain and providing good solutions. A multimodal evolutionary algorithm is used for global optimization to prevent stagnation at bad local optima. Piecewise quadratic structure of the least squares objective function with city-block distances has been exploited for local improvement. The proposed algorithm has been compared with other algorithms described in literature. Through a comprehensive computational study, it is shown that the proposed algorithm provides the best results. The algorithm with fine-tuned parameters finds the global minimum with a high probability.

Journal ArticleDOI
TL;DR: It is shown that the improved Hwang et al.'s ElGamal-like scheme is still insecure against chosen-plaintext attacks whether the system is operated in the quadratic residue modulus or not.
Abstract: Hwang et al. proposed an ElGamal-like scheme for encrypting large messages, which is more efficient than its predecessor in terms of computational complexity and the amount of data transformation. They declared that the resulting scheme is semantically secure against chosen-plaintext attacks under the assumptions that the decision Diffie–Hellman problem is intractable. Later, Wang et al. pointed out that the security level of Hwang et al.'s ElGamal-like scheme is not equivalent to the original ElGamal scheme and brings about the disadvantage of possible unsuccessful decryption. At the same time, they proposed an improvement on Hwang et al.'s ElGamal-like scheme to repair the weakness and reduce the probability of unsuccessful decryption. However, in this paper, we show that their improved scheme is still insecure against chosen-plaintext attacks whether the system is operated in the quadratic residue modulus or not. Furthermore, we propose a new ElGamal-like scheme to withstand the adaptive chosen-ciphertext attacks. The security of the proposed scheme is based solely on the decision Diffie–Hellman problem in the random oracle model.

Journal Article
TL;DR: In this article, the Frozen Accident as an Evolutionary Adaptation: A Rate Distortion Theory Perspective on the Dynamics and Symmetries of Genetic Coding Mechanisms.
Abstract: 'The Frozen Accident' as an Evolutionary Adaptation: A Rate Distortion Theory Perspective on the Dynamics and Symmetries of Genetic Coding Mechanisms

Journal ArticleDOI
TL;DR: A piecewise uniform quantizer for input samples with discrete amplitudes for Laplacian source is designed and analyzed, and its forward adaptation is done, proving that the discrete model is more appropriate for image quantization than continual model.
Abstract: In this paper, a piecewise uniform quantizer for input samples with discrete amplitudes for Laplacian source is designed and analyzed, and its forward adaptation is done This type of quantizers is very often used in practice for the purpose of compression and coding of already quantized signals It is shown that the design and the adaptation of quantizers for discrete input samples are different from the design and the adaptation of quantizers for continual input samples A weighting function for PSQNR (peak signal-to-quantization noise ratio), which is obtained based on probability density function of variance of standard test images is introduced Experiments are done, applying these quantizers for compression of grayscale images Experimental results are very well matched to the theoretical results, proving the theory Adaptive piecewise uniform quantizer designed for discrete input samples gives for 9 to 20 dB higher PSQNR compared to the fixed piecewise uniform quantizer designed for discrete input samples Also it is shown that the adaptive piecewise uniform quantizer designed for discrete input samples gives higher PSQNR for 146 to 345 dB compared the adaptive piecewise uniform quantizer designed for continual input samples, which proves that the discrete model is more appropriate for image quantization than continual model

Journal Article
TL;DR: This work addresses the issue of maintaining users’ privacy when using third-party recommender services and introduces a framework for Private Recommender Service (PRS) based on Enhanced Middleware for Collaborative Privacy (EMCP) running at user side, which executes a two-stage concealment process on the extracted data to augment the recommendation’s accuracy and privacy.
Abstract: IPTV service providers are starting to realize the significant value of recommender services in attracting and satisfying customers as they offer added values e.g. by delivering suitable personalized contents according to customers personal interests in a seamless way, increase content sales and gain competitive advantage over other competitors. However the current implementations of recommender services are mostly centralized combined with collecting data from multiple users that cover personal preferences about different contents they watched or purchased. These profiles are stored at third-party providers that might be operating under different legal jurisdictions related to data privacy laws rather than the ones applied where the service is consumed. From privacy perspective, so far they are all based on either a trusted third party model or on some generalization model. In this work, we address the issue of maintaining users’ privacy when using third-party recommender services and introduce a framework for Private Recommender Service (PRS) based on Enhanced Middleware for Collaborative Privacy (EMCP) running at user side. In our framework, PRS uses platform for privacy preferences (P3P) policies for specifying their data usage practices. While EMCP allows the users to use P3P policies exchange language (APPEL) for specifying their privacy preferences for the data extracted from their profiles. Moreover, EMCP executes a two-stage concealment process on the extracted data which utilize trust mechanism to augment the recommendation’s accuracy and privacy. In such case, the users have a complete control over the privacy level of their profiles and they can submit their preferences in an obfuscated form without revealing any information about their data, the further computation of recommendation proceeds over the obfuscated data using secure multi-party computation protocol. We also provide an IPTV network scenario and experimentation results. Our results and analysis shows that our two-stage concealment process not only protect the users’ privacy, but also can maintain the recommendation accuracy.