scispace - formally typeset
Search or ask a question

Showing papers in "Informatica (lithuanian Academy of Sciences) in 2007"


Journal Article
TL;DR: The goal of supervised learning is to build a concise model of the distribution of class labels in terms of predictor features, and the resulting classifier is then used to assign class labels to the testing instances where the values of the predictor features are known, but the value of the class label is unknown.
Abstract: The goal of supervised learning is to build a concise model of the distribution of class labels in terms of predictor features. The resulting classifier is then used to assign class labels to the testing instances where the values of the predictor features are known, but the value of the class label is unknown. This paper describes various supervised machine learning classification techniques. Of course, a single chapter cannot be a complete review of all supervised machine learning classification algorithms (also known induction classification algorithms), yet we hope that the references cited will cover the major theoretical issues, guiding the researcher in interesting research directions and suggesting possible bias combinations that have yet to be explored.

2,535 citations


Journal Article
TL;DR: The performance of several stopping criteria that react adaptively to the state of an optimization run is evaluated for a Particle Swarm Optimization algorithm in this work.
Abstract: When using optimization algorithms the goal is usually clear: The global optimum should be found. However, in general it is not clear when this goal is achieved, especially if real-world problems are optimized for which no knowledge about the global optimum is available. Therefore, it is not easy to decide when the execution of an optimization algorithm should be terminated. Although different mechanisms can be used for the detection of an appropriate time for ending an optimization run, only two of them are frequently used in the literature. Unfortunately, both methods have disadvantages, particularly for the optimization of real-world problems. Because especially for practical applications it is important when an optimization algorithm is terminated as they usually contain computationally expensive objective functions, the performance of several stopping criteria that react adaptively to the state of an optimization run is evaluated for a Particle Swarm Optimization algorithm in this work. The examination is done on the basis of a constrained single-objective power allocation problem. Suggestions from former work concerning stopping criteria for unconstrained optimization are verified and comparisons with results for Differential Evolution are made. Povzetek: Ovrednoteni so ustavitveni kriteriji za optimiranje z roji delcev (angl. particle swarm optimization) in rezultati primerjani z rezultati algoritma diferencialne evolucije.

120 citations


Journal Article
TL;DR: An Efficient Chaos-Based Feedback Stream Cipher (ECBFSC) for Image Encryption and Decryption
Abstract: An Efficient Chaos-Based Feedback Stream Cipher (ECBFSC) for Image Encryption and Decryption

115 citations


Journal Article
TL;DR: This paper summarizes most of techniques used to filter spams by analyzing the email content and recommends several approaches to improve the quality of email filtering.
Abstract: So fast, so cheap, so efficient, Internet is nowadays incontestably communication mean of choice for personal, business and academic purposes. Unfortunately, Internet has not only this beautiful face. Malicious activities enjoy as well this so fast, cheap and efficient mean. The last decade, Internet worms took the lights. In the recent years, spams are invading one of the most used services of Internet: email. This paper summarizes most of techniques used to filter spams by analyzing the email content. Povzetek: Clanek pregledno opisuje metode za filtriranje elektronske poste.

60 citations


Journal Article
TL;DR: This paper gives a review of current state of the art in the development of robust and intelligent surveillance systems, going beyond traditional vision based framework to more advanced multi-modal framework.
Abstract: This paper gives a review of current state of the art in the development of robust and intelligent surveillance systems, going beyond traditional vision based framework to more advanced multi-modal framework. The goal of automated surveillance system is to assist the human operator in scene analysis and event classification by automatically detecting the objects and analyzing their behavior using computer vision, pattern recognition and signal processing techniques. This review addresses several advancements made in these fields while bringing out the fact that realizing a practical end to end surveillance system still remains a difficult task due to several challenges faced in a real world scenario. With the advancement in sensor and computing technology, it is now economically and technically feasible to adopt multi-camera and multi-modal framework to meet the need of efficient surveillance system in wide range of security applications like security guard for communities and important buildings, traffic surveillance in cities and military applications. Therefore our review includes significant discussion on multi-modal data fusion approach for robust operation. Finally we conclude with discussion on possible future research directions. Povzetek: Opisane so moderne robustne metode inteligentnega nadzora.

52 citations


Journal ArticleDOI
TL;DR: There are not necessary to solve WEP in the group presentation level and hence there are no restrictions on the group complexity in this sense, and the construction of irreducible representation of group is required.
Abstract: The key agreement protocol based on infinite non-commutative group presentation and representation levels is proposed. Two simultaneous problems in group representation level are used: the conjugator search problem (CSP) and modified discrete logarithm problem (DLP). The modified DLP in our approach is a matrix DLP and is different from that's used in other publications. The algorithm construction does not allow to perform a crypto-analysis by replacing the existing CSP solution to the decomposition problem (DP) solution. The group presentation level serves for two commuting subgroups and invertible group's word image matrix construction. The group representation level allows reliable factors disguising in the initial word. The word equivalence problem (WEP) solution is transformed from the group presentation level to the group representation level. Hence there are not necessary to solve WEP in the group presentation level and hence there are no restrictions on the group complexity in this sense. The construction of irreducible representation of group is required. The presented protocol is a modernization of protocol declared in (Sakalauskas et al., 2005).

47 citations


Journal ArticleDOI
TL;DR: In this article, an identity based strong designated verifier signature (IBSDVS) scheme using bilinear pairings is proposed and proved secure against existential forgery under adaptively chosen message and identity attack in random oracle model.
Abstract: We propose an Identity Based Strong Designated Verifier Signature (IBSDVS) scheme using bilinear pairings. Designated Verifier Signature finds application in e-voting, auctions and call for tenders. We prove that the scheme is secure against existential forgery under adaptively chosen message and identity attack in random oracle model. We also show that the problem of delegatability does not exist in our scheme. Part of work was carried out when the author was affiliated to Secure Technology Lab., IDRBT, Hyderabad, India.

44 citations


Journal ArticleDOI
TL;DR: Computational Trust in Web Content Quality: A Comparative Evalutation on the Wikipedia Project shows that trust in web content quality is higher on Wikipedia than on other search engines.
Abstract: Computational Trust in Web Content Quality: A Comparative Evalutation on the Wikipedia Project

44 citations


Journal ArticleDOI
TL;DR: This paper presents an efficient identity-based key exchange protocol based on the difficulty of computing a discrete logarithm problem that provides implicit key authentication as well as the desired security attributes of an authenticated key exchange Protocol.
Abstract: A key exchange (or agreement) protocol is designed to allow two entities establishing a session key to encrypt the communication data over an open network. In 1990, Gunther proposed an identity-based key exchange protocol based on the difficulty of computing a discrete logarithm problem. Afterwards, several improved protocols were proposed to reduce the number of communication steps and the communicational cost required by Gunther's protocol. This paper presents an efficient identity-based key exchange protocol based on the difficulty of computing a discrete logarithm problem. As compared with the previously proposed protocols, it has better performance in terms of the computational cost and the communication steps. The proposed key exchange protocol provides implicit key authentication as well as the desired security attributes of an authenticated key exchange protocol.

42 citations


Journal ArticleDOI
TL;DR: This paper presents an approach to mining information relating people, places, organizations and events extracted from Wikipedia and linking them on a time scale and illustrates the proposed approach on 1.7 million Wikipedia articles.
Abstract: This paper presents an approach to mining information relating people, places, organizations and events extracted from Wikipedia and linking them on a time scale. The approach consists of two phases: (1) identifying relevant pages - categorizing the articles as containing people, places or organizations; (2) generating timeline - linking named entities and extracting events and their time frame. We illustrate the proposed approach on 1.7 million Wikipedia articles. Povzetek: Predstavljene so metode rudarjenja informacij iz Wikipedie in urejanje v

38 citations


Journal ArticleDOI
TL;DR: The proposed method is very simple method of making an intelligent guess of the starting points for the iterative Lloyd-Max's algorithm, which can be determined by the values of compandor's parameters.
Abstract: In this paper an exact and complete analysis of the Lloyd-Max's algorithm and its initialization is carried out An effective method for initialization of Lloyd-Max's algorithm of optimal scalar quantization for Laplacian source is proposed The proposed method is very simple method of making an intelligent guess of the starting points for the iterative Lloyd-Max's algorithm Namely, the initial values for the iterative Lloyd-Max's algorithm can be determined by the values of compandor's parameters It is demonstrated that by following that logic the proposed method provides a rapid convergence of the Lloyd-Max's algorithm

Journal Article
TL;DR: The paper illustrates how webshops can be extended by a fuzzy classification model, which allows webshop administrators to improve customer equity, launch loyalty programs, automate mass customization and personalization issues, and refine marketing campaigns to maximize the real value of the customers.
Abstract: Building and maintaining customer loyalty are important issues in electronic business. By providing customer services, sharing cost benefits with online customers, and rewarding the most valued customers, customer loyalty and customer equity can be improved. With conventional marketing programs, groups or segments of customers are typically constituted according to a small number of attributes. Although corresponding data values may be similar for two customers, they may fall into different classes and be treated differently. With the proposed fuzzy classification model, however, customers with similar behavior and qualifying attributes have similar membership functions and therefore similar customer values. The paper illustrates how webshops can be extended by a fuzzy classification model. This allows webshop administrators to improve customer equity, launch loyalty programs, automate mass customization and personalization issues, and refine marketing campaigns to maximize the real value of the customers. Povzetek: Razvit je model za dolocanje lojalnosti internetnih kupcev.

Journal Article
TL;DR: The potential of text mining for discovering implicit knowledge in biomedical literature focusing on articles from database PubMed Central is investigated and a concrete example of such constructed knowledge about a substance calcineurin and its potential relations with other already published indications of autism is presented.
Abstract: In this paper we investigate the potential of text mining for discovering implicit knowledge in biomedical literature. Based on Swanson's suggestion for hypotheses generation we tried to identify potential contributions to a better understanding of autism focusing on articles from database PubMed Central. First, we used them for ontology construction in order to obtain an improved insight into the domain structure. Next, we extracted a few rare terms that could potentially lead to new knowledge discovery for the explanation of the autism phenomena. We present a concrete example of such constructed knowledge about a substance calcineurin and its potential relations with other already published indications of autism. Povzetek: Prispevek opisuje uporabo metod rudarjenja besedil na medicinskih clankih s podrocja avtizma.

Journal ArticleDOI
TL;DR: This paper proposes a reversible data hiding method for error diffused halftone images that employs statistics feature of pixel block patterns to embed data, and utilizes the HVS characteristics to reduce the introduced visual distortion.
Abstract: This paper proposes a reversible data hiding method for error diffused halftone images. It employs statistics feature of pixel block patterns to embed data, and utilizes the HVS characteristics to reduce the introduced visual distortion. The watermarked halftone image can be perfectly recovered if it is intact, only a secret key is required. The method is suitable for the applications where the content accuracy of the original halftone image must be guaranteed, and it is easily extended to the field of halftone image authentication.

Journal ArticleDOI
TL;DR: The paper proposes to consider a family of similar business processes as a generic process and to represent knowledge about generic processes in a domain independent way when developing enterprise-wide information systems (IS).
Abstract: Business process engineering is an important part of the advanced enterprise engineering. One of the still open issues is the question how in the enterprise system design to reuse ontological knowledge about business processes. The paper proposes to consider a family of similar business processes as a generic process and to represent knowledge about generic processes in a domain independent way. It describes the main scheme for reuse of such a domain independent knowledge when developing enterprise-wide information systems (IS). The main attention is paid to the process configuration problem. In order to solve this problem, a configurator (human being or machine) must find a set of components that fit together to satisfy the problem specification. An approach based on Description Logics is proposed for this aim. The main contribution of the paper is the proposed process configuration technique.

Journal Article
TL;DR: An entropy-driven parameter control approach for exploring and exploiting evolutionary algorithms and four kinds of entropy to express diversity and to control the entropy- driven approach are discussed.
Abstract: Every evolutionary algorithm needs to address two important facets: exploration and exploitation of a search space. Evolutionary search must combine exploration of the new regions of the space with exploitation of the potential solutions already identified. The necessity of balancing exploration with exploitation needs to be intelligent. This paper introduces an entropy-driven parameter control approach for exploring and exploiting evolutionary algorithms. Entropy represents the amount of disorder of the population, where an increase in entropy represents an increase in diversity. Four kinds of entropy to express diversity and to control the entropy-driven approach are discussed. The experimental results of a unimodal, a multimodal with many local minima, and a multimodal with only a few local minima functions show that the entropy-driven approach achieves good and explicit balance between exploration and exploitation.

Journal ArticleDOI
TL;DR: This paper formally defines the security model for the non-interactive ID-based deniable authentication protocol and presents a new efficient ID- based deniability authentication protocol based on RSA assumption and uses the techniques from provable security to analyze the security of the proposed protocol.
Abstract: Deniable authenticated protocol is a new cryptographic authentication protocol that enables a designated receiver to identify the source of a given message without being able to prove the identity of the sender to a third party. Therefore, it can be applied to some particular situations in electronic commerce. In this paper, we formally define the security model for the non-interactive ID-based deniable authentication protocol and present a new efficient ID-based deniable authentication protocol based on RSA assumption. What's more, we also use the techniques from provable security to analyze the security of our proposed protocol.

Journal ArticleDOI
TL;DR: The processing steps from image acquisition to e-learning documentation of the Aghios Achilleios basilica, twin lakes Prespes, Northern Greece, through its 3-D geometric CAAD (Computer-Aided Architectural Design) model and semantic description are presented, and emphasis is placed on introducing and documenting the new term e- learning documentation.
Abstract: The innovations and improvements in digital imaging sensors and scanners, computer modeling, haptic equipments and e-learning technology, as well as the availability of many powerful graphics PCs and workstations make haptic-based rendering methods for e-learning documentation with 3-D modeling functionality feasible. E-Learning documentation is a new term in computing, engineering and architecture, related to digital documentation with e-learning functionality, and introduced to literature for the first time within this paper. In particular, for the historical living systems (architectures, monuments, cultural heritage sites), such a methodolgy must be able to derive pictorial, geometric, spatial, topological, learning and semantic information from the target architectural object (historical living system), in such a way that it can be directly used for e-learning purposes regarding the history, the architecture, the structure and the temporal (time-based) 3-D geometry of the projected historical living system. A practical project is used to demonstrate the functionality and the performance of the proposed methodology. In particular, the processing steps from image acquisition to e-learning documentation of the Aghios Achilleios basilica, twin lakes Prespes, Northern Greece, through its 3-D geometric CAAD (Computer-Aided Architectural Design) model and semantic description are presented. Also, emphasis is placed on introducing and documenting the new term e-learning documentation. Finaly, for learning purposes related to 3-D modeling accuracy evaluation, a comparison test of two image-based approaches is carried out and discussed.

Journal Article
TL;DR: In this article, a model of game-based learning is proposed for both the playing of computer games and learning within the classroom environment. But this model is not suitable for the transfer of knowledge to domains outside the world of games and highlights several case studies in health and medicine.
Abstract: This paper details a model of game-based learning and suggests how this can be applied to both the playing of computer games and learning within the classroom environment. The authors document the results from a University level course, created in a role-playing form for designing educational games and highlights the student’s attitudes and beliefs regarding game design as a career. They also suggest that educational games can be used successfully for the transfer of knowledge to domains outside the world of computer games and highlights several case studies in the area of health and medicine.

Journal ArticleDOI
TL;DR: Computational results reported reveal that the proposed heuristic determines solutions proven to lie within 92-99% of optimality for a number of realistic test problems.
Abstract: This paper is concerned with an employee scheduling problem involving multiple shifts and work centers, where employees belong to a hierarchy of categories having downward substi- tutability An employee at a higher category may perform the duties of an employee at a lower category, but not vice versa However, a higher category employee receives a higher compensation than a lower category employee For a given work center, the demand for each category during a given shift is fixed for the weekdays, and may differ from that on weekends Two objectives need to be achieved: The first is to find a minimum-cost workforce mix of categories of employees that is needed to satisfy specified demand requirements, and the second is to assign the selected employ- ees to shifts and work centers taking into consideration their preferences for shifts, work centers, and off-days A mixed-integer programming model is initially developed for the problem, based on which a specialized scheduling heuristic is subsequently developed for the problem Computational results reported reveal that the proposed heuristic determines solutions proven to lie within 92-99% of optimality for a number of realistic test problems

Journal ArticleDOI
TL;DR: This work proposes a novel and integrated approach for a semi-automated extraction of ontology-based semantic web from data-intensive web application and thus, make the web content machine-understandable.
Abstract: The advance of the Web has significantly and rapidly changed the way of information organization, sharing and distribution. The next generation of the web, the semantic web, seeks to make information more usable by machines by introducing a more rigorous structure based on ontologies. In this context we try to propose a novel and integrated approach for a semi-automated extraction of ontology-based semantic web from data-intensive web application and thus, make the web content machine-understandable. Our approach is based on the idea that semantics can be extracted by applying a reverse engineering technique on the structures and the instances of HTML-forms which are the most convenient interface to communicate with relational databases on the current data-intensive web application. This semantics is exploited to produce over several steps, a personalised ontology.

Journal Article
TL;DR: Preliminary Numerical Experiments in Multiobjective Optimization of a Metallurgical Production Process shows promising results in terms of precision and efficiency in the production process.
Abstract: Preliminary Numerical Experiments in Multiobjective Optimization of a Metallurgical Production Process

Journal Article
TL;DR: A flexible modular system based on integration of arbitrary access sensors and an arbitrary number of stand-alone modules and four independent modules (expert-defined rules, micro learning, macro learning and visual learning) is designed.
Abstract: Access control is an important security issue in particular because of terrorist threats. Access points are increasingly becoming equipped with advanced input sensors often based on biometrics, and with advanced intelligent methods that learn from experience. We have designed a flexible modular system based on integration of arbitrary access sensors and an arbitrary number of stand-alone modules. The system was tested with four sensors (a door sensor, an identity card reader, a fingerprint reader and a camera) and four independent modules (expert-defined rules, micro learning, macro learning and visual learning). Preliminary tests of the designed prototype are encouraging. Povzetek: Clanek opisuje vgradnjo inteligentnih metod v sistem za nadzor vstopa.

Journal Article
TL;DR: This paper proposes a unique replica placement technique using the concepts of a supergame, and derives a resource allocation mechanism which acts as a platform at the subgame level for the agents to compete, guaranteeing the entire system to be in a continuous self-evolving and selfrepairing mode.
Abstract: Replicating data over geographically dispersed web servers reduces network traffic, server load, and more importantly the user-perceived access delays. This paper proposes a unique replica placement technique using the concepts of a supergame. The supergame allows the agents who represent the data objects to continuously compete for the limited available server memory space, so as to acquire the rights to place data objects at the servers. At any given instance in time, the supergame is represented by a game which is a collection of subgames, played concurrently at each server in the system. We derive a resource allocation mechanism which acts as a platform at the subgame level for the agents to compete. This approach allows us to transparently monitor the actions of the agents, who in a noncooperative environment strategically place the data objects to reduce the user access time, latency, which in turn adds reliability and fault-tolerance to the system. We show that this mechanism exhibits Nash equilibrium at the subgame level which in turn conforms to games and supergame Nash equilibrium, respectively, guaranteeing the entire system to be in a continuous self-evolving and selfrepairing mode. The mechanism is extensively evaluated against some well-known algorithms, such as: greedy, branch and bound, game theoretical auctions and genetic algorithms. The experimental results reveal that the mechanism provides excellent solution quality, while maintaining fast execution time. Povzetek: Opisana je metoda za multipliciranje internetnih strani.

Journal ArticleDOI
TL;DR: The experimental investigation has shown that the modification exceeds the original algorithm in the visualization quality and computational expenses, and the conditions, where the relative MDS efficiency exceeds that of standard MDS, are estimated.
Abstract: In this paper, the relative multidimensional scaling method is investigated. This method is designated to visualize large multidimensional data. The method encompasses application of multidimensional scaling (MDS) to the so-called basic vector set and further mapping of the remaining vectors from the analyzed data set. In the original algorithm of relative MDS, the visualization process is divided into three steps: the set of basis vectors is constructed using the k-means clustering method; this set is projected onto the plane using the MDS algorithm; the set of remaining data is visualized using the relative mapping algorithm. We propose a modification, which differs from the original algorithm in the strategy of selecting the basis vectors. The experimental investigation has shown that the modification exceeds the original algorithm in the visualization quality and computational expenses. The conditions, where the relative MDS efficiency exceeds that of standard MDS, are estimated.

Journal Article
TL;DR: This is the first attempt in Macedonia for prediction of concentrations of any a ir parameters in the ambient air by using the modelling techniques of support vector ma chines (SVM) and radial basis neural networks (RBF NN).
Abstract: In this paper we present results from prediction of data for ozone (O 3) concentrations in ambient air by using the modelling techniques of support vector ma chines (SVM) and radial basis neural networks (RBF NN). The predictions are performed for two sho rt periods of time: for 24 hours and for one week in August and in December in 2005, in Skopje, Maced onia. The built SVM models use different kinds of kernels: polynomial and Gaussian kernels and the be st values of the free parameters of SVM kernels are chosen by examining a range of values for each of t he free parameters. This is the first attempt in Macedonia for prediction of concentrations of any a ir parameters in the ambient air. Povzetek: Podana je analiza ravni ozona v Makedonij i z metodami strojnega ucenja.

Journal ArticleDOI
TL;DR: Mobile Location-Based Gaming as Driver for Location-based Services (LBS) - Exemplified by Mobile Hunters.
Abstract: Mobile Location-Based Gaming as Driver for Location-Based Services (LBS) - Exemplified by Mobile Hunters

Journal ArticleDOI
TL;DR: This paper analyzes the security of YS-like password authentication schemes and shows that they still suffer from forgery attacks, and a new scheme based on the concept of message authentication is proposed to foil the forgery attack.
Abstract: Recently, there are several articles proposed based on Yang and Shieh's password authentication schemes (YS for short) with the following features: (1) A user can choose password freely. (2) The server does not need to maintain a password table. (3) There is no need to involve a trusted third party. Although there were several variants of the YS-like schemes claimed to address the forgery attacks, this paper analyzes their security and shows that they still suffer from forgery attacks. Furthermore, a new scheme based on the concept of message authentication is proposed to foil the forgery attack.

Journal ArticleDOI
TL;DR: This paper presents a formal definition of ID-based concurrent signatures which redress the flaw of Chow et al.
Abstract: The notion of concurrent signatures was introduced by Chen, Kudla and Paterson in their seminal paper in Eurocrypt 2004. In concurrent signature schemes, two entities can produce two signatures that are not binding, until an extra piece of information (namely the keystone) is released by one of the parties. Upon release of the keystone, both signatures become binding to their true signers concurrently. In ICICS 2005, two identity-based perfect concurrent signature schemes were proposed by Chow and Susilo. In this paper, we show that these two schemes are unfair. In which the initial signer can cheat the matching signer. We present a formal definition of ID-based concurrent signatures which redress the flaw of Chow et al.’s definition and then propose two simple but significant improvements to fix our attacks.

Journal Article
TL;DR: This work proposes an alternative qualitative model for evaluating researchers in Slovenia that belongs to the paradigm of hierarchical multi-attribute models and has been developed after a literature survey on existing models in foreign countries.
Abstract: The evaluation of research work is an essential element of the scientific enterprise. In general, the evaluation of researchers and their work is highly dependent on the social and economic condition of the country in which the researchers work. The most commonly used form of evaluation is based on peer review. In Slovenia, a quantitative model for evaluating researchers has been developed and used by the Slovenian Research Agency, which has been criticized by the public. In order to alleviate some of the problems with this model and motivate further discussion on this issue, we propose an alternative qualitative model. The model belongs to the paradigm of hierarchical multi-attribute models and has been developed after a literature survey on existing models in foreign countries.