scispace - formally typeset
Search or ask a question

Showing papers in "Informatica (lithuanian Academy of Sciences) in 2005"


Journal Article
TL;DR: This paper summarizes the performance of four machine learning paradigms applied to modeling the severity of injury that occurred during traffic accidents, and considers neural networks trained using hybrid learning approaches, support vector machines, decision Trees and a concurrent hybrid model involving decision trees and neural networks.
Abstract: Engineers and researchers in the automobile industry have tried to design and build safer automobiles, but traffic accidents are unavoidable. Patterns involved in dangerous crashes could be detected if we develop accurate prediction models capable of automatic classification of type of injury severity of various traffic accidents. These behavioral and roadway accident patterns can be useful to develop traffic safety control policies. We believe that to obtain the greatest possible accident reduction effects with limited budgetary resources, it is important that measures be based on scientific and objective surveys of the causes of accidents and severity of injuries. This paper summarizes the performance of four machine learning paradigms applied to modeling the severity of injury that occurred during traffic accidents. We considered neural networks trained using hybrid learning approaches, support vector machines, decision trees and a concurrent hybrid model involving decision trees and neural networks. Experiment results reveal that among the machine learning paradigms considered the hybrid decision tree-neural network approach outperformed the individual approaches. Povzetek: Stirje pristopi strojnega ucenja so uporabljeni za preiskovanje zakonitosti poskodb v prometnih nesrecah.

149 citations


Journal Article
TL;DR: This paper describes an approach to visualization of text document collection based on methods from linear algebra, the system implementing it and results of using the system on several datasets.
Abstract: Visualization is commonly used in data analysis to help the user in getting an initial idea about the raw data as well as visual representation of the regularities obtained in the analysis. In similar way, when we talk about automated text processing and the data consists of text documents, visualization of text document corpus can be very useful. From the automated text processing point of view, natural language is very redundant in the sense that many different words share a common or similar meaning. For computer this can be hard to understand without some background knowledge. We describe an approach to visualization of text document collection based on methods from linear algebra. We apply Latent Semantic Indexing (LSI) as a technique that helps in extracting some of the background knowledge from corpus of text documents. This can be also viewed as extraction of hidden semantic concepts from text documents. In this way visualization can be very helpful in data analysis, for instance, for finding main topics that appear in larger sets of documents. Extraction of main concepts from documents using techniques such as LSI, can make the results of visualizations more useful. For example, given a set of descriptions of European Research projects (6FP) one can find main areas that these projects cover including semantic web, e-learning, security, etc. In this paper we describe a method for visualization of document corpus based on LSI, the system implementing it and give results of using the system on several datasets.

140 citations


Journal Article
TL;DR: Angelo Susi, Anna Perini and John MylopoulosITC-irst, Via Sommarive, 18, I-38050 Trento-Povo, ItalyE-mail: susi@itC.it, perini@itc.itKeywords: Agent Oriented Software Engineering Methodology, Metamodel
Abstract: Angelo Susi, Anna Perini and John MylopoulosITC-irst, Via Sommarive, 18, I-38050 Trento-Povo, ItalyE-mail: susi@itc.it, perini@itc.it, jm@cs.toronto.eduPaolo GiorginiDepartment of Information and Communication TechnologyUniversity of Trento, via Sommarive 14, I-38050 Trento-Povo, ItalyE-mail: paolo.giorgini@dit.unitn.itKeywords: Agent Oriented Software Engineering Methodology, MetamodelReceived: May 9, 2005

137 citations


Journal Article
TL;DR: An overview of AML is presented, the scope of the language, its structure and extensibility mechanisms are discussed, and the core AML modeling constructs and mechanisms are introduced and demonstrated by examples.
Abstract: The Agent Modeling Language (AML) is a semi-formal visual modeling language for specifying, modeling and documenting systems that incorporate features drawn from multi-agent systems theory It is specified as an extension to UML 20 in accordance with major OMG modeling frameworks (MDA, MOF, UML, and OCL) The ultimate objective of AML is to provide software engineers with a ready-to-use, complete and highly expressive modeling language suitable for the development of commercial software solutions based on multi-agent technologies This paper presents an overview of AML The scope of the language, its structure and extensibility mechanisms are discussed, and the core AML modeling constructs and mechanisms are introduced and demonstrated by examples

97 citations


Journal Article
TL;DR: This paper proposes both a progressive vision scheme and pheromone heuristics for the standard ant-clustering algorithm, together with a cooling schedule that improves its convergence properties.
Abstract: Among the many bio-inspired techniques, ant-based clustering algorithms have received special attention from the community over the past few years for two main reasons. First, they are particularly suitable to perform exploratory data analysis and, second, they still require much investigation to improve performance, stability, convergence, and other key features that would make such algorithms mature tools for diverse applications. Under this perspective, this paper proposes both a progressive vision scheme and pheromone heuristics for the standard ant-clustering algorithm, together with a cooling schedule that improves its convergence properties. The proposed algorithm is evaluated in a number of well-known benchmark data sets, as well as in a real-world bioinformatics dataset. The achieved results are compared to those obtained by the standard ant clustering algorithm, showing that significant improvements are obtained by means of the proposed modifications. As an additional contribution, this work also provides a brief review of ant-based clustering algorithms. Povzetek: Clanek opisuje izboljsan algoritem grupiranja na osnovi pristopa kolonij mravelj.

86 citations


Journal ArticleDOI
TL;DR: A method that takes advantage of the relationship between fuzzy sets and matrix game theories can be offered for multicriteria decision-making for selecting the variants water supply systems.
Abstract: When handling engineering problems associated with optimal alternative selection a researcher often deals with not sufficiently accurate data. The alternatives are usually assessed by applying several different criteria. A method takes advantage of the relationship between fuzzy sets and matrix game theories can be offered for multicriteria decision-making. Practical investigations have already been discussed for selecting the variants water supply systems.

81 citations


Journal Article
TL;DR: A color image quantization algorithm based on Particle Swarm Optimization (PSO) is developed and generally results in a significant improvement of image quality compared to other well-known approaches.
Abstract: A color image quantization algorithm based on Particle Swarm Optimization (PSO) is developed in this paper. PSO is a population-based optimization algorithm modeled after the simulation of social behavior of bird flocks and follows similar steps as evolutionary algorithms to find near-optimal solutions. The proposed algorithm randomly initializes each particle in the swarm to contain K centroids (i.e. color triplets). The K-means clustering algorithm is then applied to each particle at a user-specified probability to refine the chosen centroids. Each pixel is then assigned to the cluster with the closest centroid. The PSO is then applied to refine the centroids obtained from the K-means algorithm. The proposed algorithm is then applied to commonly used images. It is shown from the conducted experiments that the proposed algorithm generally results in a significant improvement of image quality compared to other well-known approaches. The influence of different values of the algorithm control parameters is studied. Furthermore, the performance of different versions of PSO is also investigated. Povzetek: Evolucijski algoritem na osnovi jate pticev je uporabljen za barvno obdelavo slik.

78 citations


Journal Article
TL;DR: A large-scale comparison on 27 standard benchmark datasets with other state-of-the-art algorithms and ensembles is performed using the simple Bayesian algorithm as base learner and the proposed technique was more accurate in most cases.
Abstract: The ensembles of simple Bayesian classifiers have traditionally not been a focus of research. The reason is that simple Bayes is an extremely stable learning algorithm and most ensemble techniques such as bagging is mainly variance reduction techniques, thus not being able to benefit from its integration. However, simple Bayes can be effectively used in ensemble techniques, which perform also bias reduction, such as Logitboost. However, Logitboost requires a regression algorithm for base learner. For this reason, we slightly modify simple Bayesian classifier in order to be able to run as a regression method. Finally, we performed a large-scale comparison on 27 standard benchmark datasets with other state-of-the-art algorithms and ensembles using the simple Bayesian algorithm as base learner and the proposed technique was more accurate in most cases. Povzetek: Preprosti Bayesov klasifikator je uporabljen v varianti Logiboost algoritma.

69 citations


Journal Article
TL;DR: A comprehensive view about the links between computational intelligence and data mining is given and results illustrate that that CI based tools can be applied in a synergistic manner though the nine step knowledge discovery.
Abstract: Janos Abonyi and Balazs FeilUniversity of Veszprem, Department of Process Engineering,P.O. Box 158, H-8201 Veszprem, Hungary, abonyij@fmt.vein.huwww.fmt.vein.hu/softcompAjith AbrahamSchool of Computer Science and Engineering,Chung-Ang University, Seoul, S. Korea, ajith.abraham@ieee.orghttp://ajith.softcomputing.netKeywords: KDD, Computational Intelligence, Soft Computing, Fuzzy Classifier System,Rule Base Reduction, VisualizationReceived: December 20, 2004This paper is aimed to give a comprehensive view about the links between computational intelligence anddata mining. Further, a case study is also given in which the extracted knowledge is represented by fuzzyrule-based expert systems obtained by soft computing based data mining algorithms. It is recognized thatboth model performance and interpretability are of major importance, and effort is required to keep theresulting rule bases small and comprehensible. Therefore, CI technique based data mining algorithms havebeen developed for feature selection, feature extraction, model optimization and model reduction (rule basesimplification). Application of these techniques is illustrated using the Wine data classification problem.The results illustrate that that CI based tools can be applied in a synergistic manner though the nine stepsof knowledge discovery.Povzetek:

57 citations


Journal Article
TL;DR: This paper presents recent trends in modelling agents and multi-agent systems, and then it review the different activities in the agent development process: from analysis and design to implementation, verification and finally testing.
Abstract: Carole Bernon IRIT – University Paul Sabatier, 118 Route de Narbonne, 31062 Toulouse, France E-mail: bernon@irit.fr, http://www.irit.fr/SMAC Massimo Cossentino ICAR-CNR, National Research Council, Viale delle Scienze, ed. 11, 90128 Palermo, Italy E-mail: cossentino@pa.icar.cnr.it, http://www.pa.icar.cnr.it/~cossentino Juan Pavon Fac. Informatica, Universidad Complutense Madrid, Ciudad Universitaria s/n, 28040 Madrid, Spain E-mail: jpavon@sip.ucm.es, http://grasia.fdi.ucm.es/jpavon Keywords: Agent Oriented Software Engineering (AOSE), Agent oriented methodologies, Multi-Agent Systems Received: June 31, 2005 The agent oriented approach is doing great steps towards its (not yet reached) maturity; from a software engineering point of view, it is today positively used for the analysis and design of complex systems. In this paper, which is related to the activity of the AgentLink AOSE TFG (Agent Oriented Software Engineering Technical Forum Group), we provide a perspective of current research trends in this area with a specific attention for results coming from European groups. We start with a discussion of what are agents, specially from the perspective of the software engineer. We present recent trends in modelling agents and multi-agent systems, and then we review the different activities in the agent development process: from analysis and design to implementation, verification and finally testing. Povzetek: Podan je povzetek evropskega raziskovanja AOSE.

47 citations


Journal ArticleDOI
TL;DR: The resulting framework for Enterprise modelling and Knowledge-based IS engineering - Enterprise meta-model (EMM) - is developed and presented in this paper.
Abstract: The paper deals with Knowledge-based Information Systems (IS) engineering. The Enterprise management functions, processes and their interactions are considered as the major components of the domain knowledge. This is the peculiarity of this approach to Enterprise modelling for IS engineering. The resulting framework for Enterprise modelling and Knowledge-based IS engineering - Enterprise meta-model (EMM) - is developed and presented in this paper. The architecture of the advanced CASE system is also discussed in this paper.

Journal Article
TL;DR: This paper proposes an alternative fuzzy clustering algorithm, Simulated Annealing Fuzzy Clustering (SAFC), that improves and extends the ideas present in VFC-SA and demonstrates that the SAFC algorithm is able to find better clusters than the other two methods.
Abstract: Classification is an important research area in cancer diagnosis. Fuzzy C-means (FCM) is one of the most widely used fuzzy clustering algorithms in real world applications. However there are two major limitations that exist in this method. The first is that a predefined number of clusters must be given in advance. The second is that the FCM technique can get stuck in sub-optimal solutions. In order to overcome these two limitations, Bandyopadhyay proposed a Variable String Length Simulated Annealing (VFC-SA) algorithm. Nevertheless, when this algorithm was implemented, it was found that sub-optimal solutions were still obtained in certain circumstances. In this paper, we propose an alternative fuzzy clustering algorithm, Simulated Annealing Fuzzy Clustering (SAFC), that improves and extends the ideas present in VFC-SA. The data from seven oral cancer patients tissue samples, obtained through Fourier Transform Infrared Spectroscopy (FTIR), were clustered using FCM, VFC-SA and the proposed SAFC algorithm. Experimental results are provided and comparisons are made to illustrate that the SAFC algorithm is able to find better clusters than the other two methods.

Journal Article
TL;DR: Integrating Multi-Objective Genetic Algorithm and Validity Analysis for Locating and Ranking Alternative Clustering and integrating multi-objective genetic algorithms for locating and ranking alternative clustering is integrated.
Abstract: Integrating Multi-Objective Genetic Algorithm and Validity Analysis for Locating and Ranking Alternative Clustering

Journal ArticleDOI
TL;DR: In this paper, an improvement based on Tzeng's protocol is proposed and it achieves forward secrecy and the proposed protocol can withstand passive attacks and is secure against impersonator's attacks.
Abstract: Recently, Tzeng proposed a provably secure and fault-tolerant conference-key agreement protocol. It requires only a constant number of rounds to establish a conference key among all honest participants. This article will show that Tzeng's protocol does not offer forward secrecy. We say that a conference-key agreement protocol offers forward secrecy if the long-term secret key of any participant is compromised and will not result in the compromise of the previously established conference keys. This property is important and has been included in most key agreement protocols and standards. In this paper, an improvement based on Tzeng's protocol is proposed and it achieves forward secrecy. Under the Diffie-Hellman decision problem assumption and the random oracle model, we show that the proposed protocol can withstand passive attacks and is secure against impersonator's attacks. The improved protocol requires a constant number of rounds to compute a conference key. The improved protocol provides fault-tolerance.

Journal ArticleDOI
TL;DR: The fuzzy curve technique, introduced by Lin and Cunningham (1995), is extended to an advantageous fuzzy surface technique; the latter is used for fast building a coarse model of the system from a subset of the initial candidate inputs.
Abstract: The problem of system input selection, dubbed in the literature as Type I Structure Identification problem, is addressed in this paper using an effective novel method. More specifically, the fuzzy curve technique, introduced by Lin and Cunningham (1995), is extended to an advantageous fuzzy surface technique; the latter is used for fast building a coarse model of the system from a subset of the initial candidate inputs. A simple genetic algorithm, enhanced with a local search operator, is used for finding an optimal subset of necessary and sufficient inputs by considering jointly more than one inputs. Extensive simulation results on both artificial data and real world data have demonstrated comparatively the advantages of the proposed method.

Journal Article
TL;DR: A novel approach for performing similarity search in time series data is introduced based on the intuition that similar time sequences will have similar variations in their slopes and consequently in their time weighted slopes.
Abstract: Similarity search in time series data is an area of active interest in the data mining community In this paper we introduce a novel approach for performing similarity search in time series data This technique is based on the intuition that similar time sequences will have similar variations in their slopes and consequently in their time weighted slopes The proposed technique is capable of handling variable length queries and also works irrespective of different baselines and scaling factors Povzetek: Opisana je nova metoda rudarjenja podatkov casovnih vrst z iskanjem podobnosti

Journal Article
TL;DR: For a long time, the role of the environment has been underestimated in multiagent systems research, and originating from research on behavior-based agents and situated multi agent systems, the importan ...
Abstract: For a long time, the role of the environment has been underestimated in multiagent systems research. Originating from research on behavior-based agents and situated multiagent systems, the importan ...

Journal Article
TL;DR: An FPGA-Based Parallel Distributed Arithmetic Implementation of the 1-D Discrete Wavelet Transform and its Applications.
Abstract: An FPGA-Based Parallel Distributed Arithmetic Implementation of the 1-D Discrete Wavelet Transform

Journal Article
TL;DR: This paper challenges the implicit hypothesis in current trends of the MASs considering that the agents are related to only one environment that captures all the different aspects of the application domain by enabling multiple occurrences of the agent-environment relationship.
Abstract: Within the Multi-agent systems (MASs) paradigm, the concept of the environment plays a central role. In fact, the autonomous agents do only exist when they are deployed on an environment. Still, there is an implicit hypothesis in current trends of the MASs considering that the agents are related to only one environment that captures all the different aspects of the application domain. In this paper we challenge this implicit hypothesis by enabling multiple occurrences of the agent-environment relationship. This brings clarity and modularity for the design and implementation of complex MASs since each environment targets a specific aspect of the application. Thanks to the proposed characterization of the agent-environment relationship, the agents are still offered a unified view about all the environments.

Journal ArticleDOI
TL;DR: The resulting detection performance of the proposed method using empirical mode decomposition was better to compare to discrete wavelet transform and resulted in 70% correctly identified "healthy subject" cases and 82%, 97% and 100% correctly identify "cataract cases" in the incipience, immature and mature cataract subject groups, respectively.
Abstract: This paper presents a new approach for human cataract automatical detection based on ultrasound signal processing. Two signal decomposition techniques, empirical mode decomposition and discrete wavelet transform are used in the presented method. Performance comparison of these two decomposition methods when applied to this specific ultrasound signal is given. Described method includes ultrasonic signal decomposition to enhance signal specific features and increase signal to noise ratio with the following decision rules based on adaptive thresholding. The resulting detection performance of the proposed method using empirical mode decomposition was better to compare to discrete wavelet transform and resulted in 70% correctly identified "healthy subject" cases and 82%, 97% and 100% correctly identified "cataract cases" in the incipience, immature and mature cataract subject groups, respectively. Discussion is given on the reasons of different results and the differences between the two used signal decomposition techniques.

Journal Article
TL;DR: Povzetek et al. as discussed by the authors proposed a general neural network model that resembles the interactions between glucose concentration levels and amount of insulin injected in the bodies of diabetics.
Abstract: In this work we extending our investigations for a general neural network model that resembles the interactions between glucose concentration levels and amount of insulin injected in the bodies of diabetics. We use real data for 70 different patients of diabetics and build on it our model. Two types of neural networks (NN’s) are experimented in building that model; the first type is called the LevenbergMarquardt (LM) training algorithm of multilayer feed forward neural network (NN), the other one is based on Polynomial Network (PN’s). We do comparisons between the two models based on their performance. The design stages mainly consist of training, testing, and validation. A linear regression between the output of the multi-layer feed forward neural network trained by LM algorithm (abbreviated by LM NN) and the actual outputs shows that the LM NN is a better model. The PN’s have proved to be good static “mappers”, but their performance is degraded when used in modelling a dynamical system. The LM NN based model still proved that it can potentially be used to build a theoretical general regulator controller for insulin injections and, hence, can reflect an idea about the types and amounts of insulin required for patients. Povzetek: Na osnovi podatkov o 70 pacientih je razvit nevronski model za razmerna med insulinom in glukozo.

Journal ArticleDOI
TL;DR: A modernization of signature scheme published in (Sakalauskas, 2004) is presented, which differs from the prototype by its structure and uses a more general algebraic systems.
Abstract: A modernization of signature scheme published in (Sakalauskas, 2004) is presented. This scheme differs from the prototype by its structure and uses a more general algebraic systems. It has a higher security and shorter key length and is also more computationally effective. The introduced new algebraic structures, semiring and semimodule, are mutually compatible algebraic systems. The semiring is a set of operators acting in a semimodule as endomorphisms. There is postulated that action operation has a one-way function (OWF) property. The compatibility of both algebraic structures' means that the action operation has right and left distributivity property with respect to the additive operation defined in semimodule and semiring. Two other essential OWFs are defined. The latter are based on known constructions and have a greater complexity than other recognized hard problems such as conjugator search problem in noncommutative groups, for example.

Journal ArticleDOI
TL;DR: A decision support system TRSS (Traffic Regulation Support System), a supervision environment for the regulation of urban transportation system based on the regulation operator decision-making process, is presented.
Abstract: To offer high quality services, when users are increasingly demanding and competition more and more hard, is now a major problem that transportation companies are faced with. So, ensuring a regular traffic needs to identify the randomly occurring disturbances that affect the transportation system and to eliminate or reduce their impacts on the traffic. This paper presents a decision support system TRSS (Traffic Regulation Support System). TRSS is a supervision environment for the regulation of urban transportation system. TRSS (tram and bus) is based on the regulation operator decision-making process. It provides the operator with the information he needs to identify disturbances and evaluate potential corrective actions to be carried out, according to the regulation strategy he has selected. The first part of the paper presents the decision model we work with. The second part deals with the functional model used in the decision support system. Decision support system for transportation and characteristics of a DSS for a transportation system are described in the third part. In the fourth part, we present the components of the decision-making TRSS supervision tool. In the fifth part, we present the criteria of evaluation and the sixth part is devoted to the presentation of the results.

Journal ArticleDOI
TL;DR: This paper considers the realization-independent testing and the impact of circuit realization on the fault coverage, and investigated two fault models that are used by test generation for circuits described at system description level.
Abstract: The design complexity of systems on a chip drives the need to reuse legacy or intellectual property cores, whose gate-level implementation details are unavailable. In this paper we consider the realization-independent testing and the impact of circuit realization on the fault coverage. We investigated two fault models (input-output pin pair fault and input-input-output pin triplet fault) that are used by test generation for circuits described at system description level. The test generation on the system-level model is preferable if the efforts and the duration of the test supplement activities are less than the efforts and the duration of the test generation on gate-level model. The test set for the black-box model is larger as compared to the test set for the particular realization of the circuit. However, large test sets for the black-box model can be compacted by analysis not only according to the stuck-at faults, but also according to various defects for the particular realization.

Journal ArticleDOI
TL;DR: A hierarchical decision making framework for the evaluation and improvement/redesign of composite systems based on Hierarchical Morphological Multicriteria Design and corresponding morphological clique problem which realize “partitioning/synthesis macroheuristic” is described.
Abstract: The article describes a hierarchical decision making framework for the evaluation and improvement/redesign of composite systems. The framework is based on Hierarchical Morphological Multicriteria Design (HMMD) and corresponding morphological clique problem which realize “partitioning/synthesis macroheuristic”. The system evaluation process consists in hierarchical integration of expert judgment (as ordinal estimates): a method of integration tables or the above-mentioned morphological approach. As a result, ordinal multi-state classification is realized. The system improvement/redesign process is examined as the selection and planning of redesign operations while taking into account operations attributes (e.g., required resources, effectiveness) and binary relations (equivalence, complementarity, precedence) on the operation sets. For modeling the system improvement process several combinatorial optimization models are used (knapsack problem, multiple choice problem, etc.) including HMMD. The suggested approach is illustrated by realistic numerical example for two-floor building. This applied problem is examined from the viewpoint of earthquake engineering.

Journal ArticleDOI
TL;DR: Public available C and C++ packages for interval arithmetic are investigated and experimentally compared to give suggestions which packages and when are preferable.
Abstract: In this paper public available C and C++ packages for interval arithmetic are investigated and experimentally compared. The results of comparison give suggestions which packages and when are preferable.

Journal Article
TL;DR: Clustering Algorithms in Process Monitoring and Control Application to Continuous Digesters applies to continuous digesters and shows promising results in improving the efficiency of process monitoring and control systems.
Abstract: Clustering Algorithms in Process Monitoring and Control Application to Continuous Digesters

Journal ArticleDOI
TL;DR: Coordination Artifacts: A Unifying Abstraction for Engineering Environment-Mediated Coordination in MAS is presented, which aims to provide a unified approach to coordinating materials and systems in the environment.
Abstract: Coordination Artifacts: A Unifying Abstraction for Engineering Environment-Mediated Coordination in MAS

Journal ArticleDOI
TL;DR: A novel and fast fingerprint identification technique is presented by using ANN to find the configuration of ANN system that could provide good generalization ability and sufficient discrimination capability at the same time.
Abstract: A novel and fast fingerprint identification technique is presented in this paper by using ANN. The proposed method use a novel clustering algorithm to detect similar feature groups from multiple template images generated from the same finger and create the cluster core set. The proposed feature extraction scheme is based on the reduction of information contents to the require minimum and define which part of the image is crucial and which is omitted. In the new method, a quick response was achieved by manipulating the search order inside the experimental databases. The novelty of our proposed method is to find the configuration of ANN system that could provide good generalization ability and sufficient discrimination capability at the same time. To achieve that goal, we have introduced new supervised recurrent neural-network. The experiments results demonstrate that this similarity search approach with ANN proves suitable one-to many matching of fingerprints on large databases. Povzetek: Hitro prepoznavanje prstnih odtisov je realizirano z ANN in grupiranjem.

Journal ArticleDOI
TL;DR: Although the question of how to perform quality size measurements in object-oriented projects remains unanswered, the paper gives valuable information on the topic, supported by mathematics.
Abstract: Software size is an important attribute in software project planning. Several methods for software size estimation are available; most of them are based on function points. Albrecht introduced function points as a technologically independent method with its own software abstraction layer. However, it is difficult to apply original abstraction elements to current technologies. Therefore researchers introduced additional rules and mappings for object-based solutions. In this paper several mapping strategies are discussed and compared. Based on the similarities in compared mappings, a common mapping strategy is then defined. This mapping is then tested on the reference application portfolio containing five applications. The aim of the test scenario is to evaluate the impact of the diverse detail levels in the class diagrams on software size measurement. Although the question of how to perform quality size measurements in object-oriented projects remains unanswered, the paper gives valuable information on the topic, supported by mathematics.