scispace - formally typeset
Search or ask a question

Showing papers in "Information Technology and Control in 2009"


Journal ArticleDOI
TL;DR: This paper focuses on a pixel level analysis and segmentation of smoke colored pixels for the automated forest fire detection and suggests that the peak performance is the distinctive feature of the algorithm itself rather than the algorithm-color space combination.
Abstract: A focus of this paper is a pixel level analysis and segmentation of smoke colored pixels for the automated forest fire detection. Variations in the smoke color tones, environmental illumination, atmospheric conditions and low quality of the images of wide outdoor area make smoke detection a complex task. In order to find an efficient combination of a color space and pixel level smoke segmentation algorithm, several color space transformations are evaluated by measuring separability between smoke and non-smoke classes of pixels. However, exhaustive evaluation of the histogram-based smoke segmentation algorithms in different color spaces suggests that the peak performance is the distinctive feature of the algorithm itself rather than the algorithm-color space combination.

79 citations


Journal ArticleDOI
TL;DR: This work proposes to use the Z notation for formalisation of previously authors’ introduced ontology-based semi-formal method for development of application domain rules, making them an important and integral part of each application domain.
Abstract: Ontologies in nowadays are widely used in the process of development of modern information systems (IS), since they are suitable to represent application domain knowledge. However, some aspects of ontology-based IS required to be developed. We propose a formal method for ontology axioms transformation into application domain rules, making them an important and integral part of each application domain and used to constrain or direct different aspects of business. Such rules can be consecutively transformed into an executable form and implemented in a software system of an IS. We propose to use the Z notation for formalisation of previously authors’ introduced ontology-based semi-formal method for development of application domain rules.

30 citations


Journal ArticleDOI
TL;DR: An improved protocol is proposed to remedy attacks of Yi et al.'s scheme, and the protection against attacks can be assured and the security of the key distribution on the mobile network is enhanced.
Abstract: An optimized certificate-based protocol for mobile network with authentication and security has been proposed by Yi et al. This protocol allows efficient computation and less storage requirement in the mobile device. As a result, less power is consumed in the mobile device. However, in 1999, 2002, and 2003, Martin et al., Wong, and Laih et al. respectively showed that Yi et al.’s scheme is vulnerable to some attacks, but did not remedy these attacks. In this paper, we propose an improved protocol to remedy these attacks. Using the new protocol, the protection against attacks can be assured. The security of the key distribution on the mobile network is enhanced as well.

17 citations


Journal Article
TL;DR: In this paper, the failure probability of failure due to a rare and abnormal situation may face the need to deal with information which is incomplete and involves uncertainties, and two sources of information are applied to this estimating: a small-size statistical sample and a fragility function.
Abstract: Estimating the probability of failure due to a rare and abnormal situation may face the need to deal with information which is incomplete and involves uncertainties. Two sources of information are applied to this estimating: a small-size statistical sample and a fragility function. This function is used to express aleatory and epistemic uncer- tainties related to the potential failure. The failure probability is estimated by carrying out Bayesian inference. Baye- sian prior and posterior distributions are applied to express the epistemic uncertainty in the failure probability. The central problem of probability estimating is formulated as Bayesian updating with imprecise data. Such data are represented by a set of continuous epistemic probability distributions of fragility function values related to elements of the small-size sample. The Bayesian updating with the set of continuous distributions is carried out by discretising these distributions. The discretisation yields a new sample used for updating. This sample consists of fragility function values which have equal epistemic weights. Several aspects of numerical implementation of the discretisation and subsequent updating are discussed and illustrated by two examples.

13 citations


Journal ArticleDOI
Anca Ion1
TL;DR: An automated method for generating images annotations, taking into account their visual features, is described, represented in Prolog and can be shared and modified depending on the updates in the respective domain.
Abstract: In this paper we describe an automated method for generating images annotations, taking into account their visual features. The semantic rules map the combinations of visual characteristics (colour, texture, shape, position, etc.) with semantic concepts, capture the meaning and understanding of a domain by an expert, namely, which visual primitives are definitive for the semantic concepts of an image category. These rules are represented in Prolog and can be shared and modified depending on the updates in the respective domain.

13 citations


Journal ArticleDOI
TL;DR: The complexity estimation of Web component generators that were developed and used for generating Web component instances to incorporate them into real portal settings is estimated using the Kolmogorov’s complexity measures and Cyclomatic Complexity.
Abstract: We consider a methodology for the development and application of a class of generators that are externally parameterized tools enabling to generate Web component instances on demand depending on the context of use. Such generators are generalized entities of conventional Web components that indeed are lower-level generators for the portal domain. We use one-stage heterogeneous metaprogramming techniques for implementing the externally parameterized metaprograms as a specification of the generators. The first our contribution is a systemized process to create the externally parameterized metaprograms for building Web domain generators. The process describes a logical linking into the coherent structure of the following entities: semantic model for change, program generator model, Web component instance model, and given metalanguages. The second our contribution is the complexity estimation of Web component generators that were developed and used for generating Web component instances to incorporate them into real portal settings. The complexity is estimated using the Kolmogorov’s complexity measures and Cyclomatic Complexity. We analyze also specific features and characteristics of the developed generators.

11 citations


Journal ArticleDOI
TL;DR: The resulting combined self-adaptive swarm optimization- genetic algorithm enables to efficiently auto-configure the control parameters for GA — which leads to excellent quality solutions, especially for the real-life like (structured) QAP instances.
Abstract: In this paper, some further experiments with the genetic algorithm (GA) for the quadratic assignment problem (QAP) are described. We propose to use a particle-swarm-optimization-based approach for tuning the values of the parameters of the genetic algorithm for solving the QAP. The resulting combined self-adaptive swarm optimization- genetic algorithm enables to efficiently auto-configure the control parameters for GA — which leads to excellent quality solutions, especially for the real-life like (structured) QAP instances.

11 citations


Journal ArticleDOI
TL;DR: This paper presents a method developed by the authors enabling reuse of OCL transformations, their adaptation to various data storage environments and evolution by applying attributes and graph merging algorithms.
Abstract: In this paper, we examine OCL-to-code transformations, which are dedicated to modern model and metamodel repositories facing with the requirements to perform search, validation and transformation of models that are usually stored in external data storages, e.g. RDBMS. Diversity and fast changes of data storage technologies make the development of such transformations a real challenge. This paper presents a method developed by the authors enabling reuse of OCL transformations, their adaptation to various data storage environments and evolution by applying attributes and graph merging algorithms.

8 citations


Journal ArticleDOI
TL;DR: In this paper, pH control problems in fed-batch biochemical processes are analyzed and an adaptive pH control sys- tem based on gain scheduling approach was proposed to achieve significant increase in control quality as compared to a standard PI control system.
Abstract: In the paper pH control problems in fed-batch biochemical processes are analyzed Process mathematical model was based on the first principles and identified using available experimental data An adaptive pH control sys- tem based on gain scheduling approach was proposed Significant increase in control quality as compared to a standard PI control system was achieved

7 citations


Journal ArticleDOI
TL;DR: The presented technique generates test cases from UML communication and state machine diagrams which allow testing a correct class integration of object- oriented software.
Abstract: The aim of this paper is to describe a systematic way to construct tests from a formal software specifica- tion for validating a system implementation. In order to achieve this goal, the specification could be extended to create UML states that directly address those aspects of the system we wish to test. The presented technique generates test cases from UML communication and state machine diagrams which allow testing a correct class integration of object- oriented software. UML state machine diagrams provide a good way for test generation in a form that can be easily manipulated. The concept of the technique and an example model are presented.

6 citations


Journal ArticleDOI
TL;DR: The parameter identification for problems where losses arising from overestimation and underestimation are different and can be described by an asymmetrical and polynomial function is investigated in this paper, using the Bayes decision rule allowing to minimize potential losses.
Abstract: The parameter identification for problems where losses arising from overestimation and underestimation are different and can be described by an asymmetrical and polynomial function is investigated in this paper. The Bayes decision rule allowing to minimize potential losses is used. Calculation algorithms are based on the nonparametric methodology of statistical kernel estimators, which releases the method from dependence on distribution type. Three basic cases are considered in detail: a linear, a quadratic, and finally a general concept for a higher degree polynomial - here the cube-case is described in detail as an example. For each of them, the final result constitutes a numerical proce- dure enabling to effectively calculate the optimal value of a parameter in question, presented in its complete form which demands neither detailed knowledge of the theoretical aspects nor laborious research of the user. Although the above method was investigated from the point of view of automatic control problems, it is universal in character and can be applied to a wide range of tasks, also outside the realm of engineering.

Journal ArticleDOI
TL;DR: A new approach for functional delay test enrichment is described, which enriches the test patterns using fault simulation and is fast because it does not require test generation.
Abstract: The testing phase is becoming the most crucial part of the overall design process, which delays the time- to-market of the digital devices. In order to reduce the complexity of test generation and to decrease the time-to- market, one needs to begin the test design at higher levels of abstraction. In this paper a new approach for functional delay test enrichment is described. The test enrichment procedure does not increase the test size and is fast because it does not require test generation. The described approach enriches the test patterns using fault simulation. The per- formed experiments demonstrate effectiveness of the proposed approach.

Journal ArticleDOI
TL;DR: The design and implementation of adaptive fuzzy controllers for the control of a coupled level and pressure process are described and the results of the experiments prove the practical relevance of the design strategy of an adaptive fuzzy controller.
Abstract: Due to high complexity of chemical processes and their control systems, adaptive controllers are frequently applied in practice. The present paper describes the design and implementation of adaptive fuzzy controllers for the control of a coupled level and pressure process. Expert knowledge is applied to form an adaptation mechanism which tunes the fuzzy controller based on process data. The results of the experiments on the physical plant prove the practical relevance of the design strategy of an adaptive fuzzy controller.

Journal ArticleDOI
TL;DR: The response surface methodology based process optimization procedure, including design of factorial experiments, development of statistical model and estimation of the optimum response point is simulated in order to investigate influence of experimental errors and experimental design area on the optimization accuracy.
Abstract: The response surface methodology based process optimization procedure, including design of factorial experiments, development of statistical model and estimation of the optimum response point is simulated in order to investigate influence of experimental errors and experimental design area on the optimization accuracy. The investigated practical problem is optimization of nutrient media composition for cultivation of microorganisms' culture. The optimization results under a priori estimated experimental errors and various factor variation ranges in factorial experiments are investigated and the relationships between the factors variation range and the confidence interval of the optimum point estimations are determined.

Journal ArticleDOI
TL;DR: To force already existing English Talking Head speech animation engine to talk Lithuanian, some modifications to the animation script were made.
Abstract: Visual speech animation plays an important role in human-computer interaction. To force already exis- ting English Talking Head speech animation engine to talk Lithuanian, some modifications to the animation script were made. For this adaptation, the relation between English and Lithuanian languages was explored. To determine it, 30 3- dimensional Lithuanian visemes were modeled using the calibration of two orthogonal pictures of phoneme. Using the visual similarity of different English and Lithuanian phonemes, "Lithuanian phoneme to English viseme mapping table" was defined and used for Lithuanian speech animation.

Journal ArticleDOI
TL;DR: In this paper, the moments of the maximum of a finite number of random values are analyzed and the largest part of the analysis is focused on extremes of dependent normal values, where the moments are expressed through moments of independent values.
Abstract: In this paper the moments of the maximum of a finite number of random values are analyzed. The largest part of analysis is focused on extremes of dependent normal values. For the case of normal distribution, the moments of the maximum of dependent values are expressed through the moments of independent values.

Journal ArticleDOI
TL;DR: The paper presents the results of model-based optimization of fed-batch culture Enterobacter aerogenes 17 E13 for maximization of biomass productivity through stochastic search algorithm based parametric optimization of flexible time function approximating the feed-rate time profile.
Abstract: The paper presents the results of model-based optimization of fed-batch culture Enterobacter aerogenes 17 E13 for maximization of biomass productivity. A mathematical model for the problem solving is developed using mass-balance conditions for cells’ biomass and limiting substrate, relevant kinetic relationships for modeling the specific rates of biochemical transformations, and experimental data from fed-batch cultivation processes, realized under various cultivation conditions. The optimization procedure refers to the stochastic search algorithm based parametric optimization of flexible time function approximating the feed-rate time profile.

Journal ArticleDOI
TL;DR: In this article, a minimum variance control (MVC) approach for a closed-loop discrete-time linear time-invariant (LTI) system when the parameters of a dynamic system as well as that of a controller are not known and ought to be estimated is presented.
Abstract: The aim of the given paper is development of a minimum variance control (MVC) approach for a closed-loop discrete-time linear time-invariant (LTI) system when the parameters of a dynamic system as well as that of a controller are not known and ought to be estimated. The parametric identification of the open-loop LTI system and the determination of the coefficients of the MV controller are performed in each current operation by processing observations in the case of additive noise on the output with contaminating outliers uniformly spread in it. The robust recursive technique, based on the S-algorithm, with a version of Shweppe’s GM-estimator is applied here in the calculation of estimates of the parameters of a LTI system with one time-varying coefficient in the numerator of the system transfer function. Then, the recursive parameter estimates are used in each current iteration to determine unknown parameters of the MV controller. Afterwards, the current value of the control signal is found in each operation, and it is used to generate the output of the system, too. The results of numerical simulation by computer are presented and discussed.

Journal ArticleDOI
TL;DR: There is no universal algorithm that could help solving any problem, as each data set has its own optimal data set suitable for the classification algorithm, and methodological recommendations to reach possibly optimal solution are given to perform clinical decision support.
Abstract: This paper analyzes various problems that appear while performing data mining. The issues of data quality are discussed. The main focus is set on feature selection and its influence on classification results. Feature selection, or discovery of an optimal data set is a process of removing features from the data set that are not useful in decision making, and leaving the most useful ones. The influence of feature selection is analyzed for different classification algorithms. They are applied on two different (in constitution) data sets to solve three problems of medical domain. Presented results show that there is no universal algorithm, whitch could help solving any problem, as well as each data set has its own optimal (sub)set suitable for the classification algorithm. Methodological recommendations to reach possibly optimal solution are given to perform clinical decision support.

Journal ArticleDOI
TL;DR: A new wavelet-based approach to implementing of a locally progressive digital signal coding idea is proposed and the proposed approach explores both the newly developed fast procedures for evaluation of the discrete wavelet transform for particular selected ROI in the digital signal and the zero-tree-based encoders with an improvedzero-tree analysis scheme.
Abstract: Progressive digital signal encoding and subsequent transmission refer to signal compression techniques that allow both the original signal reconstruction without loss of any detail and the construction of signal approximations (estimates) with the accuracy level depending on the amount of data available. Locally progressive encoding and transmission can be achieved by first transmitting a “rough” estimate of the original signal, then sending further details related to one or another selected block (region of interest - ROI) of the signal. In this paper, we propose a new wavelet-based approach to implementing of a locally progressive digital signal coding idea. The proposed approach explores both the newly developed fast procedures for evaluation of the discrete wavelet (Haar, LeGall) transform for particular selected ROI in the digital signal and the zero-tree-based encoders with an improved zero-tree analysis scheme.

Journal ArticleDOI
TL;DR: The introduced ADT permits to describe structural changes in the hierarchical dynamic PLA (dynPLA) model and in order to formalize the specification of abstract data type, the Z language is used.
Abstract: This paper presents the definition of abstract data type (ADT) in dynamic Piece-Linear Aggregate (PLA) model. The introduced ADT permits to describe structural changes in the hierarchical dynamic PLA (dynPLA). In order to formalize the specification of abstract data type, the Z language is used. The application of ADT in specification of dynPLA is demonstrated by an example - the transaction processing system.

Journal ArticleDOI
TL;DR: The paper presents a structure of executive and control program with respect to synchronization of two most important events: measuring and communication in microprocessor Measuring Station.
Abstract: Microprocessor Measuring Station (MMS) is a type of programmable logic controller fully developed and designed in Mining and metallurgy institute Bor, Serbia, for monitoring and process control, mainly in copper production plants. It is based on MC68HC11 microcontroller. The paper presents a structure of executive and control program with respect to synchronization of two most important events: measuring and communication. The results of practical implementation are also included.

Journal ArticleDOI
TL;DR: A method is presented that evaluates the quality of the delay test according to the covered paths of the circuit and constructs the paths, which could be used as the input to the path delay test generator.
Abstract: The quality of delay testing focused on small delay defects is not known when transition fault model is used. The paper presents a method that evaluates the quality of the delay test according to the covered paths of the circuit and constructs the paths, which could be used as the input to the path delay test generator. All the constructed paths are testable. The complexity of the circuit has no direct impact on the path construction. The path construction is based on the information provided by TetraMAX transition fault simulator. The transition fault simulator forms a text file that contains the complete information on the propagation of the transitions along the lines of the circuit. The experimental results demonstrate the ability to assess the quality of the delay test according to the covered paths.

Journal ArticleDOI
TL;DR: The proposed indexing that uses Generalized Hash Tree (GHT) expedites projection and selection operations on encrypted medical XML records stored in WORM storage.
Abstract: Write-Once-Read-Many (WORM) storage devices may be used to store critical medical data to prevent them from easy modification. In this paper, we propose a novel indexing structure for encrypted medical XML documents stored in WORM storage. The proposed indexing that uses Generalized Hash Tree (GHT) expedites projection and selection operations on encrypted medical XML records stored in WORM storage. We implemented and tested the proposed system.

Journal ArticleDOI
TL;DR: A new system in which association rule mining over data streams and association rule hiding for traditional databases are merged, which can be applied on both raw data and template guided XML data.
Abstract: Association rule mining is used in various applications. Also information systems may need to take privacy issues into account when releasing data to outside parties. Due to recent advances, releasing data to other parties may be done in a streaming fashion. In this paper, we introduce a new system in which association rule mining over data streams and association rule hiding for traditional databases are merged. The stream association rule hiding algorithm presented can be applied on both raw data and template guided XML data. The algorithms presented are implemented and tested.

Journal ArticleDOI
TL;DR: In this article, a method based on the Region Connection Calculus theory is proposed for the recognition of semiconductor elements, and the experimental results of the system are presented, as well as its performance.
Abstract: Self-formation processes of semiconductor elements are discussed. A method for recognition of selfformed semiconductor elements is described. The method based on the Region Connection Calculus theory is proposed. A system for recognition of semiconductor elements is developed. The experimental results of the recognition of semiconductor elements are presented.