scispace - formally typeset
Search or ask a question

Showing papers in "Artificial Intelligence Review in 2013"


Journal ArticleDOI
TL;DR: Basic decision tree issues and current research points are described, guiding the researcher in interesting research directions and suggesting possible bias combinations that have yet to be explored.
Abstract: Decision tree techniques have been widely used to build classification models as such models closely resemble human reasoning and are easy to understand. This paper describes basic decision tree issues and current research points. Of course, a single article cannot be a complete review of all algorithms (also known induction classification trees), yet we hope that the references cited will cover the major theoretical issues, guiding the researcher in interesting research directions and suggesting possible bias combinations that have yet to be explored.

694 citations


Journal ArticleDOI
TL;DR: Empirical results reveal that the problem solving success of the CK algorithm is very close to the DE algorithm and the run-time complexity and the required function-evaluation number for acquiring global minimizer by theDE algorithm is generally smaller than the comparison algorithms.
Abstract: In this paper, the algorithmic concepts of the Cuckoo-search (CK), Particle swarm optimization (PSO), Differential evolution (DE) and Artificial bee colony (ABC) algorithms have been analyzed. The numerical optimization problem solving successes of the mentioned algorithms have also been compared statistically by testing over 50 different benchmark functions. Empirical results reveal that the problem solving success of the CK algorithm is very close to the DE algorithm. The run-time complexity and the required function-evaluation number for acquiring global minimizer by the DE algorithm is generally smaller than the comparison algorithms. The performances of the CK and PSO algorithms are statistically closer to the performance of the DE algorithm than the ABC algorithm. The CK and DE algorithms supply more robust and precise results than the PSO and ABC algorithms.

656 citations


Journal ArticleDOI
TL;DR: An objective definition of trust is provided, based on Castelfranchi’s idea that trust implies a decision to rely on someone, and illustrates the proliferation in the past few years of models that follow a more cognitive approach.
Abstract: In open environments, agents depend on reputation and trust mechanisms to evaluate the behavior of potential partners. The scientific research in this field has considerably increased, and in fact, reputation and trust mechanisms have been already considered a key elements in the design of multi-agent systems. In this paper we provide a survey that, far from being exhaustive, intends to show the most representative models that currently exist in the literature. For this enterprise we consider several dimensions of analysis that appeared in three existing surveys, and provide new dimensions that can be complementary to the existing ones and that have not been treated directly. Moreover, besides showing the original classification that each one of the surveys provide, we also classify models that where not taken into account by the original surveys. The paper illustrates the proliferation in the past few years of models that follow a more cognitive approach, in which trust and reputation representation as mental attitudes is as important as the final values of trust and reputation. Furthermore, we provide an objective definition of trust, based on Castelfranchi's idea that trust implies a decision to rely on someone.

314 citations


Journal ArticleDOI
TL;DR: The advantages and disadvantages of using EAs to optimize ANNs are explained and the basic theories and algorithms for optimizing the weights, optimizing the network architecture and optimizing the learning rules are provided.
Abstract: This paper reviews the use of evolutionary algorithms (EAs) to optimize artificial neural networks (ANNs). First, we briefly introduce the basic principles of artificial neural networks and evolutionary algorithms and, by analyzing the advantages and disadvantages of EAs and ANNs, explain the advantages of using EAs to optimize ANNs. We then provide a brief survey on the basic theories and algorithms for optimizing the weights, optimizing the network architecture and optimizing the learning rules, and discuss recent research from these three aspects. Finally, we speculate on new trends in the development of this area.

281 citations


Journal ArticleDOI
TL;DR: This work focuses on university timetabling and introduces hard and soft constraints as well as most currently used objective functions in university course timetabling.
Abstract: University course timetabling is one of the most important administrative activities that take place in all academic institutions. In this work, we go over the main points of recent papers on the timetabling problem. We concentrate on university timetabling and introduce hard and soft constraints as well as most currently used objective functions. We also discuss some solution methods that have been applied by researchers. Finally, we raise more questions to be explored in future studies. We hope the directions lead to new researches that cover all aspects of the problem and result in high-quality timetables.

56 citations


Journal ArticleDOI
TL;DR: A comparative view of systems performing service composition in Ambient Intelligence Environments revealing similarities and differences, while providing additional information is presented.
Abstract: This article presents a comparative review of systems performing service composition in Ambient Intelligence Environments. Such environments should comply to ubiquitous or pervasive computing guidelines by sensing the user needs or wishes and offering intuitive human-computer interaction and a comfortable non-intrusive experience. To achieve this goal service orientation is widely used and tightly linked with AmI systems. Some of these employ the Web Service technology, which involves well-defined web technologies and standards that facilitate interoperable machine to machine interaction. Other systems regard services of different technologies (e.g. UPnP, OSGi etc) or generally as abstractions of various actions. Service operations are sometimes implemented as software based functions or actions over hardware equipment (e.g. UPnP players). However, a single service satisfies an atomic only user need, so services need to be composed (i.e. combined), in order to provide the usually requested complex tasks. Since manual service composition is obviously a hassle for the user, ambient systems struggle to automate this process by applying various methods. The approaches that have been adopted during the last years vary widely in many aspects, like domain of application, modeling of services, composition method, knowledge representation and interfaces. This work presents a comparative view of these approaches revealing similarities and differences, while providing additional information.

53 citations


Journal ArticleDOI
TL;DR: Compared different approaches based on neural networks and fuzzy systems which have been implemented in different CAD designs, the greatest improvement in CAD systems was achieved with a combination of fuzzy logic and artificial neural networks in the form of FALCON-AART complementary learning fuzzy neural network (CLFNN).
Abstract: Breast cancer is the leading type of cancer diagnosed in women. For years human limitations in interpreting the thermograms possessed a considerable challenge, but with the introduction of computer assisted detection/diagnosis (CAD), this problem has been addressed. This review paper compares different approaches based on neural networks and fuzzy systems which have been implemented in different CAD designs. The greatest improvement in CAD systems was achieved with a combination of fuzzy logic and artificial neural networks in the form of FALCON-AART complementary learning fuzzy neural network (CLFNN). With a CAD design based on FALCON-AART, it was possible to achieve an overall accuracy of near 90%. This confirms that CAD systems are indeed a valuable addition to the efforts for the diagnosis of breast cancer. Lower cost and high performance of new infrared systems combined with accurate CAD designs can promote the use of thermography in many breast cancer centres worldwide.

53 citations


Journal ArticleDOI
TL;DR: In this paper, the authors present a method for supporting the design of intelligent socio-technical systems, placing emphasis on different levels of formal characterization, with equal attention to both the analysis of concepts in a formal calculus independent of computational concerns, and the representation of concept in a machine-processable form, fully cognizant of implementation issues.
Abstract: The design of intelligent socio-technical systems calls for careful examination of relevant social and organizational concepts. We present a method for supporting this design process, placing emphasis on different levels of formal characterization, with equal attention to both the analysis of concepts in a formal calculus independent of computational concerns, and the representation of concepts in a machine-processable form, fully cognizant of implementation issues--a step in the method we refer to as principled operationalization. There are many tools (i.e. formal languages) that can be used to support the design method; we define and discuss criteria for evaluating such tools. We believe that, were the method proposed to be adopted, it would enhance the state-of-the-art in the systematic design and engineering of socio-technical systems, respecting the fundamentally interdisciplinary nature of those tasks, in both their theoretical and practical dimensions.

50 citations


Journal ArticleDOI
TL;DR: Each image spamming trick is described separately, and by perusing the methods used by researchers to combat them, a classification is drawn in three groups: header-based, content- based, and text-based.
Abstract: Many techniques have been proposed to combat the upsurge in image-based spam. All the proposed techniques have the same target, trying to avoid the image spam entering our inboxes. Image spammers avoid the filter by different tricks and each of them needs to be analyzed to determine what facility the filters need to have for overcoming the tricks and not allowing spammers to full our inbox. Different tricks give rise to different techniques. This work surveys image spam phenomena from all sides, containing definitions, image spam tricks, anti image spam techniques, data set, etc. We describe each image spamming trick separately, and by perusing the methods used by researchers to combat them, a classification is drawn in three groups: header-based, content-based, and text-based. Finally, we discus the data sets which researchers use in experimental evaluation of their articles to show the accuracy of their ideas.

43 citations


Journal ArticleDOI
TL;DR: A new model of Smart Consumer Load Balancing is put forward, where consumers actively participate in the balancing of demand with supply by forming groups that agree on a joint demand profile to be contracted in the market with the mediation of an aggregator.
Abstract: The basis of an efficient functioning of a power grid is an accurate balancing of the electricity demand of all the consumers at any instant with supply. Nowadays, this task involves only the grid operator and retail electricity providers. One of the facets of the Smart Grid vision is that consumers may have a more active role in the problem of balancing demand with supply. With the deployment of intelligent information and communication technologies in domestic environments, homes are becoming smarter and able to play a more active role in the management of energy. We use the term Smart Consumer Load Balancing to refer to algorithms that are run by energy management systems of homes in order to optimise the electricity consumption, to minimise costs and/or meet supply constraints. In this work, we analyse different approaches to Smart Consumer Load Balancing based on (distributed) artificial intelligence. We also put forward a new model of Smart Consumer Load Balancing, where consumers actively participate in the balancing of demand with supply by forming groups that agree on a joint demand profile to be contracted in the market with the mediation of an aggregator. We specify the business model as well as the optimisation model for load balancing, showing the economic benefits for the consumers in a realistic scenario based on the Spanish electricity market.

39 citations


Journal ArticleDOI
TL;DR: A survey on approaches which are based on Hidden Markov Models (HMM) for hand posture and gesture recognition for HCI applications is provided.
Abstract: Human hand recognition plays an important role in a wide range of applications ranging from sign language translators, gesture recognition, augmented reality, surveillance and medical image processing to various Human Computer Interaction (HCI) domains. Human hand is a complex articulated object consisting of many connected parts and joints. Therefore, for applications that involve HCI one can find many challenges to establish a system with high detection and recognition accuracy for hand posture and/or gesture. Hand posture is defined as a static hand configuration without any movement involved. Meanwhile, hand gesture is a sequence of hand postures connected by continuous motions. During the past decades, many approaches have been presented for hand posture and/or gesture recognition. In this paper, we provide a survey on approaches which are based on Hidden Markov Models (HMM) for hand posture and gesture recognition for HCI applications.

Journal ArticleDOI
TL;DR: A novel incremental decision tree algorithm based on rough set theory that can avoid the high time complexity of the traditional incremental methods for rebuilding decision trees too many times and can provide competitive solutions to incremental learning.
Abstract: As we know, learning in real world is interactive, incremental and dynamical in multiple dimensions, where new data could be appeared at anytime from anywhere and of any type. Therefore, incremental learning is of more and more importance in real world data mining scenarios. Decision trees, due to their characteristics, have been widely used for incremental learning. In this paper, we propose a novel incremental decision tree algorithm based on rough set theory. To improve the computation efficiency of our algorithm, when a new instance arrives, according to the given decision tree adaptation strategies, the algorithm will only modify some existing leaf node in the currently active decision tree or add a new leaf node to the tree, which can avoid the high time complexity of the traditional incremental methods for rebuilding decision trees too many times. Moreover, the rough set based attribute reduction method is used to filter out the redundant attributes from the original set of attributes. And we adopt the two basic notions of rough sets: significance of attributes and dependency of attributes, as the heuristic information for the selection of splitting attributes. Finally, we apply the proposed algorithm to intrusion detection. The experimental results demonstrate that our algorithm can provide competitive solutions to incremental learning.

Journal ArticleDOI
TL;DR: A novel measure based on the Lesk algorithm and Vector Space Model to calculate semantic relatedness is proposed and it is shown that the combination of knowledge-based methods is superior to the most frequent sense heuristic and significantly reduces the difference between knowledge- based and supervised methods.
Abstract: Word sense disambiguation (WSD) is a difficult problem in Computational Linguistics, mostly because of the use of a fixed sense inventory and the deep level of granularity. This paper formulates WSD as a variant of the traveling salesman problem (TSP) to maximize the overall semantic relatedness of the context to be disambiguated. Ant colony optimization, a robust nature-inspired algorithm, was used in a reinforcement learning manner to solve the formulated TSP. We propose a novel measure based on the Lesk algorithm and Vector Space Model to calculate semantic relatedness. Our approach to WSD is comparable to state-of-the-art knowledge-based and unsupervised methods for benchmark datasets. In addition, we show that the combination of knowledge-based methods is superior to the most frequent sense heuristic and significantly reduces the difference between knowledge-based and supervised methods. The proposed approach could be customized for other lexical disambiguation tasks, such as Lexical Substitution or Word Domain Disambiguation.

Journal ArticleDOI
TL;DR: Numerical results show that, when associated to the exponential crossover, the employment of multiple scale factors is not systematically beneficial and in some cases even detrimental to the performance of the algorithm.
Abstract: This paper studies the use of multiple scale factor values within distributed Differential Evolution structures employing the so-called exponential crossover. Four different scale factor schemes are proposed, tested, compared and analyzed. Two schemes simply employ multiple scale factor values and two also include an update logic during the evolution. The four schemes have been integrated for comparison within three recently proposed distributed Differential Evolution structures and tested on several various test problems. The results are then compared to those of a previous study where the so-called binomial crossover was employed. Numerical results show that, when associated to the exponential crossover, the employment of multiple scale factors is not systematically beneficial and in some cases even detrimental to the performance of the algorithm. The exponential crossover accentuates the exploitative character of the Differential Evolution, which cannot always be counterbalanced by the increase in the explorative aspect of the algorithm introduced by the employment of multiple scale factor values.

Journal ArticleDOI
TL;DR: It is found that the neural classifier outperforms the maximum likelihood classifier in this context, and is applied for pixel-level classification, followed by modal-filter based post-processing for robustness.
Abstract: Maximum likelihood and neural classifiers are two typical techniques in image classification. This paper investigates how to adapt these approaches to hyperspectral imaging for the classification of five kinds of Chinese tea samples, using visible light hyperspectral spectroscopy rather than near-infrared. After removal of unnecessary parts from each imaged tea sample using a morphological cropper, principal component analysis is employed for feature extraction. The two classifiers are then respectively applied for pixel-level classification, followed by modal-filter based post-processing for robustness. Although the samples look similar to the naked eye, promising results are reported and analysed in these comprehensive experiments. In addition, it is found that the neural classifier outperforms the maximum likelihood classifier in this context.

Journal ArticleDOI
TL;DR: This work critically review studies of selective attention from a multidisciplinary perspective to take lessons from psychological and biological studies of attention and consider how constraints from those studies can be imposed on computational models of selective Attention.
Abstract: During the last half century, significant efforts have been made to explore the underlying mechanisms of visual selective attention using a variety of approaches--psychology, neuroscience, and computational models. Among them, the computational approach emerged on the stage with the development of computer science and computer vision focusing researchers interests in this area. However, computer scientists often face the difficulty of how to construct a computational model of selective attention working on their own purpose. Here, we critically review studies of selective attention from a multidisciplinary perspective to take lessons from psychological and biological studies of attention. We consider how constraints from those studies can be imposed on computational models of selective attention.

Journal ArticleDOI
TL;DR: This article reviews the production scheduling problems focusing on those related to flexible job-shop scheduling and suggests ways in which artificial immune systems (AIS) can be used in solving these problems.
Abstract: This article reviews the production scheduling problems focusing on those related to flexible job-shop scheduling. Job-shop and flexible job-shop scheduling problems are one of the most frequently encountered and hardest to optimize. This article begins with a review of the job-shop and flexible job-shop scheduling problem, and follow by the literature on artificial immune systems (AIS) and suggests ways them in solving job-shop and flexible job-shop scheduling problems. For the purposes of this study, AIS is defined as a computational system based on metaphors borrowed from the biological immune system. This article also, summarizes the direction of current research and suggests areas that might most profitably be given further scholarly attention.

Proceedings ArticleDOI
TL;DR: A scheme of soot and PM formation is discussed as well as the influence of exhaust gas aftertreatment systems upon formation of the same are described.
Abstract: The paper presents results of tests on particulate matter (PM) in PC and HDV category vehicles fitted with compression ignition engines and various systems of exhaust gas aftertreatment. The measurements were made under actual conditions of use in a city and used portable exhaust gas analyzers belonging to a group of PEMS (Portable Emissions Measurement Systems) – AVL MSS concentration analyzer and TSI EEPS mass spectrometer. The obtained results were processed and coefficients of correlation were determined for PC-PC and PC-HDV. The considerations were made according to Euro 3-5 emission standards. Moreover, the paper discusses a scheme of soot and PM formation as well as describes the influence of exhaust gas aftertreatment systems upon formation of the same.

Journal ArticleDOI
TL;DR: Using five medical datasets, it is discovered that for a two-class dataset, despite as high as 20–30% missing values, almost as good results as with no missing value could still be produced.
Abstract: Using five medical datasets we detected the influence of missing values on true positive rates and classification accuracy. We randomly marked more and more values as missing and tested their effects on classification accuracy. The classifications were performed with nearest neighbour searching when none, 10, 20, 30% or more values were missing. We also used discriminant analysis and naive Bayesian method for the classification. We discovered that for a two-class dataset, despite as high as 20---30% missing values, almost as good results as with no missing value could still be produced. If there are more than two classes, over 10---20% missing values are probably too many, at least for small classes with relatively few cases. The more classes and the more classes of different sizes, a classification task is the more sensitive to missing values. On the other hand, when values are missing on the basis of actual distributions affected by some selection or non-random cause and not fully random, classification can tolerate even high numbers of missing values for some datasets.

Journal ArticleDOI
TL;DR: This paper presents a state of art segmentation algorithms of “MICCAI Grand Challenge and Conference 2007, 2008 and 2009” and exhibits the divergence in performance if the input data is varied.
Abstract: With rapid increase in disease variety, the role of image segmentation has been crucial in image guided surgery. Despite having a lot of existing methods, the robustness of an algorithm remains a concern with respect to the input image variety. This paper presents a state of art segmentation algorithms of "MICCAI Grand Challenge and Conference 2007, 2008 and 2009". These algorithms are reported to have tested on real datasets used in "MICCAI Grand Challenge 2007, 2008 and 2009". Due to the page constraint, selected papers based on some criteria are included in this review. In this work, we have implemented and evaluated all these methods on a particular data. The objective of this paper is to exhibit the divergence in performance if the input data is varied.

Proceedings ArticleDOI
TL;DR: An attempt to evaluate the exhaust emissions generated by transport in the Poznan agglomeration by modeling the on-going characteristics of the individual groups of vehicles and a forecasted structure change of these groups for the years 2012–2030.
Abstract: The paper presents an attempt to evaluate the exhaust emissions generated by transport in the Poznan agglomeration. The basis for the modeling of the exhaust emissions was the on-going characteristics of the individual groups of vehicles and a forecasted structure change of these groups for the years 2012–2030. The determination of average on-road emissions as a function of daily distance covered by individual vehicle groups was the basis for the determination of the change in the annual pollution. The assumed values of the exhaust emission concentrations from passenger vehicles were based on the performed tests under actual traffic conditions while for the other categories the authors adopted the test results of earlier works of the research team with the Chair of Combustion Engines at Poznan University of Technology. In the paper the authors assumed an increase in the share of vehicles meeting the latest applicable emission standards as well as changes in the distance covered by the vehicles. The result of the analysis will be accumulated annual exhaust emissions for a given vehicle

Proceedings ArticleDOI
TL;DR: In this article, the authors used singular value decomposition (SVD) to determine the Denavit-Hartenberg (DH) parameters of a serial robot, which are typically used to represent its architecture, are usually provided by its manufacturer.
Abstract: Kinematic identification of a serial robot has been an active field of research as the need for improving the accuracy of a robot is increasing with time. Denavit-Hartenberg (DH) parameters of a serial robot, which are typically used to represent its architecture, are usually provided by its manufacturer. At times these parameters are not the same and hence they need to be identified. An analytical method proposed elsewhere was used here for identification of an industrial robot by noting the values of the point on the end-effector due to rotation of each joint, locking all other joints, were found out using singular value decomposition. The DH parameters of the robot determined using the proposed methodology, matched satisfactorily with the robot specifications. Also, the bounding volume for the joint ranges infers that a smaller measurement volume relative to the robot workspace is required thus facilitating the use of measurement devices which have smaller range of measurement.

Journal ArticleDOI
TL;DR: This survey paper starts with a basic explanation about robot soccer and its systems, then will focus on the strategies that have been used by previous researchers, and a time-line of described robot soccer strategies, which will show the trend of strategies and technologies.
Abstract: This survey paper starts with a basic explanation about robot soccer and its systems, then will focus on the strategies that have been used by previous researchers. There is a time-line of described robot soccer strategies, which will show the trend of strategies and technologies. The basic algorithm for each robot, that is described here, morphs from just simple mechanical maneuvering strategies to biologically inspired strategies. These strategies are adapted from many realms. The realm of educational psychology, produced reinforcement learning and Q-learning, commerce produced concepts of market-driven economy, engineering with its potential field, AI with its petri-nets, neural network and fuzzy logic. Even insect and fish were simulated in PSO and have been adapted into robot soccer. All these strategies are surveyed in this paper. Another aspect surveyed here is the vision system trend that is shifting from global vision, to local omni-directional vision, to front-facing local vision, which shows the evolution is towards biologically inspired robot soccer agent, the human soccer player.

Journal ArticleDOI
TL;DR: This paper investigates how the changing of a kernel function in an SVM classifier effects classification results and proposes that SVM suits well to the automated taxa identification of benthic macroinvertebrates.
Abstract: Support vector machines are a relatively new classification method which has nowadays established a firm foothold in the area of machine learning. It has been applied to numerous targets of applications. Automated taxa identification of benthic macroinvertebrates has got generally very little attention and especially using a support vector machine in it. In this paper we investigate how the changing of a kernel function in an SVM classifier effects classification results. A novel question is how the changing of a kernel function effects the number of ties in a majority voting method when we are dealing with a multi-class case. We repeated the classification tests with two different feature sets. Using SVM, we present accurate classification results proposing that SVM suits well to the automated taxa identification of benthic macroinvertebrates. We also present that the selection of a kernel has a great effect on the number of ties.

Journal ArticleDOI
TL;DR: Experiments show that when compared with a leading LOD approach, Agreementmaker achieves considerably higher precision and F-measure, at the cost of a slight decrease in recall.
Abstract: The creation of links between schemas of published datasets is a key part of the Linked Open Data (LOD) paradigm. The ability to discover these links "on the go" requires that ontology matching techniques achieve good precision and recall within acceptable execution times. In this paper, we add similarity-based and mediator-based ontology matching methods to the Agreementmaker ontology matching system, which aim to efficiently discover high precision subclass mappings between LOD ontologies. Similarity-based matching methods discover subclass mappings by extrapolating them from a set of high quality equivalence mappings and from the interpretation of compound concept names. Mediator-based matching methods discover subclass mappings by comparing polysemic lexical annotations of ontology concepts and by considering external web ontologies. Experiments show that when compared with a leading LOD approach, Agreementmaker achieves considerably higher precision and F-measure, at the cost of a slight decrease in recall.

Journal ArticleDOI
TL;DR: This work presents a machine learning method inspired by the human immune system called Artificial Immune System (AIS) which is a new emerging method that still needs further exploration.
Abstract: Spam is a serious universal problem which causes problems for almost all computer users This issue affects not only normal users of the internet, but also causes a big problem for companies and organizations since it costs a huge amount of money in lost productivity, wasting users' time and network bandwidth Many studies on spam indicate that spam cost organizations billions of dollars yearly This work presents a machine learning method inspired by the human immune system called Artificial Immune System (AIS) which is a new emerging method that still needs further exploration Core modifications were applied on the standard AIS with the aid of the Genetic Algorithm Also an Artificial Neural Network for spam detection is applied with a new manner SpamAssassin corpus is used in all our simulations

Journal ArticleDOI
TL;DR: In this paper, the authors present an algorithm portfolio for SAT that is extremely simple, but in the same time so efficient that it outperforms SATzilla. The main distinguishing feature of their algorithm portfolio is the locality of the selection procedure, which selects a SAT solver based only on few instances similar to the input one.
Abstract: The importance of algorithm portfolio techniques for SAT has long been noted, and a number of very successful systems have been devised, including the most successful one--SATzilla. However, all these systems are quite complex (to understand, reimplement, or modify). In this paper we present an algorithm portfolio for SAT that is extremely simple, but in the same time so efficient that it outperforms SATzilla. For a new SAT instance to be solved, our portfolio finds its k-nearest neighbors from the training set and invokes a solver that performs the best for those instances. The main distinguishing feature of our algorithm portfolio is the locality of the selection procedure--the selection of a SAT solver is based only on few instances similar to the input one. An open source tool that implements our approach is publicly available.

Proceedings ArticleDOI
TL;DR: It is demonstrated that the depth sensor can provide reliable estimates of topology coordinates and can replace the complex and expensive setup of motion capture system.
Abstract: In this study we consider the problem of estimating Human-Cloth topological relationship using a depth sensor and its application to robotic clothing assistance. In the past, reinforcement learning with low dimensional topological representations has been used to learn the necessary motor skills to perform clothing. In this framework, motion capture system was used to observe the Human-Cloth relationship. There were problems faced with the use of motion capture system: 1) Elaborate and expensive setup of the system 2) Occlusion of optical markers by other objects in the environment 3) Observation of non existent markers due to unwanted reflections. To overcome these difficulties, we propose a framework to observe the Human-Cloth topological relationship using a depth sensor. We demonstrate that the depth sensor can provide reliable estimates of topology coordinates and can replace the complex and expensive setup of motion capture system.

Journal ArticleDOI
TL;DR: This work shows how argumentation schemes theory can provide a valuable help to formalize and structure on-line discussions and user opinions in decision support and business oriented websites that held social networks between their users.
Abstract: Nowadays, many websites allow social networking between their users in an explicit or implicit way. In this work, we show how argumentation schemes theory can provide a valuable help to formalize and structure on-line discussions and user opinions in decision support and business oriented websites that held social networks between their users. Two real case studies are studied and analysed. Then, guidelines to enhance social decision support and recommendations with argumentation are provided.

Proceedings ArticleDOI
TL;DR: This work presents a cognitive medical robot system using lightweight robots with redundant kinematics to create a system that acts as a human assistant, who perceives the situation, understands the context based on his knowledge and acts appropriate.
Abstract: Up to date, medical robots for minimal invasive surgery do not provide assistance appropriate to the workflow of the intervention. A simple concept of a cognitive system is presented, which is derived from a classic closed-loop control. As implementation, we present a cognitive medical robot system using lightweight robots with redundant kinematics. The robot system includes several control modes and human-machine interfaces. We focus on describing knowledge acquisition about the workflow of an intervention and present two example applications utilizing the acquired knowledge: autonomous camera guidance and planning of minimal invasive port (trocar) positions in combination with an initial robot setup. Port planning is described as optimization problem. The autonomous camera system includes a mid-term movement prediction of the ongoing intervention. The cognitive approach to a medical robot system includes taking the environment into account. The goal is to create a system that acts as a human assistant, who perceives the situation, understands the context based on his knowledge and acts appropriate.