scispace - formally typeset
Search or ask a question

Showing papers by "Enrique Alba published in 2009"


Proceedings ArticleDOI
15 May 2009
TL;DR: A new multi-objective particle swarm optimization algorithm characterized by the use of a strategy to limit the velocity of the particles, called Speed-constrained Multi-Objective PSO (SMPSO), which allows to produce new effective particle positions in those cases in which the velocity becomes too high.
Abstract: In this work, we present a new multi-objective particle swarm optimization algorithm (PSO) characterized by the use of a strategy to limit the velocity of the particles. The proposed approach, called Speed-constrained Multi-objective PSO (SMPSO) allows to produce new effective particle positions in those cases in which the velocity becomes too high. Other features of SMPSO include the use of polynomial mutation as a turbulence factor and an external archive to store the non-dominated solutions found during the search. Our proposed approach is compared with respect to five multi-objective metaheuristics representative of the state-of-the-art in the area. For the comparison, two different criteria are adopted: the quality of the resulting approximation sets and the convergence speed to the Pareto front. The experiments carried out indicate that SMPSO obtains remarkable results in terms of both, accuracy and speed.

563 citations


Journal IssueDOI
01 Jul 2009
TL;DR: This paper introduces a new cellular genetic algorithm called MOCell, characterized by using an external archive to store nondominated solutions and a feedback mechanism in which solutions from this archive randomly replace existing individuals in the population after each iteration.
Abstract: This paper introduces a new cellular genetic algorithm for solving multiobjective continuous optimization problems. Our approach is characterized by using an external archive to store nondominated solutions and a feedback mechanism in which solutions from this archive randomly replace existing individuals in the population after each iteration. The result is a simple and elitist algorithm called MOCell. Our proposal has been evaluated with both constrained and unconstrained problems and compared against NSGA-II and SPEA2, two state-of-the-art evolutionary multiobjective optimizers. For the studied benchmark, our experiments indicate that MOCell obtains competitive results in terms of convergence and hypervolume, and it clearly outperforms the other two compared algorithms concerning the diversity of the solutions along the Pareto front. © 2009 Wiley Periodicals, Inc.

215 citations


Book ChapterDOI
21 Apr 2009
TL;DR: A new MOPSO algorithm is proposed, called SMPSO, characterized by including a velocity constraint mechanism, obtaining promising results where the rest perform inadequately.
Abstract: Particle Swarm Optimization (PSO) has received increasing attention in the optimization research community since its first appearance in the mid-1990s. Regarding multi-objective optimization, a considerable number of algorithms based on Multi-Objective Particle Swarm Optimizers (MOPSOs) can be found in the specialized literature. Unfortunately, no experimental comparisons have been made in order to clarify which MOPSO version shows the best performance. In this paper, we use a benchmark composed of three well-known problem families (ZDT, DTLZ, and WFG) with the aim of analyzing the search capabilities of six representative state-of-the-art MOPSOs, namely, NSPSO, SigmaMOPSO, OMOPSO, AMOPSO, MOPSOpd, and CLMOPSO. We additionally propose a new MOPSO algorithm, called SMPSO, characterized by including a velocity constraint mechanism, obtaining promising results where the rest perform inadequately.

163 citations


Journal ArticleDOI
TL;DR: This work proposes the use of a multiobjective genetic algorithm for gene selection of Microarray datasets that performs gene selection from the point of view of the sensitivity and the specificity, both used as quality indicators of the classification test applied to the previously selected genes.

87 citations


Proceedings ArticleDOI
25 Sep 2009
TL;DR: This article designs and applies a simulated annealing to compute the optimal placement of wind turbines in a wind farm, and analyzes several case studies according to different wind conditions to show optimal solutions and prove that they are better concerning the present best result in the literature.
Abstract: In this paper, a simulated annealing is designed and applied to compute the optimal placement of wind turbines in a wind farm. The objetive of this process is to find a combination of aero-generators that maximizes the annual profit obtained from a wind farm, measured as the generated power in one year. Simulated annealing (SA) is a metaheuristic for global optimization that we will use here to search in the large landscape of possible solutions. Maximizing the generated power depends on the distribution of the aero-generators and the geometry of the wind farm. In this article we analyze several case studies according to different wind conditions. We will show optimal solutions and prove that they are better concerning the present best result in the literature.

85 citations


Proceedings ArticleDOI
13 May 2009
TL;DR: The obtained results show that MOCell outperforms NSGA-II in terms of the range of solutions covered, while this latter is able of obtaining better solutions than Mocell in large instances.
Abstract: One of the first issues which has to be taken into account by software companies is to determine what should be included in the next release of their products, in such a way that the highest possible number of customers get satisfied while this entails a minimum cost for the company. This problem is known as the Next Release Problem (NRP). Since minimizing the total cost of including new features into a software package and maximizing the total satisfaction of customers are contradictory objectives, the problem has a multi-objective nature. In this work we study the NRP problem from the multi-objective point of view, paying attention to the quality of the obtained solutions, the number of solutions, the range of solutions covered by these fronts, and the number of optimal solutions obtained.Also, we evaluate the performance of two state-of-the-art multi-objective metaheuristics for solving NRP: NSGA-II and MOCell. The obtained results show that MOCell outperforms NSGA-II in terms of the range of solutions covered, while this latter is able of obtaining better solutions than MOCell in large instances. Furthermore, we have observed that the optimal solutions found are composed of a high percentage of low-cost requirements and, also, the requirements that produce most satisfaction on the customers.

68 citations


Journal Article
TL;DR: This paper presents a new tabu search (TS) algorithm for the problem of batch job scheduling on computational grids, defined as a bi-objective optimization problem, consisting of the makespan and flowtime.
Abstract: The efficient allocation of jobs to grid resources is indispensable for high performance grid-based applications, and it is a computationally hard problem even when there are no dependencies among jobs. We present in this paper a new tabu search (TS) algorithm for the problem of batch job scheduling on computational grids. We define it as a bi-objective optimization problem, consisting of the mi- nimization of the makespan and flowtime. Our TS is validated versus three other algorithms in the literature for a classical benchmark. We additionally consider some more realistic benchmarks with larger size instances in static and dynamic environments. We show that our TS clearly outperforms the compared algorithms.

60 citations


Book ChapterDOI
21 Apr 2009
TL;DR: This paper has implemented steady-state versions of the NSGA-II and SPEA2 algorithms, and compared them to the generational ones according to three criteria: the quality of the resulting approximation sets to the Pareto front, the convergence speed of the algorithm, and the computing time.
Abstract: Genetic Algorithms (GAs) have been widely used in single-objective as well as in multi-objective optimization for solving complex optimization problems. Two different models of GAs can be considered according to their selection scheme: generational and steady-state. Although most of the state-of-the-art multi-objective GAs (MOGAs) use a generational scheme, in the last few years many proposals using a steady-state scheme have been developed. However, the influence of using those selection strategies in MOGAs has not been studied in detail. In this paper we deal with this gap. We have implemented steady-state versions of the NSGA-II and SPEA2 algorithms, and we have compared them to the generational ones according to three criteria: the quality of the resulting approximation sets to the Pareto front, the convergence speed of the algorithm, and the computing time. The results show that multi-objective GAs can profit from the steady-state model in many scenarios.

60 citations


BookDOI
23 Mar 2009
TL;DR: Solving Complex Problems addresses real problems and the modern optimization techniques used to solve them, making it especially useful to practitioners in those areas of computer science, engineering, transportation, telecommunications, and bioinformatics.
Abstract: Solving Complex Problems addresses real problems and the modern optimization techniques used to solve them. Thorough examples illustrate the applications themselves, as well as the actual performance of the algorithms. Application areas include computer science, engineering, transportation, telecommunications, and bioinformatics, making the book especially useful to practitioners in those areas.

59 citations


Journal ArticleDOI
TL;DR: A canonical RND problem formulation driven by two main directives: technology independence and a normalized comparison criterion is proposed and an exhaustive behavior comparison between 14 different techniques is included.
Abstract: The radio network design (RND) is an NP-hard optimization problem which consists of the maximization of the coverage of a given area while minimizing the base station deployment. Solving RND problems efficiently is relevant to many fields of application and has a direct impact in the engineering, telecommunication, scientific, and industrial areas. Numerous works can be found in the literature dealing with the RND problem, although they all suffer from the same shortfall: a noncomparable efficiency. Therefore, the aim of this paper is twofold: first, to offer a reliable RND comparison base reference in order to cover a wide algorithmic spectrum, and, second, to offer a comprehensible insight into accurate comparisons of efficiency, reliability, and swiftness of the different techniques applied to solve the RND problem. In order to achieve the first aim we propose a canonical RND problem formulation driven by two main directives: technology independence and a normalized comparison criterion. Following this, we have included an exhaustive behavior comparison between 14 different techniques. Finally, this paper indicates algorithmic trends and different patterns that can be observed through this analysis.

39 citations


Book ChapterDOI
22 Jun 2009
TL;DR: The most interesting results are presented, indicating the pros and cons of the studied solvers, from a large analysis of RND using different metaheuristics.
Abstract: RND (Radio Network Design) is a Telecommunication problem consisting in covering a certain geographical area by using the smallest number of radio antennas achieving the biggest cover rate. This is an important problem, for example, in mobile/cellular technology. RND can be solved by bio-inspired algorithms. In this work we use different metaheuristics to tackle this problem. PBIL (Population-Based Incremental Learning), based on genetic algorithms and competitive learning (typical in neural networks), is a population evolution model based on probabilistic models. DE (Differential Evolution) is a very simple population-based stochastic function minimizer used in a wide range of optimization problems, including multi-objective optimization. SA (Simulated Annealing) is a classic trajectory descent optimization technique. CHC is a particular class of evolutionary algorithm which does not use mutation and relies instead on incest prevention and disruptive crossover. Due to the complexity of such a large analysis including so many techniques, we have used not only sequential algorithms, but grid computing with BOINC in order to execute thousands of experiments in only several days using around 100 computers. In this paper we present the most interesting results from our work, indicating the pros and cons of the studied solvers.

Journal ArticleDOI
TL;DR: A new graph structure to abstract multimodal networks, called transfer graph, is described, which adapts to the distributed nature of real information sources of transportation networks, and a decomposition of the Shortest Path Problem in transfer graph is proposed to optimize the computation time.
Abstract: This paper presents an alternative approach for time-dependent multimodal transport problem. We describe a new graph structure to abstract multimodal networks, called transfer graph, which adapts to the distributed nature of real information sources of transportation networks. A decomposition of the Shortest Path Problem in transfer graph is proposed to optimize the computation time. This approach was computationally tested in several experimental multimodal networks having different size and complexity. The approach was integrated in the multimodal transport service of the European Carlink platform, where it has been validated in real scenarios. Comparision with other related works is provided.

Proceedings ArticleDOI
25 Sep 2009
TL;DR: This work presents a Differential Evolution (DE) approach for the efficient automated gene subset selection, in this model, the selected subsets are evaluated by means of their classification rate using a Support Vector Machines (SVM) classifier.
Abstract: The efficient selection of predictive and accurate gene subsets for cell-type classification is nowadays a crucial problem in Microarray data analysis. The application and combination of dedicated computational intelligence methods holds a great promise for tackling the feature selection and classification. In this work we present a Differential Evolution (DE) approach for the efficient automated gene subset selection. In this model, the selected subsets are evaluated by means of their classification rate using a Support Vector Machines (SVM) classifier. The proposed approach is tested on DLBCL Lymphoma and Colon Tumor gene expression datasets. Experiments lying in effectiveness and biological analyses of the results, in addition to comparisons with related methods in the literature, indicate that our DE-SVM model is highly reliable and competitive. I. INTRODUCTION DNA Microarrays (MA) (13) allow the scientists to simulta- neously analyze thousands of genes, and thus giving important insights about cell's function, since changes in the physio-logy of an organism are generally associated with changes in gene ensembles of expression patterns. The vast amount of data involved in a typical Microarray experiment usually requires to perform a complex statistical analysis, with the important goal of making the classification of the dataset into correct classes. The key issue in this classification is to identify significant and representative gene subsets that may be used to predict class membership for new external samples. Furthermore, these subsets should be as small as possible in order to develop fast and low consuming processes for the future class prediction. The main difficulty in Microarray classification versus other domains is the availability of a relatively small number of samples in comparison with the number of genes in each sample (between 2,000 and more than 10,000 in MA). In addition, expression data are highly redundant and noisy, and of most genes are believed to be uninformative with respect to studied classes, as only a fraction of genes may present distinct profiles for different classes of samples. In this context, machine learning techniques have been applied to handle with large and heterogeneous datasets, since they are capable to isolate the useful information by rejecting redundancies. Concretely, feature selection is often considered as a necessary preprocess step to analyze large datasets, as this method can reduce the dimensionality of the datasets and often conducts to better analyses (9). Feature selection (gene selection in Biology) for gene expression analysis in cancer prediction often uses wrapper classification methods to discriminate a type of tumor (9), (11), to reduce the number of genes to investigate in case of a new patient, and also to assist in drug discovery and early diagnosis. The formal definition of the feature selection problem that we consider here is given as follows:

Book ChapterDOI
21 Apr 2009
TL;DR: This work presents the application of a parallel cooperative optimization approach to the broadcast operation in mobile ad-hoc networks and obtained results for a manet s scenario, representing a mall, demonstrate the validity of the new proposed approach.
Abstract: This work presents the application of a parallel cooperative optimization approach to the broadcast operation in mobile ad-hoc networks ( manet s). The optimization of the broadcast operation implies satisfying several objectives simultaneously, so a multi-objective approach has been designed. The optimization lies on searching the best configurations of the dfcn broadcast protocol for a given manet scenario. The cooperation of a team of multi-objective evolutionary algorithms has been performed with a novel optimization model. Such model is a hybrid parallel algorithm that combines a parallel island-based scheme with a hyperheuristic approach. Results achieved by the algorithms in different stages of the search process are analyzed in order to grant more computational resources to the most suitable algorithms. The obtained results for a manet s scenario, representing a mall, demonstrate the validity of the new proposed approach.

Proceedings ArticleDOI
08 Jul 2009
TL;DR: This article proposes a new parallel cellular genetic algorithm which maintains (or even improves because its asynchronicity) the numerical behaviour of a serial cGA, while at the same time it provokes an important reduction on the execution time for finding the optimal solution.
Abstract: Cellular genetic algoritms (cGAs) are characterized by its grid structure population, in which individuals can only interact with their neighbors. This kind of algorithms has demonstrated to have a high numerical performance thanks to the good exploration/exploitation balance they perform in the search space. Although cGAs seem very appropriate for parallelism, there is a low number of works proposing or studing parallel models for clusters of computers. This is probably because the model requires a high communication level between sub-populations due to the tight interactions among individuals. These parallel versions are however needed to cope with the high computational requirements of the current real-world problems. This article proposes a new parallel cellular genetic algorithm which maintains (or even improves because its asynchronicity) the numerical behaviour of a serial cGA, while at the same time it provokes an important reduction on the execution time for finding the optimal solution.

Proceedings ArticleDOI
08 Jul 2009
TL;DR: This work evaluates a Particle Swarm Optimizer hybridized with Differential Evolution and applies it to the Black-Box Optimization Benchmarking for noiseless functions (BBOB 2009) and obtained an accurate level of coverage rate.
Abstract: In this work we evaluate a Particle Swarm Optimizer hybridized with Differential Evolution and apply it to the Black-Box Optimization Benchmarking for noiseless functions (BBOB 2009). We have performed the complete procedure established in this special session dealing with noiseless functions with dimension: 2, 3, 5, 10, 20, and 40 variables. Our proposal obtained an accurate level of coverage rate, despite the simplicity of the model and the relatively small number of function evaluations used.

Book ChapterDOI
26 Mar 2009
TL;DR: A very large instance with 1000 preselected available locations for placing sensors (ALS), modelled with a 287×287 point grid and both RSENSand RCOMMare set to 22 points is solved using simulated annealing (SA) and CHC.
Abstract: When a WSN is deployed in a terrain (known as the sensor field), the sensors form a wireless ad-hoc network to send their sensing results to a special station called the High Energy Communication Node(HECN). The WSN is formed by establishing all possible links between any two nodes separated by at most RCOMM, then keeping only those nodes for which a path to the HECN exists. The sensing area of the WSN is the union of the individual sensing areas (circles of radius RSENS) of these kept nodes.The objective of this problem is to maximize the sensing area of the network while minimizing the number of sensors deployed. The solutions are evaluated using a geometric fitness function. In this article we will solve a very large instance with 1000 preselected available locations for placing sensors (ALS). The terrain is modelled with a 287×287 point grid and both RSENSand RCOMMare set to 22 points. The problem is solved using simulated annealing (SA) and CHC. Every experiment is performed 30 times independently and the results are averaged to assure statistical confidence. The influence of the allowed number of evaluations will be studied. In our experiments, CHC has outperformed SA for any number of evaluations. CHC with 100000 and 200000 evaluations outperforms SA with 500000 and 1,000,000 evaluations respectively. The average fitness obtained by the two algorithms grows following a logarithmic law on the number of evaluations.

Book ChapterDOI
04 Jun 2009
TL;DR: This paper proposes hybrid ACO approach to solve the Global Positioning System (GPS) surveying problem and results outperform those achieved by the best-so-far algorithms in the literature, and represent a new state of the art in this problem.
Abstract: Ant Colony Optimization(ACO) has been used successfully to solve hard combinatorial optimization problems This metaheuristic method is inspired by the foraging behavior of ants, which manage to establish the shortest routes from their nest to feeding sources and back In this paper, we propose hybrid ACO approach to solve the Global Positioning System (GPS) surveying problem In designing GPS surveying network, a given set of earth points must be observed consecutively (schedule) The cost of the schedule is the sum of the time needed to go from one point to another The problem is to search for the best order in which this observation is executed Minimizing the cost of this schedule is the goal of this work Our results outperform those achieved by the best-so-far algorithms in the literature, and represent a new state of the art in this problem.

Proceedings ArticleDOI
08 Jul 2009
TL;DR: This work evaluates a Particle Swarm Optimizer hybridized with Differential Evolution and applies it to the Black-Box Optimization Benchmarking for noisy functions (BBOB 2009) and obtained an accurate level of coverage rate.
Abstract: In this work we evaluate a Particle Swarm Optimizer hybridized with Differential Evolution and apply it to the Black-Box Optimization Benchmarking for noisy functions (BBOB 2009). We have performed the complete procedure established in this special session dealing with noisy functions with dimension: 2, 3, 5, 10, 20, and 40 variables. Our proposal obtained an accurate level of coverage rate, despite the simplicity of the model and the relatively small number of function evaluations used.

01 Jan 2009
TL;DR: This article analyzes two kinds of metaheuristic algorithms applied to wind farm optimization - CHC and GPSO - and applies both algorithms to analyze the performance of the algorithms and the behavior of the computed wind farm designs.
Abstract: In this article we analyze two kinds of metaheuristic algorithms applied to wind farm optimization. The basic idea is to utilize CHC (a sort of GA) and GPSO (a sort of PSO) algorithms to obtain an acceptable configuration of wind turbines in the wind farm that maximizes the total output energy and minimize the number of wind turbines used. The energy produced depends of the farm geometry, wind conditions and the terrain where it is settled. In this work we will analyze three study farm scenarios with different wind speeds and we will apply both algorithms to analyze the performance of the algorithms and the behavior of the computed wind farm designs.

Proceedings ArticleDOI
08 Jul 2009
TL;DR: This work defines a branch distance for logical expressions containing the instanceof operator in Java programs and proposes two mutation operators based on the distance, and studies the behaviour of the mutation operators on a benchmark set composed of nine OO programs.
Abstract: Most of the software developed in the world follows the object-oriented (OO) paradigm. However, the existing work on evolutionary testing is mainly targeted to procedural languages. All this work can be used with small changes on OO programs, but object orientation introduces new features that are not present in procedural languages. Some important issues are polymorphism and inheritance. In this paper we want to make a contribution to the inheritance field by proposing some approaches that use the information of the class hierarchy for helping test case generators to better guide the search. To the best of our knowledge, no work exists using this information to propose test cases. In this work we define a branch distance for logical expressions containing the instanceof operator in Java programs. In addition to the distance measure, we propose two mutation operators based on the distance. We study the behaviour of the mutation operators on a benchmark set composed of nine OO programs. The results show that the information collected from the class hierarchy helps in the search for test cases.

Book ChapterDOI
07 Feb 2009
TL;DR: Several Simulated Annealing (SA) algorithms are developed to provide near-optimal solutions for large networks with bounded computational effort.
Abstract: In designing Global Positioning System (GPS) surveying network, a given set of earth points must be observed consecutively (schedule). The cost of the schedule is the sum of the time needed to go from one point to another. The problem is to search for the best order in which this observation is executed. Minimizing the cost of this schedule is the goal of this work. Solving the problem for large networks to optimality requires impractical computational times. In this paper, several Simulated Annealing (SA) algorithms are developed to provide near-optimal solutions for large networks with bounded computational effort.


Book ChapterDOI
04 Jun 2009
TL;DR: The results shows that AbYSS not only reaches very accurate solutions for the three instances, but also it scales well with increasingly sized instances.
Abstract: Planning a cellular phone network makes engineers to face a number of challenging optimization problems This paper addresses the solution of one of these problems, Automatic Cell Planning (ACP), which lies in positioning the antennae of the network and configuring them properly in order to meet several objectives and constraints This paper approaches, for the first time, the ACP problem with a Scatter Search technique The algorithm used is called AbYSS Three large-scale real-world instances have been used for evaluating the search capabilities of AbYSS on this optimization problem The results shows that AbYSS not only reaches very accurate solutions for the three instances, but also it scales well with increasingly sized instances.

01 Jan 2009
TL;DR: In this article, Ant Colony Optimization (ACO) is used to find error trails in concurrent systems with a reduced amount of resources, outperforming in most cases the results of Nested Depth First Search (NDFS).
Abstract: Most of model checkers found in the literature use exact deterministic algorithms to check the properties. The memory required for the verification with these algorithms usually grows in an exponential way with the size of the system to verify. When the search for errors with a low amount of computational resources (memory and time) is a priority (for example, in the first stages of the implementation of a program), non- exhaustive algorithms using heuristic information can be used. In this work we summarize our observations after the application of Ant Colony Optimization to find property violations in concurrent systems using a explicit state model checker. The experimental studies show that ACO finds optimal or near optimal error trails in faulty concurrent systems with a reduced amount of resources, outperforming in most cases the results of algorithms that are widely used in model checking, like Nested Depth First Search. This fact makes ACO suitable for checking properties in large faulty concurrent programs, in which traditional techniques fail to find counterexamples because of the model size.

OtherDOI
06 Feb 2009
TL;DR: In this article, the authors introduce multidimensional knapsack problem hybrid models and experimental analysis of the hybrid models is performed using a multi-dimensional Knapsack Problem hybrid model.
Abstract: This chapter contains sections titled: Introduction Multidimensional Knapsack Problem Hybrid Models Experimental Analysis Conclusions References


Book ChapterDOI
01 Jan 2009
TL;DR: This chapter aims at giving an overview of Evolutionary Algorithms and Ant Colony Optimization when applied to the two-dimensional strip packing problem.
Abstract: In the last few years, metaheuristic approaches have shown an important development in many application areas. This situation has turned them into one of the more appropriate candidates when dealing with difficult real-world problems for which timely, good-quality solutions are necessary. Furthermore, the class of metaheuristic approaches includes a large number of variants and designs which mainly depend on the concepts from which they are inspired. This chapter aims at giving an overview of Evolutionary Algorithms and Ant Colony Optimization when applied to the two-dimensional strip packing problem. The respective performance of these two metaheuristics are analyzed and compared from different perspectives by implementing a Genetic Algorithm and an Ant Colony System.


Proceedings ArticleDOI
13 May 2009
TL;DR: A semantic perspective of the old times, the recent years on SBSE, and the many avenues for future research and development spinning around this exciting clash of stars are presented.
Abstract: This paper is a brief description of the revamped presentation based in the original one I had the honor to deliver back in 2009 during the very first SSBSE in London. At this time, the many international forces dealing with search, optimization, and learning (SOL) met software engineering (SE) researchers in person, all of them looking for a quantified manner of modeling and solving problems in software. The contents of this work, as in the original one, will develop on the bases of metaheuristics to highlight the many good ways in which they can help to create a well-grounded domain where the construction, assessment, and exploitation of software are not just based in human expertise, but enhanced with intelligent automatic tools. Since the whole story started well before the first SSBSE in 2009, we will mention a few previous applications in software engineering faced with intelligent algorithms, as well as will discuss on the present interest and future challenges of the domain, structured in both short and long term goals. If we understand this as a cross-fertilization task between research fields, then we could learn a wider and more useful lesson for innovative research. In short, we will have here a semantic perspective of the old times (before SBSE), the recent years on SBSE, and the many avenues for future research and development spinning around this exciting clash of stars. A new galaxy has been born out of the body of knowledge in SOL and SE, creating forever a new class of researchers able of building unparalleled tools and delivering scientific results for the benefit of software, that is, of modern societies.