Hybrid Single Node Genetic Programming for Symbolic Regression
read more
Citations
Choosing function sets with better generalisation performance for symbolic regression models
Policy derivation methods for critic-only reinforcement learning in continuous spaces
Symbolic regression driven by training data and prior knowledge
Optimal Control via Reinforcement Learning with Symbolic Policy Approximation
Enhanced Symbolic Regression Through Local Variable Transformations
References
Regularization and variable selection via the elastic net
Regularization Paths for Generalized Linear Models via Coordinate Descent
Genetic Programming: On the Programming of Computers by Means of Natural Selection
Related Papers (5)
Frequently Asked Questions (12)
Q2. What is the next step of the research?
The next step of their research will be to carry out a thorough experimental evaluation of the modified SNGP algorithms with the primary objectives being the speed of convergence and the ability to react fast to the changes of the environment in order to be able to deploy the algorithm within the dynamic symbolic regression scenario.
Q3. What is the reason for imposing high selection pressure on the root nodes?
In fact, imposing high selection pressure on the root nodes might be counter-productive in the end as the mutations applied on the root nodes are less likely to bring an improvement than mutations applied on the deeper structures of the trees.
Q4. What is the reason for the proposed modifications of the SNGP algorithm?
The proposed modifications of the SNGP algorithm are configured with the following parameters:– upToN ∈ {1, 5}, – selection is either random (denoted as ’r’) or depthwise (denoted as ’d’) – moveType is either moveLeft (denoted as ’l’), moveRight (denoted as ’r’)or no move (denoted as ’n’).
Q5. What are the main objectives of the proposed hybrid SNGP?
Further investigations will include utilization of new mutation operators, identification of suitable ”high-level” basic functions to the SNGP’s function set, design of mechanisms to evolve inner constants of the models and mechanisms for escaping from local optima.
Q6. What is the SNGP with generational replacement strategy?
Standard GP with generational replacement strategy was used with the following parameters:– Function set: {+, -, *, /} – Terminal set: {x1, x2, 1.0} – Population size: 500 – Initialization method: Ramped half-and-half – Tournament selection: 5 candidates – Number of generations: 55, i.e. 54 generations plus initialization of the wholepopulation – Crossover probability: 90% – Reproduction probability: 10% – Probability of choosing internal node as crossover point: 90%For the experiments with the GP the authors used the Java-based Evolutionary Computation Research System ECJ 225.
Q7. What is the significance level of the t-test?
Checked using the t-test calculated with the significance level α = 0.05It has widely been reported in the literature that the evolutionary algorithms work much better when hybridized with local search techniques, the concept known as the memetic algorithms [7].
Q8. What is the way to learn a linear regression model?
several methods emerged [1], [2], [15], [21], [22] that explicitly restrict the class of models to generalized linear models, i.e. to a linear combination of possibly non-linear basis functions.
Q9. What is the complexity of the LASSO model?
The complexity of the LASSO model is controlled by (1) the maximal depth of features evolved in the population and (2) the maximum number of features the LASSO model can be composed of.
Q10. What is the purpose of this paper?
This paper deals with the Single Node Genetic Programming method, proposes its modifications and ways of hybridization to improve its performance.
Q11. What is the residua of the SNGP?
5.Each feature fi is evolved in a separate run of the SNGP (line 6) such that it correlates the most with the residua R (i.e. the vector of error values over all training samples) produced by the current LASSO regression model composed of i−1 features.
Q12. What is the SNGP algorithm for f2?
there is a clear trend showing that the SNGP without LASSO is doing well on rather simple benchmarks f1 and f2 (it is even better than both hybrid algorithms on f2), i.e. the polynomials that involve only trivial integer constants.