scispace - formally typeset
Proceedings ArticleDOI

Prioritized grammar enumeration: symbolic regression by dynamic programming

Tony Worm, +1 more
- pp 1021-1028
TLDR
Prioritized Grammar Enumeration (PGE), a deterministic Symbolic Regression algorithm using dynamic programming techniques, which replaces genetic operators and random number use with grammar production rules and systematic choices and leads the community to new ideas.
Abstract
We introduce Prioritized Grammar Enumeration (PGE), a deterministic Symbolic Regression (SR) algorithm using dynamic programming techniques. PGE maintains the tree-based representation and Pareto non-dominated sorting from Genetic Programming (GP), but replaces genetic operators and random number use with grammar production rules and systematic choices. PGE uses non-linear regression and abstract parameters to fit the coefficients of an equation, effectively separating the exploration for form, from the optimization of a form. Memoization enables PGE to evaluate each point of the search space only once, and a Pareto Priority Queue provides direction to the search. Sorting and simplification algorithms are used to transform candidate expressions into a canonical form, reducing the size of the search space. Our results show that PGE performs well on 22 benchmarks from the SR literature, returning exact formulas in many cases. As a deterministic algorithm, PGE offers reliability and reproducibility of results, a key aspect to any system used by scientists at large. We believe PGE is a capable SR implementation, following an alternative perspective we hope leads the community to new ideas.

read more

Citations
More filters
Proceedings ArticleDOI

Multiple regression genetic programming

TL;DR: MRGP's output method is shown to be superior to the output of program execution and it represents a practical, cost neutral, improvement to GP.
Posted Content

Discovering Symbolic Models from Deep Learning with Inductive Biases

TL;DR: In this paper, a general approach to distill symbolic representations of a learned deep model by introducing strong inductive biases is proposed. But the approach is restricted to Graph Neural Networks (GNNs).
Journal ArticleDOI

Feature engineering and symbolic regression methods for detecting hidden physics from sparse sensor observation data

TL;DR: In this paper, a modular approach for distilling hidden flow physics from discrete and sparse observations is proposed, which combines evolutionary computation with feature engineering to provide a tool for discovering hidden parameterizations embedded in the trajectory of fluid flows in the Eulerian frame of reference.
Journal ArticleDOI

Feature engineering and symbolic regression methods for detecting hidden physics from sparse sensors

TL;DR: In this paper, a modular approach for distilling hidden flow physics in discrete and sparse observations is proposed, which combines evolutionary computation with feature engineering to provide a tool to discover hidden parameterizations embedded in the trajectory of fluid flows in the Eulerian frame of reference.
Proceedings ArticleDOI

Geometric Semantic Genetic Programming with Local Search

TL;DR: The experimental results show that GSGP-LS achieves the best training fitness while converging very quickly, but severely overfits, and suggest that future GSGP algorithms should focus on finding the correct balance between the greedy optimization of a local search strategy and the more robust geometric semantic operators.
References
More filters
Book

Genetic Programming: On the Programming of Computers by Means of Natural Selection

TL;DR: This book discusses the evolution of architecture, primitive functions, terminals, sufficiency, and closure, and the role of representation and the lens effect in genetic programming.

SPEA2: Improving the strength pareto evolutionary algorithm

TL;DR: An improved version of SPEA, namely SPEA2, is proposed, which incorporates in contrast to its predecessor a fine-grained fitness assignment strategy, a density estimation technique, and an enhanced archive truncation method.
Book ChapterDOI

A Fast Elitist Non-dominated Sorting Genetic Algorithm for Multi-objective Optimisation: NSGA-II

TL;DR: Simulation results on five difficult test problems show that the proposed NSGA-II, in most problems, is able to find much better spread of solutions and better convergence near the true Pareto-optimal front compared to PAES and SPEA--two other elitist multi-objective EAs which pay special attention towards creating a diverse Paretimal front.
Proceedings Article

Genetic Algorithms for Multiobjective Optimization: FormulationDiscussion and Generalization

TL;DR: A rank-based fitness assignment method for Multiple Objective Genetic Algorithms (MOGAs) and the genetic algorithm is seen as the optimizing element of a multiobjective optimization loop, which also comprises the DM.
Related Papers (5)