Mathematical expression exploration with graph representation and generative graph neural network.
Journal:
Neural networks : the official journal of the International Neural Network Society
PMID:
40147161
Abstract
Symbolic Regression (SR) methods in tree representations have exhibited commendable outcomes across Genetic Programming (GP) and deep learning search paradigms. Nonetheless, the tree representation of mathematical expressions occasionally embodies redundant substructures. Representing expressions as computation graphs is more succinct and intuitive through graph representation. Despite its adoption in evolutionary strategies within SR, deep learning paradigms remain under-explored. Acknowledging the profound advancements of deep learning in tree-centric SR approaches, we advocate for addressing SR tasks using the Directed Acyclic Graph (DAG) representation of mathematical expressions, complemented by a generative graph neural network. We name the proposed method as Graph-based Deep Symbolic Regression (GraphDSR). We vectorize node types and employ an adjacent matrix to delineate connections. The graph neural networks craft the DAG incrementally, sampling node types and graph connections conditioned on previous DAG at every step. During each sample step, the valid check is implemented to avoid meaningless sampling, and four domain-agnostic constraints are adopted to further streamline the search. This process culminates once a coherent expression emerges. Constants undergo optimization by SGD and BFGS algorithms, and rewards refine the graph neural network through reinforcement learning. A comprehensive evaluation across 110 benchmarks underscores the potency of our approach.