符号回归
遗传程序设计
变压器
计算机科学
程序设计语言
回归
人工智能
自然语言处理
统计
数学
工程类
电气工程
电压
作者
Philipp Anthes,Dominik Sobania,Franz Rothlauf
出处
期刊:Cornell University - arXiv
日期:2025-01-30
标识
DOI:10.48550/arxiv.2501.18479
摘要
In standard genetic programming (stdGP), solutions are varied by modifying their syntax, with uncertain effects on their semantics. Geometric-semantic genetic programming (GSGP), a popular variant of GP, effectively searches the semantic solution space using variation operations based on linear combinations, although it results in significantly larger solutions. This paper presents Transformer Semantic Genetic Programming (TSGP), a novel and flexible semantic approach that uses a generative transformer model as search operator. The transformer is trained on synthetic test problems and learns semantic similarities between solutions. Once the model is trained, it can be used to create offspring solutions with high semantic similarity also for unseen and unknown problems. Experiments on several symbolic regression problems show that TSGP generates solutions with comparable or even significantly better prediction quality than stdGP, SLIM_GSGP, DSR, and DAE-GP. Like SLIM_GSGP, TSGP is able to create new solutions that are semantically similar without creating solutions of large size. An analysis of the search dynamic reveals that the solutions generated by TSGP are semantically more similar than the solutions generated by the benchmark approaches allowing a better exploration of the semantic solution space.
科研通智能强力驱动
Strongly Powered by AbleSci AI