On the Generalization Ability of Geometric Semantic Genetic Programming
详细信息    查看全文
文摘
Geometric Semantic Genetic Programming (GSGP) is a recently proposed form of Genetic Programming (GP) that searches directly the space of the underlying semantics of the programs. The fitness landscape seen by the GSGP variation operators is unimodal with a linear slope by construction and, consequently, easy to search. Despite this advantage, the offspring produced by these operators grow very quickly. A new implementation of the same operators was proposed that computes the semantics of the offspring without having to explicitly build their syntax. This allowed GSGP to be used for the first time in real-life multidimensional datasets. GSGP presented a surprisingly good generalization ability, which was justified by some properties of the geometric semantic operators. In this paper, we show that the good generalization ability of GSGP was the result of a small implementation deviation from the original formulation of the mutation operator, and that without it the generalization results would be significantly worse. We explain the reason for this difference, and then we propose two variants of the geometric semantic mutation that deterministically and optimally adapt the mutation step. They reveal to be more efficient in learning the training data, and they also achieve a competitive generalization in only a single operator application. This provides a competitive alternative when performing semantic search, particularly since they produce small individuals and compute fast.

© 2004-2018 中国地质图书馆版权所有 京ICP备05064691号 京公网安备11010802017129号

地址:北京市海淀区学院路29号 邮编:100083

电话:办公室:(+86 10)66554848;文献借阅、咨询服务、科技查新:66554700