Improving differential evolution using inductive programming (Master thesis)

Abstract

Differential Evolution (DE) has emerged as one of the most powerful and versatile global numerical optimizers for non-differentiable, multimodal problems. The popularity of DE has led to extensive work on improving the algorithm, and significant advances are increasingly more difficult to achieve.

Most researchers seek to improve DE by studying the algorithm using formal analysis, or manual exploration to find improvements. However, many improvements might not be found using these approaches. In this thesis, we use another evolutionary algorithm, the inductive programming system Automatic Design of Algorithms Through Evolution (ADATE), to empirically search for these improvements by systematically testing millions of synthesized modifications.

However, even the rigorous requirements for competitions among state-of-the-art algorithms leave considerable statistical uncertainty on many problems. This presents a significant challenge in how modifications can be evaluated quickly enough. Additionally, the algorithms are too large to improve all-at-once, thereby raising the question of which parts to improve. These questions were explored in three experiments.

The first experiment attempts to improve the mutation operator in the canonical DE algorithm, by quickly evaluating modifications, at the cost of high statistical uncertainty. While the resulting mutation operator performed worse, when tested using a larger number of generations, it provided valuable knowledge for the next experiments.

The next experiment significantly enhanced the statistical certainty when trying to improve the pool of strategies in Competitive Differential Evolution, a variant that uses a pool of competing strategies to produce mutated individuals. The improved pool achieved a significant performance increase when tested on the CEC 2014 benchmark problems.

This led to the third experiment, which resulted in the improvement of the state-of-the-art LSHADE-EpSin algorithm. This DE variant uses an ensemble of sinusoidal functions to generate the scaling parameter F in the current-to-pbest mutation operator.

ADATE improved both the mutation operator and the ensemble simultaneously, and found an improved ensemble consisting of a single sine wave that achieves statistically better performance on CEC 2014 problems with 30 dimensions.

Finally, this thesis proposes a specialized version of DE to train Spiking Neural Networks (SNNs). These networks have numerous advantages over traditional artificial neural networks, but no good training methods currently in exists. While no experiment was conducted, the proposed algorithm outlines several modifications that utilize SNNs-specific methods, in a manner that might make the search handle the high number of parameters.

Download PDF