This is part of a series entitled Genetics In Action.
To wrap up this series on evolutionary algorithms, I want to provide you with one final algorithm, the Flopping Fish Algorithm. This is not a particularly good algorithm at all, but it is one that I’ve seen in practice.
Let’s start by taking a genetic algorithm. As you’ll recall from prior posts in the series, genetic algorithms have a pair of important modification rates: the crossover rate and the mutation rate. The crossover rate gives a genetic algorithm its power, whereas the mutation rate helps kick us out of local minima/maxima. The crossover rate should be fairly high (60-90% is a reasonable range) whereas mutation should be low (1% or so is a nice start, though in certain extreme cases, it might go as high as 20-30%).
In contrast to the genetic algorithm rates, let’s go the opposite way: have a mutation rate above 50% and a crossover rate of 10% or less. As an aside, I’m picking these numbers because those were actually numbers I once found in a production system.
So what does this high-mutation, low-crossover search pattern look like? Flopping fish. Let’s motivate this with a story.
Imagine that you have a two-dimensional board of some arbitrarily large size. Each spot on the board is shaded somewhere between entirely white and entirely red. Your goal is to land on the reddest part of the board.
With a genetic algorithm, you build a fitness function where the output would be the amount of red on the board. For every (x, y) combo, you get an R value somewhere in the range of [ 0, 255 ]. You start out with a random sample of squares on the board and splice x and y values, and your strongest organisms tend to move in a semi-orderly fashion toward those darker-shaded spots. By the end of the run, you hope to have landed on the darkest spot.
By contrast, the flopping fish algorithm has you start off with a bucket of fish. You pour them onto the board and watch them as they flop around. Their flopping is not entirely aimless, but there is little rhyme or reason to the motions. Eventually, the fish stop flopping and you collect the points at which they met their doom, choosing the darkest-hued spot as your solution.
The flopping fish algorithm is a waystation between a genetic algorithm (which you might consider controlled “randomness”) and random selection of points. It is a little different from pulling a bunch of random numbers from thin air—with the flopping fish, each fish can move only so far in its gasps—but the effect is about the same. A genetic algorithm with too large a mutation rate effectively ignores the signals that the fitness function provides—and remember, those signals are the *only* signals the algorithm gets in order to tell if it’s on the right track.
If you’re going to run a genetic algorithm with an extremely high mutation rate and extremely low crossover rate, save yourself the trouble and just pluck random numbers from thin air; that’ll save you time and effort.