*This is part of a series entitled Genetics In Action.*

Up to this point, we have looked at genetic algorithms, one particular evolutionary algorithm. Now we’re going to take a look at a different evolutionary algorithm called genetic programming. John Koza popularized genetic programming with his eponymous series of books, starting with Genetic Programming and on through volumes II, III, and IV.

I want to start with points where genetic programming and genetic algorithms overlap, and then we’ll look at the gigantic difference.

### More Of The Same

Genetic programs, like genetic algorithms, work from a population of candidate solutions. Each candidate solution has a chromosome (generally just one), made up of genes. Genes are the building blocks of the chromosome and the potential values of each gene are its alleles.

In order to determine which candidate solution is the best, we need a fitness function, which we generally implement as a single numeric value. To get from our starting population to the fittest candidate solution, we take advantage of the crossover and mutation operators and run this process for some number of iterations.

In this regard, genetic programs and genetic algorithms are alike.

### But Wait, There’s More!

A big difference between a genetic algorithm and a genetic program is that genetic programs typically do not have fixed-length chromosomes. Genetic algorithms do not need to have fixed-length chromosomes either, but this is the norm, at least in Easy Mode. With genetic programs, even in the easiest setup, we don’t assume fixed numbers of genes.

The other big difference is in the name: we’re building **programs**. In the early genetic programming literature, Koza used Lisp as his language of choice for genetic programs because it has a tree-like syntax that really works well here. We won’t use Lisp ourselves, but let us take a moment to mourn all of those parentheses who gave their lives in order for us to solve a problem.

Anyhow, the programs can be as simple as mathematical functions or as complex as instruction sets for machinery. Each program can be displayed as a graph. I’m going to look at two separate scenarios, focusing mostly on problems in graph format (to make it easy to follow). First up is mathematical functions, followed by conditional programs.

### Mathematical Functions

Let’s say that we want to find the best result to a fitness function. We have a really, really simple fitness function that always returns (56 – X)^2, and we want to find the global minimum. We have integer numbers from 0-9 available to us, as well as the mathematical operations +, -, and x.

One potential solution could look like this:

In this case, we multiply 8 x 7 and get 56. (56-56)^2 = 0, which is our global minimum. We’ve solved the problem!

With a genetic program, it’s usually not going to be that parsimonious. Instead, genetic programs often will have somewhat more noisy answers:

The answer solves our fitness function all the same, so we should be happy with those results. But if you want to understand the solution, you’ll often need to reduce the outcome to its simplest form.

Something to notice is that in this mathematical function, the mathematical operators are non-leaf nodes, whereas numeric values are leaf nodes. If we introduced variables like m and n, those variables could be on leaf nodes as well, but they would not show up on non-leaf nodes.

### Conditional Programs

The other major type of genetic program is a conditional program. The end result here is not a mathematical formula, but rather a decision.

Here’s an old, old sample from my master’s thesis:

In this sample, we have non-leaf conditionals which lead to leaf node decisions. This is a simple problem based on a variant of the Prisoner’s Dilemma. Let’s describe this agent’s behavior. If the agent defected two turns ago, then check what the opponent agent did two turns ago. If the opponent defected two turns ago, then exit the game; otherwise, if the opponent cooperated two turns ago, defect. Finally, if the agent did not defect two turns ago, defect this turn. This agent’s kind of a jerk.

Again, this is an example of a simplified tree. A more realistic scenario looks a bit like another example from my thesis:

(own_prev x (opp_ever_def (opp_prev x (own_prev2

x d)) (own_prev (own_prev2 (opp_ever_def

(opp_prev x (own_prev2 (own_prev2 x c) d))

(opp_ever_def d (opp_prev2 (good_res x x)

c))) d) (opp_ever_def (opp_ever_def (own_prev

x x) (good_res (opp_prev d c) (own_prev (opp_ever_def

(opp_prev x (own_prev2 (opp_ever_def (good_res

(opp_prev2 (good_res x x) c) x) (opp_prev2

(own_prev2 x c) c)) d)) (opp_ever_def (own_prev

x x) (own_prev2 (good_res (own_prev x x)

(own_prev (opp_ever_def (own_prev2 x c) (good_res

(opp_ever_def (good_res d d) (own_prev2 x

c)) x)) (good_res x x))) c))) (good_res x

(own_prev (own_prev x x) (opp_ever_def (opp_prev

x (own_prev2 (own_prev (good_res (good_res

d d) (opp_ever_def (good_res d d) (own_prev2

x c))) (opp_prev2 (good_res x x) c)) d))

(own_prev2 x c))))))) (own_prev2 x c)))))

Nope, not going to draw a graph for that one… It simplifies down to the following:

(own_prev x (opp_ever_def (opp_prev x (own_prev2 x d))) (own_prev2 x c))

If you go through the program, you’ll see areas that we can simplify: contradictory code branches (e.g., if you defected last turn and if you did not defect last turn), redundant results (if you defected last turn, then defect this turn; otherwise, defect this turn), and the like.

### Detailing The Operation

So we’ve looked at mathematical functions and conditional programs, but we haven’t quite described the mechanics behind how genetic programs form new candidate solutions over the generations. It’s easiest to think about this in graph mode, so let’s start with a pair of candidate solutions.

Let’s suppose we want to combine these two programs. What we would do is find a subtree and perform a lop-and-splice technique, which is *totally* different from the chop-and-swap of genetic algorithms. Our job is to cut off a subtree from each of the two parents and splice the new subtrees in.

In this case, we’re swapping the right subtree under x on the left-hand side with the left subtree under + on the right-hand side. The subtrees do **not** need to be the same size in order to swap. The subtrees do **not** need to be at the same level in order to swap. And the subtrees do **not** even need to contain non-leaf nodes.

Once we’re done, it’s time to splice the subtrees, creating two brand new programs.

We now have two new programs, which means crossover is complete. Mutation is similar to the genetic algorithms example. We can mutate non-leaf nodes as well as leaf nodes, but we need to follow the rules of what’s allowed to be where.

The left-hand program experienced two mutations, whereas the right-hand tree experienced none. Now we have two new programs and can continue the process. Similar to genetic algorithms, we can run genetic programs for a certain amount of time, a certain number of generations, a certain number of operations, or until we get the result we’re expecting.

### Conclusion

This was a cursory introduction to genetic programming. Next, I’m going to showcase a couple examples of genetic programming in R.