Search code examples
artificial-intelligencemachine-learninggenetic-algorithmgenetic-programming

Genetic Programming - Fitness functions


Let's say I have a set of training examples where A_i is an attribute and the outcome is binary (yes or no):

A1,             A2,             A3,             Outcome
red             dark            large           yes
green           dark            small           yes
orange          bright          large           no

I know I have to define the fitness function. But what is it for this problem? In my actual problem there are 10 parameters and 100 training examples but this is a similar problem.


Solution

  • I think the confusion here is coming from the fact that usually fitness functions give you back some scalar, sometimes on a discrete scale, but never a binary yes/no (or true/false). In this sense, this looks more like a 'classification' problem to be solved with neural nets (or possibly bayesian logic). Said so, you could certainly devise a GA to evolve whatever kind of classifier, and the fitness function would basically be expressed in terms of correct classifications over total evaluations.

    Another pure GA approach to this - probably more relevant to the question - is to encode the whole classification rule set as a given individual for the genetic algorithm. In this sense, the fitness function could be expressed as a scalar representing how many yes/no classifications the given candidate solution at hand gets right over the total, and so forth. A similar approach can be found in this paper Using Real-Valued Genetic: Algorithms to Evolve R,de Sets for Classification.

    Example (one of the possible ways to encode this):

    A1,             A2,             A3,             Outcome
    red             dark            large           yes
    green           dark            small           yes
    orange          bright          large           no
    

    Encoding: red = 000, dark = 001, large = 010, green = 011, small = 100, orange = 101, bright = 111, etc. Outcome: yes = 1, no = 0

    Chromosome:

    A1,             A2,             A3,             Outcome
    000             001             010             1
    011             001             100             1
    101             111             010             0
    

    All of the above gets translated into a candidate solution as:

    000001010-1/011001100-1/101111010-0
    

    You will generate a random bunch of these and evolve them whichever way you like by testing fitness (correct classification/ total classifications in the ruleset) of the entire rule set (be careful picking your cross-over strategy here!).

    I also suggest you listen to a binary solo, to get you in the mood.

    NOTE: I highly doubt this would work with a rule-set composed by only 3 rules, not enough breadth for the GA.