Genetic algorithms are a class of search techniques that apply a "fitness function", which is essentially a heuristic, to a set "genes", typically represented as a set of bit values, in an attempt to find a solution to a problem within a search space scoped by these "genes". A "population of individuals" (i.e. a set of possible solutions, or the "fringe" of the search space) is maintained and updated by modifying the genes of "individuals" through mutation (i.e. random flipping of bits) and cross-overs (i.e. the exchange of genes between two individuals) in order to search space of possible solutions.

Like most computer problems the effective application of genetic algorithms is very dependent on the representation or translation of the problem into a form that can be manipulated by the algorithm(s). In addition to the problem of designing an effective heuristic (a non-trivial task even for traditional search techniques), genetic algorithms have the additional problem of defining a set of "genes" that can be mutated and etc. in such a way as to approach a solution in a (more or less) continuous manner. Such techniques as using Gray codes for numbers rather than standard binary notation seem to be critical to the success of applying the evolutionary algorithms by allowing numbers to always be one bit flip apart (and thus avoiding the problem of "cliffs" created requiring multiple bit flips for the next item in a sequence). It seems to me that this sensitivity to representation in the context of very complex problems, and thus huge search spaces, might be critical to whether or not a reasonable solution can be found. Forrest seems to at least partially agree with this as she emphasizes the importance of making mathematical analysis of the representation schemes used in genetic algorithms.

Forrest points out that "deceptive algorithms" are problematic for genetic algorithm search methods that use the cross-over technique. Cross-over allows the exchange of partial solutions that have a high fitness value to create a new partial solution which combines the fitness of both "parent" partial solutions. This is very effective when the two partial solutions lead toward the complete or optimal solution, but causes a serious problem when the two lesser partial solutions lead away from the optimal solution. I suspect that there are ways to alleviate this problem, such as keeping a large range of fitness' within the candidate solution population, but it seems to me that the problem of "deceptive algorithms" would be impossible to detect in any sort of general way within the domain of genetic algorithms and could only be addressed by an exhaustive search or adding more randomness to the search by using something like simulated annealing.

In an effort to develop the theory behind genetic algorithms it would seem to be important to identify more parallels with general search theory. I think that one important parallel theory would be in identifying what constituted an "admissible" fitness function (i.e. heuristic function). Is the fitness function space monotonic? It would seem that "deceptive algorithms" are not monotonic. Another important question is how does the interaction of "genes" participate in a fitness function. Can a gene contribute a "negative" value to a fitness function? This would not necessarily create a non-admissible fitness function, but it would be important to keep in mind.

One of the great promises of genetic algorithms is in the field of self modifying programs. The ability to have programs, or subsets of programs, modify themselves on the basis of some (possibly external) fitness criteria would allow programs to adapt to current conditions either detected by the program itself or through an external monitoring agency. One possibility that seems immediately obvious is in the area of virus or worm detection by components of the operating system that use a system similar to that of biological immune systems. This could be effected through the detection of characteristics (familiar and/or foreign) in an executable and/or in the patterns of system services calls. Once an intruder is identified the system could "inoculate" itself against similar invaders by "remembering" said patterns or behaviors on found in the sample "alien" program.