I had an idea this morning.. Why not write code to mimic evolution where needed?
For example a search engine… their algorithms are constantly being changed and updated to fight spam and whatever else comes along.
It wouldn’t work as well on a small scale, but if you have a large number of end users (like a search engine), I don’t see why it wouldn’t work.
Have the system make hundreds or thousands of small random iterations of the existing algorithms (small enough that it doesn’t destroy the results). Then have it feed end users with a random version of the algorithm and keep track of how well it was received. You could say the deeper the user needed to look in the search results, or if they immediately needed to do a different search, then it wasn’t that good of an algorithm change. Then give the “good” changes more weight than the bad, and repeat the process.
It’s survival of the fittest, and becomes an semi-automated way to evolve their algorithms.