Re: No Free Lunch (NFL) Lecture

From: Lawrence Johnston (
Date: Thu Jul 18 2002 - 01:29:18 EDT

  • Next message: Allen Roy: "Re: Comment's on Allens postings"

    David -
    sounds like you are into a program similar to our Bioinformatics here.
    At present our program is working on evolution of viruses placed under various
    kinds of stress. The virus genome is much smaller than that for
    most organisms.
    I will address your comments below. Also bear in mind that I am a
    mere physicist
    making a start in biological matters. The Biology people are very
    kind, however,
    and suffer a physics fool gladly, for what they can occasionally contribute.

    > Larry, on July 15:
    > >His lecture showed that any search algorithm that is used to
    > >optimize a given quality (such as a fitness function) can on the
    > >average be no more efficient than a RANDOM search.

    > David:
    > I am currently running analyses that seek to optimize a given quality
    > (minimum parsimony) in an analysis of genetic data. The program uses
    > heuristic search algorithms that are far better than random searches.
    > What is the difference between this and the genetic algorithms?

        It sounds like you _are_ using some sort of genetic algorithms. And
    maybe you are beating the NFL theorems.

    > Some possibilities that come to mind:
    > A search guarenteed to find the absolute optimum is much less
    > efficient than these heuristic approximations. However, living
    > organisms also get by just fine with suboptimal, but good, versions.

       I guess that such organisms are well enough adapted, like Steven Gould's
    observations of "Stasis" situations

    The heuristic analyses start with trying to find a good option and
    > then try to improve on it. Again, living organisms are generally
    > starting with a functional gene, and the number of changes needed to
    > make another functional gene is less than for starting from scratch.
    > Organisms have extra DNA. This means that they can have parallel
    > processing in searching for a particular result.

       Yes, if the organism has two copies of that gene, it could cover
    for the fact that most mutations are fatal. If there is a second good gene
    it may be able to continue the function that was otherwise lost. And as
    you say, both copies may be a target for an adaptive mutation.

    > Not knowing the detailes of the programs and the precise goal of the
    > algorithms, I do not know how applicable any of these replies may be.

    Whitely, being a rather pure mathematician, was dealing with very general,
    non-specific cases of search programs. And he was assuming that the
    population of critters
    being modeled had no informaation of their own fitness landscape, and so
    could only tell whether each mutation eventually helped their survival or not.

    One of the computer people in the audience said that he used very special
    algorithms for each organism he was running, and these usually converged
    quite rapidly on the target area of the genome space. But then someone else
    asked him if he was not unfairly putting a lot of privileged information into
    his algorithm. In that case he is getting out of the calculation
    the information
    that he himself put in. The first guy said that's true, but it
    gets you there fast.

    So at the end of the period, the NFL theorem seemed to hold, unless
    you used "Insider
    Information" which the organism itself would not have, in a real evolutionary

    All God's best, Larry Johnston

    Lawrence H. Johnston home: 917 E. 8th st.
    professor of physics, emeritus Moscow, Id 83843
    University of Idaho (208) 882-2765
    Fellow of the American Physical Society =====================

    This archive was generated by hypermail 2b29 : Thu Jul 18 2002 - 01:39:23 EDT