Written by: Richard Lenski
Primary Source: Tellliamed Revisited
This post follows up on my post from yesterday, which was about choosing a dilution factor in a microbial evolution experiment that avoids the loss of too many beneficial mutations during the transfer bottleneck.
If we only want to maximize the cumulative supply of beneficial mutations that survive dilution, then following the reasoning in yesterday’s post, we would chose the dilution factor (D) to maximize g Ne = (g2) Nmin = (g2) Nmax / (2g), where Nmax is a constant (the final population size) and D = 1 / (2g). Thus, we want to maximize (g2) / (2g) for g > 0, which gives g = ~2.885 and D = ~0.1354, which is in agreement with the result of Wahl et al. (2002, Genetics), as noted in a tweet by Danna Gifford.
The populations would therefore be diluted and regrow by ~7.84-fold each transfer cycle. But as discussed in my previous post, this approach does not account for the effects of clonal interference, diminishing-returns epistasis, and perhaps other important factors. And if I had maximized this quantity, the LTEE would only now be approaching a measly 29,000 generations!
So let’s not be purists about maximizing the supply of beneficial mutations that survive bottlenecks. There’s clearly also a “wow” factor associated with having lots and lots of generations. This wow factor should naturally and powerfully reflect the increasing pleasure associated with more and more generations. So let’s define wow = ge, which is both natural and powerful. Therefore, we should maximize wow (g2) / (2g), which provides the perfect balance between the pleasure of having lots of generations and the pain of losing beneficial mutations during the transfer bottlenecks.
It turns out that the 100-fold dilution regime for the LTEE is almost perfect! It gives a value for wow (g2) / (2g) of 75.93. You can do a tiny bit better, though, with the optimal ~112-fold dilution regime, which gives a value of 76.03.