A genetic algorithm implementation for Rust. Inspired by the book Genetic Algorithms in Elixir
There are three main elements to this approach: * The Genotype (the search space) * The Fitness function (the search goal) * The Evolve strategy (the search strategy)
UniqueDiscreteGenotype<u8>
with a 64x64 chess board setup and custom NQueensFitness
fitnesscargo run --example evolve_nqueens --release
DiscreteGenotype<(weight, value)>
with a custom KnapsackFitness(weight_limit)
fitnesscargo run --example evolve_knapsack --release
DiscreteGenotype<u8>
100 monkeys randomly typing characters in a loopcargo run --example evolve_monkeys --release
cargo run --example evolve_binary_lru_cache_fitness --release
```rust use genetic_algorithm::evolve::prelude::*;
// the search space let genotype = BinaryGenotype::builder() // boolean genes .withgenesize(100) // 100 of them .build() .unwrap();
println!("{}", genotype);
// the search goal to optimize towards (maximize or minimize)
pub struct SimpleCount;
impl Fitness for SimpleCount {
type Genotype = BinaryGenotype;
fn callforchromosome(&mut self, chromosome: &Chromosome
// the search strategy let mut rng = rand::threadrng(); // a randomness provider implementing Trait rand::Rng let evolve = Evolve::builder() .withgenotype(genotype) .withpopulationsize(100) // evolve with 100 chromosomes .withtargetfitnessscore(100) // goal is 100 times true in the best chromosome .withcrossover(CrossoverAll(true)) // crossover all individual genes between 2 chromosomes for offspring .withmutate(MutateOnce(0.2)) // mutate a single gene with a 20% probability per chromosome .withfitness(SimpleCount) // count the number of true values in the chromosomes .with_compete(CompeteElite) // sort the chromosomes by fitness to determine crossover order .build() .unwrap() .call(&mut rng);
println!("{}", evolve); ```
The following genotypes are implemented:
BinaryGenotype
with_gene_size(usize)
, mandatory, number of genes per chromosomeDiscreteGenotype<T>
with_gene_size(usize)
, mandatory, number of genes per chromosomewith_gene_values(Vec<T>)
, mandatory, possible values for genesUniqueDiscreteGenotype<T>
with_gene_values(Vec<T>)
, mandatory, possible values for genesMultiDiscreteGenotype<T>
with_gene_multi_values(<Vec<VecT>>)
, mandatory, possible values for each individual gene.ContinuousGenotype
with_gene_size(usize)
, mandatory, number of genes per chromosomewith_gene_range(Range<f32>)
, mandatory, possible values for genesMultiContinuousGenotype
with_gene_ranges(Vec<Range<f32>>)
, mandatory, possible values for each individual gene.General initialization options for all Genotypes:
with_seed_genes(Vec<_>)
, optional, start genes of all chromosomes in the population
(instead of the default random genes). Sometimes it is efficient to start with a certain population
(e.g. Knapsack problem with no items in it)Example usage:
rust
let genotype = DiscreteGenotype::<u8>::builder()
.with_gene_size(100)
.with_gene_values(vec![2,4,6,8])
.build()
.unwrap();
The fitness function has an associated Genotype.
It returns an Option<FitnessValue = isize>
. When a None
is returned the chromosome ends
up last in the competition phase, regardless whether the fitness is maximized or minimized.
It is usually better to add a penalty to invalid or unwanted solutions instead
of returning a None
, so "less" invalid chromosomes are preferred over "more"
invalid ones. This usually conditions the population towards a solution faster.
See example/evolve_knapsack for an example of a penalty and a None
.
The trait Fitness needs to be implemented for a fitness function. It only requires one method. The example below is taken from the Infinite Monkey Theorem, see example/evolve_monkeys:
```rust const TARGET_TEXT: &str = "Be not afraid of greatness! Some are great, some achieve greatness, and some have greatness thrust upon 'em.";
struct MyFitness;
impl Fitness for MyFitness {
type Genotype = DiscreteGenotype
The Evolve strategy is build with the following options:
with_genotype(Genotype)
: the genotype initialized earlierwith_population_size(usize)
: the number of chromosomes in the populationwith_target_fitness_score(FitnessValue)
, if you know the ultimate goal in terms of fitness score, stop searching when met.with_max_stale_generations(usize)
, if you don't know the ultimate goal and depend on some convergion threshold, stop searching when improvement stalls.with_fitness(Fitness)
: the Fitness functionwith_mutate(Mutate)
: the mutation strategy, very important for avoiding local optimum lock-in. But don't overdo it, as it degenerates the
population too much if overused. Mutation probability generally between 5% and 20%. Choose one:
MutateOnce(mutation_probabilty: f32)
: Each genotype has a specific mutation strategy (e.g. non-unique chromosome just pack a random now gene value, but unique chromosomes swap two genes, in order to stay valid). With this approach each chromosome has the mutation probability to be mutated once.with_crossover(Crossover)
: the crossover strategy. Every two parents create two children. The competition phase determines the order of the
parent pairing (overall with fitter first). If you choose to keep the parents, the parents will compete with their own
children and the population is temporarily overbooked and half of it will be discarded in the competition phase. Choose one:
CrossoverAll(keep_parents: bool)
: 50% probability for each gene to come from one of the two parents. Not allowed for unique genotypesCrossoverClone(keep_parents: bool)
: Children are clones, effectively doubling the population if you keep the parents. Acts as no-op if the parents are not kept. Allowed for unique genotypesCrossoverRange(keep_parents: bool)
: Single random position in the genes, after which the genes are switched from one parent to the other. Not allowed for unique genotypesCrossoverSingle(keep_parents: bool)
: Random position in the genes, where a single gene is taken from the other parent. Not allowed for unique genotypeswith_compete(Compete
)
CompeteElite
: Simply sort the chromosomes with fittest first. This approach has the risk of locking in to a local optimum.CompeteTournament(tournament_size: usize)
: Run tournaments with randomly chosen chromosomes and pick a single winner. Do this
population size times until the required population level is reached. This approach kind of sorts the fitness first, but not very strictly. This preserves a level of diversity, which
avoids local optimum lock-in.with_fitness_ordering(FitnessOrdering)
: defaults to FitnessOrdering::Maximize
, so is not mandatory. Set to FitnessOrdering::Minimize
when the search goal is to minimize the fitness function.with_degeneration_range(Range<F32>)
: When approacking a (local) optimum in the fitness score, the variation in the population goes down dramatically. This slows down the efficiency, but also has the risk of local optimum lock-in. Set this parameter to simulate a cambrian explosion, where there is only mutation until the population diversity is large enough again.
The controlling metric is fitness score standard deviation in the population. The degeneration has a hysteresis switch, where the degeneration is activated at the start bound of the range, and deactivated at the end bound of the range.
A typical value is (0.005..0.995)
with_max_stale_generations_option
: the maxstalegenerations value wrapped in an option to allow for a None
value next to Some(usize)
values.with_target_fitness_score_option
: the targetfitnessscore value wrapped in an option to allow for a None
value next to Some(FitnessValue)
values.with_degeneration_range_option
: the degeneration_range value wrapped in an option to allow for a None
value next to Some(Range<F32>)
values.After building, the strategy can be executed with a randomness provider (Trait rand::Rng): evolve.call(&mut rng)
Example usage: ```rust let mut evolve = Evolve::builder() .withgenotype(mygenotype) .withpopulationsize(100) .withmaxstalegenerations(1000) .withtargetfitnessscore(0) .withfitnessordering(FitnessOrdering::Minimize) .withdegenerationrange(0.001..0.995) .withmutate(MutateOnce(0.2)) .withfitness(myfitness) .withcrossover(CrossoverAll(true)) .with_compete(CompeteTournament(4)) .build() .unwrap();
evolve.call(&mut rng);
println!("{}", evolve); ```
Sometimes the population size is small enough to simply check all possible solutions. No randomness, mutation, crossover, competition strategies needed.
The Permute strategy is build with the following options:
with_genotype(Genotype)
: the genotype initialized earlierwith_fitness(Fitness)
: the Fitness functionwith_fitness_ordering(FitnessOrdering)
: defaults to FitnessOrdering::Maximize
, so is not mandatory. Set to FitnessOrdering::Minimize
when the search goal is to minimize the fitness function.After building, the strategy can be executed without the need for a randomness provider.
```rust use geneticalgorithm::fitness::FitnessSimpleCount; use geneticalgorithm::permutate::prelude::*;
let genotype = BinaryGenotype::builder() .withgenesize(16) .build() .unwrap();
println!("{}", genotype);
let mut permutate = Permutate::builder() .withgenotype(genotype) .withfitness(FitnessSimpleCount) // count true values for fitness .withfitnessordering(FitnessOrdering::Minimize) // goal is as little true values as possible .build() .unwrap();
permutate.call();
println!("{}", permutate); ```
One cool thing to do with genotypes is to make a meta-genotype of all the Crossover/Mutate/Compete strategies and other Evolve parameters. This could be used to optimize the parameters of some other genetic algorithm. Yes, a simple nested for loop would also work, but where is the fun in that? But I wasn't able to find an elegant approach to creating such a heterogene setup. It was tried with Trait objects, Any and Enums, but all didn't work well:
So, after some consideration I settled on using an nested index based Genotype MultiDiscreteGenotype<usize>
indices of external vectors of arbitrary types, which should then be retrieved in the fitness function.
Only one type is allowed per external vector, so the Crossover/Mutate/Compete strategies all have a Dispatch implementation forwarding to the underlying types (e.g. CompeteDispatch(Competes::Tournament, 4)
)
See example metaevolvebinary for an meta analysis of the evolution strategy:
cargo run --example meta_evolve_binary --release
cargo run --example meta_evolve_nqueens --release
Currently implemented as a permutation, but with caching an evolve strategy could also be used for larger search spaces.
```rust use geneticalgorithm::fitness::FitnessSimpleCount; use geneticalgorithm::meta::prelude::*;
let rounds = 10; let populationsizes = vec![1, 2, 3, 4, 5, 10]; let maxstalegenerationsoptions = vec![Some(100)]; let targetfitnessscoreoptions = vec![Some(0)]; let degenerationrange_options = vec![None, Some(0.001..0.995)]; let mutates = vec![ MutateDispatch(Mutates::Once, 0.05), MutateDispatch(Mutates::Once, 0.1), MutateDispatch(Mutates::Once, 0.2), MutateDispatch(Mutates::Once, 0.3), MutateDispatch(Mutates::Once, 0.4), MutateDispatch(Mutates::Once, 0.5), ]; let crossovers = vec![ CrossoverDispatch(Crossovers::Single, true), CrossoverDispatch(Crossovers::Single, false), CrossoverDispatch(Crossovers::All, true), CrossoverDispatch(Crossovers::All, false), CrossoverDispatch(Crossovers::Range, true), CrossoverDispatch(Crossovers::Range, false), CrossoverDispatch(Crossovers::Clone, true), CrossoverDispatch(Crossovers::Clone, false), ]; let competes = vec![ CompeteDispatch(Competes::Elite, 0), CompeteDispatch(Competes::Tournament, 2), CompeteDispatch(Competes::Tournament, 4), CompeteDispatch(Competes::Tournament, 8), ];
let genotype = BinaryGenotype::builder() .withgenesize(10) .build() .unwrap(); let fitness = FitnessSimpleCount; let evolvebuilder = EvolveBuilder::new() .withgenotype(genotype) .withfitness(fitness) .withfitnessordering(FitnessOrdering::Minimize); let evolvefitnesstomicrosecondfactor = 1000000;
let config = MetaConfig::builder() .withevolvebuilder(evolvebuilder) .withevolvefitnesstomicrosecondfactor(evolvefitnesstomicrosecondfactor) .withrounds(rounds) .withpopulationsizes(populationsizes) .withmaxstalegenerationsoptions(maxstalegenerationsoptions) .withtargetfitnessscoreoptions(targetfitnessscoreoptions) .withdegenerationrangeoptions(degenerationrangeoptions) .withmutates(mutates) .withcrossovers(crossovers) .withcompetes(competes) .build() .unwrap();
let permutate = MetaPermutate::new(&config).call(); println!(); println!("{}", permutate); ```
``` meta-permutate population_size: 2304
[...]
meta-permutate: bestpopulationsize: 2 bestmaxstalegenerations: Some(100) besttargetfitnessscore: Some(0) bestdegenerationrange: None bestmutate: Some(MutateDispatch(Once, 0.5)) bestcrossover: Some(CrossoverDispatch(Clone, true)) best_compete: Some(CompeteDispatch(Elite, 0)) ```
Run tests with cargo test
Implemented using criterion.
Run benchmarks with cargo bench
Implemented using criterion and pprof. find the flamegraph in: ./target/criterion/profileevolvebinary/profile/flamegraph.svg
cargo run --example profile_evolve_binary --release -- --bench --profile-time 5