emcee

Build Status Crates version Docs

A re-implementation of Dan Foreman-Mackey's emcee in Rust.

See the hosted documentation here

The fitting_a_model_to_data example is a re-creation of the "fitting a model to data" example from the emcee documentation.

Attribution

If you make use of emcee in your work, please cite Dan's paper (arXiv, ADS, BibTeX).

A copy of the original MIT license is given under DFM-LICENSE.

Basic usage

Implementing models

The sampler requires a struct that implements emcee::Prob, for example:

```rust use emcee::{Guess, Prob};

struct Model;

impl Prob for Model { fn lnlike(&self, params: &Guess) -> f32 { // Insert actual implementation here 0f32 }

fn lnprior(&self, params: &Guess) -> f32 {
    // Insert actual implementation here
    0f32
}

} ```

The trait has a default implementation for lnprob which computes the product of the likelihood and prior probability (sum in log space) as per Bayes' rule. Invalid prior values are marked by returning -std::f32::INFINITY from the priors function. Note your implementation is likely to need external data. This data should be included with your Model class, for example:

```rust struct Model<'a> { x: &'a [f32], y: &'a [f32], }

// Linear model y = m * x + c impl<'a> Prob for Model<'a> { fn lnlike(&self, params: &Guess) -> f32 { let m = params[0]; let c = params[1];

    -0.5 * self.x.iter().zip(self.y)
        .map(|(xval, yval)| {
            let model = m * xval + c;
            let residual = (yval - model).powf(2.0);
            residual
        }).sum::<f32>()
}

fn lnprior(&self, params: &Guess) -> f32 {
    // unimformative priors
    0.0f32
}

}

```

Initial guess

Next, construct an initial guess. A Guess represents a proposal parameter vector:

```rust use emcee::Guess;

let initial_guess = Guess::new(&[0.0f32, 0.0f32]); ```

The sampler implemented by this create uses multiple walkers, and as such the initial guess must be replicated once per walker, and typically dispersed from the initial position to aid exploration of the problem parameter space. This can be achieved with the create_initial_guess method:

rust let nwalkers = 100; let perturbed_guess = initial_guess.create_initial_guess(nwalkers); assert_eq!(perturbed_guess.len(), nwalkers);

Constructing a sampler

The sampler generates new parameter vectors, assess the probability using a user-supplied probability model, accepts more likely parameter vectors and iterates for a number of iterations.

The sampler needs to know the number of walkers to use, which must be an even number and at least twice the size of your parameter vector. It also needs the size of your parameter vector, and your probability struct (which implements Prob):

```rust let nwalkers = 100; let ndim = 2; // m and c

// Build a linear model y = m * x + c (see above)

let initialx = [0.0f32, 1.0f32, 2.0f32]; let initialy = [5.0f32, 7.0f32, 9.0f32];

let model = Model { x: &initialx, y: &initialy, };

let mut sampler = emcee::EnsembleSampler::new(nwalkers, ndim, &model) .expect("could not create sampler"); ```

Then run the sampler:

rust let niterations = 100; sampler.run_mcmc(&perturbed_guess, niterations).expect("error running sampler");

Studying the results

The samples are stored in the sampler's flatchain which is constructed through the flatchain method on the sampler:

```rust let flatchain = sampler.flatchain();

for (i, guess) in flatchain.iter().enumerate() { // Skip possible "burn-in" phase if i < 50 * nwalkers { continue; }

println!("Iteration {}; m={}, c={}", i, guess[0], guess[1]);

} ```