Rust Stackful Coroutine Library.
May is a high performance stackful coroutine library that can be thought of rust version goroutine. You can use it easily to design and develop massive concurrent programs in Rust.
```rust /// a naive echo server
extern crate may;
use may::net::TcpListener; use std::io::{Read, Write};
fn main() { let listener = TcpListener::bind("127.0.0.1:8000").unwrap(); while let Ok((mut stream, )) = listener.accept() { go!(move || { let mut buf = vec![0; 1024 * 16]; // alloc in heap! while let Ok(n) = stream.read(&mut buf) { if n == 0 { break; } stream.writeall(&buf[0..n]).unwrap(); } }); } }
```
Just a simple comparison with the Rust echo server implemented in tokio to get a sense about May
.
Note
The tokio version is not at it's maximum optimization. In theory
future
scheduling is not evolving context switch which should be a little faster than coroutine version. But I can't find a proper example for multi thread version comparison, so just put it here for you to get some sense about the performance ofMay
. If you have a better implementation offuture
based echo server, I will update it here.
Machine Specs:
Echo server client:
sh
$ cargo build --example=echo_client --release
tokio echo server
run the server by default with 2 threads in another terminal
sh
$ cd tokio-core
$ cargo run --example=echo-threads --release
```sh $ target/release/examples/echo_client -t 2 -c 100 -l 100 -a 127.0.0.1:8080 ==================Benchmarking: 127.0.0.1:8080================== 100 clients, running 100 bytes, 10 sec.
Speed: 315698 request/sec, 315698 response/sec, 30829 kb/sec Requests: 3156989 Responses: 3156989 target/release/examples/echo_client -t 2 -c 100 -l 100 -a 127.0.0.1:8080 1.89s user 13.46s system 152% cpu 10.035 total ```
may echo server
run the server by default with 2 threads in another terminal
sh
$ cd may
$ cargo run --example=echo --release -- -p 8000 -t 2
```sh $ target/release/examples/echo_client -t 2 -c 100 -l 100 -a 127.0.0.1:8000 ==================Benchmarking: 127.0.0.1:8000================== 100 clients, running 100 bytes, 10 sec.
Speed: 419094 request/sec, 419094 response/sec, 40927 kb/sec Requests: 4190944 Responses: 4190944 target/release/examples/echo_client -t 2 -c 100 -l 100 -a 127.0.0.1:8000 2.60s user 16.96s system 195% cpu 10.029 total ```
There is a detailed doc that describes MAY's main restrictions.
There are four things you should avoid when writing coroutines: * Don't call thread blocking APIs. It will hurt the performance.
Carefully use Thread Local Storage. Access TLS in coroutine may trigger undefined behavior.
it's considered unsafe with the following pattern
rust set_tls(); coroutine::yield_now(); // or other coroutine api that would cause a scheduling use_tls();
but it's safe if your code is not sensitive about the previous state of TLS. Or there is no coroutine scheduling between set TLS and use TLS.
Don't run CPU bound tasks for long time, but it's ok if you don't care about fairness.
Note
The first three rules are common when using cooperative async libraries in rust. Even using
future
based system also have these limitations. So what you should really focus on is the coroutine stack size, make sure it's big enough for your applications.
If you need to tune the coroutine stack size, please read here
This crate supports below platforms, for more platform support, please ref generator
May is licensed under either of the following, at your option: