This library provides utility functions:
π‘ To learn more about this library, please read how it was built on developerlife.com:
π‘ You can also read all the Rust content on developerlife.com here. Also, the equivalent of this library is available for TypeScript and is called r3bl-ts-utils.
Please add the following to your Cargo.toml
file:
toml
[dependencies]
r3bl_rs_utils = "0.7.35"
Store
is thread safe and asynchronous (using Tokio). You have to implement async
traits in order to use it, by defining your own reducer, subscriber, and middleware trait
objects. You also have to supply the Tokio runtime, this library will not create its own
runtime. However, for best results, it is best to use the multithreaded Tokio runtime.
Once you setup your Redux store w/ your reducer, subscriber, and middleware, you can use
it by calling store.dispatch_spawn(action)
. This kicks off a parallel Tokio task that
will run the middleware functions, reducer functions, and finally the subscriber
functions. So this will not block the thread of whatever code you call this from. The
dispatch_spawn()
method itself is not async
. So you can call it from non async
code,
however you still have to provide a Tokio executor / runtime, without which you will get a
panic when dispatch_spawn()
is called.
Your middleware (async
trait implementations) will be run concurrently or in parallel
via Tokio tasks. You get to choose which async
trait to implement to do one or the
other. And regardless of which kind you implement the Action
that is optionally returned
will be dispatched to the Redux store at the end of execution of all the middlewares (for
that particular dispatch_spawn()
call).
AsyncMiddlewareSpawns<State, Action>
- Your middleware has to use tokio::spawn
to
run async
blocks in a
separate thread and
return a JoinHandle
that contains an Option<Action>
. A macro
fire_and_forget!
is provided so that you can easily spawn parallel blocks of code in your async
functions. These are added to the store via a call to add_middleware_spawns(...)
.
AsyncMiddleware<State, Action>
- They are will all be run together concurrently using
futures::join_all()
.
These are added to the store via a call to add_middleware(...)
.
The subscribers will be run asynchronously via Tokio tasks. They are all run together
concurrently but not in parallel, using
futures::join_all()
.
The reducer functions are also are async
functions that are run in the tokio runtime.
They're also run one after another in the order in which they're added.
β‘ Any functions or blocks that you write which uses the Redux library will have to be marked
async
as well. And you will have to spawn the Tokio runtime by using the#[tokio::main]
macro. If you use the default runtime then Tokio will use multiple threads and its task stealing implementation to give you parallel and concurrent behavior. You can also use the single threaded runtime; its really up to you.
To create middleware you have to implement the AsyncMiddleware<S,A>
trait or
AsyncMiddlewareSpawns<S,A>
trait. Please read the
AsyncMiddleware
docs
for examples of both. The run()
method is passed two arguments: the State
and the
Action
.
AsyncMiddlewareSpawns<S,A>
in your run()
implementation you have to use the
fire_and_forget!
macro to surround your code. And this will return a JoinHandle<Option<A>>
.AsyncMiddleware<S,A>
in your run()
implementation you just have to return an
Option<A>>
.To create reducers you have to implement the AsyncReducer
trait.
State
object.run()
method will be passed two arguments: a ref to Action
and ref to
State
.To create subscribers you have to implement the AsyncSubscriber
trait.
run()
method will be passed a State
object as an argument.()
.Here's the gist of how to make & use one of these:
Default
. Or you can add your own properties / fields
to this struct, and construct it yourself, or even provide a constructor function.
new()
is provided for you by the trait.AsyncMiddleware
, AsyncMiddlewareSpawns
, AsyncReducer
, or
AsyncSubscriber
trait on your struct.add_middleware()
,
add_middleware_spawns()
, add_reducer()
, or add_subscriber()
methods. You can
register as many of these as you like.
::new()
method
to create an instance and pass that to the add_???()
methods.add_???()
methods:
Box::new($YOUR_STRUCT))
.π‘ There are lots of examples in the tests for this library and in this CLI application built using it.
Here's an example of how to use it. Let's start w/ the import statements.
rust
/// Imports.
use async_trait::async_trait;
use r3bl_rs_utils::redux::{
AsyncMiddlewareSpawns, AsyncMiddleware, AsyncReducer,
AsyncSubscriber, Store, StoreStateMachine,
};
use std::sync::{Arc, Mutex};
use tokio::sync::RwLock;
- Make sure to have the
tokio
andasync-trait
crates installed as well asr3bl_rs_utils
in yourCargo.toml
file.- Here's an example
Cargo.toml
.
Let's say we have the following action enum, and state struct.
```rust /// Action enum.
pub enum Action { Add(i32, i32), AddPop(i32), Clear, MiddlewareCreateClearAction, Noop, }
impl Default for Action { fn default() -> Self { Action::Noop } }
/// State.
pub struct State {
pub stack: Vec
Here's an example of the reducer function.
```rust /// Reducer function (pure).
struct MyReducer;
impl AsyncReducer
Here's an example of an async subscriber function (which are run in parallel after an action is dispatched). The following example uses a lambda that captures a shared object. This is a pretty common pattern that you might encounter when creating subscribers that share state in your enclosing block or scope.
```rust
/// This shared object is used to collect results from the subscriber
/// function & test it later.
let shared_object = Arc::new(Mutex::new(Vec::
struct MySubscriber {
pub sharedobjectref: Arc
impl AsyncSubscriber
let mysubscriber = MySubscriber { sharedobjectref: sharedobject_ref.clone(), }; ```
Here are two types of async middleware functions. One that returns an action (which will
get dispatched once this middleware returns), and another that doesn't return anything
(like a logger middleware that just dumps the current action to the console). Note that
both these functions share the shared_object
reference from above.
```rust /// This shared object is used to collect results from the subscriber /// function & test it later.
struct MwExampleNoSpawn {
pub sharedobjectref: Arc
impl AsyncMiddleware
let mwexamplenospawn = MwExampleNoSpawn { sharedobjectref: sharedobject_ref.clone(), };
/// This shared object is used to collect results from the subscriber /// function & test it later.
struct MwExampleSpawns {
pub sharedobjectref: Arc
impl AsyncMiddlewareSpawns
let mwexamplespawns = MwExampleSpawns { sharedobjectref: sharedobjectref.clone(), }; ```
Here's how you can setup a store with the above reducer, middleware, and subscriber functions.
rust
// Setup store.
let mut store = Store::<State, Action>::default();
store
.add_reducer(MyReducer::new()) // Note the use of `::new()` here.
.await
.add_subscriber(Box::new( // We aren't using `::new()` here
my_subscriber, // because the struct has properties.
))
.await
.add_middleware_spawns(Box::new( // We aren't using `::new()` here
mw_example_spawns, // because the struct has properties.
))
.await
.add_middleware(Box::new( // We aren't using `::new()` here
mw_example_no_spawn, // because the struct has properties.
))
.await;
Finally here's an example of how to dispatch an action in a test. You can dispatch actions
in parallel using dispatch_spawn()
which is "fire and forget" meaning that the caller
won't block or wait for the dispatch_spawn()
to return.
``rust
// Test reducer and subscriber by dispatching
Add,
AddPop,
Clear` actions in parallel.
store.dispatchspawn(Action::Add(1, 2)).await;
asserteq!(shared_object.lock().unwrap().pop(), Some(3));
store.dispatchspawn(Action::AddPop(1)).await; asserteq!(shared_object.lock().unwrap().pop(), Some(4));
store.dispatchspawn(Action::Clear).await; asserteq!(store.get_state().stack.len(), 0); ```
There are quite a few declarative macros that you will find in the library. They tend to
be used internally in the implementation of the library itself. Here are some that are
actually externally exposed via #[macro_export]
.
Wrap the given block
or stmt
so that it returns a Result<()>
. It is just syntactic
sugar that helps having to write Ok(())
repeatedly at the end of each block. Here's an
example.
rust
throws! {
match input_event {
InputEvent::DisplayableKeypress(character) => {
println_raw!(character);
}
_ => todo!()
}
}
Here's another example.
rust
fn test_simple_2_col_layout() -> CommonResult<()> {
throws!({
let mut canvas = Canvas::default();
canvas.stylesheet = create_stylesheet()?;
canvas.canvas_start(
CanvasPropsBuilder::new()
.set_pos((0, 0).into())
.set_size((500, 500).into())
.build(),
)?;
layout_container(&mut canvas)?;
canvas.canvas_end()?;
});
}
You can use this macro to dump log messages at 3 levels to a file. By default this file is
named log.txt
and is dumped in the current directory. Here's how you can use it. Please
note that the macro returns a Result
. A type alias is provided to save some typing
called CommonResult<T>
which is just a short hand for
std::result::Result<T, Box<dyn Error>>
. The log file itself is overwritten for each
"session" that you run your program.
rust
use r3bl_rs_utils::{init_file_logger_once, log, CommonResult};
fn run() -> CommonResult<()> {
let msg = "foo";
let msg_2 = "bar";
log!(INFO, "This is a info message");
log!(WARN, "This is a warning message {}", msg);
log!(ERROR, "This is a error message {} {}", msg, msg_2);
Ok(())
}
Please check out the source here.
This macro makes it easy to create simple HTTP GET requests using the reqwest
crate. It
generates an async
function called make_request()
that returns a CommonResult<T>
where T
is the type of the response body. Here's an example.
```rust use std::{error::Error, fmt::Display}; use r3blrsutils::makeapicall_for; use serde::{Deserialize, Serialize};
const ENDPOINT: &str = "https://api.namefake.com/english-united-states/female/";
makeapicall_for! { FakeContactData at ENDPOINT }
pub struct FakeContactData { pub name: String, pub phoneh: String, pub emailu: String, pub email_d: String, pub address: String, }
let fakedata = fakecontactdataapi() .await .unwraporelse(|| FakeContactData { name: "Foo Bar".tostring(), phoneh: "123-456-7890".tostring(), emailu: "foo".tostring(), emaild: "bar.com".tostring(), ..FakeContactData::default() }); ```
You can find lots of examples here.
This is a really simple wrapper around tokio::spawn()
for the given block. Its just
syntactic sugar. Here's an example of using it for a non-async
block.
rust
pub fn foo() {
fire_and_forget!(
{ println!("Hello"); }
);
}
And, here's an example of using it for an async
block.
rust
pub fn foo() {
fire_and_forget!(
let fake_data = fake_contact_data_api()
.await
.unwrap_or_else(|_| FakeContactData {
name: "Foo Bar".to_string(),
phone_h: "123-456-7890".to_string(),
email_u: "foo".to_string(),
email_d: "bar.com".to_string(),
..FakeContactData::default()
});
);
}
Syntactic sugar to run a conditional statement. Here's an example.
rust
const DEBUG: bool = true;
call_if_true!(
DEBUG,
eprintln!(
"{} {} {}\r",
r3bl_rs_utils::style_error("βΆ"),
r3bl_rs_utils::style_prompt($msg),
r3bl_rs_utils::style_dimmed(&format!("{:#?}", $err))
)
);
This is a really simple macro to make it effortless to use the color console logger. It
takes a single identifier as an argument, or any number of them. It simply dumps an arrow
symbol, followed by the identifier (stringified) along with the value that it contains
(using the Debug
formatter). All of the output is colorized for easy readability. You
can use it like this.
rust
let my_string = "Hello World!";
debug!(my_string);
let my_number = 42;
debug!(my_string, my_number);
You can also use it in these other forms for terminal raw mode output. This will dump the output to stderr.
rust
if let Err(err) = $cmd {
let msg = format!("β Failed to {}", stringify!($cmd));
debug!(ERROR_RAW &msg, err);
}
This will dump the output to stdout.
rust
let msg = format!("β
Did the thing to {}", stringify!($name));
debug!(OK_RAW &msg);
This is a macro that takes inspiration from the with
scoping function in Kotlin. It just
makes it easier to express a block of code that needs to run after an expression is
evaluated and saved to a given variable. Here's an example.
rust
with! {
/* $eval */ LayoutProps {
id: id.to_string(),
dir,
req_size: RequestedSize::new(width_pc, height_pc),
},
as /* $id */ it,
run /* $code */ {
match self.is_layout_stack_empty() {
true => self.add_root_layout(it),
false => self.add_normal_layout(it),
}?;
}
}
It does the following:
$eval
expression and assigns it to $id
.$code
block.This macro is just like with!
but it takes a mutable reference to the $id
variable.
This macro can be useful when you are working w/ an expression that returns an Option
and if that Option
is None
then you want to abort and return an error immediately. The
idea is that you are using this macro in a function that returns a Result<T>
basically.
Here's an example to illustrate.
rust
pub fn from(
width_percent: u8,
height_percent: u8,
) -> CommonResult<RequestedSize> {
let size_tuple = (width_percent, height_percent);
let (width_pc, height_pc) = unwrap_option_or_run_fn_returning_err!(
convert_to_percent(size_tuple),
|| LayoutError::new_err(LayoutErrorType::InvalidLayoutSizePercentage)
);
Ok(Self::new(width_pc, height_pc))
}
This macro is basically a way to compute something lazily when it (the Option
) is set to
None
. Unwrap the $option
, and if None
then run the $next
closure which must return
a value that is set to $option
. Here's an example.
```rust use r3blrsutils::unwrapoptionorcomputeif_none;
fn testunwrapoptionorcomputeifnone() {
struct MyStruct {
field: Option
All the procedural macros are organized in 3 crates using an internal or core crate: the public crate, an internal or core crate, and the proc macro crate.
This derive macro makes it easy to generate builders when annotating a struct
or enum
.
It generates It has full support for generics. It can be used like this.
```rust
struct Point
let mypt: Point
asserteq!(mypt.x, 1); asserteq!(mypt.y, 2); ```
This function like macro (with custom syntax) makes it easy to manage shareability and interior mutability of a struct. We call this pattern the "manager" of "things").
πͺ You can read all about it here.
RwLock
for thread safety.Arc
so we can share it across threads.where
clause.Here's a very simple usage:
rust
make_struct_safe_to_share_and_mutate! {
named MyMapManager<K, V>
where K: Default + Send + Sync + 'static, V: Default + Send + Sync + 'static
containing my_map
of_type std::collections::HashMap<K, V>
}
Here's an async example.
```rust
async fn testcustomsyntaxnowhereclause() {
makestructsafetoshareandmutate! {
named StringMap
let mymanager: StringMap
This function like macro (with custom syntax) makes it easy to share functions and lambdas that are async. They should be safe to share between threads and they should support either being invoked or spawned.
πͺ You can read all about how to write proc macros here.
Arc<RwLock<>>
for
thread safety and interior mutability.get()
method is generated which makes it possible to share this struct across
threads.from()
method is generated which makes it easy to create this struct from a
function or lambda.spawn()
method is generated which makes it possible to spawn the enclosed function
or lambda asynchronously using Tokio.invoke()
method is generated which makes it possible to invoke the enclosed
function or lambda synchronously.Here's an example of how to use this macro.
```rust use r3blrsutils::makesafeasyncfnwrapper;
makesafeasyncfnwrapper! { named SafeMiddlewareFnWrapper containing fnmut oftype FnMut(A) -> Option } ```
Here's another example.
```rust use r3blrsutils::makesafeasyncfnwrapper;
makesafeasyncfnwrapper! {
named SafeSubscriberFnWrapper
containing fnmut
oftype FnMut(S) -> ()
}
```
[Arena
] and [MTArena
] types are the implementation of a
non-binary tree data
structure that is inspired by memory arenas.
Here's a simple example of how to use the [Arena
] type:
```rust use r3blrsutils::{ treememoryarena::{Arena, HasId, MTArena, ResultUidList}, utils::{styleprimary, styleprompt}, };
let mut arena = Arena::
Here's how you get weak and strong references from the arena (tree), and tree walk:
```rust use r3blrsutils::{ treememoryarena::{Arena, HasId, MTArena, ResultUidList}, utils::{styleprimary, styleprompt}, };
let mut arena = Arena::
{ assert!(arena.getnodearc(&node1id).issome()); let node1ref = dbg!(arena.getnodearc(&node1id).unwrap()); let node1refweak = arena.getnodearcweak(&node1id).unwrap(); asserteq!(node1ref.read().unwrap().payload, node1value); asserteq!( node1refweak.upgrade().unwrap().read().unwrap().payload, 42 ); }
{ let nodeiddne = 200 as usize; assert!(arena.getnodearc(&nodeiddne).is_none()); }
{ let node1id = 0 as usize; let nodelist = dbg!(arena.treewalkdfs(&node1id).unwrap()); asserteq!(nodelist.len(), 1); asserteq!(node_list, vec![0]); } ```
Here's an example of how to use the [MTArena
] type:
```rust use std::{ sync::Arc, thread::{self, JoinHandle}, };
use r3blrsutils::{ treememoryarena::{Arena, HasId, MTArena, ResultUidList}, utils::{styleprimary, styleprompt}, };
type ThreadResult = Vec
let mut handles: Handles = Vec::new();
let arena = MTArena::
// Thread 1 - add root. Spawn and wait (since the 2 threads below need the root). { let arenaarc = arena.getarenaarc(); let thread = thread::spawn(move || { let mut arenawrite = arenaarc.write().unwrap(); let root = arenawrite.addnewnode("foo".to_string(), None); vec![root] }); thread.join().unwrap(); }
// Perform tree walking in parallel. Note the lambda does capture many enclosing variable context. { let arenaarc = arena.getarenaarc(); let fnarc = Arc::new(move |uid, payload| { println!( "{} {} {} Arena weakcount:{} strongcount:{}", styleprimary("walkerfn - closure"), uid, payload, Arc::weakcount(&arenaarc), Arc::weakcount(&arenaarc) ); });
// Walk tree w/ a new thread using arc to lambda.
{
let threadhandle: JoinHandle
let result_node_list = thread_handle.join().unwrap();
println!("{:#?}", result_node_list);
}
// Walk tree w/ a new thread using arc to lambda.
{
let threadhandle: JoinHandle
let result_node_list = thread_handle.join().unwrap();
println!("{:#?}", result_node_list);
} } ```
π There are more complex ways of using [
Arena
] and [MTArena
]. Please look at these extensive integration tests that put them thru their paces here.
These two structs make it easier to work w/ Result
s. They are just syntactic sugar and
helper structs. You will find them used everywhere in the
r3bl_rs_utils
crate.
Here's an example of using them both.
```rust use r3blrsutils::{CommonError, CommonResult};
pub struct Stylesheet { pub styles: Vec