r3blrsutils

This library provides utility functions:

  1. Thread safe asynchronous Redux library (uses Tokio to run subscribers and middleware in separate tasks). The reducer functions are run in sequence (not in Tokio tasks).
  2. Declarative macros, and procedural macros (both function like and derive) to avoid having to write lots of boilerplate code for many common (and complex) tasks.
  3. Non binary tree data structure inspired by memory arenas, that is thread safe and supports parallel tree walking.
  4. Functions to unwrap deeply nested objects inspired by Kotlin scope functions.
  5. Capabilities to make it easier to build TUIs (Text User Interface apps) in Rust. This is currently experimental and is being actively developed.

πŸ’‘ To learn more about this library, please read how it was built on developerlife.com:

  1. https://developerlife.com/2022/02/24/rust-non-binary-tree/
  2. https://developerlife.com/2022/03/12/rust-redux/
  3. https://developerlife.com/2022/03/30/rust-proc-macro/

πŸ’‘ You can also read all the Rust content on developerlife.com here. Also, the equivalent of this library is available for TypeScript and is called r3bl-ts-utils.

Usage

Please add the following to your Cargo.toml file:

toml [dependencies] r3bl_rs_utils = "0.7.26"

redux

Store is thread safe and asynchronous (using Tokio). You have to implement async traits in order to use it, by defining your own reducer, subscriber, and middleware trait objects. You also have to supply the Tokio runtime, this library will not create its own runtime. However, for best results, it is best to use the multithreaded Tokio runtime.

Once you setup your Redux store w/ your reducer, subscriber, and middleware, you can use it by calling store.dispatch_spawn(action). This kicks off a parallel Tokio task that will run the middleware functions, reducer functions, and finally the subscriber functions. So this will not block the thread of whatever code you call this from. The dispatch_spawn() method itself is not async. So you can call it from non async code, however you still have to provide a Tokio executor / runtime, without which you will get a panic when dispatch_spawn() is called.

Middlewares

Your middleware (async trait implementations) will be run concurrently or in parallel via Tokio tasks. You get to choose which async trait to implement to do one or the other. And regardless of which kind you implement the Action that is optionally returned will be dispatched to the Redux store at the end of execution of all the middlewares (for that particular dispatch_spawn() call).

  1. AsyncMiddlewareSpawns<State, Action> - Your middleware has to use tokio::spawn to run async blocks in a separate thread and return a JoinHandle that contains an Option<Action>. A macro fire_and_forget! is provided so that you can easily spawn parallel blocks of code in your async functions. These are added to the store via a call to add_middleware_spawns(...).

  2. AsyncMiddleware<State, Action> - They are will all be run together concurrently using futures::join_all(). These are added to the store via a call to add_middleware(...).

Subscribers

The subscribers will be run asynchronously via Tokio tasks. They are all run together concurrently but not in parallel, using futures::join_all().

Reducers

The reducer functions are also are async functions that are run in the tokio runtime. They're also run one after another in the order in which they're added.

⚑ Any functions or blocks that you write which uses the Redux library will have to be marked async as well. And you will have to spawn the Tokio runtime by using the #[tokio::main] macro. If you use the default runtime then Tokio will use multiple threads and its task stealing implementation to give you parallel and concurrent behavior. You can also use the single threaded runtime; its really up to you.

  1. To create middleware you have to implement the AsyncMiddleware<S,A> trait or AsyncMiddlewareSpawns<S,A> trait. Please read the AsyncMiddleware docs for examples of both. The run() method is passed two arguments: the State and the Action.

    1. For AsyncMiddlewareSpawns<S,A> in your run() implementation you have to use the fire_and_forget! macro to surround your code. And this will return a JoinHandle<Option<A>>.
    2. For AsyncMiddleware<S,A> in your run() implementation you just have to return an Option<A>>.
  2. To create reducers you have to implement the AsyncReducer trait.

  3. To create subscribers you have to implement the AsyncSubscriber trait.

Summary

Here's the gist of how to make & use one of these:

  1. Create a struct. Make it derive Default. Or you can add your own properties / fields to this struct, and construct it yourself, or even provide a constructor function.
  2. Implement the AsyncMiddleware, AsyncMiddlewareSpawns, AsyncReducer, or AsyncSubscriber trait on your struct.
  3. Register this struct w/ the store using one of the add_middleware(), add_middleware_spawns(), add_reducer(), or add_subscriber() methods. You can register as many of these as you like.

Examples

πŸ’‘ There are lots of examples in the tests for this library and in this CLI application built using it.

Here's an example of how to use it. Let's start w/ the import statements.

rust /// Imports. use async_trait::async_trait; use r3bl_rs_utils::redux::{ AsyncMiddlewareSpawns, AsyncMiddleware, AsyncReducer, AsyncSubscriber, Store, StoreStateMachine, }; use std::sync::{Arc, Mutex}; use tokio::sync::RwLock;

  1. Make sure to have the tokio and async-trait crates installed as well as r3bl_rs_utils in your Cargo.toml file.
  2. Here's an example Cargo.toml.

Let's say we have the following action enum, and state struct.

```rust /// Action enum.

[derive(Debug, PartialEq, Eq, Hash, Clone)]

pub enum Action { Add(i32, i32), AddPop(i32), Clear, MiddlewareCreateClearAction, Noop, }

impl Default for Action { fn default() -> Self { Action::Noop } }

/// State.

[derive(Clone, Default, PartialEq, Debug, Hash)]

pub struct State { pub stack: Vec, } ```

Here's an example of the reducer function.

```rust /// Reducer function (pure).

[derive(Default)]

struct MyReducer;

[async_trait]

impl AsyncReducer for MyReducer { async fn run( &self, action: &Action, state: &State, ) -> State { match action { Action::Add(a, b) => { let sum = a + b; State { stack: vec![sum] } } Action::AddPop(a) => { let sum = a + state.stack[0]; State { stack: vec![sum] } } Action::Clear => State { stack: vec![] }, _ => state.clone(), } } } ```

Here's an example of an async subscriber function (which are run in parallel after an action is dispatched). The following example uses a lambda that captures a shared object. This is a pretty common pattern that you might encounter when creating subscribers that share state in your enclosing block or scope.

```rust /// This shared object is used to collect results from the subscriber /// function & test it later. let shared_object = Arc::new(Mutex::new(Vec::::new()));

[derive(Default)]

struct MySubscriber { pub sharedobjectref: Arc>>, }

[async_trait]

impl AsyncSubscriber for MySubscriber { async fn run( &self, state: State, ) { let mut stack = self .sharedobjectref .lock() .unwrap(); if !state.stack.is_empty() { stack.push(state.stack[0]); } } }

let mysubscriber = MySubscriber { sharedobjectref: sharedobject_ref.clone(), }; ```

Here are two types of async middleware functions. One that returns an action (which will get dispatched once this middleware returns), and another that doesn't return anything (like a logger middleware that just dumps the current action to the console). Note that both these functions share the shared_object reference from above.

```rust /// This shared object is used to collect results from the subscriber /// function & test it later.

[derive(Default)]

struct MwExampleNoSpawn { pub sharedobjectref: Arc>>, }

[async_trait]

impl AsyncMiddleware for MwExampleNoSpawn { async fn run( &self, action: Action, storeref: Arc>>, ) { let mut stack = self .sharedobjectref .lock() .unwrap(); match action { Action::MwExampleNoSpawnAdd(, ) => stack.push(-1), Action::MwExampleNoSpawnAddPop() => stack.push(-2), Action::MwExampleNoSpawnClear => stack.push(-3), _ => {} } None } }

let mwexamplenospawn = MwExampleNoSpawn { sharedobjectref: sharedobject_ref.clone(), };

/// This shared object is used to collect results from the subscriber /// function & test it later.

[derive(Default)]

struct MwExampleSpawns { pub sharedobjectref: Arc>>, }

[async_trait]

impl AsyncMiddlewareSpawns for MwExampleSpawns { async fn run( &self, action: Action, storeref: Arc>>, ) -> JoinHandle> { fireandforget!( { let mut stack = self .sharedobjectref .lock() .unwrap(); match action { Action::MwExampleSpawnsModifySharedObjectResetState => { sharedvec.push(-4); return Some(Action::Reset); } _ => {} } None } ); } }

let mwexamplespawns = MwExampleSpawns { sharedobjectref: sharedobjectref.clone(), }; ```

Here's how you can setup a store with the above reducer, middleware, and subscriber functions.

rust // Setup store. let mut store = Store::<State, Action>::default(); store .add_reducer(MyReducer::new()) // Note the use of `::new()` here. .await .add_subscriber(Box::new( // We aren't using `::new()` here my_subscriber, // because the struct has properties. )) .await .add_middleware_spawns(Box::new( // We aren't using `::new()` here mw_example_spawns, // because the struct has properties. )) .await .add_middleware(Box::new( // We aren't using `::new()` here mw_example_no_spawn, // because the struct has properties. )) .await;

Finally here's an example of how to dispatch an action in a test. You can dispatch actions in parallel using dispatch_spawn() which is "fire and forget" meaning that the caller won't block or wait for the dispatch_spawn() to return.

``rust // Test reducer and subscriber by dispatchingAdd,AddPop,Clear` actions in parallel. store.dispatchspawn(Action::Add(1, 2)).await; asserteq!(shared_object.lock().unwrap().pop(), Some(3));

store.dispatchspawn(Action::AddPop(1)).await; asserteq!(shared_object.lock().unwrap().pop(), Some(4));

store.dispatchspawn(Action::Clear).await; asserteq!(store.get_state().stack.len(), 0); ```

Macros

Declarative

There are quite a few declarative macros that you will find in the library. They tend to be used internally in the implementation of the library itself. Here are some that are actually externally exposed via #[macro_export].

log!

You can use this macro to dump log messages at 3 levels to a file. By default this file is named log.txt and is dumped in the current directory. Here's how you can use it. Please note that the macro returns a Result. A type alias is provided to save some typing called ResultCommon<T> which is just a short hand for std::result::Result<T, Box<dyn Error>>. The log file itself is overwritten for each "session" that you run your program.

rust use r3bl_rs_utils::{init_file_logger_once, log, ResultCommon}; fn run() -> ResultCommon<()> { log!(INFO, "This is a info message"); log!(WARN, "This is a warning message"); log!(ERROR, "This is a error message"); Ok(()) }

Please check out the source here.

makeapicall_for!

This macro makes it easy to create simple HTTP GET requests using the reqwest crate. It generates an async function called make_request() that returns a ResultCommon<T> where T is the type of the response body. Here's an example.

```rust use std::{error::Error, fmt::Display}; use r3blrsutils::makeapicall_for; use serde::{Deserialize, Serialize};

const ENDPOINT: &str = "https://api.namefake.com/english-united-states/female/";

makeapicall_for! { FakeContactData at ENDPOINT }

[derive(Serialize, Deserialize, Debug, Default)]

pub struct FakeContactData { pub name: String, pub phoneh: String, pub emailu: String, pub email_d: String, pub address: String, }

let fakedata = fakecontactdataapi() .await .unwraporelse(|| FakeContactData { name: "Foo Bar".tostring(), phoneh: "123-456-7890".tostring(), emailu: "foo".tostring(), emaild: "bar.com".tostring(), ..FakeContactData::default() }); ```

You can find lots of examples here.

fireandforget!

This is a really simple wrapper around tokio::spawn() for the given block. Its just syntactic sugar. Here's an example of using it for a non-async block.

rust pub fn foo() { fire_and_forget!( { println!("Hello"); } ); }

And, here's an example of using it for an async block.

rust pub fn foo() { fire_and_forget!( let fake_data = fake_contact_data_api() .await .unwrap_or_else(|_| FakeContactData { name: "Foo Bar".to_string(), phone_h: "123-456-7890".to_string(), email_u: "foo".to_string(), email_d: "bar.com".to_string(), ..FakeContactData::default() }); ); }

debug!

This is a really simple macro to make it effortless to use the color console logger. It takes an identifier as an argument. It simply dumps an arrow symbol, followed by the identifier (stringified) along with the value that it contains (using the Debug formatter). All of the output is colorized for easy readability. You can use it like this.

rust let my_string = "Hello World!"; debug!(my_string);

with!

This is a macro that takes inspiration from the with scoping function in Kotlin. It just makes it easier to express a block of code that needs to run after an expression is evaluated and saved to a given variable. Here's an example.

rust with! { /* $eval */ LayoutProps { id: id.to_string(), dir, req_size: RequestedSize::new(width_pc, height_pc), }, as /* $id */ it, run /* $code */ { match self.is_layout_stack_empty() { true => self.add_root_layout(it), false => self.add_normal_layout(it), }?; } }

It does the following:

  1. Evaluates the $eval expression and assigns it to $id.
  2. Runs the $code block.

with_mut!

This macro is just like with! but it takes a mutable reference to the $id variable.

unwrapoptionorrunfnreturningerr!

This macro can be useful when you are working w/ an expression that returns an Option and if that Option is None then you want to abort and return an error immediately. The idea is that you are using this macro in a function that returns a Result<T> basically.

Here's an example to illustrate.

rust pub fn from( width_percent: u8, height_percent: u8, ) -> ResultCommon<RequestedSize> { let size_tuple = (width_percent, height_percent); let (width_pc, height_pc) = unwrap_option_or_run_fn_returning_err!( convert_to_percent(size_tuple), || LayoutError::new_err(LayoutErrorType::InvalidLayoutSizePercentage) ); Ok(Self::new(width_pc, height_pc)) }

Procedural

All the procedural macros are organized in 3 crates using an internal or core crate: the public crate, an internal or core crate, and the proc macro crate.

#[derive(Builder)]

This derive macro makes it easy to generate builders when annotating a struct or enum. It generates It has full support for generics. It can be used like this.

```rust

[derive(Builder)]

struct Point where X: std::fmt::Display + Clone, Y: std::fmt::Display + Clone, { x: X, y: Y, }

let mypt: Point = PointBuilder::new() .setx(1 as i32) .set_y(2 as i32) .build();

asserteq!(mypt.x, 1); asserteq!(mypt.y, 2); ```

makestructsafetoshareandmutate!

This function like macro (with custom syntax) makes it easy to manage shareability and interior mutability of a struct. We call this pattern the "manager" of "things").

πŸͺ„ You can read all about it here.

  1. This struct gets wrapped in a RwLock for thread safety.
  2. That is then wrapped inside an Arc so we can share it across threads.
  3. Additionally it works w/ Tokio so that it is totally async. It also fully supports generics and trait bounds w/ an optional where clause.

Here's a very simple usage:

rust make_struct_safe_to_share_and_mutate! { named MyMapManager<K, V> where K: Default + Send + Sync + 'static, V: Default + Send + Sync + 'static containing my_map of_type std::collections::HashMap<K, V> }

Here's an async example.

```rust

[tokio::test]

async fn testcustomsyntaxnowhereclause() { makestructsafetoshareandmutate! { named StringMap // where is optional and is missing here. containing mymap of_type std::collections::HashMap }

let mymanager: StringMap = StringMap::default(); let lockedmap = mymanager.mymap.read().await; asserteq!(lockedmap.len(), 0); drop(locked_map); } ```

makesafeasyncfnwrapper!

This function like macro (with custom syntax) makes it easy to share functions and lambdas that are async. They should be safe to share between threads and they should support either being invoked or spawned.

πŸͺ„ You can read all about how to write proc macros here.

  1. A struct is generated that wraps the given function or lambda in an Arc<RwLock<>> for thread safety and interior mutability.
  2. A get() method is generated which makes it possible to share this struct across threads.
  3. A from() method is generated which makes it easy to create this struct from a function or lambda.
  4. A spawn() method is generated which makes it possible to spawn the enclosed function or lambda asynchronously using Tokio.
  5. An invoke() method is generated which makes it possible to invoke the enclosed function or lambda synchronously.

Here's an example of how to use this macro.

```rust use r3blrsutils::makesafeasyncfnwrapper;

makesafeasyncfnwrapper! { named SafeMiddlewareFnWrapper containing fnmut oftype FnMut(A) -> Option } ```

Here's another example.

```rust use r3blrsutils::makesafeasyncfnwrapper;

makesafeasyncfnwrapper! { named SafeSubscriberFnWrapper containing fnmut oftype FnMut(S) -> () } ```

treememoryarena (non-binary tree data structure)

[Arena] and [MTArena] types are the implementation of a non-binary tree data structure that is inspired by memory arenas.

Here's a simple example of how to use the [Arena] type:

```rust use r3blrsutils::{ treememoryarena::{Arena, HasId, MTArena, ResultUidList}, utils::{styleprimary, styleprompt}, };

let mut arena = Arena::::new(); let node1value = 42 as usize; let node1id = arena.addnewnode(node1value, None); println!("{} {:#?}", styleprimary("node1id"), node1id); asserteq!(node1id, 0); ```

Here's how you get weak and strong references from the arena (tree), and tree walk:

```rust use r3blrsutils::{ treememoryarena::{Arena, HasId, MTArena, ResultUidList}, utils::{styleprimary, styleprompt}, };

let mut arena = Arena::::new(); let node1value = 42 as usize; let node1id = arena.addnewnode(node1value, None);

{ assert!(arena.getnodearc(&node1id).issome()); let node1ref = dbg!(arena.getnodearc(&node1id).unwrap()); let node1refweak = arena.getnodearcweak(&node1id).unwrap(); asserteq!(node1ref.read().unwrap().payload, node1value); asserteq!( node1refweak.upgrade().unwrap().read().unwrap().payload, 42 ); }

{ let nodeiddne = 200 as usize; assert!(arena.getnodearc(&nodeiddne).is_none()); }

{ let node1id = 0 as usize; let nodelist = dbg!(arena.treewalkdfs(&node1id).unwrap()); asserteq!(nodelist.len(), 1); asserteq!(node_list, vec![0]); } ```

Here's an example of how to use the [MTArena] type:

```rust use std::{ sync::Arc, thread::{self, JoinHandle}, };

use r3blrsutils::{ treememoryarena::{Arena, HasId, MTArena, ResultUidList}, utils::{styleprimary, styleprompt}, };

type ThreadResult = Vec; type Handles = Vec>;

let mut handles: Handles = Vec::new(); let arena = MTArena::::new();

// Thread 1 - add root. Spawn and wait (since the 2 threads below need the root). { let arenaarc = arena.getarenaarc(); let thread = thread::spawn(move || { let mut arenawrite = arenaarc.write().unwrap(); let root = arenawrite.addnewnode("foo".to_string(), None); vec![root] }); thread.join().unwrap(); }

// Perform tree walking in parallel. Note the lambda does capture many enclosing variable context. { let arenaarc = arena.getarenaarc(); let fnarc = Arc::new(move |uid, payload| { println!( "{} {} {} Arena weakcount:{} strongcount:{}", styleprimary("walkerfn - closure"), uid, payload, Arc::weakcount(&arenaarc), Arc::weakcount(&arenaarc) ); });

// Walk tree w/ a new thread using arc to lambda. { let threadhandle: JoinHandle = arena.treewalkparallel(&0, fnarc.clone());

let result_node_list = thread_handle.join().unwrap();
println!("{:#?}", result_node_list);

}

// Walk tree w/ a new thread using arc to lambda. { let threadhandle: JoinHandle = arena.treewalkparallel(&1, fnarc.clone());

let result_node_list = thread_handle.join().unwrap();
println!("{:#?}", result_node_list);

} } ```

πŸ“œ There are more complex ways of using [Arena] and [MTArena]. Please look at these extensive integration tests that put them thru their paces here.

utils

LazyMemoValues

This struct allows users to create a lazy hash map. A function must be provided that computes the values when they are first requested. These values are cached for the lifetime this struct. Here's an example.

```rust use std::sync::atomic::{AtomicUsize, Ordering::SeqCst}; use r3blrsutils::utils::LazyMemoValues;

// These are copied in the closure below. let arcatomiccount = AtomicUsize::new(0); let mut avariable = 123; let mut aflag = false;

let mut generatevaluefn = LazyMemoValues::new(|it| { arcatomiccount.fetchadd(1, SeqCst); avariable = 12; aflag = true; avariable + it });

asserteq!(arcatomiccount.load(SeqCst), 0); asserteq!(generatevaluefn.getref(&1), &13); asserteq!(arcatomiccount.load(SeqCst), 1); asserteq!(generatevaluefn.getref(&1), &13); // Won't regenerate the value. asserteq!(arcatomic_count.load(SeqCst), 1); // Doesn't change. ```

tty

This module contains a set of functions to make it easier to work with terminals.

The following is an example of how to use is_stdin_piped():

rust fn run(args: Vec<String>) -> Result<(), Box<dyn Error>> { match is_stdin_piped() { true => piped_grep(PipedGrepOptionsBuilder::parse(args)?)?, false => grep(GrepOptionsBuilder::parse(args)?)?, } Ok(()) }

The following is an example of how to use readline():

```rust use r3blrsutils::utils::{ printheader, readline, styledimmed, styleerror, styleprimary, style_prompt, };

fn makeaguess() -> String { println!("{}", Blue.paint("Please input your guess.")); let (bytesread, guess) = readline(); println!( "{} {}, {} {}", styledimmed("#bytes read:"), styleprimary(&bytesread.tostring()), styledimmed("You guessed:"), style_primary(&guess) ); guess } ```

Here's a list of functions available in this module:

safe_unwrap

Functions that make it easy to unwrap a value safely. These functions are provided to improve the ergonomics of using wrapped values in Rust. Examples of wrapped values are <Arc<RwLock<T>>, and <Option>. These functions are inspired by Kotlin scope functions & TypeScript expression based language library which can be found here on r3bl-ts-utils.

Here are some examples.

```rust use r3blrsutils::utils::{ callifsome, unwraparcreadlockandcall, unwraparcwritelockandcall, withmut, }; use r3blrsutils::utils::{ReadGuarded, WriteGuarded}; use r3blrsutils::{ arenatypes::HasId, ArenaMap, FilterFn, NodeRef, ResultUidList, WeakNodeRef, };

if let Some(parentid) = parentidopt { let parentnodearcopt = self.getnodearc(parentid); callifsome(&parentnodearcopt, &|parentnodearc| { unwraparcwritelockandcall(&parentnodearc, &mut |parentnode| { parentnode.children.push(newnode_id); }); }); } ```

Here's a list of functions that are provided:

Here's a list of type aliases provided for better readability:

color_text

ANSI colorized text https://github.com/ogham/rust-ansi-term helper methods. Here's an example.

```rust use r3blrsutils::utils::{ printheader, readline, styledimmed, styleerror, styleprimary, style_prompt, };

fn makeaguess() -> String { println!("{}", Blue.paint("Please input your guess.")); let (bytesread, guess) = readline(); println!( "{} {}, {} {}", styledimmed("#bytes read:"), styleprimary(&bytesread.tostring()), styledimmed("You guessed:"), style_primary(&guess) ); guess } ```

Here's a list of functions available in this module:

tui (experimental)

🚧 WIP - This is an experimental module that isn’t ready yet. It is the first step towards creating a TUI library that can be used to create sophisticated TUI applications. This is similar to Ink library for Node.js & TypeScript (that uses React and Yoga). Or kinda like tui built atop crossterm (and not termion).

Stability

πŸ§‘β€πŸ”¬ This library is in early development.

  1. There are extensive integration tests for code that is production ready.
  2. Everything else is marked experimental in the source.

Please report any issues to the issue tracker. And if you have any feature requests, feel free to add them there too πŸ‘.

Here are some notes on using experimental / unstable features in Tokio.

```toml

The rustflags needs to be set since we are using unstable features

in Tokio.

- https://github.com/tokio-rs/console

- https://docs.rs/tokio/latest/tokio/#unstable-features

This is how you set rustflags for cargo build defaults.

- https://github.com/rust-lang/rust-analyzer/issues/5828

[target.x8664-unknown-linux-gnu] rustflags = [ "--cfg", "tokiounstable", ] ```