This crate provides lots of useful functionality to help you build TUI (text user interface) apps, along w/ general niceties & ergonomics that all Rustaceans 🦀 can enjoy 🎉:
stdout
, stderr
output, to having less noisy Result
and
Error
types.🦜 To learn more about this library, please read how it was built (on developerlife.com):
- https://developerlife.com/2022/02/24/rust-non-binary-tree/
- https://developerlife.com/2022/03/12/rust-redux/
- https://developerlife.com/2022/03/30/rust-proc-macro/
🦀 You can also find all the Rust related content on developerlife.com here.
- 🤷♂️ Fun fact: before we built this crate, we built a library that is similar in spirit for TypeScript (for TUI apps on Node.js) called r3bl-ts-utils. We have since switched to Rust 🦀🎉.
Table of contents:
Please add the following to your Cargo.toml
file:
toml
[dependencies]
r3bl_rs_utils = "0.7.40"
You can build fully async TUI apps with a modern API that brings the best of reactive & unidirectional data flow architecture from frontend web development (React, Redux, CSS, flexbox) to Rust and TUI apps. And since this is using Tokio you get the advantages of concurrency and parallelism built-in. No more blocking on the main thread for user input, for async middleware, or even rendering 🎉.
This framework is loosely coupled and strongly coherent meaning that you can pick and choose whatever pieces you would like to use w/out having the cognitive load of having to grok all the things in the codebase. Its more like a collection of mostly independent modules that work well w/ each other, but know very little about each other.
Here are some framework highlights:
There is a clear separation of concerns in this module. To illustrate what goes where, and how things work let's look at an example that puts the main event loop front and center & deals w/ how the system handles an input event (key press or mouse).
cargo run
).text
input event → [TerminalWindow]
↑ ↓ [ComponentRegistry] creates
┊ [TWApp] ───────────■ [Component]s at 1st render
┊ │
┊ │ ┌──────■ id=1 has focus
┊ │ │
┊ ├→ [Component] id=1 ───┐
┊ ├→ [Component] id=2 │
┊ └→ [Component] id=3 │
default │
handler ←───────────────────────┘
Let's trace the journey through the diagram when an input even is generated by the user (eg: a key
press, or mouse event). When the app is started via cargo run
it sets up a main loop, and lays out
all the 3 components, sizes, positions, and then paints them. Then it asynchronously listens for
input events (no threads are blocked). When the user types something, this input is processed by the
main loop of [TerminalWindow].
id=1
currently has focus.Now that we have seen this whirlwind overview of the life of an input event, let's look at the details in each of the sections below.
The main building blocks of a TUI app are:
Inside of your [TWApp] if you want to use flexbox like layout and CSS like styling you can think of composing your code in the following way:
Typically your [TWApp] will look like this:
```rust /// Async trait object that implements the [TWApp] trait.
pub struct AppWithLayout {
pub componentregistry: ComponentRegistry
As we look at [Component] & [TWApp] more closely we will find a curious thing [ComponentRegistry] (that is managed by the [TWApp]). The reason this exists is for input event routing. The input events are routed to the [Component] that currently has focus.
The [HasFocus] struct takes care of this. This provides 2 things:
id
of a [TWBox] / [Component] that has focus.id
. This is used to represent a
cursor (whatever that means to your app & component). This cursor is maintained for each id
.
This allows a separate cursor for each [Component] that has focus. This is needed to build apps
like editors and viewers that maintains a cursor position between focus switches.Another thing to keep in mind is that the [TWApp] and [TerminalWindow] is persistent between re-renders. The Redux store is also persistent between re-renders.
[TerminalWindow] gives [Component] first dibs when it comes to handling input events. If it punts handling this event, it will be handled by the default input event handler. And if nothing there matches this event, then it is simply dropped.
If you use Redux for state management, then you will create a [crate::redux] [crate::Store] that is passed into the [TerminalWindow]. Here's an example of this.
```rust use crossterm::event::; use r3bl_rs_utils::; use super::*;
const DEBUG: bool = true;
pub async fn runapp() -> CommonResult<()> { throws!({ if DEBUG { trytosetloglevel(log::LevelFilter::Trace)?; } else { trytosetlog_level(log::LevelFilter::Off)?; }
// Create store.
let store = create_store().await;
// Create an App (renders & responds to user input).
let shared_app = AppWithLayout::new_shared();
// Exit if these keys are pressed.
let exit_keys: Vec<KeyEvent> = vec![KeyEvent {
code: KeyCode::Char('q'),
modifiers: KeyModifiers::CONTROL,
}];
// Create a window.
TerminalWindow::main_event_loop(store, shared_app, exit_keys).await?
}); }
async fn createstore() -> Store
Unicode is supported (to an extent). There are some caveats. The [crate::UnicodeStringExt] trait has lots of great information on this graphemes and what is supported and what is not.
An implementation of [crate::lolcat::cat] w/ a color wheel is provided.
Store
is thread safe and asynchronous (using Tokio). You have to implement async
traits in order
to use it, by defining your own reducer, subscriber, and middleware trait objects. You also have to
supply the Tokio runtime, this library will not create its own runtime. However, for best results,
it is best to use the multithreaded Tokio runtime.
Once you setup your Redux store w/ your reducer, subscriber, and middleware, you can use it by
calling spawn_dispatch_action!( store, action )
. This kicks off a parallel Tokio task that will
run the middleware functions, reducer functions, and finally the subscriber functions. So this will
not block the thread of whatever code you call this from. The spawn_dispatch_action!()
macro
itself is not async
. So you can call it from non async
code, however you still have to provide a
Tokio executor / runtime, without which you will get a panic when spawn_dispatch_action!()
is
called.
Your middleware (async
trait implementations) will be run concurrently or in parallel via Tokio
tasks. You get to choose which async
trait to implement to do one or the other. And regardless of
which kind you implement the Action
that is optionally returned will be dispatched to the Redux
store at the end of execution of all the middlewares (for that particular spawn_dispatch_action!()
call).
AsyncMiddlewareSpawns<State, Action>
- Your middleware has to use tokio::spawn
to run async
blocks in a separate thread and
return a JoinHandle
that contains an Option<Action>
. A macro
fire_and_forget!
is provided so that you can easily spawn parallel blocks of code in your async
functions. These
are added to the store via a call to add_middleware_spawns(...)
.
AsyncMiddleware<State, Action>
- They are will all be run together concurrently using
futures::join_all()
. These
are added to the store via a call to add_middleware(...)
.
The subscribers will be run asynchronously via Tokio tasks. They are all run together concurrently
but not in parallel, using
futures::join_all()
.
The reducer functions are also are async
functions that are run in the tokio runtime. They're also
run one after another in the order in which they're added.
⚡ Any functions or blocks that you write which uses the Redux library will have to be marked
async
as well. And you will have to spawn the Tokio runtime by using the#[tokio::main]
macro. If you use the default runtime then Tokio will use multiple threads and its task stealing implementation to give you parallel and concurrent behavior. You can also use the single threaded runtime; its really up to you.
To create middleware you have to implement the AsyncMiddleware<S,A>
trait or
AsyncMiddlewareSpawns<S,A>
trait. Please read the
AsyncMiddleware
docs
for examples of both. The run()
method is passed two arguments: the State
and the Action
.
AsyncMiddlewareSpawns<S,A>
in your run()
implementation you have to use the
fire_and_forget!
macro to surround your code. And this will return a JoinHandle<Option<A>>
.AsyncMiddleware<S,A>
in your run()
implementation you just have to return an
Option<A>>
.To create reducers you have to implement the AsyncReducer
trait.
State
object.run()
method will be passed two arguments: a ref to Action
and ref to State
.To create subscribers you have to implement the AsyncSubscriber
trait.
run()
method will be passed a State
object as an argument.()
.Here's the gist of how to make & use one of these:
Default
. Or you can add your own properties / fields to this
struct, and construct it yourself, or even provide a constructor function.
new()
is provided for you by the trait.AsyncMiddleware
, AsyncMiddlewareSpawns
, AsyncReducer
, or AsyncSubscriber
trait on your struct.add_middleware()
, add_middleware_spawns()
,
add_reducer()
, or add_subscriber()
methods. You can register as many of these as you like.
::new()
method to create
an instance and pass that to the add_???()
methods.add_???()
methods:
Box::new($YOUR_STRUCT))
.💡 There are lots of examples in the tests for this library and in this CLI application built using it.
Here's an example of how to use it. Let's start w/ the import statements.
rust
/// Imports.
use async_trait::async_trait;
use r3bl_rs_utils::redux::{
AsyncMiddlewareSpawns, AsyncMiddleware, AsyncReducer,
AsyncSubscriber, Store, StoreStateMachine,
};
use std::sync::{Arc, Mutex};
use tokio::sync::RwLock;
- Make sure to have the
tokio
andasync-trait
crates installed as well asr3bl_rs_utils
in yourCargo.toml
file.- Here's an example
Cargo.toml
.
Let's say we have the following action enum, and state struct.
```rust /// Action enum.
pub enum Action { Add(i32, i32), AddPop(i32), Clear, MiddlewareCreateClearAction, Noop, }
impl Default for Action { fn default() -> Self { Action::Noop } }
/// State.
pub struct State {
pub stack: Vec
Here's an example of the reducer function.
```rust /// Reducer function (pure).
struct MyReducer;
impl AsyncReducer
Here's an example of an async subscriber function (which are run in parallel after an action is dispatched). The following example uses a lambda that captures a shared object. This is a pretty common pattern that you might encounter when creating subscribers that share state in your enclosing block or scope.
```rust
/// This shared object is used to collect results from the subscriber
/// function & test it later.
let shared_object = Arc::new(Mutex::new(Vec::
struct MySubscriber {
pub sharedobjectref: Arc
impl AsyncSubscriber
let mysubscriber = MySubscriber { sharedobjectref: sharedobject_ref.clone(), }; ```
Here are two types of async middleware functions. One that returns an action (which will get
dispatched once this middleware returns), and another that doesn't return anything (like a logger
middleware that just dumps the current action to the console). Note that both these functions share
the shared_object
reference from above.
```rust /// This shared object is used to collect results from the subscriber /// function & test it later.
struct MwExampleNoSpawn {
pub sharedobjectref: Arc
impl AsyncMiddleware
let mwexamplenospawn = MwExampleNoSpawn { sharedobjectref: sharedobject_ref.clone(), };
/// This shared object is used to collect results from the subscriber /// function & test it later.
struct MwExampleSpawns {
pub sharedobjectref: Arc
impl AsyncMiddlewareSpawns
let mwexamplespawns = MwExampleSpawns { sharedobjectref: sharedobjectref.clone(), }; ```
Here's how you can setup a store with the above reducer, middleware, and subscriber functions.
rust
// Setup store.
let mut store = Store::<State, Action>::default();
store
.add_reducer(MyReducer::new()) // Note the use of `::new()` here.
.await
.add_subscriber(Box::new( // We aren't using `::new()` here
my_subscriber, // because the struct has properties.
))
.await
.add_middleware_spawns(Box::new( // We aren't using `::new()` here
mw_example_spawns, // because the struct has properties.
))
.await
.add_middleware(Box::new( // We aren't using `::new()` here
mw_example_no_spawn, // because the struct has properties.
))
.await;
Finally here's an example of how to dispatch an action in a test. You can dispatch actions in
parallel using spawn_dispatch_action!()
which is "fire and forget" meaning that the caller won't
block or wait for the spawn_dispatch_action!()
to return.
``rust
// Test reducer and subscriber by dispatching
Add,
AddPop,
Clear` actions in parallel.
spawndispatchaction!( store, Action::Add(1, 2) );
asserteq!(sharedobject.lock().unwrap().pop(), Some(3));
spawndispatchaction!( store, Action::AddPop(1) ); asserteq!(sharedobject.lock().unwrap().pop(), Some(4));
spawndispatchaction!( store, Action::Clear ); asserteq!(store.getstate().stack.len(), 0); ```
There are quite a few declarative macros that you will find in the library. They tend to be used
internally in the implementation of the library itself. Here are some that are actually externally
exposed via #[macro_export]
.
Similar to [assert_eq!
] but automatically prints the left and right hand side variables if the
assertion fails. Useful for debugging tests, since the cargo would just print out the left and right
values w/out providing information on what variables were being compared.
Wrap the given block
or stmt
so that it returns a Result<()>
. It is just syntactic sugar that
helps having to write Ok(())
repeatedly at the end of each block. Here's an example.
rust
fn test_simple_2_col_layout() -> CommonResult<()> {
throws! {
match input_event {
TWInputEvent::DisplayableKeypress(character) => {
println_raw!(character);
}
_ => todo!()
}
}
}
Here's another example.
rust
fn test_simple_2_col_layout() -> CommonResult<()> {
throws!({
let mut canvas = Canvas::default();
canvas.stylesheet = create_stylesheet()?;
canvas.canvas_start(
CanvasPropsBuilder::new()
.set_pos((0, 0).into())
.set_size((500, 500).into())
.build(),
)?;
layout_container(&mut canvas)?;
canvas.canvas_end()?;
});
}
This is very similar to throws!
but it also returns the result of the block.
rust
fn test_simple_2_col_layout() -> CommonResult<CommandQueue> {
throws_with_return!({
println!("⛵ Draw -> draw: {}\r", state);
CommandQueue::default()
});
}
You can use this macro to dump log messages at 3 levels to a file. By default this file is named
log.txt
and is dumped in the current directory. Here's how you can use it.
Please note that the macro returns a Result
. A type alias is provided to save some typing called
CommonResult<T>
which is just a short hand for std::result::Result<T, Box<dyn Error>>
. The log
file itself is overwritten for each "session" that you run your program.
```rust use r3blrsutils::{initfilelogger_once, log, CommonResult};
fn run() -> CommonResult<()> { let msg = "foo"; let msg_2 = "bar";
log!(INFO, "This is a info message"); log!(INFO, target: "foo", "This is a info message");
log!(WARN, "This is a warning message {}", msg); log!(WARN, target: "foo", "This is a warning message {}", msg);
log!(ERROR, "This is a error message {} {}", msg, msg2); log!(ERROR, target: "foo", "This is a error message {} {}", msg, msg2);
log!(DEBUG, "This is a debug message {} {}", msg, msg2); log!(DEBUG, target: "foo", "This is a debug message {} {}", msg, msg2);
log!(TRACE, "This is a debug message {} {}", msg, msg2); log!(TRACE, target: "foo", "This is a debug message {} {}", msg, msg2);
Ok(()) } ```
To change the default log file to whatever you choose, you can use the try_to_set_log_file_path()
function. If the logger hasn't yet been initialized, this function will set the log file path.
Otherwise it will return an error.
rust
use r3bl_rs_utils::{try_set_log_file_path, CommonResult, CommonError};
fn run() {
match try_set_log_file_path("new_log.txt") {
Ok(path_set) => debug!(path_set),
Err(error) => debug!(error),
}
}
To change the default log level or to disable the log itself, you can use the
try_to_set_log_level()
function.
If you want to override the default log level LOG_LEVEL
, you can use this function. If the logger
has already been initialized, then it will return a an error.
```rust use r3blrsutils::{trytosetloglevel, CommonResult, CommonError}; use log::LevelFilter;
fn run() { match trytosetloglevel(LevelFilter::Trace) { Ok(levelset) => debug!(levelset), Err(error) => debug!(error), } } ```
To disable logging simply set the log level to
LevelFilter::Off
.
```rust use r3blrsutils::{trytosetloglevel, CommonResult, CommonError}; use log::LevelFilter;
fn run() { match trytosetloglevel(LevelFilter::Off) { Ok(levelset) => debug!(levelset), Err(error) => debug!(error), } } ```
Please check out the source here.
This macro is very similar to the log! macro, except that it won't return any error if the
underlying logging system fails. It will simply print a message to stderr
. Here's an example.
rust
pub fn log_state(&self, msg: &str) {
log_no_err!(INFO, "{:?} -> {}", msg, self.to_string());
log_no_err!(INFO, target: "foo", "{:?} -> {}", msg, self.to_string());
}
This is a really simple macro to make it effortless to debug into a log file. It outputs DEBUG
level logs. It takes a single identifier as an argument, or any number of them. It simply dumps an
arrow symbol, followed by the identifier stringify
'd along with the value that it contains (using
the Debug
formatter). All of the output is colorized for easy readability. You can use it like
this.
rust
let my_string = "Hello World!";
debug_log_no_err!(my_string);
This is very similar to debuglogno_err! except that it outputs TRACE
level
logs.
rust
let my_string = "Hello World!";
trace_log_no_err!(my_string);
This macro makes it easy to create simple HTTP GET requests using the reqwest
crate. It generates
an async
function called make_request()
that returns a CommonResult<T>
where T
is the type
of the response body. Here's an example.
```rust use std::{error::Error, fmt::Display}; use r3blrsutils::makeapicall_for; use serde::{Deserialize, Serialize};
const ENDPOINT: &str = "https://api.namefake.com/english-united-states/female/";
makeapicall_for! { FakeContactData at ENDPOINT }
pub struct FakeContactData { pub name: String, pub phoneh: String, pub emailu: String, pub email_d: String, pub address: String, }
let fakedata = fakecontactdataapi() .await .unwraporelse(|| FakeContactData { name: "Foo Bar".tostring(), phoneh: "123-456-7890".tostring(), emailu: "foo".tostring(), emaild: "bar.com".tostring(), ..FakeContactData::default() }); ```
You can find lots of examples here.
This is a really simple wrapper around tokio::spawn()
for the given block. Its just syntactic
sugar. Here's an example of using it for a non-async
block.
rust
pub fn foo() {
fire_and_forget!(
{ println!("Hello"); }
);
}
And, here's an example of using it for an async
block.
rust
pub fn foo() {
fire_and_forget!(
let fake_data = fake_contact_data_api()
.await
.unwrap_or_else(|_| FakeContactData {
name: "Foo Bar".to_string(),
phone_h: "123-456-7890".to_string(),
email_u: "foo".to_string(),
email_d: "bar.com".to_string(),
..FakeContactData::default()
});
);
}
Syntactic sugar to run a conditional statement. Here's an example.
rust
const DEBUG: bool = true;
call_if_true!(
DEBUG,
eprintln!(
"{} {} {}\r",
r3bl_rs_utils::style_error("▶"),
r3bl_rs_utils::style_prompt($msg),
r3bl_rs_utils::style_dimmed(&format!("{:#?}", $err))
)
);
This is a really simple macro to make it effortless to use the color console logger. It takes a
single identifier as an argument, or any number of them. It simply dumps an arrow symbol, followed
by the identifier (stringified) along with the value that it contains (using the Debug
formatter).
All of the output is colorized for easy readability. You can use it like this.
rust
let my_string = "Hello World!";
debug!(my_string);
let my_number = 42;
debug!(my_string, my_number);
You can also use it in these other forms for terminal raw mode output. This will dump the output to stderr.
rust
if let Err(err) = $cmd {
let msg = format!("❌ Failed to {}", stringify!($cmd));
debug!(ERROR_RAW &msg, err);
}
This will dump the output to stdout.
rust
let msg = format!("✅ Did the thing to {}", stringify!($name));
debug!(OK_RAW &msg);
This is a macro that takes inspiration from the with
scoping function in Kotlin. It just makes it
easier to express a block of code that needs to run after an expression is evaluated and saved to a
given variable. Here's an example.
rust
with! {
/* $eval */ LayoutProps {
id: id.to_string(),
dir,
req_size: RequestedSize::new(width_pc, height_pc),
},
as /* $id */ it,
run /* $code */ {
match self.is_layout_stack_empty() {
true => self.add_root_layout(it),
false => self.add_normal_layout(it),
}?;
}
}
It does the following:
$eval
expression and assigns it to $id
.$code
block.This macro is just like with!
but it takes a mutable reference to the $id
variable.
Here's a code example.
rust
with_mut! {
StyleFlag::BOLD_SET | StyleFlag::DIM_SET,
as mask2,
run {
assert!(mask2.contains(StyleFlag::BOLD_SET));
assert!(mask2.contains(StyleFlag::DIM_SET));
assert!(!mask2.contains(StyleFlag::UNDERLINE_SET));
assert!(!mask2.contains(StyleFlag::COLOR_FG_SET));
assert!(!mask2.contains(StyleFlag::COLOR_BG_SET));
assert!(!mask2.contains(StyleFlag::MARGIN_SET));
}
}
This macro is just like with_mut!
except that it returns the value of the
$code
block. Here's a code example.
rust
let tw_queue = with_mut_returns! {
ColumnRenderComponent { lolcat },
as it,
return {
it.render_component(tw_surface.current_box()?, state, shared_store).await?
}
};
This macro can be useful when you are working w/ an expression that returns an Option
and if that
Option
is None
then you want to abort and return an error immediately. The idea is that you are
using this macro in a function that returns a Result<T>
basically.
Here's an example to illustrate.
rust
pub fn from(
width_percent: u8,
height_percent: u8,
) -> CommonResult<RequestedSize> {
let size_tuple = (width_percent, height_percent);
let (width_pc, height_pc) = unwrap_option_or_run_fn_returning_err!(
convert_to_percent(size_tuple),
|| LayoutError::new_err(LayoutErrorType::InvalidLayoutSizePercentage)
);
Ok(Self::new(width_pc, height_pc))
}
This macro is basically a way to compute something lazily when it (the Option
) is set to None
.
Unwrap the $option
, and if None
then run the $next
closure which must return a value that is
set to $option
. Here's an example.
```rust use r3blrsutils::unwrapoptionorcomputeif_none;
fn testunwrapoptionorcomputeifnone() {
struct MyStruct {
field: Option
All the procedural macros are organized in 3 crates using an internal or core crate: the public crate, an internal or core crate, and the proc macro crate.
This derive macro makes it easy to generate builders when annotating a struct
or enum
. It
generates It has full support for generics. It can be used like this.
```rust
struct Point
let mypt: Point
asserteq!(mypt.x, 1); asserteq!(mypt.y, 2); ```
This function like macro (with custom syntax) makes it easy to manage shareability and interior mutability of a struct. We call this pattern the "manager" of "things").
🪄 You can read all about it here.
RwLock
for thread safety.Arc
so we can share it across threads.where
clause.Here's a very simple usage:
rust
make_struct_safe_to_share_and_mutate! {
named MyMapManager<K, V>
where K: Default + Send + Sync + 'static, V: Default + Send + Sync + 'static
containing my_map
of_type std::collections::HashMap<K, V>
}
Here's an async example.
```rust
async fn testcustomsyntaxnowhereclause() {
makestructsafetoshareandmutate! {
named StringMap
let mymanager: StringMap
This function like macro (with custom syntax) makes it easy to share functions and lambdas that are async. They should be safe to share between threads and they should support either being invoked or spawned.
🪄 You can read all about how to write proc macros here.
Arc<RwLock<>>
for thread
safety and interior mutability.get()
method is generated which makes it possible to share this struct across threads.from()
method is generated which makes it easy to create this struct from a function or
lambda.spawn()
method is generated which makes it possible to spawn the enclosed function or lambda
asynchronously using Tokio.invoke()
method is generated which makes it possible to invoke the enclosed function or
lambda synchronously.Here's an example of how to use this macro.
```rust use r3blrsutils::makesafeasyncfnwrapper;
makesafeasyncfnwrapper! { named SafeMiddlewareFnWrapper containing fnmut oftype FnMut(A) -> Option } ```
Here's another example.
```rust use r3blrsutils::makesafeasyncfnwrapper;
makesafeasyncfnwrapper! {
named SafeSubscriberFnWrapper
containing fnmut
oftype FnMut(S) -> ()
}
```
[Arena
] and [MTArena
] types are the implementation of a
non-binary tree data structure that is
inspired by memory arenas.
Here's a simple example of how to use the [Arena
] type:
```rust use r3blrsutils::{ treememoryarena::{Arena, HasId, MTArena, ResultUidList}, utils::{styleprimary, styleprompt}, };
let mut arena = Arena::
Here's how you get weak and strong references from the arena (tree), and tree walk:
```rust use r3blrsutils::{ treememoryarena::{Arena, HasId, MTArena, ResultUidList}, utils::{styleprimary, styleprompt}, };
let mut arena = Arena::
{ assert!(arena.getnodearc(&node1id).issome()); let node1ref = dbg!(arena.getnodearc(&node1id).unwrap()); let node1refweak = arena.getnodearcweak(&node1id).unwrap(); asserteq!(node1ref.read().unwrap().payload, node1value); asserteq!( node1refweak.upgrade().unwrap().read().unwrap().payload, 42 ); }
{ let nodeiddne = 200 as usize; assert!(arena.getnodearc(&nodeiddne).is_none()); }
{ let node1id = 0 as usize; let nodelist = dbg!(arena.treewalkdfs(&node1id).unwrap()); asserteq!(nodelist.len(), 1); asserteq!(node_list, vec![0]); } ```
Here's an example of how to use the [MTArena
] type:
```rust use std::{ sync::Arc, thread::{self, JoinHandle}, };
use r3blrsutils::{ treememoryarena::{Arena, HasId, MTArena, ResultUidList}, utils::{styleprimary, styleprompt}, };
type ThreadResult = Vec
let mut handles: Handles = Vec::new();
let arena = MTArena::
// Thread 1 - add root. Spawn and wait (since the 2 threads below need the root). { let arenaarc = arena.getarenaarc(); let thread = thread::spawn(move || { let mut arenawrite = arenaarc.write().unwrap(); let root = arenawrite.addnewnode("foo".to_string(), None); vec![root] }); thread.join().unwrap(); }
// Perform tree walking in parallel. Note the lambda does capture many enclosing variable context. { let arenaarc = arena.getarenaarc(); let fnarc = Arc::new(move |uid, payload| { println!( "{} {} {} Arena weakcount:{} strongcount:{}", styleprimary("walkerfn - closure"), uid, payload, Arc::weakcount(&arenaarc), Arc::weakcount(&arenaarc) ); });
// Walk tree w/ a new thread using arc to lambda.
{
let threadhandle: JoinHandle
let result_node_list = thread_handle.join().unwrap();
println!("{:#?}", result_node_list);
}
// Walk tree w/ a new thread using arc to lambda.
{
let threadhandle: JoinHandle
let result_node_list = thread_handle.join().unwrap();
println!("{:#?}", result_node_list);
} } ```
📜 There are more complex ways of using [
Arena
] and [MTArena
]. Please look at these extensive integration tests that put them thru their paces here.
These two structs make it easier to work w/ Result
s. They are just syntactic sugar and helper
structs. You will find them used everywhere in the
r3bl_rs_utils
crate.
Here's an example of using them both.
```rust use r3blrsutils::{CommonError, CommonResult};
pub struct Stylesheet { pub styles: Vec