Context

R3BL TUI library & suite of apps focused on developer productivity

We are working on building command line apps in Rust which have rich text user interfaces (TUI). We want to lean into the terminal as a place of productivity, and build all kinds of awesome apps for it.

  1. ๐Ÿ”ฎ Instead of just building one app, we are building a library to enable any kind of rich TUI development w/ a twist: taking concepts that work really well for the frontend mobile and web development world and re-imagining them for TUI & Rust.

  2. ๐ŸŒŽ We are building apps to enhance developer productivity & workflows.

r3bl_tui crate

This crate is related to the first thing that's described above. It provides lots of useful functionality to help you build TUI (text user interface) apps, along w/ general niceties & ergonomics that all Rustaceans ๐Ÿฆ€ can enjoy ๐ŸŽ‰:

Text User Interface engine for Rust

You can build fully async TUI (text user interface) apps with a modern API that brings the best of the web frontend development ideas to TUI apps written in Rust:

  1. Reactive & unidirectional data flow architecture from frontend web development (React, Redux).
  2. Responsive design w/ CSS, flexbox like concepts.
  3. Declarative style of expressing styling and layouts.

And since this is using Rust and Tokio you get the advantages of concurrency and parallelism built-in. No more blocking the main thread for user input, for async middleware, or even rendering ๐ŸŽ‰.

This framework is loosely coupled and strongly coherent meaning that you can pick and choose whatever pieces you would like to use w/out having the cognitive load of having to grok all the things in the codebase. Its more like a collection of mostly independent modules that work well w/ each other, but know very little about each other.

Here are some framework highlights:

Examples to get you started

Here's a video of the demo in action:

https://user-images.githubusercontent.com/2966499/206881196-37cf1220-8c1b-460e-a2cb-7e06d22d6a02.mp4

  1. You can run cargo run --example demo in the tui/examples folder to see a demo of the library in action and play with it. The examples cover the entire surface area of the TUI API. You can also take a look at the tests in the source as well tui/src/.

  2. This document is a good place to start to get a feel for the architecture of the framework. You can get a mental model of how everything fits and what the TUI lifecycle is.

  3. Additionally the r3blrsutils_core has the tui_core module which contains dependencies that are used by the tui module. They include:

    1. ANSI text support.
    2. Core dimensions and units that are used for positioning and sizing.
    3. Grapheme cluster segment and unicode support (emoji support).
    4. Lolcat support.
    5. CSS like styling support.

Quite a few scripts are provided for your convenience in the root folder of the repo. They are all fish script files. Here are a few worth mentioning here which are related to running the example.

  1. run.fish: This will simply run the examples. You can watch the logs by running log.fish.
  2. run-with-flamegraph-profiling.fish: This will run the examples and generate a flamegraph at the end so you can see profile the performance of the app.
  3. run-with-crash-reporting.fish: This will run the examples and generate a crash_log.txt file (in the tui folder) in case the app crashes. This is useful for debugging.

How does layout, rendering, and event handling work in general?

Life of an input event

There is a clear separation of concerns in this module. To illustrate what goes where, and how things work let's look at an example that puts the main event loop front and center & deals w/ how the system handles an input event (key press or mouse).

text ๐ŸงโŒจ๏ธ๐Ÿ–ฑ๏ธ input โ†’ [TerminalWindow] event โ†‘ โ†“ [ComponentRegistry] creates โ”Š [App] โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ–  [Component]s at 1st render โ”Š โ”‚ โ”Š โ”‚ โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ–  id=1 has focus โ”Š โ”‚ โ”‚ โ”Š โ”œโ†’ [Component] id=1 โ”€โ”€โ”€โ” โ”Š โ”œโ†’ [Component] id=2 โ”‚ โ”Š โ””โ†’ [Component] id=3 โ”‚ default โ”‚ handler โ†โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

Let's trace the journey through the diagram when an input even is generated by the user (eg: a key press, or mouse event). When the app is started via cargo run it sets up a main loop, and lays out all the 3 components, sizes, positions, and then paints them. Then it asynchronously listens for input events (no threads are blocked). When the user types something, this input is processed by the main loop of [TerminalWindow].

  1. The [Component] that is in [FlexBox] w/ id=1 currently has focus.
  2. When an input event comes in from the user (key press or mouse input) it is routed to the [App] first, before [TerminalWindow] looks at the event.
  3. The specificity of the event handler in [App] is higher than the default input handler in [TerminalWindow]. Further, the specificity of the [Component] that currently has focus is the highest. In other words, the input event gets routed by the [App] to the [Component] that currently has focus ([Component] id=1 in our example).
  4. Since it is not guaranteed that some [Component] will have focus, this input event can then be handled by [App], and if not, then by [TerminalWindow]'s default handler. If the default handler doesn't process it, then it is simply ignored.
  5. In this journey, as the input event is moved between all these different entities, each entity decides whether it wants to handle the input event or not. If it does, then it returns an enum indicating that the event has been consumed, else, it returns an enum that indicates the event should be propagated.

Now that we have seen this whirlwind overview of the life of an input event, let's look at the details in each of the sections below.

Here's an architecture diagram that will be useful to keep in mind as we go through the details of the following sections:

The window

The main building blocks of a TUI app are:

  1. [TerminalWindow] - You can think of this as the main "window" of the app. All the content of your app is painted inside of this "window". And the "window" conceptually maps to the screen that is contained inside your terminal emulator program (eg: tilix, Terminal.app, etc). Your TUI app will end up taking up 100% of the screen space of this terminal emulator. It will also enter raw mode, and paint to an alternate screen buffer, leaving your original scroll back buffer and history intact. When you exit this TUI app, it will return your terminal to where you'd left off. You don't write this code, this is something that you use.
  2. [App] - This is where you write your code. You pass in a [App] to the [TerminalWindow] to bootstrap your TUI app. You can just use [App] to build your app, if it is a simple one & you don't really need any sophisticated layout or styling. But if you want layout and styling, now we have to deal with [FlexBox], [Component], and [crate::Style].

Layout and styling

Inside of your [App] if you want to use flexbox like layout and CSS like styling you can think of composing your code in the following way:

  1. [App] is like a box or container. You can attach styles and an id here. The id has to be unique, and you can reference as many styles as you want from your stylesheet. Yes, cascading styles are supported! ๐Ÿ‘ You can put boxes inside of boxes. You can make a container box and inside of that you can add other boxes (you can give them a direction and even relative sizing out of 100%).
  2. As you approach the "leaf" nodes of your layout, you will find [Component] trait objects. These are black boxes which are sized, positioned, and painted relative to their parent box. They get to handle input events and render [RenderOp]s into a [RenderPipeline]. This is kind of like virtual DOM in React. This queue of commands is collected from all the components and ultimately painted to the screen, for each render! You can also use Redux to maintain your app's state, and dispatch actions to the store, and even have async middleware!

Component, ComponentRegistry, focus management, and event routing

Typically your [App] will look like this:

```rust /// Async trait object that implements the [App] trait.

[derive(Default)]

pub struct AppWithLayout { pub componentregistry: ComponentRegistry, pub hasfocus: HasFocus, } ```

As we look at [Component] & [App] more closely we will find a curious thing [ComponentRegistry] (that is managed by the [App]). The reason this exists is for input event routing. The input events are routed to the [Component] that currently has focus.

The [HasFocus] struct takes care of this. This provides 2 things:

  1. It holds an id of a [FlexBox] / [Component] that has focus.
  2. It also holds a map that holds a [crate::Position] for each id. This is used to represent a cursor (whatever that means to your app & component). This cursor is maintained for each id. This allows a separate cursor for each [Component] that has focus. This is needed to build apps like editors and viewers that maintains a cursor position between focus switches.

Another thing to keep in mind is that the [App] and [TerminalWindow] is persistent between re-renders. The Redux store is also persistent between re-renders.

Input event specificity

[TerminalWindow] gives [Component] first dibs when it comes to handling input events. If it punts handling this event, it will be handled by the default input event handler. And if nothing there matches this event, then it is simply dropped.

Rendering and painting

The R3BL TUI engine uses a high performance compositor to render the UI to the terminal. This ensures that only "pixels" that have changed are painted to the terminal. This is done by creating a concept of PixelChar which represents a single "pixel" in the terminal screen at a given col and row index position. There are only as many PixelChars as there are rows and cols in a terminal screen. And the index maps directly to the position of the pixel in the terminal screen.

Offscreen buffer

Here is an example of what a single row of rendered output might look like in a row of the OffscreenBuffer. This diagram shows each PixelChar in row_index: 1 of the OffscreenBuffer. In this example, there are 80 columns in the terminal screen. This actual log output generated by the TUI engine when logging is enabled.

text row_index: 1 000 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘001 P 'j'โ†’fgโ€bg 002 P 'a'โ†’fgโ€bg 003 P 'l'โ†’fgโ€bg 004 P 'd'โ†’fgโ€bg 005 P 'k'โ†’fgโ€bg 006 P 'f'โ†’fgโ€bg 007 P 'j'โ†’fgโ€bg 008 P 'a'โ†’fgโ€bg 009 P 'l'โ†’fgโ€bg 010 P 'd'โ†’fgโ€bg 011 P 'k'โ†’fgโ€bg 012 P 'f'โ†’fgโ€bg 013 P 'j'โ†’fgโ€bg 014 P 'a'โ†’fgโ€bg 015 P 'โ–’'โ†’rev 016 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘017 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘ 018 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘019 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘020 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘021 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘022 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘023 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘ 024 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘025 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘026 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘027 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘028 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘029 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘ 030 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘031 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘032 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘033 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘034 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘035 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘ 036 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘037 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘038 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘039 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘040 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘041 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘ 042 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘043 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘044 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘045 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘046 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘047 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘ 048 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘049 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘050 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘051 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘052 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘053 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘ 054 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘055 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘056 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘057 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘058 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘059 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘ 060 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘061 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘062 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘063 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘064 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘065 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘ 066 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘067 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘068 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘069 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘070 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘071 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘ 072 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘073 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘074 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘075 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘076 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘077 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘ 078 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘079 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘080 S โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ•ณโ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘โ–‘spacer [ 0, 16-80 ]

When RenderOps are executed and used to create an OffscreenBuffer that maps to the size of the terminal window, clipping is performed automatically. This means that it isn't possible to move the caret outside of the bounds of the viewport (terminal window size). And it isn't possible to paint text that is larger than the size of the offscreen buffer. The buffer really represents the current state of the viewport. Scrolling has to be handled by the component itself (an example of this is the editor component).

Each PixelChar can be one of 4 things:

  1. Space. This is just an empty space. There is no flickering in the TUI engine. When a new offscreen buffer is created, it is fulled w/ spaces. Then components paint over the spaces. Then the diffing algorithm only paints over the pixels that have changed. You don't have to worry about clearing the screen and painting, which typically will cause flickering in terminals. You also don't have to worry about printing empty spaces over areas that you would like to clear between renders. All of this handled by the TUI engine.
  2. Void. This is a special pixel that is used to indicate that the pixel should be ignored. It is used to indicate a wide emoji is to the left somewhere. Most terminals don't support emojis, so there's a discrepancy between the display width of the character and its index in the string.
  3. Plain text. This is a normal pixel which wraps a single character that maybe a grapheme cluster segment. Styling information is encoded in each PixelChar::PlainText and is used to paint the screen via the diffing algorithm which is smart enough to "stack" styles that appear beside each other for quicker rendering in terminals.
  4. ANSI text. Styling information in not available w/ these characters because the styling information is encoded in the ANSI escape codes. lolcat_api.rs generates these ANSI strings for the rainbow effect. An example of this is the outline around a modal dialog box.

Render pipeline

The following diagram provides a high level overview of how apps (that contain components, which may contain components, and so on) are rendered to the terminal screen.

Each component produces a RenderPipeline, which is a map of ZOrder and Vec<RenderOps>. RenderOps are the instructions that are grouped together, such as move the caret to a position, set a color, and paint some text.

Inside of each RenderOps the caret is stateful, meaning that the caret position is remembered after each RenderOp is executed. However, once a new RenderOps is executed, the caret position reset just for that RenderOps. Caret position is not stored globally. You should read more about "atomic paint operations" in the RenderOp documentation.

Once a set of these RenderPipelines have been generated, typically after the user enters some input event, and that produces a new state which then has to be rendered, they are combined and painted into an OffscreenBuffer.

First render

The paint.rs file contains the paint function, which is the entry point for all rendering. Once the first render occurs, the OffscreenBuffer that is generated is saved to GlobalSharedState. The following table shows the various tasks that have to be performed in order to render to an OffscreenBuffer. There is a different code path that is taken for ANSI text and plain text (which includes StyledText which is just plain text with a color). Syntax highlighted text is also just StyledText. The ANSI text is an example of text that is generated by the lolcat_api.rs.

| UTF-8 | ANSI | Task | | ----- | ---- | ------------------------------------------------------------------------------------------------------- | | Y | Y | convert RenderPipeline to List<List<PixelChar>> (OffscreenBuffer) | | Y | Y | paint each PixelChar in List<List<PixelChar>> to stdout using OffscreenBufferPainterImplCrossterm | | Y | Y | save the List<List<PixelChar>> to GlobalSharedState |

Currently only crossterm is supported for actually painting to the terminal. But this process is really simple making it very easy to swap out other terminal libraries such as termion, or even a GUI backend, or some other custom output driver.

Subsequent render

Since the OffscreenBuffer is cached in GlobalSharedState a diff to be performed for subsequent renders. And only those diff chunks are painted to the screen. This ensures that there is no flicker when the content of the screen changes. It also minimizes the amount of work that the terminal or terminal emulator has to do put the PixelChars on the screen.

Redux for state management

If you use Redux for state management, then you will create a [crate::redux] [crate::Store] that is passed into the [TerminalWindow]. For more detailed information on Redux, please read the docs for the r3bl_redux create.

Here's an example of this.

```rust use crossterm::event::; use r3bl_rs_utils::; use super::*;

const DEBUG: bool = true;

pub async fn runapp() -> CommonResult<()> { throws!({ if DEBUG { trytosetloglevel(log::LevelFilter::Trace)?; } else { trytosetlog_level(log::LevelFilter::Off)?; }

// Create store.
let store = create_store().await;

// Create an App (renders & responds to user input).
let shared_app = AppWithLayout::new_shared();

// Exit if these keys are pressed.
let exit_keys: Vec<KeyEvent> = vec![KeyEvent {
  code: KeyCode::Char('q'),
  modifiers: KeyModifiers::CONTROL,
}];

// Create a window.
TerminalWindow::main_event_loop(store, shared_app, exit_keys).await?

}); }

async fn createstore() -> Store { let mut store: Store = Store::default(); store.addreducer(MyReducer::default()).await; store }

/// Action enum.

[derive(Debug, PartialEq, Eq, Clone)]

pub enum Action { Add(i32, i32), AddPop(i32), Clear, MiddlewareCreateClearAction, Noop, }

impl Default for Action { fn default() -> Self { Action::Noop } }

/// State.

[derive(Clone, Default, PartialEq, Debug)]

pub struct State { pub stack: Vec, }

/// Reducer function (pure).

[derive(Default)]

struct MyReducer;

[async_trait]

impl AsyncReducer for MyReducer { async fn run( &self, action: &Action, state: &mut State, ) { match action { Action::Add(a, b) => { let sum = a + b; state.stack = vec![sum]; } Action::AddPop(a) => { let sum = a + state.stack[0]; state.stack = vec![sum]; } Action::Clear => State { state.stack.clear(); }, _ => {} } } } ```

How does the editor component work?

The EditorComponent struct can hold data in its own memory, in addition to relying on the state.

In other words,

  1. EditorEngine -> This goes in EditorComponent
  2. EditorBuffer -> This goes in the State

Here are the connection points w/ the impl of Component<S,A> in EditorComponent:

  1. handle_event(input_event: &InputEvent, state: &S, shared_store: &SharedStore<S, A>)
  2. render(has_focus: &HasFocus, current_box: &FlexBox, state: &S, shared_store: &SharedStore<S,A>)

Painting the caret

Definitions

Caret - the block that is visually displayed in a terminal which represents the insertion point for whatever is in focus. While only one insertion point is editable for the local user, there may be multiple of them, in which case there has to be a way to distinguish a local caret from a remote one (this can be done w/ bg color).

Cursor - the global "thing" provided in terminals that shows by blinking usually where the cursor is. This cursor is moved around and then paint operations are performed on various different areas in a terminal window to paint the output of render operations.

There are two ways of showing cursors which are quite different (each w/ very different constraints).

  1. Using a global terminal cursor (we don't use this).

  2. Paint the character at the cursor w/ the colors inverted (or some other bg color) giving the visual effect of a cursor.

How do modal dialog boxes work?

A modal dialog box is different than a normal reusable component. This is because:

  1. It paints on top of the entire screen (in front of all other components, in ZOrder::Glass, and outside of any layouts using FlexBoxes).
  2. Is "activated" by a keyboard shortcut (hidden otherwise). Once activated, the user can accept or cancel the dialog box. And this results in a callback being called w/ the result.

So this activation trigger must be done at the App trait impl level (in the app_handle_event() method). Also, when this trigger is detected it has to:

  1. Set the focus to the dialog box, so that it will appear on the next render. When trigger is detected it will return a EventPropagation::Consumed which won't force a render.
  2. Set the title and text via a dispatch of the action SetDialogBoxTitleAndText. This will force a render, and the title and text in the dialog box on next render.

There is a question about where does the response from the user (once a dialog is shown) go? This seems as though it would be different in nature from an EditorComponent but it is the same. Here's why:

Two callback functions

When creating a new dialog box component, two callback functions are passed in:

  1. on_dialog_press_handler() - this will be called if the user choose no, or yes (w/ their typed text).
  2. on_dialog_editors_changed_handler() - this will be called if the user types something into the editor.

How to use this dialog to make an HTTP request & pipe the results into a selection area?

So far we have covered the use case for a simple modal dialog box. In order to provide auto-completion capabilities, via some kind of web service, there needs to be a slightly more complex version of this. This is where the DialogEngineConfigOptions struct comes in. It allows us to create a dialog component and engine to be configured w/ the appropriate mode - simple or autocomplete.

In autocomplete mode, an extra "results panel" is displayed, and the layout of the dialog is different on the screen. Instead of being in the middle of the screen, it starts at the top of the screen. The callbacks are the same.

How to make HTTP requests

Instead of using the reqwest crate, we should use the hyper crate (which is part of Tokio) and drop support for reqwest in all our crates.

Grapheme support

Unicode is supported (to an extent). There are some caveats. The [crate::UnicodeStringExt] trait has lots of great information on this graphemes and what is supported and what is not.

Lolcat support

An implementation of [crate::lolcat::cat] w/ a color wheel is provided.

Other crates that depend on this

This crate is a dependency of the following crates:

  1. r3bl_rs_utils crates (the "main" library)

Issues, comments, feedback, and PRs

Please report any issues to the issue tracker. And if you have any feature requests, feel free to add them there too ๐Ÿ‘.