Mýval (pronounced as [m'ival]) is translated from Czech as raccoon.
The common name for raccoon in Czech is "medvídek mýval" which can be translated as "little bear".
Myval is not a competitor of Polars. Myval is a lightweight Arrow data frame which is focused on data transformation and IPC.
Because Arrow has got the standardized data layout, data frames can be converted to Polars and vice-versa with zero-copy:
rust,ignore
let polars_df = polars::frame::DataFrame::from(myval_df);
let myval_df = myval::DataFrame::from(polars_df);
As well as Polars, Myval is based on arrow2.
Consider there is an Arrow stream block (Schema+Chunk) received from e.g. RPC or Pub/Sub. Convert the block into a Myval data frame:
rust,ignore
let df = myval::DataFrame::from_ipc_block(&buf).unwrap();
Need to send a data frame back? Convert it to Arrow stream block with a single line of code:
rust,ignore
let buf = df.into_ipc_block().unwrap();
Need to send sliced? No problem, there are methods which can easily return sliced series, sliced data frames or IPC chunks.
Consider there is an i64-column "time" which contains nanosecond timestamps. Let us override its data type:
```rust,ignore use myval::{DataType, TimeUnit};
df.setdatatype("time", DataType::Timestamp(TimeUnit::Nanosecond, None)).unwrap(); ```
Consider there is a utf8-column "value" which should be parsed to floats:
rust,ignore
df.parse_float("value").unwrap();
rust,ignore
df.add_i64("col", 1_000).unwrap();
df.sub_i64("col", 1_000).unwrap();
df.mul_i64("col", 1_000).unwrap();
df.div_i64("col", 1_000).unwrap();
Consider there is a Myval data frame with columns "voltage", "temp1", "temp2", "temp3" which has received data from a server column-by-column in random ordering. Let us correct the ordering back to normal:
rust,ignore
df.set_ordering(&["voltage", "temp1", "temp2", "temp3"]);
Check the documentation: https://docs.rs/myval
Arrow provides several ways to work with databases. Myval additionally provides tools to work with PostgreSQL databases in the easy way via the popular sqlx crate ("postgres" feature must be enabled):
```rust,ignore use futures::stream::TryStreamExt;
let pool = Arc::new(PgPoolOptions::new() .connect("postgres://postgres:welcome@localhost/postgres") .await.unwrap()); let maxsize = 100000; let mut stream = myval::db::postgres::fetch( "select * from test".toowned(), Some(maxsize), pool.clone()); // the stream returns data frames one by one with max data frame size (in // bytes) = maxsize while let Some(df) = stream.trynext().await.unwrap() { // do some stuff } ```
Why does the stream object require Arc<PgPool>? There is one important reason: such stream objects are static and can be stored anywhere, e.g. used as cursors in a client-server architecture.
rust,ignore
let df = DataFrame::from_ipc_block(payload).unwrap();
// The first received data frame must have "database" field in its schema
// metadata. Next data frames can go without it.
if let Some(dbparams) = df.metadata().get("database") {
let params: myval::db::postgres::Params = serde_json::from_str(dbparams)
.unwrap();
let processed_rows: usize = myval::db::postgres::push(&df, ¶ms,
&pool).await.unwrap();
}
Let us push Polars data frame into a PostgreSQL database:
```rust,ignore use serde_json::json;
let mut df = myval::DataFrame::from(polarsdf); df.metadatamut().insert( // set "database" metadata field "database".toowned(), serdejson::to_string(&json!({ // table, required "table": "test", // PostgreSQL schema, optional "postgres": { "schema": "public" }, // keys, required if the table has got keys/unique indexes "keys": ["id"], // some field parameters "fields": { // another way to declare a key field //"id": { "key": true }, // the following data frame columns contain strings which must be // sent to the database as JSON (for json/jsonb PostgreSQL types) "data1": { "json": true }, "data2": { "json": true } } }))?, ); // send the data frame to the server in a single or multiple chunks/blocks ```
BOOL, INT2 (16-bit int), INT4 (32-bit int), INT8 (64-bit int), FLOAT4 (32-bit float), FLOAT8 (64-bit float)
TIMESTAMP, TIMESTAMPTZ (time zone information is discarded as Arrow arrays can not have different time zones for individual records)
CHAR, VARCHAR
JSON/JSONB (encoded to strings as LargeUtf8 when fetched)
Myval is not designed for data engineering. Use Polars.
Myval series can contain a single chunk only and there are no plans to extend this. When a Polars data frame with multiple chunks is converted to Myval, the chunks are automatically aggregated.
Some features (conversion to Polars, PostgreSQL) are experimental, use at your own risk.