This is a library for integrating Web Scraper in your flow function for flows.network.

Visit Web Scraper

Below examples show a lambda service that responds with the text content of a web page for a url passed as query parameter.

```rust use std::collections::HashMap;

use lambdaflows::{requestreceived, sendresponse}; use serdejson::Value; use webscraperflows::getpagetext;

[no_mangle]

[tokio::main(flavor = "current_thread")]

pub async fn run() { request_received(handler).await; }

async fn handler(qry: HashMap, body: Vec) { let url = qry.get("url").expect("No url provided").asstr().unwrap();

match get_page_text(url).await {
    Ok(text) => send_response(
        200,
        vec![(
            String::from("content-type"),
            String::from("text/plain; charset=UTF-8"),
        )],
        text.as_bytes().to_vec(),
    ),
    Err(e) => send_response(
        400,
        vec![(
            String::from("content-type"),
            String::from("text/plain; charset=UTF-8"),
        )],
        e.as_bytes().to_vec(),
    ),
}

} ```

The whole document is here.