WebScrapingApi is an API that allows scraping websites while using rotating proxies to prevent bans. This SDK for Rust makes the usage of the API easier to implement in any project you have.
Add the following dependancy:
webscrapingapi = "0.1.0"
To use the API and the SDK you will need a API Key. You can get one by registering at WebScrapingApi
Using the SDK it's quite easy. An example of a GET call to the API is the following:
``` use webscrapingapi::WebScrapingAPI; use webscrapingapi::QueryBuilder; use std::collections::HashMap; use std::error::Error;
async fn getexample(wsa: &WebScrapingAPI<'>) ->Result<(), Box
query_builder.url("http://httpbin.org/headers");
query_builder.render_js("1");
let mut headers: HashMap<String, String> = HashMap::new();
headers.insert("Wsa-test".to_string(), "abcd".to_string());
query_builder.headers(headers);
let html = wsa.get(query_builder).await?.text().await?;
println!("{}", html);
Ok(())
}
async fn rawgetexample(wsa: &WebScrapingAPI<'_>) ->Result<(), Box
let mut headers: HashMap<String, String> = HashMap::new();
headers.insert("Wsa-test".to_string(), "abcd".to_string());
let html = wsa.raw_get(params, headers).await?.text().await?;
println!("{}", html);
Ok(())
}
async fn main() -> Result<(), Box
get_example(&wsa).await?;
raw_get_example(&wsa).await?;
Ok(())
} ```
Notice that in order to run the async request for webscrapingapi from main we used the dependency:
tokio = { version = "1", features = ["full"] }
All dependencies of the crate:
urlencoding = "2.1.0"
reqwest = { version = "0.11", features = ["json"] }
tokio = { version = "1", features = ["full"] }
For a better understanding of the parameters, please check out our documentation