google-cloud-bigquery

Google Cloud Platform BigQuery Client library.

crates.io

Installation

toml [dependencies] google-cloud-bigquery = <version> google-cloud-default = { version = <version>, features = ["bigquery"] }

Quick Start

CreateClient

The function create() will try and read the credentials from a file specified in the environment variable GOOGLE_APPLICATION_CREDENTIALS, GOOGLE_APPLICATION_CREDENTIALS_JSON or from a metadata server.

This is also described in google-cloud-auth

```rust use googlecloudpubsub::client::{ClientConfig, Client}; use googleclouddefault::biqquery::CreateAuthExt;

async fn run() { let (config, projectid) = ClientConfig::newwith_auth().await.unwrap(); let client = Client::new(config).await.unwrap(); } ```

When you can't use the gcloud authentication but you have a different way to get your credentials (e.g a different environment variable) you can parse your own version of the 'credentials-file' and use it like that:

```rust use googlecloudauth::credentials::CredentialsFile; use googlecloudbigquery::client::{ClientConfig, Client}; use googleclouddefault::biqquery::CreateAuthExt;

async fn run(cred: CredentialsFile) { let (config, projectid) = ClientConfig::newwith_credentials(cred).await.unwrap(); let client = Client::new(config).await.unwrap(); } ```

Read Data

Query

```rust use googlecloudbigquery::http::job::query::QueryRequest; use googlecloudbigquery::query::row::Row; use googlecloudbigquery::client::Client;

async fn run(client: &Client, projectid: &str) { let request = QueryRequest { query: "SELECT * FROM dataset.table".tostring(), ..Default::default() }; let mut iter = client.query(project_id, request).await.unwrap(); while let Some(row) = iter.next::().await.unwrap() { let col1 = row.column::(0); let col2 = row.column::>(1); } } ```

Read Table

```rust use googlecloudbigquery::storage::row::Row; use googlecloudbigquery::client::Client; use googlecloudbigquery::http::table::TableReference;

async fn run(client: &Client, projectid: &str) { let table = TableReference { projectid: projectid.tostring(), datasetid: "dataset".tostring(), tableid: "table".tostring(), }; let mut iter = client.read_table::(&table, None).await.unwrap(); while let Some(row) = iter.next().await.unwrap() { let col1 = row.column::(0); let col2 = row.column::>(1); } } ```

Values

Default supported types to decode by row.column::<T>() are * String (for STRING) * bool (for BOOL) * i64 (for INT64) * f64 (for FLOAT) * bigdecimal::BigDecimal (for NUMERIC, BIGNUMERIC) * Vec (for BINARY) * time::OffsetDateTime (for TIMESTAMP) * time::Date (for DATE) * time::Time (for TIME) * T: StructDecodable (for STRUCT) - Example * Option (for all NULLABLE) * Vec (for ARRAY)

Insert Data

Table data API

```rust use googlecloudbigquery::http::tabledata::insertall::{InsertAllRequest, Row}; use googlecloud_bigquery::client::Client;

[derive(serde::Serialize)]

pub struct TestData { pub col1: String, #[serde(with = "time::serde::rfc3339::option")] pub coltimestamp: Option, // Must serialize as base64 string to insert binary data // #[serde(default, with = "Base64Standard")] pub colbinary: Vec }

async fn run(client: &Client, projectid: &str, data: TestData) { let data1 = Row { insertid: None, json: data, }; let request = InsertAllRequest { rows: vec![data1], ..Default::default() }; let result = client.tabledata().insert(projectid, "dataset", "table", &request).await.unwrap(); let error = result.inserterrors; } ```

Run loading job

ex) Loading CSV data from GCS ```rust use googlecloudbigquery::client::Client; use googlecloudbigquery::http::bigqueryjobclient::BigqueryJobClient; use googlecloudbigquery::http::job::cancel::CancelJobRequest; use googlecloudbigquery::http::job::get::GetJobRequest; use googlecloudbigquery::http::job::getqueryresults::GetQueryResultsRequest; use googlecloudbigquery::http::job::query::QueryRequest; use googlecloudbigquery::http::job::{Job, JobConfiguration, JobConfigurationLoad, JobReference, JobState, JobType, OperationType, TrainingType, WriteDisposition}; use googlecloudbigquery::http::table::{SourceFormat, TableReference};

async fn run(client: &Client, projectid: &str, datapath: &str) { let job = Job { jobreference: JobReference { projectid: projectid.tostring(), jobid: "jobid".tostring(), location: Some("asia-northeast1".tostring()) }, // CSV configuration configuration: JobConfiguration { job: JobType::Load(JobConfigurationLoad { sourceuris: vec![format!("gs://{}.csv",datapath)], sourceformat: Some(SourceFormat::Csv), fielddelimiter: Some("|".tostring()), encoding: Some("UTF-8".tostring()), skipleadingrows: Some(0), autodetect: Some(true), writedisposition: Some(WriteDisposition::WriteTruncate), destinationtable: TableReference { projectid: projectid.tostring(), datasetid: "dataset".tostring(), tableid: "table".to_string(), }, ..Default::default() }), ..Default::default() }, ..Default::default() };

// Run job
let created = client.job().create(&job).await.unwrap();

// Check status
assert!(created.status.errors.is_none());
assert!(created.status.error_result.is_none());
assert!(created.status.state == JobState::Running || created.status.state == JobState::Done);

} ```

Features

HTTP API