The interface is not stable
This is a Minimum Viable Product for feedback, experimentation and iteration
Provide a backend for the metrics
facade crate,
to emit metrics in CloudWatch Embedded Metrics Format
```rust let metrics = metricscloudwatchembedded::Builder::new() .cloudwatch_namespace("MyApplication") .init() .unwrap();
metrics::increment_counter!("requests", "Method" => "Default");
metrics .set_property("RequestId", "ABC123") .flush(); ```
The Lambda Runtime intergration feature handles flushing metrics after each invoke via either run()
alternatives or MetricService
which inplements the tower::Service
trait.
It also provides optional helpers for emiting a metric on cold starts and decorating metric documents with
request id and/or x-ray trace id.
In your Cargo.toml add:
toml
metrics_cloudwatch_embedded = { version = "0.3", features = ["lambda"] }
```rust use lambdaruntime::{Error, LambdaEvent}; use metricscloudwatch_embedded::lambda::handler::run; use serde::{Deserialize, Serialize};
struct Request {}
struct Response { }
async fn functionhandler(event: LambdaEvent<()>) -> Result
Ok( Response {})
}
async fn main() -> Result<(), Error> { tracingsubscriber::fmt() .withenvfilter(tracingsubscriber::filter::EnvFilter::fromdefaultenv()) .withtarget(false) .withouttime() .compact() .init();
let metrics = metrics_cloudwatch_embedded::Builder::new()
.cloudwatch_namespace("MetricsExample")
.with_dimension("Function", std::env::var("AWS_LAMBDA_FUNCTION_NAME").unwrap())
.lambda_cold_start_metric("ColdStart")
.with_lambda_request_id("RequestId")
.init()
.unwrap();
run(metrics, function_handler).await
}
```
collector::Collector::flush
, overflow will report an error via the tracing
crateBuilder::with_dimension(...)
may not overlap with metric labels
metrics::Unit
are supported
https://docs.aws.amazon.com/AmazonCloudWatch/latest/APIReference/API_MetricDatum.htmlmetrics::Key
will fail with an error via the tracing
cratetracing
crate