Crawler Data Client

This is a cli tool to help automate the market data distribution process.

Setup

First make sure rust is installed on the system with rustup.

Then install the cli using:

cargo install crawler_data_client

In order to decompress the files, you also need to have a working installation of the zstd package.

Usage

``` crawlerdataclient --address

--username --password --crawler-type --exchange --market --symbol --month --output

OPTIONS: -h, --help (display help text) -a, --address

---------- (ftp server address, eg 12.34.56.78:21. Usually the port is 21) -u, --username -------- (your ftp login username) -p, --password -------- (your ftp login password) -c, --crawler-type (options: trades, l2events, l2topk, bbo, fundingrate, ticker (raw only)) -e, --exchange -------- (options: binance, ftx, bybit) -m, --market ------------ (options: spot, linearswap, inverse_swap) -s, --symbol ------------ (eg. BTCUSDT) -o, --output ------------ (the local download destination folder, eg ~/Downloads/crawler-data) --month -------------- (optional. the year-month in ISO, eg 2022-08. required if not using --latest-day or --latest-month) --day ------------------ (optional. the day of the month, eg. 07. if not provided, will download whole month. this needs to be ISO, so days less than 10 need a zero in front) --keep-compressed ------------ (optional. if set, the data will be kept compressed as zst. default is to decompress the file for usage) --raw ------------------------ (optional. download the raw data instead of parsed) --latest-day ----------------- (optional. download the latest day, ie the previous utc day at time of download) --latest-month --------------- (optional. download the latest whole month, ie the month that the previous utc day is in) ```