The Jieba Chinese Word Segmentation Implemented in Rust
Add it to your Cargo.toml
:
toml
[dependencies]
jieba-rs = "0.6"
then you are good to go. If you are using Rust 2015 you have to extern crate jieba_rs
to your crate root as well.
```rust use jieba_rs::Jieba;
fn main() { let jieba = Jieba::new(); let words = jieba.cut("我们中出了一个叛徒", false); assert_eq!(words, vec!["我们", "中", "出", "了", "一个", "叛徒"]); } ```
default-dict
feature enables embedded dictionary, this features is enabled by defaulttfidf
feature enables TF-IDF keywords extractortextrank
feature enables TextRank keywords extractortoml
[dependencies]
jieba-rs = { version = "0.6", features = ["tfidf", "textrank"] }
bash
cargo bench --all-features
jieba-rs
bindingsThis work is released under the MIT license. A copy of the license is provided in the LICENSE file.