本库是对百度飞浆推理库C接口的封装,详细说明请参考官方文档
paddle_inference_c
库(Windows下为paddle_inference_c.lib
, Linux下为libpaddle_inference_c.a
)处于能被链接器搜索的位置
(Windows MSVC下设置环境变量LIB
/Linux下设置环境变量LD_LIBRARY_PATH
并指向库所在目录)```rust use paddleinference::config::model::Model; use paddleinference::config::setting::Cpu; use paddle_inference::Predictor;
let predictor = Predictor::builder(Model::path( "模型文件路径", "模型参数文件路径", )) // 使用 CPU 识别 .cpu(Cpu { threads: Some(std::thread::availableparallelism().unwrap().get() as i32), mkldnn: None, }) // 设置缓存陌路 .setoptimizationcachedir("caches".to_string()) // 创建 Predictor .build();
let names = predictor.input_names(); println!("输入名称列表长度: {}", names.len());
// 获取和设置输入数据 let input = predictor.input(&names.get(0).unwrap()); input.reshape(&[1, 3, 100, 100]); input.copyfromf32(&[0.0; 3 * 100 * 100]);
// 执行 println!("run: {}", predictor.run());
let names = predictor.output_names(); println!("output names len: {}", names.len());
let output = predictor.output(&names.get(0).unwrap()); println!("output type: {:?}", output.data_type()); println!("output shape: {:?}", output.shape()); ```