此模型为字符切分版的ernie1.0(即对非whitespace的字符按个切分),去除长度超过1的token的tokenizer和模型。

Downloads last month
4
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.