ckiplab/bert-base-chinese-ner is a forked repo from huggingface. License: gpl-3-0
Go to file
Mu Yang 50c5afc0a0 Update model card. 2022-05-10 11:43:10 +08:00
.gitattributes allow flax 2021-05-19 14:05:00 +00:00
README.md Update model card. 2022-05-10 11:43:10 +08:00
config.json Specify tokenizer in config. 2021-01-18 10:14:30 +08:00
flax_model.msgpack upload flax model 2021-05-19 14:05:16 +00:00
pytorch_model.bin Upload model files. 2020-12-14 17:58:45 +08:00
special_tokens_map.json Upload model files. 2020-12-14 17:58:45 +08:00
tokenizer_config.json Upload model files. 2020-12-14 17:58:45 +08:00
vocab.txt Upload model files. 2020-12-14 17:58:45 +08:00

README.md

language thumbnail tags license
zh
https://ckip.iis.sinica.edu.tw/files/ckip_logo.png
pytorch
token-classification
bert
zh
gpl-3.0

CKIP BERT Base Chinese

This project provides traditional Chinese transformers models (including ALBERT, BERT, GPT2) and NLP tools (including word segmentation, part-of-speech tagging, named entity recognition).

這個專案提供了繁體中文的 transformers 模型(包含 ALBERT、BERT、GPT2及自然語言處理工具包含斷詞、詞性標記、實體辨識

Homepage

Contributers

Usage

Please use BertTokenizerFast as tokenizer instead of AutoTokenizer.

請使用 BertTokenizerFast 而非 AutoTokenizer。

from transformers import (
  BertTokenizerFast,
  AutoModel,
)

tokenizer = BertTokenizerFast.from_pretrained('bert-base-chinese')
model = AutoModel.from_pretrained('ckiplab/bert-base-chinese-ner')

For full usage and more information, please refer to https://github.com/ckiplab/ckip-transformers.

有關完整使用方法及其他資訊,請參見 https://github.com/ckiplab/ckip-transformers