bert-base-chinese-ner/README.md

46 lines
1.2 KiB
Markdown
Raw Normal View History

2020-12-14 09:58:45 +00:00
---
language:
2022-05-10 03:28:12 +00:00
- zh
2020-12-14 09:58:45 +00:00
thumbnail: https://ckip.iis.sinica.edu.tw/files/ckip_logo.png
tags:
2022-05-10 03:28:12 +00:00
- pytorch
- token-classification
- bert
- zh
2020-12-14 09:58:45 +00:00
license: gpl-3.0
---
2021-01-05 07:31:53 +00:00
# CKIP BERT Base Chinese
This project provides traditional Chinese transformers models (including ALBERT, BERT, GPT2) and NLP tools (including word segmentation, part-of-speech tagging, named entity recognition).
這個專案提供了繁體中文的 transformers 模型(包含 ALBERT、BERT、GPT2及自然語言處理工具包含斷詞、詞性標記、實體辨識
## Homepage
2022-05-10 03:28:12 +00:00
- https://github.com/ckiplab/ckip-transformers
2020-12-14 09:58:45 +00:00
## Contributers
2022-05-10 03:28:12 +00:00
- [Mu Yang](https://muyang.pro) at [CKIP](https://ckip.iis.sinica.edu.tw) (Author & Maintainer)
2021-01-05 07:31:53 +00:00
## Usage
Please use BertTokenizerFast as tokenizer instead of AutoTokenizer.
請使用 BertTokenizerFast 而非 AutoTokenizer。
```
from transformers import (
BertTokenizerFast,
2021-01-06 05:47:42 +00:00
AutoModel,
2021-01-05 07:31:53 +00:00
)
tokenizer = BertTokenizerFast.from_pretrained('bert-base-chinese')
2021-01-06 05:47:42 +00:00
model = AutoModel.from_pretrained('ckiplab/bert-base-chinese-ner')
2021-01-05 07:31:53 +00:00
```
For full usage and more information, please refer to https://github.com/ckiplab/ckip-transformers.
有關完整使用方法及其他資訊,請參見 https://github.com/ckiplab/ckip-transformers 。