Migrate model card from transformers-repo
Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755 Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/distilgpt2-README.md
This commit is contained in:
parent
58ada35031
commit
9851938d07
|
@ -0,0 +1,21 @@
|
|||
---
|
||||
language: en
|
||||
tags:
|
||||
- exbert
|
||||
|
||||
license: apache-2.0
|
||||
datasets:
|
||||
- openwebtext
|
||||
---
|
||||
|
||||
# DistilGPT2
|
||||
|
||||
DistilGPT2 English language model pretrained with the supervision of [GPT2](https://huggingface.co/gpt2) (the smallest version of GPT2) on [OpenWebTextCorpus](https://skylion007.github.io/OpenWebTextCorpus/), a reproduction of OpenAI's WebText dataset. The model has 6 layers, 768 dimension and 12 heads, totalizing 82M parameters (compared to 124M parameters for GPT2). On average, DistilGPT2 is two times faster than GPT2.
|
||||
|
||||
On the [WikiText-103](https://blog.einstein.ai/the-wikitext-long-term-dependency-language-modeling-dataset/) benchmark, GPT2 reaches a perplexity on the test set of 16.3 compared to 21.1 for DistilGPT2 (after fine-tuning on the train set).
|
||||
|
||||
We encourage to check [GPT2](https://huggingface.co/gpt2) to know more about usage, limitations and potential biases.
|
||||
|
||||
<a href="https://huggingface.co/exbert/?model=distilgpt2">
|
||||
<img width="300px" src="https://cdn-media.huggingface.co/exbert/button.png">
|
||||
</a>
|
Loading…
Reference in New Issue