huggingface/distilgpt2 is a forked repo from huggingface. License: apache-2-0
Go to file
Patrick von Platen 96699c9fd5 upload flax model 2021-05-21 09:15:46 +00:00
.gitattributes allow flax 2021-05-21 09:15:28 +00:00
64.tflite Update 64.tflite 2019-11-29 21:18:39 +00:00
README.md Migrate model card from transformers-repo 2020-12-11 22:24:14 +01:00
config.json Update config.json 2020-05-11 21:02:01 +00:00
flax_model.msgpack upload flax model 2021-05-21 09:15:46 +00:00
merges.txt Update merges.txt 2019-10-03 14:08:14 +00:00
pytorch_model.bin Update pytorch_model.bin 2019-10-07 16:29:52 +00:00
rust_model.ot Update rust_model.ot 2020-04-24 19:42:34 +00:00
tf_model.h5 Update tf_model.h5 2019-10-07 16:29:52 +00:00
tokenizer.json Update tokenizer.json 2020-10-12 12:56:42 +00:00
vocab.json Update vocab.json 2019-10-03 14:08:13 +00:00

README.md

language tags license datasets
en
exbert
apache-2.0
openwebtext

DistilGPT2

DistilGPT2 English language model pretrained with the supervision of GPT2 (the smallest version of GPT2) on OpenWebTextCorpus, a reproduction of OpenAI's WebText dataset. The model has 6 layers, 768 dimension and 12 heads, totalizing 82M parameters (compared to 124M parameters for GPT2). On average, DistilGPT2 is two times faster than GPT2.

On the WikiText-103 benchmark, GPT2 reaches a perplexity on the test set of 16.3 compared to 21.1 for DistilGPT2 (after fine-tuning on the train set).

We encourage to check GPT2 to know more about usage, limitations and potential biases.