huggingface/distilroberta-base is a forked repo from huggingface. License: apache-2-0
Go to file
Patrick von Platen ec58a5b7f7 upload flax model 2021-05-20 22:47:11 +00:00
.gitattributes allow flax 2021-05-20 22:46:55 +00:00
README.md Migrate model card from transformers-repo 2020-12-11 22:24:18 +01:00
config.json Update config.json 2020-04-24 15:58:11 +00:00
dict.txt Update dict.txt 2019-10-17 19:12:55 +00:00
flax_model.msgpack upload flax model 2021-05-20 22:47:11 +00:00
merges.txt Update merges.txt 2019-10-17 19:12:54 +00:00
pytorch_model.bin Update pytorch_model.bin 2019-10-17 19:12:53 +00:00
rust_model.ot addition of Rust model 2020-11-24 16:51:19 +01:00
tf_model.h5 Update tf_model.h5 2019-10-17 19:12:53 +00:00
tokenizer.json Update tokenizer.json 2020-10-12 12:56:43 +00:00
vocab.json Update vocab.json 2019-10-17 19:12:55 +00:00

README.md

language tags license datasets
en
exbert
apache-2.0
openwebtext

DistilRoBERTa base model

This model is a distilled version of the RoBERTa-base model. It follows the same training procedure as DistilBERT. The code for the distillation process can be found here. This model is case-sensitive: it makes a difference between english and English.

The model has 6 layers, 768 dimension and 12 heads, totalizing 82M parameters (compared to 125M parameters for RoBERTa-base). On average DistilRoBERTa is twice as fast as Roberta-base.

We encourage to check RoBERTa-base model to know more about usage, limitations and potential biases.

Training data

DistilRoBERTa was pre-trained on OpenWebTextCorpus, a reproduction of OpenAI's WebText dataset (it is ~4 times less training data than the teacher RoBERTa).

Evaluation results

When fine-tuned on downstream tasks, this model achieves the following results:

Glue test results:

Task MNLI QQP QNLI SST-2 CoLA STS-B MRPC RTE
84.0 89.4 90.8 92.5 59.3 88.3 86.6 67.9

BibTeX entry and citation info

@article{Sanh2019DistilBERTAD,
  title={DistilBERT, a distilled version of BERT: smaller, faster, cheaper and lighter},
  author={Victor Sanh and Lysandre Debut and Julien Chaumond and Thomas Wolf},
  journal={ArXiv},
  year={2019},
  volume={abs/1910.01108}
}