huggingface/distilbert-base-uncased-finetuned-sst-2-english is a forked repo from huggingface. License: apache-2-0
Go to file
Julien Chaumond 68499bf5d6 Migrate model card from transformers-repo
Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/distilbert-base-uncased-finetuned-sst-2-english-README.md
2020-12-11 22:24:07 +01:00
.gitattributes initial commit 2019-12-13 15:30:34 +00:00
README.md Migrate model card from transformers-repo 2020-12-11 22:24:07 +01:00
config.json Update config.json 2020-04-24 15:58:05 +00:00
pytorch_model.bin Update pytorch_model.bin 2020-01-10 17:03:55 +00:00
rust_model.ot Update rust_model.ot 2020-04-24 20:23:37 +00:00
tf_model.h5 Update tf_model.h5 2020-01-10 17:03:55 +00:00
vocab.txt Update vocab.txt 2019-12-13 15:30:37 +00:00

README.md

language license datasets
en apache-2.0
sst-2

DistilBERT base uncased finetuned SST-2

This model is a fine-tune checkpoint of DistilBERT-base-uncased, fine-tuned on SST-2. This model reaches an accuracy of 91.3 on the dev set (for comparison, Bert bert-base-uncased version reaches an accuracy of 92.7).

Fine-tuning hyper-parameters

  • learning_rate = 1e-5
  • batch_size = 32
  • warmup = 600
  • max_seq_length = 128
  • num_train_epochs = 3.0