huggingface/distilbert-base-cased-distilled-squad is a forked repo from huggingface. License: apache-2-0
Go to file
Julien Chaumond 626af3168b Migrate model card from transformers-repo
Read announcement at https://discuss.huggingface.co/t/announcement-all-model-cards-will-be-migrated-to-hf-co-model-repos/2755
Original file history: https://github.com/huggingface/transformers/commits/master/model_cards/distilbert-base-cased-distilled-squad-README.md
2020-12-11 22:23:50 +01:00
.gitattributes initial commit 2020-02-07 19:16:00 +00:00
README.md Migrate model card from transformers-repo 2020-12-11 22:23:50 +01:00
config.json Update config.json 2020-04-24 15:57:53 +00:00
pytorch_model.bin Update pytorch_model.bin 2020-02-07 19:16:00 +00:00
rust_model.ot Update rust_model.ot 2020-04-24 19:38:16 +00:00
saved_model.tar.gz Update saved_model.tar.gz 2020-04-17 21:29:58 +00:00
tf_model.h5 Update tf_model.h5 2020-02-07 19:16:00 +00:00
tfjs.tar.gz Update tfjs.tar.gz 2020-04-17 21:31:29 +00:00
tokenizer.json Update tokenizer.json 2020-10-12 12:56:41 +00:00
tokenizer_config.json Add tokenizer configuration 2020-11-25 15:16:04 +01:00
vocab.txt Add vocab 2020-11-25 15:37:42 +01:00

README.md

language datasets metrics license
en
squad
squad
apache-2.0

DistilBERT base cased distilled SQuAD

This model is a fine-tune checkpoint of DistilBERT-base-cased, fine-tuned using (a second step of) knowledge distillation on SQuAD v1.1. This model reaches a F1 score of 87.1 on the dev set (for comparison, BERT bert-base-cased version reaches a F1 score of 88.7).