typeform/distilbert-base-uncased-mnli is a forked repo from huggingface. License: None
Go to file
David Chu b91e7a74c6 fix: id2label and label2id 2021-02-13 18:32:43 +00:00
.gitattributes initial commit 2021-02-13 10:51:22 +00:00
README.md Update README.md 2021-02-13 11:01:12 +00:00
config.json fix: id2label and label2id 2021-02-13 18:32:43 +00:00
eval_results_mnli-mm.txt add model 2021-02-13 14:58:05 +00:00
eval_results_mnli.txt add model 2021-02-13 14:58:05 +00:00
pytorch_model.bin add model 2021-02-13 14:58:05 +00:00
special_tokens_map.json add model 2021-02-13 14:58:05 +00:00
tokenizer_config.json add model 2021-02-13 14:58:05 +00:00
train_results.txt add model 2021-02-13 14:58:05 +00:00
trainer_state.json add model 2021-02-13 14:58:05 +00:00
training_args.bin add model 2021-02-13 14:58:05 +00:00
vocab.txt add model 2021-02-13 14:58:05 +00:00

README.md

language pipeline_tag tags datasets metrics
en zero-shot-classification
distilbert
mulit_nli
accuracy

DistilBERT base model (uncased)

This model is the Multi-Genre Natural Language Inference (MNLI) fine-turned version of the uncased DistilBERT model.