distilbert-base-uncased-mnli/README.md

14 lines
323 B
Markdown
Raw Normal View History

2021-02-13 10:54:51 +00:00
---
2021-02-13 10:55:01 +00:00
language: en
2021-02-13 10:54:51 +00:00
pipeline_tag: zero-shot-classification
2021-02-13 11:01:12 +00:00
tags:
- distilbert
2021-02-13 11:00:17 +00:00
datasets:
- mulit_nli
metrics:
- accuracy
2021-02-13 10:54:51 +00:00
---
# DistilBERT base model (uncased)
This model is the Multi-Genre Natural Language Inference (MNLI) fine-turned version of the [uncased DistilBERT model](https://huggingface.co/distilbert-base-uncased).