diff --git a/README.md b/README.md index d19d2e7..493f2c9 100644 --- a/README.md +++ b/README.md @@ -42,7 +42,7 @@ As of December 2021, mDeBERTa-base is the best performing multilingual transform ```python from transformers import AutoTokenizer, AutoModelForSequenceClassification import torch -model_name = "MoritzLaurer/mDeBERTa-v3-base-xnli-mnli" +model_name = "MoritzLaurer/mDeBERTa-v3-base-mnli-xnli" tokenizer = AutoTokenizer.from_pretrained(model_name) model = AutoModelForSequenceClassification.from_pretrained(model_name) premise = "Angela Merkel ist eine Politikerin in Deutschland und Vorsitzende der CDU"