From 574dbac1d76aa546d792c43ab140315c8b2385e7 Mon Sep 17 00:00:00 2001 From: Moritz Laurer Date: Sun, 25 Sep 2022 18:34:23 +0000 Subject: [PATCH] update readme with easier zeroshot code --- README.md | 14 ++++++++++++-- 1 file changed, 12 insertions(+), 2 deletions(-) diff --git a/README.md b/README.md index a1051b8..c5107ad 100644 --- a/README.md +++ b/README.md @@ -38,8 +38,18 @@ This multilingual model can perform natural language inference (NLI) on 100 lang As of December 2021, mDeBERTa-base is the best performing multilingual base-sized transformer model, introduced by Microsoft in [this paper](https://arxiv.org/pdf/2111.09543.pdf). -## Intended uses & limitations -#### How to use the model +### How to use the model +#### Simple zero-shot classification pipeline +```python +from transformers import pipeline +classifier = pipeline("zero-shot-classification", model="MoritzLaurer/mDeBERTa-v3-base-mnli-xnli") + +sequence_to_classify = "Angela Merkel ist eine Politikerin in Deutschland und Vorsitzende der CDU" +candidate_labels = ["politics", "economy", "entertainment", "environment"] +output = classifier(sequence_to_classify, candidate_labels, multi_label=False) +print(output) +``` +#### NLI use-case ```python from transformers import AutoTokenizer, AutoModelForSequenceClassification import torch