Update README.md
This commit is contained in:
parent
dada2f5164
commit
49c996995e
12
README.md
12
README.md
|
@ -84,14 +84,14 @@ average | ar | bg | de | el | en | es | fr | hi | ru | sw | th | tr | ur | vu |
|
|||
---------|----------|---------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------|----------
|
||||
0.808 | 0.802 | 0.829 | 0.825 | 0.826 | 0.883 | 0.845 | 0.834 | 0.771 | 0.813 | 0.748 | 0.793 | 0.807 | 0.740 | 0.795 | 0.8116
|
||||
|
||||
|
||||
## Limitations and bias
|
||||
Please consult the original DeBERTa-V3 paper and literature on different NLI datasets for potential biases.
|
||||
### BibTeX entry and citation info
|
||||
If you want to cite this model, please cite the original DeBERTa paper, the respective NLI datasets and include a link to this model on the Hugging Face hub.
|
||||
|
||||
### Ideas for cooperation or questions?
|
||||
## BibTeX entry and citation info
|
||||
If you use this model, please cite: Laurer, Moritz, Wouter van Atteveldt, Andreu Salleras Casas, and Kasper Welbers. 2022. ‘Less Annotating, More Classifying – Addressing the Data Scarcity Issue of Supervised Machine Learning with Deep Transfer Learning and BERT - NLI’. Preprint, June. Open Science Framework. https://osf.io/74b8k.
|
||||
|
||||
## Ideas for cooperation or questions?
|
||||
If you have questions or ideas for cooperation, contact me at m{dot}laurer{at}vu{dot}nl or [LinkedIn](https://www.linkedin.com/in/moritz-laurer/)
|
||||
|
||||
### Debugging and issues
|
||||
Note that DeBERTa-v3 was released recently and older versions of HF Transformers seem to have issues running the model (e.g. resulting in an issue with the tokenizer). Using Transformers==4.13 might solve some issues. Note that mDeBERTa currently does not support FP16, see here: https://github.com/microsoft/DeBERTa/issues/77
|
||||
## Debugging and issues
|
||||
Note that DeBERTa-v3 was released in late 2021 and older versions of HF Transformers seem to have issues running the model (e.g. resulting in an issue with the tokenizer). Using Transformers==4.13 or higher might solve some issues. Note that mDeBERTa currently does not support FP16, see here: https://github.com/microsoft/DeBERTa/issues/77
|
Loading…
Reference in New Issue