23 lines
583 B
Markdown
23 lines
583 B
Markdown
|
---
|
||
|
license: mit
|
||
|
widget:
|
||
|
- text: "I like you. </s></s> I love you."
|
||
|
---
|
||
|
|
||
|
|
||
|
## roberta-large-mnli
|
||
|
|
||
|
Trained by Facebook, [original source](https://github.com/pytorch/fairseq/tree/master/examples/roberta)
|
||
|
|
||
|
```bibtex
|
||
|
@article{liu2019roberta,
|
||
|
title = {RoBERTa: A Robustly Optimized BERT Pretraining Approach},
|
||
|
author = {Yinhan Liu and Myle Ott and Naman Goyal and Jingfei Du and
|
||
|
Mandar Joshi and Danqi Chen and Omer Levy and Mike Lewis and
|
||
|
Luke Zettlemoyer and Veselin Stoyanov},
|
||
|
journal={arXiv preprint arXiv:1907.11692},
|
||
|
year = {2019},
|
||
|
}
|
||
|
```
|
||
|
|