huggingface/roberta-large-mnli is a forked repo from huggingface. License: mit
Go to file
Patrick von Platen 130fb28e15 upload flax model 2021-05-20 19:32:30 +00:00
.gitattributes allow flax 2021-05-20 19:31:56 +00:00
README.md Migrate model card from transformers-repo 2020-12-11 22:25:36 +01:00
config.json Update config.json 2020-06-17 14:42:29 +00:00
flax_model.msgpack upload flax model 2021-05-20 19:32:30 +00:00
merges.txt Update merges.txt 2019-08-09 15:05:41 +00:00
pytorch_model.bin Update pytorch_model.bin 2019-09-24 12:53:04 +00:00
tf_model.h5 Update tf_model.h5 2019-09-24 13:39:28 +00:00
tokenizer.json Update tokenizer.json 2020-10-12 12:57:01 +00:00
vocab.json Update vocab.json 2019-08-26 14:15:16 +00:00

README.md

license widget
mit
text
I like you. </s></s> I love you.

roberta-large-mnli

Trained by Facebook, original source

@article{liu2019roberta,
    title = {RoBERTa: A Robustly Optimized BERT Pretraining Approach},
    author = {Yinhan Liu and Myle Ott and Naman Goyal and Jingfei Du and
              Mandar Joshi and Danqi Chen and Omer Levy and Mike Lewis and
              Luke Zettlemoyer and Veselin Stoyanov},
    journal={arXiv preprint arXiv:1907.11692},
    year = {2019},
}