diff --git a/README.md b/README.md index b34a0dc..eea5cea 100644 --- a/README.md +++ b/README.md @@ -19,6 +19,23 @@ The other (non-default) version which can be used is: Disclaimer: The team releasing TAPAS did not write a model card for this model so this model card has been written by the Hugging Face team and contributors. +## Results on SQA - Dev Accuracy + +Size | Reset | Dev Accuracy | Link +-------- | --------| -------- | ---- +LARGE | noreset | 0.7223 | [tapas_sqa_inter_masklm_large.zip](https://huggingface.co/google/tapas-large-finetuned-sqa/tree/no_reset) +LARGE | reset | 0.7289 | [tapas_sqa_inter_masklm_large_reset.zip](https://huggingface.co/google/tapas-large-finetuned-sqa/tree/main) +BASE | noreset | 0.6737 | [tapas_sqa_inter_masklm_base.zip](https://huggingface.co/google/tapas-base-finetuned-sqa/tree/no_reset) +BASE | reset | 0.6874 | [tapas_sqa_inter_masklm_base_reset.zip](https://huggingface.co/google/tapas-base-finetuned-sqa/tree/main) +MEDIUM | noreset | 0.6464 | [tapas_sqa_inter_masklm_medium.zip]((https://huggingface.co/google/tapas-medium-finetuned-sqa/tree/no_reset)) +MEDIUM | reset | 0.6561 | [tapas_sqa_inter_masklm_medium_reset.zip]((https://huggingface.co/google/tapas-medium-finetuned-sqa/tree/main)) +SMALL | noreset | 0.5876 | [tapas_sqa_inter_masklm_small.zip]((https://huggingface.co/google/tapas-small-finetuned-sqa/tree/no_reset)) +SMALL | reset | 0.6155 | [tapas_sqa_inter_masklm_small_reset.zip]((https://huggingface.co/google/tapas-small-finetuned-sqa/tree/main)) +MINI | noreset | 0.4574 | [tapas_sqa_inter_masklm_mini.zip]((https://huggingface.co/google/tapas-mini-finetuned-sqa/tree/no_reset)) +MINI | reset | 0.5148 | [tapas_sqa_inter_masklm_mini_reset.zip]((https://huggingface.co/google/tapas-mini-finetuned-sqa/tree/main)) +TINY | noreset | 0.2004 | [tapas_sqa_inter_masklm_tiny.zip]((https://huggingface.co/google/tapas-tiny-finetuned-sqa/tree/no_reset)) +TINY | reset | 0.2375 | [tapas_sqa_inter_masklm_tiny_reset.zip]((https://huggingface.co/google/tapas-tiny-finetuned-sqa/tree/main)) + ## Model description TAPAS is a BERT-like transformers model pretrained on a large corpus of English data from Wikipedia in a self-supervised fashion.