microsoft/tapex-large is a forked repo from huggingface. License: mit
Go to file
Joao Gante e1a21a34d2 Adding generation config file(s) 2023-01-24 17:02:43 +00:00
.gitattributes initial commit 2022-03-10 04:55:54 +00:00
README.md Update README.md 2022-10-06 02:48:19 +00:00
config.json Update config.json 2022-05-17 08:26:50 +00:00
generation_config.json Adding generation config file(s) 2023-01-24 17:02:43 +00:00
merges.txt Upload merges.txt 2022-03-10 04:56:20 +00:00
pytorch_model.bin Upload pytorch_model.bin with git-lfs 2022-03-10 04:59:00 +00:00
tokenizer_config.json Upload tokenizer_config.json 2022-03-10 05:01:20 +00:00
vocab.json Upload vocab.json 2022-03-10 05:01:27 +00:00

README.md

language tags license
en
tapex
table-question-answering
mit

TAPEX (large-sized model)

TAPEX was proposed in TAPEX: Table Pre-training via Learning a Neural SQL Executor by Qian Liu, Bei Chen, Jiaqi Guo, Morteza Ziyadi, Zeqi Lin, Weizhu Chen, Jian-Guang Lou. The original repo can be found here.

Model description

TAPEX (Table Pre-training via Execution) is a conceptually simple and empirically powerful pre-training approach to empower existing models with table reasoning skills. TAPEX realizes table pre-training by learning a neural SQL executor over a synthetic corpus, which is obtained by automatically synthesizing executable SQL queries.

TAPEX is based on the BART architecture, the transformer encoder-encoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder.

Intended Uses

⚠️ This model checkpoint is ONLY used for fine-tuining on downstream tasks, and you CANNOT use this model for simulating neural SQL execution, i.e., employ TAPEX to execute a SQL query on a given table. The one that can neurally execute SQL queries is at here.

This separation of two models for two kinds of intention is because of a known issue in BART large, and we recommend readers to see this comment for more details.

How to Fine-tuning

Please find the fine-tuning script here.

BibTeX entry and citation info

@inproceedings{
    liu2022tapex,
    title={{TAPEX}: Table Pre-training via Learning a Neural {SQL} Executor},
    author={Qian Liu and Bei Chen and Jiaqi Guo and Morteza Ziyadi and Zeqi Lin and Weizhu Chen and Jian-Guang Lou},
    booktitle={International Conference on Learning Representations},
    year={2022},
    url={https://openreview.net/forum?id=O50443AsCP}
}