encoder-encoder -> encoder-decoder (#1)

- encoder-encoder -> encoder-decoder (efb1f2a83adf338d09a84cacb045aeac50454c24)


Co-authored-by: scott schmidt <scottschmidt@users.noreply.huggingface.co>
This commit is contained in:
Niels Rogge 2023-03-10 12:32:02 +00:00 committed by system
parent 85ba1baa39
commit b60f7b51d9
1 changed files with 1 additions and 1 deletions

View File

@ -14,7 +14,7 @@ TAPEX was proposed in [TAPEX: Table Pre-training via Learning a Neural SQL Execu
TAPEX (**Ta**ble **P**re-training via **Ex**ecution) is a conceptually simple and empirically powerful pre-training approach to empower existing models with *table reasoning* skills. TAPEX realizes table pre-training by learning a neural SQL executor over a synthetic corpus, which is obtained by automatically synthesizing executable SQL queries.
TAPEX is based on the BART architecture, the transformer encoder-encoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder.
TAPEX is based on the BART architecture, the transformer encoder-decoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder.
## Intended Uses