From 03c7da25f9d6ffd4bfb286065b8095fd1c9d892e Mon Sep 17 00:00:00 2001
From: Qian Liu <qian.liu@buaa.edu.cn>
Date: Thu, 10 Mar 2022 08:26:35 +0000
Subject: [PATCH] Update README.md

---
 README.md | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/README.md b/README.md
index 5a760d0..12beb1f 100644
--- a/README.md
+++ b/README.md
@@ -15,7 +15,7 @@ TAPEX (**Ta**ble **P**re-training via **Ex**ecution) is a conceptually simple an
 
 TAPEX is based on the BART architecture, the transformer encoder-encoder (seq2seq) model with a bidirectional (BERT-like) encoder and an autoregressive (GPT-like) decoder.
 
-This model is the `tapex-base` model fine-tuned on the [WikiSQL](https://huggingface.co/datasets/wikisql) dataset.
+This model is the `tapex-base` model fine-tuned on the [WikiTableQuestions](https://huggingface.co/datasets/wikitablequestions) dataset.
 
 ## Intended Uses