From 9d7568e4b20ed5db15ee30e99c7219bde9990762 Mon Sep 17 00:00:00 2001 From: Sylvain Gugger Date: Mon, 7 Nov 2022 20:17:26 +0000 Subject: [PATCH] Update README.md (#2) - Update README.md (c66c58b6d926dd0d37987e69cb6083e4e40afb89) Co-authored-by: Bartek Szmelczynski --- README.md | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/README.md b/README.md index c20a7a0..76251ef 100644 --- a/README.md +++ b/README.md @@ -11,7 +11,7 @@ datasets: This model is a distilled version of the [BERT base model](https://huggingface.co/bert-base-cased). It was introduced in [this paper](https://arxiv.org/abs/1910.01108). The code for the distillation process can be found -[here](https://github.com/huggingface/transformers/tree/master/examples/distillation). +[here](https://github.com/huggingface/transformers/tree/main/examples/research_projects/distillation). This model is cased: it does make a difference between english and English. All the training details on the pre-training, the uses, limitations and potential biases (included below) are the same as for [DistilBERT-base-uncased](https://huggingface.co/distilbert-base-uncased).