diff --git a/README.md b/README.md index 59c8e30..fa5247f 100644 --- a/README.md +++ b/README.md @@ -15,7 +15,7 @@ datasets: ## Model Description -GPT-Neo 2.7B is a transformer model designed using EleutherAI's replication of the GPT-3 architecture. GPT-Neo refers to the class of models, while 2.7B represents the number of parameters of this particular pre-trained model. This model is the same size as OpenAI's "Ada" model. +GPT-Neo 2.7B is a transformer model designed using EleutherAI's replication of the GPT-3 architecture. GPT-Neo refers to the class of models, while 2.7B represents the number of parameters of this particular pre-trained model. ## Training data @@ -60,7 +60,7 @@ EleutherAI is currently in the process of carrying out further evaluations of GP | GPT-3 1.3B | ------ | ----- | ----- | | GPT-2 1.5B | 1.0468 | ----- | 17.48 | | **GPT-Neo 2.7B** | **0.7165** | **5.646** | **11.39** | -| GPT-3 Ada 2.7B | 0.9631 | ----- | ----- | +| GPT-3 2.7B | 0.9631 | ----- | ----- | | GPT-3 175B | 0.7177 | ----- | ----- | All GPT-2 and GPT-3 scores are from their respective papers, except for the Pile test results which are from the Pile paper.