Update README.md
This commit is contained in:
parent
40fb0549d9
commit
058e8e2c95
|
@ -15,7 +15,7 @@ datasets:
|
||||||
|
|
||||||
## Model Description
|
## Model Description
|
||||||
|
|
||||||
GPT-Neo 2.7B is a transformer model designed using EleutherAI's replication of the GPT-3 architecture. GPT-Neo refers to the class of models, while 2.7B represents the number of parameters of this particular pre-trained model. It is the same size as OpenAI's "Ada" model.
|
GPT-Neo 2.7B is a transformer model designed using EleutherAI's replication of the GPT-3 architecture. GPT-Neo refers to the class of models, while 2.7B represents the number of parameters of this particular pre-trained model. This model is the same size as OpenAI's "Ada" model.
|
||||||
|
|
||||||
## Training data
|
## Training data
|
||||||
|
|
||||||
|
|
Loading…
Reference in New Issue