Update README.md
Browse files
README.md
CHANGED
|
@@ -1,3 +1,16 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
# GPT2-Spanish
|
| 2 |
GPT2-Spanish is a language generation model trained from scratch with 11.5GB of Spanish texts and with a Byte Pair Encoding (BPE) tokenizer that was trained for this purpose. The parameters used are the same as the medium version of the original OpenAI GPT2 model.
|
| 3 |
|
|
|
|
| 1 |
+
---
|
| 2 |
+
language: es
|
| 3 |
+
tags:
|
| 4 |
+
- GPT-2
|
| 5 |
+
- Spanish
|
| 6 |
+
- ebooks
|
| 7 |
+
- nlg
|
| 8 |
+
datasets:
|
| 9 |
+
- ebooks
|
| 10 |
+
widget:
|
| 11 |
+
- text: "Quisiera saber que va a suceder"
|
| 12 |
+
license: mit
|
| 13 |
+
---
|
| 14 |
# GPT2-Spanish
|
| 15 |
GPT2-Spanish is a language generation model trained from scratch with 11.5GB of Spanish texts and with a Byte Pair Encoding (BPE) tokenizer that was trained for this purpose. The parameters used are the same as the medium version of the original OpenAI GPT2 model.
|
| 16 |
|