Add Transformers.js library tag and sample code
Browse files
README.md
CHANGED
|
@@ -11,51 +11,68 @@ language:
|
|
| 11 |
library_name: transformers
|
| 12 |
tags:
|
| 13 |
- text2text-generation
|
|
|
|
| 14 |
widget:
|
| 15 |
- text: >-
|
| 16 |
-
Teapot is an open-source small language model (~800 million parameters)
|
| 17 |
-
|
| 18 |
-
|
| 19 |
-
|
| 20 |
-
|
| 21 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 22 |
|
| 23 |
|
| 24 |
What devices can teapot run on?
|
| 25 |
example_title: Question Answering
|
| 26 |
- text: >-
|
| 27 |
-
Teapot is an open-source small language model (~800 million parameters)
|
| 28 |
-
|
| 29 |
-
|
| 30 |
-
|
| 31 |
-
|
| 32 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 33 |
|
| 34 |
|
| 35 |
Tell me about teapotllm
|
| 36 |
example_title: Summarization Answering
|
| 37 |
- text: >-
|
| 38 |
-
Teapot is an open-source small language model (~800 million parameters)
|
| 39 |
-
|
| 40 |
-
|
| 41 |
-
|
| 42 |
-
|
| 43 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 44 |
|
| 45 |
|
| 46 |
Extract the number of parameters
|
| 47 |
example_title: Information Extraction
|
| 48 |
- text: >-
|
| 49 |
-
Teapot is an open-source small language model (~800 million parameters)
|
| 50 |
-
|
| 51 |
-
|
| 52 |
-
|
| 53 |
-
|
| 54 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
| 55 |
|
| 56 |
|
| 57 |
How many parameters is Deepseek?
|
| 58 |
-
example_title: Hallucination Resistance
|
| 59 |
base_model:
|
| 60 |
- google/flan-t5-large
|
| 61 |
pipeline_tag: text2text-generation
|
|
@@ -264,6 +281,25 @@ answer = teapot_ai(context+"\n"+question)
|
|
| 264 |
print(answer[0].get('generated_text')) # => The Eiffel Tower stands at a height of 330 meters.
|
| 265 |
```
|
| 266 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 267 |
---
|
| 268 |
|
| 269 |
|
|
|
|
| 11 |
library_name: transformers
|
| 12 |
tags:
|
| 13 |
- text2text-generation
|
| 14 |
+
- transformers.js
|
| 15 |
widget:
|
| 16 |
- text: >-
|
| 17 |
+
Teapot is an open-source small language model (~800 million parameters)
|
| 18 |
+
fine-tuned on synthetic data and optimized to run locally on
|
| 19 |
+
resource-constrained devices such as smartphones and CPUs. Teapot is trained
|
| 20 |
+
to only answer using context from documents, reducing hallucinations. Teapot
|
| 21 |
+
can perform a variety of tasks, including hallucination-resistant Question
|
| 22 |
+
Answering (QnA), Retrieval-Augmented Generation (RAG), and JSON extraction.
|
| 23 |
+
TeapotLLM is a fine tune of flan-t5-large that was trained on synthetic data
|
| 24 |
+
generated by Deepseek v3 TeapotLLM can be hosted on low-power devices with
|
| 25 |
+
as little as 2GB of CPU RAM such as a Raspberry Pi. Teapot is a model built
|
| 26 |
+
by and for the community.
|
| 27 |
|
| 28 |
|
| 29 |
What devices can teapot run on?
|
| 30 |
example_title: Question Answering
|
| 31 |
- text: >-
|
| 32 |
+
Teapot is an open-source small language model (~800 million parameters)
|
| 33 |
+
fine-tuned on synthetic data and optimized to run locally on
|
| 34 |
+
resource-constrained devices such as smartphones and CPUs. Teapot is trained
|
| 35 |
+
to only answer using context from documents, reducing hallucinations. Teapot
|
| 36 |
+
can perform a variety of tasks, including hallucination-resistant Question
|
| 37 |
+
Answering (QnA), Retrieval-Augmented Generation (RAG), and JSON extraction.
|
| 38 |
+
TeapotLLM is a fine tune of flan-t5-large that was trained on synthetic data
|
| 39 |
+
generated by Deepseek v3 TeapotLLM can be hosted on low-power devices with
|
| 40 |
+
as little as 2GB of CPU RAM such as a Raspberry Pi. Teapot is a model built
|
| 41 |
+
by and for the community.
|
| 42 |
|
| 43 |
|
| 44 |
Tell me about teapotllm
|
| 45 |
example_title: Summarization Answering
|
| 46 |
- text: >-
|
| 47 |
+
Teapot is an open-source small language model (~800 million parameters)
|
| 48 |
+
fine-tuned on synthetic data and optimized to run locally on
|
| 49 |
+
resource-constrained devices such as smartphones and CPUs. Teapot is trained
|
| 50 |
+
to only answer using context from documents, reducing hallucinations. Teapot
|
| 51 |
+
can perform a variety of tasks, including hallucination-resistant Question
|
| 52 |
+
Answering (QnA), Retrieval-Augmented Generation (RAG), and JSON extraction.
|
| 53 |
+
TeapotLLM is a fine tune of flan-t5-large that was trained on synthetic data
|
| 54 |
+
generated by Deepseek v3 TeapotLLM can be hosted on low-power devices with
|
| 55 |
+
as little as 2GB of CPU RAM such as a Raspberry Pi. Teapot is a model built
|
| 56 |
+
by and for the community.
|
| 57 |
|
| 58 |
|
| 59 |
Extract the number of parameters
|
| 60 |
example_title: Information Extraction
|
| 61 |
- text: >-
|
| 62 |
+
Teapot is an open-source small language model (~800 million parameters)
|
| 63 |
+
fine-tuned on synthetic data and optimized to run locally on
|
| 64 |
+
resource-constrained devices such as smartphones and CPUs. Teapot is trained
|
| 65 |
+
to only answer using context from documents, reducing hallucinations. Teapot
|
| 66 |
+
can perform a variety of tasks, including hallucination-resistant Question
|
| 67 |
+
Answering (QnA), Retrieval-Augmented Generation (RAG), and JSON extraction.
|
| 68 |
+
TeapotLLM is a fine tune of flan-t5-large that was trained on synthetic data
|
| 69 |
+
generated by Deepseek v3 TeapotLLM can be hosted on low-power devices with
|
| 70 |
+
as little as 2GB of CPU RAM such as a Raspberry Pi. Teapot is a model built
|
| 71 |
+
by and for the community.
|
| 72 |
|
| 73 |
|
| 74 |
How many parameters is Deepseek?
|
| 75 |
+
example_title: Hallucination Resistance
|
| 76 |
base_model:
|
| 77 |
- google/flan-t5-large
|
| 78 |
pipeline_tag: text2text-generation
|
|
|
|
| 281 |
print(answer[0].get('generated_text')) # => The Eiffel Tower stands at a height of 330 meters.
|
| 282 |
```
|
| 283 |
|
| 284 |
+
### Transformers.js Support
|
| 285 |
+
|
| 286 |
+
You can even run the model in-browser (or any other JavaScript environment) with [Transformers.js](https://huggingface.co/docs/transformers.js) as follows:
|
| 287 |
+
|
| 288 |
+
```js
|
| 289 |
+
// npm i @huggingface/transformers
|
| 290 |
+
import { pipeline } from "@huggingface/transformers";
|
| 291 |
+
|
| 292 |
+
const teapot_ai = await pipeline("text2text-generation", "teapotai/teapotllm");
|
| 293 |
+
|
| 294 |
+
const context = `
|
| 295 |
+
The Eiffel Tower is a wrought iron lattice tower in Paris, France. It was designed by Gustave Eiffel and completed in 1889.
|
| 296 |
+
It stands at a height of 330 meters and is one of the most recognizable structures in the world.
|
| 297 |
+
`;
|
| 298 |
+
const question = "What is the height of the Eiffel Tower?";
|
| 299 |
+
const answer = await teapot_ai(context + "\n" + question);
|
| 300 |
+
console.log(answer[0].generated_text); // => " The Eiffel Tower stands at a height of 330 meters."
|
| 301 |
+
```
|
| 302 |
+
|
| 303 |
---
|
| 304 |
|
| 305 |
|