Trouter-Library commited on
Commit
ec5f667
·
verified ·
1 Parent(s): 0d6b48c

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +15 -0
README.md CHANGED
@@ -33,6 +33,21 @@ Helion-V1-Embeddings is a lightweight text embedding model designed for semantic
33
  - **Embedding Dimension:** 384
34
  - **Max Sequence Length:** 256 tokens
35
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
36
  ## Intended Use
37
 
38
  Helion-V1-Embeddings is designed for:
 
33
  - **Embedding Dimension:** 384
34
  - **Max Sequence Length:** 256 tokens
35
 
36
+ ## Model Parameters
37
+
38
+ | Parameter | Value | Description |
39
+ |-----------|-------|-------------|
40
+ | **Architecture** | BERT-based | 6-layer transformer encoder |
41
+ | **Hidden Size** | 384 | Dimension of hidden layers |
42
+ | **Attention Heads** | 12 | Number of attention heads |
43
+ | **Intermediate Size** | 1536 | Feed-forward layer size |
44
+ | **Vocab Size** | 30,522 | WordPiece vocabulary |
45
+ | **Max Position Embeddings** | 512 | Maximum sequence length |
46
+ | **Pooling Strategy** | Mean Pooling | Average of token embeddings |
47
+ | **Output Dimension** | 384 | Final embedding size |
48
+ | **Total Parameters** | ~22.7M | Trainable parameters |
49
+ | **Model Size** | ~80MB | Disk footprint |
50
+
51
  ## Intended Use
52
 
53
  Helion-V1-Embeddings is designed for: