|
|
--- |
|
|
base_model: google/gemma-2-27b-it |
|
|
library_name: peft |
|
|
--- |
|
|
|
|
|
# Model Card for Model ID |
|
|
|
|
|
<!-- Provide a quick summary of what the model is/does. --> |
|
|
|
|
|
|
|
|
|
|
|
## Model Details |
|
|
|
|
|
### Model Description |
|
|
|
|
|
<!-- Provide a longer summary of what this model is. --> |
|
|
|
|
|
|
|
|
|
|
|
- **Developed by:** Anton Bazdyrev, Ivan Bashtovyi, Ivan Havlytskyi, Oleksandr Kharytonov, Artur Khodakhovskyi at National Technical University of Ukraine "Igor Sikorsky Kyiv Polytechnic Institute" |
|
|
- **Finetuned from model [optional]:** google/gemma-2-27b-it with unmasking |
|
|
|
|
|
### Model Sources [optional] |
|
|
|
|
|
<!-- Provide the basic links for the model. --> |
|
|
|
|
|
- **Repository:** https://github.com/AntonBazdyrev/unlp2025_shared_task/blob/master/llm_encoder_pretrain/gemma2_27b_pretrain-mlm.ipynb |
|
|
- **Paper [optional]:** TBD |
|
|
- **Demo [optional]:** https://github.com/AntonBazdyrev/unlp2025_shared_task/tree/master/span_ident |
|
|
|
|
|
|
|
|
### Framework versions |
|
|
|
|
|
- PEFT 0.15.0 |