|
|
--- |
|
|
base_model: |
|
|
- mistralai/Mistral-7B-Instruct-v0.3 |
|
|
datasets: |
|
|
- noystl/Recombination-Extraction |
|
|
language: |
|
|
- en |
|
|
library_name: transformers |
|
|
license: cc |
|
|
pipeline_tag: feature-extraction |
|
|
--- |
|
|
|
|
|
This Hugging Face repository hosts a fine-tuned Mistral model designed to classify scientific abstracts based on whether they involve idea recombination, as introduced in the paper [CHIMERA: A Knowledge Base of Idea Recombination in Scientific Literature](https://huggingface.co/papers/2505.20779). The model employs a LoRA adapter on top of a Mistral base model. |
|
|
|
|
|
For detailed usage instructions and to reproduce the results, please refer to the linked GitHub repository. |
|
|
|
|
|
**Bibtex** |
|
|
```bibtex |
|
|
@misc{sternlicht2025chimeraknowledgebaseidea, |
|
|
title={CHIMERA: A Knowledge Base of Idea Recombination in Scientific Literature}, |
|
|
author={Noy Sternlicht and Tom Hope}, |
|
|
year={2025}, |
|
|
eprint={2505.20779}, |
|
|
archivePrefix={arXiv}, |
|
|
primaryClass={cs.CL}, |
|
|
url={https://arxiv.org/abs/2505.20779}, |
|
|
} |
|
|
``` |
|
|
|
|
|
**Quick Links** |
|
|
- π [Project](https://noy-sternlicht.github.io/CHIMERA-Web) |
|
|
- π [Paper](https://arxiv.org/abs/2505.20779) |
|
|
- π οΈ [Code](https://github.com/noy-sternlicht/CHIMERA-KB) |