Feature Extraction
Transformers
English
File size: 1,210 Bytes
6fd6a0a
72c241b
 
6fd6a0a
 
 
 
201e002
72c241b
 
6fd6a0a
b419d9b
201e002
 
 
 
0c1e34d
48fd529
 
 
 
 
 
 
 
 
 
 
0c1e34d
 
 
 
6fd6a0a
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
---
base_model:
- mistralai/Mistral-7B-Instruct-v0.3
datasets:
- noystl/Recombination-Extraction
language:
- en
library_name: transformers
license: cc
pipeline_tag: feature-extraction
---

This Hugging Face repository hosts a fine-tuned Mistral model designed to classify scientific abstracts based on whether they involve idea recombination, as introduced in the paper [CHIMERA: A Knowledge Base of Idea Recombination in Scientific Literature](https://huggingface.co/papers/2505.20779). The model employs a LoRA adapter on top of a Mistral base model.

For detailed usage instructions and to reproduce the results, please refer to the linked GitHub repository.

**Bibtex**
```bibtex
@misc{sternlicht2025chimeraknowledgebaseidea,
      title={CHIMERA: A Knowledge Base of Idea Recombination in Scientific Literature}, 
      author={Noy Sternlicht and Tom Hope},
      year={2025},
      eprint={2505.20779},
      archivePrefix={arXiv},
      primaryClass={cs.CL},
      url={https://arxiv.org/abs/2505.20779}, 
}
```

**Quick Links**
- ๐ŸŒ [Project](https://noy-sternlicht.github.io/CHIMERA-Web)
- ๐Ÿ“ƒ [Paper](https://arxiv.org/abs/2505.20779)
- ๐Ÿ› ๏ธ [Code](https://github.com/noy-sternlicht/CHIMERA-KB)