Improve model card: Add pipeline tag, library name, RAG tag, and sample usage
Browse filesThis PR enhances the model card for the HierSearch model by:
* Adding the `pipeline_tag: question-answering` to correctly classify the model's primary function in information retrieval and answering.
* Specifying `library_name: transformers` to indicate compatibility with the Hugging Face Transformers library, enabling the "Use in Transformers" button and code snippets.
* Including `retrieval-augmented-generation` in the tags for better discoverability, as the model is a deep search framework building on RAG principles.
* Adding a sample usage snippet to demonstrate how to run inference with the model using the `transformers` library.
* Adding the full BibTeX citation for proper attribution.
These updates improve the model's discoverability and usability on the Hugging Face Hub.
|
@@ -1,22 +1,27 @@
|
|
| 1 |
---
|
| 2 |
-
|
|
|
|
| 3 |
language:
|
| 4 |
- en
|
| 5 |
- zh
|
| 6 |
-
|
| 7 |
-
|
|
|
|
| 8 |
tags:
|
| 9 |
- biology
|
| 10 |
- finance
|
| 11 |
- text-generation-inference
|
|
|
|
| 12 |
---
|
| 13 |
|
|
|
|
|
|
|
| 14 |
## Model Information
|
| 15 |
|
| 16 |
-
We release agent model used in **HierSearch: A Hierarchical Enterprise Deep Search Framework Integrating Local and Web Searches**.
|
| 17 |
|
| 18 |
<p align="left">
|
| 19 |
-
Useful links: 📝 <a href="https://arxiv.org/abs/2508.08088" target="_blank">Paper</a> • 🤗 <a href="https://huggingface.co/papers/2508.08088" target="_blank">Hugging Face</a> • 🧩 <a href="https://github.com/plageon/HierSearch" target="_blank">Github</a>
|
| 20 |
</p>
|
| 21 |
|
| 22 |
1. We explore the deep search framework in multi-knowledge-source scenarios and propose a hierarchical agentic paradigm and train with HRL;
|
|
@@ -26,5 +31,43 @@ Useful links: 📝 <a href="https://arxiv.org/abs/2508.08088" target="_blank">Pa
|
|
| 26 |
|
| 27 |
🌹 If you use this model, please ✨star our **[GitHub repository](https://github.com/plageon/HierSearch)** or upvote our **[paper](https://huggingface.co/papers/2508.08088)** to support us. Your star means a lot!
|
| 28 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 29 |
|
|
|
|
| 30 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
+
base_model:
|
| 3 |
+
- Qwen/Qwen2.5-7B-Instruct
|
| 4 |
language:
|
| 5 |
- en
|
| 6 |
- zh
|
| 7 |
+
license: mit
|
| 8 |
+
pipeline_tag: question-answering
|
| 9 |
+
library_name: transformers
|
| 10 |
tags:
|
| 11 |
- biology
|
| 12 |
- finance
|
| 13 |
- text-generation-inference
|
| 14 |
+
- retrieval-augmented-generation
|
| 15 |
---
|
| 16 |
|
| 17 |
+
# HierSearch: A Hierarchical Enterprise Deep Search Framework Integrating Local and Web Searches
|
| 18 |
+
|
| 19 |
## Model Information
|
| 20 |
|
| 21 |
+
We release the agent model used in **HierSearch: A Hierarchical Enterprise Deep Search Framework Integrating Local and Web Searches**.
|
| 22 |
|
| 23 |
<p align="left">
|
| 24 |
+
Useful links: 📝 <a href="https://arxiv.org/abs/2508.08088" target="_blank">Paper (arXiv)</a> • 🤗 <a href="https://huggingface.co/papers/2508.08088" target="_blank">Paper (Hugging Face)</a> • 🧩 <a href="https://github.com/plageon/HierSearch" target="_blank">Github Repository</a>
|
| 25 |
</p>
|
| 26 |
|
| 27 |
1. We explore the deep search framework in multi-knowledge-source scenarios and propose a hierarchical agentic paradigm and train with HRL;
|
|
|
|
| 31 |
|
| 32 |
🌹 If you use this model, please ✨star our **[GitHub repository](https://github.com/plageon/HierSearch)** or upvote our **[paper](https://huggingface.co/papers/2508.08088)** to support us. Your star means a lot!
|
| 33 |
|
| 34 |
+
## Usage
|
| 35 |
+
|
| 36 |
+
This model is designed as a "planner agent" within the HierSearch framework, coordinating local and web searches to answer complex questions. It is based on `Qwen2.5-7B-Instruct`. You can load and use it with the `transformers` library for general text generation, or refer to the full codebase for the complete deep search functionality.
|
| 37 |
+
|
| 38 |
+
```python
|
| 39 |
+
from transformers import AutoModelForCausalLM, AutoTokenizer
|
| 40 |
+
import torch
|
| 41 |
+
|
| 42 |
+
model_name = "plageon/HierSearch-Planner-Agent"
|
| 43 |
+
|
| 44 |
+
tokenizer = AutoTokenizer.from_pretrained(model_name)
|
| 45 |
+
model = AutoModelForCausalLM.from_pretrained(model_name, torch_dtype=torch.bfloat16, device_map="auto")
|
| 46 |
+
|
| 47 |
+
messages = [
|
| 48 |
+
{"role": "system", "content": "You are a helpful and knowledgeable assistant specializing in enterprise search."},
|
| 49 |
+
{"role": "user", "content": "What are the main findings of the paper 'HierSearch: A Hierarchical Enterprise Deep Search Framework Integrating Local and Web Searches'?"}
|
| 50 |
+
]
|
| 51 |
+
|
| 52 |
+
text = tokenizer.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
|
| 53 |
+
model_inputs = tokenizer([text], return_tensors="pt").to(model.device)
|
| 54 |
+
|
| 55 |
+
generated_ids = model.generate(model_inputs.input_ids, max_new_tokens=512)
|
| 56 |
+
decoded_output = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0]
|
| 57 |
+
|
| 58 |
+
print(decoded_output)
|
| 59 |
+
```
|
| 60 |
|
| 61 |
+
## Citation
|
| 62 |
|
| 63 |
+
```bibtex
|
| 64 |
+
@misc{tan2025hiersearchhierarchicalenterprisedeep,
|
| 65 |
+
title={HierSearch: A Hierarchical Enterprise Deep Search Framework Integrating Local and Web Searches},
|
| 66 |
+
author={Jiejun Tan and Zhicheng Dou and Yan Yu and Jiehan Cheng and Qiang Ju and Jian Xie and Ji-Rong Wen},
|
| 67 |
+
year={2025},
|
| 68 |
+
eprint={2508.08088},
|
| 69 |
+
archivePrefix={arXiv},
|
| 70 |
+
primaryClass={cs.IR},
|
| 71 |
+
url={https://arxiv.org/abs/2508.08088},
|
| 72 |
+
}
|
| 73 |
+
```
|