File size: 2,171 Bytes
1dec809 c99dcd5 5566540 c99dcd5 dd7705c c99dcd5 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 |
---
license: bsd-3-clause-clear
language:
- en
tags:
- Transformer
- ONNX
- ocr
- mmocr
- satrn
---
# satrn
[original repo](https://github.com/open-mmlab/mmocr/blob/main/configs/textrecog/satrn/README.md)
## Convert tools links:
For those who are interested in model conversion, you can try to export onnx or axmodel through
[satrn.axera](https://github.com/AXERA-TECH/satrn.axera)
## Installation
```
conda create -n open-mmlab python=3.8 pytorch=1.10 cudatoolkit=11.3 torchvision -c pytorch -y
conda activate open-mmlab
pip3 install openmim
git clone https://github.com/open-mmlab/mmocr.git
cd mmocr
mim install -e .
```
## Support Platform
- AX650
- [M4N-Dock(η±θ―ζ΄ΎPro)](https://wiki.sipeed.com/hardware/zh/maixIV/m4ndock/m4ndock.html)
- [M.2 Accelerator card](https://axcl-docs.readthedocs.io/zh-cn/latest/doc_guide_hardware.html)
The speed measurements(under different NPU configurations ) of the two parts of SATRN:
(1) backbone+encoder
(2) decoder
||backbone+encoder(ms)|decoder(ms)|
|--|--|--|
|NPU1|20.494|2.648|
|NPU2|9.785|1.504|
|NPU3|6.085|1.384|
## How to use
Download all files from this repository to the device
```
.
βββ axmodel
β βββ backbone_encoder.axmodel
β βββ decoder.axmodel
βββ demo_text_recog.jpg
βββ onnx
β βββ satrn_backbone_encoder.onnx
β βββ satrn_decoder_sim.onnx
βββ README.md
βββ run_axmodel.py
βββ run_model.py
βββ run_onnx.py
```
### python env requirement
#### 1. pyaxengine
https://github.com/AXERA-TECH/pyaxengine
```
wget https://github.com/AXERA-TECH/pyaxengine/releases/download/0.1.1rc0/axengine-0.1.1-py3-none-any.whl
pip install axengine-0.1.1-py3-none-any.whl
```
#### 2. satrn
[satrn installation](https://github.com/open-mmlab/mmocr/tree/main?tab=readme-ov-file#installation)
#### Inference onnxmodel
```
python run_onnx.py
```
input:

output:
```
pred_text: STAR
score: [0.9384028315544128, 0.9574984908103943, 0.9993689656257629, 0.9994958639144897]
```
#### Inference with AX650 Host
check the [reference](https://github.com/AXERA-TECH/satrn.axera) for more information
|