Screen2AX
Collection
11 items
β’
Updated
This model is a fine-tuned version of Ultralytics/YOLO11, trained to detect UI elements in macOS application screenshots.
It is part of the Screen2AX project β a research effort focused on generating accessibility metadata using computer vision.
['AXButton', 'AXDisclosureTriangle', 'AXImage', 'AXLink', 'AXTextArea']
This model detects common interactive components typically surfaced in accessibility trees on macOS.
MacPaw/Screen2AX-Elementpip install huggingface_hub ultralytics
from huggingface_hub import hf_hub_download
from ultralytics import YOLO
# Download the model
model_path = hf_hub_download(
repo_id="macpaw-research/yolov11l-ui-elements-detection",
filename="ui-elements-detection.pt",
)
# Load and run prediction
model = YOLO(model_path)
results = model.predict("/path/to/your/image")
# Display result
results[0].show()
This model is licensed under the GNU Affero General Public License v3.0 (AGPL-3.0), as inherited from the original YOLOv11 base model.
If you use this model in your research, please cite the Screen2AX paper:
@misc{muryn2025screen2axvisionbasedapproachautomatic,
title={Screen2AX: Vision-Based Approach for Automatic macOS Accessibility Generation},
author={Viktor Muryn and Marta Sumyk and Mariya Hirna and Sofiya Garkot and Maksym Shamrai},
year={2025},
eprint={2507.16704},
archivePrefix={arXiv},
primaryClass={cs.LG},
url={https://arxiv.org/abs/2507.16704},
}
Learn more at https://research.macpaw.com
Base model
Ultralytics/YOLO11