Depth-Anything-3
This version of Depth-Anything-3 has been converted to run on the Axera NPU using w8a16 quantization.
This model has been optimized with the following LoRA:
Compatible with Pulsar2 version: 5.0-patch1
Convert tools links:
For those who are interested in model conversion, you can try to export axmodel through
The repo of AXera Platform, which you can get the detial of guide
Support Platform
- AX650
- AX630C
| Chips | Time |
|---|---|
| AX650 | ms |
| AX630C | ms |
How to use
Download all files from this repository to the device
python env requirement
pyaxengine
https://github.com/AXERA-TECH/pyaxengine
wget https://github.com/AXERA-TECH/pyaxengine/releases/download/0.1.3.rc2/axengine-0.1.3-py3-none-any.whl
pip install axengine-0.1.3-py3-none-any.whl
others
Maybe None.
Inference with AX650 Host, such as M4N-Dock(爱芯派Pro)
Input image:
root@ax650:~/AXERA-TECH/Depth-Anything-3# python3 python/infer.py --model models/da3metric-large.axmodel --img examples/demo01.jpg
[INFO] Available providers: ['AxEngineExecutionProvider']
[INFO] Using provider: AxEngineExecutionProvider
[INFO] Chip type: ChipType.MC50
[INFO] VNPU type: VNPUType.DISABLED
[INFO] Engine version: 2.12.0s
[INFO] Model type: 2 (triple core)
[INFO] Compiler version: 3.3 ae03a08f
root@ax650:~/AXERA-TECH/Depth-Anything-3# ls
Output image:
- Downloads last month
- 12
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
🙋
Ask for provider support
Model tree for AXERA-TECH/Depth-Anything-3
Base model
depth-anything/DA3-BASE
