MobileNet-v2-INT8
Image Classification
INT8
post
MobileNet-v2: Image Classification

MobileNet-v2 is an efficient deep convolutional neural network designed for mobile and embedded devices. It improves upon MobileNet-v1 by introducing an inverted residual structure and a linear bottleneck design. The inverted residuals enable efficient computation by retaining low-dimensional features, while the linear bottleneck reduces the transmission of redundant information, lowering computational costs. MobileNet-v2 achieves significant reductions in model parameters and computational overhead without sacrificing accuracy, making it suitable for environments with limited resources. It is widely used in tasks such as image classification, object detection, and semantic segmentation.

Source model

  • Input shape: 224x224
  • Number of parameters: 3.34M
  • Model size: 13.34M
  • Output shape: 1x1000

Source model repository: MobileNet-v2

Performance Reference

Device

Backend
Precision
Inference Time
Accuracy Loss
File Size
Model Optimization

Click Model Conversion Reference in the Performance Reference panel on the right to view the model conversion steps.

Inference with AidLite SDK

The model performance benchmarks and inference example code provided on Model Farm are all based on the APLUX AidLite SDK

SDK installation

For details, please refer to the AidLite Developer Documentation

  • Install AidLite SDK
# install aidlite sdk c++ api
sudo aid-pkg -i aidlite-sdk

# install aidlite sdk python api
python3 -m pip install pyaidlite -i https://mirrors.aidlux.com --trusted-host mirrors.aidlux.com
  • Verify AidLite SDK
# aidlite sdk c++ check
python3 -c "import aidlite; print(aidlite.get_library_version())"

# aidlite sdk python check
python3 -c "import aidlite; print(aidlite.get_py_library_version())"

Inference example

  • Click Model & Test Code to download model files and inference codes. The file structure showed below:
/model_farm_{model_name}_aidlite
    
    |__ models # folder where model files are stored

    |__ python # aidlite python model inference example

    |__ cpp # aidlite cpp model inference example

    |__ README.md
License
Source Model:APACHE-2.0
Deployable Model:APLUX-MODEL-FARM-LICENSE
Performance Reference

Device

Backend
Precision
Inference Time
Accuracy Loss
File Size