Ahmed Fawzy Gad published this piece about running machine learning on edge devices. The deep learning models created using TensorFlow require high processing capabilities to perform inference. Fortunately, there is a Lite version of TensorFlow called TensorFlow Lite (TFLite for short) which allows such models to run on devices with limited capabilities. Inference is performed in less than a second.
In this tutorial author will prepare Raspberry Pi (RPi) to run a TFLite model for classifying images. After that, the TFLite version of the MobileNet model will be downloaded and used for making predictions on-device.
The sections covered in this tutorial are as follows:
- Accessing Raspberry Pi from PC
- Preparing TFLite in Raspberry Pi
- Downloading MobileNet
- Classifying a single image
This tutorial assumes that you already have a TensorFlow model converted into a TensorFlow Lite model. If not, there are plenty of TensorFlow Lite models available for download. Author decied to use the Lite version of MobileNet. You will get greats steep by step tutorial plus links to other resources. Great read!
[Read More]