Machine Learning: Integrating Tensorflow in Android
As we all know Google has open-sourced a library called TensorFlow that can be used in Android for implementing Machine Learning.
TensorFlow is an open-source software library for Machine Intelligence provided by Google.
I searched the internet a lot but did not find a simple way or a simple example to build TensorFlow for Android. After going through many resources, I was able to build it. Then, I decided to write on it so that it would not take time for others.
Credit: The classifier example has been taken from Google TensorFlow example.
This article is for those who are already familiar with machine learning and know how to the build model for machine learning(for this example I will be using a pre-trained model). Sooner, I will be writing a series of articles on machine learning so that everybody can learn how to build the model for machine learning.
Edit: Now, there is no need to build the library like below as the library is now available through maven. Check this pull request.
Let’s start with the building process for Android
Few important pointers that we should know:
- The core of the TensorFlow is written in c++.
- In order to build for Android, we have to use JNI(Java Native Interface) to call the c++ functions like loadModel, getPredictions, etc.
- We will have a .so(shared object) file which is a c++ compiled file and a jar file which will consist of JAVA API that will be calling the native c++. And then, we will be calling the JAVA API to get things done easily.
- So, we need the jar(Java API) and a .so(c++ compiled) file.
- We must have the pre-trained model file and a label file for the classification.
We will be building the below object detection.
So let’s build the jar and .so file.
git clone --recurse-submodules https://github.com/tensorflow/tensorflow.git
--recurse-submodules is important to pull submodules.
Download NDK from here.
Download Android SDK or we can provide the path from Android Studio SDK.
Install Bazel from here. Bazel is the primary build system for TensorFlow.
Now, edit the WORKSPACE, we can find the WORKSPACE file in the root directory of the TensorFlow that we have cloned earlier.
# Uncomment and update the paths in these entries to build the Android demo. #android_sdk_repository( # name = "androidsdk", # api_level = 23, # build_tools_version = "25.0.1", # # Replace with path to Android SDK on your system # path = "<PATH_TO_SDK>", #) # #android_ndk_repository( # name="androidndk", # path="<PATH_TO_NDK>", # api_level=14)
Like below with our sdk and ndk path:
android_sdk_repository( name = "androidsdk", api_level = 23, build_tools_version = "25.0.1", # Replace with path to Android SDK on your system path = "/Users/amitshekhar/Library/Android/sdk/", )
android_ndk_repository( name="androidndk", path="/Users/amitshekhar/Downloads/android-ndk-r13/", api_level=14)
Then build the .so file.
bazel build -c opt //tensorflow/contrib/android:libtensorflow_inference.so --crosstool_top=//external:android/crosstool --host_crosstool_top=@bazel_tools//tools/cpp:toolchain --cpu=armeabi-v7a
Replacing armeabi-v7a with our desired target architecture.
The library will be located at:
To build the Java counterpart:
bazel build //tensorflow/contrib/android:android_tensorflow_inference_java
We can find the JAR file at:
Now we have both jar and .so file. I have already built both .so file and jar, you can directly use from the below project.
I have created a complete running sample application here.
But, we need the pre-trained model and label file.
In this example, we will use the Google pre-trained model which does the object detection on a given image.
We can download the model from here.
Unzip this zip file, we will get imagenet_comp_graph_label_strings.txt(label for objects) and tensorflow_inception_graph.pb (pre-trained model).
Now, create an android sample project in Android Studio.
Put imagenet_comp_graph_label_strings.txt and tensorflow_inception_graph.pb into assets folder.
Put libandroid_tensorflow_inference_java.jar in libs folder and right click and add as library.
Create jniLibs folder in main directory and put libtensorflow_inference.so in jniLibs/armeabi-v7a/ folder.
Now, we will be able to call TensorFlow Java API.
The TensorFlow Java API has exposed all the required methods through a class
Now, we have to call the TensorFlow Java API with the model path and load it.
And, then we can provide the input image to get the prediction.
To see it completely working, clone the project, build and run.
If you are getting any problem in building the project, connect with me, I will be happy to help.
Update: Check Creating Custom Model For Android Using TensorFlow Here.
RxJava is a Java VM implementation of Reactive Extensions. It has become the single most important skill for Android development.
Happy Coding 🙂