• Skip to main content
  • Skip to secondary menu
  • Skip to primary sidebar
  • Skip to footer
  • Home
  • Crypto Currency
  • Technology
  • Contact
NEO Share

NEO Share

Sharing The Latest Tech News

  • Home
  • Artificial Intelligence
  • Machine Learning
  • Computers
  • Mobile
  • Crypto Currency

Custom TensorFlow Lite model on Android using Firebase ML

February 16, 2021 by systems

Once your machine learning model is ready, you have to deploy it to a device. One of the ways that can be done is by shipping the model with the application. A challenge with this method is that whenever your model changes, you will need to ship a new APK to the app stores. Obviously, this takes a long time because every app update needs to be verified by the app store. Now, imagine if it was possible to update the model over the air without the need to ship a new application. In this article, we will see how that can be done using Firebase Machine Learning.

Getting Started

Before we can get this plane in the air, we first need to ensure that we’ve connected to Firebase. Head over to the Firebase Console and create a project. After that, you need to register your application with that project.

Next, download the Google JSON file and place it in the app folder of your application. This file contains configurations that are specific to the project and app you just registered.

The next step is to add the Firebase SDK to the application. Add the following to the project-level build.gradle file.

Add the Google Services plugin to the App-level build.gradle file.

Let’s also declare that TFLite files shouldn’t be compressed. This is crucial because we will load an on-device model before the online model is downloaded. We do this to ensure that the user gets the expected output before the model is downloaded.

In the same file define the dependencies needed for this application.

When used, the Firebase Android BoM (Bill of Materials) enables you to specify just the version of the BoM. After that, the BoM will be responsible for ensuring that the Firebase library versions used for your application are compatible. When you update the BoM, Firebase libraries will be updated to the versions that are associated with the version of BoM. Notice that when the BoM is used, the versions of Firebase libraries need not be defined.

com.google.firebase:firebase-ml-model-interpreter is the dependency for the Firebase ML Custom Models library. Dependencies for TensorFlow Lite are also included.

Since internet access is needed in order to download the model, you’ll need to include that in the Android Manifest file.

Deploy the model

Next, you need to upload your model to Firebase. In this case, let’s use a MobileNet pre-trained model. However, you can train and upload your own custom model. Click “add custom model” to get started.

After this, you will be prompted to name and upload your model. Note that this is the name that you will use to download the model.

Filed Under: Machine Learning

Primary Sidebar

Stay Ahead: The Latest Tech News and Innovations

Cryptocurrency Market Updates: What’s Happening Now

Emerging Trends in Artificial Intelligence: What to Watch For

Top Cloud Computing Services to Secure Your Data

The Future of Mobile Technology: Recent Advancements and Predictions

Footer

  • Privacy Policy
  • Terms and Conditions

Copyright © 2025 NEO Share

Terms and Conditions - Privacy Policy