Fritz AI in Flutter Applications

Add Fritz Vision in Flutter Applications

Derrick Mwiti
6 min readDec 21, 2020

--

In order to work with Fritz AI in Flutter, you’ll need some knowledge on how to write native Java/Kotlin code in Flutter. At the moment, there is no native plugin for working with Fritz AI in Flutter, however, since Flutter allows one to write native code, Fritz AI can still be integrated into Flutter. Let’s look at how the image labeling* API from Fritz can be integrated.

*Note: Flutter also allows you to write other kinds of native code (Obj-C, Swift), but for the purposes of this tutorial, we’ll focus on Java

Getting Started

The first step is to jump into your favorite editor and create a project. The default language for Android in Flutter applications is Kotlin. You can change that by declaring Java explicitly on the terminal as you create your project.

flutter create -i objc -a java native_code

Once you do, copy the application ID. This can be found in your app’s build.gradle file. It will look something like this:

Next, you’ll need a Fritz AI account. For this project, you’ll want to select the “Pre-Trained Models” option. Just as a note, Fritz AI also has a model building platform (Fritz AI Studio) that allows you to build custom mobile-ready models from end-to-end.

Once you’re logged in, click on follow the four prompts to register your new application. First, select your target platform (here, Android).

Give your application a name and enter your application ID. Ensure you type the application ID in your app’s build.gradle file. Otherwise, the Fritz SDK won’t be able to communicate with your application.

The next step is to install the Fritz SDK. Jump over to your root-level Gradle file (build.gradle) and include the Maven repository for Fritz AI:

Next, add the dependency for the SDK to your app-level Gradle file (app/build.gradle). We add the Fritz Core, Image Labelling, and Vision dependencies. Including the Image Labelling model in your application will make your application larger in size. Now that you have changed the Gradle files, ensure that you sync that with your project. That will download all the necessary dependencies.

Now, let’s register the FritzCustomModelService in our AndroidManifest:

The only thing remaining now is to initialize the SDK by calling Fritz.configure() with our API key. This will be done in the Android folder in the MainActivity.java file.

With that in place, click next to verify that your application is able to communicate with Fritz AI.

Use Fritz Pre-trained Models

The pre-trained model we’ll use here has labels for more than 680 common objects. The model will give us the predicted label accompanied by the confidence interval.

Image Source

According to the Fritz AI docs, we also need to specify aaptOptions in order to prevent compression of the tflite model. After adding it, your app/build.gradle file will look like this:

As you can see, RenderScript support is also added in order to improve image processing performance.

The App Elements

This application contains just two key elements:

  • a Text widget that will display the label of the image
  • a Button that processes the image when clicked

Here is the entire widget tree for reference.

Model optimization is only one part of a complicated development lifecycle for mobile ML projects. To explore in more depth, download our free ebook for full lifecycle coverage and best practices from our team of experts.

Obtaining the Image

For simplicity, we’ll use an image from the drawable folder. So you can add the image you’d like to experiment with in that folder. That said, you can work with an image from the web or request the user to select an image. We’ve done something similar to that here. We’ll get to the image loading in a moment. First, let’s see how we can get Flutter to communicate with Java.

Calling Android Code Using Platform Channels

In Flutter, messages can be sent between different channels via a platform channel. The message will be sent to the host operating system(in this case Android). The operating system will then send the response to Flutter. In this transaction, Flutter is the client and the native code is the host. The transaction is asynchronous and so it returns a Future. We'll see this in a trice.

For this transaction to be successful we have to add a MethodChannel in Flutter. Its purpose is to communicate with platform plugins via asynchronous method calls. The name given to this has to be the same in Flutter as well as in the Java code. If the names are different, the communication will fail.

On the Java side it is defined as CHANNEL. Since Dart (the language Flutter is written in) and Java have different types, serialization, and deserialization of data types is done automatically.

With that out of the way, let’s start with the Flutter side of things. Begin by importing the services and async packages.

The next step is to define the MethodChannel. You can give it any name as long as it's the same on the Java side of things.

Let’s now define the function that will handle the communication between the host and Flutter. As mentioned earlier, the process is async, and so we define a function that returns a Future.

This function will run once the button has been clicked.

When the result is obtained, we display it on a text widget.

Here’s the entire dart file for reference:

The Java Side

In mainActivity.java, define the CHANNEL with the same name as the one in Flutter. mainActivity.java is found in the Android folder:

The next step is to define the MethodChannel and set a MethodCallHandler. Notice the labelImage method. The method will return the label as a string and we then attach that to the result. Next, let’s define that method.

We start by obtaining an image as a Bitmap, this is what is required by Fritz AI. After that we:

  • create a FritzVisionImage from it
  • create an instance of the on-device image labeling model
  • define the FritzVisionLabelPredictor
  • run the prediction and obtain the result
  • get the obtained result as a string.

This gives us the label and the confidence level and, finally, we return that result:

Editor’s Note: Heartbeat is a contributor-driven online publication and community dedicated to exploring the emerging intersection of mobile app development and machine learning. We’re committed to supporting and inspiring developers and engineers from all walks of life.

Editorially independent, Heartbeat is sponsored and published by Fritz AI, the machine learning platform that helps developers teach devices to see, hear, sense, and think. We pay our contributors, and we don’t sell ads.

If you’d like to contribute, head on over to our call for contributors. You can also sign up to receive our weekly newsletters (Deep Learning Weekly and the Fritz AI Newsletter), join us on Slack, and follow Fritz AI on Twitter for all the latest in mobile machine learning.

--

--