{{announcement.body}}
{{announcement.title}}

Using ML Predictions in Mobile Apps With Couchbase Lite's Predictive Query API

DZone 's Guide to

Using ML Predictions in Mobile Apps With Couchbase Lite's Predictive Query API

In this article, see how to use ML predictions in Mobile apps with Couchbase Lite's predictive query API.

· Integration Zone ·
Free Resource

Couchbase Lite is a full-fledged NoSQL JSON document database for mobile and desktop applications. Couchbase Lite’s Predictive Query API allows applications to leverage pre-trained, Machine Learning(ML) models to run predictive queries against data stored in application's local Couchbase Lite database in a convenient, fast and always-available way. These predictions can be combined with predictions made against real-time data captured by your app to enable a range of compelling applications. 

In this post, I provide an overview of the feature including context around why we built it and the kinds of applications that it can enable. I also demonstrate the use of the Predictive API with an example. 

Machine Learning on Mobile — Motivation

Machine Learning has been traditionally been done in the cloud. With access to “unbounded” compute and storage resources, it seemed like the only viable place to do it. While that may be true for certain applications, there are a number of reasons to perform machine learning locally on mobile device.

Fast and Responsive

By avoiding the round trip time to the cloud, the response times to predictions can be guaranteed to be in order of a few milliseconds regardless of state of network connection. This is important as the responsiveness of an app is directly correlated to the end user experience which is important for app retention.

Always Available

By running machine learning locally on device, you can make predictions even in disconnected state. Your application and your data is always available. This is particularly relevant in case of field applications deployed in disconnected environments.

Data Privacy

Running predictions locally on the device implies that sensitive data does not have to ever leave your device. You can use this to build apps with highly personalized experiences without compromising on data privacy restrictions.

Bandwidth Savings

By avoiding the need for data to be transferred to the cloud, you can save on bandwidth costs. This is particularly important in case of network bandwidth-constrained environments where data plans come at a premium.

All of the abovementioned benefits align well with the paradigm of edge computing where there is a growing need to bring storage and compute closer to the edge.

Mobile ML — Ecosystem

Mobile ML is a reality and there are several ecosystem enablers that make that possible.

Powerful Mobile Devices

Mobile devices are getting to be incredibly powerful. For instance, the recently launched iPhone11 and Pixel 4 include dedicated neural core engines. Innovation at the silicon layer make it feasible to run ML models locally on mobile devices. By taking advantage of platform hardware acceleration capabilities,the models can run in a performant manner.

Mobile Optimized Models

Machine Learning models have traditionally been strorage and compute intensive. There are several efforts/projects underway to optimize the models for resource utilization while maintaining required level of accuracy. There is a growing network of open-source pre-trained ML models such as MobileNet and SqueezeNet that are optimized to run on mobile devices.

Machine Learning Frameworks

All the major mobile platforms include support for Machine Learning frameworks and libraries that make it extremely easy for developers to integrate ML support into their apps. These frameworks leverage hardware acceleration and low-level APIs that make it very performant. Examples include Core ML for iOS, Tensorflow lite for Android, ML.NET for Windows, and ML Kit among others.

Creating ML Models

There is a growing number of open source tools and platforms such as TensorFlow and createML that make it easier than ever to create machine learning models. You don’t need to be a data scientist to create one.

Predictive Query API

Predictive Queries API enable mobile apps to leverage pre-trained machine learning(ML) models to make predictions on Couchbase Lite data. The API can be used to combine real-time predictions made on data input into the app in real-time with predictions on app data stored in Couchbase Lite.

By building prediction index during write time, users can get see significant improvements in performance of their queries. The prediction results can be materialized in Couchbase Lite and synced over to other clients via Sync Gateway.

Applications

There are several applications of Machine Learning on mobile including image classifications, voice recognition, object recognition and content-based recommender systems. Combining those predictions with data that is stored in Couchbase Lite has the potential to revolutionalize day-to-day activities across several industries. All these applications benefit from running the machine learning locally viz., guaranteed high availability, faster response, efficient network bandwidth usage and data privacy. Here are a few examples-

Retail and e-commerce

Consider this workflow

  • You walk into a store with a photo of an item of interest. For instance, a photo of a handbag that you saw on someone
  • You ask the sales clerk if the item is available
  • The sales clerk takes a photo of the item with his iPad
  • The captured image is used to search the local catalog database on his iPad 
  • If the item is available, you are directed to the right aisle. You can also be presented with alternatives and recommendations on other related items.
  • Captured image is deleted and never leaves the iPad

Image-based search augment the more traditional text-based search and even voice-based search experience which rely on the user’s ability to describe the item of interest. Applications with image-based search can significantly improve both the online and in-store shopping experience. While the advantages of image-based search are quite obvious in case of e-commerce applications, workflows such as the one described above provides an opportunity for traditional brick-and-motor stores to compete with online shopping experiences.

Behind the scenes, the catalog/inventory app uses an Image Classifier ML model to identify the item that is captured using the tablet camera. Once identified, the app looks up the local Couchbase Lite database to check if the item is available at the store and retrieves other relevant details such as if it is in stock and the aisle where it can be found. The images never leave the store app and can be locally deleted after use, alleviating any privacy concerns.

Hospitality

Consider a self-ordering kiosk at a fast food restaurant. Here is a typical workflow.

  • You walk up to a self-ordering kiosk in a fast food restaurant
  • Kiosk is equipped with a camera that takes your image (with your permission of course)
  • The captured face image is used to lookup a local database of registered patrons to identify your preferences
  • The system can pull up order history, suggest meal orders, apply loyalty points and such. Your order can be placed using credit card on file.
  • Captured image is deleted and never leaves the kiosk

Self-ordering kiosks are fairly ubiquitous and have revolutionized the meal ordering experience by reducing wait times resulting in faster service. Equipping these kiosks with facial recognition technology can make that experience even more seamless and faster.

Beind the scenes, the kiosk is running an app that uses a Facial Recognition ML model such as OpenFace to generate a unique fingerprint of the captured face image. The app then performs a similarity match between the fingerprint of the captured face image and the fingerprints of images in Couchbase Lite database to identify the closest match.

Feature Walkthrough

The best way to understand how the API works is through an example. In this post, I am going to describe how you can implement Facial Recognition application discussed above using the Predictive Query API. This is probably the more complicated workflow. The image-based search app using classifier model follows a similar pattern except that it is a lot simpler.

  • Prerequisites
    • Face Recognition ML model is available in the app. The ML model could have been bundled with the app or pulled down from an external repository, with techniques similar to this. The model takes in an image and outputs a “fingerprint” or “face embedding”, which is essentially a vector representation image features.
    • Couchbase Lite database is populated with relevant data. In our use case, it would correspond to a registered user database with “user” type documents. Each user document includes a blob corresponding to a photo of registered user.
  • Step 1: Register the ML model with Cuchbase Lite
    You can run predictions using any ML model. As an app developer, you will implement the PredictiveModel interface and register with Couchbase Lite. This interface is very straightforward and defines a single predict() method that’s to be implemented by the app developer. Anytime, the Prediction Function is invoked on Couchbase Lite, the underlying predict() method in the PredictiveModel is invoked.

Java
 




xxxxxxxxxx
1


 
1
 // 1: Register Model
2
 Database.prediction.registerModel(mlModel, withName: modelName)



  • Step 2: Create Predictive Index
    Create a predictive index on the images in Couchbase Lite database by running a prediction on all the images using the registered ML model. While this step is optional, it is highly recommended to create the index as it has a significant impact on query time performance

Java
 




xxxxxxxxxx
1


 
1
// 2: Build prediction index of predictions on images in database 
2
let index = IndexBuilder.predictiveIndex(model: modelName, input: imageInDB)
3
db.createIndex(index, withName: "faceIndex")
4
 
5
// Generate fingerprint by running predictions on images in database
6
let fingerPrintOfImagesInDB = Function.prediction(model: modelName, input: imageInDB)



  • Step 3: Prediction on captured image
    Run prediction on image input into the app using Prediction Function

Java
 




xxxxxxxxxx
1


 
1
// 3: Generate fingerprint by running prediction on input image
2
let fingerPrintOfInputImage = Function.prediction(model: modelName, input: inputPhoto)



  • Step 4: Similarity Match and User document query
    Run a similarity match between the captured image and images in the database using one of the many distance vector functions. Query the Couchbase Lite database for documents with the closest match
Predictive Query API on Couchbase Lite
Java
 




x


 
1
// 4: Find distance between  fingerprints (similarity match of prediction results)
2
let distanceBetweenImages = Function.cosineDistance(between: fingerPrintOfInputImage, and: fingerPrintOfImagesInDB)
3
 
          
4
// Query for documents in database with closest match (as measured by distance comparison)
5
let query = QueryBuilder
6
            .select(SelectResult.all())
7
            .from(DataSource.database(db))
8
            .orderBy(Ordering.expression(distance).ascending())
9
            .limit(Expression.int(1))



That’s it! In just 4 simple steps, you can use the Predictive Query API to implement a facial recognition application based on data in Couchbase Lite.

Resources

In this post, we discussed a few applications and walked through an example of how you would leverage Couchbase Lite's Predictive Query API. 

Here are direct links to some helpful resources:
- Predictive Query Documentation
- Sample App: Sample app demonstrating use of Predictive API with a Classifier ML model

Topics:
couchbase, couchbase mobile, embedded database, integration, machine learning, mobile, mobile app, tutorial

Published at DZone with permission of Priya Rajagopal , DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}