Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

How Personalized App Experience Around AI Will Shape Mobile App Development

DZone's Guide to

How Personalized App Experience Around AI Will Shape Mobile App Development

Let's take a look at how AI is going to change the way apps interact with humans. Are we already there? Let us know your opinion in the comments.

· AI Zone ·
Free Resource

EdgeVerve’s Business Applications built on AI platform Infosys Nia™ enables your enterprise to manage specific business areas and make the move from a deterministic to cognitive approach.

The speaker orders Google Assistant to book him a hair-cutting appointment. Google Assistant places a call to a nearby salon. The reception picks the call. Google Assistant talked its way out with the receptionist and in a very human manner asked it to book an appointment, which she obediently did. All this happened during Google's annual keynote event I/O 2018 in the front of thousands of people. Google Assistant passed the Turing Test with flying colors because the receptionist had no idea she's was talking to a robot.

Google_Assistant

AI gives abilities to machines we thought only living beings are capable of. With intelligence, machines can self-train themselves to learn new things very much like humans do by seeing, listening, talking, and observing others.

Today's portable devices such as smartphones and tablets sport more sensors than any other computing device. They not only can see and hear, but they can catch variation in temperatures, tell orientation with the ground, and even calculate real-time speed and humidity of the device. But sensors alone aren't enough for AI.

AI capabilities need an insane amount of processing power. One such capability is Natural language understanding (NLU). Another one is intelligent routing in the content delivery network. There is a reason these battery-operated devices are always connected to the internet and draw high on cloud computing.

AI in Mobile App Development and NLU

A mobile app is nothing but an assemblage of services borrowed from various service providers by means of APIs packed inside a native software package. Third party APIs give capabilities to applications to fetch weather information, the status of a stock, make a payment on an e-commerce website, etc. on a user's requests.

In a similar way, AI capabilities can be added with third party APIs to an app, particularly Natural language understanding (NLU). There are many such APIs available to enable this functionality in your app. For mobile devices running on Android and iOS, Actions for the Google Assistant and SiriKit are the most popular.

google-vs-siri

With close integration of Siri and Google Assistant with iOS and Android, they are the perfect candidates to bring AI capabilities in our everyday apps.

App Integration: Intent, Action, Fulfillment

With AI becoming an integral of the app ecosystem, future user interaction will revolve around three things:

  • Intent: A goal or task that users want to do such as ordering coffee or finding a piece of music.
  • Action: An interaction you build for the Assistant or Siri that supports a specific intent and has a corresponding fulfillment that processes the intent.
  • Fulfillment: A service, app, feed, conversation, or other logic that handles an intent and carries out the corresponding Action.

If you want to order a cup of coffee from your phone, you have to open the coffee-ordering app, select the coffee you want, add toppings, and make payment. Too long of a process? This is the way we do things now with our smartphones.

With AI integration such as SiriKit, every one of your mobile interactions will revolve around three things: intent, action, fulfillment. The user just has to clear his intent — the phone will take care of the rest. It will interact with the necessary apps, make payments, and even share your location. Gone are the days when you have to install tens of hundreds of apps on your phone.

So, when you have to order a burger and book a cab, just tell your phone and it'll take care of the rest. You'll soon receive your food and ride.

Delivering Personalized Experience With AI

A user can express his intent to a smartphone in two ways: text and speech. However, there is more to AI than just giving users' intent a direction. What about a personalized experience? The purpose of apps is to provide a personalized experience to each user. Let's take an example; you use maps in the morning to check traffic status and Facebook before going to bed. As an AI program can train itself with the data available, it will get better with time. It will get better at predicting your behavior. The OS will suggest different apps at different point of time. In the morning, it will suggest maps, and during bedtime, Facebook.

Lock Screen Search

Suggesting apps is one thing, but delivering a personalized experience is another. To understand how important AI is to app experience and how it will shape the future of mobile app development, let me give you an example:

James uses his favorite food delivery app to order lunch every day. Every day, 30 minutes before lunchtime, he picks up his phone, opens the food delivery app, and chooses a restaurant and an item on its menu. He then pays for it and waits for the food to get delivered. This takes about 10 minutes of his time.

Another app comes in the market. It's very similar to the one James has on in his phone, except this one is integrated with an AI engine. Think of it as a future version of Siri. James is tired of doing the same thing every time he has to order his lunch, so he gave this new app a try.

It looked the same for the first ten days except for a few cosmetic changes. However, the app learned James's many preferences in those 10 days: James has chicken on Saturdays and a salad on Mondays. He tries a burger and pizza on Wednesday and Tuesdays. On Friday, he often orders fish and chips. He orders fish and chips from Johnny's Diner, pizza from PizzaHut, and the Burger from White Castle.

It was Wednesday and an hour before his lunch, and he receives a system notification generated by the AI system: "Order Pizza from Pizza Hut?" A similar notification pops up, "50% discount on American Style Pizza at Pizza Hut. " James clicks the other notification and soon, a pizza was on its way.

This is the kind of personal experience we want to receive from the apps we use.

Final Words

AI will reshape the way we interact with apps. Time will come when platforms like Google Assistant and Siri will replace the OS to become the center of a user's phone experience. The AI engine inside them will deliver the whole flow from action to fulfillment upon a user's "intent." The AI engine will interact with the apps, system, server, payment gateways, etc.

In the near future, when you are lying on your bed with your phone in your pocket, it could check to see if you're sick or not by monitoring your vital health statistics. If you're ill, it will make a doctor's appointment, share your medical info with the doctor, call a cab on the day of the appointment, note down your prescription, order medicines, etc. Welcome to the future.

Adopting a digital strategy is just the beginning. For enterprise-wide digital transformation to truly take effect, you need an infrastructure that’s #BuiltOnAI. Click here to learn more.

Topics:
mobile app development ,artifical intelligence ,voice assistants ,customer experience ,mobile app ,nlu ,ai

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}