Is Artificial Intelligence a Booster for Edge Computing?
Is Artificial Intelligence a Booster for Edge Computing?
Edge computing is the technology that allows us to relocate data processing as close as possible to the application. It benefits heavily from AI.
Join the DZone community and get the full member experience.Join For Free
Insight for I&O leaders on deploying AIOps platforms to enhance performance monitoring today. Read the Guide.
In January of this year, a truck equipped with latest technologies traveled more than 3,800 km in the United States from west to east autonomously. The unique combination of technologies involved to accomplished this feat perfectly illustrates the power of edge computing. Are you familiar with this concept and its implications?
Edge computing is the technology that allows us to relocate data processing as close as possible to the application.
Even if there were a human operator on board to ensure the safety of this piloting test, this is a normal truck that has been modified and to which sensors and software have been added that accomplished this. The distinctive feature of the Embark Truck’s approach lies in the fact that they did not use detailed route maps to guide their autonomous system but instead considered an alternative way to guide the truck. Embark relied entirely on data collected by the vehicle's sensors and its embedded machine learning algorithms.
What Is Edge Computing?
This technology is characterized by a software and hardware architecture where data are processed as close as possible to their provenience. It refers primarily to the Internet of Things and mobile computing and relies on smart devices (intelligent devices) and also a bit of the cloud. Another aspect worth noting edge computing mainly concerns data whose value is of particular importance in a short time window.
As per the previous example, data produced by the truck sensors is processed within its embarked devices. Obviously, the truck has to “see” the road in real-time to process the driving tasks accordingly.
It necessarily takes advanced technologies including low-power sensors, RFID (radio-frequency identification), low-cost battery power, low-cost data communication links, and data storage and computing systems.
These comments help us to appraise more precisely what is interesting about edge computing. Edge computing lets processes take advantage of data in a continuous stream at the very time and place when it is acquired. This also brings technical advantages including security as data is not transported across the network and is not stored in data centers, as well as cost optimization.
“Edge computing is a way to streamline the flow of traffic from IoT devices and provide real-time local data analysis.” — Brandon Butler
IDC says that by 2019, almost 50% of IoT-created data will be stored, processed, analyzed, and acted upon close to or at the edge of the network.
McKinsey estimates that the economic impact of IoT applications could be from $3.9 trillion to $11.1 trillion per year in 2025. They give as an example: “The value of improved health of chronic disease patients through remote monitoring could be as much as $1.1 trillion per year in 2025.”
A new research report from MarketsAndMarkets anticipates that the edge computing market is expected to grow from $1.47 billion in 2017 to $6.72 billion by 2022, at a compound annual growth rate of >35% during the forecast period.
Gartner’s analysis report that currently, around 10% of enterprise-generated data is created and processed outside a traditional centralized data center or cloud. By 2022, Gartner predicts that this figure will reach 50%.
The Scope of Edge Computing
According to McKinsey, “The IoT data that are used today are mostly for anomaly detection and control, not optimization and prediction, which provide the greatest value.” McKinsey also states that most data generated from IoT devices (and more generally, from intelligent systems at the edge) is not used today. Their analyses lead them to feel that companies could benefit up to 90% of the economic value of these data. The main domains that can take the best advantage of edge computing, in their view, include:
It seems clear that although there are many different usage scenarios of edge computing, they remain, by their very nature, closely tied to IoT. The best-known examples might be self-driven vehicles and smartphones. However, the biggest innovative industrial companies, including General Electric Digital, have been working on it for years, mainly in the context of Industrial Internet of Things (IIoT). In the same way, more and more smart cities projects are flourishing all over the world involving IoT and AI technologies and, a fortiori, edge computing.
By 2020, the global available storage capacity will be able to store less than 15% of the amount of data in the digital universe. — IDC
As per the “digital universe” study conducted by IDC, the world's data will climb above 40 zettabytes over the next two years. The IoT field accounts for 10%. It's easy to see why the industrial sector has a great interest and great expertise in IIoT and edge computing.
There are a multiplicity of potential uses, but typical use cases in the industrial field are:
Energy efficiency management
Smart manufacturing (customization of production modes)
Flexible device replacement (rapid deployment of new processes and models)
Low/intermittent connectivity (closed-loop interaction between machine insights and actuation)
AI Takes Edge Computing to the Next Level
Among the various ways in which edge computing could take advantage of artificial intelligence (and vice versa), there are three areas that especially distinguish themselves:
1. Autonomous Vehicles
Trucks, taxis, personal cars, trains, drones — not a month goes by without a press release pointing an innovation, a significant technical progress, or a new project regarding the application of the latest AI and edge computing technology progressing self-driving vehicles. The autonomous vehicles ecosystem — including software editors, hardware makers, application developers, data scientists, automotive manufacturers, sensor makers, etc. — are putting together technology and expertise to implement intelligent co-pilot and self-driving capabilities. They rely on applications and algorithms that empower data acquired by sensors that equip vehicles. For example, they work to develop and perfect AI algorithms that process sensor data to let a vehicle make instant decisions, like emergency stops, for example. Another example is an obstacle avoidance algorithm that monitors the vehicle’s trajectory to verify that it remains free of obstacles.
In this area, there are two main categories: one where it means machines and a second one where it means software automation. Regarding software automation (AKA robotic process automation), please refer to my article How AI Will Take Robotic Process Automation to the Next Level.
When we come to robots, in order to function efficiently in their working areas, on top of their functional capabilities (e.g. moving heavy loads and working in complex, perilous environments), they have to be empowered by vital functions that might include machine vision, speech recognition, and complex decision-making algorithms.
The challenge is to have robots working in human environments while ensuring the safety of their humans coworkers. The fact is humans make mistakes, they can have erratic behavior, and they can disobey or misunderstand security rules.
3. Maintenance and Monitoring
While IoT has been involved into these areas for a long time, having AI algorithms processing sensor data on the edge give another dimension to maintenance and monitoring processes. Predictive maintenance is mainstream for airline companies, so it becomes a service that they can value. A March 2018 article states that Airbus will provide a predictive maintenance platform to EasyJet. The next challenge for industrial companies is to enhance predictive maintenance to improve their process, reduce time to market, tend to zero downtime, save money, and even save lives.
This is a domain where Edge Computing AI can make a substantial difference.
The quality of a machine — whether it’s an airliner, transport truck, or soda machine — is not only measured by how efficient it is but also on reliable it is. Let’s remember that the first connected thing was set up even before the invention of the World Wide Web. it was the famous Internet Coke Machine created in 1982 by Carnegie Mellon University students. They wanted to make sure the machine was not empty before walking to it.
Then, to ensure reliability, there is a need to improve maintenance processes. Scheduling the maintenance of complex and sophisticated machines like aircrafts could be a nightmare. An A380, for example, has about seven million parts. The challenge lies in finding the very right moment when maintenance should proceed. It must take place in such a way that the piece is not prematurely replaced but at the same time, it won’t fail. Answering the industry's most pressing questions requires combining sensors data, machine learning algorithms, and advanced models.
As Jon Markman rightly points out, it must be clear to you now that before the advent of cloud computing — which, schematically speaking, consists of moving data from local databases to data centers — most computing did take place at the edge of the network.
Indeed, today, it seems clear that technological developments, increased volumes of data, the need for real-time data processing, and the increasing sophistication and wide availability of smart devices, sensors, and connected things has led to an inexorable shift to edge computing.
“Connected does not equal smart” — Bill Schmarzo
I cannot end this article without pointing out another quote from Bill pointed out two years ago:
“Being able to capture, store, and manage the data created by connected devices and humans creates amazing possibilities. ... But that data by itself isn’t useful unless I apply some analytics to understand what it’s telling me.”
Beyond to the operational use of data resulting from edge applications, Bill's comment leads to the concept of edge analytics. But I think that's a story for another day.
Opinions expressed by DZone contributors are their own.