DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Please enter at least three characters to search
Refcards Trend Reports
Events Video Library
Refcards
Trend Reports

Events

View Events Video Library

Zones

Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Last call! Secure your stack and shape the future! Help dev teams across the globe navigate their software supply chain security challenges.

Modernize your data layer. Learn how to design cloud-native database architectures to meet the evolving demands of AI and GenAI workloads.

Releasing software shouldn't be stressful or risky. Learn how to leverage progressive delivery techniques to ensure safer deployments.

Avoid machine learning mistakes and boost model performance! Discover key ML patterns, anti-patterns, data strategies, and more.

Related

  • Transforming Manufacturing: AI-Driven Quality Control and Predictive Maintenance Using Cloud and IoT
  • Machine Learning at the Edge: Enabling AI on IoT Devices
  • Predictive Maintenance in Industrial IoT With AI
  • Harnessing the Power of Artificial Intelligence to Improve Human Health and Safety

Trending

  • Top Book Picks for Site Reliability Engineers
  • DGS GraphQL and Spring Boot
  • Artificial Intelligence, Real Consequences: Balancing Good vs Evil AI [Infographic]
  • Ethical AI in Agile
  1. DZone
  2. Data Engineering
  3. AI/ML
  4. Using AI To Optimize IoT at the Edge

Using AI To Optimize IoT at the Edge

Artificial intelligence has the potential to revolutionize the combined application of IoT and edge computing. Here are some thought-provoking possibilities.

By 
Devin Partida user avatar
Devin Partida
·
Mar. 15, 23 · Opinion
Likes (1)
Comment
Save
Tweet
Share
3.9K Views

Join the DZone community and get the full member experience.

Join For Free

As more companies combine Internet of Things (IoT) devices and edge computing capabilities, people are becoming increasingly curious about how they could use artificial intelligence (AI) to optimize those applications. Here are some thought-provoking possibilities.

Improving IoT Sensor Inference Accuracy With Machine Learning

Technology researchers are still in the early stages of investigating how to improve the performance of edge-deployed IoT sensors with machine learning. Some early applications include using sensors for image-classification tasks or those involving natural language processing. However, one example shows how people are making progress.

Researchers at IMDEA Networks recognized that using IoT sensors for specific deep-learning tasks may mean the sensors cannot guarantee specific quality-of-service requirements, such as latency and inference accuracy. However, the people working on this project developed a machine learning algorithm called AMR² to help with this challenge.

AMR² utilizes an edge computing infrastructure to make IoT sensor inferences more accurate while enabling faster responses and real-time analyses. Experiments suggested the algorithm improved inference accuracy by up to 40% compared to the results of basic scheduling tasks that did not use the algorithm.

They found an efficient scheduling algorithm such as this one is essential for helping IoT sensors work properly when deployed at the edge. A project researcher pointed out that the AMR² algorithm could impact an execution delay if a developer used it for a service similar to Google Photos, which classifies images by the elements they include. A developer could deploy the algorithm to ensure the user does not notice such delays when using the app.

Reducing Energy Usage of Connected Devices With AI at the Edge

A 2023 study of chief financial officers at tech companies determined 80% expect revenue increases in the coming year. However, that’s arguably most likely to happen if employees understand customers’ needs and provide products or services accordingly.

The manufacturers of many IoT devices intend for people to wear those products almost constantly. Some wearables detect if lone workers fall or become distressed or if people in physically demanding roles are becoming too tired and need to rest. In such cases, users must feel confident that their IoT devices will work reliably through their workdays and beyond.

That’s one of the reasons why researchers explored how using AI at the edge could improve the energy efficiency of IoT devices deployed to study the effects of a sedentary lifestyle on health and how correct posture could improve outcomes. Any IoT device that captures data about how people live must collect data continuously, requiring few or no instances where information gathering stops because the device runs out of battery.

In this case, subjects wore wireless devices powered by coin-cell batteries. Each of these gadgets had inertia sensors to collect accurate data about how much people moved throughout the day. However, the main problem was the batteries only lasted a few hours due to the large volume of data transmitted. For example, research showed a nine-channel motion sensor that reads 50 samples every second produces more than 100 MB of data daily.

However, researchers recognized machine learning could enable the algorithms only to transfer critical data from edge-deployed IoT devices to smartphones or other devices that assist people in analyzing the information. They proceeded to use a pre-trained recurrent neural network and found the algorithm achieved real-time performance, improving the IoT devices’ functionality.

Creating Opportunities for On-Device AI Training

Edge computing advancements have opened opportunities to use smart devices in more places. For example, people have suggested deploying smart street lights that turn on and off in response to real-time traffic levels. Tech researchers and enthusiasts are also interested in the increased opportunities associated with AI training that happens directly on edge-deployed IoT devices. This approach could increase those products’ capabilities while reducing energy consumption and improving privacy.

An MIT team studied the feasibility of training AI algorithms on intelligent edge devices. They tried several optimization techniques and came up with one that only required 157 KB of memory to train a machine-learning algorithm on a microcontroller. Other lightweight training methods typically require between 300-600 MB of memory, making this innovation a significant improvement.

The researchers explained that any data generated for training stays on the device, reducing privacy concerns. They also suggested use cases where the training happens throughout normal use, such as if algorithms learn by what a person types on a smart keyboard.

This approach had some undoubtedly impressive results. In one case, the team trained the algorithm for only 10 minutes, which was enough to allow it to detect people in images. This example shows optimization can go in both directions.

Although the first two examples here focused on improving how IoT devices worked, this approach enhanced the AI training process. However, suppose developers train algorithms on IoT devices that will eventually use them to perform better. That’s a case where the approach mutually benefits AI algorithms and IoT-edge devices.

How Will You Use AI to Improve How IoT-Edge Devices Work?

These examples show some of the things researchers focused on when exploring how artificial intelligence could improve the functionality of IoT devices deployed at the edge. Let them provide valuable insights and inspiration about how you might get similar results. It’s almost always best to start with a clearly defined problem you want to solve. Then, start exploring how technology and innovative approaches could help meet that goal.

AI IoT Machine learning

Opinions expressed by DZone contributors are their own.

Related

  • Transforming Manufacturing: AI-Driven Quality Control and Predictive Maintenance Using Cloud and IoT
  • Machine Learning at the Edge: Enabling AI on IoT Devices
  • Predictive Maintenance in Industrial IoT With AI
  • Harnessing the Power of Artificial Intelligence to Improve Human Health and Safety

Partner Resources

×

Comments
Oops! Something Went Wrong

The likes didn't load as expected. Please refresh the page and try again.

ABOUT US

  • About DZone
  • Support and feedback
  • Community research
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Core Program
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends:

Likes
There are no likes...yet! 👀
Be the first to like this post!
It looks like you're not logged in.
Sign in to see who liked this post!