True Offline Neural Networks Soon to Come in Phones, Too

DZone 's Guide to

True Offline Neural Networks Soon to Come in Phones, Too

Simulating just a second of a real human's brain activity still requires a significant amount of time — but this ability is on its way to your mobile device.

· AI Zone ·
Free Resource

Creating a real silicon brain is no longer a far-fetched sci-fi theory. It’s already been done to a certain extent, and it’s only a matter of time before we have computers powerful enough to run a full brain simulation with billions of neurons involved.

Simulating just a second of a real human's brain activity still requires a significant amount of time for the fastest supercomputers in the world. Yet, as technology progresses both in terms of hardware as well as software, this gap will eventually be bridged to such an extent that an AI might very well become sentient to a certain extent.

However, while the state of machine learning has come a long way since the new millennium, for most mobile devices, AI algorithms have had to be calculated in the cloud and then sent back via the internet to the phone or tablet in order to run in a timely manner.

For obvious reasons — such as cost, connectivity issues, and availability — it’s not always feasible to develop an application that requires a constant internet connection in order to work properly. And while most types of apps have had their spot in the limelight, machine learning apps are still relying on internet connection and server-rendered ML results due to the inefficiency of mobile CPU’s.

And somehow, the technology progress from two opposite ends has managed to meet in the middle here in our present day of 2017. On one hand, we’re working towards better processing units to put inside the small mobile devices, and on the other hand, we’re working towards building lightweight machine learning applications that can handle intensive workloads without needing a CPU upgrade — or even worse, an online connection — to connect to the cloud for calculations.

Also, in the area of random-access memory, the improvements have been large enough to justify true self-contained mobile AI innovation. This has been possible by applying machine learning to itself, meaning that a team of researchers at Google first built a mobile ML system, then told it to optimize itself for energy efficiency.

The results that were published show a more than 70% decrease in power consumption compared to their first model, so clearly, there’s a great potential for improvements to be made. And if just one team can manage this, then we’re not far away from obtaining a minuscule energy load for running state-of-the-art ML calculations on a tiny mobile chip.

The idea of having artificial intelligence improve itself is not new, but it still might be slightly scary. Elon Musk, Stephen Hawking, and a number of other prominent people in the tech and science industry have already warned the general public of the dangers of an omnipotent artificial intelligence and the ramifications of the mass adoption of such technology.

While we’re nowhere near a point where machines will rise up against humanity in a revolution, it is alarming how many jobs are being lost to automation every day, and to some extent, it’s even more alarming just how well software can replace a human being without people around noticing a change in the workflow.

ai ,algorithms ,app development ,automation ,deep learning ,machine learning ,neural networks

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}