DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports Events Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones AWS Cloud
by AWS Developer Relations
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Partner Zones
AWS Cloud
by AWS Developer Relations
Securing Your Software Supply Chain with JFrog and Azure
Register Today

Trending

  • Adding Mermaid Diagrams to Markdown Documents
  • How AI Will Change Agile Project Management
  • TDD vs. BDD: Choosing The Suitable Framework
  • An Overview of Cloud Cryptography

Trending

  • Adding Mermaid Diagrams to Markdown Documents
  • How AI Will Change Agile Project Management
  • TDD vs. BDD: Choosing The Suitable Framework
  • An Overview of Cloud Cryptography
  1. DZone
  2. Data Engineering
  3. Big Data
  4. "I'll Be Back:" How The Internet of Things is Giving Birth to an AI Renaissance

"I'll Be Back:" How The Internet of Things is Giving Birth to an AI Renaissance

Jason Kolb user avatar by
Jason Kolb
·
Mar. 09, 15 · Interview
Like (0)
Save
Tweet
Share
6.90K Views

Join the DZone community and get the full member experience.

Join For Free

In 1977, an idea was born that would change computing forever. The world just didn’t realize it yet.

It was the heyday of computing innovation—transistor density increased daily, Moore’s law was in full effect, and the power of computing was poised to revolutionize the global economy. Computers were no longer room-sized behemoths, and researchers were gaining access to the computing power that would allow science, medicine, and a fledgling Silicon Valley to blossom.

This Wild-West environment was giving birth to ideas that would eventually change the world, from TCP/IP, which would give birth to modern networking and eventually the Internet, to the semiconductor, which would one day shrink computers to fit inside our pockets.

But some ideas were ahead of their time. Technology often needs a significant incubation period, during which it’s often dismissed as too novel or not practical. Gartner refers to it as the “trough of disillusionment”. But some of the ideas from the early age of computing have finally started to germinate. The actor model model of computing is a perfect example. Conceived in the 1970’s, it is the basis for many of the stream computing and distributed computation engines like Apache Spark, which are enabling real-time predictions on Big Data. Another idea ahead of its time, and the topic at hand, is artificial intelligence.


In the context of the Internet of Things and Big Data, AI is poised to usher in another, quieter revolution—and one that is not directly related to walking, talking, sentient robots, Skynet, or a singularity (conversation with intelligent, “sentient” computer programs is still a parlor trick, as most Siri users attest). But the concept of systems that are capable of reasoning and making decisions on their own, with some degree of autonomy, is beginning to come into its own, and in ways that nobody 40 years ago could have imagined.

In 1977, Edward Feigenbaum, a Carnegie Ph.D., wrote a paper describing “reasoning systems,” systems whose power comes from deduction based on the data they possessed rather than hard-coded rules. Today, this seems obvious; data is a valuable asset in the era of wearable tech and Google. But this is a relatively new way of thinking.

In the 1970s, using a store of data to drive decision making was a radical concept, and it spurred activity in the area of artificial intelligence for the next decade. Well into the 1980’s, universities offered courses in “expert systems,” which were the first real applications of systems that could reason on their own. Famous expert systems were developed to do things such as diagnose infectious diseases and identify organic molecules. This was the birth of AI.

By the 1990’s however, the ideas of reasoning and expert systems had fallen into obscurity, largely due to the fact that sister technologies like connectivity and distributed computing hadn’t yet caught up. As transistor technology improved, the focus shifted to personal computers, which didn’t have the firepower to run expert systems. The personal computing gold rush almost completely shifted the focus off of expert systems, resulting in an “AI Winter”.

But thanks to Big Data and an interconnected digital world, I believe we are now entering an “AI Spring”. Reasoning systems are capable of making intelligent decisions based on known information—they’re opening doors to smart devices which themselves contain intelligence and are capable of autonomous decision making. The “internet of things” feeds decision systems with all the data they need to act both intelligently and autonomously.

A reasoning system can synthesize available knowledge and come to conclusions about the world with little to no input from its operators. Such a system can determine what action is needed to achieve a goal, whether it is a refrigerator ordering food based on its knowledge of historical consumption or a locomotive applying the brakes when it senses conditions worsening. Expert systems can also house inference engines that can deduce new information based on already-established knowledge. Your refrigerator might know that the shelf life of milk is 7 days; given that your risotto contains milk, your fridge could infer that the risotto’s shelf life is also 7 days and give you a warning when it’s about to go off.

But the capabilities of data science don’t stop there. Predictive analytics give us a whole new type of knowledge to infer from: probable knowledge. The original expert systems had inference engines that inferred information from historical data and explicit facts only; we have the opportunity to make use of implied information—information that’s created from predictive algorithms based on the data from actual processes and actions.

What if the wheel bearings in your car could be equipped with an acoustic sensor, and the wave forms given off by that sensor could tell a predictive algorithm that there’s an 80% chance of those bearings going bad in the next 90 days? If the inference engine knows that a bad wheel bearing could result in the wheel separating from the car, it could strongly suggest maintenance—or an associated expert system could refuse to drive the car anywhere but a mechanic’s shop.

The Internet of Things is all about connected, smart devices, typically ones that are packed with sensors. If those sensors feed into an expert systemand also inform predictions, then the expert systems could proactively make decisions based on probabilities. That kind of autonomy could revolutionize the way we interact with the world.

Regardless, the emergence of these 40-year-old technologies is fun to watch. Who doesn’t like asking Siri to open the pod bay doors—or to find local film showings? But combined with modern connectivity and data-gathering systems, these ideas can offer us an incredibly exciting future packed with the possibility of ever-more-accurate predictions.

AI IoT Data science Big data Internet (web browser) Distributed Computing

Published at DZone with permission of Jason Kolb, DZone MVB. See the original article here.

Opinions expressed by DZone contributors are their own.

Trending

  • Adding Mermaid Diagrams to Markdown Documents
  • How AI Will Change Agile Project Management
  • TDD vs. BDD: Choosing The Suitable Framework
  • An Overview of Cloud Cryptography

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com

Let's be friends: