Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Exploring the Artificial Intelligence Ecosystem: AI, Machine Learning, and Deep Learning

DZone's Guide to

Exploring the Artificial Intelligence Ecosystem: AI, Machine Learning, and Deep Learning

AI is not the next big thing — it's the current big thing! Make sure not to miss out with help from this introductory article.

· AI Zone
Free Resource

Welcome to the artificial intelligence era. Unless you have been hiding under a rock for the last decade, you will have come across some form of artificial intelligence tools or solutions in your life. If you are anything like me, you have been excited to welcome the innovations brought on by artificial intelligence. But when it comes to understanding the landscape of the artificial intelligence ecosystem, it can truly be confusing. This ecosystem includes terms such as general artificial intelligence, artificial narrow intelligence, machine learning, deep learning, and so many others.

This post intends to explain the little and not so subtle differences of all of the above.

AI's Beginnings

While artificial intelligence has most recently revolutionized the knowledge and insight gathering processes in the business world, its origins reach back over 60 years. The first AI conference took place in 1956 based on the idea that significant advances can be made in problems currently reserved for humans. To explore such possibilities, a new project was launched in collaboration by Dartmouth College, Harvard University, IBM, and Bell. At that time, AI was addressing automation and computers, computer processing languages, neuron nets, and self-improvement.

I will let you in on a dirty little secret.

Today’s state of the art artificial intelligence tools and solutions are all derivatives of the scientific works from the mid-1950s. Can you imagine that Siri is built on the LIPS computer language, invented by McCarty in 1958? And if you believe that this is just a one-off and AI won’t really matter that much, then you should check out more IDC papers on the topic. In their most recent reports, IDC estimates that the artificial intelligence market will be worth USD 47 billion by 2020. That’s only three years from now. Issues addressed by AI today are so big that not only high tech companies are investing in AI innovations, but entire industries are, too.

“If you woke up as an industrial company today, you will wake up as a software and analytics company tomorrow.” Jeff Immelt, 2014

This famous quote from GE’s CEO highlights his attitude towards software as part of your product/service offering, no matter what company you are. Yet as innovative as that sounds, it is almost outdated again. Immelt made his own observation obsolete by driving GE beyond his own vision from 2014. General Electric is now leveraging AI and machine learning to continue its digital transformation journey, which it began in 2016. 

Differentiating General AI, Artificial Narrow Intelligence, Machine Learning, and Deep Learning

In my recent article Artificial Intelligence, the New Cornerstone of Big Data Analytics, I proposed a short definition for AI:

“Artificial Intelligence is hardware and software systems combined to simulate human intelligence.”

With this understanding in mind, we can define general artificial iIntelligence as the behavior of a machine that “acts” like human. This is the full premise of “The Terminator,” or C-3PO if you prefer heros.

Now, if you want to process a specific sector of human intelligence like the ability to differentiate images like sharpeis vs. towels, to crawl a web page, or to play chess with you, this is artificial narrow intelligence. Your electronic chess sets won’t be able to even see the dog pictures, but have a narrow focus and skill set fine tuned by Artificial Intelligence.

Back in 2000, there were a total of approximately 50,000 computer viruses in the entire year. In 2015, nearly one million new malware threats were released every day. This up to three million today. So, listing all threats to protect computer systems has been impossible from a while. This exponential growth makes it prohibitive to keep feeding computers with data by humans or manually, and hence created a wave of innovation we call today machine learning. This type of artificial intelligence provides systems, application, computers with the ability to learn without being explicitly programmed, in our example, programmed to learn what a threat/virus is. For example, Symantec is developing a set of machine learning technologies that inspects thousands of static characteristics of a file, then digs deeper to understand a program’s dynamic behaviors and finally examine the file’s relationships with other files in order to qualify what’s a threat. This is machine learning at its finest.

This type of artificial intelligence processes data and learns from it in order to be able to make a determination or prediction like “This one is a dog, this one is a towel.”

“A lot of successful machine learning applications depend on hand-engineering features where the researcher manually encodes relevant information about the task at hand and then there is learning on top of that.” George E. Dahl

The next level of innovation is deep learning. Here, you want to “get the system to engineer its own features as much as is feasible,” as George E. Dahl said. At this point, the machine learning system will automate his analysis algorithms and innovate itself and its features. Deep learning is used to help to go through new industrial challenges such as computer vision for driverless vehicles, speech recognition, and natural language processing for human voice interfaces. For example, Google uses deep learning in its voice recognition algorithms.

Key Differences

Now, to differentiate machine learning from deep learning in as few words as possible, you could say that deep learning is mostly working by itself, while machine learning is working with human intervention.

In more detail, deep learning is able to create a large-scale net of neural connections, also called neural nets, which are modeled after the human brain in order to lead the machine to learn and “think” on its own without requiring the supervision of a human.

Software companies, researchers, industry firms, and business companies are paying closer attention to artificial intelligence in many innovative areas like autonomous vehicles, image recognition, language translation, and natural language processing for analytics, for example.

Business sectors most interested in AI are:

  • Finance: Fraud detection; robo-trading (things are happening now in milliseconds).
  • Automotive: Speech recognition, computer vision, connected cars, virtual assistants, self-driving cars, etc.
  • Data security: Intrusion detection, data privacy, malware fighting, etc.
  • Healthcare and medicine: Advanced and fast diagnostics.
  • Sales and marketing: User experience (UX); bots.

AI is not the next big thing — it's the current big thing! Welcome to the artificial intelligence era. Be sure not to miss out!

Shout out to Niki for your help. Thank you so much for your support.

Topics:
machine learning ,deep learning ,ai ,neural nets ,artificial narrow intelligence

Published at DZone with permission of Fred Jacquet. See the original article here.

Opinions expressed by DZone contributors are their own.

THE DZONE NEWSLETTER

Dev Resources & Solutions Straight to Your Inbox

Thanks for subscribing!

Awesome! Check your inbox to verify your email so you can start receiving the latest in tech news and resources.

X

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}