DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports
Events Video Library
Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
View Events Video Library
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks

Mobile Database Essentials: Assess data needs, storage requirements, and more when leveraging databases for cloud and edge applications.

Full-Stack Observability Essentials: Explore the fundamentals of system-wide observability and key components of the OpenTelemetry standard.

Monitoring and Observability for LLMs: Datadog and Google Cloud discuss how to achieve optimal AI model performance.

Automated Testing: The latest on architecture, TDD, and the benefits of AI and low-code tools.

Related

  • Artificial Intelligence (AI) Utilizing Deep Learning Techniques to Enhance ADAS
  • Deep Learning Frameworks Comparison
  • Explainability of Machine Learning Models: Increasing Trust and Understanding in AI Systems
  • The AI Revolution: Embracing the Future of Technology and Automation

Trending

  • Discrepancies Between Test and FastAPI App Data
  • The Emergence of Cloud-Native Integration Patterns in Modern Enterprises
  • Resistance to Agile Transformations
  • Are You an Efficient Developer? Then AI Is After Your Job
  1. DZone
  2. Data Engineering
  3. AI/ML
  4. Exploring the Artificial Intelligence Ecosystem: AI, Machine Learning, and Deep Learning

Exploring the Artificial Intelligence Ecosystem: AI, Machine Learning, and Deep Learning

AI is not the next big thing — it's the current big thing! Make sure not to miss out with help from this introductory article.

Frederic Jacquet user avatar by
Frederic Jacquet
CORE ·
Jul. 04, 17 · Opinion
Like (11)
Save
Tweet
Share
19.04K Views

Join the DZone community and get the full member experience.

Join For Free

Welcome to the artificial intelligence era. Unless you have been hiding under a rock for the last decade, you will have come across some form of artificial intelligence tools or solutions in your life. If you are anything like me, you have been excited to welcome the innovations brought on by artificial intelligence. But when it comes to understanding the landscape of the artificial intelligence ecosystem, it can truly be confusing. This ecosystem includes terms such as general artificial intelligence, artificial narrow intelligence, machine learning, deep learning, and so many others.

This post intends to explain the little and not so subtle differences of all of the above.

AI's Beginnings

While artificial intelligence has most recently revolutionized the knowledge and insight gathering processes in the business world, its origins reach back over 60 years. The first AI conference took place in 1956 based on the idea that significant advances can be made in problems currently reserved for humans. To explore such possibilities, a new project was launched in collaboration by Dartmouth College, Harvard University, IBM, and Bell. At that time, AI was addressing automation and computers, computer processing languages, neuron nets, and self-improvement.

I will let you in on a dirty little secret.

Today’s state of the art artificial intelligence tools and solutions are all derivatives of the scientific works from the mid-1950s. Can you imagine that Siri is built on the LIPS computer language, invented by McCarty in 1958? And if you believe that this is just a one-off and AI won’t really matter that much, then you should check out more IDC papers on the topic. In their most recent reports, IDC estimates that the artificial intelligence market will be worth USD 47 billion by 2020. That’s only three years from now. Issues addressed by AI today are so big that not only high tech companies are investing in AI innovations, but entire industries are, too.

“If you woke up as an industrial company today, you will wake up as a software and analytics company tomorrow.” Jeff Immelt, 2014

This famous quote from GE’s CEO highlights his attitude towards software as part of your product/service offering, no matter what company you are. Yet as innovative as that sounds, it is almost outdated again. Immelt made his own observation obsolete by driving GE beyond his own vision from 2014. General Electric is now leveraging AI and machine learning to continue its digital transformation journey, which it began in 2016. 

Differentiating General AI, Artificial Narrow Intelligence, Machine Learning, and Deep Learning

In my recent article Artificial Intelligence, the New Cornerstone of Big Data Analytics, I proposed a short definition for AI:

“Artificial Intelligence is hardware and software systems combined to simulate human intelligence.”

With this understanding in mind, we can define general artificial iIntelligence as the behavior of a machine that “acts” like human. This is the full premise of “The Terminator,” or C-3PO if you prefer heros.

Now, if you want to process a specific sector of human intelligence like the ability to differentiate images like sharpeis vs. towels, to crawl a web page, or to play chess with you, this is artificial narrow intelligence. Your electronic chess sets won’t be able to even see the dog pictures, but have a narrow focus and skill set fine tuned by Artificial Intelligence.

Back in 2000, there were a total of approximately 50,000 computer viruses in the entire year. In 2015, nearly one million new malware threats were released every day. This up to three million today. So, listing all threats to protect computer systems has been impossible from a while. This exponential growth makes it prohibitive to keep feeding computers with data by humans or manually, and hence created a wave of innovation we call today machine learning. This type of artificial intelligence provides systems, application, computers with the ability to learn without being explicitly programmed, in our example, programmed to learn what a threat/virus is. For example, Symantec is developing a set of machine learning technologies that inspects thousands of static characteristics of a file, then digs deeper to understand a program’s dynamic behaviors and finally examine the file’s relationships with other files in order to qualify what’s a threat. This is machine learning at its finest.

This type of artificial intelligence processes data and learns from it in order to be able to make a determination or prediction like “This one is a dog, this one is a towel.”

“A lot of successful machine learning applications depend on hand-engineering features where the researcher manually encodes relevant information about the task at hand and then there is learning on top of that.” George E. Dahl

The next level of innovation is deep learning. Here, you want to “get the system to engineer its own features as much as is feasible,” as George E. Dahl said. At this point, the machine learning system will automate his analysis algorithms and innovate itself and its features. Deep learning is used to help to go through new industrial challenges such as computer vision for driverless vehicles, speech recognition, and natural language processing for human voice interfaces. For example, Google uses deep learning in its voice recognition algorithms.

Key Differences

Now, to differentiate machine learning from deep learning in as few words as possible, you could say that deep learning is mostly working by itself, while machine learning is working with human intervention.

In more detail, deep learning is able to create a large-scale net of neural connections, also called neural nets, which are modeled after the human brain in order to lead the machine to learn and “think” on its own without requiring the supervision of a human.

Software companies, researchers, industry firms, and business companies are paying closer attention to artificial intelligence in many innovative areas like autonomous vehicles, image recognition, language translation, and natural language processing for analytics, for example.

Business sectors most interested in AI are:

  • Finance: Fraud detection; robo-trading (things are happening now in milliseconds).
  • Automotive: Speech recognition, computer vision, connected cars, virtual assistants, self-driving cars, etc.
  • Data security: Intrusion detection, data privacy, malware fighting, etc.
  • Healthcare and medicine: Advanced and fast diagnostics.
  • Sales and marketing: User experience (UX); bots.

AI is not the next big thing — it's the current big thing! Welcome to the artificial intelligence era. Be sure not to miss out!

Shout out to Niki for your help. Thank you so much for your support.

AI Machine learning Deep learning

Published at DZone with permission of Frederic Jacquet. See the original article here.

Opinions expressed by DZone contributors are their own.

Related

  • Artificial Intelligence (AI) Utilizing Deep Learning Techniques to Enhance ADAS
  • Deep Learning Frameworks Comparison
  • Explainability of Machine Learning Models: Increasing Trust and Understanding in AI Systems
  • The AI Revolution: Embracing the Future of Technology and Automation

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 3343 Perimeter Hill Drive
  • Suite 100
  • Nashville, TN 37211
  • support@dzone.com

Let's be friends: