Over a million developers have joined DZone.
{{announcement.body}}
{{announcement.title}}

Navigating the AI and Cognitive Maze

DZone's Guide to

Navigating the AI and Cognitive Maze

This article attempts to explain terms like AI and cognitive computing mean, how they relate to one other, and where they all fit along the AI/cognitive time continuum.

· AI Zone ·
Free Resource

Did you know that 50- 80% of your enterprise business processes can be automated with AssistEdge?  Identify processes, deploy bots and scale effortlessly with AssistEdge.

If you work in the area of artificial intelligence (AI) and cognitive computing, you might use buzzwords and phrases that, to others, might be perceived as confusing jargon. This article attempts to explain what these terms mean, how they relate to one other, and where they all fit along the AI and cognitive time continuum. I include a glossary of my top 20 useful AI/cognitive terms and advice on getting started on your AI/cognitive journey.

Machine Learning, Cognitive Computing, and Artificial Intelligence

Think of machine learning (ML) as a set of libraries and an execution engine for running a set of algorithms as part of a model to predict one or more outcomes. Each outcome has an associated score indicating the confidence level at which it will occur. Cognitive computing is the ability of computers to simulate the human behaviors of understanding, reasoning, and thought processing. The ultimate goal is to simulate intelligence through a set of software and hardware services to produce better business outcomes — hence, the term "artificial intelligence" as shown in Figure 1.

From Checkers to Jeopardy

Machine learning is defined as a field of computer science that gives computers the ability to learn without being explicitly programmed. Arthur Samuel, an IBMer known for his groundbreaking work in computer checkers in 1959 (Figure 2), developed a scoring function based on the position of the board at any given time, which measured the chance of each side winning, taking into account many game factors. In what he called rote learning, the program remembered every position it had already seen, along with the terminal value of the "reward function" (reinforcement learning).

Machine learning grew out of the quest for AI. As an academic discipline, some researchers were interested in having machines learn from data, approaching the problem with various symbolic methods, as well as what were then termed "neural networks."

Two important learning concepts to know about:

  • Supervised learning. An algorithm is given training data that contains the "correct answer" for each example. For instance, a supervised learning algorithm for credit card fraud detection would take as input a set of recorded transactions. For each transaction, the training data would contain a flag that says whether it is fraudulent or not.
  • Unsupervised learning. An algorithm looks for structure in the training data, like finding which examples are similar to each other, and groups them in clusters.

Around 1957, the "perceptron" was conceived: an algorithm for supervised learning of binary classifiers.

During the "AI Winter" of the 1970s and 1980s, there was pessimism in the AI community, reflected by the press and followed by severe cutbacks in funding and research. Three years later, the billion-dollar AI industry began to collapse.

During the 1980s, "backpropagation" caused a resurgence in ML research, followed by a shift to a data-driven approach in the 1990s. Scientists began creating programs for computers to analyze large amounts of data and draw conclusions or "learn" from the results.

By 2010, deep learning was helping ML become integral to many widely used software services and applications. It uses convolutional neural networks, looking at things more deeply — in layers — and has been applied to computer vision, speech recognition, natural language processing, audio recognition, social network filtering, machine translation, bioinformatics, and drug design, where it has produced results comparable to and in some cases superior to human experts.

In 2011, "Watson" competed on the Jeopardy! TV game, beating and outperforming the top contestants. Its natural language processing, predictive scoring, and models were key to its success.

The Impact of Big Data

Big data just means all data. Just remember the 3 Vs: volume, variety, velocity. The more data (volume and variety) that we have, the more informed our insights should be. Trying to make a decision on limited data or just one dimension can incur risk because we may not have the full picture. Including customer sentiment, product reviews, social media data, etc. should enable greater "understanding" of a situation. Applying AI techniques to all this data should result in smarter business outcomes by considering interrelationships across many different data sources.

Cognitive Computing Comes of Age

Around 2015, there was a convergence of the many facets of ML and deep learning mentioned above. Cognitive computing is the ability of computers to simulate the human behaviors of understanding and thought processing. Open source, improved tools, demand for self-service PayGo analytics, cheap compute power, massive data ingest, scale-out processing, and flexible deployment options helped democratize cognitive computing, putting it in reach of the vast majority of the data science community.

Since the Jeopardy! game, AI has been applied across many industries from financial markets to help predict and prevent fraud in real time, to retail to help predict what customers will purchase next, to security and protection to help prevent attacks and crimes, to media to help tailor the viewing experience with targeted advertising and to healthcare to help doctors design cancer treatment plans.

Machine Learning for the Masses

But one thing was missing: making it consumable to the data science community regardless of skill level.

The IBM Data Science Experience (DSX) is a single unifying tool that allows multiple personas to collaborate across the data science lifecycle — from data preparation and ingest to ML model creation and training to deployment and management. DSX is suitable for all skill levels whether you prefer to use Notebooks or an intuitive step-by-step visual interface that applies cognitive techniques to choose the best algorithms for you. This IBM video on DSX provides more information.

Summary and Call to Action

Hopefully, this article helps readers understand how and when AI appeared and developed over time at a high level, how the different elements of AI, ML and cognitive relate to each other, as well as explaining some of the key terms we hear mentioned in this exciting industry. So, that's the educational portion of this blog post. Your next step is to try machine learning for yourself by clicking here, which will take you to the IBM Data Science Experience.

Glossary: My Top 20 ML/AI Terms Defined and Explained

Consuming AI in byte sized applications is the best way to transform digitally. #BuiltOnAI, EdgeVerve’s business application, provides you with everything you need to plug & play AI into your enterprise.  Learn more.

Topics:
ai ,machine learning ,cognitive computing ,big data analytics

Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}