DZone
Thanks for visiting DZone today,
Edit Profile
  • Manage Email Subscriptions
  • How to Post to DZone
  • Article Submission Guidelines
Sign Out View Profile
  • Post an Article
  • Manage My Drafts
Over 2 million developers have joined DZone.
Log In / Join
Refcards Trend Reports Events Over 2 million developers have joined DZone. Join Today! Thanks for visiting DZone today,
Edit Profile Manage Email Subscriptions Moderation Admin Console How to Post to DZone Article Submission Guidelines
View Profile
Sign Out
Refcards
Trend Reports
Events
Zones
Culture and Methodologies Agile Career Development Methodologies Team Management
Data Engineering AI/ML Big Data Data Databases IoT
Software Design and Architecture Cloud Architecture Containers Integration Microservices Performance Security
Coding Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
Culture and Methodologies
Agile Career Development Methodologies Team Management
Data Engineering
AI/ML Big Data Data Databases IoT
Software Design and Architecture
Cloud Architecture Containers Integration Microservices Performance Security
Coding
Frameworks Java JavaScript Languages Tools
Testing, Deployment, and Maintenance
Deployment DevOps and CI/CD Maintenance Monitoring and Observability Testing, Tools, and Frameworks
  1. DZone
  2. Data Engineering
  3. AI/ML
  4. AI Will Not Eat the World

AI Will Not Eat the World

...well, not for a good while anyway.

Christopher Lamb user avatar by
Christopher Lamb
CORE ·
Jan. 24, 19 · Opinion
Like (2)
Save
Tweet
Share
5.09K Views

Join the DZone community and get the full member experience.

Join For Free

So I work at the intersection of cybersecurity and machine learning. I use a variety of neural network architectures and machine learning techniques to try to create new ways to detect new malware. I've worked on other projects using machine learning and AI too.

And we have nothing to worry about.

We're about to enter a new valley of despair with AI technologies. The explosion in neural networks that has lead to self-driving cars, autonomous drones, and other modern AI applications is based on two singular events — the development of more complex and biologically inspired network architectures and GPUs.

These kinds of networks, like the ubiquitous convolutional networks we use in deep learning applications today, were originally inspired by the design of the eye and were used in visual image recognition contexts. The first CNNs exhibited monumental increases in performance when compared to other methods folks had used for character recognition, essentially solving handwritten character recognition over the MNIST data set. These networks were certainly novel in their architecture, but the techniques they used weren't really anything new. They represented an evolution of neural techniques, not the revolution that it seemed.

But what really enabled these deep, complex architectures was computational power. Neural models are fast once trained, but they are very difficult to train initially. Propagating error vectors through multiple layers is expensive and slow. The kinds of deep networks used by convolutional networks simply took too long to train previously. When graphical processing units and associated development tools like CUDA really became usable and affordable, this kind of training suddenly became feasible. Networks that took days to train with a CPU could be trained in hours.

But we're beginning to hit the end of the road with AI applications. All of the obvious problems have been, if not solved, at least approached with neural network-based architectures. And they've been reasonably successful, though we've certainly had some notable failures like Uber's self-driving car fiasco and recidivism prediction systems, which seem to codify training set bias more than anything else.

The simple fact is that, yes, you can train these kinds of systems to do amazing things if you have immense amounts of data, which you have access to if you're Google, Facebook, or Netflix. Otherwise, it's not so easy. And selecting unbiased data to effectively train a model to select on the features you're interested in rather than coincident but meaningless features is much harder than people think.

We don't have another GPU revolution on the horizon. We'll certainly continue to develop new algorithms, but we won't have a shiny new engine to run them. And we'll tax GPU clusters just like we did CPU clusters and hit limits with them too. And sure, we have lots of work to do trying to understand the internals of these kinds of systems and what we can do to protect them from interference.

But general intelligence? Mass replacement of jobs with AI? Forget about it. Not with the tools we have today.

AI Machine learning neural network Network Architecture

Opinions expressed by DZone contributors are their own.

Popular on DZone

  • Best Practices for Writing Clean and Maintainable Code
  • Core Machine Learning Metrics
  • Data Mesh vs. Data Fabric: A Tale of Two New Data Paradigms
  • How To Avoid “Schema Drift”

Comments

Partner Resources

X

ABOUT US

  • About DZone
  • Send feedback
  • Careers
  • Sitemap

ADVERTISE

  • Advertise with DZone

CONTRIBUTE ON DZONE

  • Article Submission Guidelines
  • Become a Contributor
  • Visit the Writers' Zone

LEGAL

  • Terms of Service
  • Privacy Policy

CONTACT US

  • 600 Park Offices Drive
  • Suite 300
  • Durham, NC 27709
  • support@dzone.com
  • +1 (919) 678-0300

Let's be friends: