How AI Is Changing
How AI Is Changing
Organizations are beginning to see real business value from their data and they are able to do so because GPUs have become affordable.
Join the DZone community and get the full member experience.Join For Free
To gather insights on the state of artificial intelligence (AI) and all of its sub-segments — machine learning (ML), natural language processing (NLP), deep learning (DL), robotic process automation (RPA), regression, et al — we talked to 21 executives who are implementing AI in their own organizations and helping others understand how AI can help their business. We began by asking, "How has A.I. changed in the past year?" Here's what they told us:
- The past year has seen many breakthroughs in AI, specifically in deep learning. For example, AlphaGo Zero taught itself how to play Go, Chess, and Shogi without any human intervention or games with human opponents. Taco Tron and Baidu’s Deep Voice have made speech generation nearly identical to human speech (no more of that funny Mac robot voice). Additionally, computer vision, object detection, and image segmentation are becoming extremely accurate and rivaling humans in medical diagnosis and biology research. However, other technologies haven't progressed quite as expected. Natural language processing, chatbots, and text summarization haven't quite lived up to expectations.
- AI has been around a long time. There are more "not new" things than new things. It's important to not underestimate the power of the public awareness. When Deep Blue was beating Kasparov, that was different. What happened with Alpha Go was important. A person was beaten by a machine that was being innovative. We had seen this before in movies, but now it’s real. This changed perceptions quite a bit. We also have a lot of applications that are providing business value by AI starts to mean something.
- We've seen people investing a lot over the last two or three years. This year, they are looking for the results. Projects get turned off if organizations are not seeing the results. It's important to determine the problem to be solved rather than just playing with technology. Within Fortune 500 to 2000, a lot of investment in R&D is slowly pushing to real-world.
- AI is not considered science fiction anymore. Among most tech companies, there’s a common understanding regarding AI’s benefit to the enterprise. The technology has evolved over the last few months with better-earning capabilities as well as increased availability from off the shelf libraries and machines’ ability to improve its learning process in real-time.
- Over the last year, we’ve grown our focus on building truly conversational AI. Current assistants are useful, but they are very far from the capabilities of a real human and lack the ability to handle more complex and valuable tasks. Getting to the next level requires AI technologies that take advantage of, but also go beyond, pattern-matching to a true dynamic dialog that leverages knowledge-based reasoning, context, and personalization to make sense of incomplete and ambiguous language. We have also started to add other modalities into the mix. When humans communicate, they use gestures, gaze, and other factors beyond speech, and we have started to also enable our systems to do the same. There’s also an important need to connect to other services, including other virtual assistants. That’s why we introduced our cognitive arbitrator, which seamlessly connects and integrates disparate virtual assistants, third-party services, and content via a single interface that spans the automotive, smart home, and the Internet of Things (IoT) ecosystem to complete complex tasks and enhance the user experience. As a result, we maximize our customers’ ability to provide their own unique and differentiated experiences to end users, while also offering interoperability to the world of other assistants that deliver useful services. It’s a win-win for everyone in the ecosystem, especially the humans that buy and use our customers’ products and services. This need to connect to other services is also owed to another important change that has happened in the last few years and even accelerated in the last year: AI is becoming mainstream.
- AI and ML have moved out of the fringes in the labs and moved into more mainstream applications. We’re in the next chapter, which we’re just beginning. Six years ago, the title of data scientists did not exist. Today, skills have become deeply specialized and data scientists and developers are magical at finding ways to do things faster with AI.
- Slow gradual adoption from 2000 to 2003 as all trading firms went to algorithmic trading. Over the past few years, we've seen an increase in ML because of aspirational applications. Humans are being replaced by AI in creative situations. Machines are making decisions by themselves based on new sources of signals and more data. Organizations are able to buy GPUs for a few pennies an hour.
- Technologically speaking, in the past year, GPU-based servers have become commonplace as developers leverage the processing power to accelerate their applications. Specialized processors such as Google’s TPU are starting to emerge, and competitive cloud providers are collaborating to develop an open-source deep-learning library. There has also been a steady transition from Big Data and point tools such as Hadoop and Spark to a broader class of data analysis using AI and neural networks. ML is bridging the gap between these approaches by taking the large disparate data sets and applying algorithmic intelligence to the analysis. Currently, the ability to self-learn is in the early stages. Learning algorithms are still very rudimentary. AI has gained a stronger foothold in our lives. The result has been that product and service recommendation engines and image processing systems have improved dramatically, fear about AI’s threat to jobs has begun to subside, replaced by optimism over new career paths. The pace of innovation in this area is increasing rapidly. There is no doubt AI will have a transformative effect on our world.
- Become fundamental to what you do with regards to the products you ship. AI is infused or digital assistants are surfacing for many channels and use cases. TensorFlow uses wider, people more knowledgeable. GPUs have gotten better. Even though it's hard to find AI talent, it's not changing, but fresh grads from universities are exposed to data science earlier. Object recognition is getting started. Not democratized yet, but on the path and accelerating. In the next 24 months, we will see more AI driven use cases.
- A lot more adoption. Some innovation in techniques. Practical usage. More data is available and smarter about putting the data to use. Practical, realistic expectations to improve a process or product in a variety of ways.
- The concept of AI and ML is now a key ingredient of a cloud environment, but this only works if users have the data at their hands. Increased automation through ML has helped enterprises increase employee productivity, and it will do so even more down the road as employees become more familiar working with A.I. tools. In addition, streamlining data integration efforts is on the rise, especially as enterprises look to glean insights from their data. The increased focus on predictive analytics allows companies to turn real-time data into actions.
- AI is nothing new, but a resurgence is due to new capabilities of handling the data required and the velocity and types of data. Harness vast lakes of information. Such a cacophony of information you need AI to get value from it. Help convert data swamps to data lakes. Problem is, they can’t get their arms around their data.
- AI has dramatically evolved in the past year due to two primary reasons — 1) the speed at which digital transformation is taking place across all businesses; and, 2) the rate at which new business and operations datasets are getting introduced and their velocity is reinforcing the need of AI to automate the business and operational activities. The need of AI has evolved from “nice-to-have” to “must-have.” There is a good degree of awareness that has taken place across key decision makers and stakeholders about implementing AI to make their business successful. AI is now a key item on the agenda of CIOs and CFOs of every company.
- There’s a lot of substance with the hype. A continuation of trends. The democratization of machine learning where ordinary engineers can do it. It’s much easier than one year ago for software engineers to do interesting ML. There’s hardware available at a lower cost. There’s data available, there are techniques for transferred learning, and all of these things are making it where you don’t have to be a super Ph.D. You can be a subject matter expert, know about your data, spin your data, and the learning is becoming more commoditized.
- Inflection point we’re seeing a real serious uptick of people aware of the production issue. A couple of issues with the shortage of data scientists. Today there is more online education, universities setting up data science programs. Now we have the citizen data scientists. There is also the trend of auto ML — the selection of algorithms is assisted by the machines themselves. Have data science. Now the problem is going the next step to production.
- Not an inflection point with regards to deployment. Have enough dots to see where people are on the hype cycle. Very few are at the top of the hype cycle. We’re a bridge to help people get to that point in a more competent with it. The data scientists have it figured out in their lab loop. If you give them A, B, and C they’ll get it done without understanding the business problem they’re trying to solve.
- No one knows how to do this. They don’t have the skillsets for the cloud. They don’t have the data scientists. Part of the way to a multi-year journey to smarter computing at the edge. The edge will become smarter with semantically smart ML. Can we make these edge systems do something like remembering? More diverse deployments of diverse devices. Naturally, instantiate digital personas and roll app into a compositional model. More semantically enriched.
- We certainly see a bit of a fatigue with deep learning and black box techniques. On the research side of things, there appears to be a big shift toward creating algorithms that are less opaque and less data hungry. How can we come to insights not with big data but with real data? In the case of marketing, there are certain systems that are very data rich. But there are others that are not. How can we derive meaningful solutions using statistics and other mathematical techniques to get where we need to go?
- The fundamental changes have been in the neural networks. Different, unique companies are adopting machine learning because most people are seeking interpersonal results. One example is AI for healthcare — AI has opened up cutting-edge opportunities and adoption for the healthcare industry thanks to the interpersonal element.
Here's who we spoke to:
- Assaf Gad, Vice President and Strategic Partnerships, Audioburst
- Tyler Foxworthy, Chief Scientist, DemandJump
- Patric Palm, CEO, Favro
- Sameer Padhye, CEO, FixStream
- Matthew Tillman, CEO, Haven
- Dipti Borkar, V.P. Product Marketing, Kinetica
- Ted Dunning, Chief Application Architect, MapR
- Jeff Aaron, VP Marketing and Ebrahim Safavi, Data Scientist, Mist Systems
- Dominic Wellington, Global IT Evangelist, Moogsoft
- Dr. Nils Lenke, Director, Corporate Research, Nuance Communications
- Mark Gamble, Senior Director of Product Marketing, OpenText
- Sri Ramanathan, Group Vice President of Mobile, Oracle
- Sivan Metzger, CEO and Co-founder, ParallelM
- Nisha Talagala, CTO and Co-founder, ParallelM
- Stuart Feffer, Co-founder and CEO, Reality AI
- Sven Denecken, SVP Head of Product Management, SAP S/4 Hana Cloud
- Steve Sloan, Chief Product Officer, SendGrid
- Simon Crosby, CTO, Swim
- Liran Zvibel, CEO and Co-founder, WekaIO
- Daniel DeMillard, A.I. Architect, zvelo
Opinions expressed by DZone contributors are their own.