Cognitive Computing In 2017
Cognitive Computing In 2017
Prediction based on trends in GPU-database adoption.
Join the DZone community and get the full member experience.Join For Free
Hortonworks Sandbox for HDP and HDF is your chance to get started on learning, developing, testing and trying out new features. Each download comes preconfigured with interactive tutorials, sample data and developments from the Apache community.
Cognitive computing, which aims to simulate human thought and reasoning, could be considered the ultimate goal of information technology and is set to begin a new era in 2017, according to Kinetica’s Eric Mizell, vice president, global solutions engineering.
“The foundation for affordable cognitive computing already exists based on steady advances in CPU, memory, storage and networking technologies,” said Mizell. “A major breakthrough came with the advent of the Graphical Processing Unit (GPU) and its use in data analytics.”
There are four major trends that are driving the adoption of GPU-accelerated databases in this new 2017 Cognitive Computing era, according to Mizell. These include:
Trend #1: GPUs will revolutionize real-time intelligence in 2017
Graphical Processing Units (GPUs) deliver up to 100-times better performance compared to leading in-memory and analytical databases. This is due to the GPU’s parallelized processing architecture that contains over 4,000 cores, compared to 16-32 cores in today’s typical multi-core CPUs. The GPU’s small cores are able to efficiently perform similar instructions in parallel. This ability makes them ideally suited for these new compute-intensive workloads that are essential for real-time analysis of large streaming data sets.
Trend #2: GPUs will transform the Cloud into a “turbo-charged” network
Amazon already offers GPU-powered instances within its cloud service, and Microsoft and Google are not far behind, as they have both announced plans to offer GPU instances in their cloud offerings. All of these cloud service providers are turning to GPUs in order to gain a competitive advantage and offer their customers dramatic performance improvements. Microsoft and Google are both expected to begin offering GPU instances in 2017.
Trend #3: GPU-accelerated databases will soon have enterprise-class capabilities
Major enhancements in both security and high availability in GPUs will occur in 2017, and these features will build on the GPU’s proven enterprise-class performance and scalability abilities. In terms of security, role-and group-based authorization as well as user authentication support will make GPU-accelerated databases a good choice for applications that need to comply with security regulations, including personal privacy protections. Data replication with automatic failover capabilities will meet today’s high availability requirements, and will make GPUs a powerful choice for even the most mission-critical applications.
Trend #4: 2017 will usher in the new era of Cognitive Computing
These powerful forces will help usher in this new era of cognitive computing, making it possible for organizations to affordably converge artificial intelligence, business intelligence, machine learning, natural language processing, expert systems, pattern recognition, and other data analytics in order to create new, groundbreaking systems capable of real-time self-learning. With so much raw compute power, GPU-accelerated databases offer extraordinary performance without needing to re-define schemas or index in advance. GPU acceleration makes it possible for organizations to perform real-time, exploratory analytics that will be a requirement in the new, exciting era of cognitive computing in 2017.
Opinions expressed by DZone contributors are their own.