{{announcement.body}}
{{announcement.title}}

Kinetica Operationalizes Big Data in Real-Time

DZone 's Guide to

Kinetica Operationalizes Big Data in Real-Time

Real-time analytics with large-scale data is driving use cases across many industries.

· AI Zone ·
Free Resource

Image title

Real-Time Analytics


We had the opportunity to meet with Daniel Raskin, CMO, Irina Farooq, Chief Product Officer, and Ken Wattana, Sr. Marketing Manager of Kinetica during the IT Press Tour in San Francisco. Kinetica is enabling the adoption of technical analytics with data in telecommunications, retail, financial services, as well as the public sector.

You might also like:  Building Real-Time Analytics APIs at Scale

Kinetica was founded in 2016 as it developed an innovation initiative for the Army to track and analyze national security threats and deliver real-time analytics. Today a GPU-accelerated platform brings different data and techniques together for massively parallel processing to provide sub-second response times for real-time analysis, recommendations, and response.

Data is at the center of innovation as exemplified with analysis of the human genome, autonomous driving, and smart cities. Data and analytics today is going from passive (past) to active (of the moment and future) analytics. Kinetica is enabling developers to build smart real-time applications that transform businesses and verticals.

Use cases include designing rich customer experiences, reducing fuel costs and carbon footprints, bringing new medicines to market faster, real-time knowledge of risks, optimizing 5G network coverage, and making self-driving smarter by augmentation traditional data platforms with an active data layer.

Ovo is a digital payment and loyalty platform in Indonesia. They are building real-time 360-degree customer profiles for customer personalization and are using real-time analytics to serve up promotions and campaigns as the customer goes through life with the purpose of making their customers’ lives simpler and easier.

Financial services firms are able to make continuous value-at-risk calculations as market activity streams in versus the beginning and end of the day. They are able to recalculate risk on a minute-by-minute basis.

The U.S. Postal Service is optimizing mail delivery worldwide, aggregating data and using data as a foundation to deliver new services. Retail enterprises are making real-time replenishment a reality by streaming data from stores and warehouses to trigger replenishment decisions. Production enablement of machine learning is taking place in one unified platform.

Telecommunications companies are accelerating geospatial analytics for network optimization to determine how placing locations affects coverage towers in particular. They are able to go from years to minutes to get this information.  They are also analyzing signal and coverage data with what-if scenarios to see how it affects coverage areas. They are able to visualize what a city looks like delivering deeper network and subscriber insights, understand revenue reporting and subscriber analytics in real-time, and improving network performance and customer service with real-time geospatial visualizations.

GSK is just one pharmaceutical company accelerating drug discovery using huge amounts of data in real-time.

Automotive companies are powering regulatory reporting for autonomous vehicles using real-time in-car analytics with Kinetica Edge running on NVidia in the car.

Orbital Insight, a private satellite company is doing real-time global scale mapping and querying.

PubMatic is delivering real-time campaign reporting in digital marketing.

Hundreds of billions of points of data can be analyzed in real-time. The platform does the number crunching all CPU and GPU is managed underneath enabling hardware consolidation into a handful of GPUs to do more powerful analysis.

The in-memory data platform has the ability to scale and handle the complexity of data engineering for intraday replenishment. Existing data infrastructures are unavailable to handle the size. The unified architecture of the platform provides a simple solution with a unified data model, developer APIs, and tools for streaming analytics, historical analytics, location intelligence, and ML-powered analytics.

The GPU-accelerated database supports streaming data sources, high-speed parallel distributed ingest, real-time streaming, and active analytical techniques, and in-memory OLAP database.

An oil and gas company was able to go from 9 million data points in 2-D to 90 billion data points in 3-D for much richer insights into oil fields.

ML-powered analytics seamlessly integrate algorithms and ML models into active analytical applications. With a container, you can import logic and trained models, select a deployment mode and publish, enable automated data orchestration, and audit data and model activity.

The core differentiators of the platform are the breadth of active analytics, simplicity and agility, and performance at scale. You can run more scenario analyzes faster to make decisions that are more informed.

Further Reading

Real-Time Analytics in the World of Virtual Reality and Live Streaming

Apps Depend on Real-Time Streaming Data, Here’s How to Manage Them

Topics:
artificial intelligence ,massively parallel processing ,big data analytics ,real-time analytics

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}