I was talking with Eric Mizell, Vice President Global Solution Engineer at Kinetica recently about another topic and we began discussing use cases for big data and the vast opportunities with artificial intelligence and machine learning moving forward.
Kinetica has a few use cases that I thought would inspire our community to come up with solutions to problems using data. Here are a few for your consideration.
Problem: A pipeline and well analytics company had a lot of data from it IoT devices. They wanted to augment their SaaS offering to provide research data and analytics on oil and gas to energy investors and operators with a geospatial query, visualization, and analytics.
Solution/results: Geospatial visualization and analytics of a massive number of wells and pipeline by land ownership, region, and more. Custom visualizations and charts provided data-driven insights. This was done with an embedded solution with seamless Node.js integration and GPU acceleration. Running in RSEG's AWS VPC deployment.
Real-Time Data and Analytics
Problem: An independent oil and natural gas company was looking for modern geospatial visualization and OLAP technology for high-value drilling and data analytics (i.e. well level, production, historic and go forward, economics). Jumpstart the project with huge expansion potential based on results.
Solution/results: Replaced the legacy system to ingest, join, and visualize data in real-time at scale and with speed. This provided real-time drilling analytics with visualizations and charting with a 1TB cluster running on Azure with MapR as the warm layer.
Problem: A customer in the entertainment business wanted to accelerate Tableau dashboards for faster customer 360-degree analytics.
Solution/results: 24x faster dashboard loads. 3.5x faster slice and dice, drill-downs, and filters for real-time 360-degree customer information to provide more timely and relevant offers. Tableau server is running on GCP with an accelerated EDW workload.
Advanced In-Database Analytics
Problem: A financial services company wanted to move counterparty risk analysis from batch/overnight to a streaming/real-time system for flexible real-time monitoring by traders, auditors, and management.
Solution/results: Able to handle time-sensitive, compute-intensive risk computations to project years into the future across hundreds of variables. In-database analytics to run custom XVA algorithms at scale with GPU's massive parallelization. The client now has a real-time risk modeling engine running on public cloud-based, Microsoft Azure GPU instances. This is a turn-key solution with elasticity, security, ease-of-use, and faster time-to-value.
Problem: An ad tech company wanted to be the first to market with game-changing technologies that put publishers' needs first while supporting real-time campaign reporting.
Solution/results: High-speed ingest, store, and persist data processing capabilities. Ad hoc analytics on ad impression and bid data. Functional replacement of a 40-node Apache Apex cluster resulted in a smaller hardware footprint. High-speed data ingestion via native Kafka integration. Python access to the data store for simplified data science discovery. Contributed fast data capabilities to long-term retention and archive Hadoop data lake.
Do you have some big data use cases you'd like to share with the DZone developer community?