{{announcement.body}}
{{announcement.title}}

Database Use Cases

DZone 's Guide to

Database Use Cases

The most common are in financial services, retail and revolve forecasting, risk mitigation, and improving customer experience.

· Database Zone ·
Free Resource

To learn about the current and future state of databases, we spoke with and received insights from 19 IT professionals. We asked, "What are some real-world problems you, or your clients, are solving with databases?" Here’s what they shared with us:

Industry

  • Financial Services

    • We operate in the capital markets industry. Use cases capture high-frequency datasets for electronic trading, surveillance, and regulatory reporting where you have to analyze and run analytics against trillions of data points. IoT and IIot are big industries around capturing data from sensors in manufacturing plants and helping customers to optimize the manufacture of semiconductors and doing predictive analytics when a machine might suffer a failure so a company can take action before a failure occurs.
    • Quite a few organizations with large legacy on-prem datacenter rise in demand for cloud database support and all of the work required to move to the cloud. Large banks and retailers pursuing cloud initiatives. Database changes that need to float through a software pipeline. Better use of cloud platforms.
    • Santander has banking systems of hardware mainframe databases with Cobol. They’re and need to modernize applications Work with them on augmentation. New services for applications and share between the two systems to address customer requirements, more interactive capabilities without replacing existing. Another cloud agnostic, Kaminos financial banking software cloud-native, cloud-agnostic system. We work on any cloud.
  • Retail

    • Customer journey and hyper-personalization for marketing across all industries — retail, financial services, oil and gas, Telcos. Retain and acquire more customers Instead of transactional these use cases get into behavioral aspect based on transactional and social. What triggers you to refinance with us? These are Greenfield applications. Classical use cases like risk management, fraud detection, and inventory management are getting rebuild with analytics context built right in. Current engines are unable to scale beyond 90 days-worth of data. Customers want one and two-year views to make more informed decisions.
  • Other

    • One of our customers is collecting events and incidents in cloud databases, providing their customers with real-time actions /alerts/insights, and enabling companies to make better use of cloud technologies. We allowed them to save a lot of money doing so, and also opened up new use cases within their software. Another customer (in the packaging industry) is using our IoT Data Platform to increase the efficiency of their manufacturing production by providing real-time insights and alerts to the shop floor operators who can then fix issues faster and more acutely.
    • 1) Genomics England: Working with the NHS, Genomics England is sequencing 100K genomes from patients with rare diseases and their families, as well as patients with common cancer. MongoDB provides the flexibility needed to store and analyze complex data sets together as the team seeks to deliver better diagnostic approaches for patients and new discoveries for doctors and scientists.

      2) Coinbase: as crypto and the blockchain continue to carve out real-world use cases--international payments, debt settlement, verification for sensitive systems such as voting, etc. — Coinbase has scaled its operation to handle the massive amount of data that can transfer during peak windows, requiring transactional guarantees at huge scale.

Image title

Application

  • Fraud/Risk

    • Our typical use cases are focused on the need for high performance at scale. We deliver value in every industry. With the latest version, we enable clusters to provide strong consistency, giving us an even stronger relationship in the finance industry or where it is essential that data is consistent. Here are just a couple:
      1) Market Risk A Global Fortune 100 financial company needed to analyze trades in as near real-time as possible to mitigate risk faster than their competition and continue trading. They were not able to do this without Hazelcast. They could have gone with other IMDGs or spent time and money optimizing their application and backend database, but as the volumes increased and the business grew horizontally, they would have had to continuously fine-tune their environment with no guarantee of meeting the SLAs.

      By using our platform, they were able to use existing developer skills in .NET to very quickly deploy Hazelcast in between their application and backend ORCL/Windows platform and enable analysts to view 200 trades at a time, merge them with market metadata and analyze risk according, in near real time, giving them competitive advantage.

      2) Fraud Detection  we provide a unique set of features that are ideally suited to unifying corporate risk data, making the needed analytics and reporting relatively easy — and doing all of this across a highly available, low latency platform. The customer was exceeding transaction rate limits when applying fraud detection algorithms against customer data sitting on its relational database platform; the technical restriction caused them to break fraud detection SLAs and blocked the new business acquisition. With IMDG, they were able to identify false positives/negatives well within the SLA enabling them to add additional rules and checks, resulting in much less fraudulent activity.
    • 1) Anadarko bought by Chevron try to figure out how to be first to bid on land for extracting oil. $2,000 per acre versus $20,000 per acre. Model out basin data to go from 9 million to 90 billion data points to run more realistic models for better in real-time.

      2) Retail Ovo subsidiary in Indonesia 360-view of the customer. Market across all retail channels — Ovo payment aggregates different profiles for customers when they walk past the store and send real-time promotions. Real-time engagement and vending machines capture people walking by, people looking at the machine. 140 million customers.

      3) Real-time value at risk in financial services. Banks and financial institutions looking at portfolios to make decisions at the moment. How to run value at risk in an on-going manner. Insurance portfolio risk. Telecommunication network optimization and predictive analysis. 5G roll-out where to put infrastructure to support customer-base. Putting graph capabilities into GPU accelerated database became important for shortest force analysis.
  • CX

    • 1) Things like 360 view of the customer, product, or portfolio. 2) Fraud detection, money laundering. 3) Recommendations, 4) Identity access management. 5) Risk. For technologists: 1) big and slow joins that slow the database, 2) complexity of modeling data for 3) networks or deep hierarchies dealing with fast moving business requirements. Real-time recommendation engines for products, jobs, next best offering, the optimal path for data.
    • 1) Our clients are using databases to operationalize larger data sets. One example is 'personalization.' For example, a large media company wants to show personalized news headlines on their mobile application, so that the news headlines you see on your mobile device might be slightly different than the news headlines your friend sees on her mobile device. The personalization is done by backend software that uses your recent browsing history, your location, and your demographic characteristics to determine how to rank news articles personalized for you. This query needs to do a set of filters, aggregations and sorting on gigabytes of clickstream events to produce this newsfeed in a few milliseconds.

      2) Another example of operationalizing large data sets using a database is to do real-time inventory management. A retailer is using a transactional database like Amazon DynamoDB to record the current state of its inventory. The changelog from DynamoDB continuously flows into an operational-analytics database. The data in the analytics database is only one or two seconds behind the transactional database.

      The retailer now makes complex SQL queries on the analytics database to figure out when to order new supplies to refill the inventory, or to uncover fraud in the retail orders that are occurring. The reason the customer deploys these live analytics applications on the analytics database is that these queries do not impact the production of transactional workloads. The analytics database also has historical information on transactions, which is used by analytics queries to perform active surveillance on longer-term trends.
    • A big one is customer 360, IoT, real-time offers, ad targeting, security analytics,  threat detection, fraud detection, risk engines, personalization is big, domain-specific analytics of service applications, security analytics, sales analytics sensor data, pricing, analytics as a service app.

Other

  • Customers looking for document and relational databases are pleasantly surprised you can get both through MySQL, and even more pleasantly surprised when they find out that MySQL licensing is over 90 percent cheaper than SQL Server.
  • What customers do with Flow help them integrate databases to make them part of the deployment. We hook the integrations into the product to make them part of the overall pipeline. Roll out schema changes and then roll out the application changes to keep an eye on the database changes to ensure they don’t have a negative impact on the current functionality. If there are problems, you have a handle on where the problems are.
  • Our customers range from small local county to large fortune 25 company. They all have one goal in common: how to securely manage sensitive databases at the same time provide the highest level of database availability. We can help our customers achieve the desired goal at the lowest deployment cost.
  • We’ve maximized our database design to ensure data aligns with customer needs and our support communication strategy/efforts.
  • A product category that’s new and event log like Apache Kafka acts like a distributed log it’s like a database because you can query. It stores data coming into the company as events. Over time Apache Kafka is bringing in features only databases would provide like KSQL to write queries against. Databases themselves one use case migrating from a commercial to open source — Oracle to PostgreSQL. Another use case to bring MongoDB in. Doing schema and read rather than write. Still, do the work but down the road. 
  • A particularly interesting use case is how PubNub turned to a NoSQL database strategy — Apache Cassandra, managed by Instaclustr – to handle massive data volume. The data streaming company has scaled to manage 1.3 trillion messages per month and needs to meet a 99.999 percent uptime guarantee to customers. It’s no small task. The open source strategy worked well in that it avoided vendor lock-in if requirements changed and was a good way to store time series data. Having it managed as-a-service then helped PubNub ensure Cassandra expertise is available for continual database optimization, and PubNub can then focus more on building its solution.
  • As companies are modernizing their infrastructure from data warehouses and going to the cloud, they need help migrating their on-prem legacy data warehouse to the cloud. As they modernize in the cloud, we see the undercurrent of more automation as part of the strategy. Automating to do more with existing resources in the cloud to address the backlog of business needs.

Here are the contributors of insight, knowledge, and experience:

Topics:
database ,use cases ,solving problems with databases ,future state of databases ,financial services ,retail ,risk mitigation ,improving customer experience

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}