{{announcement.body}}
{{announcement.title}}

Databases: The Future

DZone 's Guide to

Databases: The Future

The future of databases is fluid.

· Database Zone ·
Free Resource

To learn about the current and future state of databases, we spoke with and received insights from 19 IT professionals. We asked, "Where do you think the biggest opportunities are in the evolution of databases?" Here’s what they shared with us:

AI/ML

  • It will be interesting to see how artificial intelligence (AI) and machine learning (ML) impact the dynamic elements of data compilation and how it will evolve the database strategy.
  • Continuing to push the limit of what can be done in real-time versus being computed, Apache stateful processing time goes into coding for streaming engines. ML phenomenon technologies are being inserted into real-time decision making. This is an exciting area for the industry to discover and graph can enhance ML.
  • The edge and/or the fog will play an increasingly important role. There is so much data generated that moving and storing it in the cloud already has too much overhead. There need to be solutions that work on the edge and play well with the cloud. Databases need (and will get closer to) machine learning tools.

Autonomous

  • 1) The user experience around simplicity. Managed services are taking over and becoming popular easy to get access, management monitoring. 2) High-speed processing opportunities are taking over. We want analytics built in. Hybrid nature is important. 3) Autonomous databases are becoming the biggest opportunities.
  • Around autonomous databases: Reduce the operational footprint in terms of security, management, maintainability, and the cost associated with maintaining, making them more automated and autonomous.
  • More automation on the administration side to reduce overhead for IT resources. Snowflake provides elasticity in those platforms. Platforms provide connectivity to object stores through data tables. How to integrate with streaming data and Kafka – what are the coming dominant architectures within the modern infrastructure?

Image title

Other

  • The world of IoT is very exciting. IoT is now producing data volumes and sets of data we’ve never seen before. Because it is machine generated, it is highly structured, the frequency is escalating all of the time. How to capture, ingest, and get value from IoT data.
  • Managed services around databases shed the responsibility to someone else the same way shedding power and cooling to public cloud providers. Huge benefits for companies that are huge database users without a lot of DBAs. Heavier use of Elk stack, non-RDBMS things.
  • Treating database code as application code is that databases are inherently stateful systems. Managing changes has challenges, especially using container and virtualization technology to move to ephemeral environments. In the future, a lot of this work will shift further left ability to more rapidly retry the change you are going to make to your stateful system i.e., database. Organizations that release more quickly do more rework to their database changes – they’ve found a way to do that rework faster and more easily.
  • Database storage and access will continue to be a great area of opportunities. We will see more quantum technologies adopted here.
  • Products will continue to evolve as each product borrows ideas from others. We’ll see a convergence of product functionality. A lot of developer tooling is converging. The software tools are ahead of the data. I see this maturing over the next couple of years. There is potential for some database to bring event logging functionality onboard.
  • One opportunity is leveraging other technologies (Kubernetes is a prime example) to reduce the amount of database management and operations overhead involved. For example, we built an open-source, freely available Cassandra operator to more seamlessly run and operating Cassandra within Kubernetes. There will continue to be databases and optimizations to existing processes, but working on the ecosystem that surrounds all of that – so that it makes it easier to manage those database solutions – is a big opportunity in this part of the industry.
  • The risks of a new wave of lock-in with hyper-scale cloud providers. Many organizations have just freed themselves from the lock-in of proprietary database providers but now are facing repeating the exact same plot using proprietary cloud databases. Security is another concern. Data is the lifeblood of all organizations and it’s becoming more and more valuable. That makes it a honeypot for malicious actors, and enterprises must make securing their data a top priority with fine-grained access controls, encryption everywhere, and complete audit logging.
  • 1) Traditional databases are schema-on-write. They have fixed schema, which means that the number of columns in a database table is fixed and each column has a fixed type. If a column is configured to store integer values, then its type is “INTEGER”, and if a developer tries to insert a value that is not an integer into this field, then the insert will fail. Similarly, the maximum number of columns in a traditional database is a small number: Postgres supports a maximum of 1600 columns while Oracle supports 1000 columns. These constraints are required so that database queries are fast while remaining efficient. 2) For a modern database, the need of the hour is to provide a database that is flexible and makes appropriate tradeoffs between efficiency and flexibility. Can we build a database that is schema-on-read while allowing standard ANSI SQL queries on it? Can we build a database that can support hundreds of thousands of columns in a table? Can we have a database that can store multi-value types in the same column? These are opportunities and challenges for our current set of database developers!
  • 1) I think more companies will use real-time data to make critical business decisions. That will pave the way for more innovations and allow for accurate decision-making and forecasts. The need to perform “windowing” analytics with a simplified technology architecture, enabled by in-memory technology, are seeing increasing adoption. There is also a demand to support operational workloads that incorporate real-time analysis, such as recommendations, targeting, and fraud analysis. 2) In-memory is a key technological enabler of hybrid transactional analytic databases, which also provides the added benefit of simplicity of architecture – one system to maintain with no data movement. 3) Data Driven Decisions may prove to be popular, by removing the human factor. Companies will use real-time analytics and enhanced BI reporting to build a data-driven culture where machine advise us what to do. 4) Leading database vendors are actively building automation for common DBA tasks: performance tuning and monitoring, disaster recovery, high availability, low latency, auto-scaling based upon historical workloads. 5) Quantum computing may one day realize its theoretical potential and may have an enormous impact on all areas of computing – databases included. I’ve read articles about Quantum transactions, Quantum search, and even a Quantum Query Language!
  • Shifting to streaming data and building customer analytical applications allow for disruption across different industries. How to normalize and create an off the shelf platform that allows for creativity for now solutions. Infuse data into business.

Here are the contributors of insight, knowledge, and experience:

Topics:
database ,artificial intelligence ,machine learning ,ml ,ai ,autonomous ,database security ,security ,data security

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}