2020 Performance and Security Predictions
Three industry experts predict what the security ecosystem will look like and how companies will measure performance and efficiency in 2020.
Join the DZone community and get the full member experience.Join For Free
Surely, 2019 was the year we solved massive security breaches. Right?
Of course not. Barely a month went by without some news story breaking of dozens, or even hundreds, of millions of records being accessed by bad actors.
In contrast, the APM ecosystem quietly hummed along in the background. In the DevOps realm, focus on automation testing and test automation have quietly surged (at least if our readers' habits are any indication), and everyone seems to be increasingly focused on getting rid of manual tasks and testing.
With all of that in mind, we asked a few of their contacts what they thought was up next for the performance and security worlds.
- Paul Ponzeka, CTO, Abacus: Support and security: Software vendors are getting increasingly stricter and tighter around the lifecycle of their deployed software. This means that if a client wants support, they are forced to use a version that by design will lack new features and productivity, or go to a software release model that forces the client to upgrade every three months or so to maintain supportability. The third and final reason is the most troubling, leaving clients scrambling to migrate their workloads, and that’s security patching.
- Anurag Kahol, CTO and co-founder, Bitglass: Cloud database disasters: Misconfigurations of cloud databases will continue to plague enterprises around the world and will be a leading cause of data breaches in 2020. Gartner forecasts that global public cloud revenue will reach $249.8 billion in 2020, a 16.6% increase from 2019. This rapid rise in revenue is spurred by continued growth in cloud adoption. However, cloud adoption is clearly outpacing the adoption of the tools and expertise needed to properly protect data in cloud environments; this is supported by the fact that 99% of cloud security failures will be the customer’s fault through 2025, according to Gartner. Consequently, misconfigurations will continue to be a leading cause of data leakage across all verticals.
In addition to the above, highly niche cloud tools provided by second-tier cloud service providers are making their way into enterprises. While services that cater specifically to individual industries or company departments are gaining traction, they do not typically have the same native security measures that mainstream cloud services do. Regardless, companies are gaining confidence — even if it’s a false sense of confidence — in their ability to utilize the cloud and are adopting these second-tier and long-tail cloud apps without considering all of the security ramifications. Enterprises will need visibility and control into all of their cloud footprint, including niche services, in order to proactively mitigate any vulnerabilities and properly secure data in the cloud.
- Tomer Shiran, CEO and co-founder, Dremio: Goodbye performance benchmarks, hello efficiency benchmarks: Escalating public cloud costs have forced enterprises to re-prioritize the evaluation criteria for their cloud services, with higher efficiency and lower costs now front and center. The highly elastic nature of the public cloud means that cloud services can (but don’t always) release resources when not in use. And services which deliver the same unit of work with higher performance are in effect more efficient and cost less. In the on-premises world of over-provisioned assets such gains are hard to reclaim. But in the public cloud time really is money. This has created a new battleground where cloud services are competing on the dimension of service efficiency to achieve the lowest cost per compute, and 2020 will see that battle heat up.
Opinions expressed by DZone contributors are their own.