Did Amazon Just Kill Open Source?
Did Amazon Just Kill Open Source?
Just like the trend outside of tech, convenience seems to be trumping forethought. As AWS rolls out more and more fully integrated tools, what's happening to open source?
Join the DZone community and get the full member experience.Join For Free
Learn how to migrate and modernize stateless applications and run them in a Kubernetes cluster.
After re:Invent, it's clear that Amazon is unstoppable. AWS announced more products, all fully integrated and simple to use — and if you thought infrastructure companies are its competition, think again. The new Amazon offering competes with established database vendors, the open-source Big Data and container ecosystems, security software, and even developer and APM tools.
Open source is a key ingredient, but Amazon seems to prove that usability and integration are more important to many customers than access to an endless variety of overlapping open source projects.
It is interesting to see how Amazon, on one hand, bashes the open source ecosystem and highlights the advantage of its own tools, while at the same time taking projects like Presto, which was developed in the open by Facebook, and turning it into a packaged, revenue-generating product (the newly announced Athena service).
This should be a wake-up call for the tech and software industry!
Back in the old days, we used to focus on creating modular architectures. We had standard wire protocols like NFS, RPC, etc. and standard API layers like BSD, POSIX, etc. Those were fun days. You could buy products from different vendors, they actually worked well together and were interchangeable. There were always open source implementations of the standard fare, but people could also build commercial variations to extend functionality or durability.
The most successful open source project is Linux. We tend to forget it has very strict APIs and layers. New kernel implementations must often be backed by official standards (USB, SCSI…). Yet open source and commercial implementations live happily side-by-side in Linux.
If we contrast Linux with the state of open source today, we see so many implementations that overlap. Take the Big Data ecosystem as an example: In most cases there are no standard APIs or layers, not to mention standard wire protocols. Projects are not interchangeable, causing much worse lock-in than when using commercial products that conform to a common standard.
How Did We Get Here?
The tech industry is going through a monumental change driven by digital transformation. This changes the infrastructure and software stack dramatically. The old guard is in survival mode, and we seem to be missing responsible tech leadership that will define and build a modular stack for the new age. Strong players like Amazon and Azure are building their own fully integrated offerings, so the rest of us need to exercise responsibility and work together with a focus on integration, not code.
We don’t need 20 more Apache projects that do the same thing, just slightly better. We don’t need 10 more open source container management platforms, and we can’t turn poorly architected frameworks into a de-facto standard. We need to start by defining the layers and components in the new stack, followed by APIs, protocols, and common management paradigms. We should work to make existing projects and products fit into this new model, while adding better ones.
That’s the only way to get back to a decent user experience, one in which we can easily build, secure, and operate integrated stacks from independent components, with the ability to swap parts if need be, without being locked to project specific or cloud provider APIs. If we won’t do this, we will all lose and become technically enslaved by the cloud.
Published at DZone with permission of Yaron Haviv , DZone MVB. See the original article here.
Opinions expressed by DZone contributors are their own.