Over a million developers have joined DZone.

Continuous Integration in the cloud: good idea?

DZone's Guide to

Continuous Integration in the cloud: good idea?

· Cloud Zone ·
Free Resource

Learn how to migrate and modernize stateless applications and run them in a Kubernetes cluster.

Continuous Integration can be tricky to provision. It’s IO or CPU bound at the beginning and then it has a tendancy to batter your database for a long time while staying almost idle. Slava Imeshev of Viewtier kindly commented on myoutsourcing continuous integration post:

My take on this is that hosted CI in a common virtualized environment such as EC2 won’t work. A CI or a build server, unlike the rest of the applications, needs all four components of a build box, CPU, RAM, disk and network I/O. The industry wisdom says applications that are subject of virtualization may demand maximum two. Sure you can run a build in EC2, but you will have to sacrifice build speed, and that’s usually the last thing you want to do. If you want fast builds, you have to run in the opposite direction, towards a dedicated, big fat box hosted locally.

Viewtier has been hosting Continuous Integration for open source projects for five years, and our experiences shows that even builds on a dedicated build box begin to slow down if the number of long-running builds exceeds a double of number of CPUs. Actually, we observe a trend towards farms of build machines hosted locally.

Hmm. He makes an interesting point. Seems that we might need to do more than throw EC2 agents at our favourite Continuous Integration servers. The great appeal of Cloud Continuous Integration is that there’s no limit to the amount of resource that we can buy. There’s an assumption that you’ll be able to make use of that. I’m wondering what patterns will emerge to deal with that. Will we fire many builds that compile and run unit tests (proper unit tests, without database calls) on EC2, and then queue them elsewhere for slow functional test runs?

Ideally we’d get more effective at writing tests; I’d think that parallelizing a test execution via the cloud could allow us to sweep test performance issues under the rug. We’ll just have to wait and see.

Join us in exploring application and infrastructure changes required for running scalable, observable, and portable apps on Kubernetes.


Published at DZone with permission of

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}