At the end of 2016, we conducted a survey of nearly 400 IT professionals to find out what daily challenges they face when developing new applications. What we found was that legacy testing processes no longer cut it for DevOps and microservices app development. In fact, a staggering 60% of developer team members spend up to half their day debugging errors instead of developing new features that add end-user value.
Why is debugging becoming such a huge resource drain? For starters, testing is a crucial part of an application’s lifecycle, but it’s inherently challenging to ensure that tests done in development will mirror what happens in production. There are several reasons for this. In the survey, we asked respondents to select what most often leads to bugs appearing in production. They cited:
- Inability to fully recreate production environments in testing: 33%.
- Interdependence on external systems, making integration testing difficult: 27%.
- Testing against unrealistic data before moving into production: 26%.
- Difficulty sharing test data across different teams: 10%.
- Difficulty creating staging environments for testing: 4%.