{{announcement.body}}
{{announcement.title}}

Why Unit Tests Should Be Avoided For Web Applications

DZone 's Guide to

Why Unit Tests Should Be Avoided For Web Applications

This developer makes the argument that unit tests have become much less useful with web applications that are not only focused on computation.

· DevOps Zone ·
Free Resource

Here's why you may not want to use unit tests for your web apps.

If you are a mid- or senior-level developer, by now you have already educated yourself enough about why you should write unit test cases and how it helps make your web application robust.

However, I believe that unit test cases are hidden anti-patterns for modern web applications.

I realize that my statement does not support the common perception of the software industry, but if you will read the whole article, I hope you will agree with me.

This statement only pertains to web applications, where we write APIs to perform CRUD operations on the backend. 

Drawbacks of Unit Test Cases in Web Application

Ineffective Due to Excessive Object Mocking

Unit tests came into the picture at the very beginning of coding and that's why they have huge respect in the developer's community even now.

We need to understand that things have changed now. Earlier, the architecture of application used to be mostly functional programming or procedural. Programs were mostly used for computation purposes. There was not much dependency on databases and other services, and code used to be small pieces of logic, which you can test easily.

Now we have n number of dependencies on other services and databases and very few methods have logic which can be (should be tested) as separate units.

For writing unit tests, we need to mock service and DB dependency. Once you mock the parts of your code then you will never get absolute confidence that this code will work in production.

You may also enjoy: Unit Testing: The Good, Bad, and Ugly

Obsession With Test Coverage Tools

There are many tools available using which we can check test case coverage of code. Companies are obsessed with the percent of code covered by test cases, and this leads to writing unit tests for the code which is not even required (like getters and setters). The quantity of test cases does not define the quality of test cases.

It's a bad idea to connect percent coverage with the quality of code. I have seen many times that test cases are covering 90% of code, but the way developers wrote them was not going to cover all scenarios. Sometimes developers mocked the response object which they were asserting. The reason was that there was a simple DB operation in the method, and to reach code coverage standard they had to write test cases for that method.

Unnecessary Layer of Testing

We have other types of test cases like integration tests, which cover the scenarios in a better and more sensible way, and this leads to consider unit tests as an extra layer for the web applications.

We may say that unit tests validate small blocks, so a test failure can be useful to identify which block is causing that issue and that would be easy to debug.

The point here is that languages have evolved over time and the exceptions you get in logs would take minutes to identify what is causing the failure. Even if logs are not proper, if you have written your integration/feature tests in the right way, then you can easily identify which block can cause that issue.

Repetitive Effort

As already mentioned, that there are n layers of testing, so we write test cases in different levels which are somehow overlapping. A developer first writes unit test cases, then integration test cases, then if there is any testing team, then they write their own test cases. It seems good at first blush, but just imagine how much repetitive effort we are putting in to do the same thing. Additionally, when we modify any feature, we have to modify the same thing in different places, making more difficult to maintain.

Summary

Languages, design patterns, and application architectures are evolving with time, and we should change ourselves with that.

Unit tests are the best option even now if you are writing methods/functions which are doing computations and are not dependent on other services and databases.

Web applications are the opposite of this, most of the code is either getting or sending the data to other services(or databases). These kind of methods are not qualified for unit testing. They are covered in integration tests.

I am not saying that we should never write unit tests for web applications. We should identify which part of your code is following functional programming or has pure computation, and write unit tests for that and that only, or else we will end up writing garbage.

Further Reading

Why Do Programmers Fail to Write Good Unit Tests?

Unit Testing Best Practices: How to Get the Most Out of Your Test Automation

Topics:
web app testing ,unit test ,anti pattern ,unit testing ,dependencies ,database dependency ,devops ,testing ,architecture

Opinions expressed by DZone contributors are their own.

{{ parent.title || parent.header.title}}

{{ parent.tldr }}

{{ parent.urlSource.name }}