Just wondering on the pros and cons on TDD/automated unit testing and looking for the community’s view on whether it’s acceptable for professional developers to write applications without supporting unit tests?
12
To test what?
If you’re talking about 100% code coverage, it’s extremely rare, not useful and often impossible. In the same way, testing CRUD-related code is a waste of time, and it would be highly unprofessional to spend hours writing code you don’t need instead of doing something actually useful.
Now, as a developer, you have to know how to write unit tests, and where do you need them.
Theory
There’s a common misconception that unit tests are for testing “units”.
Unit tests, likewise all other tests, test functionality. There’s simply nothing else can be tested.
However, functionality of a complex system often can’t be tested effectively.
Say, a user presses a “Delete” button, and nothing happens. Why? An error may be within a database, a connection may be broken, a core logic may malfunction, or even an operation may succeed but the interface did not update properly. Each layer may contain many functions that call one another, and it is not always clear where the error is.
Unit tests are based on a paradigm of separating components and test them individually.
Once again, it does not guarantee the whole system will work, but it simplifies testing.
TDD is not a requirement by itself. It simplifies coding by forcing a developer to answer first, “what am I actually going to code?”.
Costs
Many think that by not writing tests, they save their time. Wrong. Thinking strategically, the cost of an error increase exponentially with time since appearance till detection.
Say you make an error in your code and detect and fix it the same day. The cost is $X.
Let’s suppose the error remains undetected and goes to weekly internal build. Fortunately, the QA has detected it, raised a bug in bug tracker, the issue was a subject of 5-minute discussion on a meeting of 5 people, etc. Finally, you fix it. The cost is sum of manpower spent by everyone, and it may be $3*X in this case.
What if the error went to beta testing and involve more people? Let’s say, $10*X.
If it stays undetected for a long time, went to 1,000 customers (hopefully, not to a million!), 10 of them detected it, they raised a discussion with your support staff, some may called your boss, your teammates attempt to reproduce it, etc, etc. Finally, the bug comes back to you, and you fix it. How much, in total? Well more than $50*X.
As you see, a bug will sooner or later come back to you (or your colleague). The only difference is when it happens and how much it would cost.
Unit tests shorten a lifecycle of bugs and therefore reduce costs.
Pro’s
- Unit tests let the developer to code better. As simple as that.
- They let save your time;
- They decrease costs;
- Unit tests live together with the code they test. Upon any change request (which happens all time), the tests will adapt.
Con’s
I can see a single excuse not to write tests. If you are writing a prototype, e.g. something that will never go to the other people. Or maybe you’re writing something for a single use.
3
Sure, it is acceptable to not write unit tests for little internal helper utilities, or test tools, or scenarios where business really really needs things other than quality and you as the professional developer find that you can get the software done and working just as quickly without.
In my experience, 95% of the errors that can be caught by unit tests come from calls to the data layer, especially after database design changes. If you’re using a database, just put a test over every method you use to access it. The tests don’t even need to be elaborate, just sanity checks.
In answer to your question – if you access a database, and you are a professional developer, than you should use unit tests. Otherwise, it depends.
1
It really depends on how the application will be used. Is this a mission critical application? Or is it a simple verification application only used internally by developers? When working on the core applications for your company there should be as much unit tests as necessary to ensure the business decisions are covered. Depending on your company, those are the applications customers see and bugs could cost the money. When done well, unit tests can be a big help to ensure your applications will work when deployed. But it is not a cure all and human testing should still be done. Simple internal (or ancillary) applications don’t need unit tests, but should still be written well.
TDD is not just about writing tests first. It is about making sure your code is easily testable. And more often than not easily testable code is also easier to read/debug/maintain. More often than not easily testable code also follows patterns and OO principles such as SOLID. I would argue as a professional developer your code should always be written that way.
You definitely do not want “no testing”. If you write unit tests, you ave at least some assurance that your code matches your tests (although you’d need to ensure that your tests matches your specification).
You’re not done if all you have is unit tests, though. You probably still need to do integration tests and end-to-end tests (and over time accumulate test cases to catch bug regressions).
2
I’m going to go out on a limb here and say it’s mostly subjective and depends on your goals. tl;dnr: It’s good to do, but being dogmatic about it is just going to lead to more problems.
TDD/Unit tests are going to improve the stability of your code. They make it easier to make changes without knowing the code base really well, they let you refactor faster, they let you be sure that you’re not doing something silly. Unit tests can also be a waste of time. Time that could be spent writing code. It can also let you fool yourself in to thinking your code works when it doesn’t if you follow them blindly.
If you’re working for a company that supports best practices and gives you the time to do implement them and you want an application that will last, then it’s really best for everyone to use and practice unit tests and code reviews. Having a QA team can be an acceptable substitute if the developers don’t abuse it. If you’re writing a prototype (even a production one) it may be faster the just do smoke tests.
If you’re working on something where a mess-up isn’t going to be end of the world, less coverage is probably fine. Financial transactions? Lots. If you have a team of strong developers who know the codebase, work well together and there’s no turnover, then you probably need less coverage. Etc.
So it’s probably going to be some function of
- Team size/turnover
- Desired shelf life of the application
- The application and how critical failures are
- Time provided to implement the best practices relative to the development schedule
- Team aptitude (this doesn’t mean being a good programmer means you shouldn’t write unit tests, but a team without junior developers might be able to fly by the seat of the pants with more success)
There are plenty of situations where not writing unit tests would be acceptable. TDD is ‘in’ right now, but it’s not a silver bullet.
It is professional to Write Maintainable Unit Tests That Will Save You Time And Tears !
There is a misconception that Unit test finds bugs
. Well that is simply NOT true for all cases. Unit testing is not about finding bugs or detect regressions. It is by definition, examine each unit of your code separately
. But when your application runs for real, all those units have to work together, and the whole is more complex and subtle than the sum of its independently-tested parts.
Thus, the goal of Unit testing (within the TDD process) is about designing software components robustly.
Edit: There is one exception where unit tests do actually detect bugs. It’s when you’re refactoring, or restructuring a unit’s code but without meaning to change its behaviour. In this case, unit tests can often tell you if the unit’s behaviour has changed.
As a professional developer, is it acceptable to not write unit tests?
No.
Without question – This should be one of the first lessons that a fresher learns.
You must develop unit tests to prove that your code works before you tell QA that your code is ready for testing. A failure to do so would inflate costs for the project and lower team morale.
How thorough should these unit tests be?
Here is the litmus test: If QA discovers a bug in your code, will you be comfortable using your unit test(s) to prove your due diligence to your boss?
If your answer is ‘No’, then you should craft a better unit test (or tests).
If your answer is ‘Yes’, then your code is ready for verification.
As a professional developer, is it acceptable to not write automated unit tests?
Yes.
Particularly if the time and effort spent writing automated unit tests would outweigh the benefit gained by the test. This includes, but is not restricted to, UI code that can be difficult to mock.
Emphasis: I’m not saying that it is impossible to write automated unit tests for UI code. I’m just saying that, in my experience, it is sometimes difficult to write automated unit tests for some UI code.