I work on a C++ project that generates bajillions of warnings. Most of the warnings appeared after the code was written:
- Initially the project used Visual C++ 8, soon switching to 9, but there is little difference in the generated warnings. Warnings were either fixed or silenced, so there were no warnings then.
- Then a 64-bit target was added. It generated a huge amount of warnings, mainly because of sloppy use of types (e.g.
unsigned
vs.size_t
). No one bothered, or had time, to fix them once the code worked for the purpose. - Then support for other platforms using GCC (4.5 and 4.6, some 4.4 initially) was added. GCC is much more picky, so it generated many more warnings. Again nobody bothered, or had time, to fix them. This was complicated by the fact that GCC didn’t have a pragma to silence a warning in particular bit of code until 4.5, and according to the documentation it is still not what one would need.
- In the mean time some deprecated warnings popped up.
So now we have a project generating thousands of warnings. And I can’t even say from how many places, since even .cpp
files are compiled several times by different compilers and warnings in headers get printed over and over.
Is there a best practice for cleaning up something like that? Or at least some positive experience dealing with it?
Don’t touch any of the legacy code. Find a way to suppress the warnings for now. Making seemingly innocuous changes to the code can introduce bugs in code which, I assume, has already been tested and is relatively bug free.
By suppressing all of the warnings that are currently in the codebase, you can pretend to have a clean slate. Any new warnings introduced into the code should be eliminated, and when old code is rewritten the new implementation should not suppress warnings.
Your goals here should be:
-
Reduce the signal/noise ratio. If developers see thousands of notices when they build, they aren’t going to notice that the master branch has 2004 notices and their branch has 2006.
-
Don’t introduce any more bugs. By making the aforementioned innocuous changes, you could be introducing unforseen bugs. This is especially pertinent when you have thousands of warnings. Even if your error rate is incredibly low, you still have thousands of chances to screw up.
-
Don’t let the legacy code lower your standards for new code. Just because the warnings for a class were suppressed doesn’t mean that the new class can do the same. Eventually, with diligence, the legacy code will be eliminated from the codebase and replaced with newer, warning-free code.
7
I think warnings, which are not treated as errors, are useless. You get tons of them, therefore nobody bothers to look at them, and they miss their purpose.
My suggestion is to fix them all, and start treating them as errors.
Fixing them looks scary, but I think it can be done. A good programmer that would pick up this work will very soon figure out the way to deal with each warning, and will handle it very mechanically and quickly.
We did a similar project in my company, and it did work – we now have no warnings in the part of the code where it’s important to us, and I’m talking about many thousands of files (don’t know how many warnings).
This must be immediately followed by treating warnings as errors. Once a part of your code is warning-free, change the compilation flags for this part, and let no new warnings enter. If you fail to do that, you’ll find new warnings pop up like mushrooms, and your work will be wasted.
One concern you may have is introducing bugs as a result of fixing the warnings. This is especially likely because the programmer fixing the warnings probably doesn’t know most of the code he’s changing. However, most changes would be very safe – e.g. adding a function prototype or a cast.
6
I was reading C++ Coding Standards: 101 Rules, Guideleines, and Best Practices and the second of guidelines suggests “Compile cleany at high warning levels”. It outlines a number of reasons why you would not want that many warnings in a program and chief among them are
- There is a potential problem in your code.
- Should have silent, successful builds and, if not, you may be missing real problems.
- Benign warnings ignored could obscure warnings pointing to real dangers
Programs that compile with warnings may appear correct but it is possible there are real problems with the code that may crop up at some point. I would suggest categorizing the warnings, give them a level of prioritization and, when you have time, get them corrected.
1
Either set aside a sprint/cycle to do preventative maintenance on the system, clearing out dead code, eliminating warnings, cleaning up bad comments, things like that, OR institute a policy to eliminate warnings as sources are touched anyway for something else.
I’d not do the former myself just to get rid of warnings, but if such a cycle were planned that’d be part of the workload. Reason for that is that touching a file for anything introduces risk of bugs, and thus risks the stability of your product.
4
I feel your pain because I am in a similar situation. The way I approached the problem is – remove the warnings one module/subproject at the time and set the “treat warnings as error” (Werr) compiler flag for the module after it has been cleaned up. Also, I decided to focus on one platform only, even though we build on several operating systems with different compilers. Finally, for the third-part libraries that we compile, I just turn the warnings off.
Mind you, getting rid of some of these warnings is much easier said than done. For instance, size_t vs unsigned issues you mention can cause integer overflows if you just silence the warnings with casts – so in some cases you replace unsigned with size_t, in other cases you need to detect an overflow at run-time. It is all case-by-case – very tedious and hard.