I had a discussion with a co-worker yesterday about design philosophy. The other coder is more experienced then me, and I fully admit that he is likely much better at properly automating his testing, which I’m just now trying to break sloppy habits. However, it seems that some of our dispute is a philosophical issues where I’m unwilling to simply yield to his greater experience.
Generally he is a Directing programmer, who follows all of the formal approaches. He has interfaces for nearly every class and has automated tests for everything down to getter and setters on basic POJO objects. He sees refactoring as a dirty word which is a sign that things weren’t written/designed properly from the start, it’s inevitable and done as necessary but everything should be done to avoid having to do it and you should feel bad when you have to.
I feel I’m a more flexible/Enabling programmer. I feel that interfaces should be used as necessary, but not thrown in until they are needed (KISS), and that it’s easy enough to pull an interface out of an existing object by refactoring if you decide an interface/polymorphism makes more sense later. I also felt that, while I admit I need to be better at testing then I am currently, the level of automated tests he suggested seemed too rigors since it would take so long to implement and, more importantly, make refactoring a nightmare since I wouldn’t be free to modify the behavior of any non-private method. As my requirements grow and change I find myself refactoring heavily and that level of testing feels like it would kill my flexibility. His view is that I shouldn’t be refactoring so it doesn’t matter if the tests make it harder to do.
Ultimately all of the other philosophical discussions seemed to boil down to the word ‘refactor’. I love refactoring, I feel it should be used liberally any time you feel that something isn’t quite as clean and beautiful as it should be, and that the natural course of growing a program or adding new requirements will necessitate refactoring to better enable code reuse, moving logic to new classes/packages as it grows more complex, and yes fixing inevitable design mistakes which will happen from a flexible agile development approach (and that’s not always bad if your code is written to allow modification!). He disagrees with all of that.
I’m wondering what other’s thoughts are on refactoring. How common should it be? Is writing code with a philosophy of “I’m doing it simple now, I can refactor it later if more complicated logic proves necessary” inherently flawed/dangerous? Does acceptability of refactoring depend on program approach and style? I generally am on rapid prototyping or “clean up this ugly, but working, prototype some non-engineers wrote”, while it sounds like he actually gets to do formal programming with official requirements that don’t change; I’ve never once had a job like that myself!
5
I think the answer lies in between the two of you.
Changes are to be expected as software evolves. No matter how competent you are, you can’t predict the future with 100% accuracy. And even if you could, it may not match your customer’s predictions or taste. Since he’s the one who pays he (most unfortunately 🙂 ) has a say in it. So unless it’s dead simple, it’s almost impossible to design an application with everything in it’s right place from day one.
On the other hand, you still do have the responsibility to be, at all times, as close as possible to that. You have to plan and think ahead of requirements so that you will be less likely to paint yourself in a corner and make it nearly impossible for your code to adapt to new demands (and believe it, there will be demands).
Therefore, while I fully agree with you that refactoring is a great thing to do whenever you feel things could be better, calling it a “rule” doesn’t sound quite right to me. If refactoring code is something you do on a daily basis then clearly something is wrong with the way you are coding your apps.
On this question:
Is writing code with a philosophy of "I'm doing it simple now, I can refactor
it later if more complicated logic proves necessary" inherently flawed/dangerous?
If you expect – or even just suspect – that more complicated logic will be required eventually, the answer is most definitely yes. If not then it really depends on code’s usefulness, how deep it will be in your app, project deadlines and perhaps most importantly, common sense.
If your customer is a bike workshop in Wisconsin it’s very unlikely that he’ll need an app that supports international phone numbers. Therefore it’s reasonable for you to design a PhoneNumber
class that’s really just holding a three digits area code and a seven digits number. In the unlikely event that it becomes necessary for your customer to store data on bikers from France, refactoring may then become helpful.
This is of course just an example but you get the idea: while at first it may make sense to write pretty straightforward code, as time passes and requirements changes you might find that the way your application now relies on it may be a problem. This is the kind of scenario which you should normally try to avoid but unless you’re God you won’t be able to avoid all of them. Sometimes it won’t even be your fault; software customers tends to be very imaginative when it comes to transform a bike workshop management application into a pizza ordering system.
In any case, you should still always write code in the way you think it should be upon release. Think of refactoring as a way to fix things you made wrong assumptions on.
7
Embrace refactoring. Embrace the fact that you can create code that works, and you have tests that make it work, and then you can change the internals having confidence that you are not breaking anything. And as you mention, KISS.
About the view of not having written/designed correctly from the start, for me the answer is that I have not worked in a project in which there were no changes at some point (or rather, very often). Even if someone spent a month “designing” the solution. Changes that introduce duplication, that introduce generalization, that introduce new use cases. For all of them you will refactor.
Even as you are writing a fully “designed” solution, you will find that there is duplicated code, that there are generalizations not taken into account. Coding is designing, and until you are not done coding, you have not finished designing.
One other thing to add: YAGNI.
6
It all come down to who writes better code relative to the amount of time. Your company/team should define what “better” means. Customers have something to say about this as well.
Make sure you have a solid definition of Refactoring. It sounds like you do, but you mention it along with code behavioral changes very close together in your question. Be careful. You two can’t maintain this debate if you don’t agree on the definition.
Out of everything the two of you are claiming, the biggest issue is his claim to be able to write code correctly the “first” time. Does this occur when code is checked in for the very first time or before it goes into production? Obviously the later gives you a little more time to get things right. Maybe he has worked on this particular project for so long that he is able to do it because he understands the code and the people asking for features. A new PM could be added who doesn’t create as detailed of specs or may not understand the domain as much as the current one. Your method may make you more adaptable.
Does he take a disproportionate amount of time to achieve the same objective as you do with fewer bugs/need to improve? Code can be improved to a certain extent forever, but truly good code gets shipped. Do you introduce more bugs into the system? Users want new features and corrections sooner, but there is a point they get too frustrated with too many bugs popping up too often. Every time we update…
Your philosophical arguments are only just personal preferences until you can determine who is the most effective developer. Would your Directing programmer be able to do this on a new project? Are you using “I can refactor it later” as a crutch?
I’d say the exception.
Every time you refactor, you’re basically changing the way the code looks rather than the way it works (eg you might extract a block of code into a method, or change a switch to a polymorphic function call, in all cases the functionality hasn’t changed, you’ve just altered it).
Now while this maybe useful to make the code easier to maintain or understand, it is still making large changes that will really affect the way your code shows up in the source control logs. You cannot now look at the diffs and see a bug because of a code change, as the code is radically different. Its the same problem when you change the look of code because you want to prettify it (eg change brace positions or rename variables).
So refactoring should really be kept for cases where it is really worthwhile, just so you can keep a sense of what the code does. I used to work with a guy who changed the code dramatically – I’d come in after a weekend and find the codebase was practically a different product. It made working with it tremendously difficult as I had to take the time to figure out where the bits I was familiar with had gone.
There is also a sense that refactoring is overused nowadays, a sense that coders should hack something together that works quickly, and then tidy it up to make it look good later (or even, just tidy it up as you go). I think this is a poor practice. You should think what you’re trying to achieve before you do it. If this was done first, you shouldn’t need to refactor. In the old days where design-first was much more prevalent, refactoring was solely used for legacy code, never for daily work.
I know one though – ‘beauty is in the eye of the beholder’. You may refactor to make things look nicer, but then someone else may come along and refactor again to make it look nice to them (in Fowler’s Refactoring book there are refactorings that are opposites!). You’ll never stop, spending all day refactoring code and none actually writing product features.
3