The big question I have in my mind: how many developers are brownfield (“enterprise”) compared to greenfield (all new code, from the ground up).
I’m constantly reading breathless articles about the latest technology, only to find out that It Just Won’t Work On Our Enterprise Software codebase. People aren’t ready for automated testing (because the logic is in the click-handlers and/or database). People aren’t ready for ORM tools because we have horrendous amount of logic in stored procs and triggers. People aren’t ready for WPF because our existing stuff is all WinForms. We can’t get the latest version of Reactive Extensions because existing code used RX 1.0 and there are breaking changes that will require more testing effort than is justified by the return. Etc., etc., etc.
Very few articles seem to be oriented toward the brownfield developer, for whatever reason (can’t sell ads for articles that start off with “you probably can’t use this, but…”?).
So, I’m truly wondering: is the software development industry just chock full of greenfield developers, developing new projects for clients which are then released and enjoy a short existence until complete replacement for whatever reason? Or are there hordes of brownfield programmers silently laboring away in the ADO.NET T/SQL VB.NET software mines, looking wistfully up at the sunshine of Entity Framework 5.0 and Haskell, et cetera?
How do we even measure that? Salaries (wages?) paid to software engineers* in the two categories? How do we measure THAT? Maybe… revenue generated from selling said software? (There’s an assumption that the crappy old software sold by XYZ Corp. actually has maintainers).
My question: Does anybody have any numbers that speaks to how much of the industry is green field vs. brown field?
8
It is a well-known fact that most software development effort is spent maintaining existing software, not writing new software.
Why? Because the first version of a program is only written once. Every subsequent version builds on the original, and steady revenue comes from steadily maintaining, promoting and upgrading an existing product, not from constantly creating brand new inventions.
The life cycle of a software product can vary greatly. Some software systems written for banking on mainframes have life cycles measured in decades, while other programs only have an effective life of a few years (or less). For that reason, any attempt to quantify the relative percentage of effort on greenfield vs brownfield development would not be representative of the industry as a whole.
2
I think the crux of your problems is that most of the “new technologies” you mention are development tools or development methods.
So simply cannot apply a new development methodology to software that has already been developed.
I would also question whether some of these “new” technologies are better. In particular I have never seen the benefit or ORM frameworks. They turn an 1980s relational database technology back into a 1970s hierarchical database. And generate hundreds of lines of code whilst doing so.
Another point is that many/most of these “old” technologies are pretty good at what they do. Just ask yourself what would be the business benefit of replacing and existing working winforms application with a WPF based system that did the same thing.
Most of the value of a system is NOT in its technology but in the business rules and processes which it supports.
In terms of revenue, pure Green Field development is probably close to 0.0% of revenue generated from development.
Why?
Development is almost always building on earlier code, doing rewrites of part of a design, bringing out new and improved version 4.0 etc… I can think of very few commercial situations where a new product is designed completely in isolation and then generates substantial revenue.
Even in start-ups, by the time the product is actually being used by paying customers, it probably been in development for so many months that it doesn’t feel like greenfield code any more.
5