A Sorry State

As the two or three people who stumble across my blog might know, I've been in the software business for a fair while now. It seems to me that the industry that I've come to know and love has fallen recently into a pretty sorry state.

For many years I spent my time as an independent consultant, going from project to project, helping teams build systems, often doing a lot of the building myself. There was always something new to learn, and ways to make software construction easier, better, faster, and more effective.

In the last few years I've had the good fortune to help a whole team of consultants do the same thing: Help clients at all kinds of companies leverage the latest tools and techniques to improve their software results, and the results of their business.

I've noticed a few things, lately, and when I reflect on these things, I realize it's been going on for a long time. It's a slow but steady reduction in the interest in software quality.

Don't get me wrong, there has always been "bad code": Most beginners in the field turn out some pretty bad code, code that they will look back on six months or more later and think "who wrote that mess"... But the only way you can look back and be embarrassed by the code you wrote six months ago is by improving.

The reason we look back at our own code and are embarrassed by it is that we've learned better in those six months (or whatever period it is). We know more than we did when we wrote it - in many cases, so much more that it takes us a moment to recognize our own work, back when we didn't know what we do now. This has been characteristic of the whole industry, at least on the technical side: Most people in the development game enjoyed learning at a blazing pace, and their own skills and therefore their output improved steadily for years at a time.

In the last few years, though, I've started running into what I once thought was a rare bird in our business: The developer who is uninterested in learning. There have always been a few who were content with the status quo, and part of my job was to transfer my own enthusiasm for new techniques and techologies to these few (and often their bosses) to the point where they too got excited about the future, and got back on the learning bandwagon.

Lately though, it's not been just one or two here and there, but whole teams who are actively hostile to learning new things. Their employer often seems to support this condition, being uninterested in proving opportunites and training to better their team.

At first, I thought it was because I was encountering many software teams that were entirely comprised of contractors. If one of your developers (or all of them) are not actually employed by you, it's perhaps natural to be less invested in their development. Then I found even the company who did employ these developers (assuming they weren't independents) was also uninterested in their enablement. The idea of taking even two or three days to significantly upgrade their team's skills was not compelling to them.

I was baffled: The economy of increasing skill on a team pays off so quickly that the idea of rejecting it seemed unbelievable to me. The old adage is true "What if we train our employee and they leave" can always be countered with "What if we don't, and they stay?"

But here I was finding whole companies who seemed entirely content with the idea of "don't train them, and they stay".

Teams of this nature seemed to have certain common characteristics: They often had a wide range of skills, but uniformly low depth of skill: That is, people on the team knew a lot of different things. One member might be a Javascript guy, the next one knew build systems, and so on, but only at a superficial level. They seemed satisfied with this level, in contrast to what I'd expect. Turnover tends to be pretty high, but perhaps this is characteristic of the whole industry right now - no-one seems to raise an eyebrow at 25% or greater turnover every 6 months, and it's considered normal for someone to shift jobs every two years or so.

As you might expect, the output of such teams is less than stellar. In a fine proof of the "broken windows" theory, the quality of the code they work on tends to drop to the lowest common denominator. Results come slowly and intermittently, niceities like frequent releases and continuous integration fall by the wayside, and the venerable practice of "big bang" released at intervals measured in months becomes normal.

We see teams like this inspire the businesses they support to adopt self-defeating techniques such as "code freezes", where changes to a code-base are halted weeks or even months ahead of any anticipated major load, such as holiday shopping. This of course indicates an expectation that changes to the code increase risk, and have a greater than even chance of introducing new problems, as opposed to fixing existing problems - therefore, the smart thing to do is to live with the errors you've got now, rather than take the risk of introducing any more.

This terrifying state of affairs starts to become the new "normal" in such cases, and is hardly even questioned, much less any attempt made to rectify it.

Not only do members on such teams not press for continued improvement in their skills, they also don't seem to be much motivated to help each other sharpen up: It used to be the case that if one person on your team knew a skill, the rest of the team soon would - teaching each other was fundamental. This seems to have gone out of fashion - instead, developers I've encountered recently are reluctant to share their knowledge, perhaps because they realize the make up of the team they are on is ephemeral: People come and go quickly, so helping my teammates raise our collective abilities is less valuable somehow.

Perhaps a contributor to all of this is the rising "noise" level in the industry. It seems unlikely now for any two developers to agree on a good approach, or a proper technique. Such things evolve, of course, but generally in the past there has been some agreement on what constitutes a good technique and what does not. Now, there seems to always be another pundit somewhere advocating whatever nonsense you care to believe. Like tests? No problem, lots of people will agree with you. Hate tests? Also no problem, lots of people will tell you that you don't need tests, they're a waste of time.

Never mind that old-fashioned notion of empircal evidence, or of the reputation of the pundit in question, or of a track record of successful projects to prove out an idea: We'll just take the opinion that supports our own beliefs the most closely, and stick to it tightly. No matter if it's proven false.

Much like fashion, the latest trends on software seem to have more to do with fads than with practicalities of building systems. Containers are in? Quick, we must put everything into containers! That causes network issues? Quick, we must build a mesh, and orchestrate, and a host of other solutions to problems we created for ourselves.

Snake oil has always sold well, but there was at least a healthy does of skepticism driving new ideas to prove themselves previously in the industry. Now "experts" two years out of college are telling us solemnly that they have seen the future, and the future is ... well, fill in the blank, but you must have it, and now!

Teams with shallow, scattered skills have less of a chance of resisting the whims of their often "technically challenged" managers, and are more easily swayed by these value-free trends, further compounding the problem.

Software qualify, of course, plummets in this environment. No problem, we'll just flaunt the "mythical man month" and put three more developers on the team... sorry, three more "resources".

Most frightening of all, though, is the reaction of the business stakeholders to all of these shenanigans: Plummeting quality, no predictable velocity, no consistency, massive risk, sky-high turnover, and no growth in team skills. They greet all this with a collective shrug, and accept the slow trickle of crumbling failures as the new status quo.

Have you seen some of the stuff that passes for "production ready" software nowadays? It's awful: Slow, buggy, poorly designed and unreliable rube-goldberg creations that barely hang together. It's the new norm. And that's just the desktop stuff: Don't even get me started on web and mobile applications.

Maybe it's just time for me to stop pounding the drum of software craftmanship. No-one seems to be buying it, least of all the developers themselves in today's corporations.

As Robert Martin warned some years ago, another few nasty accidents clearly caused by software, and the public may cry out to legislators to "do something", and I very much fear they will. The "something" will probably be externally-imposed regulation and licensing of the software trade, and that will be the last nail in the coffin of software craftmanship.

I should probably end with </rant> I suppose :-) Oh, and yes, get off my lawn.