There will be code
Posted by Uncle Bob on 08/28/2008
During the last three decades, several things about software development have changed, and several other things have not. The things that have changed are startling. The things that have not are even more startling.
What has changed? Three decades have seen a 1000 fold increase in speed, another 1000 fold increase in memory. Yet another 1000 fold decrease in size (by volume), and yet another 1000 fold decrease in power consumption. Adding up all those zeros implies that the resources we have to play with have increased by twelve orders of magnitude. Even if I have over estimated by five orders of magnitude the remaining seven are still and astounding increase.
I remember building RSX-11M on a PDP-11/60 from source. It took several hours. Nowadays I can build huge java applications in a matter of seconds. I remember that compiling small C programs required dozens of minutes. Now much larger programs compile in a eyeblink. I remember painstakingly editing assembly language on punch cards. Now I use refactorings in huge java programs without thinking about it. I remember when 10,000 lines of code was five boxes of cards that weighed 50 pounds. Now, such a program is considered trivial.
Nowadays we have tools! We have editors that compile our code while we type, and complete our thoughts for us. We have analyzers that will find code duplication in huge systems, and identify flaws and weaknesses. We have code coverage tools that will tell us each line of code that our unit tests fail to execute. We have refactoring browsers that allow us to manipulate our code with unprecedented power and convenience.
But in the face of all this massive change, this rampant growth, this almost unlimited wealth of resources, there is something that hasn’t changed much at all. Code.
Fortran, Algol, and Lisp are over fifty years old. These language are the clear progenitors of the static and dynamic languages we use today. The roots of C++, Java, and C# clearly lie in Algol and Fortan. The connection between Lisp, and Ruby, Python, and smalltalk may be less obvious, but only slightly so. Today’s modern language may be rich with features and power, but they are not 12 orders of magnitude better than their ancestors. Indeed, it’s hard to say that they are even ONE order of magnitude better.
When it comes down to it. We still write programs made out of calculations, ‘if’ statements, and ‘for’ loops. We still assign values into variables and pass arguments into functions. Programmers from 30 years ago might be surprised that we use lower case letters in our programs, but little else would startle them about the code we write.
We are like carpenters who started out using hammers and saws, and have progressed to using air-hammers and power saws. These power tools help a lot; but in the end we are still cutting wood and nailing it together. And we probably will be for the next 30 years.
Looking back we see that what we do hasn’t changed all that much. What has changed are the tools and resources we can apply to the task. Looking forward I anticipate that the current trend will continue. The tools will get better, but the code will still be code. We may see some “minor” improvements in languages and frameworks, but we will still be slinging code.
Some folks have put a great deal of hope in technologies like MDA. I don’t. The reason is that I don’t see MDA as anything more than a different kind of computer language. To be effective it will still need ‘if’ and ‘for’ statements of some kind. And ‘programmers’ will still need to write programs in that language, because details will still need to be managed. There is no language that can eliminate the programming step, because the programming step is the translation from requirements to systems irrespective of language. MDA does not change this.
Some folks have speculated that we’ll have “intelligent agents” based on some kind of AI technology, and that these agents will be able to write portions of our programs for us. The problem with this is that we already have intelligent agents that write programs for us. They are called programmers. It’s difficult to imagine a program that is able to communicate to a customer and write a program better than a human programmer.
So, for the foreseeable future I think software will remain the art of crafting code to meet the requirements of our customers.
There is something else that needs to change, however. And I believe it is changing. Our professionalism.
In some ways our “profession” has paralleled that of medicine. 300 years ago there were a few thinkers, and far too many practitioners. There were no standards, no common rituals or behaviors, no common disciplines. If you got sick you might go to a barber, or a healer. Perhaps he’d let out some blood, or ask you to wear some garlic. Over time, however, the few thinkers gained knowledge, discipline, and skill. They adopted standards and rituals. They set up a system for policing and maintaining those standards, and for training and accepting new members.
THIS is the change that I hope the next thirty years holds for software development.
Comments
mab@cruzio.com about 9 hours later:
Uncle Bob,
Enjoyed your blog piece, though I have both similar and different perspectives borne of apparently similar experience: I’ve loaded paper tapes into PDP-8/e’s and 8/m’s in order to boot systems, and of course I’ve toggled in a boot-loader using real switches. I totally remember computers before the microprocessor. I even remember Hollerith Code—how could one forget?!
Thought I’d share these perspectives, and in so doing, I just wish to convey possibly divergent ideas elicited from a transformative process of learning, but not convey any disrespect.
Some of the topics you touched on resonate with me, and these views also seem to be in synchronism with Dewar and Shonberg’s article (hope the link posts properly) that I found very much matching how I think about the state of the software world today:http://www.stsc.hill.af.mil/CrossTalk/2008/0801DewarSchonberg.html
I went back and revisited Ada, after reading this article. I had kind of written it off long ago, and I now believe that was a mistake made primarily as a tempestuous youth, perhaps. Some of things one has to do for mission critical programming that must be provably correct are handled directly in the Ada language. Not only that, they are even convenient. So, much like the Structured Design fad of the 1980’s, I really think MDA is a poor technological substitute for correct representation at the level where the real work gets done: the nuts and bolts programming that your blog piece alluded too.
I particularly much liked your power hammers and saws analogy, and I think it’s quite apt to the state of things today.
Where I differ, I suppose, is that there are other kinds of thinking aside from that assisted by rituals and standards, though perhaps a commendable idea for some domains.
I see C++ now is very nearly an orthodoxy, and while I personally got reasonably good at that language, I quite dislike it now. I much prefer ObjectiveC, which I think is much more intelligently designed. No offense to Bjorne, Josuttis, and others I have very great respect for, but I still feel different. Similarly, IMHO, again meaning no disrespect, I think Java to a certain extent has become an orthodoxy too. This orthodoxy, and related adherents, are a lot why I decide not to renew my membership in the ACM. I felt there has been way too much focus on IT, and not enough on computer science, esp. computer science at the edges.
I think to a certain extent, the homogenization of many things being done in C++ or Java is actually a detriment to advances, and it’s still ignoring some gains that were made long ago. I think I must be a non-conformist, belonging only to clubs for non-joiners.
The specific issue for me is not strong typing, it’s static typing. I really believe Alan Kay when he said “I invented the term ‘Object Oriented Programming,’ and C++ is not what I had in mind.”
One of the things made possible by the 7- to 12-order’s of magnitude improvement in computational speed is the use of higher-order representations to do the picayune iteration, if/else and other “business logic” we need to do.
I got very far with Python in that regard, having written entire tools and compilers; but what I learned that I wanted to do for getting work done, was write programs that wrote programs, then executed them. Python greatly enabled that through it’s introspection and other features, but control over representation in source code, and it’s more modest object model were starting to get in my way.
So, in the last half-decade, I have gone back to one ancient dynamic language. My main work these days is done in Common Lisp. It has been gloriously productive to safely modify programs while they are running, to introspect a live program, and to do metaprogramming at runtime, not compile time, as I had to do with C++.
I was well-versed in Design Patterns, and found that I could do them directly within Common Lisp, but not have to deal with hugely distracting details. I am completely in disbelief at how long it has taken for modern tools to finally get to the quality, utility, and productivity level of where Lisp was 30 years ago. It’s really unbelievable. I’m having huge productivity gains using today’s Lisp tools, like SLIME. And I feel huge disappointment at how XML is quite wonderful, but it’s implementation has apparently been done incognizant of previously solved problems in computer science.
In my own more specialized field in the semiconductor industry, I’ve seen activity in Hardware Verification or High-Level Design Languages repeat similar kinds of “mistakes,” while other related developments based on purely dynamic languages are clearly much superior. The specific case I’ll cite (and again, this only IMHO …), which might not mean much to most conventional programmers, is OpenVera versus MyHDL. One feels like severe bloat, and the other is rather elegant. But, there are other examples in CAD tools like this as well.
It’s finally become that the “Yech! Haven’t you guys learned anything from history?” factor has just become too overwhelming. Life is too short not to have excellent tools, excellent representation, and the ability to organically modify and incrementally improve code.
So, I have personally settled for the “Programmable Programming Language,” and I am just not going back to static typing, nor numerous other restrictions in what is easily, conveniently, and productively representable with code. The loop logic still has to be done, but now I can write programs that write those programs, and it’s quite convenient to do so.
Wish you well.
Best, M
Sebastian Kübeck about 20 hours later:
Nice Post! It’s true that there’s no imaginable way (at least for me) to escape from code. No matter how you do it, once it’s getting more complex, it’s programming anyway. I think there’s a kind of perpetuum mobile of the information age in the heads of sales people today that is able to magically reduce complexity of any given problem and make it a triviality.
What I could imagine is that we just write test code or some testing rules in the future and there is a generator that writes the software that fit’s to that.
Simon Kirk 3 days later:
What an interesting analogy between medicine and software development.
Thinking about it a little deeper, I think the reason why Software has this lack of professionalism problem is both a blessing and a curse: accessibility.
In medicine, the accessibility is fairly limited: sooner or later there’s a limit to how many people can be involved, lopping off one another’s limbs, blood letting, etc.
But in Software, anybody with a computer can have a go.
I think that accessibility must be maintained, just as the professionalism must also come to be. The ability for anybody who wishes to create something wonderful is part of what gives Software its beauty: the limit is only the imagination of the programmer(s), and those without knowledge of boundaries often transcend them. Professionalism mustn’t stifle that creativity.
I hope that Agile will provide the links of craftsmanship that allows creativity to bloom while keeping chaos at bay.
Ravi Venkataraman 7 days later:
Twelve orders of magnitude? Some of the stated advantgaes are not really additive or multiplicative. If the volume of the hardware decreases by a factor of 1000, how does it affect my program? If the power consumption decreases by a factor of 1000, how does it affect my program? As long as the data can be stored in a reasonable volume of space, and as long as the power consumption is affordable, these factors don’t matter at all.
The fact that almost everyone can afford to compute does not mean that we’ll write programs that are “x” orders of magnitude better than the ones we wrote earlier. Writing programs is still a human endeavour.
It is like saying that because CPUs are 1000 times faster and we have 1000 times as much RAM, our cars should run 1000000 times as fast as they did!
Programming is still a human activity, as you pointed out. Unless we start thinking 1000 times as fast, and increase our memory a thousand-fold, we should not expect programs to be several orders of magnitude “better”, whatever “better” means.
As for AI’s writing programs, I believe that there is work being done on genetic algorithms, etc., that may lead to it. One only has to know about the chess playing programs of today that are better than the strongest grandmaster to realize that in the not too distant future computers will be able to “think” and make decisions, and write programs.
For now, though, the human rules!
Ken 7 months later:
I think it’s safe to say that basically every language in common use today is at least an order of magnitude better than Fortran in 1959. That was FORTRAN II, which did not yet have boolean logic, or a non-arithmetic IF statement. They had just added support for subroutines!
Even today’s Lisp is drastically better than 1959’s Lisp: that would have been LISP 1, which I believe didn’t have hardware numeric types yet (only Church numerals).
That’s not to say that software has kept pace with hardware, because it clearly has not. But if we’d really made virtually no progress in 50 years, then many of us would still be using 1959’s languages. Some of us still use Lisp because we find its abstraction-building capabilities still unparalleled, but we’re not using Lisp 1.
Marc 7 months later:
“There were no standards, no common rituals or behaviors, no common disciplines. If you got sick you might go to a barber, or a healer.”
And there is a reason there will continue to be custom solutions requiring custom skill sets and experiences. The body is finite, it can be figured out. However, not every business rule or chunk of business logic will work for everyone. We aren’t just programmers, we are problem solvers as well, thats half of it.
You may be able to standardize on the process one goes through to solving a problem…there are best practices for that. But definitely not in the implementation.
v 7 months later:
I doubt the professionalism will emerge, though. Taking the saw/power saw analogy a bit further, you could imagine the “coding” industry move along from the garages (individual machines) to large manufacturing units (software companies with 1000s of programmers) where the level of professionalism would presumably increase multiple-fold similar to the manufacturing industry.
But we do have large software companies and not as much professionalism, nor is there perceivable change across the spectrum (in fact, smaller shops might have a better grip on professionalism, but i digress). The standard excuse given is a version of the “human body is finite and can be figured out, but the mind is limitless, thus no standard need apply”. There are no cookie cutters in this industry. The “software ICs” that were talked about in the 80s have still not arrived. Not that they should have, but after half a century of development, shouldnt we expect standard interfaces for the most common applications instead of a set of guidelines, best practices and de-facto standards?
It seems to me, therfore, that not only will there always be code, but also thatthere always will be more and more of it. How do we explain to outsiders (most of whom are our clients) our penchant for reinventing (or renaming) things over and over? Take for example RPC: we’ve gone through Unix RPC, Sockets, CORBA, RMI, and many more. And yet the defining thought behind these are the same.
While its required and behoves us as practitioners to call out the deficiencies of something like MDA, what is the way (or ways) to better code?
How do we become doctors from the barbers and healers that we are now? We certainly have the critical mass, but lack the will. Is something like the certified software professional the way? I’m not too sure.
I have two suggestions though:
Come up with a peer reviewed definition of what it means to be a software professional – using ietf style BoF meetings or unconferences or some such informal yet workable format – anything that would result in the average developer saying: “yeah, that makes sense to me”. it worked for building the internet, why shouldnt it work to build our careers and industry?
If we cannot standardise the mind, can we not look for patterns in what the mind has produced and standardise that? We have a ton of software that’s already produced. What’s stopping us from doing a cyc-style large scale project to analyse this and derive the standard patterns (or anti-patterns) and provide this as input to the rules that we use write code in the future (And here I mean something at least an order of magnitude more insightful and higher-level than our current crop of checkstyle, findbugs, etc)?
v 7 months later:
I doubt the professionalism will emerge, though. Taking the saw/power saw analogy a bit further, you could imagine the “coding” industry move along from the garages (individual machines) to large manufacturing units (software companies with 1000s of programmers) where the level of professionalism would presumably increase multiple-fold similar to the manufacturing industry.
But we do have large software companies and not as much professionalism, nor is there perceivable change across the spectrum (in fact, smaller shops might have a better grip on professionalism, but i digress). The standard excuse given is a version of the “human body is finite and can be figured out, but the mind is limitless, thus no standard need apply”. There are no cookie cutters in this industry. The “software ICs” that were talked about in the 80s have still not arrived. Not that they should have, but after half a century of development, shouldnt we expect standard interfaces for the most common applications instead of a set of guidelines, best practices and de-facto standards?
It seems to me, therfore, that not only will there always be code, but also thatthere always will be more and more of it. How do we explain to outsiders (most of whom are our clients) our penchant for reinventing (or renaming) things over and over? Take for example RPC: we’ve gone through Unix RPC, Sockets, CORBA, RMI, and many more. And yet the defining thought behind these are the same.
While its required and behoves us as practitioners to call out the deficiencies of something like MDA, what is the way (or ways) to better code?
How do we become doctors from the barbers and healers that we are now? We certainly have the critical mass, but lack the will. Is something like the certified software professional the way? I’m not too sure.
I have two suggestions though:
Come up with a peer reviewed definition of what it means to be a software professional – using ietf style BoF meetings or unconferences or some such informal yet workable format – anything that would result in the average developer saying: “yeah, that makes sense to me”. it worked for building the internet, why shouldnt it work to build our careers and industry?
If we cannot standardise the mind, can we not look for patterns in what the mind has produced and standardise that? We have a ton of software that’s already produced. What’s stopping us from doing a cyc-style large scale project to analyse this and derive the standard patterns (or anti-patterns) and provide this as input to the rules that we use write code in the future (And here I mean something at least an order of magnitude more insightful and higher-level than our current crop of checkstyle, findbugs, etc)?