What You Need To Know
Stevey's Drunken Blog Rants™
What do you really need to know?
I'm often wondering what the "bar" is. You know, that mysterious "bar" that we talk about when we interview or calibrate developers, and we try to figure out whether they (a) clear the bar, (b) merely reach the bar, or (c) couldn't hit it with a running jump and a trampoline.
I know the standard definition, which is (paraphrasing): "smarter than the average bear". Or perhaps it was "I forgot my pic-a-nic basket!" Whatever. The standard definition is pretty useless, in my opinion, because we still argue daily about what the hiring bar is. So I'll ignore the standard definition for now.
We do need a definition, though: some rule of thumb that we can apply when we're not sure what to ask, or how to rate someone.
So I'll propose a new definition: the hiring bar is what you need to know in order to be good at your job.
What do you need to know in order to be a good SDE [software development engineer]?
I might ask first: what do you need to know to be good at being a person? I.e. an adult human being in the United States.
Hopefully you're thinking: "Gosh. A lot."
To illustrate just how much you need to know, here's a random good thing that people should know: never put a wet dog in the microwave to dry it off. You could argue that you can be an effective person without knowing this. After all, you may never need to dry off a wet dog. Or you may never own a microwave. Or it simply may never occur to you to try it.
But you know what? I think it's something you need to know, because if you don't, then you might just try it someday. Chances are slim, but I don't want to be there when it happens, nor does any dog I know.
We get through life mostly by memorizing rules: Look both ways before you cross the street. Do your taxes. Bring an extra pair of socks. Don't put your dog in the microwave. And don't tell lies; it just causes more trouble for you.
We also learn to program mostly by memorizing rules: Write short functions. Order your resource locks. Save your work regularly. Don't override your pure virtual private abstract copy destructor operator without sacrificing a goat to Bjarne. You know. Rules.
Rules are helpful because you can apply them safely without worrying too much about their derivations. They were created by wise people who've already felt the pain.
As we go through life trying to be effective people, we also perform computations. For instance, if a tape gets stuck in the VCR, you may not have been equipped with a specific rule for that situation. There aren't any old homespun adages or proverbs about VCR stuckageness (or about nounjectifying words, for that matter.) Sometimes you need to deduce the right thing to do.
If your tape is stuck in the VCR, you're unlikely to try to dislodge it with a polish sausage, or with anything squishy, for that matter. You probably won't try anything too large (like another VCR), or a living thing, or a very very sharp object, or a fragile object. Even though you may have no rules memorized for this situation, you've probably learned some applicable methodologies, such as pounding the VCR, pressing the buttons a bunch more times, cursing loudly (which may bring another person around who might want to help), prodding the tape with a tool of some sort, or, if all else fails, reading the instruction manual.
I'm coming to an important point, eventually. Really.
First I'll hazard a guess: when you read the VCR problem statement above, you probably thought of prying at the tape with a makeshift tool (any tool, or a pencil, or whatever). This would be the "object-oriented methodology," I suppose. But you probably didn't think of cursing loudly, or pressing the buttons a bunch of times. However, I'd venture to guess that last time you actually had a stuck tape, you didn't use a tool right away. You probably tried a lot of quicker approaches, such as pressing the buttons, or turning the power on and off, or even prying at it with your fingers, before you went off looking for a suitable tool. That's because the quicker approaches have often worked before, so you may as well try them first.
I'll bet that you've broken a jammed object before, sometime in your lifetime. You had some media-playing electronics device, or a kitchen drawer, or a box, and it had something wedged inside of it, and you broke it. You used too much physical force, and — oops! — you either broke the holder or the thing being held. When you did this, you filed the information away: a lesson for the future. And you used that lesson when you were computing how to get your last stuck VCR tape un-stuck. The last time it happened, I'll bet you didn't try pulling on the tape until something yielded, because you knew this approach would probably create even more problems.
Obviously I'm talking about common sense here. We all know what it is, but can you define it? That's harder. One way to define it, I think, is to say it's the knack of choosing a near-optimal solution when faced with an unusual problem.
Put that way, it sounds a lot like the definition of Design Patterns, which were one of the early (c. 1993) attempts to apply homespun common sense to software engineering. The "Gang of Four" authors wrote down 23 "patterns" that they felt were tried-and-true solutions to everyday programming problems.
This article isn't about Design Patterns per se, but it's certainly a related idea. Design Patterns codify a few reasonable common-sense rules with memorable names. They're just the software engineering equivalent of proverbs or old sayings.
How do you learn common sense as a developer? The same way you learn real-life common sense. You learn some of it in school, with stern teachers admonishing you to do (or not do) certain things that don't really sound that bad, but you try to remember them all the same. You learn some of it by making mistakes yourself, and learning from the pain, or from introspection after the fact. You learn from reading, certainly, and you learn from your friends.
Sometimes, when there's something people really screw up a lot, a proverb or saying is made just for that situation, and we memorize it. Like: "don't light a match 'til you know which end of the dog is barkin'." Well, or maybe some better example.
Sadly, some people never really get the hang of the whole "common sense" thing. Some developers never get it, either, even with 30 years of experience.
To learn common sense, you have to go through a period in which you're lacking sufficient amounts of it, during which you learn by making lots of mistakes. This is your childhood, and your adolescence, and really you're learning more common sense your whole life, which is why it seems that the truly wise always seem to be grizzled old farts.
It also explains why so many of your best teachers, and also so many of the great pioneers of computer science and software engineering, are also old. They've been doing this a long time, so they've got a store of common sense that it would be difficult to accumulate as a kid.
Over time, the wisest people find ways to encapsulate their wisdom in a way that makes it easily accessible to future generations. The vehicle doesn't really matter: it could be via a famous speech like the Gettysburgh Address, or a funny anecdote like the urban legend about the woman that brought a mexican sewer rat across the border thinking it was a chihuahua. One way or another, we remember it, and over time, common sense becomes institutionalized, part of our culture. Fewer people make mistakes after they've become famous mistakes, or so we hope.
Some things, of course, we just never get better at. No amount of homespun wisdom will help; you just need to experience them yourself before you can have any common sense about them. Personal relationships are a good example. The homespun wisdom doesn't sink in until you've made a few real mistakes. Similarly, software Design Patterns don't really sink in until you've made some software design mistakes.
So where does that leave us for software development common sense? What rules do we need to memorize? What general problem-solving approaches do we need to know, so we don't do the software equivalent of the dog in the microwave or the polish sausage in the VCR?
Well, for starters, there's a lot to know, just like there's a lot to know in order to get through life as a competent adult. That's why it takes 18 to 30 years. You may have a bright, competent software developer with 5 years of experience (including school), but they're still basically just a kid. Maybe the ratio isn't the same; maybe it's 2 computer years for every 1 person year, but even then, someone a year out of school is still only as sensible as a 10-year-old, when it comes to creating software.
Of course, there are some really mature 10-year-olds out there, but you probably don't want them piloting an aircraft, or designing a building's emergency exit system. There are certain things you can't entrust to a 10-year-old, no matter how mature.
I realize it sounds like I'm insulting all junior developers when I offer these opinions. I thought I was cool stuff when I was one year out of school, after getting my CS degree. I also thought I was pretty cool as a 10-year-old kid. But I didn't actually have much of a store of common sense, and I can tell you I've made a great many mistakes in my time, both as a person, and as a software developer. Each year I think I make fewer, though, because I'm becoming more sensible. Or at least I try to make different mistakes.
So what does this mean for interviewing?
For one thing, it means you ought to interview people for common sense. I don't think we do this very much. Testing for common sense isn't something you automatically think to do. Most interviewers assume that a person has common sense, and skip right over that stuff and get to the complicated things.
I find that I'm an effective interviewer because for many years, I've been testing candidates on common sense. And you'd be amazed at what I've turned up. I interviewed one guy who didn't know the odds of getting heads when you flipped a coin. It wasn't a mistake or a communication problem; he really didn't know, as I was able to verify in a famously funny story that would take me too far afield here.
Once I interviewed a PM candidate who felt that it was best, when faced with two week-long projects both due in a week, to do a half-assed job on both of them. I was so amazed that she was choosing to fail at both of them that I offered her a clear verbal alternative: she could choose to succeed fully with one of them, or do a half-assed job at both. I said it just like that. She decided to stick with being half-assed. Not only was that generally a bad answer; she didn't even possess enough interviewing sense to steer clear of an alternative with "half-assed" in the description. Geez.
I put a lot of stock in common sense, and I'll turn down any candidate who appears to lack it, even if the person is otherwise well-qualified.
Here's something I consider to be common sense: never lie to the interviewer or try to bluff the interviewer. But it happens so often! On a recent recruiting trip, I interviewed a candidate who claimed to have lots of experience with multithreading. I asked him to write a synchronized block in his stated favorite language, Java. He said he'd forgotten the syntax, and that he was really more comfortable doing it in C#. I saw that look in his eye — he was fugging bluffing, and I knew it. I told him to go right ahead and do it in C#. 5 seconds later, he said that he was really having mental block, and that he couldn't really remember right now. 5 minutes later, after lying about some other stuff, he was on his way out of the building.
Another common sense tip: don't bring stuff up, in the interview, that you don't know anything about. If I ask you what your favorite classes were, don't pick one that you don't remember jack about, because that's the one I'm going to ask you about next. C'mon, is that really surprising? But candidates do it all the time.
More common sense: don't tell me you can't remember what classes you took (in your major) in school, unless it's been longer than I've been out of school. I've been out for over fifteen years now, and I still remember all the ones I took, and each year I'm slightly more irked when someone says they've forgotten it all because it's been a whole farging 2 years since they graduated. If you can't remember the approximate names of the CS courses you took 2 years ago, then I'll fight to keep you out of this company to my dying breath.
These things happen so often. I'd love to put together a list of Interview Patterns: two lists, really; one for the candidates and one for the interviewers. Someday...
If you were interviewing someone for a job, and you were talking to them by phone, how could you tell if the person was a 10-year old? Or an adult with the common sense of a 10-year-old?
You could certainly ask something like: "what's the fastest way to dry off a wet dog", or "how do you get a stuck tape out of the VCR", and after a few questions of this nature, someone with the sense of a 10-year-old is likely to say something really insensible: something that gives them away as a 10-year-old (in real-life experience years.)
If you were interviewing a software developer, and you wanted to find out if they lacked sufficient sense, what would you ask? What is "common sense" for a software developer?
Here are a few examples of things that I would consider to be software common sense. I've stuck just to things that people on a recent 2-day recruiting trip didn't know, or got wrong:
2^32 is not a 32-digit-long number, in decimal.
The elements of an array do occupy contiguous locations in RAM.
"is-a" and "has-a" are logical relationships that don't depend on implementation issues.
...and a Person object should not multiple-inherit from Head, Arm, Leg, and Torso.
Locks are not guarded by other locks, any more than the earth is held up by a giant turtle.
How often would ever think to ask these things specifically? I don't set out to ask things like this, but if a candidate says something fishy, then I may kick into "common-sense detection mode" and start asking simple questions for a while.
What constitutes "fishy?" Basically any time a candidate makes a statement that's completely wrong — a gross conceptual error that any 11-year-old developer ought to know — your Fishy Detector should fire. The first thing you do is ask if they're sure. If they revisit the assumption, think about it a bit, and then say "gosh, you're right, I had that backwards", then you can probably write it off to interview nervousness. But if they stick to their claim, you've entered Official Fishy Territory, and it's time to start digging to see just how far off base this person has wandered.
Someday I may start assembling a common-sense questions list. They're actually hard to think of - it's hard to think of something that (a) goes against common sense, and (b) is likely to generate the wrong answer from a candidate who is merely common-sense challenged, as opposed to certifiably insane (which candidates sometimes are.)
OK, I'm finally ready to make My Main Point. Thanks for bearing with me so far!
My point today is that although I don't feel you need to have a Computer Science degree in order to be a good developer, I do believe that there is a set of learnings, which you normally acquire in the course of a CS degree, that I would consider to be part of the core set of ideas that are "software common sense". If you don't know them, then I will feel that you lack common sense.
In our CS degree, we saw a lot of proofs, and had to work through many of them by hand. I feel that this was to some extent unnecessary, since it obscured the importance of the things we were proving by putting us all to sleep. Being able to prove something does allow you to reason through it to assure yourself of its correctness, and to derive it from first principles if you've forgotten it. But in everyday programming, you don't need to do proofs. You ought to have a feel for the outline of the proof, the sort of intuitive derivation of a rule, so you can reason through it with yourself and with others. But you don't need to be ultra-formal about it unless you happen to love proofs, as some folks do.
Instead, you need to be aware of the major findings and learnings of Computer Science, and the rough reasons for them. I realize this opinion is going to displease two audiences: CS theory folks, who will think it's too weak, and professional programmers with no CS degree, who will think it's too strong. But it's my opinion and I'll stick with it.
Here are some examples of "CS common sense" — stuff you would expect anyone with a CS degree to know:
Computers are not able to perform all possible computations; there are in fact very simple and important computations that no computer is capable of performing, nor ever will be.
Regular expressions aren't powerful enough to match anbn, e.g. balanced parentheses. That's why every developer needs to know how to use a parser generator like yacc or antlr.
Recursion and iteration are formally equivalent; each can always be expressed in the other form.
Most hardware architectures have (at least) an instruction pointer, a frame pointer, and a stack pointer register, in addition to the general-purpose registers.
The Von Neumann architecture is not the only one out there, nor is it going to last much longer (in the grand 400-year scheme of things.) It's just a slightly glorified Turing machine.
Threads share memory but have their own stacks.
Algorithms can be analyzed for their space and time performance, which is expressed in a semi-formal "big-O" notation.
Space and time are usually involved in a tradeoff, where you must optimize for one or the other.
A balanced binary tree has O(log n) performance on its insert and find operations.
Exponential big-O performance is really bad.
Imperative programming (C/C++/Java/Perl) is not the only programming model out there, nor is it going to last much longer (in the grand 400-year Scheme of things, if you didn't catch the hint the first time). Imperative languages are just syntax for Von Neumann machine operations.
There are many more, of course. You might be surprised to hear that not all CS classes are filled with such findings. Some, like graphics, have a more practical focus. But all of them are impacted by the findings produced by CS research over the past 50 years, things that now constitute "software common sense".
How do you learn these things?
Well, I don't recommend that you just memorize them, since people who do that tend to memorize them wrong. You need to understand them, but (I argue) only informally, not to the level of being able to prove them on paper. E.g. for "recursion and iteration are the same", all you really need to understand is that with iteration you'd substitute an explicit stack for the call stack used in recursion. It's a pragmatic way to look at it, but it gives you enough power to be able to translate any iterative algorithm to a recursive one, or vice-versa, should the need arise.
You don't have to have a CS degree, but I don't know many people who've learned the fundamentals just by reading books on their own. I guarantee you that it's possible, and I know a few people who've done it — one in particular at Amazon who is loads better than most people with CS degrees. But it takes quite a commitment, essentially equal (in hours) to what you'd need to get a CS degree — 2 years full time, more or less. There really aren't any shortcuts. If you don't have a CS degree, you have some reading and practicing ahead of you.
Well, er, you caught me there. There wasn't one, really. Unless it's that you should definitely never dry off a wet dog in a microwave. Or perhaps that chihuahuas bear a remarkable resemblance to sewer rats, at least in some countries.
This is more of a transitional essay, one that's going to bridge the gap between my uber-grumpy Bob Paradox essay from a few weeks back, and one that I'm preparing that's a sort of tutorial on how to practice (in fun ways) at becoming a better programmer — one with a greater store of common sense.
But next time you interview someone, take a good look at them, and ask yourself: "Have I verified for myself that this person has software common sense?" It doesn't take long, and I think you'll find it's well worth spending a few minutes exploring with them.
I really enjoy reading your blog, and it motivates me (a recent grad) to want to constantly improve my skills. When I read your earlier entry filled with interview questions, I made myself write down an answer to most of them, so I am sure to know the difference between "I'm sure I could do that" and "I just did that."
I have a slight disagreement with one of your "common sense" principles about software development, namely the one about object-orientation. You say that the difference between is-a and has-a is a logical distinction, and not implementation-dependent. I agree and disagree.
I don't believe object-oriented design is primarily about making pure logical relationships. This is software, not math. Going down the path of "object-oriented design as world-modeling" leads to silly debates like "shouldn't Circle derive from Ellipse?" (http://www.parashift.com/c++-faq-lite/proper-inheritance.html#faq-21.6)
Inheritance and composition aren't logical assertions, they are practical ones. Inheritance doesn't mean "is-a" in a deep, philosophical way. It means "can-be-treated-as-if-it-were-a", in a practical way. Composition doesn't necessarily means "has-a" in the same way that a car has-an ownership title, it just means "uses-a-private-instance-of-a".
The issue was probably more cut and dry in your interview when you were talking about a specific problem. But I think that in general, it is better to stress interfaces and substitutability in object-oriented design than logical relationships.
Posted by: Josh H. at November 5, 2004 06:08 PM