20.40 Stacks and Problem Complexity

Again, I'm hoping you're wondering --

Does the number of stacks tell us something about the complexity of a machine?

Is there some fundamental difference between, say, a machine with one stack and a machine with many stacks?

The Difference

Why, yes, it does. And, yes, there is. Let's see if we can pin down the difference. Let's look at those two handy-dandy calculators again, bc and dc. Let's talk a little more about what's happening under the hood. (Well, I'm going to oversimplify a little to avoid getting lost in the source code. Erm, okay, make that oversimplify a lot.)

Sample problem: We have a circle with a diameter of three component lengths, 3.1 cm. 1.1 cm. .7 cm, and we want to find the area and circumference of the circle.

With dc, the program reads numbers and one-letter (one "character") commands. When it sees a number, it pushes the number on the operand stack. (That's the only stack it has, as far as we care in this discussion.)

When it sees a character, it performs whatever operation the character is supposed to mean in the language of dc, "+" or "-", "*" for multiply, "/" for divide, "p" for print, "n" for print and remove, "d" for duplicate, and so on. These are the operators. (I typed an extra return after the "n", to move down a line.)

The operands are on the stack, so the operators just operate on whatever is on the stack. Very simple. (Simple is why most early mechanical desktop calculators operated this way, but with a very small stack.)

(And I'm not going to mention strings or named memory here. If you're interested, you can find the manual pages on-line or look it up on wikipedia. If you're motivated enough, break out a terminal shell window and do "man dc", pop up another shell window and do "dc" and have fun. If you insist on trying it on MSWindows instead of a real OS, you can download and install cygwin or mingw. I can't really tell you where to go from there, though. It would probably be simpler and faster to download a Linux or BSD live CD or DVD. I like Fedora's XFCE or LXDE spins.)

Oh, wait. I guess I do want to mention named memory here. Notice how I had to type the value of π twice? With only a single stack, if I tried to duplicate π, there would be two copies of π on the top of the stack. How do I get to the radius that I went to the trouble of calculating?

Without some extension of the calculating engine model, I'm stuck. Have to type that long number in twice. On the other hand, if I had even just one extra memory, I could save π off there while I calculate the area. Of course, the guys who wrote dc were building tools, not just proofs of arcane mathematics, so they went one or two better than that. dc actually has 26 named registers, one for each letter of the alphabet. (No, wait! Order now and get, not just named registers, but stacks! Heh. Check the man pages.) Anyway, if you don't want to type π twice, you can save it to a register:

But this is no longer using the strictly single stack model. You can see there are limitations on the single stack model which are overcome by even a little extra, non-stack memory, in this case, the register p.

I guess, because I want to make what's happening very clear, I'll go ahead and write down the detailed play-by-play:

Now, let's look at what bc does with the same calculations. First, without using named storage (variables) or the math library that bc can use in the most common versions (and again ignoring built-in exponentiation):

$ bc

bc 1.06.95

Copyright 1991-1994, 1997, 1998, 2000, 2004, 2006 Free Software Foundation, Inc.

This is free software with ABSOLUTELY NO WARRANTY.

For details type `warranty'.

scale = 10

( ( 3.1 + 1.1 + .7 ) / 2 ) * ( ( 3.1 + 1.1 + .7 ) / 2 ) * 3.1415927

18.8574101817

( 3.1 + 1.1 + .7 ) * 3.1415927

15.39380423

As expected, this is the algebraic grammar you were probably taught in school. You know how to read it, but what hoops does the computer have to jump through to do the calculations? Let's look at it in detail.

Whew!

Why do I go to all the effort to detail all of this?

(A better question might be why I don't write a program to generate the above stuff automatically, but I'm going to let that question pass.)

And even better question would the reason for not using the built-in exponentiation in either the dc or bc code, and the reason for not using named storage (variables) in the bc code.

I wanted to show where we get stuck without a second stack. (Or, in the alternative, addressable memory, which we can also call named store or variables.)

When we were calculating the radius the second time so we could multiply the radius by itself, we had two operators pending for much of the sequence. That's the reason for using an operator stack instead of just a single pending operator variable. Hold that thought while I get sidetracked for just a little bit.

Let's look at the bc code using a few more of the features of the language. We are going to invoke the program with the "-l" option so that we can use the arctangent function, "a(θ)". And then we will save some intermediate values in named variables. And we'll use the exponentiation function, "^".

$ bc -l

bc 1.06.95

Copyright 1991-1994, 1997, 1998, 2000, 2004, 2006 Free Software Foundation, Inc.

This is free software with ABSOLUTELY NO WARRANTY.

For details type `warranty'.

scale = 10

diameter = 3.1 + 1.1 + .7

pi = 4 * a(1)

( diameter / 2 ) ^ 2 * pi

18.8574099008

diameter * pi

15.3938040006

pi

3.1415926532

If we worked through the details of the interpreter operation for this code, we would see that use of name variables overcomes the form of expression that requires more than one pending operator, and thus the second stack.

But we must be careful here. The use of named variables does not provide a general solution for all parenthetic expressions. It allows us to re-write the expression. There's a difference. When we re-write the expression, we have to think about it for a moment.

Having the second stack allows us to make the computer do more work for us. It really is not anthropomorphizing to say we are letting the computer think for us. It is performing calculations that took us a certain amount of time and effort to learn in the early school grades.

The second stack allows the computer to perform calculations that are equivalent to some of those which we consider part of our higher reasoning skills.

Stacks vs. Random Addressing

Stacks are devices which allow a computer to be programmed to take an ordered path through symbolically solving a problem.

The ability to address memory, or to name storage, or to create named variables such as pi and diameter, above, provides an alternative to having a second stack. Or even the first stack. A stack can be constructed with an array of addressable memory used in conjunction with another variable that tell us where the current top of stack is.

Other structures or devices may be used instead of stacks in specific cases to work through a single list of options to be checked, but when you start trying to check options within options, you end up needing at least one stack, whether a simulated stack or an implicit stack or a hardware stack.

(I probably should have already explained implicit stacks somewhere. Sometimes, it's not obvious that a list or a tree is being traversed in such a way that the references to the structure are being held as if in a stack. It's a little more obvious when using explicit recursion, but there are also times when even indirect recursion is not involved, but the chain of references can be backed up and forked. This is what I'm talking about when I say an implicit stack.)

An interesting result of some theoretical work is that two stacks are sufficient for options within options within options, for any finite nested branching of options. Mathematicians have also proven that the set of problems which can be solved with a machine with two or more stacks is the same set as can be solved with randomly addressable memory.

Often, it is more convenient, due to the nature of the data being tracked for each set of options, to use different stacks, but, mathematically speaking, two stacks makes for a general purpose problem solving machine.

Solving Problems

One stack allows a computer to back up when it makes a mistake.

A second stack allows one to back up when a mistake is compounded by a second mistake.

Stacks allow a computer to try various options when searching for a solution to a problem.

WHOA! Computers can solve problems!!!!

Any problem which can be formally defined, for which a deterministic solution can be formally defined, can theoretically be symbolically solved by a computer with two stacks.

You might need very large stacks. You might need a lot of time to arrive at the solution. Addressable memory may allow more optimal solutions in many cases. You may want faster processors, or even more processors working in parallel. But that's not a big deal because money can be used to buy those things, right.

For the first time in human history, the impulse to try to solve problems by throwing money at them can apparently be shown to be justified.

Apparently.

But this fact can lead us to a number of mistaken conclusions.

Computers can be used to do the symbolic part of solving many problems. Nowhere is there a guarantee that they can actually solve problems. That part is up to us.

For instance, a computer can be used to design cars and even to test the design. But the car is not on the road until some humans actually build it.

Computer systems can recover from errors in some cases, if the programmers and systems engineers have correctly identified the possible errors beforehand, and if the computer programs and the systems which use them are correctly constructed and operated.

Bridges, building, clothing, tools, materials, there are all sorts of things which computer systems can help us make, and every one of them are subject to the skill and foresight of the engineers and programmers building the systems. And they all have limits.

Why should there be limits? I mean, the innate rebellion we humans feel against limitations means something, doesn't it? and computers are so good at solving problems, aren't they?

At the very least, when you build a computer, it has a finite amount of memory. That means the stacks and other data structures that the programs use can't expand forever. Maybe you can add some memory, but you have to buy it first. The same goes for CPU time and other resources. Buy a faster processor, buy more processors, but you can't have the future now, even if you haven't yet hit the physical limits of the hardware.

Computer systems have to be designed and built. Some systems available now are useful for a wide range of problems, but there is no guarantee that they fully answer the problems you need to solve. And even if they can, you still have to go find the systems, obtain them (preferably legally), install them, and learn to use them. You can't have the future now, no how sweetly the sales-sirens sing.

And then there is the issue of turning the symbolic solution into a real solution. Even after you have learned enough about the systems to start using them, using them takes time and effort.

(It's tempting to drag music players into the conversation, to try to force my point, but, ...)

Why? Why can't these magic boxes do for us what our gods never have?

I can't tell you the answer to that one. I can wave my hands around, talking about my religion, and talking about how most religions include some philosophical reason that we must put up with non-ideal realities in this world, but I can't really prove to anyone that gaining experience in this world is necessary or even a good thing. I can tell you I think it is so, and I can talk about some of the reasons why, but ultimately it appears to be up to each individual to decide whether experience is a good thing or not.

One thing is sure, if experience is a good thing, it is more than a little bit unreasonable to try to get computers to solve all our problems. And the attempt to push them too far that direction smells just a little more than faintly of the same sort of thing as when many religions of the past have ceased to be useful.

Just a word of friendly advice, but worshiping your computer is not conducive to your general health and well-being.

At this point, I have to admit that I am not being very formal in my descriptions, but I think I have shown that these things computer scientists call stacks provide us with two more levels of complexity in our efforts to solve problems.

What's left? Are there problems that computers can't solve?

Copyright 2011 Joel Matthew Rees