Here is a good handle on Anthroposophical thinking —
 and on the kind of thinking that will replace it.
 “The brain is a belief engine. It relies on two processes:
 patternicity [our tendency to find or invent patterns] 
and agenticity [the belief that events always result from the intentions of beings]. 
It finds meaningful patterns in both meaningful and meaningless data. 
It infuses patterns with meaning, and imagines intention and agency
 in inanimate objects and chance occurrences. 
We believe before we reason. Once beliefs are formed, we seek out 
confirmatory arguments and evidence to justify them. 
We ignore contrary evidence or make up rationalizations to explain it away. 
We do not like to admit we are wrong. 
We seldom change our minds ... [But] as science advances, 
the things we once thought of as supernatural acquire natural explanations. 
Thunderstorms are caused by natural processes of electricity in clouds, 
not by a god throwing thunderbolts.” 
— THE SKEPTICAL INQUIRER, Sept.-Oct. 2011, p. 57.

Oh Why?
Oh Why?
Oh Why?

Much of my childhood was spent under the supervision of smart Anthroposophists. I saw no inherent contradiction. The teachers at the Waldorf school I attended seemed to be quite intelligent, and they embraced Anthroposophy. No contradiction. [1] I tended to accept my teachers’ apparent evaluation of themselves: They were very “deep” and very “good” — far more so than virtually anyone outside the school. They “knew” things that most other people didn’t know, and while they kept mum about much of their knowledge, they imparted enough to wow me.

These days, understanding how weird Anthroposophy is, I do see a contradiction. How can smart people believe such nonsense as the doctrines of Anthroposophy? I’ve offered some answers. [See "Inside Scoop".] Here are a few more. I will quote from, and paraphrase, several authors. There will be some overlap between the points made by the authors, but I'll try to hold the repetitions to a minimum by keeping things concise.

Michael Shermer is the author of WHY PEOPLE BELIEVE WEIRD THINGS (Henry Holt, 2002). Some of his conclusions apply quite well to Anthroposophy. Here are Shermer’s explanations for the acceptance of weird beliefs by all sorts of people, intelligent and not-so-intelligent.

CREDO CONSOLANS [I believe what consoles me]: “The reason people believe weird things is because they want to. It feels good. It is comforting. It is consoling.” [p. 275.]

Yes. It feels good to believe, for instance, that human beings are the center of the universe; to believe that the gods worship us; to believe that we will become God. [2] Such beliefs feel much better than thinking that we are unimportant, peripheral actors in a random, uncaring universe. Anthroposophy feels good. It feels right. But whether it is true is a different question.

IMMEDIATE GRATIFICATION: If you embrace Anthroposophy, POW! — right then, right now — you “know” you are on the path to spiritual salvation. The path itself will be long, spanning many future incarnations in various planetary stages (Jupiter, Venus, Vulcan...), but immediately (now, here — POW!) you know all is swell.

SIMPLICITY: “Scientific explanations are often complicated and require training and effort to work through. Superstition and belief in fate and the supernatural provide a simpler path through life’s complex maze.” [p. 277.] Hm. Simple? Anthroposophy is simple? Actually, Anthroposophy is a highly complex body of doctrines. And yet, at the core, Shermer’s point is well taken. Anthroposophy is really much, much simpler than reality: Steiner returned over and over to a few basic propositions (we are evolving through planetary stages; Christ is our prototype; everything becomes clear when you develop exact clairvoyance) — his teachings are far simpler than the reality revealed by modern science. (Indeed, they are simpler than their primary source, Theosophy. The cosmology of Theosophy is vast; Steiner focused on a single portion of cosmic history, as described in Theosophy, and he simplified it further by marrying it to the familiar narrative — with variations — of Christianity.)

MORALITY AND MEANING: “To most people, science seems to offer only cold and brutal logic in its presentation of an infinite, uncaring, and purposeless universe. Pseudoscience [e.g., "spiritual science": Anthroposophy], superstition, myth, magic, and religion offer simple, immediate, and consoling canons of morality and meaning.” [p. 277.]  Shermer suggests that science’s discoveries are not really so bleak, that in fact a highly moral and meaningful life can be built upon truth — that is, it can be built upon the reality revealed by science. But, Shermer says, many people do not understand this, and therefore many run to bogus palliatives like Anthroposophy instead of facing reality head-on. [3]

HOPE SPRINGS ETERNAL: “[H]umans are, by nature, a forward-looking species always seeking greater levels of happiness and satisfaction.” [p. 278.] This human characteristic may seem upbeat, but really it is the opposite. We humans are almost never satisfied with what we have, we always want more, we always want better. [See "Spiritual Science" for a discussion of dissatisfaction as a human motive.] Like so many other con men, Steiner offered us something “better” than reality, and some people leap at it. But that alternate, “better” universe is bogus, and the desire for it misses the important recognition that reality is really quite wonderful. Really. If we humans could just agree to stop destroying the Earth and the creatures upon it, we might realize that we live in a wonderful oasis of beauty and (potentially) truth. But no. Steiner had to offer us an alternative, a wholly imaginary alternative in which beautiful truth and true beauty fade away.

Those are Shermer’s general explanations for the human tendency to believe weird nonsense. He also offers some observations on the seeming paradox that smart people may believe weird falsehoods. Here’s a summary:

OUR BELIEFS DON’T USUALLY COME FROM OUR INTELLIGENCE OR EDUCATION: They come from our emotional needs, or our cultural heritage, or our dreams... We tend to accept or develop our beliefs irrationally. Our intelligence can help us to analyze our irrational beliefs and cast them aside when we see that they are baseless. But our intelligence can also be misused, to concoct rationalizations that defend our irrational beliefs, convincing us that the silly things we believe are true. “Smart people believe weird things because they are skilled at defending beliefs they arrive at for non-smart reasons.” [p. 283.]

INTELLIGENCE DOESN’T NECESSARILY PREVENT SILLY BELIEFS: There are, after all, all sorts of intelligence. A math prodigy may have a foolish fear of ghosts. If you are great at doing square roots in your head, this doesn’t prevent you from having weird metaphysical beliefs. Math and metaphysics are different; the one doesn’t apply to the other. You can be a math genius and a metaphysics fool. “Smart people might be smart in only one field.” [p. 287.]

EDUCATION DOESN’T NECESSARILY PREVENT SILLY BELIEFS: There are, after all, all sorts of subjects to study. A Ph.D. in literature might be a dunce when it comes to studying fossils. If you are great at figuring out the poetry of John Keats, this doesn’t prevent you from having weird beliefs about evolution. Literature and biology are different; the one doesn’t apply to the other. You can be a superb literary critic and a fool on the subject of biology. [pp. 291-292.]

BELIEFS STEM MORE FROM PERSONALITY THAN I.Q.: If you are highly confident, you may have an irrational belief in the future. If you are highly insecure, you may have an irrational need for invisible friends (angels, etc., who can do more for you than you can do for yourself). Beliefs stem more from the psyche than from the rational brain. “Individuals high on hypnotic susceptibility are also more likely to report having undergone religious conversion.” [p. 294.] Your emotional needs may overwhelm your ability to reason things through.

MUCH DEPENDS ON YOUR FEELINGS ABOUT THE LOCUS OF CONTROL: If you feel weak, you may think that you are controlled by forces outside yourself; you see yourself as a victim. Being smart or highly educated doesn’t change this. If you feel strong, you may think that you control your own destiny; you see yourself as an independent agent, perhaps as a leader. Being relatively unintelligent or badly educated doesn’t change this. [pp. 294-295.]

BEING SMART, EDUCATED, OR NORMAL DOESN’T SAVE YOU FROM THE LURE OF CULTS: “[T]wo-thirds of cult members come from normal functioning families and showed no psychological abnormalities whatever when they joined the cult.” [p. 295.] “[J]oining such groups is an integral part of the human condition to which we are all subject ... Banding together in closely knit groups was a common practice in our evolutionary history.” [p. 296.] Membership in a cult can result from several factors: “cognitive dissonance; obedience to authority; group compliance and conformity; and especially the manipulation of rewards, punishments, and experiences.” [p. 296.] Anthroposophy was clearly a cult when it was a small group gathered around the physical presence of Rudolf Steiner. Today it is a far more dispersed movement, but it remains a cult.


#1: INTELLECTUAL ATTRIBUTION BIAS: Smart people know they are smart, which can make them smug. Smarties are inclined to think that whatever ideas they come up with must be  true, since they are so smart. “[W]e see our own actions as being rationally motivated, whereas we see those of others as more emotionally driven.” [p. 298.]

#2: CONFIRMATION BIAS: Smart people trust their intelligence. They develop ideas that they defend cleverly, and they may find all sorts of evidence to support their ideas while dismissing contrary evidence. They have a “tendency to seek or interpret evidence favorable to already existing beliefs, and to ignore or reinterpret evidence unfavorable to already existing beliefs.” [p. 299.]

OK. So being smart and well-educated gets us nowhere.

No, that is not the message. Being smart is valuable, and being well-educated is indispensable. The lesson of Shermer’s work is that intelligence and education can be misused. Anthroposophists misuse them. But the rest of us need not misuse them. We can be smart enough and well-informed enough to reject Anthroposophy.

Perhaps Shermer's most important insight is that the human brain often functions as a "belief engine." [p. xxiv.] We gather sense impressions, and from these our brains automatically produce explanations or beliefs that enable us to deal with the phenomena of the world around us. Sometimes the explanations we think up are rational and can be tested — they are, in effect, scientific hypotheses. But far more often, the explanations we offer ourselves are akin to dreams or fantasies — ideas that seem true enough, or possible enough for our ordinary purposes, ideas that comfort us, or inspire us, or encourage us, but that are wrong. These false ideas become the myths to which mankind has long turned, an easier and perhaps more natural human tendency than submitting to the arduous disciplines of logic and scientific proof.

Anthroposophy consists of many foolish beliefs (belief in ghosts; in goblins; in reincarnation; in the Sun God) rolled together to create one large foolish belief. It is the sort of thing that results from the human belief engine misfiring; one of the numberless unfounded belief systems that we humans have inflicted on ourselves over the millennia. [See, e.g., "Choosing".]

The great task for all of us is to winnow our beliefs, affirming the ones that can be proven logically and factually, and ditching the others, the faulty explanations that cannot stand up to careful scrutiny. But thinking carefully — with logical rigor — does not come naturally or easily. It is a skill that must be developed. One of the chief goals of education should be to develop this skill; but this will happen only if we truly prize clear, logical thought. Steiner taught that we live in a era when it is necessary for humans to develop their thinking brains — but he also taught that real knowledge is not produced by use of the brain. He taught that real cognition is clairvoyance (the formation of spiritual pictures, "imaginations") , and that real or "living" thoughts come to us from prior lives or spiritual agents, they are not produced by logical thought. Submission to Steiner's doctrines makes the development of clear, logical thinking a minor goal, at best, in Waldorf schools. The faithful Anthroposophists on the faculty will emphasize imagination as a stepping stone to full-blown clairvoyance (the fantasy on which they are fixated); along the way, they will follow Steiner in giving logic, intellect, and science short shrift. Students steered away from reality in this manner may have a very hard time orienting themselves and functioning in the real world.

Shermer also presents an intriguing analysis of Objectivism as a cult. Most of what he says applies equally well to Anthroposophy. Check it out: chapter 8. The characteristics of a cult he identifies include "veneration of the leader" and "hidden agendas." [p. 119.] Anthroposophy certainly displays such characteristics. 

So. Steiner was wrong and Shermer is right. Yes?

No. That's not what I mean.

What I mean is that Steiner was wrong and Rawlings is right. Yes?

No. That's not what I mean, either.

I mean it is easy to be wrong and hard to be right. But it is possible to be right. Intelligence and education can work. We can reject error and find truth. It is hard, but it can be done.

Anthroposophists are wrong. This is clear. We need not deride them or despise them for their errors. They are wrong, but so what? They have the right to be wrong. They can make their own decisions, and if their decisions are wrong — as they clearly are — then they can live with the consequences. This is their right.

But if you have a children, do not hand them over to Anthroposophists for schooling. Anthroposophists are wrong. Find other people — teachers with a handle on reality — to educate your children.

Other books comparable to WHY PEOPLE BELIEVE WEIRD THINGS include

HOW WE KNOW WHAT ISN'T SO, by Thomas Gilovich (Free Press, 1993)

THE PSYCHOLOGY OF THE PSYCHIC, by David Marks (Prometheus, 2000)

SCIENCE: Good, Bad and Bogus, by Martin Gardner (Prometheus, 1989)

HOW WE BELIEVE, by Michael Shermer (Henry Holt and Company, 2000)

PSEUDOSCIENCE AND THE PARANORMAL, by Terence Hines (Prometheus, 2003)


THE DEMON HAUNTED WORLD, by Carl Sagan (Ballantine, 1996).

All these books are worth a look. You may also want to see "Clairvoyance".

“People who want to be anthroposophists...must pour what they have been during their dreamless sleep into the pure thoughts of anthroposophy with the help of their strong will ... We must not read anthroposophical books in the way we usually read ... Generally, people read only with their waking life ... To read anthroposophical books, we must enter into them with our whole being. Since we are unconscious during sleep and have no thoughts then — though our will is, of course, still there — we must put our whole will into the reading of anthroposophical books. If you make the contents of an anthroposophical book the object of your will, then you will become immediately clairvoyant....” — Rudolf Steiner, EARTHLY KNOWLEDGE AND HEAVENLY WISDOM (SteinerBooks, 1990), pp. 30-31.

You must will yourself to believe Anthroposophical teachings, you must close your thoughts as if in deep sleep — but you must also exert your will. And then, bingo, you will clairvoyant. Or, anyway, you may convince yourself that you are clairvoyant. All unknowing, Steiner prescribed a process of self-deception; this is the essence of Anthroposophy. And, indeed, Steiner's followers exert their will in just this way.

In THE PSYCHOLOGY OF THE PSYCHIC (Prometheus, 2000), David Marks lays out several reasons people often embrace pseudoscientific nonsense:

"[T]here are common psychological processes, or cognitive fallacies ... My goal here is to understand the difference between rationality and rationalization. By rationality I mean a self-correcting system of discovery. By rationalization I refer to all processes that make beliefs self-perpetuating regardless of the evidence." [p. 257.]

Here are the forms of rationalization Marks identifies:

SUBJECTIVE VALIDATION: Without meaning to, and usually without realizing it, we tend to grab any piece of information, any statement, and any opinion that supports the beliefs we already have, while we disregard contrary information, statements, and opinions. "Our beliefs are not automatically updated by the best evidence available. They often have an active life of their own and fight tenaciously for their own survival. They tells us what to read, what to listen to, who to trust, and how to explain away contrary information." [p. 259.]

The process even leads us to misunderstand evidence, interpreting it in ways that seem to support our beliefs — even when the information is actually antithetical to our beliefs. "Whenever a person misreads unfavorable or neutral evidence as giving positive support to his beliefs, a subjective validation has occurred." [P. 259.]

ILLUSORY CORRELATION: We tend to see what we want to see. If we are convinced that Americans are the happiest people on Earth (because we think America is the greatest nation with the best system of government) we tend to notice all the happy Americans we meet while disregarding or rationalizing away all the unhappy Americans we meet. Likewise, we note all the unhappy foreigners we read about, while disregarding reports of happy foreigners. "Illusory correlation is thus a type of subjective validation in which expected matches are imagined to occur more often than they really do." [p. 262.] (By the way, polls indicate that Americans are far from being the happiest people on Earth.)

FUDGING [4]: We all want to find meaning in life, and the strength of this desire leads us to fudge. Leaders who offer their followers meaning tend to use vague and even confusing terminology, while their yearning followers tend to interpret these utterances and all other "evidence" in ways that seem to — but do not really — offer confirmation. "Seers and prophets like Nostradamus play on human desires to see meaning by deliberately making their predictions in a 'cloudy manner with abstruse and twisted sentences.'" [p. 266.] If we are believers, we accept these sentences as true without subjecting them to rigorous, rational examination.

Marks offers the example of Nostradamus, whose predictions were often so vague as to lack any real, rational content. This enables followers of Nostradamus to interpret the predictions as they please, and to see the predictions "fulfilled" whenever any event occurs that might conceivably be interpreted as a realization of a favored reading of the prophecy. If Nostradamus said, for instance, "The blood of the just shall be demanded in London" [p. 263], this could be taken as a prediction of any event in which good or just Londoners lose blood: an IRA bombing in London, or a subway disaster in London, or any other misfortune that befalls any good people in London at any time for any reason. Then again, the "prophecy" might mean that London needs good, just people — perhaps to lead the government, or perhaps to donate blood for London hospitals, or perhaps to undergo eugenics experimentation, or perhaps...

BENDING THE FACTS: We tend to interpret the things we see, hear, read, etc., as substantiating our preferred view of reality. This is particularly true when we interpret the words of the "wise" leaders whom we choose to follow. "Once we think we know the correct link between a description and an event, we can easily make everything else fit around it." [p. 266.]

Sticking with examples from Nostradamus, Marks writes "First one finds an apparent similarity between one or two notable facts in a [prediction] and a historical event. Given this organizing can then shape both [prediction] and history to maximize an apparent agreement." [p. 264.]

PROCEEDING ON INSUFFICIENT EVIDENCE: Rationally gathering a large body of evidence and then logically analyzing it is hard work and takes a lot of time. We tend to take the easier way, basing our beliefs on minimal (or even no) real evidence. We are especially prone to do this in matters of the greatest importance, bearing on the meaning of our lives. We resist acknowledging that we do not have answers to some of life's deepest questions. Therefore, we leap to conclusions that satisfy us emotionally even though these conclusions do not stand up to careful, rational examination. "Human judgment is fallible, yet we make judgments constantly that are based on inadequate and probabilistic information. We can't really avoid this because, in spite of our uncertainties, we need a stable construction of reality in order to plan the future in a sensible way. Belief in the paranormal helps tie up 'loose ends' and clean out the oddities of experience that the normal [scientific] theory...cannot handle." [pp. 266-267.] So in the very area where we should tread most carefully — our core beliefs — we tend to jump to unwarranted conclusions. The obvious danger is that doing so may lead us to waste our lives, embracing beliefs that are largely or even wholly baseless, and doing so for years or decades or until the day we die.

In HOW WE KNOW WHAT ISN'T SO (Free Press, 1993), Thomas Gilovich discusses six errors that lead people to accept false ideas.

1. MAKING SOMETHING OUT OF NOTHING, OR MISUNDERSTANDING RANDOMNESS [6]: "We are predisposed to see [or imagine] order, pattern, and meaning in the world, and we find randomness, chaos, and meaningless unsatisfying." [p. 9.] So, confronted with confusing, random information or events, we concoct myths that "explain" them. The key problem, here, is failure to rationally test the stories we tell ourselves. A hypothesis is simply a story or idea framed in a way that allows us to determine whether it is true or false. The scientific method is simply the use of careful observation, experimentation, and reasoning to test hypotheses, to see if they are true. Unfortunately, people often embrace myths without making any genuine attempt to learn whether or not they are true.

2. ACCEPTING — AND MISINTERPRETING — INCOMPLETE OR UNREPRESENTATIVE INFORMATION. A small amount of evidence is necessary to begin with, but evidence must be extensive if it is to truly prove a hypothesis. "Unfortunately, people do not always appreciate [the] distinction between necessary and sufficient evidence, and they are often overly impressed by data that, at best, only suggests that a belief may be true ... [A] willingness to base conclusions on incomplete or unrepresentative information is a common cause of people's questionable and erroneous beliefs." [pp. 29-30.]

3. BIAS, THAT IS, SEEING WHAT WE WANT OR EXPECT TO SEE. "When examining evidence relevant to a given belief, people are inclined to see what they expect to see, and conclude what they expect to conclude. Information that is consistent with our pre-existing beliefs is often accepted at face value, whereas evidence that contradicts them is critically scrutinized and discounted."

4. DESIRE: SEEING WHAT WE WANT TO SEE. This is similar to bias, but desire operates at the emotional level while bias operates at the intellectual level. For example, "We are capable of believing the most flattering things about ourselves, and many scholars have argued that we do so for no other reason than that we want them to be true." [p. 76.] Anthroposophy flatters us in the most extraordinary way, teaching that we are the center of the universe — that indeed the gods worship us. Our desire to find meaning in our lives is thus satisfied (as long as we don't rationally examine the role our desire plays in this system of belief).

5. BELIEVING WHAT WE ARE TOLD, ESPECIALLY BY PEOPLE WE ADMIRE. "Much of what we know in today's world comes not from direct experience, but from what we read and what others tell us. An ever-higher percentage of our beliefs rest on a foundation of evidence that we have not collected ourselves. Therefore, by shedding light on the ways in which secondhand information can be misleading, we can better understand a common source of questionable and erroneous beliefs." [p. 90.] In Anthroposophy, theoretically, individuals can acquire their own direct knowledge of the spirit realm by becoming clairvoyant. But this is a delusion. In reality, Anthroposophists tend to believe what Steiner told them to believe. He was a great spiritual master, they say; thus their beliefs stem in large part from their admiration of their great guru.

6. OVERESTIMATING HOW MANY OTHER PEOPLE SHARE OUR BELIEFS, AND FINDING "PROOF" OF OUR OWN BELIEFS IN THE AGREEMENT OF OTHERS. This is a particularly prevalent error when one enters a close-knit group that limits ties to the wider world. Anthroposophical communities and Waldorf schools are examples: When you primarily hear only one set of ideas, endlessly repeated, you come to accept them more or less wholeheartedly. The small group and its beliefs are mistaken for wider reality and verifiable truth. "We often exaggerate the extent to which other people hold the same beliefs we do. Because our beliefs appear to enjoy more social support than is actually the case, they are more resistant to change than they would be otherwise." [p. 113.]

The human brain is predisposed to deceive itself. 
When it is confused or uncertain,
it tends to concoct "answers" that seem more or less sufficient
without actually providing the truth.

We have to work hard to be rational, to tame our tendency toward self-deception.
As Bertrand Russell said, "What is wanted is not the will to believe
but the wish to find out, which is the exact opposite."

Eminent astronomers once convinced themselves that the surface of Mars
is crisscrossed by many straight lines — channels or canals.
Seeing a jumble of confusing details on Mars, they unconsciously linked the various
dots and smudges revealed by their telescopes; they thus "saw" totally illusory patterns 
that they took to be real, as shown in the Martian maps they produced.

All of us share this tendency to "see" things that aren't there.
It is hard not to see a square and a large
downward-pointing triangle in the images below.
But there is no square, there is no downward triangle:
When we see them, we are deceiving ourselves.

[Images from Michael Shermer's HOW WE BELIEVE (Henry Holt and Company, 2000), 
and Terence Hines' PSEUDOSCIENCE AND THE PARANORMAL (Prometheus Books, 2003.]

The human brain's tendency to create illusory patterns
— and then to accept these as real —
can be seen in the "constellations" of the sky.

Constellations of the southern sky.

Constellations do not really exist,
although occultism (including Anthroposophy) finds great importance in them.
The sky overhead is dotted with an enormous, random swarm of lights.*
Disliking this confused spectacle, our brains get busy imposing an illusion of order.
Constellations consist of stars and galaxies that our brains falsely tie together, 
although in reality the pinpoints of light we assign to each constellation are nowhere near each other.**
If we were to travel far enough from Earth, all of the constellations that
we "see" from Earth would vanish — the illusory patterns would break apart.
But our brains would doubtless "see" new constellations then —
that is, confronted with a different swarm of random lights scattered across the sky, 
our brains would impose new illusory patterns on them.

Seeing something where nothing actually exists
is characteristic of fallacies such as Anthroposophy.
An important part of wisdom is to comprehend
how our brains sometimes trick us, and to resist this trickery.
When we do so, false systems of thought such as astrology and Anthroposophy
lose much of their power over us — they break apart like illusory constellations.


*There is order in the cosmos, such as the orderly shape of our galaxy, the Milky Way.
But we cannot perceive much of this actual order from our vantage point inside the Milky Way.
In the image above, the Milky Way is the disorderly white stripe.

**Some of the stars "in" any constellation may be fairly near to Earth, as cosmic distances go,
but others may be vastly farther away from both the Earth and from the other stars "in" the constellation.
Moreover, some of the lights that appear to be stars "in" a constellation are actually entire galaxies
that are so far away as appear as single points of light.

Some scientists think that a tendency to believe in the supernatural is built into human nature. 

This tendency does not prove the existence or non-existence of God(s), 

nor does it mean that any particular religious doctrine is true or false. 

Rather, these scientists argue that believing in the existence of supernatural beings has helped humans to survive and flourish.

“Religion has the hallmarks of an evolved behavior, meaning that it exists because it was favored by natural selection. It is universal because it was wired into our neural circuitry before the ancestral human population dispersed from its African homeland.” — Nicholas Wade, “The Evolution of the God Gene”, THE NEW YORK TIMES, November 15, 2009.

Developing and affirming a set of religious beliefs helped early humans in many ways — not because God or the gods came to their aid, necessarily, but because shared beliefs united societies and emboldened their members.

“It is easier to see from hunter-gatherer societies how religion may have conferred compelling advantages in the struggle for survival. Their rituals emphasize not theology but intense communal dancing that may last through the night. The sustained rhythmic movement induces strong feelings of exaltation and emotional commitment to the group. Rituals also resolve quarrels and patch up the social fabric.

“...Religion served them as an invisible government. It bound people together, committing them to put their community’s needs ahead of their own self-interest. For fear of divine punishment, people followed rules of self-restraint toward members of the community. Religion also emboldened them to give their lives in battle against outsiders. Groups fortified by religious belief would have prevailed over those that lacked it, and genes that prompted the mind toward ritual would eventually have become universal.” — Ibid.

So a propensity to believe spread throughout humanity because of natural selection: Societies that were strengthened by supernatural beliefs prevailed over societies that lacked such beliefs. The believers passed on their genes while the unbelievers, losers in the evolutionary struggle, tended to die out. Importantly, note that religion could provide benefits — such as making warriors brave — whether or not the religious doctrines of any particular group were true or false. Believing was the key. Truth was largely irrelevant.

“A propensity to learn the religion of one’s community [5] became so firmly implanted in the human neural circuitry, according to this new view, that religion was retained when hunter-gatherers, starting from 15,000 years ago, began to settle in fixed communities. In the larger, hierarchical societies made possible by settled living, rulers co-opted religion as their source of authority. Roman emperors made themselves chief priest or even a living god....

“Religion was also harnessed to vital practical tasks such as agriculture, which in the first societies to practice it required quite unaccustomed forms of labor and organization. Many religions bear traces of the spring and autumn festivals that helped get crops planted and harvested at the right time. Passover once marked the beginning of the barley festival; Easter, linked to the date of Passover, is a spring festival.” — Ibid.

Our predisposition to have supernatural beliefs can become a serious hurdle. We want to believe, so many of us do believe. But are our beliefs true? Our predisposition may discourage us from probing deeply. We want to believe, so we do believe — whether our beliefs are right or wrong, true or false. In this sense, we have a strong psychological aversion to examining our beliefs, testing them to determine whether they are true. We have a predisposition, in other words, to not think critically about the very concepts that are most central to our lives — the very concepts that we really should examine more carefully than any others. The bias that is "so firmly implanted in the human neural circuitry" makes it hard for us to question our doctrines in order to discover the truth about them. We want to believe, so — often without skeptically thinking things through — we do believe.

Much of the thinking that runs afoul of real science can be classified as pseudoscience — 
that is, it is unscientific thinking dressed up in an imitation of scientific respectability. 
Spiritual science — Anthroposophy — falls in this category.

Here are excerpts from some more books that deal with pseudoscience. 
I'll begin by quoting Martin Gardner in the Introduction to his book SCIENCE: Good, Bad and Bogus (Prometheus, 1989).

"We all know there have been occasions when top scientists ridiculed ideas that later proved to be true ... [But we must not] forget that for every example of a crank who later became a hero there were thousands of cranks who forever remained cranks. We must not forget that for every outcast theory raised to respectability by a scientific revolution there were thousands of crazy theories that permanently bit the dust.

"...The science community today assigns a probability close to one [i.e., 100%] to the belief that Venus was a planet long before the human race evolved. By the same token, it gives a probability close to zero that Venus originated as a comet....

"Cranks by definition believe their theories, and charlatans do not, but this does not prevent a person from being both a crank and a charlatan. It is a familiar combination in the history of pseudoscience and occultism....

"I hope no one will imagine that I believe that cranks should be silenced by any kind of legislation. In a free society every crank has a right to be heard ... Crank books — on how to lose weight without cutting down on calories, on how to talk to plants, on how to cure your ailments by rubbing your feet, on how to apply horoscopes to your pets, on how to use ESP in making business decisions, on how to sharpen razor blades by putting them under little models of the Great Pyramid of Egypt — far outsell most books by reputable scientists.

"I do not believe that books on worthless science, promoted into bestsellers by cynical publishers, do much damage except in areas like medicine, health, and anthropology ... Although I am opposed to any kind of law that tells a publisher or a movie or TV producer, what cannot be done, I reserve the right of moral indignation both as an individual and as a member of a pressure group.

"I was among four representatives of the Committee for Scientific Investigation of Claims of the Paranormal who met in 1977 with a group of NBC officials to protest that network's outrageous pseudodocumentaries about the marvels of occultism. One official shouted in anger, "I'll produce anything that gets high ratings!" I thought to myself: this should be engraved on his tombstone ... The sad fact was that not a single NBC official at our meeting knew enough about science to comprehend in the slightest the degree to which their moronic shows about the paranormal were in bad taste.

"...In discussing extremes of unorthodoxy in science I consider it a waste of time to give rational arguments. Those who are in agreement do not need to be educated about such trivial matters, and trying to enlighten those who disagree is like trying to write on water. People are not persuaded by arguments to give up childish beliefs; either they never give them up or they grow up ... As for those who have not yet made up their minds about evolution (and there are millions), the best advice you can give is to suggest that they go to a university and take some introductory courses in geology. Without such basic knowledge they will not even understand your arguments.

"...[W]hen writing about extreme eccentricities of science, I have adopted H. L. Mencken's sage advice: one horse-laugh is worth ten thousand syllogisms. Concerning less extreme claims, such as those of parapsychology, I have occasionally called attention to poor experimental design and the prevalence of fraud; but even in this twilight area such arguments are unlikely to have any effect on the true believer."

Belief in God, or gods, or angels, does not necessarily contradict the findings of modern science. 
But the distinction between belief and knowledge must be recognized, as well as the physical and psychological influences that can come into play. 
The level of science education can also be critically important. 
The following is from Michael Shermer's HOW WE BELIEVE (Henry Holt and Company, 2000), pp. 244-248:

"[P]aranormal thoughts and beliefs may be associated with high levels of dopamine in the brain. The significant effect is that L-dopa makes skeptics less skeptical. By contrast, and surprisingly, L-dopa did not seem to increase the tendency of believers [to see "confirmations" of their beliefs] ... [T]his could mean that there is a plateau effect for believers, with more dopamine having relatively little effect above their belief threshold.

"It is possible that studies like these will begin to explain at least some of the reasons why people believe in paranormal, supernatural, and spiritual entities. Just how many believe in such ephemera? ...[In a poll of Americans] 77 percent answered 'Yes' to the question: 'Do you believe angels, that is, some kind of heavenly beings who visit Earth, in fact exist?'

"...Consider the following account given by Catherine Forbes ... 'Yes, I absolutely believe in angels. I met one.' The circumstances of this experience are telling. After the death of her husband, Forbes decided to take a trip to Jerusalem with a friend in 1953. On their way through the Dallas airport they got lost and became anxious. 'All of a sudden, the nicest voice I ever heard said, "May I help you?" I turned and saw a clean-cut young man, just the most handsome, beautiful man. He picked up my luggage and showed me where to go and which people I was traveling with. I turned around to thank him, and he had absolutely disappeared.'

"...There is no doubt that Forbes's experience was a real one. [Was Forbes helped by a young man? Probably. Was he an angel? There is no evidence. Are her memories about the man's appearance and voice reliable? Gratitude and pre-existing belief, e.g. in angels, can easily color memory.] The was the source of her experience inside or outside the brain? The scientific evidence shows that such experiences are brain-generated, mediated by past experiences (in the form of memories) and the context in which they occur (in this case the airport during an episode of extreme grief). There is no need to call forth supernatural explanations when natural ones will do.

"[From] the Gallup News Service ... June 8, 2001 ... '[H]alf or more of Americans believe in...psychic or spiritual healing, and extrasensory perception (ESP), and a third or more believe in such things as haunted houses, possession by the devil, ghosts, telepathy, extraterrestrial beings having visited earth, and clairvoyance.'

"...[Low levels of science education play a role.] In April, 2002, the NSF [National Science Foundation] published their biennial report of the state of science understanding and public attitudes toward science ... 'Belief in pseudoscience, including astrology, extrasensory perception (ESP), and alien abductions, is relatively widespread and growing ... [Many people said] astrology was at least somewhat scientific, and a solid majority (60 percent) agreed with the statement "some people possess psychic powers or ESP." Gallup polls show substantial gains in almost every category of pseudoscience during the last decade. Such beliefs may sometimes be fueled by the media's miscommunication of science and the scientific process.'

"...70 percent of Americans still do not understand the scientific process, grasping three concepts: probability, the experimental method, and hypothesis testing. One solution is more and better science education.

"...The NSF report concluded: 'Although more than 50 percent of NSF survey respondents in 2001 had some understanding of probability, and more than 40 percent were familiar with how an experiment is conducted, only one-third could adequately explain what it means to study something scientifically. Understanding how ideas are investigated and analyzed is a sure sign of scientific literacy. Such critical thinking skills can also prove advantageous in making well-informed choices at the ballot box and in other daily living activities.'" []

Here is Terenece Hines, in his book PSEUDOSCIENCE AND THE PARANORMAL (Prometheus, 2003), pp. 13-17:

"The most common characteristic of pseudoscience is the nonfalsifiable or irrefutable hypothesis. This is a hypothesis against which there can be no evidence — that is, no evidence can show the hypothesis to be wrong. It might at first seem that such a hypothesis must be true, but a bit of reflection...will demonstrate just the opposite. Consider the following hypothesis: 'I, Terrence Michael Hines, am God incarnate, and I created the universe thirty seconds ago.' Now, you probably don't believe this hypothesis, but how could you go about disproving it? You could argue, 'You say you created the universe thirty seconds ago, but I have memories from years ago. So, you're not God.' But I reply, 'When I created the universe, I created everyone complete with memories.' We could go on like this for some time and you would never be able to prove that I'm not God. Nonetheless, this hypothesis is clearly absurd!

"...Those who are skeptical of pseudoscientific and paranormal claims are frequently accused of being close-minded and demanding adequate evidence and proof ... But who is really being close-minded? ... [T]he believer, who likes to paint him or herself as open-minded and accepting of new possibilities, is actually extremely close-minded. After all, the irrefutable hypothesis is really saying 'There is no conceivable piece of evidence that will cause me to change my mind!' This is true close-mindedness.

"...A second characteristic of pseudoscience is the proponents's unwillingness to look closely at the phenomenon they claim exists. In other words, careful, controlled experiments that would demonstrate the existence of the phenomenon — if it were real — are not conducted. The reality of the phenomenon is uncritically accepted, and the need for hard data and facts is belittled.

"...In reality, the burden of proof should rest squarely on the one who is making the extraordinary claim. This is because, as we have seen, it is often impossible to disprove even a clearly ridiculous claim. Consider the claim that Santa Claus is a real, living person ... The burden of proof must rest with the proponent. He or she must bring forth clear, acceptable evidence that Santa Claus is real and not simply demand that skeptics explain away miscellaneous reports to prove that Santa doesn't exist."

In THE DEMON-HAUNTED WORLD (Ballantine, 1996), pp. 210-216, Carl Sagan offers a set of rule for detecting “baloney.”

People who spread baloney (i.e., illogical nonsense) may deceive themselves even as they deceive others.

Sagan begins with principles we all should observe if we want to make sense,

and then he lists errors we all should avoid if we want to make sense.


• CONFIRM [7]: "Wherever possible there must be independent confirmation of the 'facts.'"

• DEBATE: "Encourage substantive debate on the evidence by knowledgeable proponents of all points of view."

• THINK FOR YOURSELF: "Arguments from authority carry little weight — 'authorities' have made mistakes in the past. They will do so again in the future. Perhaps a better way to say it is that in science there are no authorities; at most, there are experts."

• CONSIDER ALTERNATIVES: "Spin [or develop] more than one hypothesis. If there's something to be explained, think of all the different ways in which it could be explained. Then think of tests by which you might systematically disprove each of the alternatives. What survives, the hypothesis that resists disproof in this Darwinian selection among 'multiple working hypotheses,' has a much better chance of being the right answer than if you had simply run with the first idea that caught your fancy." 

• DON'T GET ATTACHED: "Try not to get overly attached to a hypothesis just because it's yours. It's only a way station in the pursuit of knowledge. Ask yourself why you like the idea. Compare it fairly with the alternatives. See if you can find reasons for rejecting it. If you don't, others will."

• MEASURE, WEIGH, COUNT: "Quantify. If whatever it is you're explaining has some measure, some numerical quantity attached to it, you'll be much better able to discriminate among competing hypotheses. What is vague and qualitative is open to many explanations. Of course there are truths to be sought in the many qualitative issues we are obliged to confront, but finding them is more challenging."

• CHECK EACH LINK: "If there's a chain of argument, every link in the chain must work (including the premise) — not just most of them."

• AVOID NEEDLESS COMPLICATION: "Occam's Razor. This convenient rule-of-thumb urges us when faced with two hypotheses that explain the data equally well to choose the simpler." [Of course, this is not an invitation to oversimplify; it is a recommendation to avoid elaborations that are, in truth, unnecessary. If a phenomenon can be explained by referring to one law of physics, this explanation is probably better than one that calls in several laws of physics.]

• AVOID IRREFUTABLE HYPOTHESES [8]: "Always ask whether the hypothesis can be, at least in principle, falsified [that is, a hypothesis is useless if there is no way to prove it right or wrong]. Propositions that are untestable, unfalsifiable are not worth much. Consider the grand idea that our Universe and everything in it is just an elementary particle — an electron, say — in a much bigger Cosmos. But if we can never acquire information from outside our Universe, is not the idea incapable of disproof? You must be able to check assertions out. Inveterate skeptics must be given the chance to follow your reasoning, to duplicate your experiments and see if they get the same result." [pp. 210-211.]


"...In addition to teaching us what to do when evaluating a claim to knowledge, any good baloney detection kit must also teach us what not to do. It helps us recognize the most common and perilous fallacies of logic and rhetoric. Many good examples can be found in religion and politics, because their practitioners are so often obliged to justify two contradictory propositions. Among these fallacies are:

"• AD HOMINEM [9] — Latin for 'to the man,' attacking the arguer and not the argument (e.g., The Reverend Dr. Smith is a known Biblical fundamentalist, so her objections to evolution need not be taken seriously);

"• ARGUMENT FROM AUTHORITY (e.g., President Richard Nixon should be re-elected because he has a secret plan to end the war in Southeast Asia — but because it was secret, there was no way for the electorate to evaluate it on its merits; the argument amounted to trusting him because he was President: a mistake, as it turned out);

"• ARGUMENT FROM ADVERSE CONSEQUENCES (e.g., A God meting out punishment and reward must exist, because if He didn't, society would be much more lawless and dangerous — perhaps even ungovernable.  Or: The defendant in a widely publicized murder trial must be found guilty; otherwise, it will be an encouragement for other men to murder their wives); 

"• APPEAL TO IGNORANCE — the claim that whatever has not been proved false must be true, and vice versa (e.g., There is no compelling evidence that UFOs are not visiting the Earth; therefore UFOs exist — and there is intelligent life elsewhere in the Universe. Or: There may be seventy kazillion other worlds, but not one is known to have the moral advancement of the Earth, so we're still central to the Universe.) This impatience with ambiguity can be criticized in the phrase: absence of evidence is not evidence of absence.

"• SPECIAL PLEADING, often to rescue a proposition in deep rhetorical trouble (e.g., How can a merciful God condemn future generations to torment because, against orders, one woman induced one man to eat an apple? Special plead: you don't understand the subtle Doctrine of Free Will. Or: How can there be an equally godlike Father, Son, and Holy Ghost in the same Person? Special plead: You don't understand the Divine Mystery of the Trinity. Or: How could God permit the followers of Judaism, Christianity, and Islam — each in their own way enjoined to heroic measures of loving kindness and compassion — to have perpetrated so much cruelty for so long? Special plead: You don't understand Free Will again. And anyway, God moves in mysterious ways.)

"• BEGGING THE QUESTION, also called assuming the answer (e.g., We must institute the death penalty to discourage violent crime. But does the violent crime rate in fact fall when the death penalty is imposed? Or: The stock market fell yesterday because of a technical adjustment and profit-taking by investors — but is there any independent evidence for the causal role of 'adjustment' and profit-taking; have we learned anything at all from this purported explanation?);

"• OBSERVATIONAL SELECTION, also called the enumeration of favorable circumstances, or as the philosopher Francis Bacon described it, counting the hits and forgetting the misses  (e.g., A state boasts of the Presidents it has produced, but is silent on its serial killers); 

"• STATISTICS OF SMALL NUMBERS — a close relative of observational selection (e.g., 'They say 1 out of every 5 people is Chinese. How is this possible? I know hundreds of people, and none of them is Chinese. Yours truly.' Or: 'I've thrown three sevens in a row. Tonight I can't lose.');

"• MISUNDERSTANDING OF THE NATURE OF STATISTICS (e.g., President Dwight Eisenhower expressing astonishment and alarm on discovering that fully half of all Americans have below average intelligence);

"• INCONSISTENCY (e.g., Prudently plan for the worst of which a potential military adversary is capable, but thriftily ignore scientific projections on environmental dangers because they're not 'proved.' Or: Attribute the declining life expectancy in the former Soviet Union to the failures of communism many years ago, but never attribute the high infant mortality rate in the United States (now highest of the major industrial nations) to the failures of capitalism. Or: Consider it reasonable for the Universe to continue to exist forever into the future, but judge absurd the possibility that it has infinite duration into the past);

"• NON SEQUITUR — Latin for 'It doesn't follow' (e.g., Our nation will prevail because God is great. But nearly every nation pretends this to be true; the German formulation was 'Gott mit uns'). Often those falling into the non sequitur fallacy have simply failed to recognize alternative possibilities;

"• POST HOC, ERGO PROPTER HOC — Latin for 'It happened after, so it was caused by' (e.g., Jaime Cardinal Sin, Archbishop of Manila: 'I know of...a 26-year-old who looks 60 because she takes [contraceptive] pills.' Or: Before women got the vote, there were no nuclear weapons);

"• MEANINGLESS QUESTION (e.g., What happens when an irresistible force meets an immovable object? But if there is such a thing as an irresistible force there can be no immovable objects, and vice versa);

"• EXCLUDED MIDDLE, or false dichotomy — considering only the two extremes in a continuum of intermediate possibilities (e.g., 'Sure, take his side; my husband's perfect; I'm always wrong.' Or: 'Either you love your country or you hate it.' Or: 'If you're not part of the solution, you're part of the problem');

"• SHORT-TERM VS. LONG-TERM — a subset of the excluded middle, but so important I've pulled it out for special attention (e.g., We can't afford programs to feed malnourished children and educate pre-school kids. We need to urgently deal with crime on the streets. Or: Why explore space or pursue fundamental science when we have so huge a budget deficit?);

"• SLIPPERY SLOPE, related to excluded middle (e.g., If we allow abortion in the first weeks of pregnancy, it will be impossible to prevent the killing of a full-term infant. Or, conversely: If the state prohibits abortion even in the ninth month, it will soon be telling us what to do with our bodies around the time of conception);

"• CONFUSION OF CORRELATION AND CAUSATION (e.g., A survey shows that more college graduates are homosexual than those with lesser education; therefore education makes people gay. Or: Andean earthquakes are correlated with closest approaches of the planet Uranus; therefore — despite the absence of any such correlation for the nearer, more massive planet Jupiter — the latter causes the former); 

"• STRAW MAN — caricaturing a position to make it easier to attack (e.g., Scientists suppose that living things simply fell together by chance — a formulation that willfully ignores the central Darwinian insight, that Nature ratchets up by saving what works and discarding what doesn't. Or — this is also a short-term/long-term fallacy — environmentalists care more for snail darters and spotted owls than they do for people);

"• SUPPRESSED EVIDENCE, or half-truths (e.g., An amazingly accurate and widely quoted 'prophecy' of the assassination attempt on President Reagan is shown on television; but — an important detail — was it recorded before or after the event? Or: These government abuses demand revolution, even if you can't make an omelette without breaking some eggs. Yes, but is this likely to be a revolution in which far more people are killed than under the previous regime? What does the experience of other revolutions suggest? Are all revolutions against oppressive regimes desirable and in the interests of the people?);

"• WEASEL WORDS (e.g., The separation of powers of the U.S. Constitution specifies that the United States may not conduct a war without a declaration by Congress. On the other hand, Presidents are given control of foreign policy and the conduct of wars, which are potentially powerful tools for getting themselves re-elected. Presidents of either political party may therefore be tempted to arrange wars while waving the flag and calling the wars something else — 'police actions,' 'armed incursions,' 'protective reaction strikes,' 'pacification,' 'safeguarding American interests,' and a wide variety of 'operations,' such as 'Operation Just Cause.' Euphemisms for war are one of a broad class of reinventions of language for political purposes. Talleyrand said, 'An important art of politicians is to find new names for institutions which under old names have become odious to the public')." [pp. 211-216.] 

 The following is excerpted from the THE SKEPTICAL INQUIRER, Volume 34, Issue 5 (Sept-Oct, 2010):

“Are true believers in the paranormal crazy? ... Contrary to popular myth, schizophrenics do not have ‘split’ or ‘multiple’ personalities. Many schizophrenics have auditory hallucinations in which they hear voices. Actually, auditory hallucinations are relatively common even among perfectly normal people. Often, schizophrenics claim their [imagined] voices and perceptual distortions come from outside sources, like space aliens, CIA operatives, therapists, their mothers, and so on, trying to control their thoughts. Why do such delusions emerge?

“One popular theory is that terrified and confused schizophrenics try to make some sense of a frightening world by concluding that the voices come from an outside source ... When such explanations are challenged or don’t fit apparent reality (no flying saucers are found), the patient patches up the false belief so it is still believable.

“Logically, there are only a few ways to do this. First, you can conclude you are very special, a truly unique person, the only one who has paranormal abilities ... Second, you can conclude that your voices could be heard by others if some external paranormal force or agency wasn’t preventing it....

“I propose that perfectly sane, intelligent, and honest true believers in the paranormal — those who refuse to question or use sensible tools of critical thinking — possess a similar thinking process. For example, Sue might believe that she has been abducted by space aliens. Maybe one night she woke up paralyzed and saw the dark figure of a space creature that stood by her bed, touched her, and took her breath away. Of course, there may be numerous alternative explanations: perhaps Sue experienced a sleep-related hallucination, misinterpreted a shadow, or has faulty memory....

“However, Sue is convinced that her alien experience is real and can’t be explained through science — that it was a paranormal event. Sue looks for reassuring answers. Like the schizophrenic, she has two choices.

“One possibility is that she is unique and special....

“Alternatively, Sue...might favor an external explanation ... [Perhaps] scientists see no evidence for abductions because some external paranormal force or agency is getting in the way ... [Or perhaps] evidence for alien abductions eludes scientists and experts because there is a conspiracy among those in power....

“Like the schizophrenic, Sue may be motivated to ‘patch up‘ her beliefs when confronted with conflicting evidence....

“Sane, intelligent, educated, honest, and perfectly decent people are quite capable of experiencing profound delusions and distortions....

“...For those blessed with at least a modicum of sanity, the tools of critical thinking give some chance of remaining grounded in the world as it is.”

— Compilation and commentary by Roger Rawlings

To visit other pages in this section of Waldorf Watch, use the underlined links, below.


The missing basis of Waldorf thinking

Occult initiation in Anthroposophy



"Exact" clairvoyance

Case closed?

Being fooled

Getting happy


You may also want to consult the following essay 
posted in the first section of Waldorf Watch:

The use of "clairvoyance" by Waldorf teachers

If you'd like more information about any of the topics discussed here, 

you might begin by consulting the following resources:


[A - D]   [E - I]   [J - O]   [P - R]   [S]   [T - Z]


[A - B]   [C - D]   [E - F]   [G - I]   [J - M]   [N - Q]   [R - S]   [T - Z]


[A - E]     [G - M]     [N - S]     [T - Z]



[1] I don’t remember when I first heard the word “Anthroposophy,” and it is hard now to remember how many Anthroposophical doctrines I internalized and accepted while still very young. My teachers were circumspect, leading us students into Steiner’s universe without openly telling us — or our parents — what they were doing.

[2] These are Anthroposophical tenets.

[3] Shermer does not mention Anthroposophy. The application of his conclusions to Anthroposophy is my own contribution.

[4] This heading and the following ones are my own contribution. Marks covers topics 4-6 under the heading "GETTING THE BEST OF NOSTRADAMUS", but in the summary of the chapter he separates these topics. This is the pattern I have followed.

[5] Some people become Anthroposophists by accepting it from their parents, teachers, or other members of their community. Indeed, converting young people to Anthroposophy is the central unannounced goal of Waldorf schools. But for other people, the process is reversed: Individuals who were not raised in Anthroposophy leave their original faiths to select Anthroposophy. Thereafter, they may associate mainly, if not exclusively, with other Anthroposophists. In other words, for them the community may arise from the faith rather than vice versa.

[6] I have altered the wording of Gilovich's headings, aiming at greater clarity.

[7] For consistency, I have added the capitalized headings; they do not appear in the book.

[8] See Terence Hines, above.

[9] For consistency, I have capitalized these headings; in the book, they are shown lower case.

[Public domain optical illusions/tricks.] 

Our brains and senses can deceive us. This does not mean, however, that we are unable to discover the truth.

And it does not justify the sorts of illogical arguments Anthroposophists often use,

such as "Our physical brains and senses are unreliable, therefore we must turn to nonphysical organs of clairvoyance."

Or, "Physical science is imperfect, therefore we must turn to spiritual science."

We can see the error in such arguments if we ask the simple question:

Do the things being advocated (organs of clairvoyance and spiritual science) actually exist?

They don't, they are fantasies, so the arguments being offered get us nowhere.

Looked at more formally, these arguments are clearly logical fallacies:

"A is wrong, therefore B is right." No. B may be wrong as well, indeed it may be more wrong than A.

"A is wrong, therefore the exact opposite of A (call it contra-A or X) must be right." No. A and X may both be wrong,

and some other answer (call it T) may be the truth.

Our brains can work, and we can find truth. We just have to work at it 

while rejecting fantastical, false answers such as those Steiner offered.