I like arguing (discussing, if you'd rather). Specially about complicated topics. And let's not hide it, I am mainly thinking about Internet arguments (although not only so). However, it is so annoying when (as if it were uncommon...) people retort to fallacious arguments, don't understand the basic premises of an argument or the psychological traps in which they are falling. And even worse when you try to explain it to them and their reaction is a confused face "What is a fallacy?"
Image: Ad Hominem Attacks, a comic representation.
(Sorry, SMBC, I hope you don't mind that I use your comic)
Let me start with a positive message: I am of the strong opinion that everyone is entitled to an opinion about anything (play of words intended). By writing this I am not trying to say "shut up, you don't know what you're talking about". A lot of very wise people agree that uninformed opinions sometimes provide very useful original and new points of view that experts couldn't have thought about. And even disregarding that, not being an expert in something doesn't make you unable to come up with useful things to say about it. So please, do keep expressing your opinion. But please, also do so with a minimum o civism, logic and basic common sense. This is what I'm trying to help provide you here: Civism, logic and basic common sense.
We all fall into the things I talk about here (intentionally meaning "YES, ME INCLUDED"). However, knowing about them can help us make an effort to avoid them and make arguments more useful and interesting. Introspection is key to make arguments evolve and not run into loops or deadlocks. I will keep adding well known concepts in psychology for this purpose (Disclaimer: I am not actually a psychologist, not a certified one at least, but I do believe I know what I'm talking about here more than most people (kill me)). Currently there's not really a particular order for these. Once I find one that makes sense, I may use it.
So let's start with what I just mentioned: Introspection. Most people are familiar with the word, but I find a lot of people don't really understand what it means and why it is important. You can look up definitions and explanations that look a lot more professional all over the place, but let me give my own. Introspection is the ability to understand one's thought processes, to be able to think about your own thinking as a physical process in which causes have effects and realize that noone is safe from conditioning.
For example, if I shout at someone "Leave me alone!", introspection may be that, a second later, I think about why i did that, and realize that I haven't slept much today and I'm irritable, and that is probably what has caused me to react in such a way. It's useful because realizing this allows me to apologize about it, and therefore have a more constructive argument: "I'm sorry. I'm not feeling well and I'm irritable. Could we just do this some other time?". Without introspection, I would just let the tiredness and irritability control me. It is not wrong to be tired and irritable sometimes, but if you can be introspective about it, it's a lot easier to deal with it.
When someone really believes something (because they have an intuition, because they really want it to be true), they will notice the arguments in favour of it more than the ones against it. In general, a bias is any way in which empirical facts and arguments are not treated equally, in which we favour ones or others. Confirmation bias is when our desire to be right creates this bias.
Maybe the best way to see examples of confirmation bias is to say that optimism and pessimism are, in some sense and in some cases, a form of confirmation bias. Take a pessimist person who truly believes the world is against them. They will take anything that happens and find the worst possible interpretation of it, and use it to justify that the world is indeed against them. They are biased towards arguments and facts that reinforce this idea, and ignore or minimize any good things that happen to them.
A more specific example of confirmation bias would be that of a person who has a strong belief in, for example, ghosts. (I'm not arguing over whether ghosts are real or they are not, that is irrelevant here. The point is that there is a discussion to be had over whether ghosts exist or not, and there may be arguments for and against it). That person may have had an experience such that they find the most plausible explanation for them is the existence of ghosts. Confirmation bias would be this person giving all the importance of the discussion to this particular experience and ignoring all the scientific evidence that seems to indicate that ghosts do not exist (which there is, even though scientific evidence is never conclusive). They believe ghosts exist, and therefore, they will give more value to arguments that validate their opinion than arguments that invalidate it.
It's important to understand that confirmation bias is an unconscious behaviour. It is not done on purpose. We all have a tendency to it. But understanding what it is and how it may affect us may help us, through introspection, to realize when we're doing it and avoid it.
This is not really a psychological concept, but I mentioned scientific arguments in Confirmation Bias so I thought it would be adequate to talk about it here.
Science does not prove things. Nothing is scientifically proven.
Let's say it again. Science does not prove things. Nothing is scientifically proven.
If you are a scientist, you know this. The world is, however, full of "science fans" that are not really scientists, and which get really confused about things. Science is, by definition, empirical. This means that it is based in real world experiments, measurements and observations. Due to this, all it ever does is provide evidence. But no amount of evidence, no matter how big, is a proof. Proofs belong exclusively in mathematics, and mathematics is but a tool that we use to deal with certain parts of arguments (in other words, to transform evidence of a certain thing X into evidence of a certain other thing Y, without losing any value). Mathematics cannot, per se, provide us with truths about our physical universe. We need science for that, and therefore we will never have proof.
Yes, this means that relativity, quantum physics, Newton's laws (which are, by the way, disproven long ago), etc. are not proven. Science does disprove things, though. A scientific theory predicts things, and finding evidence that those predictions are correct gives evidence for that theory, but finding just one case in which those predictions are wrong disproves it: it is certainly false, or rather, it is certainly not entirely correct. This is the case with, for example, Newton's laws. They are disproven, but they are still useful to think about a big majority of cases, they are almost correct.
So please, don't say something as stupid as "science has proven that god does not exist", because you're only making yourself look like a fool. Science has never proven anything. It has only provided evidence for a lot of things. Lots of evidence, that makes us almost sure about them. But they're likely still a bit wrong, and arguing against any generally accepted scientific theory is always a legal thing to do. It just so happens that you are going to have to deal with a lot of evidence for it, which, if you don't fall into confirmation bias, is going to make you have a lot of good arguments to successfully counter.
"Echo Chamber" is a term used to refer to a situation where a group of people who agree on some particular topic reinforce each other's opinions on the topic. Say I believe the earth was created by Mickey Mouse. Say you do too. If we meet and share our opinion, we're both going to believe more strongly in that. Now, because we agree on this, it is likely that we stay close to each other, and so we are going to keep hearing each other talk about all the details of how Mickey Mouse created the earth. This is an Echo Chamber. It's not something that you can really avoid, but if you're aware (again, through introspection) that you may be part of one or several of these, you may be more able to healthily include other less similar opinions into your thought processes.
Being part of an Echo Chamber does not automatically invalidate your argument. You can be part of an Echo Chamber which says something that is entirely correct. But the important thing is, hearing the echoes from the chamber over and over will not strengthen the validity of your opinion, and psychology clearly says that you are likely to stay near the chamber because everyone likes hearing other people agree with them. Be aware of it, and control how it affects your thinking.
"Who's right? Them or us?"
"Oh, you don't agree that meat should be banned? Then you must think that animals don't deserve any respect."
The binary mentality is the tendency to think that there are only two possible positions for a topic. It's either them or us, there's nothing else. It is a dangerous way of thinking because... well... because in most cases there are actually a big range of different opinions, and oversimplifying to just two limits our ability to understand the problem.
Put like this, it seems clear to most people that binary mentality is a dangerous thing, but what is it good for talking about it? Once again, the answer comes through introspection. Try to evaluate whether you think you're falling into a binary way of seeing the world in your arguments, whether you are subconsciously hiding reasoning patterns of the form "because you don't agree with X, you must think -X" in your arguments. You'd be surprised how often you do it without even realizing, we all do. Our minds like simplicity, we are subconsciously always trying to understand things in a simpler way that makes us able to think about it with more confidence and quicklier. And binary mentality is just very simple, so it is very attractive to the subconscious.
If you don't know what a fallacy is, you're clearly the target of this text. A fallacy is a pattern of an argument that is, simply, not correct. When you conclude B from A using a reasoning that is, simply, not right. For example (and introducing one of the most typical kind of fallacious arguments, ad hominem attacks): "You say A, but you're stupid. So A must be false". This is not right (even if we assumed that indeed the person saying A is stupid, that does not make what they say automatically wrong).
There are a lot of well explained lists of fallacious arguments, so I won't explain them all here. Try, for example, https://thebestschools.org/magazine/15-logical-fallacies-know/. If you didn't know what a fallacy was, you need to read this. It is eye-opening, and all 15 of them are very common and every person has committed them at least once in their lives. These fallacies may seem obvious when you see them like this, but next time you're in an argument, try to find (through introspection) how many of them you use. You'll be surprised. We all use fallacies constantly, they're just hardcoded in our minds. But, once again, knowing about them helps us control their effect and avoid them.
Read the linked article to understand them, but I'd just like to list the 15 fallacies mentioned there (causal fallacy unfolded), with a very short summary:
Ad Hominem: attack the person giving the argument instead of the argument.
Straw Man: attacking an argument that is not really the one your opponent gave, because it is easier to argue against it.
Appeal to ignorance: using lack of knowledge of a topic itself as an argument of any position within that topic.
False dichotomy: unjustifiedly either A or B must be true, then saying B must be true because A isn't.
Slippery slope: picking specific predicted outcomes to build a long sequence of causes and effects that is actually very unlikely to happen.
Circular argument: cleverly using an assumption to prove itself, by veiling it.
Hasty generalization: unjustifiedly assuming something will be true in a more general sense than what we have proof or evidence for.
Red Herring: distracting the attention from the argument by appealing to emotions and sentiment.
Tu Quoque: implying that because the opponent has also done something, that means you cannot be blamed or acquainted for.
Causal Fallacy (non causa pro causa): extracting a causal relation from a mere plausibility.
Causal Fallacy (Post hoc): deducing that if something happened after something else, then it was a consequence of it.
Sunk Costs: using the cost of accepting an argument as wrong as an argument for it being correct. (economical, psychological, in energy, or even something like "but then I would have been wrong my entire life").
Appeal to Authority: implying that an argument is correct just because someone reputable said it.
Equivocation: confusing and playing with the vague meaning of words to extract conclusions that would have been impossible to obtain otherwise.
Appeal to Pity: using an emotion related to the topic being discussed as an argument itself.
Bandwagon Fallacy: assuming that if a lot of people agree on something, that automatically makes it right.
Some fallacies are related to things I already mentioned. For example the binary mentality leads to false dichotomy arguments. The same goes with confirmation bias and slippery slope arguments.
We all fall for confirmation bias, we all like echo chambers, we all find it comforting to think in terms of them and us, we all find comfort in thinking that truths can be absolute if we use enough scientists. But that doesn't make any of those things good things to bring into an argument. My intent is not to demonize people who fall into these things, it is to let them know about them (if they didn't already), and equip their introspection with more tools so that we can all have a few more constructive and useful arguments.
This is something people amazingly often forget. Let's take Hitler as an example of someone most people would agree was blatantly wrong. Hitler did not think he was evil. He thought he was totally right in what he did. Does that make him any less guilty for what he did? I'm not saying that! But if we think that Hitler knew his entire life that what he did was wrong and he did it anyway, we are negating ourselves from the ability to understand what happened during those years a lot better. Or said more shortly: we are lying to ourselves.
Everyone thinks they're right. When someone says something that is clearly wrong to you, there needs to be two things going on in your mind:
Could they actually be right? Does their argument counter mine? (Introspection needs to come into play).
Assuming they are indeed wrong, what is making them think they are right?: if we forget that they think they are right, we will never understand why they think they are right and will never be able to provide them with the adequate counter-argument that will make their introspection help them realize their mistake.
The natural reaction could be anger, disbelief, exasperation. But these things, while understandable and should not make you feel guilty for feeling them, are not going to help either of you reach an agreement or learn anything from the argument. They are not useful.