Hi, welcome to lesson six, which is going to be about cognitive biases. Psychologists and philosophers have discovered a large number of biases that people have, which produce errors. Not all biases are bad, by the way. Bias just means inclination or tendency. The biases that psychologists are interested in usually promote accurate perception of the world. But we usually mean mistaken or unfair when we say that someone has a bias. And this is a course on critical thinking, so I'm going to talk about biases that sometimes, or even usually, produce errors. One of the most profound cognitive biases is the illusion of objectivity. This is the belief that we understand the world by direct perception. Whereas in fact, our understanding of even the simplest thing is guided by layers of cognitive processes. Philosophers call this error naive realism, in other words, we think we've got the real world in a clear direct way. And that belief is naive, because it's not true. Every belief about the world is achieved via a huge overlay of cognitive processes operating on the raw material of perception. These mental processes themselves have underlying biases. Those biases make us inclined to view the world in particular ways. Most of the biases help us to understand the world correctly most of the time. Please take a look at this pair of tables. It's obvious that one of these tables is longer and narrower than the other, obvious but wrong. The two tables are exactly the same size and shape. So why are your eyes deceiving you? They're deceiving you in a good cause. Your perceptual apparatus, lens, retina, optic nerves, and the part of the brain where the image is projected and processed are designed to help you see the world as it is. The higher order visual processes, the ones in the brain, back behind the retina, operate so as to increase the apparent size of things that are far away. If they didn't, we would see things far away as being much smaller than they are. The perceptual system is trying to achieve what's called size constancy. Our brains doctor the evidence so that things known to be far away seem larger than the image on the retina. That's crucial in the real world. But those tables are not the real world. They've been drawn so that our perceptual machine re-decides for us that we're looking at the end of the table on the right, and the side of the table on the left. This has the effect of lengthening the table on the right and broadening the table on the left. When you see a table in real life, the adjustment the brain makes to the appearance of the image on the retina is a distortion of what's actually there, but that bias helps us to see the world as it is. But in an artificial world, that same biased operation can cause us to make errors. Not until the 20th century was it understood that perception has as much to do with programs in the brain as with images on the retina. Please read about a fictitious person named Howard. Howard spent a great amount of his time in search of what he liked to call excitement. He had already climbed Mt McKinley, shot the Colorado rapids in a kayak, driven in a demolition derby, and piloted a jet powered boat without knowing very much about boats. He had risked injury, and even death, a number of times. Now he was in search of new excitement. He was thinking, perhaps, he would do some skydiving or maybe cross the Atlantic in a sailboat. By the way he acted, one could readily guess that Howard was well aware of his ability to do many things well. Other than business engagements, Howard's contacts with people were rather limited. He felt he didn't really need to rely on anyone. Once Howard made up his mind to do something, it was as good as done, no matter how long it might take or how difficult the going might be. Only rarely did he change his mind, even when it might well have been better if he had. So does Howard seem to be an attractive, adventurous sort of person or an unappealing, reckless sort of person? If several minutes before you read that paragraph you had been exposed to words like self-confident, independent, persistent, you would likely have rated him as an attractive, adventurous sort of person. Those words conjure up what's called a schema or a template. In this case, a schema of an active, exciting, interesting person. If instead you had read words like conceited, aloof, stubborn, you would likely have rated him as an unappealing reckless person. Because that set of words call up a schema of an unpleasant person concerned only with his own pleasures and stimulation. Would you believe me if I told you this man was a higher powered Manhattan lawyer who plays polo and hangs out in trendy bars? I didn't think so. The picture instantly conjures up a schema, or a stereotype, of a person who doesn't remotely resemble a high powered Manhattan lawyer. Schemas are cognitive structures that guide our understanding of the world, usually making us perceive things more accurately than we would without them and prompting us to behave more appropriately. We have schemas for house, family, civil war, insect, introvert, party animal, policeman, doctor, fast food restaurant, fancy restaurant. We should behave differently toward policemen than toward doctors and we should behave differently in fast food restaurants than in fancy restaurants. So these kinds of schemas are mostly useful, although, of course, they can lead us astray. Stereotypes, in particular, can get us in trouble. Schemas can be triggered by very impoverished stimuli, events that we're scarcely aware of. Have a look at these coconuts. If you placed one of them on a shelf above a coffee urn in an office with an honest box next to it, you would get many contributions. Which of those do you think would get more contributions? To give you a hint, coco is Spanish for head. The coconut on the left calls up the schema for a human face. And if there's someone watching me, I want to seem to be honest. You actually don't even have to have something that looks as much like a head as a coconut. Just having a poster with three dots in the pattern on the coconut on the left is enough to trigger the schema for a face and get more contributions. People are more likely to vote for increased taxes for education if the polling place is a school. That brings up all kinds of schemas about school and studying and the importance of education. And people are more likely to vote against abortion if the polling place is a church, it makes you conscious of the sacredness of life. Believe it or not, if you have someone read a persuasive communication in a room where there's a fishy smell, the person is less likely to believe it. This seems fishy to me. This only works, by the way, in countries where fishy means dubious. In Denmark, it's a rat that you smell. The fish trick doesn't work there, to make people less persuaded. If you want someone to like you, hand them a cup of coffee, they will perceive you as warmer. And avoid iced tea at all costs. I hope all this makes the point that schemas are ubiquitous. They often operate completely unconsciously. They usually help us to understand the world, to behave differently toward policemen than doctors, and behave differently in McDonald's than in a fancy restaurant. And schema sometimes intrude in a way that's irrelevant or unhelpful. Schema shape not just perception but also memory. In one study, students watched a video of a husband and wife having dinner together. Some of the participants were told that the woman was a librarian and some were told that she was a waitress. Then after seeing the videos, students were asked a number of questions about what they'd seen. If they thought they had seen a librarian, they tended to remember she had been drinking a glass of wine. If they thought she was a waitress, they tended to remember she had been drinking a glass of beer. If she was a librarian, they remembered her mentioning a history book. If she was a waitress, they remembered her mentioning a romance novel. So schemas have a profound effect on what we see, what we hear, what we remember, and how we behave. Schemas usually help us to perceive the world as it is, but right or wrong, it's always schemas operating on raw sensation that determine what we see and believe. When political convictions are involved in what we see, we're particularly subject to erroneously believing that we're seeing the world as it is. Social psychologists have conducted a study in the immediate aftermath of a particularly horrible event in the long, tragic history of the Israeli Palestinian conflict. This was the 1982 massacre of civilians in refugee camps on the outskirts of Beirut by Christian Falangist gunmen. Israelis had some political ties to the Falangists, who were in conflict with various Muslim groups. The question was how fair the media coverage of the massacre seemed to different constituents, in particular, Stanford University students, some of whom were pro-Israeli and some of whom were pro-Palestinian. Both groups were shown the same samples of coverage by the major news networks. And they were then asked about what they had seen. Both saw the coverage as decidedly biased in favor of the other side. In fact, not a single one of the 68 pro-Israeli participants thought the coverage was as favorable to Israel as any of the 27 pro-Palestinian participants did. Moreover, and this is what's really important, both groups came away convinced that nonpartisan viewers would become more favorable to the other side as a result of what they saw. What this means is that since people tend to think of their own understanding of events not as a take but as a direct accurate readout of what's going on, anyone who tries to offer an even handed account of events will tend to be seen as biased and hostile to their interests. Right wingers in the US talk about the lamestream media who can't be trusted. And people on the left are angry if the major news outlets maintain what they regard as a mindless neutrality by giving the same coverage and treatment to extreme right views as to the sensible, much more centrist views offered by their side of the political spectrum. And in general, people who see the world as radically different from the way we do are stupid, crazy, or are hopelessly biased by their wrong views of the world. If my belief was wrong, I wouldn't have it, so if your belief is different from mine, it's wrong. The comedian George Carlin, used to ask his audience, did you ever notice that someone driving slower than you is an idiot and everyone driving faster is a maniac? In the next segment, we'll talk about what's called heuristics, mental tools that we use to understand the world.