Bounded awareness: Socrates 2.0

Socrates gave us the foundation of modern philosophy when he claimed that his only wisdom was in knowing his own ignorance. By implication, of course, everyone else was even stupider than he and just didn't know it, believing they were thinking/acting with all the information available and all the mental faculties necessary to put that information to good use, when in fact they were hopelessly crippled by their unrecognized obtuseness. Admit it, you'd have made him drink hemlock too.

According to an article by Chugh and Bazerman, entitled "Bounded Awareness: what you fail to see can hurt you", economists and behavioral psychologists make the same mistake for which Socrates chastized the ancient Athenians. Researchers in both disciplines, despite believing very different things about human behavior, tend to assume that people will "accurately perceive the stimuli available to them". That is, that people receive, correctly interpret, and use all the information there is. According to Chugh and Bazerman, that just ain't so, and they lay out an argument for the consistent failure of individuals to recognize relevant information, a tendency they term "bounded awareness".

(Note: there is an earlier and more extensive version of the paper here. Quotes are taken from both papers.)

Bounded awareness differs from bounded rationality in that the latter is "a behavioral model in which human rationality is very much bounded by the situation and by human computational powers", whereas the former arises from failure to "see, seek, use or share highly relevant, easily-accessible and readily-perceivable information during the decision-making process". The authors introduce three principal types of bounded awareness:

1. Inattentional Blindness

Go to this video and watch it. You will see two teams of people passing basketballs. While you watch it, keep track of the number of aerial passes and the number of bounce passes made by the team wearing white.

Done yet? Did you notice anything unusual? If not, watch it again without counting anything. You will see a woman with an umbrella walk through the field. In experiments using this video, significantly less than 100% of people notice the umbrella lady wandering through the game. What this indicates is that when we are concentrating on a difficult cognitive task, we may fail to notice very outstanding facts that are not relevant to the task at hand. This is not such a big deal when it's an umbrella and a basketball game, but what if it's a pilot closely monitoring his instruments on an ILS approach who fails to notice another airplane on the runway?

2. Change Blindness

Somewhat related to the first type of bounded awareness, change blindness occurs when individuals fail to notice changes in visual information.

The possible influence of change blindness in decision making is seen in a study in which participants are asked to choose the more attractive of two faces displayed on a computer screen. As participants moved the cursor to indicate their choice, a flash on the screen distracted them, and the two pictures were reversed. Nonetheless, most subjects continued to move the cursor in the same direction, and selected the picture they originally viewed as the more attractive. Importantly, they both failed to notice the switch and provided reasons, post-hoc, to support their unintended decision. (citations removed)

In other words, they didn't notice that they ended up chosing the less attractive option, and then provided justification for the (suboptimal) choice they made.

It has been postulated that change blindness may contribute to human versions of the "boiling frog" syndrome, in which an allegorical frog will jump out of a pot of boiling water he's thrown into at once, but will allow himself to be cooked to death if put into cold water which is then heated to boiling.

3. Focalism and the Focusing Illusion

Focalism is the tendency to focus on a particular event to the near exclusion of other events that are likely to occur at the same time.

Individuals overestimate the degree to which their future thoughts will be occupied by the focal event, as well as the duration of their emotional response to the event. For example, individuals overestimate the impact of positive events, such as the win of a preferred sports team or political candidate, on their overall happiness. And even more dramatically, individuals overestimate the impact of negative events, such as a major medical condition, on overall happiness.

This gives rise to the so-called "focusing illusion", in which individuals make decisions based on only an overweighted subset of available information, underweighting and ignoring other information.

The earlier version of the paper has a section on bounded awareness in groups which is quite interesting, and which I think fits nicely here. Basically they say that, when a group of individuals gets together with some shared information and some unique (to individuals) information, they tend to focus on the shared information when making decisions. That is, they leave out information that is unique to individuals when deciding things, which can result in even worse decisions made by the group than by any given individual acting alone. This, of course, defeats the whole point of group decision-making, which is to get as much information as possible together to inform the decision!

The authors go on to illustrate some common negotiating situations from game theory to demonstrate how bounded awareness can harm individuals. In negotiating situations, they argue, two types of information are crucial for negotiating effectively: the decisions of others and the rules of the game. Despite ready availability, this information is often neither seen nor used in negotiations. Two examples:

1. Monty Hall Game

Okay, so "Let's Make a Deal" was waaaaay before my time, but I know you've heard of the challenge posed to the contestants. Monty would present them with three doors, one of which concealed a super-duper grand prize, the other two of which concealed dud prizes. They would pick a door, Monty would open one of the other two doors to reveal a dud prize, and they would have the option to stay with their original door or switch. After the show came off the air, statisticians worked out that, given the rules of the game, in which Monty always opens one of the doors you haven't picked, you should always choose to switch to the other door. Quick-and-dirtily, ex ante each door has a 1/3 probability of hiding the grand prize. You pick a door which has this 1/3 probability, leaving 2/3 probability it's in one of the other doors. You then see that one of those doors is a dud, which leaves the 2/3 probability on that last door. You should choose the door with 2/3 probability rather than the one you've got which has only 1/3 probability attached to it.

Recall, though, that the rules of the game are crucial. If Monty were mean, and strategized to minimize your chances of winning by observing your choice and either ending the game or revealing a different door and offering you a switch, you should always stay with your original choice if asked. Different rules, different optimal strategies. In experiments, only 24% of subjects correctly strategized under both sets of rules.

2. The Acquiring a Company Problem

I don't know why this is called "Acquiring a Company" since it's a blatant rip-off of Akerlof's "The Market for Lemons" . Possibly they recast it for MBA students who couldn't imagine why anybody would want to buy a used car in the first place. I'll stick with the company example. It's quite a brilliant concept: you are a buyer trying to acquire a company. The owner of the company knows how much it is worth. You don't, except that it's worth somewhere between $0 and $100 per share with equal probability on all values. However, you do know that under your management, the company is worth 50% more than under present ownership.

When should you buy? Here is where it gets tricky. Most people will say that, on average, the company is worth $50 per share to the present owner, which makes it worth $75 on average to you, so you should buy it at any price between $50 and $75. But not so fast. What is crucial to remember in this situation is that the other side is strategizing as well. The first question to ask is, "when will the seller want to sell me the company?". The answer is, of course, whenever the price is greater than or equal to the value. So for a given price, if the seller will sell, the value has to be less than or equal to the price, and the expected value is half the price. If the price is $60, the expected value to the prospective buyer is $30 + $15 = $45, which is less than the price. The best option is not to buy. So it goes for every possible price, and the only optimal strategy is to buy only at a price of $0.

Extensive research on this problem suggests that bounded awareness leads decision makers to ignore or simplify the cognitions of opposing parties, as well as the rules of the game. Across studies, the modal response range falls between $50 and $75. The common reasoning is: "On average, the firm will be worth $50 to the target and $75 to the acquirer; consequently, a transaction in this range will, on average, be profitable to both parties." Typically, less than 10 per cent of participants offer $0 per share. Replications with accounting firm partners, CEOs, investment bankers, and many other skilled groups have produced similar results.

3. Auctions

Most people have probably already heard of the "Winner's Curse" so I won't belabor the point. Basically, if you are involved in an auction for an object about whose value everyone participating has a different signal, the fact that you win means you probably overpaid since every single one of your opponents believed the object to be worth less than you thought it was. Failing to consider that other people will be betting based on their unique information about the object leads you to spend more than you should have for that groovy retro lava-lamp on e-bay. (I know e-bay is actually a second-price auction and therefore doesn't really apply here, but I was looking for a quick dig at 70's aesthetics. Sue me.)

In summary,

Bounded awareness is the phenomenon whereby individuals do not 'see' accessible and perceivable information during the decision-making process, while 'seeing' other equally-accessible and perceivable information; as a result, useful information remains out-of-focus for the decision-maker.

Somebody pass the hemlock.

Tags

More like this

My account of the big creationism conference will resume shortly, but I really must take time out to discuss this article by Brian Hayes of American Scientist. He is discussing the Monty Hall problem, you see. The story begins with this earlier article by Hayes. He was reviewing the recent book…
Sorry for the sporadic blogging. For the past week I've been working on the Progressive Monty Hall problem, and it has proven to be considerably more complicated than I at first realized. I had expected to polish it off with a few hours work. Instead I have thought about little else for the past…
Here is an interesting variation on the Monty Hall problem. For now I will simply present it cold, without indicating the context of where I saw it. Feel free to leave your proposed solutions in the comments. Everything from vague intuitions to hard-core Bayesian analysis is welcome. Adam and…
I have in front of me an anthology of bridge (as in the card game) essays entitled For Experts Only, edited by Pamela and Matthew Granovetter. Essay number six was written by Phil Martin, and is entitled “The Monty Hall Trap.” Sounds interesting, but I am most definitely not an expert at bridge…

I enjoyed the heck out of this. I especially love the post-hoc explanations telling us what we did and why we did it. I recall hearing that you could hypnotize someone, tell them to get down on all fours on your carpet, bring them out of their trance, and say "What are you doing on my floor?" They would respond by saying something like "We were thinking of getting new carpet for the office, and I wanted to look at yours." It's that 'explainer' circuit in our brains, telling us the story of ourselves, albeit with incomplete information, the poor thing. It's always there, always telling us tales, more or less true tales, unless we are asleep or insane, but tales nonetheless. Great post, thanks, rb

Oh, and you say "more" attractive instead of the "less" that you meant in Change Blindness, #2. You state it correcly the next paragraph. Thanks, rb

oktay usta

It is a good site for healt (best food cooking)
I advice everybody

By oktay usta (not verified) on 09 Jul 2008 #permalink