This paper should include learning and opinions, from your reading and discussion of, Blind Spots, book. It should include a self-assessment on how you can improve your decision-making skills through your key learnings from the book. Include the following"
Key learning, Opinions and observations from the book 75 pts
What are the observations from the book on why individuals suffer ethical lapses?
What are the observations from the book on why organizations suffer ethical lapses?
Both personally and organizationally, why is it important to acknowledge susceptibility to ethical lapse?
No strategies guarantee eliminating blind spots however what are positive strategies that increase the odd of success?
What things can you do to improve and eliminate your Blind Spots. 15pts
Five-page minimum, seven-page maximum, 12-point font, double spaced, Word.
Properly formatted title page; reference page in APA format.
Use a topic sentence (summarize main idea) first in each paragraph followed by statements that support or explain that idea.
Narrative style throughout. Bullet points are not acceptable in an academic paper.
Use and refer to relevant readings or lectures to support assumptions and evidence. Use additional articles, books, and other sources if useful (Wikipedia is not a source) .
What did you find valuable in the book, and would you recommend we continue to use this book? 10 pts
Summary 25 Pts
Chapter 1
The Gap between Intended and Actual Ethical Behavior
For some reason I can’t explain, I know St. Peter won’t call my name. —“Viva La Vida,” Coldplay
How ethical do you think you are compared to other readers of this book? On a scale of 0 to 100, rate yourself relative to the other readers. If you believe you are the most ethical person in this group, give yourself a score of 100. If you think you’re the least ethical person in this group, give yourself a score of 0. If you are average, give yourself a score of 50. Now, if you are part of an organization, also rate your organization: On a scale of 0 to 100, how ethical is it compared to other organizations? How did you and your organization do? If you’re like most of the people we’ve asked, each
of your scores is higher than 50. If we averaged the scores of those reading this book, we guess that it would probably be around 75. Yet that can’t actually be the case; as we told you, the average score would have to be 50. Some of you must be overestimating your ethicality relative to others.1 It’s likely that most of us overestimate our ethicality at one point or another. In effect, we are unaware of the gap between how ethical we think we are and how ethical we truly are. This book aims to alert you to your ethical blind spots so that you are aware of that gap—the
gap between who you want to be and the person you actually are. In addition, by clearing away your organizational and societal blind spots, you will be able to close the gap between the organization you actually belong to and your ideal organization. This, in turn, will help us all to narrow the gap between the society we want to live in and the one in which we find ourselves. Drawing on the burgeoning field of behavioral ethics, which examines how and why people behave the way they do in the face of ethical dilemmas, we will make you aware of your ethical blind spots and suggest ways to remove them.
Behavioral Ethics: A New Way of Understanding Unethical Behavior
Consider these two opinions regarding responsibility for the financial crisis that began in 2008:
This recession was not caused by a normal downturn in the business cycle. It was caused by a perfect storm of irresponsibility and poor decision-making that stretched from Wall
Bazerman, Max H., and Ann E. Tenbrunsel. Blind Spots : Why We Fail to Do What's Right and What to Do about It, Princeton University Press, 2011. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/gguu-ebooks/detail.action?docID=664630. Created from gguu-ebooks on 2023-07-16 18:51:43.
C op
yr ig
ht ©
2 01
1. P
rin ce
to n
U ni
ve rs
ity P
re ss
. A ll
rig ht
s re
se rv
ed .
Street to Washington to Main Street. —President Barack Obama
The mistakes were systemic—the product of the nature of the banking business in an environment shaped by low interest rates and deregulation rather than the antics of crooks and fools. —Richard Posner
Same financial crisis, two different explanations from two famous citizens. The first blames the “bad boys” who operated in our financial system, the second the system in which those bad boys operated. Who’s right? Both are—but, even if combined, both opinions are incomplete. Did some greedy, ill-intentioned individuals contribute to the crisis? Absolutely! As
President Obama notes, self-interested actors engaged in clearly illegal behavior that helped bring about the crisis, and these criminals should be sent to jail. Was the financial system destined to produce such behavior? Again, absolutely! Many of our institutions, laws, and regulations are in serious need of reform. Do these two explanations, even when combined, fully explain the financial crisis? Absolutely not! Missing from these analyses are the thousands of people who were culpably ignorant,
engaged in what they thought were seemingly harmless behaviors without consciously recognizing they were doing anything wrong: the mortgage lenders who only vaguely understood that buyers couldn’t afford the homes they wanted, the analysts who created mortgage-backed securities without understanding the ripple effect of such a product, the traders who sold the securities without grasping their complexity, the bankers who lent too much, and the regulators biased by the lobbying efforts and campaign donations of investment banks. The crisis also involves the multitude of people who were aware of the unethical behavior of others, yet did little or nothing in response, assuming perhaps that “someone smarter than them understood how it all worked,” as BusinessWeek speculated.2 Numerous scandals that have occurred in the new millennium have damaged our confidence
in our businesses and our leaders. Under pressure to become more ethical, organizations and financial institutions have undertaken efforts aimed at improving and enforcing ethical behavior within their walls. They have spent millions of dollars on corporate codes of conduct, value-based mission statements, ethical ombudsmen, and ethical training, to name just a few types of ethics and compliance management strategies. Other efforts are more regulatory in nature, including the Sarbanes-Oxley Act passed by the U.S. Congress; changes to the rules that determine how the New York Stock Exchange governs its member firms; and changes in how individual corporations articulate and communicate their ethical standards to their employees, monitor employees’ behavior, and punish deviance. While we support efforts to encourage more ethical decisions within organizations, the
results of these efforts have been decidedly mixed. One influential study of diversity programs even found that creating diversity programs—an organizational attempt to “do the right thing”—has a negative impact on the subsequent diversity of organizations.3 Moreover, such
Bazerman, Max H., and Ann E. Tenbrunsel. Blind Spots : Why We Fail to Do What's Right and What to Do about It, Princeton University Press, 2011. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/gguu-ebooks/detail.action?docID=664630. Created from gguu-ebooks on 2023-07-16 18:51:43.
C op
yr ig
ht ©
2 01
1. P
rin ce
to n
U ni
ve rs
ity P
re ss
. A ll
rig ht
s re
se rv
ed .
interventions are nothing new. Many similar changes have been made in the past to address ethical indiscretions. Despite these expensive interventions, new ethical scandals continue to emerge. Similarly, ethics programs have grown at a rapid rate at business schools across the globe,
and ratings of business schools now often explicitly assess the prevalence of ethics training in the curriculum. Yet the effects of such ethics training are arguably short-lived, and MBA honor codes, usually part of the educational process, have in some cases been proven to produce no discernible improvement in ethical behavior. In fact, according to a 2008 survey conducted by the Aspen Institute, MBA students feel less prepared to deal with value conflicts the longer they are in school.4 Could the financial crisis have been solved by giving all individuals involved more ethics
training? If the training resembled that which has historically and is currently being used, the answer to that question is no. Ethics interventions have failed and will continue to fail because they are predicated on a false assumption: that individuals recognize an ethical dilemma when it is presented to them. Ethics training presumes that emphasizing the moral components of decisions will inspire executives to choose the moral path. But the common assumption this training is based on—that executives make explicit trade-offs between behaving ethically and earning profits for their organizations—is incomplete. This paradigm fails to acknowledge our innate psychological responses when faced with an ethical dilemma. Findings from the emerging field of behavioral ethics—a field that seeks to understand how
people actually behave when confronted with ethical dilemmas—offer insights that can round out our understanding of why we often behave contrary to our best ethical intentions. Our ethical behavior is often inconsistent, at times even hypocritical. Consider that people have the innate ability to maintain a belief while acting contrary to it.5 Moral hypocrisy occurs when individuals’ evaluations of their own moral transgressions differ substantially from their evaluations of the same transgressions committed by others. In one research study, participants were divided into two groups. In one condition, participants were required to distribute a resource (such as time or energy) to themselves and another person and could make the distribution fairly or unfairly. The “allocators” were then asked to evaluate the ethicality of their actions.
In the other condition, participants viewed another person acting in an unfair manner and subsequently evaluated the ethicality of this act. Individuals who made an unfair distribution perceived this transgression to be less objectionable than did those who saw another person commit the same transgression.6 This widespread double standard—one rule for ourselves, a different one for others—is consistent with the gap that often exists between who we are and who we think that we should be. Traditional approaches to ethics, and the traditional training methods that have accompanied
such approaches, lack an understanding of the unintentional yet predictable cognitive patterns that result in unethical behavior. By contrast, our research on bounded ethicality focuses on the
Bazerman, Max H., and Ann E. Tenbrunsel. Blind Spots : Why We Fail to Do What's Right and What to Do about It, Princeton University Press, 2011. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/gguu-ebooks/detail.action?docID=664630. Created from gguu-ebooks on 2023-07-16 18:51:43.
C op
yr ig
ht ©
2 01
1. P
rin ce
to n
U ni
ve rs
ity P
re ss
. A ll
rig ht
s re
se rv
ed .
psychological processes that lead even good people to engage in ethically questionable behavior that contradicts their own preferred ethics. Bounded ethicality comes into play when individuals make decisions that harm others and when that harm is inconsistent with these decision makers’ conscious beliefs and preferences. If ethics training is to actually change and improve ethical decision making, it needs to incorporate behavioral ethics, and specifically the subtle ways in which our ethics are bounded. Such an approach entails an understanding of the different ways our minds can approach ethical dilemmas and the different modes of decision making that result. We have no strong opinion as to whether or not you, personally, are an ethical person.
Rather, we aim to alert you to the blind spots that prevent all of us from seeing the gap between our own actual behavior and our desired behavior. In this book, we will provide substantial evidence that our ethical judgments are based on factors outside of our awareness. We will explore the implicit psychological processes that contribute to the gap between goals and behavior, as well as the role that organizations and political environments play in widening this divide. We will also offer tools to help weight important ethical decisions with greater reflection and less bias—at the individual level, the organizational level, and the societal level. We will then offer interventions that can more effectively improve the morality of decision making at each of these three levels.
What about You? The Implications of Ethical Gaps for Individuals
Most local and national journalists questioned in a recent survey expressed the strong belief that most reporters are more ethical than the politicians they cover. In stark contrast, most government and business leaders surveyed, including members of Congress, believed that reporters were no more ethical than the targets of their news stories.7 Who’s right? While it would be almost impossible to reach an objective conclusion, the vast literature that documents the way we view ourselves suggests that both groups have inflated perceptions of their own ethicality. Here’s another question: Did former president George W. Bush act ethically or unethically
when he decided to invade Iraq? How would you have answered this question during the early days of the war, when it looked as if the United States was “winning”? To what extent might political preferences bias answers to these questions? Most people believe they are fairly immune from bias when assessing the behavior of elected officials. Moreover, even when they try to recall their view at the time they made a decision, most people are affected by their knowledge of how well the decision turned out. Our preferences and biases affect how we assess ethical dilemmas, but we fail to realize that this is the case. At this point, we may have convinced you that others have inflated perceptions of their own
ethicality and a limited awareness of how their minds work. In all likelihood, though, you remain skeptical that this information applies to you. In fact, you probably are certain that you are as ethical as you have always believed yourself to be. To test this assumption, imagine that
Bazerman, Max H., and Ann E. Tenbrunsel. Blind Spots : Why We Fail to Do What's Right and What to Do about It, Princeton University Press, 2011. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/gguu-ebooks/detail.action?docID=664630. Created from gguu-ebooks on 2023-07-16 18:51:43.
C op
yr ig
ht ©
2 01
1. P
rin ce
to n
U ni
ve rs
ity P
re ss
. A ll
rig ht
s re
se rv
ed .
you have volunteered to participate in an experiment that requires you to try to solve a number of puzzles. You are told that you will be paid according to your performance, a set amount for each successfully solved puzzle. The experimenter mentions in passing that the research program is well funded. The experimenter also explains that, once you have finished the task, you will check your answers against an answer sheet, count the number of questions you answered correctly, put your answer sheet through a shredder, report the number of questions you solved correctly to the experimenter, and receive the money that you reported you earned. Would you truthfully report the number of puzzles you solved to the experimenter, or would
you report a higher number?8 Note that there is no way for the experimenter to know if you cheated. While we do not know if you personally would cheat on this task, we do know that lots of seemingly nice people do cheat—just a little. In comparison to a group of individuals who are not allowed to shred their answers, those who are allowed to shred report that they solved significantly more problems than did those who didn’t shred. Those who cheat likely count a problem they would have answered correctly, if only they hadn’t made a careless mistake. Or they count a problem they would have aced if they only had had another ten seconds. And when piles of cash are present on a table in the room, participants are even more likely to cheat on the math task than when less money is visually available.9 In this case, participants presumably justify their cheating on the grounds that the experimenters have money to burn. Ample evidence suggests that people who, in the abstract, believe they are honest and would never cheat, do in fact cheat when given such an easy, unverifiable opportunity to do so. These people aren’t likely to factor this type of cheating into their assessments of their ethical character; instead, they leave the experiment with their positive self-image intact. The notion that we experience gaps between who we believe ourselves to be and who we
actually are is related to the problem of bounded awareness. Bounded awareness refers to the common tendency to exclude important and relevant information from our decisions by placing arbitrary and dysfunctional bounds around our definition of a problem.10 Bounded awareness results in the systematic failure to see information that is relevant to our personal lives and professional obligations.
Bazerman, Max H., and Ann E. Tenbrunsel. Blind Spots : Why We Fail to Do What's Right and What to Do about It, Princeton University Press, 2011. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/gguu-ebooks/detail.action?docID=664630. Created from gguu-ebooks on 2023-07-16 18:51:43.
C op
yr ig
ht ©
2 01
1. P
rin ce
to n
U ni
ve rs
ity P
re ss
. A ll
rig ht
s re
se rv
ed .
Figure 1. Photograph copyright © 1965 by Ronald C. James
Take a look at figure 1. What did you see? Now take a look at the Dalmatian sniffing on the ground. Most people do not see the Dalmatian on the first look. Once they know she is there, however, they easily see her— and, in fact, they can no longer look at the picture without noticing she is there. The context of the black-and-white background keeps us from noticing the Dalmatian, just as our profit-focused work environments can keep us from seeing the ethical implications of our actions. As the Dalmatian picture demonstrates, we are “boundedly aware”: our perceptions and
decision making are constrained in ways we don’t realize. In addition to falling prey to bounded awareness, recent research finds we are also subject to bounded ethicality, or systematic constraints on our morality that favor our own self-interest at the expense of the interest of others. As an example, a colleague of Ann’s once mentioned that she had decided not to vaccinate her children given a perceived potential connection between vaccines and autism. After noting that this was a decision her colleague had a right to make, Ann suggested that she might be overweighing the risks of the vaccine in comparison to the risk of the disease. Ann also raised the possibility that her colleague was not fully considering the impact of her decision on others, particularly immune-compromised children who could die if they contracted diseases as commonplace as chicken pox from unvaccinated children. Several days later, Ann’s colleague mentioned that she was rethinking her decision not to vaccinate her children, as she had never considered the other children who might be affected by her decision. The psychological study of the mistakes of the mind helps to explain why a parent might
overweigh the risks of a vaccine relative to the risk of a disease for the sake of her or his own child. Going a step further, bounded ethicality helps to explain how a parent might act in ways that violate her own ethical standards—by putting other people’s children in danger— without being aware that she is doing so. We will explore how psychological tendencies produce this
Bazerman, Max H., and Ann E. Tenbrunsel. Blind Spots : Why We Fail to Do What's Right and What to Do about It, Princeton University Press, 2011. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/gguu-ebooks/detail.action?docID=664630. Created from gguu-ebooks on 2023-07-16 18:51:43.
C op
yr ig
ht ©
2 01
1. P
rin ce
to n
U ni
ve rs
ity P
re ss
. A ll
rig ht
s re
se rv
ed .
type of accidental unethical behavior. Philosopher Peter Singer’s book The Life You Can Save: Acting Now to End World Poverty
provides ample documentation of how our limited awareness restricts our charitable giving and even our willingness to think about many ethical problems.11 He opens his book with the following problem:
On your way to work, you pass a small pond. On hot days, children sometimes play in the pond, which is only about knee-deep. The weather’s cool today, though, and the hour is early, so you are surprised to see a child splashing about in the pond. As you get closer, you see that it is a very young child, just a toddler, who is flailing about, unable to stay upright or walk out of the pond. You look for the parents or babysitter, but there is no one else around. The child is unable to keep his head above the water for more than a few seconds at a time. If you don’t wade in and pull him out, he seems likely to drown. Wading in is easy and safe, but you will ruin the new shoes you bought only a few days ago, and get your suit wet and muddy. By the time you hand the child over to someone responsible for him, and change your clothes, you’ll be late for work. What should you do?
Singer notes that most people see this as an easy problem to solve. Clearly, one should jump in and save the child, as failing to do so would be a massive ethical failure. Singer then goes on to describe a challenge described by a man in Ghana:
Take the death of this small boy this morning, for example. The boy died of measles. We all know he could have been cured at the hospital. But the parents had no money and so the boy died a slow and painful death, not of measles but out of poverty. Think about something like that happening 27,000 times every day. Some children die because they don’t have enough to eat. More die, like that small boy in Ghana, from measles, malaria, diarrhea, and pneumonia, conditions that either don’t exist in developed nations, or, if they do, are almost never fatal. The children are vulnerable to these diseases because they have no safe drinking water, or no sanitation, and because when they do fall ill, their parents can’t afford any medical treatment. UNICEF, Oxfam, and many other organizations are working to reduce poverty and provide clean water and basic health care, and these efforts are reducing the toll. If the relief organizations had more money, they could do more, and more lives would be saved.
While one could quibble about whether the two stories are perfectly parallel, most people feel uncomfortable when reading this second story (we know that we were). In fact, the stories are quite similar, except for one difference. In the first, you would likely be aware of any gap that arises between what you should do and what you actually do: you should save the boy, and if you do not, it will be obvious to you that you failed to meet your own ethical standards. In the second example, your ethical blinders are firmly in place. Most people likely would be ashamed if they knew they had failed to save a life for a relatively small amount of money, yet most of us do exactly that. We will explore the psychological tendencies that produce those blind spots and suggest ways to remove them.
Bazerman, Max H., and Ann E. Tenbrunsel. Blind Spots : Why We Fail to Do What's Right and What to Do about It, Princeton University Press, 2011. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/gguu-ebooks/detail.action?docID=664630. Created from gguu-ebooks on 2023-07-16 18:51:43.
C op
yr ig
ht ©
2 01
1. P
rin ce
to n
U ni
ve rs
ity P
re ss
. A ll
rig ht
s re
se rv
ed .
As another example, take the case of Bernard Madoff. Over the course of three decades, Madoff’s Ponzi scheme racked up enormous losses: more than 15,000 claims approaching $300 million in damages, and $64.8 billion in paper profit was wiped out. Madoff sold most of his investments through feeder funds—that is, other funds that either marketed their access to Madoff to potential investors or claimed they had access to some exotic investment strategy. In reality, the feeder funds were doing nothing more than turning much of the money they collected over to Madoff. These intermediaries were extremely well paid, often earning a small percentage of the funds invested plus 20 percent of any investment profits earned. Thus, as Madoff claimed an amazing record of success, the feeder funds were getting rich. It is now clear that Madoff was a crook, and his purposeful, deceitful behavior lies outside
of this book’s focus on unintentional ethical behavior. Yet we are fascinated by the harmful behavior of so many other people in this story, people who had no intention of hurting Madoff’s eventual victims. Many analysts have now concluded that outperforming all kinds of markets, as Madoff did, is statistically impossible. Did the managers of the feeder funds know that Madoff was running a Ponzi scheme, or did they simply fail to notice that Madoff’s performance reached a level of return and stability that was impossible? Ample evidence suggests that many feeder funds had hints that something was wrong, but lacked the motivation to see the evidence that was readily available. For example, Rene-Thierry Magon de la Villehuchet, a descendent of European nobility and the CEO of Access International Advisors and Marketers, had invested his own money, his family’s money, and money from his wealthy European clients with Madoff. He was repeatedly warned about Madoff and received ample evidence that Madoff’s returns were not possible, but he turned a blind eye to the overwhelming evidence. Two weeks after Madoff surrendered to authorities, de la Villehuchet killed himself in his New York office. Here’s a final example of the type of psychological blind spots that affect us. In perhaps the
most famous experiment in psychology, Stanley Milgram demonstrated the amazing degree to which people will engage in unethical behavior in order to fulfill their obligations to authority. Each participant in Milgram’s study played the role of “teacher,” while a study confederate (someone trained by the experimenter) played the role of “learner.” The learner was portrayed as a forty-seven-year-old accountant. The teacher and learner were physically separated, such that the teacher could not see the learner. The teacher was told that it was his job to administer shocks of increasing magnitude, ranging from 15 volts to 450 volts, as the learner made mistakes in a task requiring the matching of word pairs. The learner did make mistakes on the task, requiring the confederate to administer shocks.
Up to 150 volts, occasional grunts were heard from the other side of the wall where the learner was located. (The learner was not actually receiving shocks; he was an actor.) At 150 volts, the learner shouted that he wanted to stop the experiment and let out some cries of pain. If the teacher resisted continuing, the experimenter insisted that the experiment must go on. From 150 to 300 volts, the teacher heard the learner as he pleaded to be released and complained about his heart condition. At 300 volts, the learner banged on the wall and demanded to be released. After 300 volts, the learner was completely silent.
Bazerman, Max H., and Ann E. Tenbrunsel. Blind Spots : Why We Fail to Do What's Right and What to Do about It, Princeton University Press, 2011. ProQuest Ebook Central, http://ebookcentral.proquest.com/lib/gguu-ebooks/detail.action?docID=664630. Created from gguu-ebooks on 2023-07-16 18:51:43.
C op
yr ig
ht ©
2 01
1. P
rin ce
to n
U ni
ve rs
ity P
re ss
. A ll