The (Honest) Truth About Dishonesty
Book notes for "The (Honest) Truth About Dishonesty: How We Lie to Everyone – Especially Ourselves" By Dan Ariely
I liked that most of the research was done by the author, so we were able to see his thought process. I often found myself reading a study, thinking "but surely this could be caused by X", then seeing a study to exclude X on the next page.
It also contains one of the most British studies I have ever heard of:
> This experiment took place in the kitchen of the psychology department at the University of Newcastle where tea, coffee, and milk were available for the professors and staff. Over the tea-making area hung a sign saying that beverage drinkers should contribute some cash to the honesty box located nearby. For ten weeks the sign was decorated with images, but the type of image alternated every week. On five of the weeks the sign was decorated with images of flowers, and on the other five weeks the sign was decorated with images of eyes that stared directly at the beverage drinkers. At the end of every week, the researchers counted the money in the honesty box. What did they find? There was some money in the box at the end of the weeks when the image of flowers was hung, but when the glaring eyes were “watching,” the box contained almost three times more money.
I'm sure there were many passive-aggressive emails sent on the subject of that "honesty box" every day.
I liked the idea that "Essentially, we cheat up to the level that allows us to retain our self-image as reasonably honest individuals.". The rest of the book does seem to back this up (showing how both making the immorality of the act more or less salient and harder to rationalize away, and changing our current view of ourselves as moral or not, could both change the amount of cheating), and it makes intuitive sense. A useful addition to the mental toolbox.
In some cases, I still felt that the conclusions drawn from the studies seemed a stretch however. For example in one of the ego depletion studies, one group has a more difficult task to complete before the standard 'cheating' test, and it is found that they cheat more and get a higher payout. It is claimed that
> What do these findings suggest? Generally speaking, if you wear down your willpower, you will have considerably more trouble regulating your desires, and that difficulty can wear down your honesty as well.
But how do we know this is the case? Maybe they just feel that they 'deserve' more money due to difficulty of the task.
In the same chapter it says
> Ego depletion also helps explain why our evenings are particularly filled with failed attempts at self-control—after a long day of working hard to be good, we get tired of it all. And as night falls, we are particularly likely to succumb to our desires (think of late-night snacking as the culmination of a day’s worth of resisting temptation).
But could this not just be tiredness? Maybe a day of zero temptations would also result in evening of low self control?
Obviously this is a pop-science book and it may be that these points are covered in the studies themselves, but some parts felt very hand-wavey and imprecise. I think a deep look into one or two points would have been more convincing than this quick tour of many aspects of dishonesty.
Overall, a quick fun read with some interesting ideas.
is dishonesty largely restricted to a few bad apples, or is it a more widespread problem? I realized that the answer to this last question might dramatically change how we should try to deal with dishonesty: that is, if only a few bad apples are responsible for most of the cheating in the world, we might easily be able to remedy the problem. Human resources departments could screen for cheaters during the hiring process or they could streamline the procedure for getting rid of people who prove to be dishonest over time. But if the problem is not confined to a few outliers, that would mean that anyone could behave dishonestly at work and at home—you and I included. And if we all have the potential to be somewhat criminal, it is crucially important that we first understand how dishonesty operates and then figure out ways to contain and control this aspect of our nature. WHAT DO WE know about the causes of dishonesty?
THE PRIMARY PURPOSE of this book is to examine the rational cost-benefit forces that are presumed to drive dishonest behavior but (as you will see) often do not, and the irrational forces that we think don’t matter but often do.
This result suggests that cheating is not driven by concerns about standing out. Rather, it shows that our sense of our own morality is connected to the amount of cheating we feel comfortable with. Essentially, we cheat up to the level that allows us to retain our self-image as reasonably honest individuals.
In a nutshell, the central thesis is that our behavior is driven by two opposing motivations. On one hand, we want to view ourselves as honest, honorable people. We want to be able to look at ourselves in the mirror and feel good about ourselves (psychologists call this ego motivation). On the other hand, we want to benefit from cheating and get as much money as possible (this is the standard financial motivation). Clearly these two motivations are in conflict. How can we secure the benefits of cheating and at the same time still view ourselves as honest, wonderful people?
As it turned out, those who lied for tokens that a few seconds later became money cheated by about twice as much as those who were lying directly for money.
As it turns out, people are more apt to be dishonest in the presence of nonmonetary objects—such as pencils and tokens—than actual money.
We took a group of 450 participants and split them into two groups. We asked half of them to try to recall the Ten Commandments and then tempted them to cheat on our matrix task. We asked the other half to try to recall ten books they had read in high school before setting them loose on the matrices and the opportunity to cheat. Among the group who recalled the ten books, we saw the typical widespread but moderate cheating. On the other hand, in the group that was asked to recall the Ten Commandments, we observed no cheating whatsoever. And that was despite the fact that no one in the group was able to recall all ten. This result was very intriguing. It seemed that merely trying to recall moral standards was enough to improve moral behavior.
Sadly, they were not. When the Princeton students were asked to sign the honor code, they did not cheat at all (but neither did the MIT or Yale students). However, when they were not asked to sign the honor code, they cheated just as much as their counterparts at MIT and Yale. It seems that the crash course, the propaganda on morality, and the existence of an honor code did not have a lasting influence on the moral fiber of the Princetonians.
the experiments described here show that doing something as simple as recalling moral standards at the time of temptation can work wonders to decrease dishonest behavior and potentially prevent it altogether. This approach works even if those specific moral codes aren’t a part of our personal belief system.
These results suggest that once someone (or some organization) does us a favor, we become partial to anything related to the giving party—and that the magnitude of this bias increases as the magnitude of the initial favor (in this case the amount of payment) increases. It’s particularly interesting that financial favors could have an influence on one’s preferences for art, especially considering that the favor (paying for their participation in the study) had nothing at all to do with the art, which had been created independently of the galleries.
Ego depletion also helps explain why our evenings are particularly filled with failed attempts at self-control—after a long day of working hard to be good, we get tired of it all. And as night falls, we are particularly likely to succumb to our desires (think of late-night snacking as the culmination of a day’s worth of resisting temptation).
> or just tiredness
What do these findings suggest? Generally speaking, if you wear down your willpower, you will have considerably more trouble regulating your desires, and that difficulty can wear down your honesty as well.
> or they felt they deserved more money
After all, if a bunch of people buy knockoff Burberry scarves for $10, others—the few who can afford the real thing and want to buy it—might not be willing to pay twenty times more for the authentic scarves. If it is the case that when we see a person wearing a signature Burberry plaid or carrying a Louis Vuitton LV-patterned bag, we immediately suspect that it is a fake, then what is the signaling value in buying the authentic version? This perspective means that the people who purchase knockoffs dilute the potency of external signaling and undermine the authenticity of the real product (and its wearer). And that is one reason why fashion retailers and
These results suggest that wearing a genuine product does not increase our honesty (or at least not by much). But once we knowingly put on a counterfeit product, moral constraints loosen to some degree, making it easier for us to take further steps down the path of dishonesty.
THE BOTTOM LINE is that we should not view a single act of dishonesty as just one petty act. We tend to forgive people for their first offense with the idea that it is just the first time and everyone makes mistakes. And although this may be true, we should also realize that the first act of dishonesty might be particularly important in shaping the way a person looks at himself and his actions from that point on—and because of that, the first dishonest act is the most important one to prevent.
The moral of this story? We may not always know exactly why we do what we do, choose what we choose, or feel what we feel. But the obscurity of our real motivations doesn’t stop us from creating perfectly logical-sounding reasons for our actions, decisions, and feelings.
This anecdote illustrates an extreme case of a tendency we all have. We want explanations for why we behave as we do and for the ways the world around us functions. Even when our feeble explanations have little to do with reality. We’re storytelling creatures by nature, and we tell ourselves story after story until we come up with an explanation that we like and that sounds reasonable enough to believe. And when the story portrays us in a more glowing and positive light, so much the better.
This means that those who cheated more on each of the three tasks (matrices, dots, and general knowledge) had on average higher creativity scores compared to noncheaters, but their intelligence scores were not very different. We also studied the scores of the extreme cheaters, the participants who cheated almost to the max. In each of our measures of creativity, they had higher scores than those who cheated to a lower degree. Once again, their intelligence scores were no different.
We wanted to see if the customers who had been so rudely ignored would keep the extra money as an act of revenge against Daniel. Turns out they did. In the no-annoyance condition 45 percent of people returned the extra money, but only 14 percent of those who were annoyed did so. Although we found it pretty sad that more than half the people in the no-annoyance condition cheated, it was pretty disturbing to find that the twelve-second interruption provoked people in the annoyance condition to cheat much, much more. In terms of dishonesty, I think that these results suggest that once something or someone irritates us, it becomes easier for us to justify our immoral behavior. Our dishonesty becomes retribution, a compensatory act against whatever got our goat in the first place. We tell ourselves that we’re not doing anything wrong, we are only getting even. We might even take this rationalization a step further and tell ourselves that we are simply restoring karma and balance to the world. Good for us, we’re crusading for justice!
When I think about all of these justifications together, I realize how extensive and expansive our ability to justify is and how prevalent rationalization can be in just about every one of our daily activities. We have an incredible ability to distance ourselves in all kinds of ways from the knowledge that we are breaking the rules, especially when our actions are a few steps removed from causing direct harm to someone else.
> eg. eating meat
As it turned out, the level of moral flexibility was highly related to the level of creativity required in their department and by their job. Designers and copy-writers were at the top of the moral flexibility scale, and the accountants ranked at the bottom. It seems that when “creativity” is in our job description, we are more likely to say “Go for it” when it comes to dishonest behavior.
The combination of positive and desired outcomes, on the one hand, and the dark side of creativity, on the other, leaves us in a tight spot. Though we need and want creativity, it is also clear that under some circumstances creativity can have a negative influence. As the historian (and also my colleague and friend) Ed Balleisen describes in his forthcoming book Suckers, Swindlers, and an Ambivalent State, every time business breaks through new technological frontiers—whether the invention of the postal service, the telephone, the radio, the computer, or mortgage-backed securities—such progress allows people to approach the boundaries of both technology and dishonesty. Only later, once the capabilities, effects, and limitations of a technology have been established, can we determine both the desirable and abusive ways to use these new tools. For example, Ed shows that one of the first uses of the U.S. postal service was for selling products that did not exist. It took some time to figure that out, and eventually the problem of mail fraud ushered in a strong set of regulations that now help ensure the high quality, efficiency, and trust in this important service. If you think about technological development from this perspective, it means that we should be thankful to some of the creative swindlers for some of their innovation and some of our progress.
For a few weeks, Nina Mazar and I used it to see what would happen if we gave people a probabilistic discount instead of a fixed discount. Translated, that means that we set up the machine so that some candy slots were marked with a 30 percent discount off the regular price of $1, while other slots gave users a 70 percent chance of paying the full price of $1.00 and a 30 percent chance of getting all their money back (and therefore paying nothing). In case you are interested in the results of this experiment, we almost tripled sales by probabilistically giving people back their money.
When the cheater is part of our social group, we identify with that person and, as a consequence, feel that cheating is more socially acceptable. But when the person cheating is an outsider, it is harder to justify our misbehavior, and we become more ethical out of a desire to distance ourselves from that immoral person and from that other (much less moral) out-group.
Although the Broken Windows Theory has been difficult to prove or refute, its logic is compelling. It suggests that we should not excuse, overlook, or forgive small crimes, because doing so can make matters worse. This is especially important for those in the spotlight: politicians, public servants, celebrities, and CEOs. It might seem unfair to hold them to higher standards, but if we take seriously the idea that publicly observed behavior has a broader impact on those viewing the behavior, this means that their misbehavior can have greater downstream consequences for society at large. In contrast to this view, it seems that celebrities are too often rewarded with lighter punishments for their crimes than the rest of the population, which might suggest to the public that these crimes and misdemeanors are not all that bad.
This experiment took place in the kitchen of the psychology department at the University of Newcastle where tea, coffee, and milk were available for the professors and staff. Over the tea-making area hung a sign saying that beverage drinkers should contribute some cash to the honesty box located nearby. For ten weeks the sign was decorated with images, but the type of image alternated every week. On five of the weeks the sign was decorated with images of flowers, and on the other five weeks the sign was decorated with images of eyes that stared directly at the beverage drinkers. At the end of every week, the researchers counted the money in the honesty box. What did they find? There was some money in the box at the end of the weeks when the image of flowers was hung, but when the glaring eyes were “watching,” the box contained almost three times more money.
> depressingly british approach..
What level of cheating did we find? None at all. Despite the general inclination to cheat that we observe over and over, and despite the increase in the propensity to cheat when others can benefit from such actions, being closely supervised eliminated cheating altogether.
but the fudge factor theory we have developed in these pages suggests that our capacity for flexible reasoning and rationalization allows us to do just that. Basically, as long as we cheat just a little bit, we can have the cake and eat (some of) it too. We can reap some of the benefits of dishonesty while maintaining a positive image of ourselves. As
we’ve seen, certain forces—such as the amount of money we stand to gain and the probability of being caught—influence human beings surprisingly less than one might think. And at the same time other forces influence us more than we might expect: moral reminders, distance from money, conflicts of interest, depletion, counterfeits, reminders of our fabricated achievements, creativity, witnessing others’ dishonest acts, caring about others on our team, and so on.
the half-full part of the story is that human beings are, by and large, more moral than standard economic theory predicts. In fact, seen from a purely rational (SMORC) perspective, we humans don’t cheat nearly enough. Consider how many times in the last few days you’ve had the opportunity to cheat without getting caught.
In fact, the amount of cheating seems to be equal in every country—at least in those we’ve tested so far.
Our matrix test exists outside any cultural context. That is, it’s not an ingrained part of any social or cultural environment. Therefore, it tests the basic human capacity to be morally flexible and reframe situations and actions in ways that reflect positively on ourselves. Our daily activities, on the other hand, are entwined in a complex cultural context. This cultural context can influence dishonesty in two main ways: it can take particular activities and transition them into and out of the moral domain, and it can change the magnitude of the fudge factor that is considered acceptable for any particular domain. Take plagiarism, for example. At American universities, plagiarism is taken very seriously, but in other cultures it is viewed as a kind of poker game between the students and faculty.
From the social science perspective, religion has evolved in ways that can help society counteract potentially destructive tendencies, including the tendency to be dishonest. Religion and religious rituals remind people of their obligations to be moral in various ways; recall, for example, the Jewish man with the tzitzit from chapter 2 (“Fun with the Fudge Factor”). Muslims use beads called tasbih or misbaha on which they recount the ninety-nine names of God several times a day. There’s also daily prayer and the confessional prayer (“Forgive me, Father, for I have sinned”), the practice of prayaschitta in Hinduism, and countless other religious reminders that work very much as the Ten Commandments did in our experiments.
> importance of faith
The woman’s sister lived in South America, and one day the sister realized that her maid had been stealing a little bit of meat from the freezer every few days. The sister didn’t mind too much (other than the fact that sometimes she didn’t have enough meat to make dinner, which became rather frustrating), but she clearly needed to do something about it. The first part of her solution was to put a lock on the freezer. Then the sister told her maid that she suspected that some of the people who were working at the house from time to time had been taking some meat from the freezer, so she wanted only the two of them to have keys. She also gave her maid a small financial promotion for the added responsibility. With the new role, the new rules, and the added control, the stealing ceased. I think this approach worked for a number of reasons. I suspect that the maid’s habit of stealing developed much like the cheating we’ve been discussing. Perhaps it began with a single small action (“I’ll just take a little bit of meat while I’m cleaning up”), but having stolen once, it became much easier to continue doing so. By locking the freezer and giving the maid an additional responsibility, the sister offered the maid a way to reset her honesty level. I also think that trusting the maid with the key was an important element in changing her view on stealing meat and in establishing the social norm of honesty in that household. On top of that, now that a key was needed to open the freezer, any act of stealing would have to be more deliberate, more intentional, and far more difficult to self-justify. That is not unlike what happened when we forced participants to deliberately move the mouse to the bottom of the computer screen to reveal an answer key (as we saw in chapter 6, “Cheating Ourselves”).
Finally, I suspect that one other important characteristic for making rules is to have each rule link to a larger meaning. If the rule is set in an arbitrary way (exercise for thirty minutes, three times a week; eat two pieces of fruit and up to two thousand calories a day), the rule itself, and breaking it, is going to be relatively meaningless. But if the rules link us to other people (we are all doing this together), to some other larger purpose (this is what good people do), or to a deep belief (God’s commandments), breaking the rule is more difficult and less likely to happen. In AA, for example, everything is linked to a sense of surrender to a “higher power.”
> importance of faith