Influence: The Psychology of Persuasion explores the art of compliance. It sets out to answer the question, “How do we become convinced to do the things that we do?”
A lot of persuasion rests on the manipulation of human fixed-action patterns. Fixed-action patterns are the mental shortcuts and assumptions that we use to fill in the blanks of our everyday experience. For example, we assume that when other drivers on the road are braking, we should brake too. Or that a long line of people means that there must be some desirable attraction at the end. Or that a high price for an item at the store indicates that it’s rarer or of higher quality (or both). A fixed-action pattern causes us to respond in the same, predictable way to certain stimuli, over and over again.
These fixed-action patterns are useful because it’s impossible for us to individually assess every single situation on its own merits: we would just get overwhelmed with information and be unable to make any decisions at all. These shortcuts let us make decisions without being burdened by endless analysis and weighing of pros and cons. And usually, our fixed-action patterns lead us to the right conclusions and help us make correct decisions.
While our fixed-action patterns are usually an asset for us in everyday life, they are easily manipulated and exploited by compliance practitioners. These are professional persuaders, people whose job it is to get you to say “yes” to whatever it is they’re offering. They’re usually salespeople, but they can also be fundraisers asking you to contribute to a charitable cause or sign a petition, or they can be politicians asking for your vote. If they’re trying to get you to do something that you wouldn’t do on your own, they’re a compliance practitioner.
These individuals are very skilled at using our fixed-action patterns against us: that is, they manipulate us into behaving the right way in response to the wrong stimuli. A store owner might, for example, mark up the price of a low-quality item to make it more desirable to you: knowing full-well that your mental shortcut would usually lead you to believe that a high-priced item is of high quality. They’re turning your fixed-action advantage into a powerful disadvantage that clouds your judgement and leads you to make faulty decisions.
Most compliance practitioners use six psychological principles of persuasion:
The Reciprocity Principle tells us to repay others when they do something for us. Most of the time, it just feels like common decency: when a friend treats you to lunch, you pay for their meal the next time; when your neighbors invite you over, you return the courtesy.
The principle is an evolutionary inheritance: early human communities with strong social cohesion and an ability to work together had a better chance of survival. By knowing that a favor would be returned, reciprocity helped to lower the “costs” of helping one’s neighbors and kin.
Thus, compliance practitioners know that you’re more likely to feel obliged to them if they present you some small gift or token gesture of kindness before they make their request. This is why sellers are so fond of promotional offers, free samples, and small gifts: they know you’ll respond with a “gift” of your own by buying whatever it is they’re trying to promote. Or they’ll ask for an initially ludicrous request that you’ll reject, only to present you with a second, smaller request. This rejection-then-retreat tactic is designed to lull you into making a reciprocal concession by giving in to their second ask.
To say “no” to this kind of reciprocity, you have to distinguish between people who are engaging in genuine acts of kindness and those who are simply trying to trick you into doing something for them. You are socially obligated to return a genuine favor with another favor: you’re not obligated to return a trick with a favor.
The Consistency Principle says that humans have an obsession with sticking to their guns. Once we’ve committed to something, we pressure ourselves to conform to that commitment. In fact, we’ll convince ourselves that our current behavior and beliefs align with our past behavior and beliefs, even when they clearly don’t.
For example, gamblers who are unsure about their bets before they place them have been shown to be far more confident after they’ve put their money on the table. They convince themselves that they were always confident in the horse they picked or the color on the roulette wheel they chose.
Consistency is generally useful for day-to-day human experience and it’s usually a good attribute for someone to have. It frees us from potential mental overload by giving us an easy, one-size-fits-all guide for how to react to a multitude of situations and people that we encounter each day.
Unfortunately, consistency and commitment can also be exploited. Our desire for internal consistency can turn even a small commitment into larger and larger ones. In one California study, homeowners were shown to be far more willing to have an unsightly billboard installed on their property after they had agreed to have a much smaller one erected a few weeks earlier. Thus, compliance practitioners who are “only” asking you to sign a petition or answer a few seemingly innocuous questions are usually trying to build you up to agree to ever-larger requests.
You can feel it intuitively when you’re being asked to do something you don’t want to do. The key to fighting back is spotting these situations quickly. Otherwise the compliance professional will corner you with your own commitment. You should then turn the table on the compliance practitioner. Tell them that you’re onto them and you know exactly what they’re trying to do. Make decisions for a reason: don’t make reasons for a decision.
The Social Proof Principle posits that we decide what’s correct based on what other people think is correct. If lots of other people are doing something or thinking something, then it must be good and worthy of imitation. It’s why television producers add laugh tracks to unfunny sitcoms: they know that, through social proof, we’ll be more likely to laugh if we hear others laughing (even if we don’t find the content to be funny on its own).
Of course, social proof is often valuable: you’ll tend to make fewer mistakes if you follow social evidence than if you ignore it. When a lot of people are doing something, it usually is the right thing to do. We can look to others for how to model our behavior in everyday situations, rather than needing to meticulously analyze everything.
Social proof can also be faked or manufactured, however, or used for self-serving purposes by compliance practitioners. It’s why so many product advertisements talk about being the “fastest-growing” or “highest-selling”: the marketers want to convince you that there’s a groundswell of demand for the product from others. Or even worse, they’ll create fake “person-on-the-street” commercials where allegedly “real” people (who are actually paid actors) talk up the merits of the product.
To resist the manipulation, you need to look closer at group behavior. Is there a reason to do something, beyond just the fact that everyone else is doing it? Don’t be like a pilot who flies by relying solely on her instruments. You also need to actually see the sky in front of you. Sometimes you do need to look critically at the world around you, take the time to assess situations, think for yourself, and apply your own individual judgement.
The Liking Principle stipulates that we’re more likely to comply with requests from people that we know and like. Thus, we are more amenable to the compliance efforts of neighbors, friends, and family. It’s why salespeople will often mention the names of members of your family or friends that they’ve done business with. The salesperson wants you to translate some of your warm feelings about those individuals onto them.
We are also more willing to acquiesce to people who we see as being good-looking, affable, or who profess to like us. This creates a wide opening for compliance practitioners. If you like the seller, you’ll like what she’s selling. The efforts at manipulation can be almost comically transparent and still be effective: one car salesman claimed great success just by mailing generic postcards to his customers every month saying nothing more than “I like you.”
There’s nothing wrong with liking people, and usually someone’s charm or warmth indicates that they are trustworthy and reliable. But to avoid being manipulated, you need to evaluate each situation on the merits. If you feel that you are strongly liking someone after only being briefly acquainted with them, you need to pause and assess what is producing these feelings. Always separate your personal feelings for the person trying to sell you something from the thing you’re actually looking to buy. Judge your potential decision solely on the merits: don’t comply with a request just because you like the requester.
The Authority Principle states that people are hard-wired to comply with requests that come from an acknowledged and accepted source of authority. Thus, we are strongly inclined to be deferential to people whom we consider to be in a position of power or expertise, like teachers, members of the armed forces, police officers, doctors, and judges. In fact, we respond to even just the symbols of authority—like titles and uniforms.
Of course, there are good and legitimate reasons why we’re strongly conditioned to obey authority. Leadership, hierarchy, and authority are obviously necessary ingredients in any functioning society. Our ancestors wouldn’t have been able to organize complex societies if there hadn’t been some authority figure giving orders, assigning priorities, and allocating resources. Indeed, authority is the basis of government and law: without it, there’s only anarchy.
Unfortunately, authority can also be abused and exploited. In the famous Milgram experiment at Yale, ordinary people were shown to be highly vulnerable to pressure from an authority figure who instructed them to administer painful and dangerous electric shocks to fellow experiment participants. Clearly, the instinct to obey authority runs deep and can be easily exploited by compliance professionals who only need to adopt the most superficial patina of authority to trick people into acceding to their requests.
To avoid getting suckered, don’t blindly obey authority. Always assess an authority figure’s credentials and the relevance of those credentials. A cop telling you to pull over is a legitimate authority figure whose training and expertise clearly compel you to comply in this situation. An actor who plays a doctor on a TV show, on the other hand, is not a legitimate authority from which to take medical advice in a pharmaceutical commercial. Their training is as an actor, not as a physician.
The Scarcity Principle tells us that we find more appealing those things with limited availability. Thus, rare goods are expensive, abundant items are cheap. Scarcity is closely related to the idea of loss aversion. We’re inherently conservative and cautious: in fact, we’re more afraid of losing something than we are enticed by the hope of gaining something of equal value.
Like our other fixed-action mental shortcuts, scarcity usually is a good gauge of how valuable something is. It’s simple supply-and-demand: when there’s less of something and there’s a high demand for it, the price increases. It’s why gold is more valuable than iron and why high-skilled workers earn more than low-skilled workers.
But compliance practitioners know how to twist this instinct to their own advantage. It’s why we see so many “limited-time only” or “first-come, first-serve” sales pitches: the goal is to drive you into a scarcity frenzy that forces you to suspend your better judgement and rush headlong into an ill-considered decision.
Our scarcity instinct is accelerated when things become recently scarce (where they were previously abundant) and when they become scarce through social competition. For a salesperson, then, there’s no better scenario than when customers are bidding against one another for a product of limited availability: the sense of loss aversion will compel many people to grossly inflate the value and desirability of the item.
To avoid being manipulated this way, you need to ask yourself if you truly wish to use the item for its intended purpose, or if you merely wish to possess it because of the rarity itself. Do you really want that sports car because of its inherent features, or do you just want it because so few other people have it? If your answer is the latter, then you’ve probably fallen into a scarcity compliance trap. You should want things because of their intrinsic value, not because of their rarity or status.
All of these principles of persuasion turn our greatest strengths into some of our greatest vulnerabilities. Compliance practitioners are adept at fooling us by activating our fixed-action patterns to get us to agree to whatever it is they’re trying to push on us. Thus, they’ll give you “free” samples; manipulate you into making seemingly innocuous commitments; create phony social proof, butter you up with flattery; put on a fake uniform to lend themselves the air of authority; or give you a made-up deadline to make you think your time to act is limited. Knowledge is power: the more you know about the compliance tricks, the better prepared you’ll be to resist them.
Have you ever been persuaded to purchase something that you later regretted? Or been manipulated into contributing money to a charitable cause that you didn’t actually support? If your answer is “yes,” then don’t worry: you’re hardly alone.
Every day, we are bombarded with advertisements and appeals that ask us to buy something, join some organization, or get involved in some cause. Clearly, there is great advantage in persuading people to do things.
But who are these persuaders and how are they so effective at manipulating us into doing what they want?
A compliance practitioner is anyone whose job is to get you to say “yes” to what they’re offering. They can be:
Their specific agendas may be different, but they’re all after the same thing. They’re all in the persuasion business and they all want to persuade you. How do they do this? By manipulating the decision-making instincts that get you to say “yes” before you even consider the consequences.
A brief example will illustrate what we mean. Turkey mothers are known to be caring and protective of their young. But what animal researchers have discovered is that the turkey mothers’ nurturing instinct is triggered by a “cheep cheep” noise that the turkey chicks make. This noise is the mother’s signal to care for the chicks. Remarkably, however, the mother will neglect to care for the chicks if the latter fail to make the “cheep cheep” noise.
Researchers further discovered that these same maternal instincts can be triggered by man-made replicas of animals other than turkeys (even natural predators), as long as the replicas make the same “cheep cheep” noise. This is a classic fixed-action pattern: a sequence of behaviors that consistently happen in the same way and in the same order. Behavior A (the “cheep cheep” noise) can always be counted on to produce Behavior B (the nurturing behavior) every time.
But it turns out that we humans aren’t so different from turkeys. We have our fixed-action patterns too. Like clockwork, we will behave the same way in response to the same stimuli. Compliance practitioners are experts in exploiting our fixed-action patterns: they know exactly which inputs to use in order to produce the outputs that they want from us.
You might be thinking that fixed-action patterns are a bad thing for human beings to have, that they’re some design flaw in our cognitive wiring. But they’re not.
In fact, fixed-action patterns are essential for human beings to process and order all of the information in our world. Think of human fixed-action patterns as mental shortcuts. We could never individually assess every aspect of every situation we encountered, even in the course of a normal day: it would lead to mental overload and an inability to make any decisions at all.
This is why general categories are useful. You can’t examine the properties of every blade of grass before you walk across a field or measure every grain of sand before you walk across a beach. We need some way to aggregate all of this information and distill it down to general rules of behavior that inform our responses to situations.
This is the power and utility of fixed-action patterns. They fill in the blanks for us so that our brains don’t need to overload.
As the world becomes even more complex and we have access to more and more information, fixed-action patterns will become more important than ever. The more information we need to process, the more we’ll rely on mental shortcuts.
The problem, then, isn’t that we have these mental shortcuts. It’s that compliance practitioners have become skilled at exploiting them for their own advantage. In doing so, they pull a trick on us: compliance practitioners short-circuit our mental shortcuts by getting us to behave the right way in response to the wrong stimuli.
For example, most people’s instincts would tell them that a higher-priced item is more valuable than a lower-priced item. And the vast majority of the time, this instinct would lead you to the correct conclusion: pricey items generally are expensive because they’re rarer or higher quality. But this instinct can also be exploited by clever salespeople. The book shares the story of an antique shop owner who raised the prices of an item that had previously been selling poorly. Within a day, customers had bought every unit!
These customers weren’t drawn to the item because they knew it was better or rarer. They were relying solely on the big price tag to guide their behavior, reasoning, “It must be good if they’re charging this much for it!” Thus, the shop owner was able to manipulate the customers into doing what their instincts told them to do, but for the wrong reason (the item wasn’t really valuable or rare): not all that different from a turkey.
There’s no end to the list of specific persuasion tricks and tactics, but most compliance practitioners play upon (or prey upon) six psychological principles that guide human behavior. These principles of persuasion are:
All of these principles are fixed-action patterns. When we encounter any of them in the real world, our instincts prime us to respond in particular ways. Compliance practitioners know this better than anyone. And with an effective command of these principles, they can manipulate us into doing just about anything.
They are able to do this with a light touch. When compliance practitioners do their job right, they convince you that you truly want what they’re offering—and all they’re doing is satisfying that want. The most effective persuasion is that which doesn’t feel like persuasion at all.
But you can fight back. You don’t have to meekly resign yourself to getting manipulated. After reading this summary, you’ll know:
The Reciprocity Principle tells us to repay others when they do something for us. This fixed-action pattern of behavior is so deeply ingrained that we hardly think about it, yet we practice it all the time. When a friend treats you to lunch, you make sure you pick up the check the next time you go out; when your neighbors invite you to a party, you invite them the next time you’re hosting an event.
That the phrase “much obliged” is a synonym for “thank you” is a powerful encapsulation of the Reciprocity Principle: we naturally feel obliged, indebted to those who have done something for us.
The Reciprocity Principle can even extend beyond these small day-to-day courtesies to the world of international diplomacy. When Mexico City was devastated by an earthquake in 1985, Ethiopia dutifully made a foreign aid contribution to help the rebuilding and recovery effort. But why would a country like Ethiopia, poverty-stricken and suffering through a devastating famine, spend its scarce resources to help people all the way on the other side of the world? Simple: the Reciprocity Principle. In 1935, when Ethiopia was invaded and occupied by the Italians, Mexico was one of the few countries to send aid. The Ethiopians were returning the favor, 50 years later.
Why is the Reciprocity Principle such an intuitive part of the human experience? Because evolution favored early human communities with strong social cohesion and an ability to work together.
The Reciprocity Principle was the glue that enabled social cohesion. If another individual brought you some firewood, for example, bringing them some of your own firewood would help the two of you survive and make the overall clan or tribe stronger.
This created networks of obligation among early humans that made it easier for the group as a whole to multiply and survive. In a harsh and unforgiving environment, like many prehistoric peoples faced, this was the only way to ensure group survival and prosperity.
It also lowered the cost of giving things up to one’s neighbors: you weren’t really losing something if you knew it would eventually come back to you. This is how humans came to rely on the principle so heavily. These networks of obligation, in turn, enabled communities to divide labor, trade for scarce goods with their neighbors, create systems of mutual defense, and develop hierarchies and functional divisions within society. This inheritance from evolution is why we still adhere to the Reciprocity Principle today: we’re all taught that it’s bad to be a moocher or freeloader.
(Shortform note: The Reciprocity Principle became even more deeply ingrained as primitive communities grew into complex and interdependent societies. Looking at early Mesopotamia, for example, we see that large-scale, collective efforts like monument-building or the irrigation projects that enabled the Sumerians to control the flooding of the Tigris and Euphrates Rivers could have only been possible if there were some social mechanism that obliged people to assist one another in these endeavors.)
Unfortunately, the Reciprocity Principle also represents an evolutionary blind spot that compliance practitioners know how to exploit. They know that you’re more likely to feel obliged to them if they offer you some small gift or token gesture of kindness before they make their request.
The Reciprocity Principle overpowers our other senses: you don’t even have to like the person making you the offer, you don’t need to have asked for it, nor do you have to desire the thing being offered to you for the principle to work its persuasive magic.
Without the sense of indebtedness, most people would never agree to the requests. The Reciprocity Principle is why sellers are so fond of promotional offers, free samples, and small gifts.
They know that by accepting these offers, you’ll become indebted to them: and their goal is to get you to fulfill your social obligation by purchasing their product. The best part for them is how subtle it is. They don’t have to directly ask you for anything, and they can never be accused of pressuring you into anything: all they did was offer you a free sample!
Of course, they know how hard it is for you to take a small piece of cheese, for example, without buying at least a little: they know how painful it is for you to feel like a moocher. Thus they comply with their request or abandon your ingrained sense of fairness and obligation.**
The Soda Experiment
In an experiment that demonstrates the power of the Reciprocity Principle, experimenters had a participant sit in a room with a researcher pretending to be a participant. The fake participant (the book calls him “Joe”) offered the real participant a free can of soda. In the control group, Joe did not offer the participant any soda. Later, Joe came back and asked the participant if they would be willing to purchase raffle tickets he was selling: in effect, testing to see if they would be willing to return the favor of the can of soda from earlier.
The contrast between the experimental and control groups was stark: participants who had been offered soda bought twice as many tickets as those who hadn’t. Even those who later reported that they didn’t particularly like Joe were no less likely to purchase raffle tickets than those who did like him, as long as they’d been offered a can of soda. This is the power of the Reciprocity Principle to induce compliance.
The Hare Krishnas
The members of the Hare Krishna Society are a familiar site at airports around the world. Recognizable by their shaved heads, robes, beads, and bells, they are known for soliciting contributions from people passing by. There are also highly skilled exploiters of the Reciprocity Principle. They don’t begin their request for donations with a straightforward ask: instead they present the prospective donor with a small gift, like a flower or a book. Even if you don’t want it, they are insistent that you take what they’re offering: “No, please accept it, it’s our gift to you.”
It’s only at this point that the Hare Krishnas will ask for a monetary contribution. They know that the Reciprocity Principle will work its effects even on those who didn’t want the gift or find the Hare Krishnas weird or off-putting. The tactic has been a great success for the society, funding the creation and maintenance of over 300 religious centers around the world.
A favorite of compliance practitioners looking to make use of the Reciprocity Principle is the rejection-then-retreat tactic. The tactic plays upon the contrast effect, in which we exaggerate the differences between things depending on the order in which they’re presented to us.
A classic example is lifting a light object and then lifting a heavier object. Studies show that people who lifted the lighter object first consistently rated the second object as being heavier than those who had only lifted the heavier object.
Here’s how compliance professionals use the rejection-then-retreat tactic:
1) They make you an extreme first ask. For example, they might ask a high initial price for an item you’re looking to buy or ask you to commit a large amount of time volunteering for some cause. These first asks are meant to be rejected. They’re counting on you balking at the initial request.
2) After you’ve rejected them the first time, they make you a more “reasonable” second request. The second request looks small and reasonable because it comes immediately after a larger initial request: the compliance professional has primed your brain by using the contrast effect.
They’ve scaled back their initial request: they’ve offered you a concession. Because of the Reciprocity Principle, they know that you’ll be more likely to match their concession with one of your own, by saying “yes” to their second request.
The technique also fosters a deeper commitment from the customer. You’re more invested in following through on the contract or purchase because you’ve been tricked into thinking that you “won” a hard-earned concession from the compliance professional. Of course, you haven’t won anything: you’ve fallen perfectly into the trap they laid for you. The compliance professional is hoping to lure you into this cycle of reciprocal concessions.
The rejection-then-retreat tactic has been shown to be highly effective at getting people to accede to requests that they would otherwise reject out of hand. The “reasonable” second request doesn’t even have to be small for the tactic to work. It can still be an objectively large ask, as long as it’s smaller than the first request.
It should be noted, however, that there are limits to how effective this tactic is. If the initial request is too outlandish, the tactic will backfire. For example, a compliance professional can’t ask $1 million for something that they really only want to charge $200 for.
TV Producers Vs. Censors: A Tale of Reciprocal Concessions
Producers of shows that air on network television in the United States have to clear the shows’ content with censors at the Federal Communications Commission (FCC) before it can be aired. The FCC will flag material that they deem to be inappropriate. But producers are good at using the Reciprocity Principle and the rejection-then-retreat tactic to get the content they want on the air. Producers will deliberately insert lines into scripts that they know will be rejected by the censors: they then pull back to the lines that they really wanted to include all along. When Happy Days was on the air in the 1970s, producers included the word “virgin” in the initial script seven times. The censors ended up cutting six uses of the word and leaving in one, which was the producers’ plan from the start.
Talking the Top of the Line: A Sales Trick
You might think that salespeople would want to show you the cheaper, economy model of a product first, because you’d be more likely to buy it. They could then talk you up to purchasing the more expensive model.
But you’d have it backwards. Salespeople always want to show you the most expensive model first. This technique, called “talking the top of the line,” is a time-honored Reciprocity Principle move. If you buy the expensive model at first sight, then that’s obviously a win for the salesperson. But if you reject it, they’ll then show you the more reasonably-priced model. By contrast, it will look far more affordable than it really is. You’ll want to match the salesperson’s concession by conceding to buy the cheaper model.
The numbers bear this out: According to Consumer Reports sales figures were twice as high when customers were shown more expensive models first than when they were shown the cheaper models first. The lesson? Rejection-then-retreat works.
It’s important to defend yourself against compliance practitioners who employ the techniques we’ve discussed. It’s hard to tell if someone is using reciprocity to exploit you or if they’re genuinely offering you something with no expectation of anything in return. Accepting an invitation to a party and then inviting that person to your own party isn’t being exploited by the Reciprocity Principle: that’s exactly how networks of obligation are supposed to work. There are plenty of people in our lives who simply wish to do acts of kindness for us.
Clearly then, you can’t just refuse gifts and offers of help from everyone. If you did that, you’d end up being pretty unhappy and socially isolated: as we’ve discussed, reciprocity is the glue that holds so much of society together.
The solution is to distinguish between exploiters and benefactors, and to treat the former objectively. Figure out if the person is an exploiter by asking yourself if they stand to gain by your acceptance of their favor. Are they doing something for you just to get something in return later? If so, then it’s not genuine: you should think of these interactions as a form of social predatory lending.
Once you’ve determined that someone offering you a gift, sample, or favor is really employing a compliance trick, you should cut off your reciprocity instinct. See the gift, not as a gift, but as a sales device.You have no social obligation to return a trick with a favor. Don’t think of accepting the favor as putting you in their debt. You’re not.
It’s useful to be explicit in calling out the compliance practitioner for exactly what they’re doing. You can tell them, “I see what you’re doing. You’re offering me this free gift in the hopes that I’ll be more inclined to buy whatever it is you’re really selling. But I don’t owe you anything, because your free offer was made in bad faith.”
In fact, by taking their free sample and not making a subsequent purchase, you are living up to the Reciprocity Principle. You’re matching their attempt to take advantage of you by taking advantage of them.
Avoid getting sucked into the reciprocity trap.
Have you ever agreed to do something you didn’t want to do because someone gave you an initial concession or performed a small favor for you? If so, describe why you felt compelled to comply with this person’s request.
How can you distinguish between people offering you genuine favors and people trying to manipulate you through reciprocity so as to not get tricked in the future?
The Consistency Principle says that humans have an obsession with sticking to their guns. Consistency is closely related to commitment. Once we’ve committed to a course of action or to a belief, we pressure ourselves to conform to that commitment. We go through great mental gymnastics to convince ourselves that our current behavior and beliefs align with our past behavior and beliefs, even when they clearly don’t.
In one Canadian study, experimenters looked at the beliefs and behavior of bettors at a racetrack. Thirty seconds before they placed their bets, they were uncertain about their horse. Just thirty seconds after, however they were far more optimistic and confident in their choices. Nothing had objectively changed in this short span of time. The Consistency Principle just forced the bettors to bring their beliefs into line with the action they had already committed to.
The principle creates a valuable opening for those ever-present compliance practitioners. By getting you to make just a small commitment, a skilled compliance practitioner can get you to make larger and larger ones.
Like the other instincts, consistency and commitment are powerful instincts that usually do lead us to the correct conclusions and behaviors. Consistency is a luxury: it frees us from having to assess each situation individually.
We don’t have to weigh every pro and con, sift through every obscure fact, or think through every possible ramification of every decision. Instead, we have an easy, one-size-fits-all guide for how to react to a multitude of situations and people that we encounter each day.
The obvious benefits of consistency are why we value it so highly and have great disdain for those who don’t seem to embody it. We label people whose words and deeds don’t align as indecisive, weak, vacillating, and even dishonest.
(Shortform note: In the 2004 U.S. presidential election, Democratic nominee John Kerry was lambasted by the media and his political opponents for his alleged inconsistency and was labelled a “flip-flopper.” This was based chiefly on his incongruous votes and statements with regard to the Iraq War. Exit poll results showed that this criticism of Kerry’s character played a role in his narrow defeat).
Conversely, we view people who exhibit high levels of consistency as being strong, decisive, resolute, and honest. Thus, there is a strong social incentive for us to be consistent in our words, deeds, and even our thoughts.
The experience of American POWs held by the Chinese Communists during the Korean War is a telling example of how the Consistency Principle can radically alter someone’s beliefs and behavior through even small, token acts of prior commitment.
The Chinese engaged in what they called a “lenient policy” toward their captives. Unlike their North Korean allies, they didn’t physically beat or torture their American captives. Instead, they engaged in a long campaign of psychological warfare against them.
In doing so, the Chinese were able to get these POWs to collaborate with their captors and inform on one another. In fact, American authorities concluded that virtually all Chinese-held American POWs collaborated with their captors in some way.
How did the Chinese achieve this level of compliance? By manipulating the instinct for consistency and commitment.
They started small, first convincing prisoners to write down mildly anti-American statements like “America isn’t perfect,” with which it would be difficult for any reasonable person to disagree. By getting the prisoners to take even these seemingly innocuous positions, the Chinese could extract more and more. They could rely on the Consistency Principle to bring the prisoners’ later actions into line with their previous commitment.
Next, the Chinese escalated the commitment. They might, for example, ask a man to make a list of everything wrong with America, sign his name to it, and read it to his fellow POWs. If he resisted, his captors knew how to use consistency to bring him back in line. They could remind him, “But this is really what you believe, right?”
Psychologically it was hard for the man to wriggle out of his commitment: after all, his statement was plainly written on paper, in his handwriting. How could he deny the truth of the statement when he had written it himself? They would then take it even further by broadcasting his essay over the radio to the other prisoners and to all American troops and allies in the region.
The public nature of the man’s commitment was critical: he had now gone on record as being a collaborator, for everyone to see.
This is where the exploitation of the Consistency Principle really worked its effects. In making the prisoner’s collaboration known to the whole world, the Chinese had changed his self-conception. He now thought of himself as a collaborator, because of all his previous commitments.
With this label as an ingrained part of his self-identity, the prisoner could reliably be expected to model all of his future behavior and beliefs to conform to it. The prisoner’s change in beliefs could be staggering. By the end of the war, the POWs had come to believe wild Chinese propaganda about American germ warfare; the United States as an aggressive, imperialist power; and the merits of the communist system.
Interestingly, while the Chinese didn’t resort to harsh punishment, they also didn’t give generous material incentives to collaborators. The rewards they did give were minor luxuries like cigarettes or fresh fruit.
This was also by design. If the Chinese granted the collaborators overly generous rewards for their compliance, they might provide the prisoners with a psychological escape hatch: the Americans would be able to convince themselves that they were only collaborating for material gain.
This was not what the Chinese wanted. They wanted the Americans to genuinely come to embrace their new identity as collaborators. The use of the Consistency Principle was just as much about changing belief as changing behavior.
Of course, most of us (thankfully) won’t have to go through the experience of psychological manipulation in a POW camp. But everyday compliance practitioners like salespeople and fundraisers are just as adept at using the Consistency Principle against us as were the Chinese Communists.
By getting you to make one small commitment, savvy compliance practitioners know how to rope you along and lock you into progressively larger commitments. This is the time-honored “foot-in-the-door” sales tactic. Below, we’ll explore some examples that show just how effective this technique can be.
In a study conducted in California in the 1960s, a researcher posing as a volunteer worker went door-to-door asking residents if they would be willing to install a small, three-square-inch sign on their front lawn that read “BE A SAFE DRIVER.” Since it appeared to be a well-intentioned and public-spirited request that involved no sacrifice on the part of the homeowner, the great majority of respondents said yes.
Two weeks later, the research team went back to the same neighborhood and asked people if they would consent to having a massive, unattractive billboard (one that would almost obscure their entire house) that read “DRIVE CAREFULLY” set up on their property.
Among those who hadn’t been asked to install the small sign two weeks before, the response was expected: 83 percent said no to the outlandish request, and only 17 percent said yes. But among those who had been asked and said yes to that earlier request, the results were startling: 76 percent of them agreed to let their front yards be taken up by the billboard! Their earlier commitment made these homeowners far more willing to comply with the second, larger request.
The researchers then replicated their experiment in another California neighborhood, but this time with a twist. They sent a researcher (again posing as a volunteer) around to make another simple request: this time, for residents to sign a petition to “keep California beautiful.” Then, two weeks later, they sent a different “volunteer” around to ask the petition-signers if they would consent to erecting the same massive billboard on their lawns that the previous group had been asked to. Around half of the petition-signers agreed to this obviously preposterous request!
But how could this be? The first commitment was about state beautification, while the second was about driver safety. The two commitments seemingly had nothing to do with each other. How could they have been linked by the Consistency Principle?
Because the researchers, like the Chinese Communists, had altered their subjects’ self-identity. By signing the petition, residents came to view themselves as civic-minded, public-spirited citizens. With this newfound identity, the Consistency Principle did the rest of the work: they complied in order to be consistent with their new vision of themselves.
“Why I Love Tide Detergent”: The Power Of Product Testimonials
Large consumer-facing companies like Procter & Gamble often host contests asking participants to write short essays (usually less than 100 words) explaining why they love the company’s products, awarding large prizes to the winners.
On paper, it seems like the company gets the short end of the bargain: they have to give out a big prize and all they get is a few words of praise from customers. What’s the upside?
It turns out there’s a big upside. These companies are taking a page out of the Chinese Communist playbook: they’re getting as many people as they can to go on record as liking their products. From what we’ve learned about the power of consistency and commitment, writing a “Why I Love Tide Detergent” essay would greatly influence future consumer behavior.
It’s an easy way to get potentially hundreds of thousands of customers to self-identify as admirers of a product—and then buy lots of it to bring their actions in line with that identity.
We’ve seen that putting a commitment down in writing is extremely effective in activating the Consistency Principle to guide future behavior. This is because writing something down requires effort: and the more effort that goes into making a commitment, the more likely it is to influence our beliefs and behavior.
We see this dynamic most clearly in the elaborate hazing rituals practiced by American college fraternities and sororities. Pledges are subjected to harrowing ordeals where they are beaten, exposed to extreme weather conditions, forced to drink to excess, deprived of food and water, and other elaborate forms of painful initiation. Hazing like this has resulted in severe injuries, psychological trauma, and even death for many college students.
Why do these organizations continue subjecting their members to what can only be described as ritualized torture? It’s not that frat boys are uniquely sociopathic or deviant (as their detractors would like to believe).
It’s really all about group cohesion: the pledges will value their membership in the fraternity more if they’ve gone through excruciating lengths to earn it. Researchers believe that the roots of this lie in cognitive dissonance—the mental burden of carrying two contradictory beliefs at once. The worse the hazing is, the more your mind needs to convince you that joining the group will be a positive, fulfilling experience. Thus, hazing binds new recruits closer to the group through commitment and consistency: you invest more into the group, because it’s impossible to stomach the idea that you went through all this hazing for something you don’t actually want.
The fraternities strongly resist any attempts to substitute their hazing rituals for some other, more socially acceptable activity, like community service work. Like the Chinese Communists, the fraternities don’t want to give their pledges a mental “out.” They don’t want new members to be able to tell themselves that they’re going through the ordeal for any reason other than their loyalty and commitment to the group. Fraternities want the pledges to own the commitment intrinsically.
Charitable work (even if it was genuinely unpleasant) would provide such a mental escape route: it would dilute the pledges’ intrinsic commitment to the group. Thus, the fraternities have a strong incentive to keep up practices like “Hell Week”: it fosters the behavior and beliefs in new pledges that make the group stronger. In fact, it’s the key to their survival.
This effect applites to any group that requires a tough initiation to get in—the military, a career,a workplace, even a casual social group.
So how do you resist the tricks and manipulations of compliance professionals looking to use your instinct for consistency and commitment against you?
As we’ve seen, thinking and behaving in a consistent way can be useful. It makes our lives more predictable, stable, and unburdens us from having to assess each and every situation according to its unique merits. But stubborn, rigid consistency is bad. It leads us to make the wrong choices and shut off our faculties of reason and critical thinking.
You need to know when to switch off your consistency fixed-action pattern and focus on the merits of the situation at hand.
You can feel it in your gut when you’re being lured into a compliance trap. As a reminder, this is when you’re being asked to do something or affirm something that you intuitively know you don’t want to. You need to be able to spot these situations quickly, otherwise the compliance professional will corner you with your own commitment.
When you get this sinking feeling that you’re being manipulated, turn the table on the compliance practitioner. Tell them that you’re onto them and you know exactly what they’re trying to do. You can be perfectly blunt and direct:
“I know in my heart of hearts that I don’t want the movie magazine subscriptions that you’re selling. It doesn’t matter that I answered in the affirmative to your questions asking if I liked cinema. All you’re doing is trapping me into statements that you can then use to get me to buy something I don’t need. It would be dumb of me to throw my money away on something I don’t want, and I won’t let you manipulate me. Please leave.”
You can learn from past instances where you were lured into a consistency/commitment trap.
After you make the poor decision, ask yourself “Knowing what I know now, would I make this decision again?” If your answer is “no,” then you know you were bamboozled by compliance trickery.
The solution is to trust your instincts better. Train yourself to be attentive and listen to the little voice inside your head that tells you “You don’t actually want to make this decision. Stop yourself.”
Also, don’t over-intellectualize. As humans, we experience our feelings about something before we have time to intellectualize about it. Trust those feelings and don’t give yourself too much time to rationalize a poor decision. Make decisions for a reason; don’t make reasons for a decision.
Fight the urge to be stubbornly consistent.
Think of something that you now believe in strongly that you didn’t used to believe in. In a few sentences, describe the differences between your past belief and your current belief, and why you’ve come to see things differently.
How do you reconcile the inconsistency between what you used to believe and what you believe now?
What tactics can you use to keep an open mind and avoid sticking to a stubborn consistency in the future?
The Social Proof Principle states that we decide what’s correct based on what other people think is correct. If lots of other people are doing something or thinking something, then it must be good and worthy of imitation.
We see this all the time in everyday life. When we see a crowd of people forming on a street, we instinctively want to join. “If all those people are gawking at something, it must be something interesting that’s worth checking out,” our brains tell us.
As you can probably guess by now, however, compliance practitioners of all stripes are very good at manipulating social proof to get us to behave in ways that we otherwise wouldn’t.
For example, look at canned laughter on TV sitcoms. Although it’s not as prevalent as it was a generation ago, TV producers still overlay recordings of human laughter following “funny” lines on the show. Experiments show that the audience watching at home finds the show funnier if they hear other people laughing.
The Social Proof Principle is strong enough to override our basic intuition and knowledge. We laugh along with the laugh track even when the underlying content isn’t funny, and even when we know that it’s a fake recording. The transparency of the ruse does nothing to diminish its effectiveness.
Like the turkey mothers with the fake “cheep cheep” noise, we’re responding to a stimulus (the noise of laughter) according to the same predictable, fixed-action pattern (laughing along ourselves).
The motive, then, is clear for compliance practitioners. If they can convince you that lots of other people are doing something, then can make you do it too.
The Social Proof Principle is one that usually serves us well. In general, you’ll make fewer mistakes if you follow social evidence than if you ignore it. When a lot of people are doing something, it usually is the right thing to do.
Like the other fixed-action mental shortcuts, social proof saves us a lot of mental effort. We can look to others for how to model our behavior in everyday situations, rather than needing to meticulously analyze everything.
(Shortform note: You can see how social proof would have been a highly useful psychological trait for early humans. By encouraging adherence to group standards of behavior and thought, it probably played an important role in allowing many basics of society like religion, morality, and political hierarchy to develop. It was also a useful aid in achieving important collective goals like agriculture, public works, and military campaigns).
Thus, we are strongly conditioned to do as others are doing. The problem is when we respond to fraudulent or manufactured social proof or when our social proof instinct leads to harmful consequences.
Perhaps unsurprisingly, advertisers are among the most prevalent manipulators of the Social Proof Principle. Advertisements often tout products as being the nation’s “fastest-growing” or “highest-selling.”
The subtext is clear: if so many other people are enjoying this product, why aren’t you? By using this trick, an advertiser doesn’t even need to convince you that the product is good on the merits: they just need to convince you that lots of others think it is.
You can see this also in the charitable, nonprofit sector. Fundraisers love to promote how many donors they have and how much they’ve raised so far in their campaigns (both of these are staples of fundraising telethons). By demonstrating how many people have already contributed to their cause, fundraisers know that they can compel many non-donors to start giving.
The Social Proof Principle will be more potent under certain conditions than others. One such condition is uncertainty.
In unclear or ambiguous situations, we’re more likely to use the actions of others to model our own behavior. This can lead to a phenomenon called pluralistic ignorance, in which a group of people behaves contrary to the norms and standards of most of the individual members of that group.
It really comes down to the difference between how a person acts and how people act. Pluralistic ignorance explains bystanders who fail to help individuals in need.
In 1963, in the Kew Gardens section of Queens in New York City, a young woman named Kitty Genovese was murdered. While undeniably tragic, what made her murder internationally famous were the circumstances under which it occured.
Genovese was stabbed by her killer over the course of a prolonged 35-minute attack, during which she was audibly in excruciating pain and screaming for someone to help her. Several neighbors in the crowded urban environment either saw or heard portions of the attack. Yet many of them failed to intervene.
The media seized on apathy as an explanation for the neighbors’ behavior. According to this theory, we were becoming a “Cold Society” in which people were unwilling to lift a finger to help each other in an hour of need. Given the attack’s setting in New York City, pundits pointed to apathy as a regrettable feature of modern urban life. They warned that episodes like this would only become more common as the country became more urbanized.
But was this really what was going on? Did these witnesses simply shrug their shoulders at Genovese’s plight? Or was something else going on? Psychologists Bibb Latane and John Darley begged to differ.
They argued that some individual witness struggled to help the victim precisely because they knew there were so many other witnesses. There was a diffusion of responsibility: everyone assumed that someone else would intervene or call the police, so no one actually did. The knowledge of fellow witnesses lowered the psychological cost of non-involvement for each individual.
They also argued that uncertainty drove the witnesses’ actions (or non-actions).
As we mentioned above, the Social Proof Principle thrives in conditions of uncertainty and ambiguity. In these scenarios, we strongly model our behavior on what others are doing.
In Latane and Darley’s analysis, this created a feedback loop with deadly consequences for Genovese. In the confusing and chaotic atmosphere of a crowded urban environment, Geneovese’s screams could have been any number of things: drunken exuberance, or a lover’s quarrel maybe, neither of which would warrant intervention from a stranger.
Everyone hearing it could plausibly claim that they didn’t know what was really going on. So they looked to everyone else to guide their own behavior. And since everyone else was doing nothing, each person thought that the screams were a non-emergency. “After all, if this were an emergency, at least someone else should be looking alarmed and calling the police.” But if everyone is thinking this, no one is acting, and so everyone believes it’s a non-emergency. So as a whole, the group made the collective decision to let Genovese die.
Shortform note: Subsequent investigations of the murder have discredited some of the early mythology when first studied in the 1960s. Far fewer people heard parts of the attack than had been previously thought. None of them witnessed it in its entirety. At least two neighbors did call the police. And, in fact, Genovose died in the arms of an elderly neighbor who came down to help her when she heard Genovese’s anguished cries.
The story is still useful as a demonstration of the principle of pluralistic ignorance, so we’ve opted to keep it in the summary, but we had to point out the facts have changed quite a bit since Influence was published. If you’re interested in learning more, check out this article from the American Psychological Association on pop psychology’s “tall tales.”)
The Genovese story shows the folly of the idea of “safety in numbers.” With the powerful force of pluralistic ignorance working against you, you’re probably in greater danger in a large group than you are in a small group.
As proof, Darley and Latane staged an experiment in which a college student pretended to have a seizure. The student received help 85 percent of the time when there was only a single witness. But he received help only 31 percent of the time where there were five bystanders.
On an individual level, then, people are not apathetic. Quite the contrary, they’re remarkably eager to help. People quickly leap into action once they realize an emergency is underway.
The lesson is clear. Single out an individual if you are ever in need of help in a public place. General, non-directed cries of “Help!” can too easily be sucked into the vortex of pluralistic ignorance. People will either be able to tell themselves that there’s no real emergency or that someone else will handle it. Don’t let diffusion of responsibility and uncertainty take root. Call out, “Hey you, in the red cardigan. I need help, go call 911 now!”
The second working condition where social proof thrives is that of similarity or familiarity. We model our own behavior after that of people we believe to be similar to us.
This has real-world ramifications. We’re more likely to offer assistance to people that we perceive as being in our in-group and are susceptible to appeals or calls to action from people that we judge to be fellow members of the same in-group.
The Wallet Experiment
In one Columbia University experiment, researchers wanted to test whether Americans would be more willing to assist someone they perceived as being a fellow American.
The researcher placed wallets on the ground in Manhattan for passersby to pick up. All the wallets contained the same amount of money. But there was a crucial difference between some of them. In one group of wallets, the researchers enclosed a letter written in broken English—to convey to the finders that the “owner” was a foreigner.
In other wallets, the letter was written in standard English—signifying that the “owner” was a native-born American.
The results were stark. People only returned 33 percent of the “foreign” wallets, whereas they returned 70 percent of the “native” wallets. Clearly, people were more willing to do an act of kindness for someone they saw as being similar to themselves than they were for someone who they perceived as being “other” or an outsider.
The Person On the Street Commercials
A favorite tactic of advertisers is the “person on the street” testimonial in television commercials. We’ve all seen it: average-looking and average-sounding people appear on screen and offer glowing personal testimony about the phenomenal experience they had with a given product.
The purpose is clear: to convince you that the person on screen is just like you. Once you identify with the person in the commercial, you’ll want to emulate their behavior: by purchasing the product in question.
Advertisers know how to use this tactic to drill down into the specific demographics of their product’s target audience, like having young people appear in ads for youth-oriented products.
Of course, the problem is that many of these people are not real users of the product: they’re paid actors cast to look like everyday, on-the-street people. The advertisers are hoping to build an affinity between the audience and the actors in the commercial.
We emulate those who we see as being like ourselves. This can even extend to life-and-death circumstances. Research has shown that when a suicide is prominently featured in the news, copycat suicides start popping up. Within two months of a famous suicide, there are about 58 more suicides than there otherwise would be.
Given what we know about social proof, it should come as no surprise that there is often a similarity between the original suicide victim and the copycats. When a young person takes their own life, the imitative suicides occur among young people. When an older person dies by suicide, the copycats are other older people.
The most startling and grisly example of compliance through social proof was the Jonestown mass suicide event of 1978. Jim Jones, the charismatic leader of the People’s Temple cult, convinced over 900 men, women, and children to poison themselves at the Jonestown temple complex, located in the remote jungles of Guyana. They did this by drinking Kool-Aid laced with cyanide.
(Shortform note: This is where the phrase “drink the Kool-Aid” comes from. To drink the Kool-Aid is to blindly engage in some self-destructive act at the behest of a leader who promises lavish rewards.)
How did Jones manage to get hundreds of people to comply with this ghastly request? He did it through a potent mixture of the uncertainty and similarity factors of the Social Proof Principle.
Under these circumstances, all it took was a few people to start drinking the poison for social proof to compel the entire group to follow in kind. Correct behavior was defined not by any individual’s sense of right and wrong, but by what the group was doing.
How can you push back against the social proof manipulations of compliance practitioners? How can you overcome the overwhelming instinct to conform to group behavior?
Social proof acts like an autopilot: usually it steers us right, but it can land us in trouble if it’s being fed the wrong data. You need to recognize situations where your social proof instinct is being triggered based on manufactured or faulty evidence. Once you recognize that this is happening, you’ll know to think for yourself.
We discussed this earlier when we talked about “person-on-the-street” commercials and canned laughter in sitcoms, but it bears repeating: compliance practitioners want to convince you that lots of people are behaving the way the manipulators want you to behave.
But ruses like fake “real people” or artificial laugh tracks are often comically transparent in their phoniness. (Shortform note: So are other “tells” like implausibly high statistics (“99 percent of dentists agree”) or, in online shopping, companies buying positive, fake user reviews on sites like Amazon and Yelp.) Treat any over-the-top praise from a “neutral” observer with skepticism.
You should immediately pounce once you recognize an effort to deceive you with counterfeit social proof. For example, don’t purchase products that are sold via these unscrupulous “unrehearsed interview” techniques.
You can even go a step further and demand that the company in question fire the advertising agency that produced the ad and vow never to work with them. Blatantly calling out compliance practitioners like this and forcefully demonstrating that these tactics won’t work is the only way to get them to stop.
Not all Social Proof Principle errors are the result of deliberate manipulation. Natural, genuine errors of judgement (like those made by the Kitty Genovese witnesses) can snowball into a group making a bad decision.
A lot of traffic accidents, for example, are caused by just a few drivers switching lanes. The drivers behind them start switching lanes too, thinking that those in front must have had a good reason for switching. Through social proof, this leads to a wave of sudden lane departures, which increases the likelihood of a collision.
Look closer at group behavior. Is there a reason to do something, beyond just the fact that everyone else is doing it? Don’t be like a pilot who flies by relying solely on her instruments. You also need to actually see the sky in front of you. Sometimes you do need to look critically at the world around you, take the time to assess situations, and apply your own individual judgement.
See how you can resist the pull of social proof.
Have you ever gone along with group behavior, despite your own private reservations about what the group was doing? Describe how you felt and why you decided to do as others were doing.
Now, describe a situation where you did the opposite, and bucked group behavior to make your own choice. What made you act on your own instead of going along with the crowd?
Going forward, how can you distinguish between scenarios where it’s right to follow the group and situations where you need to analyze the facts for yourself?
The Liking Principle stipulates that we’re more likely to comply with requests from people that we know and like. Thus, we are more amenable to the compliance efforts of neighbors, friends, and family, or from people who claim to know them. We are also more willing to acquiesce to people who we see as being good-looking, affable, or who profess to like us.
As you’ve probably guessed, however, this principle of human behavior creates a wide opening for compliance practitioners who wish to exploit it for personal gain. If they can get us to like them, we’re much more likely to be putty in their hands. If you like the seller, you’ll like what she’s selling.
(Shortform note: It makes sense, from an evolutionary and group cohesion perspective, why we would be more willing to comply with the wishes of people we know. Early human communities were extended kinship groups, where the survival of the individual was closely tied to the larger familial group. In this world of localized, tight-knit communities, helping people in your in-group was a valuable survival trait: you relied on them to help with the necessities of life, like food-gathering and child-rearing. The members of your kin group would also have probably been the only people you actually encountered. Thus, there evolved a strong propensity to assist individuals who were personally known to us).
We are far more likely to comply with requests from people we know. The social costs of saying “no” to a neighbor or acquaintance are much higher than they are for a stranger.
Compliance professionals harp on this instinct and use our natural empathy for our friends, acquaintances, and neighbors against us. This explains, for example, why charitable organizations recruit volunteers to go door-to-door in the neighborhoods where they live.
People are less willing to slam the door in someone’s face if the canvasser starts their pitch with, “Hello, I live in this neighborhood…”
The familiarity bias is so strong that the person making the compliance request doesn’t even need to be known to us personally: they just need to drop the name of someone that we do know. This is the “endless chain” technique.
Salespeople will ask a customer for a list of names of friends and neighbors who might be interested in the product. Most customers comply, since it seems like an innocent enough request.
They then approach the people on that list, opening their sales pitch with, “Your friend __ recommended that I call on you.” This puts you, the new customer, in a social bind. Turning away the sales person feels tantamount to turning away your friend.
It’s called the “endless chain technique” because the salesperson can always rely on references from their current customers to generate the next round of customers. If someone refuses their sales pitch, they can always do a rejection then retreat by saying, “OK, sorry that you’re not interested. Would you be able to provide me with some names of friends that might be interested in taking advantage of this incredible offer?” And on and on it goes: the salesperson rarely walks away from a visit without at least a reference.
Going further, we also see that people are more willing to cooperate with requests from people they see as being similar to themselves. We explored some of this when we talked about social proof, but similarity also works on a one-on-one level.
The source of similarity can be anything from religion, ethnic background, personality traits, shared preferences, or physical appearance and style of dress. We are evolved to form a bond with people as soon as we can identify some common ground with them.
In one 1970s experiment, researchers assessed how the style of dress of a petition-carrier at an anti-war rally affected people’s willingness to sign it. People were more willing to sign the petition of a requester who was dressed like themselves.
They often did so without bothering to read the petition itself: similarity alone did all the work for the requester.
Familiarity does not always lead to liking and compliance. In the wrong circumstances, it can lead to contempt and viciousness.
The mixed results of school desegregation in the United States speak further to this point. Unofficial, de facto segregation in schools remained widespread decades after the end of formal, legal segregation following the Supreme Court’s decision in Brown v. Board of Education in 1954. Children of different races rarely interacted on a social level even at schools with a high degree of diversity. Clearly, claims that increased exposure to children of different races would lead to racial harmony and inclusion were wrong.
The issue, however, was not desegregation itself. It was the circumstances under which it was being done. School is a highly competitive environment. Children are jostling with one another for the approval of peers, teachers, and administrators. Desegregation was always going to be in trouble in such an environment. Children saw peers outside their own racial group as potential competitors for these scarce resources. Just throwing children of all different racial groups together was a recipe for conflict.
The solution was to bring the children together in a multi-racial group project, where each individual could only succeed if the group succeeded. Research done by Elliot Aronson in the 1970s demonstrated the potential of cooperative learning techniques to break down social and racial barriers in the classroom.
The “jigsaw puzzle” classroom is a twist on this. Class activity is centered around answering questions that have multi-part answers. Each child in a group is responsible for giving one part of a multi-part answer. Thus, the children have an incentive to help each other succeed: they all fail if one of them fails.
We also associate “liking” someone with their degree of physical attractiveness. We are more likely to acquiesce to requests from people we think are attractive.
Attractive people benefit from what social scientists call a “halo effect.” This means that we extrapolate from one positive characteristic (being attractive) and credit that person with having other positive characteristics, even with no evidence. Thus, we associate physical attractiveness with intelligence, humor, talent, kindness, and honesty.
It is no wonder that persuaders of all stripes know how to exploit this cognitive bias in humans to maximum effect. A study of Canadian elections showed that, controlling for other factors, attractive candidates on the whole received more than 2.5 times more votes than unattractive candidates.
Based on the candidates’ attractiveness, voters seem to have been disproportionately persuaded of their other good qualities (policies, ideology, trustworthiness, competence). What makes this bias even more effective is that most people deny that it even exists—73 percent of voters claimed that a candidate’s looks had nothing to do with how they voted. No one wants to be seen as superficial, but it seems most of us behave that way.
The attractiveness bias extends beyond the political arena. In a 1990 study of job applicants, researchers found that the physical appearance of an interviewee was more positively correlated with being hired than experience and qualifications. This was despite employers claiming that physical appearance played no role in their hiring decisions.
Similarly, even the supposedly impartial judicial system is vulnerable to the attractiveness bias. A Pennsylvania study found that handsome male defendants had received significantly lighter sentences than unattractive defendants. The good-looking defendants were twice as likely to avoid prison than others.
We also like or dislike people based on what we associate them with. We like people who bring us good news, and “shoot the messengers” who bring us bad news.
In the ancient Persian Empire, imperial messengers were always in a precarious position. They would be fêted and celebrated by the emperor if they brought tidings of a military victory. But if they came bearing news of a defeat, they would be summarily executed.
In a modern-day analogue, consider the plight of weather forecasters. Weather forecasters have been shown to be met with hostility, threats, and even violence when they predict bad weather. The association bias is so strong that people come to believe that adverse weather events are caused by the forecasters themselves—meteorologists have been accused of causing snow, tornadoes, and hurricanes!
Compliance practitioners are adept at associating their products or brands with people or things that the public knows and likes, in the hopes that we’ll come to like the products more by association.
Presidential candidates seek the support of athletes, performing artists, and other non-political cultural figures during a campaign, even though the approval of these people has nothing to do with government or public policy.
After the 1969 moon landing, all sorts of products incorporated space travel or lunar themes into their marketing and advertising. And every two years, we’ve become used to the spectacle of brands labeling themselves “ the official fast food,” “the official toothpaste,” or “the official soft drink” of the U.S. Olympic team.
Athletes are probably the most sought-after product endorsers, because marketers know that their associative persuasive appeal is deep and widespread, cutting across ethnic, regional, age, and economic demographic groups. Also, athletes are linked to many positive attributes that brands are eager to associate themselves with: youth, strength, winning, prowess, and physical attractiveness.
(Shortform note: A quick look at some of the biggest endorsement deals in sports will show just how high a premium brands place on the association principle. NBA Star Steph Curry’s contract with Under Armour is reportedly worth $285 million. Global soccer icon Cristiano Ronaldo’s deal with Nike is estimated at around $1 billion. And NBA megastar LeBron James has a lifetime endorsement deal with Nike that goes into ten figures.)
Not only do marketers wish to associate their products with well-known symbols of public admiration—people try to associate themselves with the achievements and victories of others.
The psychology of sports fandom illustrates this quite well. Sports are a deadly serious business for the fan. Their very sense of self hangs in the balance depending on the outcome of the game.
This is compounded because most sports teams are rooted in a specific geographic location. Thus, the fan wraps up the team’s performance with their own sense of pride in their culture, their home, their family, and their being: when the team loses, the fan personally feels like a loser. Thus, fans cheer the players who spur the team on to victory and mercilessly hound those whom they judge responsible for its failures.
(Shortform note: One famous example is the story of Boston Red Sox first baseman Bill Buckner. His error in Game 6 of the 1986 World Series caused the team to lose the game, and ultimately the series, when they were on the verge of clinching the championship. The reaction from fans and the media in Boston was brutal: Buckner received death threats and was forced to leave the city for his own safety.)
Association shows itself most strongly in the language fans use to describe a team’s performance. Studies have shown that when the team wins, fans use more first-person pronouns to describe the outcome: “We won,” “Our offense dominated theirs today” “My team wiped the floor with you.” Because of the victory, fans are looking to deepen their connection to the team.
The opposite phenomenon happens in the wake of a loss, with fans using more indirect, third-person language: “They totally blew it out there,” “Those guys were sleepwalking through that game,” “That performance was horrible.” Because of the loss, fans are looking to disassociate themselves from the team.
Not only do we like other people, but we’re especially predisposed to like people who claim to like us. This might seem obvious, but it’s surprising how effective it is.
Indeed, the bias is so strong that compliance practitioners can be shockingly transparent and vague in their efforts and still be successful using this tactic.
Joe Girard was acknowledged by the Guinness Book of World Records as the world’s greatest salesperson, having personally sold over 13,000 Chevrolets during the 1960s and 70s. Clearly, he was a compliance practitioner par excellence. But what was his secret?
Every month, he would send his customers (numbering well into the thousands) a greeting card with the same message: “I like you.” That’s it. No variation on the message, no further detail than that. And with Joe’s track record of success and humans’ known susceptibility to flattery, he was very likely on to something.
In one study, men were given comments by someone who they were told needed a favor from them. The comments given were either positive, negative, or neutral.
The evaluators who gave the men pure praise were rated as the most-liked, even though the men knew that the praiser had an ulterior motive. Going even further, the subjects still gave these evaluators high marks for likeability even when the compliments they received were completely divorced from reality. Apparently, praise doesn’t have to be genuine or even accurate for us to be susceptible to it.
So, how do you resist the charms of seemingly likeable and personable compliance practitioners? It’s hard to come up with a single rule or a one-size-fits-all defense tactic for a simple reason: there is enormous variability in what each of us considers to be likeable.
As with the defenses against the other fixed-action manipulations, it all comes back to recognition. You can’t simply commit yourselves to not liking anyone. That path leads only to isolation and unhappiness. Instead, you should let yourself like people you find to be charming, but be aware when you feel that you like someone more than you should given the circumstances.
If you like someone or something more quickly or more intensely than it would be reasonable to, hit the pause button and rethink. For example, if you find yourself liking a salesperson after less than an hour of interacting with her, you might want to ask yourself some questions to see if she did anything that might have tipped unduly influenced your judgement:
If you can answer “yes,” then you should mentally separate your liking for the person from the thing they’re trying to get you to comply with.
Make decisions based solely on the pros and cons or on the intrinsic merits of that decision. Don’t fall into the Liking Principle and say “yes” to people simply because you like them personally.
Make rational decisions based on their own merits, not because you like the person asking you to make them.
Have you ever been in a situation where you found yourself liking a person more than you otherwise would, given the circumstances? In a few sentences, explain what happened.
In the future, how can you help yourself separate the person making a request of you from the request itself?
The Authority Principle states that people are hard-wired to comply with requests that come from an acknowledged and accepted source of authority. Thus, we are strongly inclined to be deferential to people who we consider to be in a position of power or expertise. Examples would include teachers, members of the armed forces, police officers, doctors, and judges, to name just a few.
The instinct is so powerful that humans are responsive even to the mere vestiges or symbols of authority—titles, uniforms, and insignia can exert a strong influence. Given this strong fixed-action pattern of compliance, it is no wonder that compliance professionals know how to use the appearance or suggestion of authority to force us to accede to their requests.
Like the other fixed-action instincts, there are good and legitimate reasons why we’re strongly conditioned to obey authority. Leadership, hierarchy, and authority are obviously necessary ingredients in any functioning society.
Unless we want a state of complete anarchy and lawlessness, there’s always going to be somebody making decisions and giving orders. Authority of some people over others was a great advantage to early humans as they were building the first organized societies—and eventually, the first civilizations. It allowed for resource allocation, trade, military organization, economic development, and the rule of law.
Early civilizations were civilizations precisely because they had some centralized authority that was able to make decisions for the collective good and marshall the necessary resources to implement them. In the absence of authority, there is anarchy, the state of nature that English philosopher Thomas Hobbes famously described as being “nasty, brutish, and short.”
This is why we’re so strongly oriented toward obedience and deference to authority. We learn this in the very beginning of life when we are taught to obey and respect our parents, and the message only gets reinforced in the educational, legal, religious, military, and political systems we navigate throughout the course of our lives.
Authority and Obedience in the Old Testament
For many people in what we would call “the West,” the Bible is the touchstone text that forms the foundation of our moral values. It also happens to be shot through with the theme of deference to authority.
In the Book of Genesis in the Old Testament, Adam and Eve’s fall from grace is the result of a failure to obey God’s command to not eat fruit from the Tree of Knowledge—the origin of the concept of original sin.
Later in the Old Testament, God commands Abraham to kill Isaac, his eldest son. Abraham is perfectly willing to go through with this until God stops him: the episode was merely a test of Abraham’s willingness to obey a higher authority.
Just how deep does our obedience to authority run? What can normal human beings be compelled to do at the behest of an authority figure? The results of the (in)famous experiment conducted by Stanley Milgram at Yale in 1961 demonstrated just how far authority can be used—and abused.
The test subjects were told that they were participating in an experiment to measure the effects of punishment on learning and memory. One set of participants (the Learners) was tasked with memorizing lists of words. The other set of participants (the Teachers) had to measure the Learners’ progress and administer electric shocks to the Learners whenever the latter made a mistake.The lab-coated researcher was always present to ensure that both the Teacher and the Learner carried out their assigned responsibilities.
The Learners, however, weren’t really test subjects, but were instead hired actors. And there was no real electric shock being delivered. But the Teachers didn’t know this: they believed they were delivering real shocks to real people.
The experiment was really measuring people’s willingness to inflict pain on others if it was part of their “job.” At first, the Teachers were told they were administering mild shocks to the Learners, akin perhaps to the static electricity you might feel when you open a doorknob after walking on a carpet.
But for every wrong answer, the Teachers were told to increase the intensity of the shocks by 15 volts. By the fourth or fifth administration, the shocks were clearly painful.
Yet, the Teachers continued to administer them, even when they themselves were clearly distraught at the idea of having to inflict severe pain on someone else.
They might glance over nervously at the researcher leading the experiment. But the researcher would only order the shocks to continue, in a series of increasingly harsh commands:
As a result, the Teacher would dutifully, if reluctantly, carry on with the task. Of the 40 Teachers, not a single one refused to continue the shocks when the victim begged to be released. Only when the victims began to scream in agony and/or show signs of unconsciousness or heart failure did any of the Teachers quit—and even then it was a distinct minority.
Milgram was shocked at how willing his subjects were to carry on with their task as long as there was an authority figure (the stern-faced researcher in the lab coat) urging them on.
There was nothing particularly special about the group of people selected to be Teachers: they were chosen randomly and represented a broad cross-section of ages, occupations, and educational levels. In repeated trials, Milgram found that people of all backgrounds, and of both sexes, were equally willing to administer the shocks: it was a universal characteristic.
Variations on the experiment honed in on the authority of the researcher being the decisive factor. In one twist, the researcher and the Learner switched scripts: the lab-coated researcher ordered the Teacher to stop administering the shocks, while the Learner insisted on carrying through with the experiment. 100 percent of the Teachers stopped administering the shocks when the person asking them to continue was merely an equal instead of a superior.
Similarly, when the researcher was receiving the shocks and begged to be released, all of the Teachers refused to administer any more.Milgram concluded that the sense of duty to an acknow ledged authority figure was compelling enough to drive the test subjects to inflict excruciating pain on others.
The Milgram experiment presents an illuminating and disturbing picture of just how far our fixed-action pattern of compliance with authority will take us. But authority can also be faked or manufactured.
Titles, uniforms, and other outward symbols of dominance or expertise can create the patina of authority. Naturally, compliance practitioners know how to present themselves in a way that conveys authority and expertise to their victims.
Beyond the usual mix of salespeople and politicians we’ve talked about so far, a whole variety of con men, grifters, and charlatans are quite savvy at projecting artificial authority to force compliance.
Titles often have more power to direct our behavior and perceptions than the actual person claiming the title.
In one Australian study, different groups of subjects rated the same person as being taller depending on the title they were told that he held. With each increase in title (student, lecturer, professor), the subjects’ perception of his height grew by half an inch! This perhaps explains why door-to-door scammers frequently wear lifts in their shoes. Psychologically, it conveys a certain air of authority and gravitas.
This deference to title can have frightening effects in the medical community. One study from the Midwest demonstrated that 95 percent of nurses were willing to unquestioningly comply with orders to give dangerously high doses of medication to patients, as long as the order came from someone who claimed to be a “doctor.”
Con artists know that most people will obey the orders or recommendations of someone in a uniform. Whether it’s a police uniform, military camouflage, a doctor’s lab coat, or even a snappy business suit, uniforms and modes of dress send a powerful message of authority.
In one experiment in Texas, researchers had a young man cross a busy intersection into oncoming traffic, to test how willing other pedestrians would be to follow him. When the young man was dressed in a freshly pressed business suit, people were 3.5 times more likely to follow him as they were when he was dressed in regular street clothes.
The “bank examiner” scam is beloved by con artists because it uses the simple power of dress to bamboozle people out of their savings. The con artist, dressed in conservative business attire, knocks on the door of the victim’s home, claiming to be a bank examiner from the victim’s financial institution. The con artist will claim that there has been some irregularity with the victim’s bank account, and inform the victim that a bank officer might be doing phony transactions with the victim’s money.
The victim is then asked to withdraw all their money from their account, so that the “bank examiner” can trace the record of the transaction to see if they can catch the embezzling bank officer in the act.
After the victim has withdrawn all their money, another bank official turns up at their door. This time, it’s a uniformed bank officer who reassures the victim that there was no malfeasance with their account after all. This person then offers to re-deposit the victim’s money to save them the trouble of another trip to the bank—which, of course, they never do. The scam works because the counterfeited uniforms lull the victim into compliance.
Refusing to comply with authority can be extremely difficult given how strongly conditioned most of us are to obeying orders. And, of course, we shouldn’t seek to rebel against all authority figures in all situations.
For example, it would be a bad idea to refuse to pull over your vehicle when police officers signal for you to do so. Most authority figures like police officers, judges, lawyers, and doctors are in their positions for good reasons. They have special training, expertise, or legal sanction to be in charge in certain situations.
The trick, as always, comes in recognition. When should authoritative promptings be followed and when should they be ignored? There are two tests to apply to help you answer this question.
Ask yourself, “Is this person truly an expert?” This will prompt you to assess the person’s credentials.
It will also prompt you to evaluate the relevance of the authority figure’s credentials to whatever it is that they’re trying to get you to do. This test will steer you in the right direction both ways: its logic will compel you to obey legitimate authority figures and be highly skeptical of fraudulent authority figures or those with ulterior motives.
For example, a doctor in a hospital giving you medical advice about your upcoming surgery has a) legitimate credentials and b) highly relevant subject-matter expertise in the topic at hand.
Conversely, an actor who plays a doctor on TV and is endorsing some pharmaceutical or health product in a commercial has a) no credentials or training as a real physician and b) irrelevant credentials in another field: she’s an expert in acting and performance, not medicine.
In addition to examining an authority’s expertise, you need to look at how truthful they’re being. Do they have an ulterior motive? Do they have a reason to lie or mislead? Often, experts stand to gain personally from our compliance with their recommendations.
Compliance practitioners know that we might be looking closely at their true motivations, however. In a further attempt to deceive us, they’ll often go out of their way to appear to argue against their own interests.
For example, a waiter might dissuade you from ordering an expensive entree by telling you that it’s actually not that good, and instead recommend a slightly cheaper main course. In doing so, the waiter establishes trust and credibility: he must have your best interests at heart, because he’s talking himself out of money! With your newfound trust and credibility in him, the waiter can then persuasively recommend expensive side dishes, wines, and desserts: after all, he’s proven that he’s a subject-matter expert and a guardian of your best interests. Of course you should trust his subsequent recommendations!
Influence truly is a cat-and-mouse game.
Discover when you should and shouldn’t listen to authority.
Have you ever complied with a request from an authority figure who turned out to be illegitimate or self-serving? Describe what happened.
What compelled you to comply with this request?
How can you apply what you’ve learned in this chapter to distinguish between authorities you should listen to and those you should treat with skepticism?
The Scarcity Principle tells us that we find more appealing those things with limited availability. On a basic level, we encounter this all the time: rare goods are expensive, while abundant items are cheap.
Scarcity is closely related to the idea of loss aversion. As humans, we are powerfully guided by our desire to avert losing what we already have. We are inherently conservative and cautious. Loss aversion is a strong framing effect: we are more afraid of losing something than we are enticed by the hope of gaining something of equal value.
As you know by now, compliance practitioners know how to frame their proposals to make them seem like rare, fleeting opportunities that we’ll miss if we don’t capitalize on them immediately. “Limited-time only” or “first come, first serve” sales offers are the most common use of the Scarcity Principle. Compliance practitioners can greatly enhance the appeal of their product if they can convince you that what they’re selling is in short supply, will be taken off the market imminently, or is only available to an exclusive set of customers.
The Scarcity Principle is powerful because it manipulates our desire to be in control and have as many options of possible: when we face a deadline or a competitive scramble for a rare item, our freedom to have whatever we want is limited.
As we’ll see, the Scarcity Principle is most effective when:
Scarcity, or the appearance of it, is a powerful motivating factor in influencing our behavior. We want things we can’t have—or at least think we won’t be able to have if we don’t act quickly and decisively.
Like our other fixed-action mental shortcuts, scarcity usually is a good gauge of how valuable something is. It’s simple supply-and-demand: when there’s less of something and there’s a high demand for it, the price increases.
(Shortform note: Throughout human history, across all societies, humans have valued rare goods and skills. Gold was used as the standard for currency in ancient societies because it was a highly rare metal. In the late medieval and early modern period, European traders braved dangerous sea and land routes to access the spices of the East Indies, because they knew that the rarity of these commodities would make them desirable and valuable to other Europeans who couldn’t access them locally. Today, people with expertise in fields like bioengineering and cryptocurrency are paid very well because there are very few people who have their knowledge— there is a scarcity of their skillset.)
The Scarcity Principle derives much of its strength from a phenomenon known as psychological reactance. This is the adverse reaction we have to any restriction of our choices.
When something is freely available and abundant, we don’t feel any limitation in our options: we can have as much of it as we want. Conversely, scarcity limits our choices, especially when whatever we desire was previously abundant.
Psychological reactance stems from loss aversion, our desire to preserve what we already have: when this freedom is restricted, we desire the item more than we did before.
“The Terrible Twos”
Anyone who has toddlers can tell you that the job of parenting becomes much harder after the child’s second birthday. Children at this age start exhibiting:
Outright defiance (doing the opposite of what their parents tell them).
Refusal to let their parents hold them and simultaneous refusal to be put down.
Rejection of food, sleep, and any other source of comfort that used to be able to calm them down.
The reason this happens is because children become self-actualized at this age: they realize that they are full individuals. They come to desire autonomy and freedom of choice: and saying “no” is the most powerful expression of that autonomy. They feel that this new freedom is being taken away from them (being made scarce) by attempts to force them to eat, sleep, or clean up their toys.
Thus, psychological reactance kicks in: they strenuously resist their parents’ attempts to curb their freedom through tantrums.
The Romeo and Juliet Effect
The famous Shakespearean tragedy Romeo and Juliet tells the story of two young star-crossed lovers who ultimately choose to take their own lives than let their feuding families tear the couple apart. Romeo and Juliet, however, might just be literature’s most famous example of psychological reactance.
Psychologists have studied the “Romeo and Juliet” effect to see the role that parental interference and restriction play in driving teenage romances. One study of 140 Colorado couples showed that couples who reported parental disapproval of the relationship were more likely to want to get married.
Moreover, when the parental interference increased, so did the feelings of attraction. The lesson is clear: don’t spark psychological reactance if you’re a parent. If you don’t like your child’s significant other, your disapproval will likely drive your child right into their arms.
The Dade County Phosphate Saga
By now, you should be getting the idea that psychological reactance makes us want what we can’t have. It’s the forbidden fruit, the scarcity that makes it attractive.
> One case from Florida stands as a powerful demonstration of this concept on a community-wide scale. In the early 1970s, Dade County, Florida (containing Miami) banned the use and possession of phosphate in cleaning products. The reaction of residents was classic psychological reactance: they began importing phosphate cleaners in droves, often resorting to smuggling and hoarding!
Not only did they start consuming more phosphate cleaners, they also started changing their attitudes and opinions about these products. The majority of Miami consumers came to see phosphate cleaners as being better products than they had before.
Psychological reactance produced the desire and then people retroactively assigned positive attributes to the product to justify that desire.
We’ve established that scarcity is a powerful motivator of human behavior and a useful tool for compliance practitioners.
A famous experiment by social psychologist Stephen Worchel demonstrated just how vulnerable human beings are to man-made scarcity. Test subjects were asked to eat a chocolate chip cookie from a jar and rate its taste and overall quality. But there was a twist: half the participants were asked to evaluate a cookie from a jar that contained ten, while the other participants were asked to rate a cookie from a jar containing only two.
Based on what you know about scarcity from the other examples in this chapter, you can probably guess that participants rated the “rarer” cookie as being tastier and of higher quality (though the cookies in both jars were identical).
OK, no surprise so far. But two additional findings from the study show that scarcity is even stronger under certain conditions.
Worchel wanted to test and see if people would desire the cookies more not only if they began as scarcer, but if they had recently become scarcer.
In this experimental twist, participants were first asked to evaluate cookies from the ten-cookie jar. They were then asked to rate a cookie from the two-cookie jar. Thus, the previously abundant supply of cookies was suddenly and drastically reduced.
This group that experienced a sharp drop in their cookie supply rated the cookies higher than those who had only known scarcity from the beginning.
We can see this dynamic playing out in the real world, beyond the confines of this cookie experiment. People in the Soviet Union in the 1980s had become accustomed to a higher material standard of living and greater government tolerance of free expression under Premiere Mikhail Gorbachev’s twin policies of perestroika (reform) and glasnost (openness).
When a cadre of Communist Party hardliners ousted Gorbachev in 1989 in a coup and sought to reinstate the repressive policies of the Soviet past, the results were shocking to party leadership: the people rebelled, rioted in the streets, and refused to give back their newly earned and hard-fought freedoms.
This was a classic example of the power of recent scarcity: freedoms were being taken away that people had become accustomed to. Once the Soviet people had tasted freedom, it was clear that they would fiercely resist any attempts to claw it back.
Worchel added yet another twist to the cookie experiment. Certain participants who saw their cookie supply dwindle from ten to two were told that the experiments had made a mistake and over-assigned cookies to their jar.
Another subset of participants, meanwhile, was told that their cookies had to be taken away so that they could be given to other raters.
The results were clear: people liked the cookies more when they became scarce through social competition that they did when they became scarce by accident.
Scarcity in Television
A famous example from the world of network television illustrates just how firm a grasp scarcity-through-competition can have over people’s rational thought process. In the 1970s, Barry Diller was in charge of prime time programming at ABC. In 1973, he and his rival network executives were presented with what seemed like a great opportunity: to air the hit film, The Poseidon Adventure.
A bidding war for the rights to air the film broke out between Diller and the executives from NBC and CBS. In the end, Diller won the prize by agreeing to pay the movie studio the then-unprecedented sum of $3.3 million for one airing of the film. This was a gross overpayment by any standard: ABC ended up losing $1 million on the deal!
The Scarcity Principle (and its social competition accelerant) was the likely culprit for Diller’s colossal misjudgement. This was the first time a studio had put the rights to a film up for auction to the networks: the competitive frenzy and desire not to be outbid made Diller vastly over-value the film.
How do you avoid getting caught up in the mania of scarcity? How do you resist the temptation of “buy now, limited-time only?”
Knowledge, as with the other principles, is power. When you sense that your desire for some scarce item or experience is clouding your judgement, precisely because it is scarce, you should hit the pause button.
You’re never going to make a wise decision in this state of irrational exuberance. Compliance practitioners know this all too well, that’s why they try to work you up with scarcity tactics.
Once you realize that this is where your mind is going, you can begin to us your rational faculties to better assess the potential decision. Ask yourself, “Do I truly want to experience this scarce thing or merely possess it?”
In other words, do you really value it for its intrinsic social, economic, material, or psychological benefits, or is your desire driven solely by scarcity itself? If your answer is the latter, then you are probably being lulled into a scarcity compliance trap.
Rare or scarce things aren’t inherently better than common things.Compliance practitioners know this better than anyone. Used-car salespeople, for example, know that they can greatly increase the desirability of a given car, simply by making potential customers wait in line while other potential customers inspect it and take it for test drives.
But the fact that they have to wait doesn’t make the car any better: the tires, transmission, shocks, and alignment are still the same. These customers become consumed by their desire to have the car, not by the actual utility of the car.
You should remember this as you confront the compliance practitioners of the world. Is the rarity of the thing itself what’s drawing you to it? If so, are there any inherently desirable qualities that would make the item rare? If there aren’t, then you might be buying something for entirely the wrong reasons.
Figure out whether you really want what you think you want.
Have you ever wanted something, solely because it was rare or difficult to obtain? Describe your emotions and thought process.
Think of something rare that’s actually cheap or undesirable. What do you think this says about the relationship between scarcity and quality?
How can you teach yourself to desire things because of their intrinsic value and not because they’re rare?
By reading this summary, you’ve hopefully gleaned some important insight into how the compliance practitioners of the world are looking to pull the wool over your eyes by manipulating your fixed-action patterns.
Whether through reciprocity, consistency/commitment, social proof, liking, authority, or scarcity, the goal is always the same: to get you to suspend your rational judgement and comply with a request that you would otherwise never agree to. Although our fixed-action patterns were useful survival tools for our prehistoric ancestors (and still largely serve us well today), they are exploitable mental blind spots in the hands of a seasoned compliance practitioner.
By recognizing when someone is trying to pull one of these tricks on you, questioning your own decision-making process, and forcefully calling the tricksters out, you’ll lead a happier, less stressful life. You’ll make decisions for the right reasons. You’ll keep more of your money. Most importantly, you won’t be a sucker.
Examine your own mental blind spots.
Of the six principles of persuasion described in this summary, which one do you feel you’re the most vulnerable to? Explain why.
How can you apply what you’ve learned to cover this mental blind spot and resist the compliance practitioners who’d like to exploit it?