Online Manipulation: All The Ways You’re Currently Being Deceived

by Alex Birkett November 25, 2015
November 25, 2015

There’s a fine line between online persuasion and manipulation.


In university, most classes, at least in the humanities and social sciences, dealt at least in partial with the morality of the lessons we learned. Especially in marketing and communications classes, there was the line between persuasion and propaganda.


Online, though, for whatever reasons, ethics aren’t discussed as frequently. Of course, we’re all in the business of persuasion, at least to the extent that we’d like people to buy our products. Nir Eyal put it well a while back:



“We build products meant to persuade people to do what we want them to do. We call these people “users” and even if we don’t say it aloud, we secretly wish every one of them would become fiendishly addicted.”


But, in the end, what differentiates persuasion from its more evil cousin?


The Facebook Controversy: A Contemplation of Ethics

Not that ethics in experimentation has ever been very clear cut, but the internet has made things even weirder.


We all (hopefully) remember the Facebook scandal, where Facebook manipulated the content seen by more than 600,000 users in an attempt to see if they could affect their emotional state. They basically skewed the number of positive or negative items on random users’ news feeds and then analyzed these people’s future postings. The result? Facebook can manipulate your emotions.


Image SourceImage Source


As AV Club put it, “Result: They can! Which is great news for Facebook data scientists hoping to prove a point about modern psychology. It’s less great for the people having their emotions secretly manipulated.”


The publication of these results, of course, led to massive blowback. Though people signed off their permission for Facebook to do such a thing, it still came as a shock that they would test something like emotional manipulation on such a massive level. As such, it became one of the most salient discussions in online manipulation in recent memory.


What is “Skinner Box Marketing”?

You remember B.F. Skinner from psychology 101? He conducted all sorts of messed up studies around ‘operant conditioning‘. Well there are a few voices claiming that we’re entering a new age of Digital Skinner Box Marketing. According to The Atlantic, it’s essentially



We’re entering the age of Skinnerian Marketing. Future applications making use of big data, location, maps, tracking of a browser’s interests, and data streams coming from mobile and wearable devices, promise to usher in the era of unprecedented power in the hands of marketers, who are no longer merely appealing to our innate desires, but programming our behaviors.


Joseph Bentzel broke the process down into three broad pillars..


Three Pillars of Skinner Box Marketing:


1. Emotional manipulation as ‘strategy’


Accoriding to Bentzel, “the emotional manipulation is always and often rationalized under the flag of ‘serving the customer’.”


2. The new ‘chief marketing technologist’ as that ‘man behind the strategy curtain’.


Bentzel:



“On the surface the ‘chief marketing technologist’ looks like strategic marketing progress in the digital age. But when you combine these powerful technologies with emotional manipulation as strategy—-you’re getting pretty close to something more than a few folks see as unethical.”


3. The ‘big data scientist’ as ‘skinner box management’ provider.


You need big data to fuel these insights. Therefore, Bentzel lists this as the third pillar of Skinner Box Marketing:



“And not just any big data but the kind of big data that provides a 360 degree surround-sound view of the specimen in the digital skinner box, aka the consumer. The big data component of the skinner box data management model is now on the agenda at the big analyst firms. They call it ‘customer context’.”


Digital Market Manipulation


The ascension of Skinner Box Marketing on the internet is backed by an academic concept Digital Market Manipulation.


Market Manipulation, the original theory, supplements and challenges law and economics with the extensive evidence that people do not always behave rationally in their best interest as traditional economic models. Rather, to borrow a phrase from Dan Ariely, humans are “predictably irrational.”


hands-966492_1280


Digital Marketing Manipulation builds onto this theory, focusing on the dramatic capabilities of the digitization of commerce to increase the ability of firms to influence consumers at a personal level.


In other words, M. Ryan Calo posits that emerging technologies and techniques will increasingly allow companies to exploit consumers’ irrationality or vulnerability. Essentially, the internet makes it that much easier to exploit emotions on a personal level and manipulate actions.


Anyway, all of this is to say that companies can and have been manipulating consumers in a variety of ways. On the internet, these are most often referred to as “dark pattens.”


Dark Patterns, Reintroduced

You’ve almost assuredly heard of ‘dark patterns’:



“Normally when you think of “bad design”, you think of the creator as being sloppy or lazy but with no ill intent. This type of bad design is known as a “UI anti-pattern”. Dark Patterns are different – they are not mistakes, they are carefully crafted with a solid understanding of human psychology, and they do not have the user’s interests in mind. We as designers, founders, UX & UI professionals and creators need to take a stance against Dark Patterns.”


The holy grail of dark patterns examples is at darkpatterns.org. Check that out. They list 14 categories of dark patterns:



  1. Bait and Switch
  2. Disguised Ads
  3. Faraway Bill
  4. Forced Continuity
  5. Forced Disclosure
  6. Friend Spam
  7. Hidden Costs
  8. Misdirection
  9. Price Comparison Prevention
  10. Privacy Zuckering
  11. Roach Motel
  12. Road Block
  13. Sneak in the Basket
  14. Trick Questions

Flip around the site and check out some of their examples. I know you’ll recognize these techniques and probably be able to find tons more examples..


Example: Google’s Disguised Ads


As the SEO Doctor pointed out, even Google isn’t always so righteous in their practices. He gave two examples of their (possible) dark pattern of disguising ads.


Though I think the first one may be a bit exaggerated, the article gives the example of the deceptive background color of Google’s sponsored links. For example…


Image SourceImage Source


The second example I thought was a little more legitimate, because, well, I’ve definitely fallen for it many times.


SEO Doctor draws attention to the arrow box on adwords – you know, when you’re on a blog and there’s a button that looks as if it will send you to the next page. But, no. It’s a subtle ad.


Image SourceImage Source


Blackhat Copywriting

Challenges of ethics are not new in advertising. Read any advertising history book to see the messed up tactics they’d use to manipulate readers. But, of course, manipulative copywriting still exists on the internet (and it’s probably easier to get away with it, too).


Instead of scouring the internet for bogus claims (of which there are many – check some out here), here are three very specific, and not often talked about, copywriting manipulations:



  1. Testiphonials
  2. False Scarcity
  3. The Damning Admission

Testiphonials


You know not all testimonials are legit, right?


Of course they aren’t. Testimonials work more often than not in A/B tests (at least the trustworthy ones), but they’re not all done ethically. Take this one for example, where a company pulled a quote from a ConversionXL article to get a fake testimonial from Peep:


Screen Shot 2015-11-16 at 9.13.36 PM


Testimonials, when authentic or perceived to be authentic, boost the credibility of your offer by engaging social proof. It’s more common than you’d think for companies to fake or embellish testimonials. Keep an eye out for that.


False Scarcity


False scarcity is a pretty common tactic in online marketing. It’s also infuriating, and it can obviously backfire if the consumer catches onto it.


Peep wrote about this a while ago, giving the example of the following email he received:


monthly


Normally scarcity works (a pillar of Cialdini’s 6 persuasion principles), but when it’s deliberately forced like this? It screams sleazy. The email was referring to a digital course that, somehow, had sold out of monthly pay as you go levels, but still had yearly memberships available. Funny how that works.


The Damning Admission


This one is pretty common in infomarketing, but it’s been applied to more than just that field.


Basically, the damning admission is designed to lower the guard of the consumer and make your offer seems more authentic and credible. Here’s how White Hat Crew described it:



“Just before making a claim that you want people to believe, you admit something negative about your product. By demonstrating your willingness to show your product in it’s true light, you gain credibility, and the prospect is more likely to believe your positive claim.


Of course, this technique can be used either to shine a more perfect light on the truth (to help people accept a true but exceptional claim that would usually be received with skepticism), or to manipulate (to trick people into accepting a false claim).


Your intent may guide your decision to use the technique or not, but it’s the truth or falsehood of the claim that determines whether it’s persuasive or manipulative.”


As implied by the above quote, the damning admission isn’t always unethical. Heck, most of the time it’s an example of truly and authentically describing your product. But when one employs the strategy in order to legitimize a false claim that follows, it’s an example of blackhat copywriting.


The Power of Defaults

The power of defaults? That’s something Jakob Nielsen wrote about about a decade ago in relation to search engines. Turns out, users almost always pick the top option (even when the top 1 and 2 switch places). This leads to the idea that users overwhelmingly choose the default option in online decision making. Of course, many companies exploit this.


An example of this online comes from none other than RyanAir (via darkpatterns.org). They offer travel insurance but the free option to decline it.


Not only do you have to look in a list called ‘Insurance – country of residence’, but the free option, ‘Don’t insure me’, distinguished at all for easy access. They’ve placed it alphabetically between Denmark and Finland.


ryanair-1024x949


Negative Option features


The power of defaults can be positive, but negatively embodied the tactic becomes known as ‘negative option features.’


These negative option features have been written about extensively by the BBB, Visa, and many others interested in deterring online deception. Here’s Visa’s description of a negative option feature:



“Consumers accept an offer online, often for a “free trial” or “sample.” They provide their card information in order to pay a small amount for shipping. What they may not realize is that there is a pre-checked box at the bottom of the page in fine print or buried in the terms and conditions that authorizes future charges. Consumers are required to un-click or opt-out of a pre-checked terms and conditions or payment authorization box or cancel before the end of the trial period to avoid being billed a recurring monthly charge.


For free trials with a negative option feature, a company takes a consumer’s failure to cancel as permission to continue billing. Cancelling can also be complicated by merchants with poor customer service, slow response times, and untimely refunds.”


Tricking You Out of Tips


For one last real world example, picture yourself purchasing a large latte at your local coffee shop.


When you go to pay, it’s now almost always done on an Ipad. According to Nir Eyal, “digital payment systems use subtle tactics to increase tips, and while it’s certainly good for hard-working service workers, it may not be so good for your wallet.”


Research from Software Advice backs this up. They found that digital point-of-sale terminals, like the one at your coffee shop, increase the frequency and amount of tips left by customers. But how?


Again, the power of defaults.


First, by removing physical cash from the equation, they remove many barriers to tipping. For example, what Dan Ariely coined as the Pain of Paying – that’s no longer applicable. Also, leaving a tip on the digital system is easy – equally as easy as it is not to tip. Here’s how Nir Eyal put it:


“Nir”Nir Eyal:


“When cash was king, anyone not wanting to give a tip could easily leave the money and dash. “Whoops, my bad!” However, with a digital payment system the transaction isn’t complete until the buyer makes an explicit tipping choice. Clicking on the “No Tip” button is suddenly its own decision. This additional step makes all the difference to those who may have previously avoided taking care of their server.”


Another way they increase our tip amounts is by anchoring. At a coffee shop it may be the worst, because you’re likely to purchase less than a $ 4 cup of coffee. The three options you’re usually presented for a tip on the screen are $ 1, $ 2, and $ 3 (and a custom amount).


Because of the anchoring effect, we’re drawn to pick the middle option, even though that’s a large amount more than the suggested tip percentage. Nir Eyal gave an example of his taxi ride to explain this:


Nir Eyal:


The vendor knows you likely won’t pick the least expensive amount — only cheapskates would do that. So even though 15 percent is squarely within the normal tipping range, by making it the first option, you’re more likely to chose 20 percent. Picking the middle-of-the-road option is in-line with your self-image of not being a tightwad. Therefore, you tip more and you’re not alone. The New York City Taxi and Limousine Commission reported tips increased from 10 percent to 22 percent on average when the new payment screens were turned on.


Note: this isn’t an example of online manipulation, but rather an illustration of the power of defaults. If anchoring and defaults make it easy to increase physical tips, then think about this example next time you’re confronted with a pricing decision online.


Conclusion: The Blurred Line of Right or Wrong

So online manipulation exists, clearly. What, then, do we do about it?


A common answer might very well be: “Caveat Emptor (let the buyer beware).”


That, however, places a lot of responsibility on the irrational side of the consumer’s mind. That’s where knowledge (via articles like this and archives like darkpatterns.org) comes in handy; if you can detect deception, you’re more likely to avoid it.


Then there’s the question of the difference between manipulation and marketing, which I never really answered. That’s a tough question for anyone to answer. I like Roger Dooley’s answer to it, though:


Roger Dooley:


“My response to the “manipulation” question is always, “If you are being honest, and if you are helping the customer get to a better place, it’s not manipulation and it’s not unethical.”


In today’s age of enforced transparency for business, manipulative tactics that deceive the customer simply won’t work. They will be quickly exposed and, with consumer voices amplified by social media, cause far more damage to the business than any short-term benefit.”


There are certain tactics here that are used prevalently, by large companies, but are by a large majority viewed as unethical. Therefore, dark patterns and online manipulation is a complex ethical decision, one that hinges on the balance between doing what’s right and doing what’s effective.

Digital & Social Articles on Business 2 Community

(51)

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.