Three years ago when Facebook Messenger users opened the app on an Android device, they were greeted by an adorable cartoon yeti. It was shown texting a big pink heart. Below it, a prompt read: "Text anyone in your phone."
The design seemed innocuous—friendly, even. Customers were given two choices: Tap the highlighted "turn on" button, which would give Facebook access to contacts, call and text history, or press the grayed-out "not now" button.
Millions tapped "turn on."
This week, those users learned that Facebook not only collected their call and text histories, it also allegedly held on to that information. The revelation led to panic among some customers, adding to the growing consumer distrust of the social network facing congressional inquiries and an Federal Trade Commission investigation into the mishandling of the personal information of 50 million customers.
Facebook said last weekend that customers had to opt into the texting feature and, therefore, knew they were handing over their data. What it didn't say was that those users were all subject to a sophisticated design strategy common in the technology industry—one that nudges users to do what companies want them to do.
"In digital products, every action the users perform is always being driven toward some type of goal," said Caspar Lam, an assistant professor of communication design at Parsons School of Design at the New School in New York.
For a shopping website, the goal could be to complete a transaction. On a news website, it might be to get people to sign up for a newsletter or subscribe. And on social networks, it's to sign up, log in, like, share, poke or comment—each offering the company more data that can be used to sell targeted advertising.
By now, most consumers understand that data collection is a core part of advertising-based businesses such as Facebook, Google and Snapchat. The practice can often be a boon to consumers: The more people share with the companies, the better they are able to serve up ads, search results, product recommendations and music and movie suggestions tailored to an individual's liking.
Yet many remain unaware of the type of data collected and what companies do with it. While the answers often lie in privacy policies and terms-of-service agreements, few take the time to look them over. A 2017 Deloitte survey found that more than 90 percent of people agree to terms and conditions without reading them.
The problem isn't that Facebook and other companies don't get users' consent or that they're not disclosing details on the data they collect, design critics say. It's that when the companies ask for consent, they use interfaces designed to get users to opt in without a second thought.
Facebook's yeti campaign is just one of many design strategies the social network has used to guide users to click or tap on a certain button.
It's no accident, for example, that Facebook notifications are bright red—a color that commands attention.
It's also no accident that the company dressed up a prompt asking for long-term access to users' entire call and text message history with a friendly, disarming cartoon yeti.
Even the language used—turn on" and "not yet," as opposed to "yes" or "no—plays on user psychology.
In this case, declining to hand over data was framed simply as "delaying the inevitable," said Mayo Nissen, an associate creative director at design agency Frog.
"It's clear from those options that Facebook was trying to encourage users to turn on the feature in question, and essentially frame the question as only having one correct answer," Nissen said.
These design tactics are so effective that consumers often don't realize they're at play. Lam compared it to the way delicious-looking food operates on our subconscious: When we look at an appealing dish, our minds don't break down the individual components that make it appetizing. Rather, it's the whole package that works on our senses. The same goes for the software and websites we use. Designers use color, images, shapes, button size, placement and language to herd users toward certain actions.
These design decisions can often be helpful. In a banking app, consumers might want designers to make it as easy as possible for them to pay their bills. In an email program, consumers don't want to spend all day searching for the "compose" button.
Where sophisticated, behavior-controlling design becomes a problem is when designers don't consider the broader consequences of their work, industry experts say.
In the technology industry, design teams are often hyper-focused on achieving a particular goal, said Sara Wachter-Boettcher, author of "Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech." Such teams are often asking, "'What are we trying to get people to do with this product? How do we get people to do that thing?'" Wachter-Boettcher said. "When people get hyper-focused on that goal, they lose sight of the potential negative consequences."
This is how Sophie Xie, a designer at Facebook from 2012 to 2013, described her experience working on the controversial texting feature. At a company as large and stratified as Facebook, designers like her were often briefed on a goal but given little time to understand the broader scope.
"On the one hand, your top-level goal is to get the project done, to do your job well," said Xie, who was responsible for art direction of stickers—including the yeti—that Facebook later used across its security and privacy messaging. "Then there's the ambient goal, which is, are you doing the right thing?
"If you pull at that thread, it causes you to question every objective that is pitched to you. And then you're at an impasse where it's hard to find a path forward if every product objective causes you to really question and dig."
Xie left Facebook after a year and is now a freelance designer, a move she made because she said she wanted more control over the kinds of projects she works on. The work she did for Facebook continues to weigh on her, though.
Facebook did not respond to a request for comment.
A common refrain by Silicon Valley critics is that engineers often ask themselves whether they can do something but forget to ask whether they should. Designers could ask themselves a similar question, said Erika Hall, a philosopher and co-founder of design agency Mule. They should also ask themselves: What am I serving? Who is my master? Who gets paid by whom?
"Designers were meant to be the people best able to make good choices," said Hall. "But you can't do good design if you're working in service of a bad business model."
If a company's goal is to increase sign-ups so it can collect user data so it can sell ads so it can meet Wall Street's expectations so it can appease shareholders so it can increase its stock price, a designer working within those confines will inevitably push for company growth, Hall said.
Making matters more difficult, the intangible nature of software and web design means it's hard to critique, and even harder to call out a company when it has committed unethical design. In a field like architecture, critics can talk about a physical building's aesthetic, function and social impact, Hall said. If a building is bad for the community, it's hard to ignore. In software, however, problems often aren't evident until much later when the damage has already been done.
The solution to this, according to Hall and Wachter-Boettcher, is three-pronged.
"The first place to start is raising consciousness among designers," Hall said.
Hall's business partner, designer Mike Monteiro, told designers at a conference five years ago that "it doesn't take malice to bring bad design into the world," he said. "All it takes it carelessness."
"The problem with designers and ethics is they see it as something to possibly strive for and maybe incorporate into their work, but they don't see it as core to what they do," Monteiro said. "We as designers should see this as part of the job."
The second is having companies re-evaluate their culture and valuing ethical design as much as they value profit.
"If you can kill every conversation about ethics by just mentioning your profit margins, then you don't have ethics," Wachter-Boettcher said. "They don't exist."
The third is tougher external regulation, which companies such as Facebook could potentially face after its recent string of consumer trust violations. (Facebook in 2011 agreed to an FTC consent decree in which it promised to be transparent with users about the data it collected.)
As pressure mounted, the social network announced this week that it will make it easier for customers to find and manage their privacy settings. The new privacy page is decorated with pastel-colored cartoon humans. There are big, clear buttons encouraging users to "manage security" and "manage privacy." Facebook said in its announcement that it wants to give users control of their information. It's calling the changes a redesign.
Explore further: 3 Facebook Messenger app users file lawsuit over privacy