Toasters aren’t technology. They’re toasters.
Technology, as inventor Danny Hillis puts it, “is everything that doesn’t quite work yet.” Tried to load this article on your smartphone but lost signal for a second? That’s technology: amazing, but still sometimes kind of broken. Over time the rough edges are smoothed off and our products just work. Technologists succeed when their inventions are taken for granted.
The unprecedented number of new technologies we’re seeing means that there are lots of unsmoothed edges out there. After all, there are lots of ways technologies might not quite work.
Usually, when something goes wrong it annoys, inconveniences, or frustrates us. In the digital world, these failures—reminders that we’re dealing with technology—often come in the form of hardware glitches or software crashes. Today’s products however don’t just live in isolation. They’re parts of complex systems that try to learn about us and grow smarter, faster, and better. When systems this complex show their design flaws, sometimes they cause more than annoyance: they creep us out. If you’re using a product or service and it feels creepy, that’s because it’s broken in a fundamental way.
Just how design includes safety and profitability, design now needs to include respect for users, their privacy, and the data they’re choosing to share.
Take Google and Facebook. Considered alone or together, they have more information about more people than any organizations ever before. Google knows intimately what we’re interested in, Facebook knows who our friends are, and both companies know what we talk about online. So why is it only Facebook that’s widely regarded as being creepy?
In early 2012 Google announced its intentions to consolidate user information across services. Google argued that this allowed them to provide better services to their users. In the months leading up to the consolidation, it was impossible to visit a Google site or product without an information bar pleading for your attention to explain the forthcoming changes. Right in the explanation, Google offered users a way to delete their personal data if the change made them uncomfortable. The approach was open and respectful, Google had clearly learned from earlier missteps where customer data had been overzealously shared.
Contrast this with Facebook where the internal mantra is “move fast and break things.” While an approach like this might be appealing to curious, clever, and creative engineers, from a user experience perspective, it can be very unpleasant. One example is when they change privacy settings without telling us and we have to play a game of whack-a-mole to return them to parameters we can live with. Another one is the decision to let users pay to force their friends to see their updates. It might be technically straightforward, but subordinating the interests of users to the company’s profitability is a reliable formula for alienation. It’s creepy.
We’re constantly creating data that’s being stored and analyzed by a growing number of companies. It’s possible to learn, infer, and predict surprising and unexpected things about the users of complex systems. This knowledge can be profitable: Telecoms can now identify their socially popular customers and target them with special promotions to reduce churn among their friends while retail stores can use analytics to infer that a customer has recently become pregnant. These are very early examples. We’re all users of these complex systems. And as users of these systems we have a reasonable expectation to be treated with respect.
While Facebook and Google’s businesses force them to think of users as inventory rather than customers, there’s no reason why their products can’t be designed with respect for customers built-in. Just as design includes safety and profitability, design now needs to include respect for users, their privacy, and the data they’re choosing to share.
It’s not that hard to avoid being creepy:
Test your new products and features on your own personal data. Any decision maker who chooses to mine and analyze a customer’s data should first know what it feels like to have their own data crunched. It turns out that the golden rule applies to big data too.
Give your users access to and control over their personal data—and the option to delete it. Google provides both a personal data collection dashboard and an easy export of all user data entered into the system.
Keep the user’s best interests at heart. Mistakes can and will happen, but they should always happen while you’re trying to do right thing.
Customers know what they dislike much more clearly than they know what they like. If your product is making your customers unhappy, no matter what kind of strategic advantages you have (network effects in Facebook’s case) they’ll eventually find a competitor who doesn’t. In order to create non-creepy experiences, we need to recognize creepiness as the design flaw that it is—a flaw that only materializes when you’ve done something wrong, either by mistake or by design. Products and services should never creep us out.