More on this book
Community
Kindle Notes & Highlights
Read between
October 14 - October 14, 2017
The more I started paying attention to how tech products are designed, the more I started noticing how often they’re full of blind spots, biases, and outright ethical blunders—and how often those oversights can exacerbate unfairness and leave vulnerable people out.
But when we start looking at them together, a clear pattern emerges: an industry that is willing to invest plenty of resources in chasing “delight” and “disruption,” but that hasn’t stopped to think about who’s being served by its products, and who’s being left behind, alienated, or insulted.
Ten years ago, tech was still, in many ways, a discrete industry—easy to count and quantify. Today, it’s more accurate to call it a core underpinning of every industry.
Because tech has spent too long making too many people feel like they’re not important enough to design for.
And what all these stories indicate to me is that, despite tech companies talking more and more about diversity, far too much of the industry doesn’t ultimately care that its practices are making smart people feel uncomfortable, embarrassed, unsafe, or excluded.
That’s great in theory, but when personas are created by a homogenous team that hasn’t taken the time to understand the nuances of its audience—teams like those we saw in Chapter 2—they often end up designing products that alienate audiences, rather than making them feel at home.
But once you hand them out at a meeting or post them in the break room, personas can make it easy for teams to start designing only for that narrow profile.
Default settings can be helpful or deceptive, thoughtful or frustrating. But they’re never neutral. They’re designed.
Today, more sites are defaulting to neutral avatars—either by making the silhouettes more abstract, and therefore less gendered, or by using some other icon to represent a user, such as their initials.
But when default settings present one group as standard and another as “special”—such as men portrayed as more normal than women, or white people as more normal than people of color—the people who are already marginalized end up having the most difficult time finding technology that works for them. Perhaps worse, the biases already present in our culture are quietly reinforced.
Edge case is a classic engineering term for scenarios that are considered extreme, rather than typical. It might make sense to avoid edge cases when you’re adding features: software that includes every “wouldn’t it be nice if . . . ?” scenario that anyone has ever thought of quickly becomes bloated and harder to use. But when applied to people and their identities, rather than to a product’s features, the term “edge case” is problematic—because it assumes there’s such a thing as an “average” user in the first place.
When designers call someone an edge case, they imply that they’re not important enough to care about—that they’re outside the bounds of concern. In contrast, a stress case shows designers how strong their work is—and where it breaks down.
Instead, says Bawcombe, it’s an exercise in seeing the problem space differently: Identifying stress cases helps us see the spectrum of varied and imperfect ways humans encounter our products, especially taking into consideration moments of stress, anxiety and urgency. Stress cases help us design for real user journeys that fall outside of our ideal circumstances and assumptions.
Reader, I died.
But what they missed—because, I recognize now, our personas encouraged them to miss it—was that demographics weren’t the point. Differing motivations and challenges were the real drivers behind what these people wanted and how they interacted with the organization. We thought adding photos, genders, ages, and hometowns would give our personas a more realistic feel. And they did—just not the way we intended. Rather than helping folks connect with these people, the personas encouraged the team to assume that demographic information drove motivations—that, say, young women tended to be highly
...more
Pretty soon, we’d removed all the stock photos and replaced them with icons of people working—giving presentations, sitting nose-deep in research materials, that sort of thing. I haven’t attached a photo to a persona since.
The only thing that’s normal is diversity.
Most of the personas and other documents that companies use to define who a product is meant for don’t need to rely on demographic data nearly as much as they do. Instead, they need to understand that “normal people” include a lot more nuance—and a much wider range of backgrounds—than their narrow perceptions would suggest.
But in the meantime, suffice it to say, an “authentic name” policy doesn’t fix abuse—but it does alienate and unfairly target the people who are most vulnerable.
Imagine if that form listed a bunch of racial and ethnic categories, but not white—just a field that said “other” at the bottom. Would white people freak out? Yes, yes they would.
Is being forced to use a gender you don’t identify with (or a title you find oppressive, or a name that isn’t yours) the end of the world? Probably not. Most things aren’t. But these little slights add up—day after day, week after week, site after site—making assumptions about who you are and sticking you into boxes that just don’t fit. Individually, they’re just a paper cut. Put together, they’re a constant thrumming pain, a little voice in the back of your head: This isn’t for you. This will never be for you.
Because when everyone’s talking incessantly about engagement, it’s easy to end up wearing blinders, never asking whether that engagement is even a good thing.
When systems don’t allow users to express their identities, companies end up with data that doesn’t reflect the reality of their users.
I didn’t say any of that, of course. I didn’t say much at all. Mostly, I wondered to myself, “How the hell did I end up here?”
Once that assumption is baked in, it skews the results: the more often women are incorrectly labeled as men, the more it looks like men dominate tech websites—and the more strongly the system starts to correlate tech website usage with men.
In this way, cuteness becomes another cloak for tech companies: a shiny object that deflects criticism. Because the more we focus on the balloons and streamers, the clever videos and animated icons and punny humor, the less we’ll question why companies are asking us for so much information. The less we’ll notice just how much of our identities Facebook is making assumptions about, and who’s using those assumptions for their own purposes. The less we’ll care that almost every app we use is investing thousands of hours into figuring out new ways to squeeze information out of our usage patterns.
Because, no matter how much tech companies talk about algorithms like they’re nothing but advanced math, they always reflect the values of their creators: the programmers and product teams working in tech. And as we’ve seen time and again, the values that tech culture holds aren’t neutral.
Because, as powerful as algorithms are, they’re not inherently “correct.” They’re just a series of steps and rules, applied to a set of data, designed to reach an outcome. The questions we need to ask are, Who decided what that desired outcome was? Where did the data come from? How did they define “good” or “fair” results? And how might that definition leave people behind? Otherwise, it’s far too easy for teams to carry the biases of the past with them into their software, creating algorithms that, at best, make a product less effective for some users—and, at worst, wreak havoc on their lives.
Friedler. “People need to understand that data is not truth. It is not going to magically solve these hard societal problems for us.” 39
Tied up in this meritocracy myth is also the assumption that technical skills are the most difficult to learn—and that if people study something else, it’s because they couldn’t hack programming. As a result, the system prizes technical abilities—and systematically devalues the people who bring the very skills to the table that could strengthen products, both ethically and commercially: people with the humanities and social science training needed to consider historical and cultural context, identify unconscious bias, and be more empathetic to the needs of users.
People call this the “leaky bucket”: when women and underrepresented groups leave because they’re fed up with biased cultures where they can’t get ahead. No pipeline in the world can make up for a steady flow out of tech companies.
if tech wants to be seen as special—and therefore able to operate outside the rules—then it helps to position the people working inside tech companies as special too. And the best way to ensure that happens is to build a monoculture, where insiders bond over a shared belief in their own brilliance.

