Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
Rate it:
Open Preview
Kindle Notes & Highlights
4%
Flag icon
Courts are using software algorithms to influence criminal sentencing.
Brian
I’m guessing Weapons of Math Destruction will be more helpful on this line of criticism...
6%
Flag icon
When she asked about establishing flex schedules and making work travel more predictable, she was shot down. “We have three other women of childbearing age on our team, and we don’t want to set a precedent,” the owner told her, as if pregnancy were some sort of new trend. She had to quit—and so did other women who got pregnant there after she left.
Brian
Great work, Team.
6%
Flag icon
And what all these stories indicate to me is that, despite tech companies talking more and more about diversity, far too much of the industry doesn’t ultimately care that its practices are making smart people feel uncomfortable, embarrassed, unsafe, or excluded.
Brian
I hope he’s bothered to find some counter examples?
7%
Flag icon
But you might wonder, why do these companies’ stats always emphasize technical positions (which typically means people with titles like “engineer,” “developer,” or “programmer”), when a whole host of others are involved in creating a new digital product or service? Here’s why: in most tech companies, these roles—much more than designers, copywriters, marketers, and others who work on the creative or “soft skills” end of the spectrum—are seen as the masterminds of new technology.
Brian
Less cynically, it’s probably also the roles where the disparity is greatest and there’s the most work to be done. That’s why we often see stats for manager and leadership positions.
10%
Flag icon
Her cycle had been recently irregular, and she wanted to do a better job of tracking both her period and her moods in relation to it. So she test-drove some menstrual cycle apps, looking for one that would help her get the information she needed. What she found wasn’t so rosy. Most of the apps she saw were splayed with pink and floral motifs, and Delano immediately hated the gender stereotyping. But even more, she hated how often the products assumed that fertility was her primary concern—rather than, you know, asking her.
10%
Flag icon
The first thing I was asked when I opened the app was what my “journey” was: The choices were avoiding pregnancy, trying to conceive, or fertility treatments. And my “journey” involves none of these.
Brian
They may have done research and realized this covers a high percentage of the market? Yes, they probably should have ‘other’, but they might have done the analysis and realized it’s not worth building an app for that. I don’t think this was stereotyping out of ignorance, there’s just not that much there there...well , also there’s a helluva lot more ads they can push on the fertility/maternity angle, so that’s probably who they’re angling for.
11%
Flag icon
“We live in a time when people are tracking everything about their bodies . . . yet it’s still uncomfortable to talk about your reproductive health, whether you’re trying to get pregnant or just wondering how ‘normal’ your period is,” the company website stated. “We believe this needs to change.” 4 And the people who thought they were the ones to change it? Glow’s founding team: Max Levchin, Kevin Ho, Chris Martinez, and Ryan Ye. All men, of course—men who apparently never considered the range of real people who want to know whether their period is “normal.”
Brian
How do you know they never considered it? Aren’t those trying to get treatment interested in “normal”?
11%
Flag icon
Eve by Glow, a newer app designed by the makers of Glow for young women. Except it, too, makes assumptions about what its audience cares about.
Brian
Ok, this feels like it was designed by dudes. 😧
13%
Flag icon
these services all have another thing in common: women’s voices serve as the default for each of them. As Adrienne LaFrance, writing in the Atlantic, put it, “The simplest explanation is that people are conditioned to expect women, not men, to be in administrative roles” 9 (just think about who you picture when you hear the term “secretary”).
14%
Flag icon
Eric Meyer and I wrote about this in Design for Real Life, calling on designers to let go of their narrow ideas about “normal people,” and instead focus on those people whose identities and situations are often ignored:
Brian
But should they *focus* on these? These aren’t necessarily a continuum like neck size in the Air Force example...these are different *purposes*. If you design for the rare purpose, won’t you miss what the majority want and this fail to succeed? You do have to go for the primary pain/use case first, then can accommodate rarer needs?
14%
Flag icon
Libby Bawcombe wanted to know how to make design decisions that were more inclusive to a diverse audience, and more compassionate to that audience’s needs. So she led a session to identify stress cases for news consumers, and used the information she gathered to guide the team’s design decisions. The result was dozens of stress cases around many different scenarios, such as: •  A person feeling anxious because a family member is in the location where breaking news is occurring •  An English language learner who is struggling to understand a critical news alert •  A worker who can only access ...more
Brian
Dozens of scenarios? That seems difficult to design for. But maybe it works? I’d be interested in the cases where they have cross-purpose needs. You’ll have to pick one use case over another, and what criteria do you use? The more common case (what Silicon Valley does), or the more traumatic case for an individual? Let the majority have slightly degraded experience in favor of the rare. It seems that these add up, slow development and muddy the experience.
15%
Flag icon
That’s not to say NPR plans to customize its design for every single situation. Instead, says Bawcombe, it’s an exercise in seeing the problem space differently: Identifying stress cases helps us see the spectrum of varied and imperfect ways humans encounter our products, especially taking into consideration moments of stress, anxiety and urgency. Stress cases help us design for real user journeys that fall outside of our ideal circumstances and assumptions.
Brian
Ok, so this lets them consciously exclude use cases, rather than magically satisfy them all. That sounds more plausible, but still very difficult. Must lead to lots of debates, perhaps perception of moving slowly? Bigger thing though: how do you know if you get it right? You need to find all of these scenarios and see if they work for the customers experiencing them. That’d be awesome to be able to pull off. Companies have hard enough time figuring out whether customers like the majority use cases... ;)
15%
Flag icon
For example, the old NPR News app displayed all stories the same way: just a headline and a tiny thumbnail image. This design is great for skimming—something many users rely on—but it’s not always great for knowing what you’re skimming. Many stories are nuanced, requiring a bit more context to understand what they’re actually about. Even more important, Bawcombe says, is that the old design didn’t differentiate between major and minor news: each story got the same visual treatment. “There is no feeling of hierarchy or urgency when news is breaking,” she told me.15 Finally, the old design ...more
Brian
Wait wait. What does this have to do with the stress cases? This sounds like it was designed to better serve the skimmer / the base case?
16%
Flag icon
These are small details, to be sure—but it’s just these sorts of details that are missed when design teams don’t know, or care, to think beyond their idea of the “average” user: the news consumer sitting in a comfy chair at home or work, sipping coffee and spending as long as they want with the day’s stories. And as this type of inclusive thinking influences more and more design choices, the little decisions add up—and result in products that are built to fit into real people’s lives. It all starts with the design team taking time to think about all the people it can’t see.
Brian
I actually don’t see the connection here. I think they made it better more for the “sitting at home” consumer than a desperate-to-know “edge case”?
16%
Flag icon
We went on like this for some time, the executive nodding along as he leafed through our document. Until we reached the last persona, “Linda.” A stock photo of a fortyish black woman beamed at us from above her title: “CEO.” Our client put down his paper. “I just don’t think this is realistic,” he said. “The CEO would be an older white man.”
Brian
Um.
17%
Flag icon
Thankfully, our clients’ disagreement over the right way to present race turned into a rethinking of our whole approach. Pretty soon, we’d removed all the stock photos and replaced them with icons of people working—giving presentations, sitting nose-deep in research materials, that sort of thing. I haven’t attached a photo to a persona since.
Brian
Personas are about the customer’s motivation/pain/goals, not their demographics. Separate those out and go more cartoon with the identity, focus on the motivation.
18%
Flag icon
And it would work in tech too. Most of the personas and other documents that companies use to define who a product is meant for don’t need to rely on demographic data nearly as much as they do. Instead, they need to understand that “normal people” include a lot more nuance—and a much wider range of backgrounds—than their narrow perceptions would suggest.
23%
Flag icon
Why does Gmail need to know your gender? How about Spotify? Apps and sites routinely ask for this information, for no other reason except to analyze trends or send you marketing messages (or sell your data so that others can do that).
Brian
People should feel more empowered to lie/mess with Bog Tech. Could become a trend to have wrong age/pronouns etc. I def do this to obscure birthdate from data vacuums (it’s almost used as a pass key by some companies (eg banks))
23%
Flag icon
While users can change settings once they have a profile, Facebook’s sign-up process still forces them to select Male or Female initially.
Brian
And why? This probably for ads purposes? I guess they’ve measured the friction it creates (for binary, trans, non-binary alike as it slows down even those for whom the answers serve them well) is worth the extra ad revenue?
24%
Flag icon
A primary way advertisers want to filter is by gender—either because they sell a product that’s specifically geared toward one group (like bras), or because they want to customize their messaging for different audiences. When a company goes into Facebook’s advertising interface to select the types of profiles where it wants its ads to appear, it can select from three options: All, Women, or Men.
Brian
Do advertisers want to market people who didn’t fit into one of the binary genders? Is this actually serving Facebook at the expense of both Facebook users *and* advertisers? Can advertisers take an active role here?...
27%
Flag icon
But metrics are only as good as the goals and intentions that underlie them. And in this case, a high number of reports doesn’t lead to greater neighborhood safety. At best, they create clutter—too much noise for users to sift through. That can actually make people less safe, because they become less likely to notice, or take seriously, those reports that do have merit. At worst, as we’ve seen, unhelpful reports perpetuate stereotypes and ultimately target people of color.
Brian
Let’s assume a decent percentage of dropouts were because the form-fillers couldn’t recall enough detail to fill out the bigger form. So these incidents were probably not going to be actionable individually, might might still have been of value collectively, to monitor crime rates or “suspicious activity”. You can spin a positive narrative but hard to know why they didn’t complete the forms (unless next door did some research to find out why?)
27%
Flag icon
When it did, the company realized that losing some Crime & Safety posts posed a lot less risk than continuing to develop a reputation as a hub for racism.
Brian
That was good product design and analysis, yes. :)
32%
Flag icon
They usually say something like, “The page you are looking for does not exist.” But at the time, the team was really focused on developing a funny, unique voice for MailChimp. So they decided to call it an “oops” moment and started brainstorming funny ways to communicate that idea. Pretty soon, someone had designed a page showing a pregnancy test with a positive sign. Everyone thought this was hilarious—right up until Kiefer Lee took it to her CEO. “Absolutely not,” he told her. Looking back, she’s glad he killed it.
Brian
Can’t see how that could’ve gone wrong... 😳Good save there, Sport.
34%
Flag icon
And that’s the sort of design sensibility we’ve seen throughout this chapter: clever copy, peppy music, the insistence that your memories must be rehashed and shared. All of it is based on the belief that true delight can be reduced to the digital equivalent of whipped cream in a can: a layer of fluffy sweetness sprayed on top of the experience. But as we’ve seen over and over, when teams laser-focus on delight, they lose sight of all the ways that fun and quirky design can fail—creating experiences that are dissonant, painful, or inappropriate.
35%
Flag icon
Take any one of the examples in this chapter, and just underneath its feel-good veneer you’ll find a business goal that might not make you smile. For example, Twitter’s birthday balloons are designed to encourage people to send good wishes to one another. They’re positioned as harmless fun: a little dose of delight that makes users feel more engaged with Twitter. But of course, Twitter doesn’t really care about celebrating your special day. It cares about gathering your birth date, so that your user profile is more valuable to advertisers. The fluttering balloons are an enticement, not a ...more
36%
Flag icon
But there’s a more insidious goal here too: companies that are seeking to be acquired (which is what a good percentage of tech startups want) are valuated higher when they have larger subscriber lists.15 Personal data, in this case, is an asset in and of itself—even if the quality of the list is low. That’s why these companies are so shameless: They’re not really trying to build loyalty. All they want is data.
Brian
I’ve heard this a few times on a venture pitch podcast I used to listen to...
36%
Flag icon
When Savage says the “beep beep!” message “performs,” he means that the notification gets a lot of people to open up Tumblr—a boon for a company invested in DAUs and MAUs. And for most tech companies, that’s all that matters. Questions like, “is it ethical?” or “is it appropriate?” simply aren’t part of the equation, because ROI always wins out.
Brian
Well, they probably consider it, but put much lower weight on each foul-up than the author here. If a single person had an offensive experience out of billions of experiences, would she let it slide? Or is exactly one misstep serious enough to cancel a feature? Be realistic and compare to offline businesses before answering....
38%
Flag icon
In November of 2016, ride-hailing company Uber released a new update to its iPhone app—and with it, a new detail you might have glossed right over while you were tapping “accept” and “continue”: rather than tracking your location while you’re using the app (which makes sense, since GPS location helps your driver find you easily), Uber now defaults to tracking your location all the time, even when the app is closed. Did that make your eyes widen?
Brian
Yes. This (and how they patronized those of us who refused) is precisely why I quit Uber.
38%
Flag icon
But iPhone settings don’t actually work that way. There are only three options: you can allow an app to use your location at all times, only while you’re using the app, or never. This Uber update purposely disables the middle option, so all you’re left with are the extremes. And while you can select “never,” Uber strongly discourages it, even misleading you along the way: doing so triggers a pop-up in the app with a large headline, “Uber needs your location to continue,” followed by a big button that says “enable location services.” By all appearances, you don’t have a choice: Uber needs it, ...more
38%
Flag icon
Because, it turns out, you certainly can use Uber without location data. The company just doesn’t want you to.
Brian
Not when they first did this. The app was broken. The two hey emailed you saying that Uber will work so much better, you’ll love it, just give them location data all the time. It was worded in a manner that didn’t even hint at the real concerns many of us had. So delete.
38%
Flag icon
Should we take the company at its word? No, one anonymous user told tech publication the Verge: “I simply don’t trust Uber to limit their location tracking to ‘five minutes after the trip ends.’ There’s nothing I can do to enforce this as an end user.” 5
Brian
No way they’re not sucking up those data. Knowing location at all times is hugely valuable to them. Even if you’re in the middle a lyft ride. ;)
39%
Flag icon
In 2014, an executive used the software’s “god view”—which shows customers’ real-time movements, as well as their ride history—to track a journalist who was critical of the company. Other sources have claimed that most corporate employees have unfettered access to customers’ ride data—and have, at times, tracked individual users just for fun.
39%
Flag icon
Back in 2010, technologist Tim Jones, then of the Electronic Frontier Foundation, wrote that he had asked his Twitter followers to help him come up with a name for this method of using “deliberately confusing jargon and user-interfaces” to “trick your users into sharing more info about themselves than they really want to.” 8 One of the most popular suggestions was Zuckering.
Brian
That’s perfect
41%
Flag icon
There’s also something deeply worrisome about Facebook assigning users an identity on the back end, while not allowing those same users to select their own identity in the front end of the system, says Safiya Noble, the information studies scholar we heard from back in Chapter 1. “We are being racially profiled by a platform that doesn’t allow us to even declare our own race and ethnicity,” she told me. “What does that mean to not allow culture and ethnicity to be visible in the platform?”
Brian
Why do people want to declare their identity for things that aren’t visible to others on the network? This is really just data for advertisers’ purposes...I guess and to assert that many women do like technology, but should we feel responsible for correcting big tech so that we get this right, ant the same time enriching big tech’s data and profits? There should be a way to tell advertisers that google/fb got it wrong so that it reflects the true value of the data (and all flaws therein)...and maybe give advertisers an excuse to claw back some of their money back from fb, goog.
41%
Flag icon
What it means is that Facebook, once again, controls how its users represent themselves online—preventing people from choosing to identify themselves the way they’d like, while enabling advertisers to make assumptions.
Brian
Doesn’t getting this wrong just work against advertisers but for fb, goog?
49%
Flag icon
Google Photos was making the same mistakes. Why? The answer is right back in those tech-company offices we encountered in Chapter 2. If you recall, Google reported that just 1 percent of technical staff was black in 2016. If the team that made this product looked like Google as a whole, it would have been made up almost entirely of white and Asian men. Would they have noticed if, like COMPAS, their failure rates disproportionately affected black people?
50%
Flag icon
Sound familiar? That’s the classic edge-case thinking we looked at back in Chapter 3—thinking that allows companies to shrug their shoulders at all the people whose identities aren’t whatever their narrow definition of “normal” is. But Alciné and his friend aren’t edge cases. They’re people—customers who deserve to have a product that works just as well for them as for anyone else.
Brian
Actually, o think this is different (and even worse). The use cases for black consumers are probably the same as for white and Asian males, when it comes to photo organization. The earlier cases of women who didn’t fit into either “trying to have baby”, “trying not to have baby”, “fertility treatments” or whatever they were were actually distinct (and deliberately excluded) use cases.
50%
Flag icon
When you think about it, these pairings aren’t very surprising: There are more women who are nurses than doctors.
Brian
Heh, but aren’t there also more women doctors than men? (At least graduating from medical school...or at least specializing in internal medicine?)
51%
Flag icon
they need to make it part of their job to scrub the bias from word embeddings before using them. The latter is what the researchers from Boston University and Microsoft propose. They demonstrate a method for algorithmically debiasing word embeddings, ensuring that gender-neutral words, like “nurse,” are not embedded closer to women than to men—without breaking the appropriate gender connection between words like “man” and “father.” They also argue that the same could be done with other types of stereotypes, such as racial bias.
Brian
This sounds really hard, without identifying which words are /are not gender neutral....
57%
Flag icon
But during all of these product improvements, Twitter built precious few features to prevent or stop the abuse that had become commonplace on the platform. For example, the ability to report a tweet as abusive didn’t come until a full six years after the company’s founding, in 2013—and then only after Caroline Criado-Perez, a British woman who had successfully led a campaign to get Jane Austen onto the £10 note, was the target of an abuse campaign that generated fifty rape threats per hour.
58%
Flag icon
According to many in the industry, Twitter’s failure to fix its abuse problem is part of the reason why it’s struggling—and why no one wants to buy the company, despite Twitter’s best efforts to sell. In 2016, Alphabet (Google’s parent company) turned them down. Then, so did both Salesforce and Disney—both, at least in part, because of Twitter’s reputation for harassment.
61%
Flag icon
Speaking of broken platforms and the American president, I would be remiss not to talk about perhaps the biggest story related to technology and the 2016 election: “fake news.” As I write this today, the term has pretty much lost its meaning: the president and his staff now use it to vilify any press coverage they don’t like. But during the fall of 2016, actual fake news was all over Facebook: The Pope endorses Donald Trump! Hillary Clinton sold weapons to ISIS!
Brian
There must be a solution to this situation. I guess long term it’s returning to the notion that quality universal education is a matter of national security. Wish I knew what that means we need to do but know that people are experimenting. And I don’t mean DeVos. :/
61%
Flag icon
According to a BuzzFeed News analysis, in the run-up to Election Day 2016, the top twenty fake stories from hoax and hyperpartisan sources generated more engagement on Facebook than the top twenty election stories from major news sites: 8.7 million shares, comments, and reactions, versus 7.4 million from real news sources.42
Brian
Zoiks.
61%
Flag icon
Facebook, which had been making a play to be seen as a credible news source, didn’t like being criticized for missing the country’s biggest news story at the time, so it decided to bring in some humans to help curate the news. That’s how the team that Gizmodo profiled in 2016 came to be. In the wake of the allegations, Facebook launched its own investigation, finding “no evidence of systemic bias.” But it didn’t matter: in August, the Trending team was suddenly laid off, and a group of engineers took its place to monitor the performance of the Trending algorithm.45 Within three days, that ...more
63%
Flag icon
SLOW DOWN AND FIX STUFF
Brian
The Slow Data movement.
64%
Flag icon
a tech culture that’s built on white, male values—while insisting it’s brilliant enough to serve all of us. Or, as they call it in Silicon Valley, “meritocracy.”
Brian
Eh, heh.
67%
Flag icon
As more colleges started offering computer science degrees, in the 1960s, women flocked to the programs: 11 percent of computer science majors in 1967 were women. By 1984, that number had grown to 37 percent. Starting in 1985, that percentage fell every single year—until, in 2007, it leveled out at the 18 percent figure we saw through 2014. That shift coincides perfectly with the rise of the personal computer—which was marketed almost exclusively to men and boys.
Brian
I’ve heard the PC marketing narrative. Is this really all it took to steer generations of women away from technology? It’s frightening how powerful marketing can be.
68%
Flag icon
In other words, blame the pipeline all you want, but diverse people won’t close their eyes and jump in until they know it’ll be a safe place for them when they get to the other side. No matter how many black girls you send to code camp.
Brian
A) individual people aren’t “diverse” B) totally right. Even subtle reminders of being an outsider cause cognitive load and stress that eventually lead to people from underrepresented groups bailing out and finding more welcoming environments.
68%
Flag icon
I’ll probably get that email no matter what I say next, but here’s the truth: Study after study shows that diverse teams perform better.
Brian
This is great news. But even if this weren’t observed (or even if the opposite were found), isn’t it just the right thing to remove bias and be more inclusive and diverse?
69%
Flag icon
Cristian Deszö of the University of Maryland and David Ross of Columbia wanted to understand how gender representation in top firms correlated with those firms’ performance. They reviewed the size and gender composition of senior management teams across the Standard & Poor’s Composite 1500 list, and then compared that information with financial data. What they found was staggering: “Female representation in top management leads to an increase of $42 million in firm value,” they wrote.
Brian
Is there also a phenomenon of bringing in women to lead companies as they head into disaster? Does this lead to greater turnaround opportunities for women leaders? But if so, do they get to stick around long enough to return to glory days, or does a man steal the opportunity away?
« Prev 1