Technically Wrong Quotes
Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
by
Sara Wachter-Boettcher2,531 ratings, 4.07 average rating, 349 reviews
Open Preview
Technically Wrong Quotes
Showing 1-17 of 17
“no matter how much tech companies talk about algorithms like they’re nothing but advanced math, they always reflect the values of their creators:”
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
“Study after study shows that diverse teams perform better. In a 2014 report for Scientific American, Columbia professor Katherine W. Phillips examined a broad cross section of research related to diversity and organizational performance. And over and over, she found that the simple act of interacting in a diverse group improves performance, because it “forces group members to prepare better, to anticipate alternative viewpoints and to expect that reaching consensus will take effort.”
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
“when designers use clean aesthetics to cover over a complex reality—to take something human, nuanced, and rife with potential for bias, and flatten it behind a seamless interface—they’re not really making it easier for you. They’re just hiding the flaws in their model, and hoping you won’t ask too many difficult questions.”
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
“The good news is there’s actually no magic to tech. As opaque as it might seem from the outside, it’s just a skill set—one that all kinds of people can, and do, learn. There’s no reason to allow tech companies to obfuscate their work, to call it special and exempt it from our pesky ethics.”
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
“The reality is a lot more mundane: design and programming are just professions—sets of skills and practices, just like any other field. Admitting that truth would make tech positions feel a lot more welcoming to diverse employees, but tech can’t tell that story to the masses. If it did, then the industry would seem normal, understandable, and accessible—and that would make everyday people more comfortable pushing back when its ideas are intrusive or unethical.”
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
“Dash concludes that, ultimately, the tech industry doesn’t really exist. It’s just in these organizations’ best interests to be seen as “tech.”
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
“Cathy O’Neil claims that this reliance on historical data is a fundamental problem with many algorithmic systems: “Big data processes codify the past,” she writes. “They do not invent the future.”
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
“In 2014, an executive used the software’s “god view”—which shows customers’ real-time movements, as well as their ride history—to track a journalist who was critical of the company. Other sources have claimed that most corporate employees have unfettered access to customers’ ride data—and have, at times, tracked individual users just for fun.6 In 2012, Uber even published an official blog post bragging about how it can tell when users have had a one-night stand—dubbing them, somewhat grossly, “rides of glory.” (Uber deleted the post a couple years later, when major stories started breaking about its disregard for privacy.)”
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
“In January 2017, Bloomberg reported that although Facebook had started giving recruiters an incentive to bring in more women, black, and Latino engineering candidates back in 2015, the program was netting few new hires. According to former Facebook recruiters, this was because the people responsible for final hiring approvals—twenty to thirty senior leaders who were almost entirely white and Asian men—still assessed candidates by using the same metrics as always: whether they had gone to the right school, already worked at a top tech company, or had friends at Facebook who gave them a positive referral.15 What this means is that, even after making it through round after round of interviews designed to prove their skills and merits, many diverse hires would be blocked at the final stage—all because they didn’t match the profile of the people already working at Facebook.”
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
“if tech wants to be seen as special—and therefore able to operate outside the rules—then it helps to position the people working inside tech companies as special too. And the best way to ensure that happens is to build a monoculture, where insiders bond over a shared belief in their own brilliance. That’s also why you see so many ridiculous job titles floating around Silicon Valley and places like it: “rock-star” designers, “ninja” JavaScript developers, user-experience “unicorns” (yes, these are all real). Fantastical labels like these reinforce the idea that tech and design are magical:”
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
“It’s not that their founders intended to build platforms that cause harm. But every digital product bears the fingerprints of its creators. Their values are embedded in the ways the systems operate: in the basic functions of the software, in the features they prioritize (and the ones they don’t), and in the kind of relationship they expect from you. And as we’ve seen throughout this book, when those values reflect a narrow worldview—one defined by privileged white men dead set on “disruption” at all costs—things fall apart for everyone else.”
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
“That’s great in theory, but when personas are created by a homogenous team that hasn’t taken the time to understand the nuances of its audience—teams like those we saw in Chapter 2—they often end up designing products that alienate audiences, rather than making them feel at home.”
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
“organizations of all types use tools called personas—fictional representations of people who fit their target audiences”
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
“guiding users through a process quickly and easily is good for business, because the fewer people who get frustrated or confused, the more sales or sign-ups are completed. The problem, though, is that making interactions feel smooth and simple sounds nice, but it starts to fail as soon as you’re asking users for messy, complicated information. And as you’ll see in this chapter, all kinds of everyday questions can be messy and complicated—often in ways designers haven’t predicted. NAMING THE PROBLEM Sara Ann Marie Wachter-Boettcher. That’s how my birth certificate reads: five names, one hyphen, and a whole lot of consonant clusters (thanks, Mom and Dad!). I was used to it being misspelled. I was used to it being pronounced all sorts of ways. I was even used to everyone who looks at my driver’s license commenting that it takes up two whole lines. But I didn’t expect my name to cause me so many problems online. As it turns out, tons of services haven’t thought much about the wide range of names out there. So, on Twitter I forgo spaces to fit my professional name in: SaraWachterBoettcher. On online bill pay, they’ve truncated it for me: Sara Wachter-Boettch. In my airline’s online check-in system, hyphens straight up don’t exist. The list goes on. It’s irritating. It takes some extra time (do I enter a space between my last names, or just squish them together?). I see more error messages than I’d like. But it’s still a minor inconvenience, compared to what other people experience.”
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
“don’t think this is realistic,” he said. “The CEO would be an older white man.” My colleague and I agreed that might often be the case, but explained that we wanted to focus more on Linda’s needs and motivations than on how she looked. “Sorry, it’s just not believable,” he insisted. “We need to change it.” I squirmed in my Aeron chair. My colleague looked out the window. We’d lost that one, and we knew it. Back at the office, “Linda” became “Michael”—a suit-clad, salt-and-pepper-haired guy. But we kept Linda’s photo in the mix, swapping it to another profile so that our personas wouldn’t end up lily-white. A couple weeks later, we were back in that same conference room, where our client had asked us to share the revised personas with another member of his executive team. We were halfway through our spiel when executive number two cut us off. “So, you have a divorced black woman in a low-level job,” he said. “I have a problem with that.” Reader, I died. Looking back, both of these clients were right: most of the CEOs who were members of their organization were white men, and representing their members this way wasn’t a good plan for their future. But what they missed—because, I recognize now, our personas encouraged them to miss it—was that demographics weren’t the point. Differing motivations and challenges were the real drivers behind what these people wanted and how they interacted with the organization. We thought adding photos, genders, ages, and hometowns would give our personas a more realistic feel. And they did—just not the way we intended. Rather than helping folks connect with these people, the personas encouraged the team to assume that demographic information drove motivations—that”
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
“During the process of redesigning the NPR News mobile app, senior designer Libby Bawcombe wanted to know how to make design decisions that were more inclusive to a diverse audience, and more compassionate to that audience’s needs. So she led a session to identify stress cases for news consumers, and used the information she gathered to guide the team’s design decisions. The result was dozens of stress cases around many different scenarios, such as: • A person feeling anxious because a family member is in the location where breaking news is occurring • An English language learner who is struggling to understand a critical news alert • A worker who can only access news from their phone while on a break from work • A person who feels upset because a story triggered their memory of a traumatic event13 None of these scenarios are what we think of as “average.” Yet each of these is entirely normal: they’re scenarios and feelings that are perfectly understandable, and that any of us could find ourselves experiencing. That’s not to say NPR plans to customize its design for every single situation. Instead, says Bawcombe, it’s an exercise in seeing the problem space differently: Identifying stress cases helps us see the spectrum of varied and imperfect ways humans encounter our products, especially taking into consideration moments of stress, anxiety and urgency. Stress cases help us design for real user journeys that fall outside of our ideal circumstances and assumptions.14 Putting this new lens on the product helped the design team see all kinds of decisions differently.”
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
“you’ll see tech more like I do: not magical, but fallible—and ripe for change. Even more, I hope you’ll feel comfortable asking hard questions of the digital products you use, and the people who make them. Because tech has spent too long making too many people feel like they’re not important enough to design for. But, as we’ll see, there’s nothing wrong with you. There’s something wrong with tech.”
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
― Technically Wrong: Sexist Apps, Biased Algorithms, and Other Threats of Toxic Tech
