More on this book
Community
Kindle Notes & Highlights
Dr. Sasha Costanza-Chock sent me a video message while at the local CVS. Holding up a Cosmopolitan magazine, they shouted excitedly, “Look what we found!” In addition to running print campaigns with Vogue, Olay also ran ads in many major beauty magazines including not just Cosmopolitan but also Allure and Harper’s Bazaar. This was phase one, and it was now time for phase two of #DecodeTheBias, which would include TV commercials, a satellite media tour, and a segment on Good Morning America on ABC. The kid who wasn’t allowed to watch commercials was now starring in one. The door to reaching a
...more
As much as the engineer in me wanted straightforward answers and straightforward technical solutions, the reality is far more complex. There are costs of inclusion and costs of exclusion to be considered in the design and deployment of AI systems that must be contextualized.
In 2017, Stanford researchers released a paper showing promising results for skin cancer diagnosis with computer vision. They were able to match the performance of dermatologists, and enthusiastic headlines followed. However, the dataset that was used to benchmark the performance was later revealed to be overwhelmingly lighter-skinned individuals. Thus far, this analysis is a rather technical view of the problem.
We also needed more companies stepping up to check their AI products with algorithmic audits. The final component of the #DecodeTheBias campaign was an audit of Olay’s Skin Advisor tool. When I was first asked to audit the tool, I warned the team that given everything I had learned about how it was developed there was a high chance we would find unflattering bias. I also did not want to be part of conducting an audit where the final results could not be made public. The Olay team did not back down. “If we find bias in our tool, we will do what it takes to fix it.” I asked, “What if it cannot
...more
After the meeting with Olay, I called my blue-haired fellow freedom fighter Cathy O’Neil, the founder of algorithmic auditing company ORCAA. “What do you know about skin care?” “Basically nothing.” “No worries, I have an epic audit we should do.” “Tell me more.”
Knowing I had no backup, I started what I called “beauty bootcamp.” I did all the right things for vanity reasons. I drank mainly water and removed all caffeinated drinks from my diet. I worked out five times a week and went to bed at a reasonable hour, averaging eight hours of sleep a night. And, of course, I used the Olay skin products that adorned my sink. Eye creams, face creams, serums, clay masks, and a mist that smelled of cucumbers.
While Olay sought the audit on a voluntary basis, we cannot assume all companies would do the same. While there may be some good actors, relying on the goodwill and moral motivations of the tech industry is not a responsible or reliable strategy. The year after the Olay campaign, I would also see firsthand that government agencies would need to be pressured to test technology from companies offering AI services.
Amy, a tall Nigerian woman, was my makeup artist. When I showed her the print ad, she looked at me with tears welling in her eyes. “I know this is supposed to be for little girls, but it’s for me too. We don’t get to see ourselves celebrated like this.” Her reaction impressed on me the joy of inclusion. I didn’t need another credential to have others feel seen or to be taken seriously.
“A PhD is an exercise in humility. Smile and nod and get it done.”
Remember the significant small moments of acknowledgment and support.
She would often say to me, “I am highly educated, not highly schooled.”[*2] I love that phrase because it reminds me that credentials and degrees have their place, but they are not requirements to learn about the impacts of technology or push for change. You don’t need a PhD from MIT to make a difference in the fight for algorithmic justice. All you need is a curious mind and a human heart.
You don’t have to know what a neural net is to know that if an AI system denies you a job because of your race, gender, age, disability, or skin color, something is wrong. You don’t have to be an AI researcher to know that if companies take your creative work and use it to create products without permission and compensation, you have been wronged.
The EU AI Act was under deliberation. When passed, it would set a precedent for how AI would be governed not just in the European Union but in other parts of the world.
Though often left out of global conversations, many of the Kenyan workers are paid wages of less than $2.00 an hour to go through trauma-inducing content for products like ChatGPT, TikTok, and Facebook. Their initiative to unionize and bring attention to the lack of mental health support, low pay, and unstable work is an important step in combating exploitative practices that power headline-grabbing AI products.[1]
If you have a face, you have a place in the conversation and the decisions that impact your daily life—decisions that are increasingly being shaped by advancing technology that sits under the umbrella of artificial intelligence.
We do not have to sit idly by and watch the strides gained in liberation movements for racial equality, gender equality, workers’ rights, disability rights, immigrant rights, and so many others be undermined by the hasty adoption of artificial intelligence that promises efficiency but further automates inequality.
After the launch event, Tawana and I grabbed some cupcakes on our way out of the Executive Office Building. We paused at the top of the Navy Steps, which overlook the White House. We stood for a moment, tiny figures on silver stone, still willing to believe our tomorrows will be better than our yesterdays—this belief inspired not by machines and the progress of technology, but by the perseverance and the creativity of everyday people. The future of AI remains open-ended. Will we let power in the hands of a few tech companies dictate our lives? Will we strive for a society that protects the
...more
Juneteenth—and
ready to meet the moment.
Just last week European lawmakers voted to push forward the EU AI Act, which restricts the use of live facial recognition in public spaces because of the discriminatory and invasive impacts of biometrics. The U.S. is going the opposite way, with the Transportation Security Administration [TSA] piloting
President Biden left the same way he entered, offering warm handshakes and asking us to continue to advise. “I’m like your poor relative, I will ask for your help and offer no money in return,” he joked. Arati and the sleep-deprived staffers looked pleased with the meeting. “He usually doesn’t stay overtime. He was very engaged,” Arati whispered in my ear.
He turned out to be a manager at the Fairmont, and when he heard I needed to change rooms because of a barking dog in the neighboring room, he brought me to one of the best, the Diplomat Suite, also known as the Tony Bennett Suite, room 2211.
penthouse suite was occupied by someone who could not be named.
I slipped off my red blazer and sunk into the bed. I looked at the sweeping panoramic view from the multi-room suite, seeing my reflection overlaid on a glittering skyline. My mind was occupied by hope. Sleep cradled me into California dreaming.