Diversity at Google AI: Is that the Pope?

If you were to paint a Pope, what would he look like? Maybe you’re thinking of a gold-embroidered, pointed headpiece or a simple, circular skullcap. But almost certainly both would be sitting on the head of an older white gentleman. If you were to draw a young black woman instead, it would be quite unusual. You could say: a mistake.

But that’s exactly what an artificial intelligence from Google recently did. When asked to generate images of popes, the system showed a black woman and a black man. An astonishing reaction – especially because AI systems have so far been known to behave in the opposite way: they reproduce social prejudices. Successful managers are portrayed as white men, drug dealers as black men.

The AI called Gemini, on the other hand, even generated Wehrmacht soldiers with non-white skin color.

All of this caused significantly more excitement than previous creations that perpetuated racist prejudices. Eventually, Google decided to temporarily turn off the ability to generate images of people. The company apologized and promised: “We will do better.”

But how? And anyway: Why can’t Google and Co. get their AIs under control?

It is not the first time since the AI hype broke out a year and a half ago that such a system has caused problems. Microsoft, for example, is also currently having trouble with its image generator because the program creates violent and sexualized images. And when chatbots suddenly talk about falling in love with the user, that might be cute. Other mistakes, on the other hand, are a threat to democracy, such as that of Microsoft’s AI search engine Bing. When scientists examined how accurately Bing can provide information about upcoming state elections, the AI answered a third of the questions incorrectly. For example, survey results were incorrect.

There is a separate division in the AI industry that is dedicated to bringing AI into line. This discipline is called alignment, and it is apparently still in its infancy. Or so John Schulman argued. He is one of the founders of OpenAI, the company behind ChatGPT , and is responsible for exactly this alignment. When Google was criticized for its absurd images, Schulman jumped to the competition’s defense in a post on Platform X. Alignment is a fairly young discipline, and “hyper-wokeness” is simply a bug, that’s what software developers call errors.

Jakob Foerster may know how to fix this. He has already worked on artificial intelligence for Google, OpenAI and Meta, and is now a professor at the University of Oxford and researches how AI and humans can work together. Foerster is not one for long preambles; in the video call he sits in front of a whiteboard scrawled with graphs, ties his hair in a braid and initially gets upset. “The methods used are obviously relatively stupid,” he says.

In fact, during the Gemini debate, it emerged that Google was conducting some kind of unannounced experiment with users of its chatbot. In the background, each of the entries, so-called prompts, was apparently supplemented with an invisible note. Google didn’t comment on the exact methods, but some users used tricks to get the secret addition out of the bot. The request “Show me a Pope” was therefore passed on to the system with the request to “explicitly include different genders and ethnicities”.

If you were to paint a Pope, what would he look like? Maybe you’re thinking of a gold-embroidered, pointed headpiece or a simple, circular skull cap. But almost certainly both would be sitting on the head of an older white gentleman. If you were to draw a young black woman instead, it would be quite unusual. You could say: a mistake.

But that’s exactly what an artificial intelligence from Google recently did. When asked to generate images of popes, the system showed a black woman and a black man. An astonishing reaction – especially because AI systems have so far been known to behave in the opposite way: they reproduce social prejudices. Successful managers are portrayed as white men, drug dealers as black men.

SOURCE

 •  0 comments  •  flag
Share on Twitter
Published on March 25, 2024 15:28
No comments have been added yet.


Victoria Fox's Blog

Victoria Fox
Victoria Fox isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow Victoria Fox's blog with rss.