AI companies have been trying to address this bias in a number of ways, with differing levels of urgency. Some of them just cheat, like the image generator DALL-E, which covertly inserted the word female into a random number of requests to generate an image of “a person,” in order to force a degree of gender diversity that is not in the training data.