Blaine Morrow

61%
Flag icon
In 2016, Microsoft introduced a chatbot, Tay, on Twitter. Its conversational style was modeled on the speech patterns of a teenage girl, and was supposed to grow more sophisticated as it interacted with people and learned their conversational style. Within twenty-four hours, a group on the 4Chan discussion forum coordinated their responses to Tay. They flooded the system with racist, misogynistic, and anti-Semitic tweets, thereby transforming Tay into a racist, misogynistic anti-Semite. Tay learned from them, and—with no actual understanding—parroted their ugliness back to the world.
A Hacker's Mind: How the Powerful Bend Society's Rules, and How to Bend them Back
Rate this book
Clear rating
Open Preview