Stephen H. Karten

17%
Flag icon
an improvement on the biases of traditional content editors, the platforms were surreptitiously implementing algorithmic filters that lacked the value system of human editors. Algorithms would not act in a socially responsible way on their own. Users would think they were seeing a balance of content when in fact they were trapped in what Eli called a “filter bubble” created and enforced by algorithms. He hypothesized that giving algorithms gatekeeping power without also requiring civic responsibility would
Zucked: Waking Up to the Facebook Catastrophe
Rate this book
Clear rating