As algorithms take over, YouTube’s recommendations highlight a human problem

By Ben Popken


YouTube is a supercomputer working to achieve a specific goal — to get you to spend as much time on YouTube as possible.


But no one told its system exactly how to do that. After YouTube built the system that recommends videos to its users, former employees like Guillaume Chaslot, a software engineer in artificial intelligence who worked on the site’s recommendation engine in 2010-2011, said he watched as it started pushing users toward conspiracy videos. Chaslot said the platform’s complex “machine learning” system, which uses trial and error combined with statistical analysis to figure out how to get people to watch more videos, figured out that the best way to get people to spend more time on YouTube was to show them videos light on facts but rife with wild speculation.


Routine searches on YouTube can generate quality, personalized recommendations that lead to good information, exciting storytelling from independent voices, and authoritative news sources.


But they can also return recommendations for videos that assert, for example, that the Earth is flat, aliens are underneath Antarctica, and mass shooting survivors are crisis actors.


Continue reading by clicking the name of the source below.

 •  0 comments  •  flag
Share on Twitter
Published on April 23, 2018 07:03
No comments have been added yet.


ريتشارد دوكنز's Blog

ريتشارد دوكنز
ريتشارد دوكنز isn't a Goodreads Author (yet), but they do have a blog, so here are some recent posts imported from their feed.
Follow ريتشارد دوكنز's blog with rss.