Reviews | I designed algorithms on Facebook. Here’s how to regulate them.

0

This time seems strange now. Our social media feeds are teeming with spontaneous and marginal content, thanks to social media adoption of two key technological developments: personalization, driven by the massive collection of user data through web cookies and big data systems, and algorithmic amplification, the use of powerful artificial intelligence. to select the content presented to users.

Algorithmic personalization and amplification alone has no doubt made wonderful new Internet services possible. Technology users take for granted our ability to customize apps and websites with our favorite sports teams, musicians and hobbies. The use of ranking algorithms by news websites for their user comment sections, traditional spam cesspools, has been widely successful.

But when data scientists and software engineers mix content personalization with algorithmic amplification, as they do to produce Facebook’s news feed, TikTok’s For You tab, and YouTube’s recommendation engine, they create uncontrollable beasts that attract attention. While these algorithms, such as Facebook’s ‘engagement-based ranking’, are marketed as growing ‘relevant’ content, they perpetuate bias and affect society in ways barely understood by their creators, and yet less users or regulators.

In 2007, I started working at Facebook as a data scientist, and my first assignment was to work on the algorithm used by News Feed. Facebook has had over 15 years to demonstrate that algorithmic personal feeds can be built responsibly; if it hasn’t happened now, it won’t. As Ms Haugen said, it should now be about humans, not computers, “making it easier from who we can hear”.

While understaffed teams of data scientists and product managers like Ms. Haugen try to control the worst impacts of algorithms, social media platforms have a fundamental economic incentive to maintain user engagement. This ensures that these feeds will continue to promote the most exciting and inflammatory content, and it creates an impossible task for content moderators, who struggle to control problematic viral content in hundreds of languages, countries and political contexts. .


Source link

Leave A Reply

Your email address will not be published.