- For years, Facebook has known that its algorithms encourage and amplify antisocial behavior like hate speech and extreme political bias to keep users engaged, according to company documents reported in The Wall Street Journal.
- When given proposals to make the platform better, executives often balked. They didn’t want to offend bad actors, and they didn’t want to release their hold on people’s attention. At Facebook attention equals money.
- So Facebook’s algorithms have been allowed to continue being sociopaths — pushing divisive content and exploiting people’s visceral reactions without a thought for the consequences or any remorse for their actions.
- Meanwhile, by letting bad actors on the platform do their thing, Facebook is feeding an inherent political bias into the algorithms themselves, and the company at large.
- This is an opinion column. The thoughts expressed are those of the author.
- Visit Business Insider’s homepage for more stories.
Facebook has always claimed that its mission is to bring people together, but a new report from The Wall Street Journal laid bare what many have suspected for some time: Its algorithms encourage and amplify harmful, antisocial behavior for money.
In other words, Facebook’s algorithms are by nature sociopaths. And company executives have been OK with that for some time.
Here’s what we learned from Jeff Horowitz and Deepa Seetharaman at The Journal:
- A 2016 internal Facebook report showed “64% of all extremist group joins are due to our recommendation tools.”
- A 2018 internal report found that Facebook’s “algorithms exploit the human brain’s attraction to divisiveness” and warned that if left unchecked they would simply get nastier and nastier to attract more attention.
- An internal review also found that algorithms were amplifying users that spent 20 hours on the platform and posted the most inflammatory content (users that may not be people at all, but rather Russian bots, for example).
- Facebook executives, especially Mark Zuckerberg, time and time again ignored or watered down recommendations to fix these problems. Executives were afraid of looking biased against Republicans — who, according to internal reports, were posting the highest volume of antisocial content.
- And of course executives had to protect the company’s moneymaking, attention-seeking, antisocial algorithms — regardless of the damage they may be doing in society as a whole. Politics played into that as well.
3rd_Party_Yesterday
Article URL : https://www.businessinsider.com/facebook-algorithm-sociopath-management-too-greedy-to-stop-it-2020-5