"Facebook quietly stopped recommending that people join online groups dealing with political or social issues", and will "assess when to lift them afterwards, but they are temporary."
This part of the Facebook's AI plays a key role in pushing "people down a path of radicalization... groups reinforce like-minded views and abet the spread of misinformation and hate". While it's impossible to know this without data, "an internal Facebook researcher found in 2016 that “64% of all extremist group joins are due to our recommendation tools”... Zuckerberg said he was “not familiar with that specific study,” despite the fact that he had criticized the Journal’s story internally to employees".
More Stuff I Like
More Stuff tagged algorithm , facebook , filter bubble , disinformation , us2020
See also: Content Strategy , Online Strategy , Social Media Strategy , Disinformation in the US 2020 elections , Social Web , Media
MyHub.ai saves very few cookies onto your device: we need some to monitor site traffic using Google Analytics, while another protects you from a cross-site request forgeries. Nevertheless, you can disable the usage of cookies by changing the settings of your browser. By browsing our website without changing the browser settings, you grant us permission to store that information on your device. More details in our Privacy Policy.