Facebook’s own recommendation algorithms have been inviting users to like pages that share pro-military propaganda that violates the platform’s rules... in 2018 the social media company admitted it could have done more to prevent incitement of violence in the run-up to a military campaign against the Rohingya Muslim minority ...
hired more than 100 Burmese-speaking content moderators [and] built algorithms [but] still rudimentary ... not doing enough to stop repeat offenders from returning ... Facebook has policies against recidivism ... it is not enforcing them”
According to Global Witness, “it was incredibly easy... to find this content" - they:
"Three of those five pages contained content that violated Facebook’s policies.
More Stuff I Like
More Stuff tagged algorithm , social media , facebook , ai , hate , moderation , myanmar
See also: Online Strategy , Social Media Strategy , Content Creation & Marketing , Digital Transformation , Innovation Strategy , Social Web , Science&Technology
MyHub.ai saves very few cookies onto your device: we need some to monitor site traffic using Google Analytics, while another protects you from a cross-site request forgeries. Nevertheless, you can disable the usage of cookies by changing the settings of your browser. By browsing our website without changing the browser settings, you grant us permission to store that information on your device. More details in our Privacy Policy.