my notes ( ? )
Truly excellent, insightful piece.
"if Facebook did make the content that users see more positive, should we simply be happy? ... If Alice is happier when she is oblivious to Bob’s pain because Facebook chooses to keep that from her, are we willing to sacrifice Bob’s need for support and validation?
This is a hard ethical choice ... Facebook is making these choices every day without oversight, transparency, or informed consent...
most of what Facebook’s algorithms do involve curating content to provide you with what they think you want to see ... Their everyday algorithms are meant to manipulate your emotions... Facebook is not alone ...
an entire industry has emerged to produce crappy click bait content ... we as a society believe that it’s totally acceptable for news ... to manipulate people’s emotions through the headlines they produce and the content they cover....
We need ethics to not just be tacked on, but to be an integral part of how everyone thinks about what they study, build, and do...
any company that manipulates user data [should] create an ethics board... that has visibility into all proprietary algorithms that could affect users...
there needs to be a mechanism for whistleblowing regarding ethics...
I want to see users have the ability to meaningfully influence what’s being done with their data ... their voices to be represented in these processes."
- What does the Facebook experiment teach us? — The Message — Medium
Read the Full Post
The above notes were curated from the full post
medium.com/message/what-does-the-facebook-experiment-teach-us-c858c08e287f?source=email-1cb9fa8d4bee-1404453157117-newsletter.