Unless you’ve been living under a brick — in which case, get this list from the Washington Post thrown at you immediately — you know that Facebook now has the #1 most-cited social media platform in the U.S. For years, numerous studies have shown that Facebook heavily influences how people engage with one another, share, like, write, vote, and often if not mostly, succumb to manipulative propaganda. But even during the 2016 Presidential election, studies showed that Facebook did little to influence the voting process.
A study by NYU and Columbia published this month show that, in the moment of Trump’s win, Facebook users were much more likely to post, interact with, and engage with pro-Trump content than they were pro-Clinton content. Perhaps because of this, The Washington Post reports that a Facebook Executive is trying to wash Facebook’s hands of being an active role in the process.
As Adrian Chen reports for The Washington Post, Mike Schroepfer, Facebook’s Chief Technology Officer has been furiously trying to become less of a target of criticism. More than that, he and Facebook are refusing to accept blame and talking instead about “maintaining a healthy ecosystem.”
It’s true that Facebook does often contain images and memes that spur even mild social discord, and is often the place where a discussion about hate speech and casual racism can end up trending and quickly start spread across the social sphere. But it’s just as true that social media play a part in a wide range of political connections, initiatives, and rallies, whether people are having them organized by white nationalists, public protests by Black Lives Matter, or by boycotts against Ivanka Trump’s brand.
Although many of the Facebook protest hashtags started coming around the time of the presidential election, seeing that the platform had so much power in how people acted in the moment and connected after, from well before election day, was probably a jarring picture to many who looked at it.
But many internet commenters seem to feel that Schroepfer and Facebook shouldn’t bear any responsibility for this.
Yet, some Facebook figures have suggested that Schroepfer’s rhetoric is much ado about nothing. Julie Zhuo of TechCrunch recently wrote, “If you’re the CEO of Facebook, there’s a widespread, incorrect belief that you owe the entire country your opinion,” adding, “The truth is your job is not to act on behalf of those who have elected a leader.”
Writing for the NY Times yesterday, reporter Cathy Young made a similar claim:
What we have here is a “fake news” incident in which the anti-media crusaders are attempting to render it impossible for Facebook to be accountable for the mass of content flowing across its platform. It’s a desperate attempt to take away one of Facebook’s most important defenses against the political attacks. People are constantly coming to Facebook — or setting up Facebook — because they believe that they can have a say in what is posted, and that there is a way to vet those posts.
Young is a little down on the smear campaign to make Facebook get off their back. But is that true? Facebook should probably have no intention of steering Facebook users toward posts they agree with, or against those they believe to be harmful. But the company must, along with the rest of us, have some internal ability to control how political content is posted, shared, and acted upon. When users aren’t happy, they tend to have some visual and psychological feedback to inform the site how to go about shaping that feedback. The fake news issue doesn’t actually change Facebook’s ability to encourage user empowerment and community, but it does put the potential for more unethical behavior in perspective.
This post originally appeared on The Express and is republished here with permission.