Facebook, the 13-year-old social media giant recently reported to have 1.86 billion users worldwide, is one of, if not the largest media platform distributing information on the planet.
That number has increased roughly 17 percent a year and there doesn’t seem to be any sign of its growth slowing down.
Users are responsible for the content they post, but what responsibility does Facebook have for the information that’s been posted, and what can – and should – they do when someone posts intentionally false or illegal information or materials to their Facebook account?
Recently Mark Zuckerberg, Facebook’s founder and CEO, tried to answer some of these questions. He released a manifesto, a nearly 5,500-word proclamation on his own social media site, touting the benefits of building a global community and addressing Facebook’s position on “fake” news and so-called “echo chambers” or “filter bubbles” in our news consumption.
The essay discusses how Facebook and its users should go about helping people “see a more complete picture, not just alternate perspectives.” Zuckerberg makes it seem like the only way for a person to have an informed decision is to have a “range of perspectives,” saying, “We must be careful how we do this.”
Indeed, research shows that some of the most obvious ideas, like showing people an article from the opposite perspective, actually deepen polarization by framing other perspectives as foreign.
We as users and consumers are at least partially responsible for our media diet. We should be careful about what news and information we consume, and how much weight and credibility we give to that material. However, the source of the material, whether that’s an extremely conservative website such as Breitbart news, a liberal-leaning newspaper such as the New York Times, or a friend telling you what they saw happening at the store, needs to be accurate to be considered news.
News is fact, not fiction. If it’s not true, it’s not news.
How much of this premise Zuckerberg laid out is just lip service, and how committed is he to making sure Facebook is providing fair and/or unbiased news to everyone?
A few years back, in a 2012 letter to potential investors, he wrote “People sharing more – even if just with their close friends or families – creates a more open culture and leads to a better understanding of the lives and perspectives of others.” He went on to say that Facebook relationships expose users to a greater number of diverse perspectives.
Not only is that not necessarily true, but having an attitude like this has lead to claims Facebook’s algorithms leave users only receiving news and information that echo their own beliefs. That, as well as current bad press in connection to the recent presidential election and dismay over a glut of widely shared fake news.
In his manifesto, Zuckerberg bragged “we’ve seen (that) the candidate with the largest and most engaged following on Facebook usually wins.” He went on to compare social media to the television of the 1960s as the primary, driving medium for civic debate.
The major difference in 2017, however, is that information on television and traditional journalism outlets like newspapers and magazines have ramifications if they give blatantly false, misleading, or illegal information. Facebook, meanwhile, doesn’t have any penalty to keep its users honest or to make sure everyone only posts proper materials, beyond at best saying ‘Stop it, or we will delete your account – well, at least until you make a new one!’
Our mistrust of Facebook’s promises to stop fake news posts may be valid, especially considering more recent news. Reportedly, Facebook was informed by the BBC that child pornography was being shared on their site. BBC sent Facebook more than 100 pictures, and links to the accounts of users sharing child porn.
Instead of removing the material, yanking the pages, and turning over the users to police, Facebook ran to the police, and lodged a complaint against the BBC. The BBC had used Facebook’s own tools, and even after finding the 100 images and reporting them, found 82 still remaining when it later checked on the situation.
Obviously Facebook, and some concerned users, urge against the corporation self-imposing regulations that are too strict, worrying the site will turn into a “police state” and arguing that people are always entitled to their opinions.
However, much harm can be done when people can freely purport information as true. This is nothing new; this is why we have libel laws, because saying false things about individuals, groups, or even companies or countries, is damaging. Since the founding of the U.S. we have had some form of libel laws on the books to help protect individuals and their reputations.
Facebook and Zuckerberg say they are doing lots to help, making changes and implementing plans that will fix this. To be honest though, this is not the first time we’ve heard from Facebook that they are going to stop fake news from making its way to users.
It boils down to this: Facebook has reaped the rich rewards of providing a platform for the world to connect and converse. It made nearly $20 billion last year alone on ad revenues last year, and didn’t really have to produce much in the way of content. Now, we are at a crossroads where Facebook and Mr. Zuckerberg need to decide if they are going to keep things the way they are, or will stick to their word and make some real, and clear, changes to their network.
Technology has revolutionized the world. It’s a different place than it was 10 or even two years ago, and how we communicate is changing right along with it. But, we MUST have a way to inform people of what’s happening in a way in which they can be certain, that their news is fully valid.
Even if it’s colored with opinions and different interpretations, as Gertrude Stein said “A rose is a rose is a rose” and as long as everyone knows they are talking about a rose, they can argue over the color!