membership

A Letter to Mark Zuckerberg About Facebook’s Fake News Problem

Elizabeth Edersheim
Posted: 01/27/2017

As Facebook confronts criticism about its role in mixing real news and fake news, I envisioned a letter to Mark Zuckerberg.

Dear Mark,

As Facebook addresses the challenges of your role in the post-truth world, I encourage you to reframe the questions your team appears to be asking. How to do that? Try the Master of Management’s 20th century approach to 21st century problems. 

Rather than ask “Is this really our fault?” or “What is mainstream media doing that we can adapt?” I suggest you focus on yesterday’s classic Peter F. Drucker questions as Facebook invents tomorrow’s communications system:

  • What is reality?
  • What is the social challenge?
  • What needs to be true for Facebook to be able to address these challenges?
  • How can we make it true? 

Remember—keep it simple. Then step back and ask, what is the Facebook strategy? 

What Is Reality? 

Reality 1: Facebook has succeeded in becoming a community, the equivalent of the forum in Athens. It is where information, opinions, and lives are shared. It is so successful that more than half U.S. adults claim to get some news from Facebook posts. In fact, the postings are so influential that some people have even blamed Facebook for promoting false news that shaped the 2016 presidential election.

Reality 2: Much of what is being called “news” and posted on your site is false or distorted facts meant to influence readers.. 

Reality 3: Facebook was not set up to be a newscaster and is not a censor—nor does being a censor fit with your values and mission. That leaves you in a quandary: You are the de facto news provider, yet you don’t intend to determine what is news.

What Is the Social Challenge?

In the post-truth environment,  people have almost endless ways to share untruths—knowingly and unknowingly. The new reality: No organization can control the facts and get news-sharing back to the old, hierarchical reality. 

The social challenge is one that Peter F. Drucker, an Austrian immigrant to the U.S., framed way back in 1942—not a challenge of propaganda but of the survival of democracy:

The danger of total propaganda is not the propaganda will be believed, the danger is nothing will believed and that every communication will become suspect…

The end results of total propaganda are not fanatics, but cynics—but this, of course, may be even greater and more dangerous corruption. 

He continued, with what some now see as special prescience: 

Although propaganda has usually been associated with totalitarian regimes propaganda can also pose a grave danger to democracies. 

That is the social challenge that Facebook has a responsibility to address.

What Needs to be True for Facebook to Address This Challenge?

Facebook and technologies like live video create challenges for society while also building accountability and trust in ways that didn’t exist before.

You and your team must contemplate this odd situation with the same open minds that made you ask what must be true to get the Internet to the entire world, and the same inventiveness that led you to create a fleet of solar powered planes to deliver signals every place the Internet is not.

I think if you believe in accountability and trust—the vast majority of posters and re-posters on Facebook also value trust and do not want their trust questioned by posting something that is untrue: Let us, the members of Facebook-based communities be the gatekeepers.  Let us choose our settings to alert us as we are about to post or read something.

For example, say Max can set her own settings so that if something she is about to post fails system A’s truthometer, or system B’s fact checker and if the source is not a five-star source. Max is alerted.  She can pick which fact-checker to use. Max may still chose to post it, or may post it with a comment that indicates that she is not certain it is accurate. 

Max also can set her settings to symbolically highlight the material that others post, and again she can decide on the gatekeeper. She may choose not to read what I post, because it is marked by a black skull or other ominous sign. Maybe Facebook’s AI can be used to see if the facts are collaborated any place else.

Or perhaps greater innovation won’t require censorship and monitoring.  It might involve the extended community of writers and thinkers. 

How Can Facebook Make This True? 

As Facebook so often does, you can model ownership and the gatekeeping behavior for all 1.8 billion members of your community.

Mark, you often visit communities to learn for Facebook.   For example, a few days ago you visited the Dallas Police Department to learn how social media is changing law enforcement, and how you can help them connect to communities. Do the same thing with truth-seeking citizens; take the findings back to your teams. 

And. Keep. It. Simple.

* * *

Many of us want you to adopt a common-sense approach. We admire your progress.

To paraphrase Drucker, the survival of our basic beliefs– the very meaning of our society – depends on Facebook’s ability to help communities keep credibility.

And as you yourself wrote in a post a few weeks ago, your Facebook team has a responsibility to make sure Facebook has the greatest positive impact on the world.

What’s on your mind? This very challenge, I believe. And I’m betting Facebook will meet the challenge.

 

Elizabeth Edersheim
Posted: 01/27/2017