Facebook co-founder and CEO Mark Zuckerberg likes to write open letters during times of great import. He wrote one when the company went public, he wrote one to his infant daughter when he became a father, and he has written a new one that was published on Thursday.
So why is now a time that requires an almost 6,000-word essay that touches on world affairs, U.S. politics, the value of high-quality journalism and the crumbling social fabric of America? Because it seems that Zuckerberg has become “woke,” as the kids like to say.
In other words, the Facebook co-founder seems to have awakened to some of the unintended consequences of the mega-platform that he has built.
In the case of the media, for example, Zuckerberg has undergone a significant evolution from his original position just after the election, when he scoffed at the idea that “fake news” distributed on the social network was a problem, and reiterated his position that Facebook is not a media company and therefore is more or less blameless when it comes to such things.
Note: This was originally published at Fortune, where I was a senior writer from 2015 to 2017
Since then, Zuckerberg has admitted that disinformation spread via social media is a problem, and that Facebook needs to help, and he has instituted a series of moves to help fact-check news stories on the network. Facebook has also started a journalism project and reached out to local media outlets.
Some of this might just be a public-relations exercise, designed to simulate interest in the problem in order to get critics off his back. But in his letter at least, the Facebook CEO confronted the problem head on, saying: “Giving everyone a voice has historically been a very positive force for public discourse because it increases the diversity of ideas shared. But the past year has also shown it may fragment our shared sense of reality.” He goes on to say:
“We know there is misinformation and even outright hoax content on Facebook, and we take this very seriously. We’ve made progress fighting hoaxes the way we fight spam, but we have more work to do. We are proceeding carefully because there is not always a clear line between hoaxes, satire and opinion.”
Zuckerberg has clearly been listening to critics of the attempt to get Facebook to arbitrate what is true and what isn’t, who argue that putting this power in the hands of a single company is unwise. As the Facebook CEO says: “In a free society, it’s important that people have the power to share their opinion, even if others think they’re wrong. Our approach will focus less on banning misinformation, and more on surfacing additional perspectives and information.”
In a larger sense, Zuckerberg seems to be admitting in his essay that Facebook’s original, somewhat simplistic goal — to bring the world together and connect everyone to everyone else — was a little flawed, or at least didn’t take into account some realities of the world we live in.
If you are trying to create a single, unified global platform with homogeneous standards and practices for billions of people, which Facebook has arguably been trying to do, then you are doomed to fail. This has happened in both large and small ways, whether it’s Facebook removing breast-feeding photos because they might offend someone, or taking down historic Vietnam war pictures.
As Zuckerberg puts it in his letter, “there are questions about whether we can make a global community that works for everyone, and whether the path ahead is to connect more or reverse course.” Needless to say, the Facebook co-founder’s default is to connect more, but he admits that “our community spans many countries and cultures, and the norms are different in each region.”
The solution, he says, is to allow for more personal customization, so that each user can decide what they wish to see or not see. And for those who argue that this perpetuates “filter bubbles,” Zuckerberg says that Facebook will try to expose users to alternate perspectives. But he is also aware of the dangers of doing this, which he describes in this way:
“Research shows that some of the most obvious ideas, like showing people an article from the opposite perspective, actually deepen polarization by framing other perspectives as foreign. A more effective approach is to show a range of perspectives, let people see where their views are on a spectrum and come to a conclusion on what they think is right.”
The Facebook CEO also seems to be aware of how much social platforms like his (and Twitter, for that matter) encourage people to take extreme positions, or create division in order to get attention. “Social media is a short-form medium where resonant messages get amplified many times,” Zuckerberg says. “This rewards simplicity and discourages nuance. At its worst, it oversimplifies important topics and pushes us towards extremes.”
So what can Facebook do to solve these problems, when the whole structure of the network — not to mention the monetization model — is arguably oriented towards maintaining the status quo? Zuckerberg is a little vague on how that is going to happen. He talks about better tools for understanding each other, better ways to have conversations, connecting more people from different walks of life, etc. But there is little about how exactly those will work.
Perhaps there are some social and political — and even just human — problems that not even Facebook and Mark Zuckerberg can solve, as well-intentioned as they might be.