This article is part of the On Tech newsletter. You can sign up here to receive it weekdays.
Days of fury over the killing of George Floyd last week in Minneapolis, strangely, is showing the best of technology that gives nearly everyone a megaphone to show and tell the world what they’re feeling, doing and witnessing.
But the same technology also enables the spread of dangerous misinformation and can be used to incite violence and empower genocidal mobs.
Can these two sides be separated? Can we have one without the other, or is it a package deal?
I spent a good chunk of the weekend on Twitter, which, like other internet hangouts, felt like a place to bear witness to the raw, beautiful and terrible moments of an important point in U.S. history.
I will never forget the video of two men and a teenager talking about how powerless they feel. I watched online this moment of silence, a woman angry at people tagging graffiti on a store, fury directed at a police car in my city and another police vehicle appearing to ram through a crowd also in New York City. I needed to see all of this. We needed to see all of this.
News outlets are publishing articles and videos on the protests, but social media brings the reporting to an even wider audience.
And journalists can’t be everywhere and see everything. Smartphones let people at every protest in every city capture what they see and tell their own stories. Otherwise we’d have a flatter view of what’s happening. Watching, listening and witnessing don’t on their own fix social inequalities, but they’re necessary steps.
There are days I wonder if the last decade of the internet was all a horrible mistake. Maybe it was a terrible idea to let people broadcast whatever they wanted to potentially billions of people without a moment’s hesitation or oversight, and to let a handful of corporations turn the dials that determine how prominently those messages got circulated.
This has empowered and encouraged drug dealers, lynch mobs, child sex abusers and backers of dangerous health information to run wild until — maybe — one of those dial-twisting companies noticed and — again, maybe — stopped them.
And then there are moments when I can’t imagine not putting the power to broadcast in the hands of billions.
I want to have the good, the bad and the in-between of bearing witness to moments of history without everything else.
Facebook is making a choice, too
My least favorite words in the English language are “Facebook shouldn’t be the arbiter of truth” of what people say online.
Mark Zuckerberg, Facebook’s founder and chief executive, said those words (again) last week in response to questions about why the company, unlike Twitter, let stand without comment or action posts from President Trump that falsely claimed that mail-in voting ballots would mean the November presidential election was “rigged.”
It’s completely fair to say that words of powerful people like the American president should stand on their own no matter what. There is inherent value in seeing the unvarnished comments of world leaders and being able to debate whether those words are right or wrong.
But Zuckerberg didn’t stop at that sentiment. He reiterated a big, bold statement that Facebook doesn’t want to be the arbiter of truth. Guess what? It is, and Zuckerberg knows it.
Facebook, like those other dial-twisting companies I wrote about above, is not a zone of complete free expression.
Tens of millions of times each month, people who work on Facebook’s behalf — or computer systems for which Facebook writes the rules — enforce the company’s policies that prohibit calling for violence against a person or a group of people, discussions about suicide or self-harm, or posting sexually explicit material about a child. Facebook likewise determines what counts as bullying on its online hangouts and what is spam to be blocked or deleted.
Facebook does not stand back and simply let anything happen inside its digital walls. And I don’t think we want it to. Everything that we see on Facebook is because Facebook actively chose to do something, or actively chose not to do something. Facebook is not neutral.
I have to assume that Zuckerberg saying that Facebook shouldn’t be the arbiter of truth is either a stunningly simplistic comment from one of the world’s most powerful people, or a blatant attempt to win over politicians and political conservatives who have proposed regulatory crackdowns. Either way, it’s not good.
Before we go …
Be careful about claims regarding “outside” agitators: Some federal and local officials have said anti-government extremists organized online to incite violent confrontations or property damage at the protests sparked by the death of George Floyd. These claims are often made without evidence and may be inflating the role of far-left or far-right “outsiders” at protests, NBC News reported.
Inside the messy policymaking at Facebook and Twitter: My colleague Kate Conger has a back story on how Twitter executives, in a late-night virtual conference last week, debated and decided to add a warning label to Mr. Trump’s tweet appearing to threaten military violence against protesters who looted.
At Facebook, there are clusters of unrest among some employees unhappy with how the company has handled inflammatory posts from the president, the Times tech reporters Sheera Frenkel and Mike Isaac write. Dozens of these employees are staging a virtual “walkout” on Monday, in what my colleagues said was rare public criticism of their own company.
Real retail therapy: In response to rising anxiety over the coronavirus and everything else, the online shoe retailer Zappos revamped its customer service hotline to let people call and talk about anything — their future travel plans, help with homework or whatever is on their minds, my colleague Jenny Gross writes.
Hugs to this
We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at email@example.com.
Get this newsletter in your inbox every weekday; please sign up here.