• Sat. Jan 23rd, 2021

Toxic Trade-Offs at Facebook

This article is part of the On Tech newsletter. You can sign up here to receive it weekdays.

Three years ago, Mark Zuckerberg rebooted Facebook to try to fix one of its big problems. He inadvertently turbocharged a different problem, feeding the growth of dangerous conspiracy theories like QAnon. I worry that we might see a repeat of this phenomenon.

Beginning in 2017, Facebook started a revamp to emphasize personal posts and interactions and to steer us away from aimlessly scrolling past news articles and puppy videos in the news feed. Among the changes was pushing people to Facebook Groups, or online forums of like-minded people.

For many people, groups can be a wonderful resource and social outlet. But they also have become places for people to wallow in fake health treatments, plot violence or spread false theories like QAnon.

Groups that post frequently and have a lot of avid back-and-forth — and that often applies to discussions of fringe ideas — tend to get circulated more in the Facebook news feed, which funnels more people into those groups.

Charlie Warzel, an Opinion writer for The New York Times, wrote that the tilt toward groups helped fertilize QAnon, the sprawling and false conspiracy that a child-abusing cabal leads powerful institutions.

Facebook didn’t invent QAnon or other conspiracy theories. But Facebook’s computerized nudges that benefited the most hyperactive groups “most likely supercharged the QAnon community,” Charlie wrote.

Some of the harmful consequences of that remodel show that when internet companies with billions of users make even well-intended changes, they can set in motion damaging ripple effects.

There is a line of thinking that Facebook could never have predicted how people would misuse its creations. That’s not quite true. There were warning signs for years about dangerous activity in groups, but the company didn’t pay attention to the toxic trade-offs of emphasizing them.

Facebook now wants to become a place for us to have more private and meaningful conversations — a continuing evolution from a global public message board to the more cloistered space that Zuckerberg started to emphasize in 2017. I worry that this may create Facebook’s next unintended consequence.

Part of this privacy plan is a march to encrypt, or scramble, all activity so that there are no digital trails of what we post or say. There are good reasons for this. Facebook wouldn’t be able to peer into our private messages, and authoritarian rulers couldn’t demand that Facebook identify the person behind an account critical of the government.

But the potential pitfalls terrify me. Encrypting Facebook apps including Instagram and Messenger will make it difficult or impossible for Facebook to help law enforcement figure out who is selling drugs on Instagram or calling for violence in its groups. It will be harder to trace a propaganda campaign to a foreign government. Facebook will be able to say, truthfully, that it can’t see behind its own curtain.

Facebook is aware of these risks. “Encryption is a powerful tool for privacy, but that includes the privacy of people doing bad things,” Zuckerberg wrote last year. Executives have sometimes said, vaguely, that they’ll consult with experts to limit the downsides of encryption.

The result of Facebook’s next changes may be that it absolves itself of responsibility for many of the social network’s horrors. A violent plot organizing in Facebook groups? Facebook can’t spot it. Child sex abuse imagery spreading on the site? That’s not Facebook’s problem anymore.

There’s a pattern of Facebook revamping itself to stay popular, and then creating new problems it has to respond to. But with this encryption shift, it cannot afford — actually, strike that; humanity cannot afford — to ignore the trade-offs until they become blindingly obvious.

If you don’t already get this newsletter in your inbox, please sign up here.

Brian X. Chen, a personal technology columnist for The Times, has advice for how to make our “private” internet browsing actually private.

Over the last decade-plus, Google Chrome and other popular browsers have offered a so-called private mode that purportedly kept our browsing activities hidden. That may have lulled us into a false sense of security, because the vast majority of the time, private web browsing isn’t truly private.

In general, private browsing sessions serve to prevent others who share your device from seeing which websites you have visited. In other words, it prevents the browser from creating a history.

In the mobile era, when many of us are the sole users of our devices, this definition of privacy is outdated. There are also now a gazillion ways for your browsing activity to be tracked by third parties like marketing companies.

But Brave, Firefox Focus and DuckDuckGo are browsers that take private sessions a step further: They automatically block the tracking technology embedded inside websites that help advertisers target you. (If you’ve ever wondered why, after doing a web search for blenders, you kept seeing ads for blenders all over the web, you could probably blame ad trackers.)

Even these browsers fall short in some ways. They generally don’t hide your browsing traffic from your internet service provider. To do that, you would have to use a virtual private network, or VPN, which creates a virtual tunnel that shields your browsing information from the service provider. Brave offers a version of its browser with a built-in VPN for $10 per month.

In short, privacy is an increasingly complex term, and when a company says it is offering you privacy, take that with a grain of salt. Your definition of private may not mesh with a tech company’s business model.

  • We all need fun diversions, including the kids: My colleague Kellen Browning wrote about teens and tweens flocking to the online gaming site Roblox during the pandemic. About three-quarters of Americans from ages 9 to 12 are on the site, according to Roblox, and many of its young players are thriving financially by developing new games.

  • The risk of the U.S. government’s Chinese tech crackdown: U.S. efforts to begin booting Chinese apps and telecom companies from America’s digital borders mirrors aspects of China’s authoritarian regulation of the internet, my Times colleagues wrote. U.S. efforts to keep out Chinese tech could trigger retaliation against American tech companies abroad.

  • Having trouble buying a computer for school? It’s not just you, Axios reported. As schools start to reopen — virtually or in person — computer makers are having trouble keeping up with the demand from both grown-ups working at home and kids learning from home. (I’ll have more in Tuesday’s newsletter about what to do if you can’t find the computer you want.)

Playing peek-a-boo with gigantic hippos.

We want to hear from you. Tell us what you think of this newsletter and what else you’d like us to explore. You can reach us at ontech@nytimes.com.

If you don’t already get this newsletter in your inbox, please sign up here.