Wednesday, October 6, 2021

Is it time to regulate Facebook?

 I’ve argued in the past against regulation of social media on numerous occasions. The recent testimony of Facebook whistleblower Frances Haugen is pretty persuasive evidence that something needs to be done about the company, however.

To be clear, I still don’t support rewriting Section 230 to mandate fairness on social media. I agree with Facebook and Twitter’s decisions to ban Donald Trump from their platforms and to enforce their own community standards. As I’ve said in the past, these are private platforms and they have the right to set their own rules. That was true even when Facebook deleted a political page that I had operated for more than a decade.

Photo by Firmbee.com on Unsplash

Rather than viewpoint discrimination, the real danger of social media is the prospect of the platforms spreading mental illness and deepening our political divisions. These allegations are among the claims made by Haugen, many of which are backed up by internal company documents that she took before leaving Facebook last May.

While it is not the government’s job to regulate the hurt feelings of private citizens who are ejected from private platforms, public health and national security do fall directly under government jurisdiction. Haugen’s allegations, along with documents obtained by the Wall Street Journal, show that Facebook negatively affects mental health and national security are areas where government regulation is appropriate, unlike free speech claims against private entities.

Internal documents released over the past few weeks show that Facebook targeted teenage girls with its Instagram app even though it was aware that its algorithms steered them to topics that caused about a third to experience anxiety and depression. Alarmingly, about six percent of teens who reported suicidal thoughts said that the idea originated on Instagram. Reports that some teens say Instagram improves their self-image do not overcome this chilling fact.

Beyond that, engineers design social media sites to be addictive. They want to keep you engaged so that you keep scrolling and keep seeing ads. We engage on social media at the expense of real world interactions. As a result, we are both more connected and more isolated than ever before.

On top of that, Facebook’s attempts to increase “meaningful social interactions” (MSIs) between family members and friends backfired in a big way. An algorithm change meant to encourage people to interact rather than just read content online caused content creators to shift toward “clickbait” posts that were full of outrage and sensationalism. These posts generated lots of comments and shares, but they also generated a lot of anger.

As David French recently pointed out, Democrats and Republicans agree on a lot of issues yet both sides see the other as radicals who threaten America’s very existence. The sensationalist clickbait makes us focus on our differences rather than what we have in common. This drives us farther apart and deepens our divisions.

“Our approach has had unhealthy side effects on important slices of public content, such as politics and news,” a team of Facebook scientists said in documents obtained by the Wall Street Journal, calling the problem “an increasing liability.”

Further, Facebook was also aware that its platform was being used for nefarious purposes from sex trafficking to inciting violence against minorities to organ selling to pornography to governments quashing political dissent and of course, the widespread dissemination of misinformation and conspiracy theories. There is also evidence that Facebook was used by the January 6 insurrection plotters. Leaked documents show that Facebook was aware of these problems and failed to take action to stop them.

“I saw Facebook repeatedly encounter conflicts between its own profit and our safety. Facebook consistently resolved these conflicts in favor of its own profits,” Haugen said in her testimony. “As long as Facebook is operating in the shadows, hiding its research from public scrutiny, it is unaccountable. Until the incentives change, Facebook will not change.”

The picture painted by Haugen’s testimony is of an evil corporation bent on profits at all costs. The revelations about Facebook’s internal workings and the bevy of coverups make the company seem to rival - if not outpace - the tobacco companies as damaging to the entire world.

At the same time, Facebook allows us to exist in bubbles of confirmation bias. Users need not ever encounter a contradictory opinion, at least one that isn’t shouted down by likeminded users in the comments, unless they make the choice to seek out opposing points of view. Most people would rather have their beliefs affirmed than questioned so massive blind spots to reality have developed.

The question is what to do about it. The problem is made more complicated by the fact that Facebook is heavily used by a large number of small businesses which rely on the platform for marketing and sales. Around the world, 200 million businesses use Facebook’s various tools. At this point, shutting the company down would be damaging to the economy. The six-hour shutdown on Monday cost some businesses thousands of dollars.

I’m no tech expert, but there are a few changes that seem like obvious ways to start fixing the problem. First, trash the algorithms. If you’re on Facebook, you’ve probably noticed that you never see posts from a lot of your friends. That’s because Facebook’s algorithms reward posts that generate engagement in terms of likes, comments, and shares. This also rewards controversial and anger-inducing posts from internet trolls, however.

A quick fix would be to just let posts appear on users’ walls in chronological order. If users want to seek out clickbait that’s one thing, but it’s another entirely to shove it in their faces.

In the meantime, users can help themselves by not engaging with trolls. If we don’t feed the trolls, engagement is limited and their posts won’t be prioritized as highly.

Second, as Haugen pointed out, Facebook’s tools designed to deter the spread of misinformation are woefully inadequate. We’ve probably all had the experience of having an innocuous post removed or flagged because it was flagged by some aspect of the algorithms.

Haugen said that Facebook is “overly reliant on artificial intelligence systems that they themselves say will likely never get more than 10 to 20 percent of the [malicious] content.” Judging from the anti-vax, stolen election, and pandemic fascism posts that I see as I scroll, she’s probably right.

Better AI and a large, large team of human moderators could solve this problem. It would take an army to police Facebook’s 1.8 billion daily active users, but the company can afford it. Facebook’s profit in 2020 was $32 billion.

For years, Facebook’s Mark Zuckerberg has said that the company welcomes federal regulation of the social media industry. In response to Haugen’s testimony this week, the company put out another statement that argued that Facebook was being mischaracterized but nevertheless called for regulation.

“We agree on one thing; it’s time to create standard rules for the internet,” Facebook’s statement said. “Instead of expecting the industry to make societal decisions that belong to legislators, it’s time for Congress to act.”

The congressional Facebook hearings did show two things. One is that the evidence shows that Facebook needs to be held accountable for actions and decision-making. The second is that we cannot trust Mark Zuckerberg and Facebook to be accountable to themselves.

Now, if you’ll excuse me, I’m off to schedule The Racket’s social media posts foe the day.

From the Racket

No comments:

Post a Comment