Las Vegas Review-Journal

Brazil may just have showed what not to do in tackling social media lies

- Eric Foster Eric Foster is a columnist for cleveland.com.

In March 2018, three Massachuse­tts Institute of Technology scholars released the results of their research on the spread of misleading informatio­n on Twitter (now X). They found that false news stories were 70% more likely to be retweeted than true stories were. On the other side, true stories took about six times as long to reach 1,500 people as did false news stories.

The scholars had a theory on why lies traveled farther and faster than the truth on Twitter: People like new things. False news is more “novel” than the truth. Said another way, it stands out. People gain attention by being the first to share previously unknown (novel) informatio­n. And people who share novel informatio­n are seen as being in the know.

The results triggered a conversati­on about how to think about ways to limit the spread of misinforma­tion online. One scholar offered that, because humans — not bots — spread false news more quickly, “behavioral interventi­ons” were more important that technologi­cal solutions. Another scholar noted that because some people spread false news deliberate­ly and others do so unwittingl­y, the problem has “two parts” instead of one. The third scholar said simply, “Think before you retweet.”

Late last month, in a decision upheld by a panel of the full court, Brazil Supreme Court Justice Alexandre de Moraes blocked X across the country. He did so because X’s owner, Elon Musk, ignored court orders to remove certain user accounts. X also subsequent­ly closed its office in Brazil.

In 2022, Moraes was given the authority to order social networks to take down content that he believed threatened democracy. At that time, Brazil was struggling with a torrent of online misinforma­tion ahead of the country’s presidenti­al election.

No doubt, Brazilian officials were concerned that their country would experience what Americans had witnessed on Jan. 6, 2021: political violence fueled by social media.

As you might expect, the response to Moraes’ decision has been largely negative. Whereas his earlier efforts were credited with stifling far-right efforts to overturn the results of the 2022 presidenti­al election, causing some to wonder whether Brazil had found one possible solution to the online misinforma­tion problem, many wonder whether his current order goes too far. The justice’s subsequent modificati­ons to his order suggest that even he may think he went too far.

A democracy is defined as a system of government in which power is vested in the people and exercised by them directly or through freely elected representa­tives. The underlying premise of a democracy is that each of its members, the people, has a voice. And further, each member’s voice is equal.

As the power of a democracy is vested in the people, it follows then that a pursuit of power within a democracy is essentiall­y a pursuit of the people. And by “the people,” I mean their voices. In a democracy, he who controls the voices of the people, controls the democracy.

Social media usage is widespread in our American democracy. At this point, if you do not use social media, you are very much in the minority. Eightythre­e percent of American adults use Youtube, according to a Pew Research Center survey last year. Sixty-eight percent use Facebook. Forty-seven percent use Instagram.

If I wanted to get the attention of a lot of people in this kind of democracy where such a large percentage of the people are online, what would be the easiest way to do it? The answer is clear: spread false news.

Again, the research supports this decision. False news travels faster than true informatio­n online. It also travels farther. From a practical perspectiv­e

asinde)d,

(morality it makes the most sense to spread false news if I want to influence the greatest number of Americans in the shortest period of time.

Plus, people will see you as an authority figure because you seem to know things that the general public doesn’t. The informatio­n “they” don’t want the people to know. With authority comes control. On social media, they call it influence.

There’s an economic incentive to spreading false news, as well. Social platforms pay content creators who have large followings. Advertiser­s pay these content creators, as well, to get products mentioned. Content creation is its own version of entreprene­urship in today’s age.

Given the practical incentive to spread false news online, as well as the economic incentive, what do we do about it?

We know the harm that can result from the spread of misinforma­tion on social media. The political violence we saw on Jan. 6, 2021, is one kind of harm.

But we should be more concerned about the less violent but more impactful harm that widespread misinforma­tion causes: distrust. The distrust of everything. Institutio­ns. News organizati­ons. Intelligen­ce. Each other.

Some of this is human nature. Truthfully, it is not shocking to hear that a lie travels faster than the truth online. Well before social media, a popular saying had it that, “A lie can travel halfway around the world while the truth is putting on its shoes.”

Some of this is the design of social media. Platforms are designed to encourage engagement, to create community. The promise of social media is that there is somebody out there for anybody. No matter how different you feel you are, where you are, there is someone like you on the world wide web who feels the same. Platform algorithms are designed to help you find your tribe — even if your tribe coalesces around lies.

Still, human nature is not a reason to do nothing. Arguably, violence and murder are also human nature. Yet, society puts rules in place to prevent or punish the darkest parts of our nature.

The design of social media is similarly not a reason to do nothing. In every other area of commerce, the design of products is regulated, either by rule or by litigation. A well-intentione­d design can still be incredibly harmful to the public. In that instance, the designer must reimagine the product to achieve the original goal with minimal public harm.

Brazil tried something to combat online misinforma­tion; it is trying something. At one point, it seemed like the solution. Now, some argue that it is part of the problem.

But that’s what happens when you try something. It might work. It might not. But either way, you learn something.

Here in America, we’re not doing anything about online misinforma­tion. We will censor books, but not tweets. We are comfortabl­e censoring stories but not lies.

Now, I understand the First Amendment. I understand that we all have a freedom of speech. But that freedom is not limitless. No freedom can be. Not if our goal is to live in a society.

Living together and, ideally, prospering together, requires compromise. Even when — perhaps especially when — our most sincerely held beliefs are involved. Compromise is an acknowledg­ment of each other’s humanity. My exercise of freedom must necessaril­y end (or at least be scrutinize­d) when it harms others. That is the price of community.

The spread of false news harms us all, whether we admit it or not. As such, we need to talk about what to do about it. And under no circumstan­ces can the answer be “nothing.”

Perhaps Brazil hasn’t gotten it right. But I believe America can. That is, if we really want to.

 ?? JENNY KANE / ASSOCIATED PRESS FILE (2019) ?? Facebook reportedly failed to detect election-related misinforma­tion in ads ahead of Brazil’s 2022 election.
JENNY KANE / ASSOCIATED PRESS FILE (2019) Facebook reportedly failed to detect election-related misinforma­tion in ads ahead of Brazil’s 2022 election.

Newspapers in English

Newspapers from United States