A New Law Is Forcing Facebook To Completely Change?
You may soon be visiting a completely different Facebook because of one new law.
This article is more than 2 years old
As countries around the world struggle with issues relating to COVID-19, research shows how misinformation has influenced attitudes and actions regarding protective measures like mask-wearing. The research indicates it’s an ongoing problem, especially on Facebook. Misinformation (also known as fake news) refers to any claims or depictions that are false or misleading. Psychological studies of both misinformation (also known as fake news) and disinformation (a subset of misinformation intended to mislead) are helping to expose the harmful impact of fake news—and offering potential remedies. However, scholars who study fake news warning that it will be an uphill battle that would require a global collaborative effort from researchers, governments, and social media companies in the end. Still, a new bill could potentially pave the way for Facebook to have to change completely.
People use five criteria to determine whether the information is true: compatibility with other known information, the source’s credibility, whether others believe it, whether the information is internally consistent, and whether there is supporting evidence. His research also shows that if misinformation is easier to hear or read, people are more likely to believe it. Misinformation that is easily read is particularly rampant on Facebook.
On Wednesday, Congress unveiled a new bill that could change how we spread and receive information. The bill is the first step toward addressing algorithmic amplification of toxic content. It aims to increase research on “content-neutral” techniques to increase friction in online content sharing. It also directs academics to find a variety of strategies to restrict the spread of harmful content and misinformation, such as requiring users to read an article before sharing it (like Twitter has done) or taking other steps. The proposals would then be codified by the Federal Trade Commission, which would require social media platforms like Facebook and Twitter to implement them.
Since the 2016 presidential election in the United States, when misinformation spread like wildfire on Facebook and other social media platforms, psychology study on the subject has exploded. People use skepticism selectively, according to studies of motivated reasoning by psychologist Peter Ditto, PhD, of the University of California, Irvine.
Democrats have been pursuing strategies to combat misinformation online for years, while Republicans have decried these attempts as a danger to free speech. Still, presently, members of both parties have been working together to discover ways to regulate algorithms that handle both children’s difficulties and misinformation, inspired by testimony from Facebook whistleblower Frances Haugen in 2020. Lummis’ support for the bill is a big step forward in the right direction.
Lummis highly favors the NUDGE Act, saying it is a good beginning toward comprehensively tackling Big Tech overreach. By allowing the [NSF] and [NASEM] to research the addictiveness of social media platforms like Facebook, we’ll be able to completely comprehend the influence the platforms’ designs and algorithms have on society. This then makes it possible to construct guardrails from there that serve as protection against the spreading of misinformation.
The biggest roadblock for politicians trying to address harmful algorithmic amplification has been removing Section 230 liability protection. Public Knowledge and other tech and public interest groups have already endorsed the Klobuchar bill. It is believed that its lack of 230 modifications makes it one of the better models for algorithm regulation. Public Knowledge supports this law because it promotes educated decision-making to address the well-known problem of spreading misinformation. Most critically, the bill accomplishes all of this without retaining Section 230 immunity. However, it still remains to be seen if and when Facebook will be subject to adhere to it.