Saturday, March 7, 2026
HomeNewsIn times of crisis, states have only a few tools to combat...

In times of crisis, states have only a few tools to combat misinformation

Date:

Related stories

When fatal forest fires raged in Los Angeles this month, the local officials had to appeal to a number of lies and falsehoods that quickly spread online.

Images of the notable Hollywood sign created from artificial intelligence, surrounded by fire, to unfounded rumors that firefighters used women with water to contain the flames, misinformation was widespread. While officials fought against fire and falsehood in Southern California, Meta – The parent company of Facebook and Instagram – announced It would eliminate its fact program in the name of free expression.

This asked some what, if at all, the governments of the country can do the distribution of harmful lies and rumors that multiply on social media. Emergency helpers are now Experience Which election officers had to fight in recent years the electorate of election fraud – the refusal of President Donald Trump, recognizing his loss of 2020 – have increased.

A California LawThe online platforms must have passed party management last year to remove contributions with misleading or phony, with AI-generated content in connection with the elections of the state within 72 hours after the complaint of a user.

The measure enables California politicians and election officers who were violated by the content to sue social media companies and to comply with compliance with compliance. Spring Statute On the whole, protects social media companies from complaints and protects them from content to make them liable.

“The recent announcement by Meta that they would follow the X model of refusal to a community forum instead of showing experts why the law was needed and why voluntary obligations are not sufficient” Steline in an e -mail.

X, the company that was previously known as Twitter, laminated California in November about the measure and compared the law with the state -funded censorship.

“Instead of allowing hidden platforms to make its own decisions about the content in question here, she authorizes the government to replace its judgment for those of the platforms,” ​​wrote the company in the lawsuit.

The law clearly violates the first change, according to the lawsuit. Further hearings on the lawsuit will probably come this summer. Berman said he was confident that the law would prevail before the courts, since it is closely tailored to the integrity of the elections.

California’s measure was the first of its kind in the nation. Depending on how it happens before the dishes, it could inspire laws in other states, said Berman.

Few state laws

The spread of misinformation about the fires supported by algorithms in Los Angeles shows how social media companies cannot handle this “crisis moment” and cannot handle and do not bypass, said Jonathan Mehta Stein, Executive Director of California Common Cause, one Pro democracy, a pro-democracy Advocacy organization. States have to do more, he said.

“You do not receive information from fire brigade agencies or from local authorities, unless the social media companies make sure that they do so,” he said in an interview. “And unfortunately, the social media companies not only actively do this not only actively to make it difficult for the government to do something about online mission.”

The two words are sometimes used interchangeable, but “misinformation” applies to false and misleading information, while “disinformation” refers to falsehoods that are deliberately spread by people who know that the information is wrong.

California Common Cause and its California Initiative for Technology and Democracy Project have done BERMAN’s legislation and are working to promote similar state legislation across the country.

Misorial laws in other countries were far more circumscribed. In Colorado, for example, democratic legislators passed last year legislation This requires that the Attorney General develop nationwide resource and education initiatives in order to prevent the spread of online fehlin information. But it doesn’t aim to do social media companies.

In July the Supreme Court of the United States bring to the queue Laws in Florida and Texas that would have prevented social media companies from banning or restricting content from politicians. Social media companies argued that these laws had violated their first violation of change protection.

The laws were a reaction to what the legislator of the Republican state saw as an anti-conservative prejudices in social media companies, especially after Trump was banned from Twitter and Facebook from the US Capitol after January 6, 2021. The judges unanimously agreed that the legal problems before the preferences must be examined further.

No state law to combat falsehoods has come close to the European Union ModelThe social media companies legally forced to contain misinformation and falsehoods on their platforms.

Also, you shouldn’t, said Ari Cohn, a senior lawyer for technology policy at the Foundation for Individual Rights and Expression based in Philadelphia, an advocacy group for freedom of speaking to college campus.

“If the government forces platforms to remove information on the basis of falsehood, this is a typical violation of the initial adaptation,” he said. “It is just a terrible idea to enable power to determine what the truth is.”

Cohn’s organization, also known as Fire, Has praised Meta’s recent approach when checking the facts in the community, which says that it eliminates the personal prejudices of facts from the company and promotes a more democratic approach to correction destitute information. X has expanded the employ of the Community Notes model in 2022.

However, critics have argued that Meta’s decision would worsen the spread of misinformation and hate speeches, and that the move was an obvious political trick to favor of Trump.

Limited tools to combat misinformation

Without legal ways to reduce misinformation, the officers had to confront directly against the falsehoods, with some even websites on the market for combating and correcting online rumors. Experts call the practice “praise”.

Newsom started the California Fire Facts website This is shown by “lies” about the reaction of the state to the forest fires together with real information. Among the claims of social media: Democratic leaders began the fire to hide “pedophile tunnels”. A fire came from a satanic ritual; The state wants out-of-and-and-stewards to support combat the blazes. All of this, according to the website, are wrong. And the federal education authority updated one side Before that used while Hurricanes used to tackle rumors.

People, even more than before, have to be their own gatekeepers and facts and editors when it comes to their information diet.

– Peter Adams, Senior Vice President for Research and Design at the News Literacy Project

The X-model of the public facts of the fact is in full swing through civil servants who counteract online falsons, whereby users submit notes to the website that indicate misleading or incorrect information.

For example when conservative provocateur Dinesh d’Auza posted On x to his 4.7 million supporters with claims that Fire Trucks from Oregon were forced to stop in Sacramento for emission tests, the users added a note.

“This information is wrong and misleading,” says the note, which has linked to an official account of the Firefighter of the State of Oregon.

However, the community notes model is not sufficient, said Imran Ahmed, founder and CEO of the Center for Digital Hate, a non -profit organization that is committed to online bourgeois freedom.

The center published in October a study This showed that 74% of the community notes that correct the US selection report are never shown to the users. In addition, articles with false information had 13 -as many views as the community itself.

“It is based on the good will of the users,” said Ahmed in an interview. “Of course, the behavior of some people who lead these platforms shows that there are quite often no good will that these are not environments in which people try to create the truth.”

When natural disasters like the forest fires in Los Angeles act as flash points of the misinformation, and programs of the type Community Notes are not effective enough to combat the falsehoods, said Peter Adams, Senior Vice President of Research and Design at the News Literacy Project, which for states Experience that accepts curricula for news competence.

The organization carries out the digital tool with rumors that informs subscribers if there is a widespread misinformation on a specific topic. The group published a warning January 9, two days after the fire’s fire.

“People, even more than before, have to be their own gatekeepers and factors and editors when it comes to their information diet,” said Adams.

Berman, the California Democratic Assembly, agrees. Madelin information on natural disasters can have unsafe consequences in practice and people can mislead people in their most endangered moments, he said.

In 2023, he successfully sent a legislative template to give the K-12 curriculum media literacy. join several states This has issued similar laws in recent years.

Recently, Berman said, a close friend who lives in the Pacific Palisades community, told him about false information that spreads in his social circles. Berman encouraged the friend to always ask himself: “Who is it who makes the posting?” And “What are your motifs behind what you post?”

“He joked that he could use a training session for media literacy,” recalled Berman in an e -mail to Steline. “But the truth is that we could all.”

Latest stories

LEAVE A REPLY

Please enter your comment!
Please enter your name here