​​​​​​​​​​​​​​​​​         

Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Meta is leaving its users to wade through hate and disinformation


Experts warn that the decision of Meta terminate its third-party fact-checking program it could allow misinformation and hate to fester online and permeate the real world.

The company announced today that it will gradually eliminate a program launched in 2016 where it partners with independent fact-checkers around the world to identify and review misinformation on its social media platforms. Meta will replace the program with a crowdsourced approach to content moderation similar to X’s Community Notes.

Meta is essentially shifting the responsibility to users to eliminate lies on Facebook, Instagram, Threads and WhatsApp, raising fears that it will be easier to spread misleading information about climate change, clean energy, public health risks and communities often targeted for violence. .

“It’s going to hurt Meta users first”

“It’s going to hurt Meta users first because the program has worked well to reduce the virality of hoax content and conspiracy theories,” says Angie Drobnic Holan, director of the International Fact-Checking Network. (IFCN) in Poynter.

“A lot of people think Community Notes-style moderation doesn’t work at all and it’s just window dressing so platforms can say they’re doing something … most people don’t want to have to go through a lot of misinformation on social media, check done everything for them,” adds Holan. “The losers here are the people who want to be able to go on social media and not be overwhelmed by false information.”

In a videoMeta CEO Mark Zuckerberg said the decision was a matter of promoting freedom of expression while also calling fact-checkers “too political.” Meta also said that its program was too sensitive and that 1 to 2 out of every 10 pieces of content it caught in December were errors and did not actually violate the company’s policies.

Holan says the video was “incredibly unfair” to fact-checkers who have worked with Meta as partners for nearly a decade. Meta specifically worked with IFCN-certified fact-checkers who had to follow the network’s Code of Principles and Meta’s policies. Fact checkers reviewed the content and assessed its accuracy. But Meta — not fact-checkers — makes the call when it comes to removing content or limiting its reach.

Poynter owns PolitiFact, which is one of the fact-checking partners Meta works with in the United States. Holan was the editor-in-chief of PolitiFact before moving into his role at IFCN. What makes the fact-checking program effective is that it serves as a “speed bump in the way of false information,” says Holan. Content that is flagged typically has a screen placed above it to let users know that fact checkers have found the statement questionable and ask if they still want to see it.

That process covers a wide range of topics, from false information about celebrities dying to claims of miracle cures, Holan notes. Meta launched the program in 2016 amid growing public concern about social media’s potential to amplify unverified rumors online, such as fake stories about the pope supporting Donald Trump for president that year.

Meta’s decision is more like an effort curry favor with President-elect Trump. In his video, Zuckerberg described the recent election as “a cultural tipping point” toward free speech. U recently named company Republican lobbyist Joel Kaplan as its new head of global business and added UFC CEO and President Dana Whitea close friend of Trump, on his advice. Trump also said today that the changes to Meta were “probably” in response to his threats.

“Zuck’s announcement is a complete knee-jerk reaction to Trump and an attempt to recover [Elon] Musk in his race to the bottom. The implications will be widespread,” said Nina Jankowicz, CEO of the nonprofit American Sunlight Project and an adjunct professor at Syracuse University who researches disinformation. place on Bluesky.

Twitter launched its community moderation program, called Birdwatch at the time, in 2021, before Musk took control. Musk, who helped fund Trump’s campaign and is now poised to lead the new incoming administration.”Department of Government Efficiency“, they rely on Community Notes after cutting the teams responsible for content moderation on Twitter. Hate speech – including insults against black and transgender people – increased on the platform after Musk bought the company, according to research by the Center for Countering Digital Hate. (Musk then sued the center, but a federal judge announced the case last year.)

Advocates are now concerned that harmful content could spread unhindered on Meta’s platforms. “Meta is now saying that it’s up to you to see the lies on their platforms, and that it’s not their problem if you can’t tell the difference, even if those lies, hate, or scams end up hurting you, ” Imran Ahmed, founder and CEO of the Center for Countering Digital Hate, said in an email. Ahmed describes it as a “big step backwards for online security, transparency and accountability” and says it “can have dire offline consequences in the form of real-world harm.”

“By abandoning fact-checking, Meta opens the door to hateful, unchecked disinformation about already targeted communities like black, brown, immigrant, and trans people, which all too often leads to offline violence,” Nicole Sugerman, campaign director of the non-profit Kairos that works. to counter racial and gender hatred online, he said in an emailed statement The Virgin today

Meta’s announcement today specifically says it will “get rid of a number of restrictions on topics such as immigration, gender identity and gender that are the subject of frequent political discourse and debate.”

Scientists and environmental groups are also paying attention to changes in Meta. “Mark Zuckerberg’s decision to abandon efforts to fact-check and correct misinformation and disinformation means that anti-science content will continue to proliferate on Meta platforms,” ​​said Kate Cell, senior climate campaign manager at the Union of Concerned Scientists, in an email statement.

“I think this is a terrible decision … the effects of disinformation on our policies are becoming more and more apparent,” says Michael Khoo, director of the climate disinformation program at Friends of the Earth. He points to attacks on wind power that affect renewable energy projects as an example.

Khoo also likens the Community Notes approach to the fossil fuel industry recycling marketing as a solution to plastic waste. Actually, recycling has done little to stem the tide of plastic pollution floods in the environment since the material is difficult to rehash and many plastic products are not really recyclable. The strategy also places responsibility on consumers to deal with a company’s waste. “[Tech] Companies need to own the misinformation problem that their own algorithms are creating,” says Khoo. The Virgin.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *