April 18, 2024

[ad_1]

Facebook owner Meta is quietly scaling back some of the safeguards designed to prevent voter fraud or foreign interference in US elections as the November midterms approach.

It’s a sharp departure from the social media giant’s multibillion-dollar efforts to improve the accuracy of posts about the US election and regain the trust of lawmakers and the public after they were outraged that the company had exploited the data. of people and allowed imposters to flood her site during the 2016 campaign.

The pivot raises alarm about Meta’s priorities and how some could exploit the world’s most popular social media platforms to spread misleading claims, create fake accounts and infuriate partisan extremists.

“They’re not talking about it,” said former Facebook policy director Katie Harbath, now CEO of technology and policy firm Anchor Change. “Best case scenario: They’re still doing a lot behind the scenes. Worst case scenario: They retire and we don’t know how that will play out for the midterms on the stands.”

Since last year, Meta has shut down an investigation into how falsehoods are amplified in political ads on Facebook, indefinitely banning researchers from the site.

CrowdTangle, the online tool the company offered hundreds of newsrooms and researchers so they could spot news posts and misinformation across Facebook or Instagram, has been down for a few days.

Public communication about the company’s response to election disinformation has been muted. Between 2018 and 2020, the company released more than 30 statements detailing how it would crack down on US election misinformation, prevent foreign adversaries from running ads or posts around the vote, and crack down on divisive hate speech.

Top executives hosted question-and-answer sessions with reporters about the new policies. CEO Mark Zuckerberg wrote posts on Facebook promising to remove false voting information and wrote opinion pieces calling for more regulations to deal with foreign interference in US elections via social media.

But this year Meta released only a one-page document outlining plans for the fall election, even as potential threats to the vote remain clear. Several Republican candidates are promoting false claims about the US election on social media. In addition, Russia and China continue to conduct aggressive social media propaganda campaigns aimed at further political divisions among the American public.

Meta says elections remain a priority and policies developed in recent years around election disinformation or foreign interference are now hard-wired into the company’s operations.

“With each election, we incorporate what we’ve learned into new processes and have created channels to share information with government and our industry partners,” said Meta spokesman Tom Reynolds.

He declined to say how many employees would be on the job protecting American elections full-time this year.

During the 2018 election cycle, the company offered tours and photos and produced metrics for its polling booth campaign. However, the New York Times reported that the number of Meta employees working on this year’s election had dropped from 300 to 60, a figure Meta disputes.

Reynolds said Meta will bring in hundreds of employees working in 40 of the company’s other teams to monitor the upcoming vote along with the election team, with an unspecified number of employees.

The company is continuing several initiatives it developed to curb election misinformation, such as a fact-checking program it launched in 2016 that enlists the help of news outlets to investigate the truth of popular falsehoods spread on Facebook or Instagram. The Associated Press is part of Meta’s data audit program.

This month, Meta also introduced a new feature for political ads that allows the public to search for details on how advertisers target people based on their interests on Facebook and Instagram.

However, Meta has stifled other efforts to identify election disinformation on its sites.

He stopped making improvements to CrowdTangle, a website he pitched to newsrooms around the world that provides insights into trending social media posts. Journalists, fact-checkers and researchers used the site to analyze Facebook content, including identifying popular misinformation and who is responsible for it.

That tool is now “dying,” former CrowdTangle CEO Brandon Silverman, who left Meta last year, told the Senate Judiciary Committee in the spring.

Silverman told the AP that CrowdTangle was working on upgrades that would make it easier to search the text of Internet memes, which can often be used to spread half-truths and escape the oversight of fact-checkers, for example.

“There’s no real shortage of ways you can organize this data to make it useful to many different segments of the fact-checking community, newsrooms and the broader civil society,” Silverman said.

Not everyone at Meta agreed with this transparent approach, Silverman said. The company hasn’t released new updates or features to CrowdTangle in more than a year, and in recent months has experienced hours-long outages.

Meta also shut down efforts to investigate how misinformation travels through political ads.

The company has indefinitely revoked access to Facebook for a pair of New York University researchers who said they collected unauthorized data from the platform. The move came hours after NYU professor Laura Edelson said she shared plans with the company to investigate the spread of misinformation on the platform surrounding the January 6, 2021 attack on the US Capitol, which is now the subject of a House investigation.

“What we found, when we looked closely, is that their systems were probably dangerous to many of their users,” Edelson said.

Privately, former and current Meta employees say the revelation of these risks around the US election has created a public and political backlash for the company.

Republicans routinely accuse Facebook of unfairly censoring conservatives, some of whom have been fired for violating the company’s rules. Democrats, meanwhile, regularly complain that the tech company hasn’t gone far enough to curb misinformation.

“It’s something that’s so fraught with politics, they’re more likely to avoid it than jump in.” said Harbath, Facebook’s former director of policy. “They just see it as a big old pile of headaches.”

Meanwhile, the possibility of US regulation no longer looms over the company, with lawmakers failing to reach any consensus on the oversight the multibillion-dollar company should be subject to.

Without that threat, Meta’s leaders have devoted the company’s time, money and resources to a new project in recent months.

Zuckerberg dived into this massive Facebook rebrand and reorganization last October when he changed the company’s name to Meta Platforms Inc. It plans to spend years and billions of dollars to evolve its social media platforms into a nascent virtual reality construct called a “metataver” — kind of like the Internet come to life, rendered in 3D.

His public Facebook posts now focus on product announcements, saluting AI and photos of him enjoying life. News about election readiness is announced in company blog posts not written by him.

In one of Zuckerberg’s posts last October, after a former Facebook employee leaked internal documents showing how the platform magnifies hate and misinformation, he defended the company. He also reminded his followers that he had pushed Congress to modernize election regulations for the digital age.

“I know it’s frustrating to see the good work we do being maligned, especially for those of you who make significant contributions to security, integrity, research and product,” he wrote on Oct. 5. “But I think over time if we continue to try to do what’s right and provide experiences that improve people’s lives, it’s going to be better for our community and better for our business.”

It was the last time he discussed the Menlo Park, California-based company’s campaign work in a public Facebook post.

___

Associated Press technology writer Barbara Ortutay contributed to this report.

___

Follow AP’s misinformation coverage at https://apnews.com/hub/misinformation.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *