X Media™ Company

X Defends Enforcement Actions Around the Israel-Hamas War, Amid Concerns Around its New Approach

Yesterday, TikTok came out with a detailed defense of its efforts to police misinformation around the Israel-Hamas war, amid accusations of bias and trend interference. And today, X has addressed the same, after a new report suggested that the Elon Musk-owned platform, under its updated approach to moderation, is failing to remove the majority of posts that include antisemitism, Islamophobia, and anti-Palestinian hate.

As per X:

From the onset of the conflict, we activated our crisis protocol and stood up the company to address the rapidly evolving situation with the highest level of urgency. That includes the formation of a cross-functional leadership team that has been working around the clock to ensure our global community has access to real-time information and to safeguard the platform for our users and partners.

X says that it’s undertaken significant countermeasures to protect the public conversation around the conflict, which have resulted in:

  • The removal of over 3,000 accounts run by violent entities in the region, including Hamas
  • Direct action taken on over 325,000 pieces of content that violate its Terms of Service, including removals in the worst cases
  • Warnings and suspensions being sent to over 375,000 accounts as a result of “proactive investigations to protect authentic conversation regarding the conflict

X has also continued to evolve its Community Notes feature, which it’s hoping will become a key driver of community-led moderation, which will not only allow its users to dictate what is and is not acceptable in the app (as opposed to such decisions being made by management), but will also reduce the moderation burden on its own staff and systems, saving labor costs.

Which seems like a reasonable collective response. But when you compare it to TikTok’s reported actions, it does suggest that there may some room for improvement.

TikTok says that in October alone, it removed more than 925,000 videos in the conflict region due to violations of its policies around violence, hate speech, misinformation, and terrorism, while it also removed 730,000 videos across the platform for breaking its rules on hateful behavior.

So 1,655,000 removals in October, versus X’s “action taken” on 325,000 posts overall.

X, of course has a lot fewer users than TikTok, which is another element to factor in (244m versus 1b TikTok users). But even with that in mind, X is still actioning a lot less content, which either suggests that X is seeing less discussion around the conflict overall, X is actioning less, in line with its more “free speech” aligned moderation approach, or X is just not being as proactive as other apps.

Which, as noted, is the suggestion of a new report published by the Center for Countering Digital Hate (CCDH), an organization which Musk is actually in the process of suing over the past criticism of his app.

As per Daily Beast:

Researchers at the CCDH used X’s internal reporting system to flag 200 posts written since Hamas’ attack on Israel on Oct. 7. They identified posts containing antisemitism, Islamophobia, and anti-Palestinian hate, which were chosen as “a means of testing X’s moderation systems,” according to the report. A week later, they say, 98 percent of the posts were still up.”

It’s an interesting example, because Musk, now that he’s seeded doubt among his supporter base around the CCDH’s past findings, will no doubt use this as an example to highlight the organization’s bias against him, and X, which will invalidate the findings in their eyes.

And 200 posts is a relatively small sample set, especially when you consider the above numbers on total actions taken. But the findings do also seem to align with the discrepancy in actions taken between X and TikTok, as a direct comparison.

In any event, at present, it doesn’t seem that X’s increased reliance on Community Notes is producing the results that it would hope, in terms of addressing these key elements of concern. Other third party reports have found the same, that Community Notes, while an interesting, and potentially valuable concept, is simply not able to provide the level of enforcement capacity that X is now asking of it under this new approach.

Maybe that, however, is the point. Maybe, X will argue that other approaches are too restrictive, which is why fewer posts are being removed under its system.

But that’s unlikely to sit well with advertisers, or regional regulatory groups that are watching on, and monitoring X’s approach.  

Which could result in more questions being raised about Elon’s drive to allow more discussion in the app.



Source link

Leave a Reply

Your email address will not be published. Required fields are marked *

Tailored for You

Get a Quote

Take the first step towards achieving your business goals, let us help you stand out and grow your business.

Important Notice from X Media™

If you don’t receive your sales quote in your inbox within 45 minutes, please check your junk or spam folder for an email from team@xmediacompany.com, as it may have been filtered there by mistake.