X Corp., which owns social media platform X (previously known as Twitter), has reduced its worldwide trust and safety team by 30%, with a decline of 80% in the number of safety engineers since billionaire Elon Musk took control in 2022, as reported by an Australian internet safety watchdog.

A summary of X's responses to queries about enforcing its regulations against hate speech was revealed by Australia's eSafety Commission, which boasts itself as the first government organization in the world devoted to making people safer while using the internet.

According to the commission, while X had previously provided estimates of the workforce decrease, the responses were the first detailed data regarding which departments had seen cuts that were made public.

COMBO-EU-US-DISINFORMATION-ISRAEL-PALESTINIAN-CONFLICT
(Photo : ALAIN JOCARD / AFP via Getty Images)
(COMBO) This combination of pictures created on October 10, 2023, shows (L) SpaceX, Twitter and electric car maker Tesla CEO Elon Musk during his visit at the Vivatech technology startups and innovation fair at the Porte de Versailles exhibition center in Paris, on June 16, 2023 and (R) the new Twitter logo rebranded as X, pictured on a screen in Paris on July 24, 2023..

Decrease Within the Trust and Safety Team

In a report by ABC News, X's trust and safety workforce worldwide decreased from 4,062 to 2,849 workers and contractors. This was between October 28, 2022, the day before Musk took control of San Francisco-based Twitter, and May 31, 2023, when a reporting period ended. This is a 30% worldwide decline and a 45% decline in the Asia-Pacific area.

There was an 80% decrease, from 279 to 55, in the number of engineers at X, whose primary emphasis was on trust and safety concerns. There were now 51 full-time content moderators instead of 107, a reduction of 52%. The number of contract content moderators also dropped from 2,613 to 2,305, a decrease of 12%.

Meanwhile, X has come clean about restoring 6,100 banned accounts, including 194 accounts suspended for hate speech. Those accounts were identified as being Australian, as per the commission. Platformer, a tech publication, announced in November 2022 that 62,000 accounts that had been suspended had been restored. However, X did not provide worldwide data.

According to the commission, these accounts were reactivated without further investigation, even though they had previously violated X's regulations.

Based on X's policy against hateful conduct, users are prohibited from making direct attacks on other individuals because of their race, ethnicity, national origin, caste, sexual orientation, gender, gender identity, religious affiliation, age, handicap, or severe illness.

See Also: Meta Announces New Policies to Put Teens Into Restrictive Facebook, Instagram Control Settings, Reduce Self-Harm Content Exposure

What Could Happen?

Since Musk came to control, X has been slower in responding to user concerns about rude material.

eSafety Commissioner Julie Inman Grant warned that a decline in safety personnel and the re-accounting of banned users would lead to an increasingly toxic and unsafe social media network. She added that X's brand image and advertising income were jeopardized if the company did not raise user safety standards, even if it could not be forced to do so.

Marketers like to place their ads on sites that they see as pleasant, safe, and free of harmful content. When users see a site that's dangerous or toxic, they will also take action by leaving, said Inman Grant.

See Also: AI Making Online Dating, Social Media More Vulnerable to Fraud: EU Police