[ad_1]

X fined over $380,000 for failing Australian child abuse content moderation questions

Stellar Snippets

The Australian eSafety Commission fined X over $380,000 for not revealing how it moderates content related to child sexual abuse on its platform.

White Scribbled Underline

Australia's 2021 Online Safety Act requires online service providers to explain how they regulate child sexual abuse (CSA) content on their platforms.

White Scribbled Underline

The eSafety office sent legal memos in February to five platforms: Google, TikTok, Twitch, Discord, and X.

White Scribbled Underline

The memo had specific questions for the companies around tackling child abuse content. X reportedly left some sections "entirely blank" and is now fined.

White Scribbled Underline

All five platforms, the commission said, had "serious shortfalls." Google got a "formal warning" for "generic responses."

White Scribbled Underline

X was fined because its failure to comply was more severe, such as not providing answers regarding how quickly they can detect and respond to CSA content.

White Scribbled Underline

As per CNBC reports, 25 of 1,600 employees held titles related to "Trust and Safety" in December 2022.

White Scribbled Underline

X has to respond to the questions within 28 days. Though small, the fine is an important step in monitoring the safety protocols of these companies.

White Scribbled Underline
[ad_2]