favicon

T4K3.news

TikTok UK moderators at risk as AI shift continues

Hundreds of trust and safety roles in the UK and parts of Asia could be moved or cut as TikTok increases automation in moderation amid new online safety rules.

August 22, 2025 at 03:17 PM
blur Hundreds of TikTok UK moderator jobs at risk despite new online safety rules

The company shifts moderation work toward automation while new UK safety rules tighten penalties for breaches.

Hundreds of TikTok UK moderator jobs at risk despite new online safety rules

TikTok has told staff that hundreds of trust and safety roles in the UK and parts of Asia could be affected as part of a global reorganisation. The plan shifts moderation work to other European offices and third‑party providers, with a portion of the jobs staying in the UK.

The company says the move is part of a broader shift toward automated moderation, with automation responsible for removing more than 85% of content that violates its guidelines. The timing comes as the UK introduces new online safety rules that require age checks for potentially harmful content and penalties up to £18 million or 10% of global turnover for breaches.

Past actions include firing 300 moderators in the Netherlands and replacing about 500 moderators in Malaysia, while workers in Germany have held strikes over layoffs. TikTok also reports revenue growth, with 2024 revenues at $6.3 billion, up 38% from the previous year, and a narrowed operating loss.

A company spokesperson framed the reorganisation as strengthening the global operating model for trust and safety by concentrating operations in fewer locations to boost effectiveness and speed as it evolves this critical function with technological advancements.

Key Takeaways

✔️
TikTok reorganises trust and safety across regions including the UK
✔️
Automation identifies most rule violations
✔️
UK online safety rules bring higher fines for breaches
✔️
Hundreds of moderation jobs in the UK could be affected
✔️
Past layoffs in the Netherlands and Malaysia signal a global trend
✔️
Union voices warn automation may undermine safety if not properly overseen
✔️
Revenue growth continues despite staffing changes
✔️
Public trust hinges on how moderation quality is maintained over automation

"TikTok workers have long been sounding the alarm over the real-world costs of cutting human moderation teams in favour of hastily developed, immature AI alternatives."

Comment from John Chadfield of the Communications Workers Union

"Continuing a reorganisation that we started last year to strengthen our global operating model for trust and safety, which includes concentrating our operations in fewer locations globally to ensure that we maximise effectiveness and speed as we evolve this critical function for the company with the benefit of technological advancements"

Statement from a TikTok spokesperson

The move highlights a broader tension between rapid automation and the need for reliable content safeguards. Regulators want faster, more consistent enforcement, but AI can misread context or miss edge cases, potentially increasing risk if human oversight is reduced. The political and budget implications are real: stricter rules come with stiff penalties, and investors will watch whether automation delivers safety without eroding jobs or public trust.

If TikTok manages to maintain high safety standards while cutting costs, other platforms may follow with similar shifts. If not, critics will push for stronger human oversight and more transparent moderation practices, especially in markets with strict new rules and active union activity.

Highlights

  • Safety needs hands not just code
  • AI can speed things up but context still matters
  • Trust is earned not erased by automation
  • Humans guard the edges where machines stumble

AI moderation could affect safety and jobs

The shift to automated moderation while enforcing stricter UK safety rules raises concerns about consistency and safeguards. The changes touch on budget pressures, public reaction, and investor scrutiny, making this a sensitive and high-impact move.

As platforms balance speed, safety, and staffing, the real test will be whether users see a safer space without sacrificing human judgment.

Enjoyed this? Let your friends know!

Related News