TikTok’s Chinese owners have been accused of “bare-faced union busting” by announcing hundreds of layoffs just a week before staff were due to vote on unionisation.
The social media platform’s Beijing-based owner ByteDance announced on Friday that hundreds of staff in its London office would be made redundant, with their roles being reallocated to offices across Europe and outsourced to third-party providers.
It came just a week before a vote on a bid by employees in the firm’s content moderation arm to establish a branch of the Communication Workers Union (CWU) for its 2,500 staff.

The announcement of redundancies on Friday raised fears that ByteDance was seeking to pile pressure on staff ahead of the vote on unionisation.
But on Friday morning, TikTok wrote to the CWU announcing it has pulled its support for the ballot altogether while the redundancy consultation process takes place.
In a letter to the CWU shortly after the redundancies were announced, a senior ByteDance employee said: “Given these exceptional circumstances, we have decided that it is necessary for us to suspend the planned voluntary ballot process with immediate effect.”
CWU national officer John Chadfield told The Independent: “The timing is deliberate… and it is deliberately cruel.
“It is bare-faced union busting, leaves the members who have organised facing massive uncertainty and, from what we can see, they are just going to be offshoring these jobs to a third-party in Lisbon.”
The union has been working with ByteDance since last November to gain recognition for TikTok’s London-based content moderators, who Mr Chadfield said “have the most dangerous job on the internet”. “The stuff they have to see is literally the stuff of nightmares,” he added.

He also warned that while ByteDance has said it plans to adopt AI to take up some content moderation responsibilities, the technology is not yet ready and human moderators will be essential.
Mr Chadfield said that, once the redundancy consultation process is closed, the CWU’s bid to unionise will continue. “The unionisation of TikTok is inevitable. They might want to delay it in the most spiteful way possible, but it is inevitable.” And, as the government clamps down on harmful content online, the CWU said that ByteDance’s decision to lay off hundreds of content moderators would leave TikTok a more dangerous platform.
TikTok said it had engaged voluntarily with the union, despite having no requirement to do so.
Announcing the redundancies, ByteDance said it was “concentrating our operations in fewer locations globally”.
TikTok’s UK head office is in Farringdon in London, and it is set to open a new office in Barbican in the capital early next year.
A spokesman for the social media firm said: “We are continuing a reorganisation that we started last year to strengthen our global operating model for trust and safety, which includes concentrating our operations in fewer locations globally to ensure that we maximise effectiveness and speed as we evolve this critical function for the company with the benefit of technological advancements.”
TikTok has increasingly been harnessing AI to moderate content shared on the app.
More than 85 per cent of the content removed for violating its community guidelines is identified and taken down by automation, according to the platform.
It also says that AI can reduce the amount of distressing or graphic content that its moderation teams are exposed to.
The restructure plans come shortly after the UK’s Online Safety Act, enforced by Ofcom, came into force last month.
This requires online platforms to protect UK viewers from illegal material, such as child sexual abuse and extreme pornography.
Platforms are also required to prevent children from accessing harmful and age-inappropriate content.
TikTok’s safety moderation teams are trained to spot signs that accounts might be being used by a child, and they can then suspend the account.
But it can also use AI-based systems to identify information, such as keywords and in-app reports from the user community, that might point to a potentially underage account.