ARTICLE AD BOX
By Michael Race
Business reporter, BBC News
TikTok has said it "immediately" took action to counter misinformation after the EU warned the platform following the attack by Hamas' on Israel.
The EU called on TikTok boss Shou Zi Chew to "urgently step up" efforts, and "spell out" within 24 hours how it was complying with European law on Friday.
Social media firms have seen a surge of misinformation about the conflict like doctored images and mislabelled videos.
TikTok said it had removed "violative content and accounts".
"We immediately mobilised significant resources and personnel to help maintain the safety of our community and integrity of our platform," the company said in a statement on Sunday.
In a letter to the company on Friday, EU commissioner Thierry Breton warned TikTok needed to be mindful of its popularity with young people and "protect children and teenagers from violent content and terrorist propaganda as well as death challenges and potentially life-threatening content".
The bloc also handed X (formerly Twitter), YouTube, and Meta, the owner of Facebook and Instagram, similar warnings about misinformation, along with a 24-hour deadline.
TikTok, which is owned by Chinese firm ByteDance, listed actions it said it had taken on its website to combat misinformation and hateful content.
It said it had created a command centre, enhanced its automated detection systems to remove graphic and violent content, and added more moderators who speak Arabic and Hebrew.
"We do not tolerate attempts to incite violence or spread hateful ideologies," TikTok said.
"We have a zero-tolerance policy for content praising violent and hateful organisations and individuals, and those organisations and individuals aren't allowed on our platform.
"TikTok stands against terrorism. We are shocked and appalled by the horrific acts of terror in Israel last week. We are also deeply saddened by the intensifying humanitarian crisis unfolding in Gaza."
The EU introduced new laws in August 2023 which regulate the kind of content that is allowed online.
The Digital Services Act (DSA) requires so-called very large online platforms - those with over 45 million EU users - to proactively remove "illegal content", and show they have taken measures to do so if requested.
The EU previously told the BBC it was not currently in a position to comment on what would come next in these specific cases, but has explained what was hypothetically possible under the law.
The DSA allows the EU to conduct interviews and inspections and, if it is unsatisfied, proceed to a formal investigation.
If it decides that a platform has not complied or is not addressing the problems it has identified, and risks harming users, the commission can take steps including issuing fines and as a last resort request judges to ban a platform from the EU temporarily.