
TikTok announced it's making changes to its recommendations engine in an attempt to limit too much negative content for users. In future, users will be able to select tags for content they don't want to see in their personal feed.
The move responds to pressure from children's advocates and regulators that algorithms used in social media to serve up similar content based on user interests may be reinforcing negative feelings, such as low self-esteem or poor body image.
TikTok said it's looking at how its system can better vary the kinds of content that may be recommended in a sequence. It's testing ways to avoid recommending a series of similar content – such as around extreme dieting or fitness, sadness, or breakups – to protect against viewing too much of a content category "that may be fine as a single video but problematic if viewed in clusters," the company said.
In addition, the system may sometimes recommend only very limited types of content that, which may not violate the platform's policies, but could have a negative effect if that's the majority of what someone watches, such as content about loneliness or weight loss.
This work is being informed by talks with experts across medicine, clinical psychology, and AI ethics, members of the company's Content Advisory Council, and the wider TikTok community, the company said. It did not give a timeframe for implementing the changes, but said it would take time to get right, and more updates will follow.
TikTok is also working on a feature that would let people choose words or hashtags associated with content they don't want to see in their 'For You' feed. They can already tap any video and select "Not interested" to automatically skip future videos from that same creator or using the same audio. The new tool will offer another way to customize their feed.