TikTok improves safety measures for user

News

 

TikTok, a well-known social media platform used by millions of influencers and young adults has decided to go a notch higher to improve the safety of its users.

It recently refreshed its community guidelines with additional details about what is allowed on the platform.

The community guidelines support the authentic and entertaining TikTok experience that people know and enjoy.

One of the updates on the guidelines that it has made is that it will incorporate feedback and language used by mental health experts to improve the policies on self-harm and suicide and avoid normalizing self-injury behaviours.

It has also bolstered its policies on bullying and harassment and its guidelines are now more explicit about the types of content and behaviours that are not welcome on the platform such as cyberstalking.

In line with its dangerous acts policy, it has taken steps to either limit, label or removes content that depicts dangerous acts or challenges.

It has now added harmful activities section to its minor safety policy to reiterate that content promoting dangerous dares, games and other acts that may jeopardise the safety of youth is not allowed on the platform.

It has updated its previous dangerous individuals and organizations policy to focus more holistically on the issue of violent extremism.

The new guidelines describe into detail what is considered a threat or incitement to violence and the content and behaviour it prohibits.

Apart from having updated guidelines, they also have new tools to support people with photosensitive epilepsy and it will start to roll out a text-to-speech feature that allows people to convert typed text to voice that plays over the text as it appears on the video.

The new resources will be supporting well-being by offering support to those who are struggling these resources were created with guidance from leading behavioural psychologists and suicide prevention experts.

If a user searches for terms such as self-harm, he/she is likely to see evidence-based actions they can take.

It is also introducing the opt-in viewing screens on top of videos that some may find graphic and distressing.

Such type of videos is already ineligible for recommendation into anyone’s For You Feed and this feature aims to further reduce unexpected viewing of such content by offering viewers the choice to skip the video or keep watching.

It continues to develop tools to help manage their app experience from automatically filtering unwanted comments to the ability to say “not interested” on videos in their For You Feed.

In an effort to support people who want to share their stories and use their voices to raise awareness on topics, others may find triggering.

Since the start of the pandemic, it has provided access to public health information from experts’ in-app and relief for frontline workers and families.

Leave a Reply

Your email address will not be published. Required fields are marked *