Muted on TikTok live by moderator

As it looks to evolve the use of its live-streaming option into eCommerce, as part of its broader monetization push, TikTok is adding a new control option for live broadcasters that will enable them to mute comments from individual viewers within streams for variable time periods.

Muted on TikTok live by moderator

As you can see in this example, live-stream hosts will now have the option to mute specific viewers for a period of the broadcast - or the entire stream, if they so choose.

As explained by TikTok:

Now, the host or their trusted helper can temporarily mute an unkind viewer for a few seconds or minutes, or for the duration of the LIVE. If an account is muted for any amount of time, that person's entire comment history will also be removed. Hosts on LIVE can already turn off comments or limit potentially harmful comments using a keyword filter. We hope these new controls further empower hosts and audiences alike to have safe and entertaining livestreams.

The added capacity to remove all of the users’ previous comments is a big addition, which could help to manage live-stream interaction, and reduce unwelcome distractions flooding the comment stream.

Which has always been a problematic element. Twitter was forced to update its rules around live-stream interaction back in 2018, after various investigations showed that women and young people, in particular, tended to attract all manner of offensive remarks and comments during their broadcasts.

Muted on TikTok live by moderator

And as noted, with TikTok exploring live-stream commerce, via various partnerships with big name brands, it needs to also provide a brand and consumer safe environment, in order to maximize appeal. With this in mind, having the capacity to quickly cut off inappropriate commentators, and negate their impact, could be a valuable addition.

TikTok also added a new live-stream moderators option back in July, to provide extra management options in this respect.

The announcement comes within a broader overview of TikTok’s latest Community Guidelines Enforcement report, which outlines all of the actions TikTok took due to platform rule violations between April and June this year.

TikTok notes that it removed more than 81 million videos in the period, equating to less than 1% of all videos uploaded on the platform - which would suggest that TikTok is now seeing more than 90 million videos uploaded to the platform every day. Which makes sense, given the app is now up to a billion users, but it does add some extra scope to the growth of the platform.

“Of those videos, we identified and removed 93.0% within 24 hours of being posted and 94.1% before a user reported them. 87.5% of removed content had zero views, which is an improvement since our last report (81.8%).

TikTok also notes that its new alerts which prompt users to reconsider potentially offensive comments, which it added back in March, are also having an impact.

“The effect of these prompts has already been felt, with nearly 4 in 10 people choosing to withdraw and edit their comment. Though not everyone chooses to change their comments, we're encouraged by the impact of features like this and we continue to develop and try new interventions to prevent potential abuse.

Muted on TikTok live by moderator

Twitter and Instagram have also implemented similar prompts, which, based on this data, could go some way in reducing angst in replies.

User safety is a major focus for TikTok, with the app’s appeal to younger audiences also, potentially, facilitating unwanted exposure and connection, if left unchecked. The platform has come under scrutiny in several regions in the past for its failure to protect young users from harm, and with concerns around its previous moderation processes, defined by Chinese regulations, TikTok knows that it’s under heavy scrutiny on this front, and that it needs to work hard to maintain trust.

Which is why measures like this are important, while they’ll also, ultimately, help the app maximize advertiser interest by providing a safer, more welcoming environment.

What happens if you get muted on TikTok live?

If an account is muted for any amount of time, that person's entire comment history will also be removed. Hosts on LIVE can already turn off comments or limit potentially harmful comments using a keyword filter.

Can moderators mute on TikTok?

During the LIVE, both you and your moderator can mute and block users to help keep the stream welcoming and positive. This way, you can focus on creating your content while the moderator helps maintain your LIVE as an entertaining, safe space.

Why did I get muted on TikTok?

Due to the copyright policy for TikTok, if any content in the video is auto-detected as copyrighted music by their system, the audio in the video could be muted, or the platform may prevent the video from being uploaded altogether.

What does it mean to be a moderator on TikTok live?

Moderators help manage your live comments and mute or block accounts if needed. Having a moderator or a team of moderators is a good step towards having a clean, healthy live stream chat. TikTok allows creators to add up to 20 moderators to their live streams.