December 2, 2022
While many may have spoken about how positive of a year TikTok has had and how its dominant market share has resulted in so many top platforms losing screen time against the archrival, well, there is some bad news too.

The company was seen publishing its latest review of Q2 this year in terms of responses linked to Community Guidelines. And that entails all forms of videos and profiles that were either deleted or taken action against.

And as the famous saying goes, the more users, the greater the problems. And that seems to have taken place right now.

TikTok is being extra stringent in terms of action taken by its board against those violating the community guidelines. This could include those breaking rules or those coming up with fake accounts. Either way, they’re removed from the platform but to see a mega 62% rise in the number is appalling, to say the least.

For starters, let’s begin with the amount of video content that made its way off of the platform. Think along the lines of 113 million that took place between the months of April to June. This was an 11% rise from the previous quarter. But the trend does make some sense as it came with steady growth in viewers as well.

As can be witnessed in a recent chat of the overview put up by the app, there is plenty of misuse, and how this particular challenge manages to increase with time on the app.

What we love is how the app manages to get rid of content that breaks violations, right before anyone gets a peek so this fast action behavior is always recommended.

See also  Instagram Is Making The App A Safer Place For All With Automated Blocking Features

TikTok is really doing plenty to have its users be safeguarded against harmful exposures but critics aren’t too happy with the way it’s handling misinformation.

A new study put up by NewsGuard says 1 in 5 search results on the app is definitely misleading, including one topic linked to an mRNA vaccine.

This is super interesting because it doesn’t seem to be getting better with time and it’s causing a concern as seen with other platforms.

According to a recent study conducted by NewsGuard, up to 1 in 5 search results in the app contains misinformation, including searches for ‘2022 election’ to ‘mRNA vaccine’. Which is an interesting counter to TikTok’s own data, which would suggest, in isolation, that TikTok’s getting much, much better on this front.

There was also plenty of talk about how one of the app’s most important concerns is linked to the safety of minors. The fact that the platform has so many users means there are allegations that it might be putting up more content belonging to young females so as to try and lure people to continue scrolling on the feed.

As we get deeper into the matter, it was revealed by TikTok how explicit content and nudity were one of the major reasons for content removal on the app. And this continues to be an area of growing concern for so many people. TikTok says it’s trying hard to get rid of such content but experts feel its algorithms do promote such content too.

But the biggest for eliminating accounts has to do with fake profiles and seeing the 62% rise is just mind-blowing as stats are compared with the previous quarter.

See also  New Study Says Twitter Users Are Not Happy With Musk In Charge And Many Are Abandoning The App

The mere fact that we’re seeing such fake profiles come into existence makes one realize the great value that today’s content creator industry holds and how the market filled with fake users is on the rise.

TikTok also revealed how it’s testing a new program for fact-checking so false content is flagged immediately on the web.

Read next: TikTok is the Least Trusted News Source According to This Survey