To that end, YouTube has begun sharing information about a metric it calls the Volatile View Rate. This is a measure of the percentage of video views on YouTube that come from videos that violate YouTube’s policies.
The Volatile View Rate
Information about the Volatile View Rate will be shared quarterly in YouTube’s Community Guidelines Enforcement Report.
YouTube reportedly created its Volatile View Rate measure in 2017, although it will now be more transparent when it comes to sharing this information. While it’s going to be tough to totally clean up YouTube overnight, the hope is that sharing this Volatile View Rate information in this manner will show a steady decrease in these video views over time.
In a blog post, Jennifer O’Connor, YouTube’s director of Trust and Safety observes that:
The report continues that YouTube has, to date, removed more than 83 million videos and 7 billion comments that violated its Community Guidelines. O’Connor observes that, using its AI-aided algorithms, the popular video platforming is now able to detect 94% of content that violates its rules using automatic flagging. Three-quarters of this content is removed before it’s even able to rack up 10 views.
Volatile View Rate isn’t the only metric YouTube uses for assessing its success at removing violating content. It also uses data related to turnaround time when it comes to removing content. But, as O’Connor observes, this is not a perfect metric. She writes:
Making YouTube Work for Everyone
There’s still plenty more work that needs to be done to ensure that platforms like YouTube are inclusive, non-harmful places for as many users as possible. The company continues to tweak its rules regarding what is, and isn’t, allowed.
Nonetheless, work like this shows that YouTube is doing its best to move things in the right direction.