Building greater transparency and accountability with the Violative View Rate

In 2018, we introduced our Community Guidelines Enforcement Report to increase transparency and accountability around our efforts to protect viewers. It was a first-of-its-kind look at the content we removed from YouTube for violating our policies, and included the number of videos removed, how that violative content was first identified, reasons for removal, and more. Over the years, we've continued to share additional metrics, such as the number of content appeals and subsequent reinstatements. Since launching this report, we've removed over 83 million videos and 7 billion comments for violating our Community Guidelines. Just as important, our report has been tracking the impact of our deep investments in machine learning technology in 2017, measuring how well we catch violative content. For example, we're now able to detect 94% of all violative content on YouTube by automated flagging, with 75% removed before receiving even 10 views. Today, we're releasing a new data point in our report that will provide even more transparency around the effectiveness of our systems: the Violative View Rate


Put simply, the Violative View Rate (VVR) helps us determine what percentage of views on YouTube comes from content that violates our policies. Our teams started tracking this back in 2017, and across the company it's the primary metric used to measure our responsibility work. As we've expanded our investment in people and technology, we've seen the VVR fall. The most recent VVR is at 0.16-0.18% which means that out of every 10,000 views on YouTube, 16-18 come from violative content. This is down by over 70% when compared to  the same quarter of 2017, in large part thanks to our investments in machine learning.  Going forward, we will update the VVR quarterly in our Community Guidelines Enforcement Report


VVR data gives critical context around how we're protecting our community. Other metrics like the turnaround time to remove a violative video, are important. But they don't fully capture the actual impact of violative content on the viewer. For example, compare a violative video that got 100 views but stayed on our platform for more than 24 hours with content that reached thousands of views in the first few hours before removal. Which ultimately has more impact? We believe the VVR is the best way for us to understand how harmful content impacts viewers, and to identify where we need to make improvements.



We calculate VVR by taking a sample of videos on YouTube and sending it to our content reviewers who tell us which videos violate our policies and which do not. By sampling, we gain a more comprehensive view of the violative content we might not be catching with our systems. However, the VVR will fluctuate -- both up and down.  For example, immediately after we update a policy, you might see this number temporarily go up as our systems ramp up to catch content that is newly classified as violative.


 


Our ongoing goal is for the YouTube community to thrive as we continue to live up to our responsibility. The Community Guidelines Enforcement Report documents the clear progress made since 2017, but we also recognize our work isn't done. It's critical that our teams continually review and update our policies, work with experts, and remain transparent about the improvements in our enforcement work. We're committed to these changes because they are good for our viewers, and good for our business—violative content has no place on YouTube. We invest significantly in keeping it off, and the VVR holds us accountable and helps us better understand the progress we've made in protecting people from harmful content on YouTube. 


Posted by Jennifer O'Connor, Director, YouTube Trust & Safety


Subscribe to receive free email updates: