A
A
A
To few people’s surprise, YouTube removed a larger number of videos from its platform in the second quarter of 2020 than ever before. This occurred as the popular online video sharing platform began to use its own algorithms to delete the videos and moved away from human moderators assigned to look at content.
The company just released a report this week called the Community Guidelines Enforcement report, which maintains YouTube deleted a whopping 11.4 million videos in the second quarter between the months of April and June. The company removed close to 9 million videos during the same timeframe in 2019.
Read More »
This has also led to the removal of nearly 2 million channels for violating the company’s community guidelines in the past three months. Perhaps even more staggeringly, more than 2 billion comments were removed from the video sharing platform after they were flagged within the automated systems now in place.
One definite reason for this increase is due to the coronavirus and the company’s need to automate with less people working for YouTube. YouTube opted to be overly cautious and enforce a stricter policy when dealing with the removal of its videos. The company posted about it in a blog post, stating, “When reckoning with greatly reduced human review capacity due to COVID-19, we were forced to make a choice between potential under-enforcement or potential over-enforcement. Because responsibility is our top priority, we chose the latter – using technology to help with some of the work normally done by reviewers.”
Google, the parent company of YouTube let all of its employees know at the onset of the coronavirus in March that it would be allowing them to work from home, which would mean a heavier reliance on technology that didn’t include humans intervening. This means that many of the videos that would normally not be removed by a human moderator were removed by the automated one in error. These human moderators need to work from a specific office to review content because any work done outside of them can pose the risk of exposing sensitive videos and user data.
This removal of more videos would also lead to more appeals from those who created them. This led to the company hiring more employees to deal with the appeals process that will allow these requests to be dealt with in a prompt manner. The numbers tell this story as there were 325,000 requests in the second quarter, a drastic increase from the 166,000 from the first one. This meant that YouTube brought back 160,000 of these videos in the second quarter, nearly four times the 41,000 from the first quarter.
The increase in removals has largely included an increase in areas like violent extremism and child safety. The videos relating to child safety involved things like people participating in dares or other activities that could be misinterpreted by young children and endanger them. The company concluded that the short-term inconvenience of having a video removed far outweighed keeping videos that violated its terms and policies.
This comes at a time when many people across the world are attacking censorship and the policies of various social media platforms like Facebook and Twitter. Each company has been forced to defend their own practices and on a case by case basis some of the posts and pages found on these popular destinations.
These employees are to remain at home until at least the end of the year, so expect these numbers to remain high during the third and fourth quarter of 2020. However, Google will inevitably be working to fix all of the kinks in their system so that not as many videos that do not fit the proper criteria are removed