So, YouTube finally put their foot down. As the largest video-sharing platform on the internet, YouTube will no longer allow children under 12 years old to live stream without adult supervision, according to the NY Times.
However, what does this mean for content creators? Of all ages? Well, it means that every channel that is not in compliance with the policy is in danger of losing their ability to live stream.
It also means that content creators have to be more responsible when they are uploading content. This new policy shows that video platform is taking a firm stance on child safety. Additionally, they are making parental supervision mandatory when it comes to pre-teen and younger content creators.
YouTube and Anti-Bullying
YouTube has launched algorithm-based classifiers to detect when children under 12 are live streaming without supervision. These classifiers can also identify and disable inappropriate comments. While there has been backlash regarding these restrictions, many parents and families support this safety precaution. This change has been a long time coming as YouTube has been disabling tens of millions of comments since last February. They started blocking comments in order to enforce the protection of minors using the platform.
What Else Has Changed on YouTube?
The video platform has also implemented a ban against videos that promote the superiority of any group that discriminates against gender, race, age, sexual orientation, veteran status, religion and caste. By doing so, YouTube is in the process of removing videos that promote Nazism and other discriminatory practices and ideologies. Right now, the YouTube community can expect thousands of channels to be removed from YouTube.
Removal of Dangerous Propaganda
YouTube has also announced that it will be removing content that denies “well-documented violent events,” such as the Sandy Hook Elementary School Shooting and 9/11. They are also targeting content that spreads inaccurate information, such as fake miracle cures and flat-earth theories. The algorithm has already decreased the amount of views that these videos receive by 50%. However, they will continue to ensure that these videos receive the least amount of recommendations. Instead, they will recommend videos that originate from more credible sources.
How Will This Affect YouTube Creators?
If you have a channel that is not in violation of the community guidelines, you should be fine. However, if your content is edgy and has “borderline content and harmful misinformation,” then your chances of losing your channel are high. Additionally, if they find that your channel continues to violate hate speech policies, you will not be allowed to run ads or let you subscribers pay you for extra chat features. YouTube refuses to support channels that use the platform to promote racism and hate speech, hence the recent changes.
While the algorithm still needs improvement, this new approach will help ensure child safety on the platform. So far, in 2019, the video platform has removed 800,000 videos for violating child safety policies. Moving forward, Contentflow hopes to see a fundamental decrease in inappropriate, harmful content as a result of the enforcement of these community guidelines.
Need clarity? Read YouTube’s full statement here.