Oh boy. Seems YouTube ruined the internet. Again. What was the last time this happened? A month ago? Well, whichever it is, this time there seems to be some amount of misinformation floating around so I'm going to attempt to set the record straight, to the best of my ability.
Let's start with the most important thing you should take away from this post: I am not a lawyer. If you are a content creator and have urgent legal concerns, I recommend you speak with one, or perphaps even better attempt to get lawyers who make YouTube content (there's quite a few) to cover this matter (so more people can benefit from it).
Okay, that said, let's get into things.
So long story short, there is this US law called COPPA, also known as the Children's Online Privacy Protection Act. If you've grown up on the internet, like most kids these days you probably have ran into this law a couple times when registering on a forum when you were below 13. Or maybe you didn't, but that's neither here or there.
Either way, COPPA is a law that forbids unauthorized data collection on kids (you can collect data but you need consent of the parents and due to the inherent anonimity of the internet, the majority of sites opt to not bother with an additional step and just ban kids under 13 year old from talking on their platforms), originally designed after concerns that digital advertisers would attempt to keep track of the behavior of kids and use that to market products to their easily impressionable minds.
I can honestly say that the law is from beginning to end in my completely unprofessional opinion: A joke. The intent is nice, but the implementation leaves a lot to be desired. For starters, it causes a really weird schism, especially if you're a 12 year old. Due to the wording and the fact that most sites opt to forego the additional verification step, the result is that COPPA only has caused one thing: Teaching kids to lie about their age online, which of course has other side effects such as making it easier for predators to hide behind incorrect birthdates since “everyone just lies about their age anyway”.
So COPPA is a bit bad. It's also not a law that's heavily enforced since the government body that is tasked with doing so is the FTC, which isn't properly staffed to deal with the entirety of the internet. Frequently, you only see COPPA enforcement against larger corporations who very bluntly run afoul.
One such company is Google.
Google vs the FTC
Back in september 2019, Google was officially fined by the FTC for using the fact that they collected data on kids to advertise to YouTube advertisers, whilst not properly taking care of how they would still follow COPPA. It's the largest recorded COPPA fine in history, coming in over at 170 million USD.
Also, I'm just going to get overtly political for a second, but FTC fines are reached by a vote. This vote was done according to the so called “party line”, which is politics talk for “this is what the party they belong to thinks is right”. The results of the vote were 3-2, with Republicans being the 3 vote and Democrats being the 2 votes. If you're someone with the ability to vote, you might want to take the way the current committee is setup into account for the next time you get to elect an administration.
Overt political talk off, let's continue on. So YouTube was fined the 170 million USD and in response they promised to start attempting to take care of the issue with them collecting data on children, which was a part of the requirements of the fine they received from the FTC.
The way they've gone about this is... brute to say the least, and the flaw here is mainly due to how intensive data collection is on it's own, even when not marketed to advertisers to begin with. Specifically, Google promised to start treating any content that is made for children as only being watched by children. If that sounds weird or confusing, especially since it's Google, the company who knows everything about you, sometimes even before you know about you, there's a reason for it. Specifically, the reason they have to take this approach is because they can't collect data on kids anymore, which limits their ability to identify said audience.
Yesterday, November 20th 2019, Google laid out its stated plans to fulfill their promise. And... here is the issue.
The biggest change is that YouTube now requires content creators to specify whether their videos are made for kids or not. Videos that are made for kids take the aforementioned heavy handed approach, and end up disabling pretty much all capabilities the platform has for users to interact with it for that specific video. The video will not appear in notification boxes, will be removed from recommended videos, the “notification” function in a video, which permits linking to a poll or to another video on the platform is disabled, end screens (these work similar to notifications except they appear as full boxes on the last couple seconds of the video) are unusable and finally comment sections are disabled.
Personalised advertisements are also disabled for the video, which for content creators that rely on YouTubes ad system for their income (already a rare situation these days) has seen reductions for around 90% on their income.
This also comes hand in hand with to my understanding the videos now being able to appear in the “YouTube kids” app, an application that is designed by YouTube to prevent children from seeing adult content. The application for the record has gotten into the public eye in the past due to child predators using the fact videos can be marked as being “for children” to essentially remotely groom them. Scary stuff, and these changes could result in a very much risky growth of other undesirable (not on the level of child predator stuff) content for children appearing on the platform.
For existing creators, YouTube has also promised to let its own algorithms determine whether it's appropriate for children or not for existing uploads on the platform. Putting aside for the moment that YouTubes algorithms are questionable in the best case situation and outright dogcrap in the worst situation, the problem is that this is not enough to cover the asses of content creators.
The FTC has very clearly specified that they plan to go after content creators that appeal to children as long as they don't have their videos marked as appealing to children, with the risk of them getting 42000$ in fines per affected video.
And here is when we get to the broad strokes. You see, while the “is it made for children” question can often be answered with a clear cut yes or no, the question of “does it appeal to children” is a really broad one.
Few people will have confusion over Peppa Pig videos being aimed at children, a reupload of a song from the movie Frozen on the platform or someone playing a H-Game (a term for a game with adult content in it) with the NSFW scenes disabled appealing to adults. But what about say, a content creator who creates animations with gallows humor in it.
Hazbin Hotel, a recently released animated series pilot which has all the stylings of a Disney XD cartoon, whilst clearly not marketed in it's characters at children, is an example of content that unintentionally could result in appealing to children, even if they're not targeted at children. Under the FTC settlement, content creators who do this would have to mark their videos as being made for children, which incurs the aforementioned restrictions.
Or for another example, to continue with my “vaguely related examples from the not confusing list”: What about platforms like SiIvaGunner, whose entire existence is reliant on the fact that they make remixes and modifications to existing music tracks, often blending in music that could potentially appeal to children, but can just as easily be highly offensive and not something you'd want a child to hear. Well, their content would again be regarded as “appealing to children”, since video game music is something children can easily like, which would mark the videos as being targeted to children, causing your kids to start listening to remixes that for example could feature content based on JustinRPGs vore fetish for a dragon Pokemon. I don't need to be a moral guardian to point out how that's not what you want, but what could easily happen due to these broad strokes.
And probably for the most complex and ambigious situation: What about the largest segment of YouTube creators: Those that play video games. Let's Plays are a large enough part of YouTube and whilst a lot are very unprofessional and are just someone playing it whilst talking into the camera about boring things, others are high-value productions that aim to show the best they can of a particular game (chuggaaconroy is a really good example of this latter one). Does this content appeal to children? It's a though question, considering video games often appeal to people of all ages, which under these guidelines would make them marketed at children.
This comes high off the heels of the so called “YouTube adpocalypse”, after several major advertisers started pulling their ads from YouTube after a couple of news sites ran articles showcasing racist and similarly awful behavior of some of their major content creators (the biggest example being PewDiePie, but for lesser examples cases like Logan Paul and prank channels being dicks to homeless people are good examples of this blowing up in their face).
The response from YouTube was to start recommending creators who didn't want their videos demonitized to start making more “child friendly” content as well as reducing the ability for content creators to customize their video and link to affiliate sites.
And now YouTube seems to have put itself in a game of lose/lose, since this settlement will have a very clear effect: Creators who create content are now going to have to skew their content to be so blatantly adult that there could be no confusion as to what it is so that the FTC cannot sue them for COPPA violations, which then will result in YouTubes advertisers having to pull out since most don't want to be associated with that stuff and now nobody is happy anymore.
I mentioned it earlier, but the FTC isn't properly equipped to deal with the entirety of the internet, let alone a site the scale of YouTube. In response to these concerns, the FTC has claimed it will run and has the ability to run it's own bots to determine video content that violates COPPA guidelines and take action from there.
The realistic effect here will probably be that larger channels (those with at least 1 million subscribers would seem like a good cutoff point if I were the FTC for practical purposes) will end up being at bigger risk of being sued over this than smaller channels are.
Again, I'm not a lawyer and this shouldn't be an excuse to not take this issue seriously, it could totally affect smaller creators too.
Okay so that was all really damn negative. Is there any positive side? Well, the FTC is still accepting public comments and I would urge you to leave a comment. A serious one though, troll comments or just bombarding it with spammy messages isn't going to be helpful. Try to stay respectful and remember that a human on the other side will be tasked eventually with processing your comments. Basically don't be an arsewipe okay?
I would also like to point out that whilst I mainly spend time ranting and rambling about the negative knock-on effects that it doesn't change that Google did end up breaking COPPA in a significant way and the fine is entirely deserved. The issue here is with the FTC however, who have now made a conscious choice to put the onus on dealing with COPPA laws on content creators rather than Google.