Raising Children With YouTube

Ugh. Where do I start on being a parent of boys and dealing with YouTube, Instagram, and the Internet?

The internet is such a wonderful tool until you become a parent. Suddenly at your child's fingertips is filth and the darkest parts of the world, things that shouldn't be seen by anyone that young or innocent. A child can find porn, hate groups, and distasteful content with a whisper or dirty joke told to Siri. Finding recommended porn videos on YouTube is as simple as sharing a wifi connection with an adult porn user and becoming connected through cross-device ad retargeting. Trying to control children's access to an unregulated media industry is impossible without the help of our largest content hosts, such as Google and Facebook.

The need to moderate YouTube content for children is huge, considering that YouTube has replaced almost all media.

If we consider that YouTube has replaced almost all media, children will most likely consume video as their primary source of media. Magazines, recipe books, exercise videos, television shows, movies, music - once carefully curated by publishers and media outlets for appropriate audiences - are now provided on an open platform for anyone to create, publish, and advertise on. It allows children to easily be exposed to sexually explicit material that once was hidden in a backroom or guarded by a ticket booth and a standardized rating system. A child watching a YouTube video of Sesame Street can get recommendations for adult content based on his father's private viewing habits. We need to support content platforms that protect children and curate tasteful content over websites whose algorithms base content on crude popularity.

 

No One is Accountable for the Content

Google and several other hardware and software providers have tried to create solutions for this problem. Despite these solutions, x-rated content on YouTube gets through the filters. This happens because of an unenforced rating system for videos that allows video creators to rate their own content!

Despite YouTube's ban of sexually explicit content, it's still easily found in popular video recommendations. The amount of content on YouTube is impossible to manually regulate at 400 hours of video uploaded a second. When Google found itself in an ad boycott this spring, an army of interns was hired to flag videos of racism and terrorism to prevent Google from losing a potential of $750 million in ad revenues.

A search for "sex" on Youtube provides thumbnail images of topless women or explicit poses on most of the recommended videos. Clearly against YouTube's content policy, this content isn't removed until flagged by users. To flag an offensive video or thumbnail, someone needs to click on the thumbnail, open the video, and then click on the settings to flag the video - which has already started auto playing. A ban of YouTube.com at the router level doesn't help much because of embedded videos on other sites.

 

Restricted Mode on YouTube isn't enough

The two solutions provided by Google are restricted mode and YouTube Kids. According to my youngest (11), "YouTube Kids is for two-year-olds." That's a non-option for my children. Restricted mode filters content shown on YouTube to anything rated less than mature. It also blocks all unrated content. Google recently added the ability to password protect the restricted mode toggle switch. Yep, parents can lock a browser in Restricted Mode with their Google account username and password and it can't be switched back without a password. The limitations of a locked restricted mode are that it is limited to the current browser. Switch browsers and you can get around it. Another limitation I've found is that restricted mode is too strict for anyone older than 10-12 years old and I'm constantly turning it on and off to let my kids watch videos for homework or find music.

It's only set for this browser, and doesn't apply to other browsers or profiles on the computer. It also misses channels, thumbnails, and relies on content providers to rate the content (ha ha).

Disney's Circle allows a family to turn on Restricted Mode for any YouTube video coming through the wifi router. This works well and can be set per device. The limitations are with Restricted Mode - it is too strict for teens and blocks any unrated content as well as mature or explicit content.

Disney's Circle does a great job of enforcing restrictions on YouTube videos and can be set on any device connected to the home wifi. The only problem is that "restricted mode" is too strict for teenagers.

Disney's Circle does a great job of enforcing restrictions on YouTube videos and can be set on any device connected to the home wifi. The only problem is that "restricted mode" is too strict for teenagers.

What if We Did Things Differently?

What would Google need to do to actually make their platform childproof, or enjoyable for people who don't want explicit videos showing up in their video feeds? Here are four recommendations I would change to improve the user experience.

  • Create a content filter and carry those settings across a Google account when someone is signed in, regardless of browser or device. Set the filter at the account level, rather than the browser level. Tie in SafeSearch settings with YouTube Restricted Mode.
  • The Restricted Mode found in the iOS YouTube app should also be available on the Chrome app. Additional levels of filtering should be created, based on a standardized rating system (music, video, or tv). Allow a toggle for content warnings on thumbnails and video previews.
Filter YouTube videos on iOS devices is best done through the YouTube app, or Chrome.
  • Use bots to automate ratings. Content creators could file an objection, but a video would not be uploaded until it was rated by a bot. Ratings would no longer be based on the honor system.
  • Ratings should be included in the metadata so third party solutions (eg Circle) could provide levels on filtering for each family member, similiar to Netflix's family settings.
 

Put a Lid On it

It's easier to protect a child from a bottle of pills than it is to protect them from harmful content on the internet. Something seems wrong with the lack of accountable these platforms have in regards to the amount of unacceptable content allowed to slip through the cracks. We need to put a lid on self-publishing platforms.

This experience is possible, but it may take an act of Congress or the FCC to force content providers to change the unlimited access to inappropriate content for children on the internet. We've created laws that prohibit targeting children's websites with profiling ads, why can't we create regulations that protect our children from this type of content? When financially motivated, Google was able to categorize all their video content for racism and terrorism. They can do more for us as parents, but it may take a massive boycott or law to see change. As a society, we should demand it.

Automating ratings with bots, flagging explicit content with warning labels, increasing capabilities of parental controls, and sharing this meta data with hardware devices are several ways to create a safer web for kids and adults who don't want to view this type of material. Calling out standards will impact culture and improve what's being published.