YouTube Kids has faced problems with disturbing videos like Disney characters in a sexual or violent situation. Now, the app has launched new settings to introduce more parental controls over what their kids can watch on Youtube. The company announced that there will be an option to filter videos by "approved content only" so that the parents are able to whitelist channels that their kids can watch. The efforts have been put into place 4 months after reports of inappropriate videos came on the site's kid-friendly platform. Content is screened by machine learning algorithms and sometimes, cartoons disguised as age appropriate can slip through.
Another tool is going to come up later this year that will help parents to choose every channel or video that the kids can see in the app. The company is persistently trying to fine-tune, test and improve the filters. The company urged parents to block and flag videos for review which they do not think is suitable for the kids app. When a video for children is uploaded on the YT platform, it gets reviewed by the algorithms to put it on Youtube Kids. This automated process takes a few days. And it is possible that this might not be enough.
The rules for the company are changed continuously to accommodate more parental control but creators have been finding their ways to defy it too.