Popular video sharing site YouTube is being accused of failing to tackle dangerous content on its popular youth channel. This is in the wake of numerous complaints by users on the internet.
During the weekend, former TV personality and mother of three Anne Kiguta reshared a tweet that was cautioning mothers from allowing their children to watch YouTube videos unsupervised as they contained harmful content minutes into the play videos.
This is so disturbing! Completely demonic! Read through TL. And always pray, pray, pray for your children. https://t.co/3lQmkoHjZa
— Anne Kiguta (@AnneKiguta) February 23, 2019
Many Kenyans on Twitter came out to share their two cents on the same and referred to suicidal content on the platform like videos of cartoon characters slitting their wrists and others showed children just turning on the stove and leaving it on.
YouTube Kids, dubbed as a safer, child-friendly version of the video-sharing site, has been criticized by parents for failing to remove cartoons that contain clips depicting suicide methods on its platform.
Google told the BBC it works hard to remove such content. “We have strict policies that prohibit videos which promote self-harm. We rely on both user-flagging and smart-detection technology to flag this content for our reviewers,” the firm said in a statement.
It is unclear how or why the clips depicting suicide methods were embedded in children’s cartoons.
In its latest transparency report, Google said it had removed more than 7,845,000 videos from its platform from July to September 2018, 74% of which were removed before they had had any views, the BBC reported.