Yesterday, a group of child and consumer advocacy groups complained that the Google Youtube kids’ app reveals violent content to the under-aged. The Federal Trade Commission received a complaint and will soon analyze and take the necessary measures, if needed.
Google expanded its targeted-users applications this year in February, when they launched the app for young children, declaring that all the videos were narrowed down to “content appropriate for kids”.
Consequently, an organization named Campaign for a Commercial Free Childhood and the Center for Digital Democracy revealed videos that would be extremely dangerous, disturbing and potentially harmful for kids to view. They asked for a thorough investigation in order to assess whether the marketing terms imposed by Google could be deceptive in any way.
The family friendly application was supposed to select content that could be addressed to kids and suppress those images designed for violence, sexual explicit scenes or other disturbing content features. It is supposed to be an application for preschoolers but it contains videos of a nail gun being shot into a mannequin’s head, the Daily Mirror reports.
Google didn’t even filter the commercials, as adverts for alcohol were also spotted, along with graphic and explicit discussions on pornography topics and drug use. Explicit sexual language escaped the Google filters as well.
This doesn’t exactly a sound like a bug in the system, it sounds like the system is entirely operating on flaws and errors. A kids app meant to filter the content that wouldn’t harm perception and understanding of commonly accepted social behavior is something fantastic, if we come to think about it. We sometimes expose our children to harmful content and images, due to the informational explosion that seldom can be efficiently controlled.
The app, available both for Apple and Android phones and tablets uses an inefficient algorithm to veil inappropriate content, although Google reported they also have a team of people constantly checking it. On the other hand, it is hard even for the most advanced algorithms and human interventions to filter giant amounts of content, considering that more than 300 hours of video are uploaded on YouTube every minute. Google managed to come with a great idea but they’re having trouble turning it into a safe and viable reality for children.
Youtube Kids App reveals violent content to the under-aged, and that’s a fact. Measures must be taken for Google to refine their algorithms or retire the product from the markets, to avoid greater scandal.
Further complaints have been made, claiming that “Google’s deceptive practices toward parents are even more widespread and pervasive than we documented in our initial request for investigation, said Donald Clark, Federal Trade Commission Secretary.
Image Source: venturebeat.com