Google Settles In FTC COPPA Violation Charges

Google has finally agreed to settle with the Federal Trade Commission over its investigation into the YouTube video streaming platform for violations regarding federal data privacy laws for children.  This settlement—with support from each of the FTC’s three Republicans and opposition from its two Democrats—finds that Google did not adequately protect kids who use the YouTube video-streaming service. 

Furthermore, the suit claims Google improperly collected data from these children, breaching the Children’s Online Privacy Protection Act (COPPA).  COPPA explicitly prohibits tracking and targeting of any user under the age of 13. 

This in mind, we can probably expect that the company will pay a fine worth upwards of several million dollars. The exact amount of the fine, however, is yet to be determined as the shape of the FTC settlement has not been clarified.  Of course, this decision rests solely in the hands of the United States Department of Justice, who tends to be quite strict on matters such as this. 

The recent announcement comes after an official Federal probe, which began in June following complaints from consumer groups.  Many of those to complain claimed that YouTube [consistently] fails to protect children who use the video service.  Perhaps more importantly, the consumer group argues that the Google-run company definitely collects their data. 

In addition to consumer complaints, though, the probe also comes after an initial news story, also published in June, which reported that YouTube’s algorithm effectively directed salacious content of children wearing bathing suits to users whose account history contain a preponderance of videos featuring prepubescent children.  This algorithm, says the claim, referred these users to videos after watching other sexually-themed (and probably inappropriate) content. 

Ultimately, YouTube removed many of these videos, though some still remain.  Indeed, it is believed that this is more likely a result of tweaking the algorithm than physical removal.  As such, authorities continue to suggest that, perhaps, YouTube is not working hard enough to ensure the remedy for this problem and to keep children safe. Still, YouTube has removed at least 800,000 videos that, they say, have violated their child-safety policies. 

Latest News