After Google agreed on Wednesday to pay a record $170 million fine, YouTube is suddenly changing what it shows to kids. Critics have dismissed the fine as paltry, similar to Facebook‘s $5 billion one, but the video giant is actually doing something about it. People will soon learn if the slap on the wrist for one of the world’s richest companies will bring better safety to children.
The new business practices, along with $170 million in fines, settle allegations by the Federal Trade Commission and New York state that YouTube owner Google violated children’s online privacy by collecting personal data without parents’ permission.
Some of the new responsibilities will be on video creators themselves, as they will have to label videos that are geared toward kids under 13.
Here’s a look at what’s behind the dispute and what’s changing.
WHAT THE LAW SAYS
The FTC’s complaint is based on a 1998 federal law called the Children’s Online Privacy Protection Act, or COPPA. It bans websites from collecting personal information from children under 13 without their parents’ consent.
Tech companies, however, have long skirted this by saying they officially exclude kids from their services, even though they don’t really check. A group of privacy advocates asked the FTC in April 2018 to investigate YouTube’s compliance.
YouTube has long said its service is intended for people ages 13 and older, a message that theoretically kept it in line with that law.
Ask any kid or parent, however, and the reality was far different. Younger kids commonly watch videos on YouTube, and many popular YouTube channels feature cartoons or sing-a-longs made for children. YouTube acknowledged Wednesday that “the likelihood of children watching without supervision has increased” since its founding because there are more shared devices and a “boom in family content.”
The FTC’s complaint details how Google boasted about its youthful audience when talking to major advertisers. The FTC includes as evidence Google’s visual presentations made to toy companies Mattel and Hasbro where YouTube is described as the “new Saturday Morning Cartoons” and the ”#1 website regularly visited by kids.”
CHANGES ON YOUTUBE’S MAIN SERVICE
Starting early next year, anyone who uploads a video to YouTube will have to designate whether or not that video is directed at children.
If a video is identified as child-focused, such as a cartoon or the “unboxing” of a new toy, Google has agreed not to put up “behavioral” ads — those that cater to specific viewers based on their age and other social characteristics. Google also won’t track the viewers’ online identities. Google says these restrictions will be in place even if the viewer is an adult.
But Google will still show generic ads, as well as “contextual” ads — those that cater to the type of content rather than the specific viewer. These typically don’t bring in as much money as viewer-specific ads.
And Google is stopping short of seeking parental consent on its main service, even for kids-focused video. The law doesn’t require it to, as long as there’s no data collection.
CHANGES ON YOUTUBE KIDS
Google already gets parental consent for its kids-focused service, YouTube Kids. But the service has traditionally been used far less frequently — after all, the main service had all the same videos and more.
YouTube said it will start promoting the kids service more aggressively. On Wednesday, kids-focused pages on YouTube’s main service had pop-ups suggesting YouTube Kids.
YouTube Kids similarly does not offer behavioral ads targeted at individuals, but it does collect some basic viewer information to recommend videos. It also collects the device’s numeric IP address.
YouTube said it will dole out $100 million over three years to encourage more videos for children.
NEW ONUS ON CREATORS
Google says the changes to the main service will happen in four months to give video creators a chance to adjust. In taking this approach, Google is putting much of the responsibility on video creators themselves, though the company says it will also use artificial intelligence to flag content that targets children but wasn’t properly identified as such.
Those who consider the settlement too weak are already concerned about what happens when video creators try to cheat the new system.
Democratic FTC Commissioner Rebecca Kelly Slaughter, in a dissenting opinion, said high-profile companies like Hasbro and Mattel will likely comply, as they won’t want to run afoul of federal rules even it means fewer kids seeing their toy promotions.
But she said it’s less clear how it will curb abuses by the millions of others who post videos on YouTube — especially those outside the United States who are beyond the FTC’s “practical reach.”
Google Gets Slap On Wrist With $170 Million FTC Settlement Fine
Google will pay $170 million to settle allegations its YouTube video service collected personal data on children without their parents’ consent.
The company agreed to work with video creators to label material aimed at kids and said it will limit data collection when users view such videos, regardless of their age.
Some lawmakers and children’s advocacy groups, however, complained that the settlement terms aren’t strong enough to rein in a company whose parent, Alphabet, made a profit of $30.7 billion last year on revenue of $136.8 billion, mostly from targeted ads.
Google will pay $136 million to the Federal Trade Commission and $34 million to New York state, which had a similar investigation. The fine is the largest the FTC has levied against Google, but it’s tiny compared with the $5 billion fine against Facebook this year for privacy violations.
YouTube “baited kids with nursery rhymes, cartoons, and more to feed its massively profitable behavioral advertising business,” Democratic Commissioner Rohit Chopra said in a tweet. “It was lucrative, and it was illegal.”
The federal government has increased scrutiny of big tech companies in the past two years — especially questioning how the tech giants collect and use personal information from their billions of customers. Many of the huge Silicon Valley companies are also under antitrust investigations aimed at determining whether the companies have unlawfully stifled competition.
Kids under 13 are protected by a 1998 federal law that requires parental consent before companies can collect and share their personal information.
Tech companies typically skirt that by banning kids under 13 entirely, though such bans are rarely enforced. In YouTube’s lengthy terms of service, those who are under 13 are simply asked, “please do not use the Service.”
Yet many popular YouTube channels feature cartoons or sing-a-longs made for children. According to the FTC, YouTube assigned ratings to its video channels and even had a “Y″ category directed at kids ages 7 or under, but YouTube targeted ads to those kids just as they would adults.
The FTC’s complaint includes as evidence Google presentations describing YouTube to toy companies Mattel and Hasbro as the “new Saturday Morning Cartoons” and the ”#1 website regularly visited by kids.”
“YouTube touted its popularity with children to prospective corporate clients,” FTC Chairman Joe Simons said. But when it came to complying with the law, he said, “the company refused to acknowledge that portions of its platform were clearly directed to kids.”
According to the settlement, Google and YouTube will get “verifiable” consent from parents before they collect or use personal information from children. The company also agreed not to use data collected from children before.
YouTube has its own service for children, YouTube Kids. The kids-focused service already requires parental consent and uses simple math problems to ensure that kids aren’t signing in on their own.
YouTube Kids does not target ads based on viewer interests the way the main YouTube service does. But the children’s version does track information about what kids are watching in order to recommend videos. It also collects personally identifying device information.
On Wednesday, Google said that starting early next year, YouTube will also limit personalized ads on its main service for videos meant for kids. Google is relying on video creators to label such items but will employ artificial intelligence to help.
YouTube won’t seek parental consent there, however, even on videos intended for children. YouTube is avoiding that precaution by instead turning off any personal tracking on those videos, saying it will collect only what is needed to make the service work. For such videos, YouTube also won’t offer features like comments and notifications.
The settlement now needs to be approved by a federal court in Washington. As with the Facebook settlement, the FTC vote was 3-2, with both Democrats opposing it as too weak.
Sen. Edward Markey, a Massachusetts Democrat, said the settlement won’t turn YouTube into a safe place for children and “makes clear that this FTC stands for ‘Forgetting Teens and Children.’”
A coalition of advocacy groups that helped trigger the investigation said the outcome will reduce behavioral advertising targeting children.
Jeff Chester, executive director of the Center for Digital Democracy, said the settlement “finally forced Google to confront its longstanding lie that it wasn’t targeting children on YouTube.”
But he said the “paltry” fine signals that politically powerful corporations can break the law without serious consequences.
Other critics, including dissenting Democratic Commissioner Rebecca Slaughter, said too much responsibility was being placed on video creators to classify their own content as kid-oriented, and thus limited to less-lucrative ads. They say that potentially allows Google to turn a blind eye as some try to cheat the system to make more money through ad revenue sharing.
Andrew Smith, the FTC’s consumer protection director, acknowledged that concern as valid, but said YouTube “has strong incentives to police its platform” to avoid further action.