California pushes ahead with its own children’s online privacy protections

California lawmakers have passed a bill that seeks to make apps and other online spaces safer for kids in the absence of robust federal standards. The bill, if signed into law, would impose a set of new protections for people under the age of 18 in California, potentially punishing tech companies with thousands in fines for every child affected by any violation.

The bill, the California Age-Appropriate Design Code Act, still needs to be signed by California Governor Gavin Newsom before becoming law. If signed, its provisions would go into effect on July 1, 2024, giving platforms an interval of time to come into compliance.

The new privacy rules would apply to social apps like Instagram, TikTok and YouTube — frequent targets of criticism for their mishandling of young users’ safety and mental health — but also to other businesses that offer “an online service, product, or feature likely to be accessed by children.” That broader definition would also extend the bill’s requirements to gaming and education platforms that kids might use, along with any other websites or services that don’t explicitly limit their use to adults.

The bill defines a child as anyone under the age of 18, pushing apps and other online products that might attract minors to enact more privacy protections for all under-18 users, not just the youngest ones. The federal law that carves out some privacy protections for children online, the Children’s Online Privacy Protection Act (COPPA), only extends its protections to children under age 13.

California Age-Appropriate … by TechCrunch

Among its requirements, the California children’s privacy legislation would prohibit companies from collecting any minor’s user data beyond what is absolutely necessary or leveraging children’s personal information in any way “materially detrimental to the physical health, mental health, or well-being of a child.” It would also require affected companies to default users under 18 to the strongest privacy settings, “including by disabling features that profile children using their previous behavior, browsing history, or assumptions of their similarity to other children, to offer detrimental material.”

The bill would also create a new working group dedicated to implementing its requirements comprised of members appointed by the governor and state agencies. The California Attorney General would be empowered to fine companies that violate its rules $2,500 per child affected for any violations deemed to be “negligent” and $7,500 for intentional violations.

“We are very encouraged by today’s bi-partisan passage of AB 2273, a monumental step toward protecting California kids online,” the children’s advocacy organization Common Sense said in a statement Tuesday. “Today’s action, authored by Assembly members Wicks, Cunningham and Petrie-Norris, sends an important signal about the need to make children’s online health and safety a greater priority for lawmakers and for our tech companies, particularly when it comes to websites that are accessed by young users.”

While there’s plenty of detail to be worked out still, the California bill could force the hand of tech companies that have historically prioritized explosive user growth and monetization above all — and dragging their feet when it comes to the less lucrative work of verifying the age of their users and protecting young people from online threats to safety and mental health. Inspired by the U.K. children’s privacy legislation known as the “Age Appropriate Design Code,” the current legislation could similarly force tech companies to improve their privacy standards for minors across the board rather than creating customized experiences for regionally specific user segments that fall under new legal protections.

FTC fines kids app developer HyperBeard $150K for use of third-party ad trackers

The U.S. Federal Trade Commission (FTC) today announced a settlement of $150,000 with HyperBeard, the developer of a collection of children’s mobile games over violations of U.S. Children’s Online Privacy Protection Act Rule (COPPA Rule). The company’s applications had been downloaded more than 50 million times on a worldwide basis to date, according to data from app intelligence firm Sensor Tower.

A complaint filed by the Dept. of Justice on behalf of the FTC alleged that HyperBeard had violated COPPA by allowing third-party ad networks to collect personal information in the form of persistent identifiers to track users of the company’s child-directed apps. And it did so without notifying parents or obtaining verifiable parental consent, as is required. These ad networks then used the identifiers to target ads to children using HyperBeard’s games.

The company’s lineup included games like Axolochi, BunnyBuns, Chichens, Clawbert, Clawberta, KleptoCats, KleptoCats 2, KleptoDogs, MonkeyNauts, and NomNoms (not to be confused with toy craze Num Noms).

The FTC determined HyperBeard’s apps were marketed towards children because they used brightly colored, animated characters like cats, dogs, bunnies, chicks, monkeys, and other cartoon characters and were described in child-friendly terms like “super cute” and “silly.” The company also marketed its kids’ apps on a kids’ entertainment website, YayOMG, published children’s books, and licensed other products, including stuffed animals and block construction sets, based on its app characters.

Unbelievably, the company would post disclaimers to its marketing materials that these apps were not meant for children under 13.

Above: A disclaimer on the NomNoms game website. 

In HyperBeard’s settlement with the FTC, the company has agreed to pay a $150,000 fine and to delete the personal information it illegally collected from children under the age of 13. The settlement had originally included a $4 million penalty, but the FTC suspended it over HyperBeard’s inability to pay the full amount. But that larger amount will become due if the company or its CEO, Alexander Kozachenko, are ever found to have misrepresented their finances.

HyperBeard is not the first tech company to be charged with COPPA violations. Two high-profile examples preceding it were YouTube and Musical.ly (TikTok)’s settlements of $170 million and $5.7 million, respectively, both in 2019. By comparison, HyperBeard’s fine seems minimal. However, its case is different from either video platform as the company itself was not handling the data collection – it was permitting ad networks to do so.

The complaint explained that HyperBeard let third-party advertising networks serve ads and collect personal information, in the form of persistent identifiers, in order to serve behavioral ads — meaning, targeted ads based on users’ activity over time and across sites.

This requires parental consent, but companies have skirted this rule for years — or outright ignored it, like YouTube did.

The ad networks used in HyperBeard’s apps included AdColony, AdMob, AppLovin, Facebook Audience Network, Fyber, IronSource, Kiip, TapCore, TapJoy, Vungle, and UnityAds. Despite being notified by watchdogs and the FTC of the issue, HyperBeard didn’t alert any of the ad networks that its apps were directed towards kids and make changes.

The issues around the invasiveness of third-party ad networks and trackers — and their questionable data collection practices — have come in the spotlight thanks to in-depth reporting about app privacy issues, various privacy experiments, petitions against their use — and, more recently, as a counter-argument to Apple’s marketing of its iPhone as a privacy-conscious device.

Last year, these complaints finally led Apple to ban the use of third-party networks and trackers in any iOS apps aimed at kids.

HyperBeard’s install base was below 50 million at the time of the settlement, we understand. According to Sensor Tower, around 12 million of HyperBeard’s installs to date have come from its most popular title, Adorable Home, which only launched in January 2020. U.S. consumers so far have accounted for about 18% of the company’s total installs to date, followed by the Chinese App Store at 14%. So far, in 2020, Vietnam has emerged as leading the market with close to 24% of all installs since January, while the U.S. dropped to No. 7 overall, with a 7% share.

The FTC’s action against HyperBeard should serve as a warning to other app developers that simply saying an app is not meant for kids doesn’t exempt them from following COPPA guidelines, when it’s clear the app is targeting kids. In addition, app makers can and will be held liable for the data collection practices of third-party ad networks, even if the app itself isn’t storing kids’ personal data on its own servers.

“If your app or website is directed to kids, you’ve got to make sure parents are in the loop before you collect children’s personal information,” said Andrew Smith, Director of the FTC’s Bureau of Consumer Protection, in a statement about the settlement. “This includes allowing someone else, such as an ad network, to collect persistent identifiers, like advertising IDs or cookies, in order to serve behavioral advertising,” he said.

YouTube starts limiting ad targeting and data collection on kids content

YouTube now officially limits the amount of data it and creators can collect on content intended for children, following promises made in November and a costly $170 million FTC fine in September. Considering how lucrative kids’ content is for the company, this could have serious financial ramifications for both it and its biggest creators.

The main change is, as announced in November, that for all content detected or marked as being for kids, viewers will be considered children no matter what. Even if you’re a verified, paying YouTube Premium customer (we know you’re out there) your data will be sanitized as if YouTube thinks you’re a 10-year-old kid.

There are plenty of reasons for this, most of them to do with avoiding liability. It’s just the safer path for the company to make the assumption that anyone viewing kids’ content is a kid — but it comes with unfortunate consequences.

Reduced data collection means no targeted ads. And targeted ads are much more valuable than ordinary ones. So this is effectively a huge revenue hit to anyone making children’s content — for instance YouTube’s current top-earning creator, Ryan Kaji (a kid himself).

It also limits the insights creators can have on their viewers, crucial information for anyone hoping to understand their demographics and improve their metrics. Engagement drivers like comments and notifications are also disabled, to channels’ detriment.

Google for its part says that it is “committed to helping creators navigate this new landscape and to supporting our ecosystem of family content.” How exactly it plans to do that isn’t clear; Many have already complained that the system is not clear and that this could be a death sentence for kids’ channels on YouTube.

Now that the policy is official we’ll probably soon hear exactly how it is impacting creators and what if anything Google actually does to mitigate that.

YouTube asks the FTC to clarify how video creators should comply with COPPA ruling

YouTube is asking the U.S. Federal Trade Commission for further clarification and better guidance to help video creators understand how to comply with the FTC’s guidelines set forth as part of YouTube’s settlement with the regulator over its violations of children’s privacy laws. The FTC in September imposed a historic fine of $170 million for YouTube’s violations of COPPA (the U.S. Children’s Online Privacy Protection Act). It additionally required YouTube creators to now properly identify any child-directed content on the platform.

To comply with the ruling, YouTube created a system where creators could either label their entire channel as child-directed, or they could identify only certain videos as being directed at children, as needed. Videos that are considered child-directed content would then be prohibited from collecting personal data from viewers. This limited creators’ ability to leverage Google’s highly profitable behavioral advertising technology on videos kids were likely to watch.

As a result, YouTube creators have been in an uproar since the ruling, arguing that it’s too difficult to tell the difference between what’s child-directed content and what’s not. Several popular categories of YouTube videos — like gaming, toy reviews, and family vlogging, for instance — fall under gray areas, where they’re watched by children and adults alike. But because the FTC’s ruling left creators held liable for any future violations, YouTube could only advise creators to consult a lawyer to help them work through the ruling’s impact on their own channels.

Today, YouTube says it’s asking the FTC to provide more clarity.

“Currently, the FTC’s guidance requires platforms must treat anyone watching primarily child-directed content as children under 13. This does not match what we see on YouTube, where adults watch favorite cartoons from their childhood or teachers look for content to share with their students,” noted YouTube in an announcement. “Creators of such videos have also conveyed the value of product features that wouldn’t be supported on their content. For example, creators have expressed the value of using comments to get helpful feedback from older viewers. This is why we support allowing platforms to treat adults as adults if there are measures in place to help confirm that the user is an adult viewing kids’ content,” the company said.

Specifically, YouTube wants the FTC to clarify what’s to be done when adults are watching kids’ content. It also wants to know what’s to be done about content that doesn’t intentionally target kids — like videos in the gaming, DIY and art space, for example — if those videos end up attracting a young audience. Are these also to be labeled “made for kids,” even though that’s not their intention?, YouTube asks.

The FTC had shared some guidance in November, which YouTube passed along to creators. But YouTube says it’s not enough as it doesn’t help creators to understand what’s to be done about this “mixed audience” content.

YouTube says it supports platforms treating adults who view primarily child-directed video content as adults, as long as there are measures in place to help confirm the user is an adult. It didn’t suggest what those measures would be, though possibly this could involve users logged in to an adult-owned Google account or perhaps an age-gate measure of some sort.

YouTube submitted its statements as a part of the FTC’s comment period on the agency’s review of the COPPA Rule, which has been extended until December 11, 2019. The FTC is giving commenters additional time to submit comments and an alternative mechanism to file them as the federal government’s Regulations.gov portal is temporarily inaccessible. Instead, commenters can submit their thoughts via email to the address secretary@ftc.gov, with the subject line “COPPA comment.” These must be submitted before 11:59 PM ET on Dec. 11, the FTC says.

YouTube’s announcement, however, pointed commenters to the FTC’s website, which isn’t working right now.

“We strongly support COPPA’s goal of providing robust protections for kids and their privacy. We also believe COPPA would benefit from updates and clarifications that better reflect how kids and families use technology today, while still allowing access to a wide range of content that helps them learn, grow and explore. We continue to engage on this issue with the FTC and other lawmakers (we previously participated in the FTC’s public workshop) and are committed to continue [sic] doing so,” said YouTube.

Instagram still doesn’t age-check kids. That must change.

Instagram dodges child safety laws. By not asking users their age upon signup, it can feign ignorance about how old they are. That way, it can’t be held liable for $40,000 per violation of the Child Online Privacy Protection Act. The law bans online services from collecting personally identifiable information about kids under 13 without parental consent. Yet Instagram is surely stockpiling that sensitive info about underage users, shrouded by the excuse that it doesn’t know who’s who.

But here, ignorance isn’t bliss. It’s dangerous. User growth at all costs is no longer acceptable.

It’s time for Instagram to step up and assume responsibility for protecting children, even if that means excluding them. Instagram needs to ask users’ age at sign up, work to verify they volunteer their accurate birthdate by all practical means, and enforce COPPA by removing users it knows are under 13. If it wants to allow tweens on its app, it needs to build a safe, dedicated experience where the app doesn’t suck in COPPA-restricted personal info.

Instagram is woefully behind its peers. Both Snapchat and TikTok require you to enter your age as soon as you start the sign up process. This should really be the minimum regulatory standard, and lawmakers should close the loophole allowing services to skirt compliance by not asking. If users register for an account, they should be required to enter an age of 13 or older.

Instagram’s parent company Facebook has been asking for birthdate during account registration since its earliest days. Sure, it adds one extra step to sign up, and impedes its growth numbers by discouraging kids to get hooked early on the social network. But it also benefits Facebook’s business by letting it accurately age-target ads.

Most importantly, at least Facebook is making a baseline effort to keep out underage users. Of course, as kids do when they want something, some are going to lie about their age and say they’re old enough. Ideally, Facebook would go further and try to verify the accuracy of a user’s age using other available data, and Instagram should too.

Both Facebook and Instagram currently have moderators lock the accounts of any users they stumble across that they suspect are under 13. Users must upload government-issued proof of age to regain control. That policy only went into effect last year after UK’s Channel 4 reported a Facebook moderator was told to ignore seemingly underage users unless they explicitly declared they were too young or were reported for being under 13. An extreme approach would be to require this for all signups, though that might be expensive, slow, significantly hurt signup rates, and annoy of-age users.

Instagram is currently on the other end of the spectrum. Doing nothing around age-gating seems recklessly negligent. When asked for comment about how why it doesn’t ask users’ ages, how it stops underage users from joining, and if it’s in violation of COPPA, Instagram declined to comment. The fact that Instagram claims to not know users’ ages seems to be in direct contradiction to it offering marketers custom ad targeting by age such as reaching just those that are 13.

Luckily, this could all change soon.

Mobile researcher and frequent TechCrunch tipster Jane Manchun Wong has spotted Instagram code inside its Android app that shows it’s prototyping an age-gating feature that rejects users under 13. It’s also tinkering with requiring your Instagram and Facebook birthdates to match. Instagram gave me a “no comment” when I asked about if these features would officially roll out to everyone.

Code in the app explains that “Providing your birthday helps us make sure you get the right Instagram experience. Only you will be able to see your birthday.” Beyond just deciding who to let in, Instagram could use this info to make sure users under 18 aren’t messaging with adult strangers, that users under 21 aren’t seeing ads for alcohol brands, and that potentially explicit content isn’t shown to minors.

Instagram’s inability to do any of this clashes with it and Facebook’s big talk this year about its commitment to safety. Instagram has worked to improve its approach to bullying, drug sales, self-harm, and election interference, yet there’s been not a word about age gating.

Meanwhile, underage users promote themselves on pages for hashtags like #12YearOld where it’s easy to find users who declare they’re that age right in their profile bio. It took me about 5 minutes to find creepy “You’re cute” comments from older men on seemingly underage girls’ photos. Clearly Instagram hasn’t been trying very hard to stop them from playing with the app.

I brought up the same unsettling situations on Musically, now known as TikTok, to its CEO Alex Zhu on stage at TechCrunch Disrupt in 2016. I grilled Zhu about letting 10-year-olds flaunt their bodies on his app. He tried to claim parents run all of these kids’ accounts, and got frustrated as we dug deeper into Musically’s failures here.

Thankfully, TikTok was eventually fined $5.7 million this year for violating COPPA and forced to change its ways. As part of its response, TikTok started showing an age gate to both new and existing users, removed all videos of users under 13, and restricted those users to a special TikTok Kids experience where they can’t post videos, comment, or provide any COPPA-restricted personal info.

If even a Chinese app social media app that Facebook CEO has warned threatens free speech with censorship is doing a better job protecting kids than Instagram, something’s gotta give. Instagram could follow suit, building a special section of its apps just for kids where they’re quarantined from conversing with older users that might prey on them.

 

Perhaps Facebook and Instagram’s hands-off approach stems from the fact that CEO Mark Zuckerberg doesn’t think the ban on under-13-year-olds should exist. Back in 2011, he said “That will be a fight we take on at some point . . . My philosophy is that for education you need to start at a really, really young age.” He’s put that into practice with Messenger Kids which lets 6 to 12-year-olds chat with their friends if parents approve.

The Facebook family of apps’ ad-driven business model and earnings depend on constant user growth that could be inhibited by stringent age gating. It surely doesn’t want to admit to parents it’s let kids slide into Instagram, that advertisers were paying to reach children too young to buy anything, and to Wall Street that it might not have 2.8 billion legal users across its apps as it claims.

But given Facebook and Instagram’s privacy scandals, addictive qualities, and impact on democracy, it seems like proper age-gating should be a priority as well as the subject of more regulatory scrutiny and public concern. Society has woken up to the harms of social media, yet Instagram erects no guards to keep kids from experiencing those ills for themselves. Until it makes an honest effort to stop kids from joining, the rest of Instagram’s safety initiatives ring hollow.

YouTube to update harassment and gaming policies, clarify changes to kids content

In YouTube CEO Susan Wojcicki quarterly letter, released today, the exec addresses a number of changes to YouTube policies, including the recent FTC-mandated rules for kids content that have alternately confused and infuriated video creators, as well as forthcoming policies around harassment and gaming videos, among other things.

On the latter, Wojcicki said the company was now in the process of developing a new harassment policy and was talking to creators about what needed to be addressed. She did not give an ETA for the rollout, but said creators would be posted when the changes were finalized.

YouTube also responded to creator concerns over policies around gaming videos that include violence.

“We’ve heard loud and clear that our policies need to differentiate between real-world violence and gaming violence,” Wojcicki said. “We have a policy update coming soon that will do just that. The new policy will have fewer restrictions for violence in gaming, but maintain our high bar to protect audiences from real-world violence.”

This topic was recently discussed at YouTube’s Gaming Creator Summit, as well.

The company also said it’s now working to match edgier content with advertisers who may be interested in it –like a marketer who wants to promote an R-rated movie, for example.

The letter briefly addressed the creator uproar over kids’ content, with promises of more clarity.

In September, YouTube reached a settlement with the Federal Trade Commission over its violation of the Children’s Online Privacy Protection Act (COPPA), which required it to pay a $170 million fine and set into place a series of new rules for creators to comply with. These rules require creators to mark videos that are directed at kids (or entire channels, if need be.). This, in turn, will limit data collection, put an end to personalized ads on kids’ content, disable comments, and reduce their revenues, creators say.

Creators will also lose out on a number of key YouTube features, The Verge recently reported, including click-through info cards, end screens, notification functions, and the community tab.

YouTube creators say they don’t have enough clarity around where to draw the line between content that’s made for kids and content that may attract kids. For example, family vlog channels and some gaming videos may appeal to kids and adults alike. And if the FTC decides a creator is in violation, they can be held liable for future COPPA violations now that YouTube’s new policy and content labeling system is in place. YouTube’s advice to creators on how to proceed? Consult a lawyer, it has said.

In today’s letter, Wojcicki acknowledges the fallout of these changes, but doesn’t offer any further clarity — only promises of updates to come.

“We know there are still many questions about how this is going to affect creators and we’ll provide updates as possible along the way,” Wojcicki said.

She also points to a long thread on the YouTube Community forum where many questions about the system are being answered — like the policy’s reach, what’s changing, how and when to mark content as being for kids or not, and more. The forum’s Q&A also addressed some of the questions that keep coming up about all-ages content, including some example scenarios. Creators, of course, have read through these materials and say they still don’t understand how to figure out if their video is for kids or not. (And clearly, they don’t want to err on the side of caution at the risk of reduced income.)

The being said, the rise of a kid-friendly YouTube has had a range of negative consequences. YouTube had to shut down comments after finding a ring of child predators on videos with kids, for instance. Parents roped in their kids to the “family business” before the children even knew what being public on the internet meant. Some young stars have been put to work more than should be legal due to the lack of child labor laws for online content. There’s even been child abuse at the hands of the parents. Children watching the vides, meanwhile, were being marketed to without their understanding, addicted to consumerism by toy unboxings and playtime videos, and targeted with personalized ads. Kid YouTube was overdue for a reigning in.

The letter addresses a few other key issues, as well, including the launch of the new Creator Studio and the latest on the EU’s copyright directive Article 17, which is now being translated into local law. Wojcicki cheers some of the changes to the policy, including the one that secures liability protections when YouTube makes its best efforts to match copyright material with rights owners.

And Wojcicki addresses the growing concerns over creator burnout, by reminding video creators to take a break and practice self-care — adding that it won’t harm their business by doing so. In fact, YouTube scoured its data from the past 6 years and found that on average, channels both large and small had more views when they returned than they had right before they left.

“If you need to take some time off, your fans will understand,” she said.

 

Ahead of FTC ruling, YouTube Kids is getting a website

Ahead of the official announcement of an FTC settlement which could force YouTube to direct under-13-year-old users to a separate experience for YouTube’s kid-friendly content, the company has quietly announced plans to launch its YouTube Kids service on the web. Previously, parents would have to download the YouTube Kids app to a mobile device in order to access the filtered version of YouTube.

By bringing YouTube Kids to the web, the company is prepared for the likely outcome of an FTC settlement which would require the company to implement an age-gate on its site, then redirect under-13-year-olds to a separate kid-friendly experience.

In addition, YouTube Kids is gaining a new filter which will allow parents to set the content to being preschooler-appropriate.

The announcement, published to the YouTube Help forums, was first spotted by Android Police.

It’s unclear if YouTube was intentionally trying to keep these changes from being picked up on by a larger audience (or the press) by publishing the news to a forum instead of its official YouTube blog. (The company tells us it publishes a lot of news the forum site. Sure, okay. But with an FTC settlement looming, it seems an odd destination for such a key announcement.)

It’s also worth noting that, around the same time as the news was published, YouTube CEO Susan Wojcicki posted her quarterly update for YouTube creators.

The update is intended to keep creators abreast of what’s in store for YouTube and its community. But this quarter, her missive spoke solely about the value in being an open platform, and didn’t touch on anything related to kids content or the U.S. regulator’s investigation.

However, it’s precisely YouTube’s position on “openness” that concerns parents when it comes to their kids watching YouTube videos. The platform’s (almost) “anything goes” nature means kids can easily stumble upon content that’s too adult, controversial, hateful, fringe, or offensive.

The YouTube Kids app is meant to offer a safer destination, but YouTube isn’t manually reviewing each video that finds its way there. That has led to inappropriate and disturbing content slipping through the cracks on numerous occasions, and eroding parents’ trust.

youtube kids website

Because many parents don’t believe YouTube Kids’ algorithms can filter content appropriately, the company last fall introduced the ability for parents to whitelist specific videos or channels in the Kids app. It also rolled out a feature that customized the app’s content for YouTube’s older users, ages 8 through 12. This added gaming content and music videos.

Now, YouTube is further breaking up its “Younger” content level filter, which was previously 8 and under, into two parts. Starting now, “Younger” applies to ages 5 through 7, while the new “Preschool” filter is for the age 4 and under group. The latter will focus on videos that promote “creativity, playfulness, learning, and exploration,” says YouTube.

Above: the content filter before

YouTube confirmed to TechCrunch that its forum announcement is accurate, but the company would not say when the YouTube Kids web version would go live, beyond “this week.”

The YouTube Kids changes are notable because they signal that YouTube is getting things in place before an FTC settlement announcement that will impact how the company handles kids content and its continued use by young children.

It’s possible that YouTube will be fined by the FTC for its violations of COPPA, as Musical.ly (TikTok) was earlier this year. One report, citing unnamed sources, says the FTC’s YouTube settlement has, in fact, already been finalized and includes a multimillion-dollar fine.

YouTube will also likely be required to implement an age-gate on its site and in its apps that will direct under-13-year-olds to the YouTube Kids platform instead of YouTube proper. The settlement may additionally require YouTube to stop targeting ads on videos aimed at children, as has been reported by Bloomberg. 

We probably won’t see the FTC issuing a statement about its ruling ahead of this Labor Day weekend, but it may do so in advance of its October workshop focused on refining the COPPA regulation — an event that has the regulator looking for feedback on how to properly handle sites like YouTube.

 

 

The FTC looks to change children’s privacy law following complaints about YouTube

The U.S. Federal Trade Commission is considering an update to the laws governing children’s privacy online, known as the COPPA Rule (or, the Children’s Online Privacy Protection Act). The Rule first went into effect in 2000 and was amended in 2013 to address changes in how children use mobile devices and social networking sites. Now, the FTC believes it may be due for more revisions. The organization is seeking input and comments on possible updates, some of which are specifically focused on how to address sites that aren’t necessarily aimed at children, but have large numbers of child users.

In other words, sites like YouTube .

The FTC’s announcement comes only weeks after U.S. consumer advocacy groups and Senator Ed Markey (D-Mass.) sent complaint letters to the FTC, urging the regulators to investigate YouTube for potential COPPA violations.

The advocacy groups allege that YouTube is hiding behind its terms of service which claim YouTube is “not intended for children under 13” — a statement that’s clearly no longer true. Today, the platform is filled with videos designed for viewing by kids. Google even offers a YouTube Kids app aimed at preschooler to tween-aged children.

According to the letter written by the Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD), Google has now collected personal information from nearly 25 million children in the U.S., and it used this data to engage in “very sophisticated digital marketing techniques.”

The groups want YouTube to delete the children’s data, set up an age-gate on the site, and separate out any kids content into its own app where YouTube will have to properly follow COPPA guidelines.

These demands are among those pushing the FTC to this action.

The Commission says it wants input as to whether COPPA should be updated to better address websites and online services that are not traditionally aimed at children but are used by kids, as well as whether these “general audience platforms” should have to identity and police the child-directed content that’s uploaded by third parties.

In other words, should the FTC amend COPPA so it can protect the privacy of the kids using YouTube?

“In light of rapid technological changes that impact the online children’s marketplace, we must ensure COPPA remains effective,” said FTC Chairman Joe Simons, in a published statement. “We’re committed to strong COPPA enforcement, as well as industry outreach and a COPPA business hotline to foster a high level of COPPA compliance. But we also need to regularly revisit and, if warranted, update the Rule,” he added.

While YouTube is a key focus, the FTC will also seek comment on whether there should be an exception for parental consent for the use of educational technology in schools. And it wants to better understand the implications for COPPA in terms of interactive media, like interactive TV (think Netflix’s Minecraft: Story Mode, for example), or interactive gaming.

More broadly, the FTC wants to know how COPPA has impacted the availability of sites and services aimed at children, it says.

The decision to initiate a review of COPPA was a unanimous decision from the FTC’s five commissioners, which includes three Republicans and two Democrats.

Led by Simons, the FTC in February took action against Musical.ly (now TikTok), by issuing a record $5.7 million fine for its COPPA violations. Similar to YouTube, the app was used by a number of under-13 kids without parental consent. The company knew this was the case, but continued to collect the kids’ personal information, regardless.

“This record penalty should be a reminder to all online services and websites that target children: We take enforcement of COPPA very seriously, and we will not tolerate companies that flagrantly ignore the law,” Simons had said at the time.

The settlement with TikTok required the company to delete children’s videos and data and restrict underage users from being able to film videos.

It’s unclear why the FTC can’t now require the same of YouTube, given the similarities between the two services, without amending the law.

“They absolutely can and should fine YouTube, not to mention force YouTube to make significant changes, under the current regulations,” says Josh Golin, the Executive Director for CCFC. “As for the YouTube decision – by far the most important COPPA case in the agency’s history – it’s extremely concerning that the Commission appears to be signaling they do not have the authority under the current rules to hold YouTube accountable,” he says.

“COPPA rules could use some updating but the biggest problem with the law is the FTC’s lack of enforcement, which is something the Commission could address right away without a lengthy comment period,” Golin adds.

The FTC says it will hold a public workshop on October 7, 2019 to examine the COPPA Rule.

The FTC looks to change children’s privacy law following complaints about YouTube

The U.S. Federal Trade Commission is considering an update to the laws governing children’s privacy online, known as the COPPA Rule (or, the Children’s Online Privacy Protection Act). The Rule first went into effect in 2000 and was amended in 2013 to address changes in how children use mobile devices and social networking sites. Now, the FTC believes it may be due for more revisions. The organization is seeking input and comments on possible updates, some of which are specifically focused on how to address sites that aren’t necessarily aimed at children, but have large numbers of child users.

In other words, sites like YouTube .

The FTC’s announcement comes only weeks after U.S. consumer advocacy groups and Senator Ed Markey (D-Mass.) sent complaint letters to the FTC, urging the regulators to investigate YouTube for potential COPPA violations.

The advocacy groups allege that YouTube is hiding behind its terms of service which claim YouTube is “not intended for children under 13” — a statement that’s clearly no longer true. Today, the platform is filled with videos designed for viewing by kids. Google even offers a YouTube Kids app aimed at preschooler to tween-aged children.

According to the letter written by the Campaign for a Commercial-Free Childhood (CCFC) and the Center for Digital Democracy (CDD), Google has now collected personal information from nearly 25 million children in the U.S., and it used this data to engage in “very sophisticated digital marketing techniques.”

The groups want YouTube to delete the children’s data, set up an age-gate on the site, and separate out any kids content into its own app where YouTube will have to properly follow COPPA guidelines.

These demands are among those pushing the FTC to this action.

The Commission says it wants input as to whether COPPA should be updated to better address websites and online services that are not traditionally aimed at children but are used by kids, as well as whether these “general audience platforms” should have to identity and police the child-directed content that’s uploaded by third parties.

In other words, should the FTC amend COPPA so it can protect the privacy of the kids using YouTube?

“In light of rapid technological changes that impact the online children’s marketplace, we must ensure COPPA remains effective,” said FTC Chairman Joe Simons, in a published statement. “We’re committed to strong COPPA enforcement, as well as industry outreach and a COPPA business hotline to foster a high level of COPPA compliance. But we also need to regularly revisit and, if warranted, update the Rule,” he added.

While YouTube is a key focus, the FTC will also seek comment on whether there should be an exception for parental consent for the use of educational technology in schools. And it wants to better understand the implications for COPPA in terms of interactive media, like interactive TV (think Netflix’s Minecraft: Story Mode, for example), or interactive gaming.

More broadly, the FTC wants to know how COPPA has impacted the availability of sites and services aimed at children, it says.

The decision to initiate a review of COPPA was a unanimous decision from the FTC’s five commissioners, which includes three Republicans and two Democrats.

Led by Simons, the FTC in February took action against Musical.ly (now TikTok), by issuing a record $5.7 million fine for its COPPA violations. Similar to YouTube, the app was used by a number of under-13 kids without parental consent. The company knew this was the case, but continued to collect the kids’ personal information, regardless.

“This record penalty should be a reminder to all online services and websites that target children: We take enforcement of COPPA very seriously, and we will not tolerate companies that flagrantly ignore the law,” Simons had said at the time.

The settlement with TikTok required the company to delete children’s videos and data and restrict underage users from being able to film videos.

It’s unclear why the FTC can’t now require the same of YouTube, given the similarities between the two services, without amending the law.

“They absolutely can and should fine YouTube, not to mention force YouTube to make significant changes, under the current regulations,” says Josh Golin, the Executive Director for CCFC. “As for the YouTube decision – by far the most important COPPA case in the agency’s history – it’s extremely concerning that the Commission appears to be signaling they do not have the authority under the current rules to hold YouTube accountable,” he says.

“COPPA rules could use some updating but the biggest problem with the law is the FTC’s lack of enforcement, which is something the Commission could address right away without a lengthy comment period,” Golin adds.

The FTC says it will hold a public workshop on October 7, 2019 to examine the COPPA Rule.

U.S. Senator and consumer advocacy groups urge FTC to take action on YouTube’s alleged COPPA violations

The groups behind a push to get the U.S. Federal Trade Commission to investigate YouTube’s alleged violation of children’s privacy law, COPPA, have today submitted a new letter to the FTC that lays out the appropriate sanctions the groups want the FTC to now take. The letter comes shortly after news broke that the FTC was in the final stages of its probe into YouTube’s business practices regarding this matter.

They’re joined in pressing the FTC to act by COPPA co-author, Senator Ed Markey, who penned a letter of his own, which was also submitted today.

The groups’ formal complaint with the FTC was filed back in April 2018. The coalition, which then included 20 child advocacy, consumer and privacy groups, had claimed YouTube doesn’t get parental consent before collecting the data from children under the age of 13 — as is required by the Children’s Online Privacy Protection Act, also known as COPPA.

The organizations said, effectively, that YouTube was hiding behind its terms of service which claims that YouTube is “not intended for children under 13.”

This simply isn’t true, as any YouTube user knows. YouTube is filled with videos that explicitly cater to children, from cartoons to nursery rhymes to toy ads — the latter which often come about by way of undisclosed sponsorships between toy makers and YouTube stars. The video creators will excitedly unbox or demo toys they received for free or were paid to feature, and kids just eat it all up.

In addition, YouTube curates much of its kid-friendly content into a separate YouTube Kids app that’s designed for the under-13 crowd — even preschoolers.

Meanwhile, YouTube treats children’s content like any other. That means targeted advertising and commercial data collection are taking place, the groups’ complaint states. YouTube’s algorithms also recommend videos and autoplay its suggestions — a practice that led to kids being exposed to inappropriate content in the past.

Today, two of the leading groups behind the original complaint — the Campaign for a Commercial-Free Childhood (CCFC) and Center for Digital Democracy (CDD) — are asking the FTC to impose the maximum civil penalties on YouTube because, as they’ve said:

Google had actual knowledge of both the large number of child-directed channels on YouTube and the large numbers of children using YouTube. Yet, Google collected personal information from nearly 25 million children in the U.S over a period of years, and used this data to engage in very sophisticated digital marketing techniques. Google’s wrongdoing allowed it to profit in two different ways: Google has not only made a vast amount of money by using children’s personal information as part of its ad networks to target advertising, but has also profited from advertising revenues from ads on its YouTube channels that are watched by children.

The groups are asking the FTC to impose a 20-year consent degree on YouTube.

They want the FTC to order YouTube to destroy all data from children under 13, including any inferences drawn from the data, that’s in Google’s possession. YouTube should also stop collecting data from anyone under 13, including anyone viewing a channel or video directed at children. Kids’ ages also need to be identified so they can be prevented from accessing YouTube.

Meanwhile, the groups suggest that all the channels in the Parenting and Family lineup, plus any other channels or video directed at children, be removed from YouTube and placed into a separate platform for children. (e.g. the YouTube Kids app).

This is something YouTube is already considering, according to a report from The Wall Street Journal last week.

This separate kids platform would have a variety restrictions, including no commercial data collection; no links out to other sites or online services; no targeted marketing; no product or brand integration; no influencer marketing; and even no recommendations or autoplay.

The removal of autoplaying videos and recommendations, in particular, would be a radical change to how YouTube operates, but one that could protect kids from inappropriate content that slips in. It’s also a change that some employees inside YouTube itself were vying for, according to The WSJ’s report. 

The groups also urge the FTC to require Google to fund educational campaigns around the true nature of Google’s data-driven marketing systems, admit publicly that it violated the law, and submit to annual audits to ensure its ongoing compliance. They want Google to commit $100 million to establish a fund that supports the production of noncommercial, high-quality and diverse content for kids.

Finally, the groups are asking that Google faces the maximum possible civil penalties —  $42,530 per violation, which could be counted as either per child or per day. This monetary relief needs to be severe, the groups argue, so Google and YouTube will be deterred from ever violating COPPA in the future.

While this laundry list of suggestions is more like a wish list of what the ideal resolution would look like, it doesn’t mean that the FTC will follow through on all these suggestions.

However, it seems likely that the Commission would at least require YouTube to delete the improperly collected data and isolate the kids’ YouTube experience in some way. After all, that’s precisely what it just did with Tik Tok (previously Musical.ly) which earlier this year paid a record $5.7 million fine for its own COPPA violations. It also had to implement an age gate where under-13 kids were restricted from publishing content.

The advocacy groups aren’t the only ones making suggestions to the FTC.

Senator Ed Markey (D-Mass.) also sent the FTC a letter today about YouTube’s violations of COPPA — a piece of legislation that he co-authored.

In his letter, he urges the FTC take a similar set of actions, saying:

“I am concerned that YouTube has failed to comply with COPPA. I therefore, urge the Commission to use all necessary resources to investigate YouTube, demand that YouTube pay all monetary penalties it owes as a result of any legal violations, and instruct YouTube to institute policy changes that put children’s well-being first.”

His suggestions are similar to those being pushed by the advocacy groups. They include demands for YouTube to delete the children’s data and cease data collection on those under 13; implement an age gate on YouTube to come into compliance with COPPA; prohibit targeted and influencer marketing; offer detailed explanations of what data is collected if for “internal purposes;” undergo a yearly audit; provide documentation of compliance upon request; and establish a fund for noncommercial content.

He also wants Google to sponsor a consumer education campaign warning parents that no one under 13 should use YouTube and want Google to be prohibited from launching any new child-directed product until it’s been reviewed by an independent panel of experts.

The FTC’s policy doesn’t allow it to confirm or deny nonpublic investigations. YouTube hasn’t yet commented on the letters.