YouTube is turning off kids’ ability to write comments in supervised experiences

YouTube’s comments section, historically, has had a bad reputation, but a change rolling out could prevent kids from wading into the comments cesspool. In an email to parents who supervise a child’s account, the company announced the introduction of a “read-only” comments option on their child’s supervised experience on YouTube. The feature will roll out […]

© 2024 TechCrunch. All rights reserved. For personal use only.

Snapchat turns off controversial ‘Solar System’ feature by default after bad press

Less than a week after The Wall St. Journal reported on how a Snapchat feature dubbed “solar system” was adding to teens’ anxiety, the company has responded by adjusting how the feature works. The ranking system for paid subscribers today shows you how close you are to your Snapchat friends by displaying your position in […]

© 2024 TechCrunch. All rights reserved. For personal use only.

k-ID launches a solution that helps game developers comply with ever-changing child safety regulations

Making a video game successful is already hard. Doing so while complying with the growing number of child safety laws and regulations around the world is an almost insurmountable task. A new technology company called k-ID aims to make this process much easier for game makers, by offering a framework that protects publishers and developers […]

© 2024 TechCrunch. All rights reserved. For personal use only.

Pinterest rolls out new teen safety features, including wiping followers from users 15 and under

Pinterest today introduced a series of new safety features aimed at better protecting teens using its service. The features — which include things like private profiles, more control over followers, and message safety controls — are similar to those introduced on other popular social media platforms with teens, like Instagram.

On Pinterest, users ages 16 and older will now be able to opt into either a private or public profile. If they choose to go private, they can still network with friends by sending a unique link that allows them to connect. These private profiles — which remain the default for users under the age of 16 — are not discoverable by others on the platform.

This mirrors the decision made at Instagram, which also defaults users to private if under 16 but then allows teens to opt to turn their profiles public if they choose.

In addition, all users, including teens under 16, will have more control over their followers and will be able to review and remove followers they don’t want tracking their posts. For teens under 16, followers will be removed, forcing them to start fresh and specifically choose who they want to collaborate with and who they trust.

That’s likely going to generate some backlash from teens who have been growing their following and will have to start over re-adding all their friends.

However, these changes come as a necessity given the reporting from earlier this year by NBC News, which exposed how pedophiles had been using Pinterest’s service to curate image boards of young girls. Many of the users under 16 had lied about their ages, which is why their profiles were discoverable. As a result, Pinterest announced a new set of parental controls and updated age verification policies, among other things to better protect its younger users.

At the time, it said it would update other features so teens would be able to safely connect with people they know and trust in the future.

Another of today’s changes follows up on that promise as it makes messaging and group boards available to all users, but only if you give them permission to contact you or collaborate. Teens under 16 will only be able to receive messages from their mutual followers who were accepted through a shared profile link. This link expires after three days or after the sender receives their first 5 followers with the shareable link, whichever comes first.

Meanwhile, for parents and guardians, Pinterest already offers a feature that requires a passcode they set before a teen can change their account settings. Now caregivers will have the ability to opt in and out of this passcode feature and can update the protected settings on their teen’s account.

Image Credits: Pinterest

Pinterest says all the new features and changes will begin rolling out this week, which means you may not immediately see them today.

The company has needed to address teen safety for some time, given its heavy use among younger users. In its most recent earnings call with investors, the company touted its young demographic.

“We’ve noted this multiple times that Gen Z is our fastest-growing demographic and our largest contributor to overall engagement, which means it’s not just a high growth rate on a small denominator. It’s our largest contributor to overall engagement growth,” said Pinterest CEO Bill Ready. “To be able to say that we have Gen Z, a more recent cohort that’s actually engaging even more deeply than our prior cohorts, I think is quite exceptional.”

Ready noted how young people’s use of the site helped Pinterest pursue its goals of being the first stop to plan for things like travel and shopping. The company has also targeted Gen Z with its shoppable scrapbooking app Shuffles, which went viral last year. 

Pinterest rolls out new teen safety features, including wiping followers from users 15 and under

Pinterest today introduced a series of new safety features aimed at better protecting teens using its service. The features — which include things like private profiles, more control over followers, and message safety controls — are similar to those introduced on other popular social media platforms with teens, like Instagram.

On Pinterest, users ages 16 and older will now be able to opt into either a private or public profile. If they choose to go private, they can still network with friends by sending a unique link that allows them to connect. These private profiles — which remain the default for users under the age of 16 — are not discoverable by others on the platform.

This mirrors the decision made at Instagram, which also defaults users to private if under 16 but then allows teens to opt to turn their profiles public if they choose.

In addition, all users, including teens under 16, will have more control over their followers and will be able to review and remove followers they don’t want tracking their posts. For teens under 16, followers will be removed, forcing them to start fresh and specifically choose who they want to collaborate with and who they trust.

That’s likely going to generate some backlash from teens who have been growing their following and will have to start over re-adding all their friends.

However, these changes come as a necessity given the reporting from earlier this year by NBC News, which exposed how pedophiles had been using Pinterest’s service to curate image boards of young girls. Many of the users under 16 had lied about their ages, which is why their profiles were discoverable. As a result, Pinterest announced a new set of parental controls and updated age verification policies, among other things to better protect its younger users.

At the time, it said it would update other features so teens would be able to safely connect with people they know and trust in the future.

Another of today’s changes follows up on that promise as it makes messaging and group boards available to all users, but only if you give them permission to contact you or collaborate. Teens under 16 will only be able to receive messages from their mutual followers who were accepted through a shared profile link. This link expires after three days or after the sender receives their first 5 followers with the shareable link, whichever comes first.

Meanwhile, for parents and guardians, Pinterest already offers a feature that requires a passcode they set before a teen can change their account settings. Now caregivers will have the ability to opt in and out of this passcode feature and can update the protected settings on their teen’s account.

Image Credits: Pinterest

Pinterest says all the new features and changes will begin rolling out this week, which means you may not immediately see them today.

The company has needed to address teen safety for some time, given its heavy use among younger users. In its most recent earnings call with investors, the company touted its young demographic.

“We’ve noted this multiple times that Gen Z is our fastest-growing demographic and our largest contributor to overall engagement, which means it’s not just a high growth rate on a small denominator. It’s our largest contributor to overall engagement growth,” said Pinterest CEO Bill Ready. “To be able to say that we have Gen Z, a more recent cohort that’s actually engaging even more deeply than our prior cohorts, I think is quite exceptional.”

Ready noted how young people’s use of the site helped Pinterest pursue its goals of being the first stop to plan for things like travel and shopping. The company has also targeted Gen Z with its shoppable scrapbooking app Shuffles, which went viral last year. 

After an investigation exposes its dangers, Pinterest announces new safety tools and parental controls

Following last month’s NBC News investigation into Pinterest that exposed how pedophiles had been using the service to curate image boards of young girls, the company on Tuesday announced further safety measures for its platform, including a new set of parental controls and updated age verification policies, among other things. However, the company also said that it would soon re-open some of its previously locked-down features for teens to allow them to once again message and share content with others.

The investigation had stained Pinterest’s carefully crafted reputation as one of the last safe and positive places on the internet. Instead, the report found that, much like any other social media platform, Pinterest was at risk of being abused by bad actors. The report found that grown men were using the site to create image boards of young girls, sexualizing innocuous photos of children doing things like gymnastics, dancing, sticking out their tongues, wearing bathing suits, and more. In addition, Pinterest’s recommendation engine was making it easier to find similar content as it would suggest more images like those the predators initially sought out.

Immediately after the report, two U.S. Senators reached out to Pinterest for answers about what was being done and to push for more safety measures. The company said it increased the number of human content moderators on its team and added new features that allow users to report content and accounts for “Nudity, pornography or sexualized content,” including those involving “intentional misuse involving minors.” Before, it had only allowed users to report spam or inappropriate cover images.

Pinterest also swept its platform to deactivate thousands of accounts — a move that may have accidentally disabled legitimate accounts in the process, according to some online complaints.

Now, the company is announcing even more safety controls are in the works.

For starters, it says it will expand its age verification process. By the end of this month, if someone who entered their age as under 18 tries to edit their date of birth on the Pinterest app, the company will require them to send additional information to its third-party age verification partner. This process includes sending a government ID or birth certificate and may also require the users to take a selfie for an ID photo.

Image Credits: Pinterest

In addition, Pinterest announced it will soon offer more parental controls. Parents and guardians of children under the age of 18 will have the ability to require a passcode before their teen is allowed to change certain account settings. This would prevent a younger child from trying to change their account to an adult’s age — which matters because minors’ accounts have further protections.

On Pinterest, teens under the age of 16 have accounts that are set private by default and aren’t discoverable by others on the service. In part, the reason why so many pedophiles had been able to discover the young girls’ photos is that they had been using accounts where their ages weren’t set accurately.

In addition, boards and Pins saved by teens under 16 aren’t visible or accessible to anyone but the user, either. Messaging features and group boards are also not available to this demographic. But Pinterest says that will soon change, noting in its Help Documentation that it’s currently “working on updating these features so you can safely connect with those you know.” In a blog post, the company explains that it will soon re-introduce the ability for teens to “share inspiration” with the people they know, “as long as they give them permission.”

Image Credits: Pinterest

Pinterest further touted other teen safety measures in its post, including its ban on beauty filters, which have negative impacts on mental health, as well as its bans on weight loss ads and policies against body shaming.

Unfortunately, the problems Pinterest has been facing, which NBC News exposed, are not limited to its service. A lack of regulation over social media, including outdated children’s online safety laws, has led to a sort of free-for-all in the U.S. where companies are creating their own policies and making up their own rules. Financially, there’s little incentive to do anything that would make it harder for people to sign up, use and stay engaged with their platforms, due to their ad-supported natures.

But despite hauling in tech execs for numerous briefings over the years, Congress has failed to pass legislation to regulate social media. This has led to some states now even crafting their own laws, as Utah just did, which is complicating matters further.

In the meantime, parents and families are dealing with the consequences. Like anywhere else on the internet, when children’s images are posted publicly online, they can attract a dangerous audience and platforms historically have not done enough around minor safety.

Recently, there’s been heightened awareness of this problem thanks to work from creators like TikTok’s Mom.Uncharted, aka Sarah Adams, whose viral videos have helped to explain how online predators are leveraging creator content, including the work of family vloggers, for dangerous and repulsive purposes. She and others who cover the topic have been calling for parents to pull back on posting public images of their children online and stop supporting creators who share photos and videos of young children.

Young people who were forced by their parents to create online content before the age of consent are also pushing for regulations that would protect future generations. Congress has yet to act here, either.

Image Credits: Pinterest

But on Pinterest, it’s not necessarily parents who are publishing the kids’ photos — Pinterest is often used by children, which parents have tended not to mind, believing in the site’s “safe” reputation.

Today, some 450 million people use Pinterest every month, and 60% are women according to Pinterest’s official statistics. Millennials are the largest age group while the second largest is women ages 18-24. In theory, Pinterest’s policies should restrict its use to users 13 and up. However, the real demographics around Pinterest usage may not be captured in online statistics as kids often lie about their age to use apps and websites intended for an older audience. In an annual study of kids’ and teens’ app usage by parental control software maker Qustodio, for instance, Pinterest was the No. 3 social media platform in the U.S. in terms of time spent. Globally, it was No. 5.

“Our mission is to bring everyone the inspiration to create a life they love, and it is our guiding light in how we have created Pinterest, developed our products, and shaped our policies,” Pinterest stressed in the announcement. “As part of this on-going work, we’ll continue to focus on ways that we can keep teens safe,” it promised.

After an investigation exposes its dangers, Pinterest announces new safety tools and parental controls by Sarah Perez originally published on TechCrunch

TikTok questioned on ineffective teen time limits in Congressional hearing

In hopes of heading off concerns over the addictiveness of its app, TikTok earlier this month rolled out new screen time controls that limited minors under the age of 18 to 60-minute daily screen time limits. But in a Congressional hearing today before the House Committee on Energy and Commerce, TikTik CEO Shou Zi Chew was questioned on the new tool’s inefficiency, forcing the exec to admit that the company didn’t have data on how many teens were continuing to watch beyond the default limits.

The line of questioning is notable because TikTok’s algorithm and vertical video-based feed are among the most addictive products to emerge from the broader tech industry in recent years. Each swipe on the app’s screen delivers a new and interesting video personalized to the user’s interests, leading users to waste an inordinate amount of time on TikTok compared with older social media services.

In fact, a recent study found that TikTok was now even crushing YouTube in terms of kids’ and teens’ app usage in markets around the world thanks, in part, to its addictive feed.

The format has become so popular, it’s also since been adopted by nearly all other major U.S. tech companies, including Facebook, Instagram, YouTube, and Snap. So an examination of any sort of addiction mitigation techniques is certainly warranted.

That said, the time limit TikTok designed for teens is really more for show — it doesn’t actually prevent younger users from watching TikTok.

A hard limit on TikTok viewing is still up to the teen’s parents, who would have to use the app’s included parental controls to set screen time and session limits. Otherwise, they could turn to other parental controls bundled with the mobile OS from Apple or Google or those from third parties.

In the hearing, Chew touted how TikTok was the first to launch a 60-minute watch limit for teen users, and had other teen protections, like disabled direct messaging for users under 16. He noted also that teen content couldn’t go viral on the app’s For You page, if the creator was under 18.

However, when pushed on the teen time limit’s real-world impact, the exec didn’t have any substantial data to share.

“My understanding is that teens can pretty easily bypass the notification to continue using the app if they want to,” suggested Representative John Sarbanes (D-Md.). “I mean, let’s face it, our teens are smarter than we are by half and they know how to use technology and they can get around these limits if they want to,” he said.

Sarbanes is correct. There’s really nothing to bypassing the feature — it only takes a tap of a button before you’re returned back to the feed when your time limit is up. A more effective mitigation technique would actually force a teen user to take a break from the app entirely. This could better disrupt the dopamine-fueled addiction cycle by requiring a short time-out where they’d be forced to find something else to do than continue to scroll more videos.

When asked if TikTok was measuring how many teens were still exceeding the 60-minute time limit after the new feature was added, Chew didn’t know and didn’t share any sort of guess, either. Instead, he avoided a direct answer.

“We understand those concerns,” the TikTok CEO responded. “Our intention is to have the teens and their parents have these conversations about what is the appropriate amount of time for social media,” he added, noting that the app offered a Family Pairing feature that does enforce a real screen time limit.

In other words, TikTok doesn’t think real teen protections are up for it to decide. To be fair, neither do any U.S.-based social media companies. They want parents to shoulder the responsibility.

This answer, however, showcases how a lack of U.S. regulation over these platforms is allowing the cycle of app addiction to continue. If lawmakers won’t create rules to protect kids from algorithms that tap into human psychology to keep them scrolling, then it really will be up to parents to figure step in. And many do not know or understand how parental controls work.

Sarbanes asked TikTok to follow up by providing the Congressional committee with research on how the time limits were implemented, how they’re being bypassed, and the measures TikTok is taking to address these sorts of issues.

In a further line of questioning, this time from Rep. Buddy Carter (R-Ga.), TikTok’s addictive nature of the app and the dangerous stunts and challenges it showcased were suggested to be “psychological warfare…to deliberately influence U.S. children.” While that may be a bit of a leap, it’s worth noting that when Carter asked if the Chinese version of TikTok (Douyin) had the same “challenges” as TikTok Chew also admitted he didn’t know.

“This is an industry challenge for all of us,” he said.

The TikTok CEO later reiterated how kids’ use of its app is ultimately up to parents. When responding to questions about the appropriate age for TikTok use, he noted there were three different experiences aimed at different age groups — one for under-13 year-olds, another for younger teens, and another for adults. As an interesting side note, where Chew is based in Singapore, there’s no under-13 experience available, meaning his own kids are not on TikTok. 

“Our approach is to give differentiated experiences for different age groups — and that the parents have these conversations with their children to decide what’s best for their family,” Chew said.

TikTok questioned on ineffective teen time limits in Congressional hearing by Sarah Perez originally published on TechCrunch

TikTok is crushing YouTube in annual study of kids’ and teens’ app usage

For another year in a row, TikTok has found itself as the social app kids and teens are spending the most time using throughout the day, even outpacing YouTube. According to an ongoing annual review of kids’ and teens’ app usage and behavior globally, the younger demographic — minors ranging in ages from 4 through 18 —  began to watch more TikTok than YouTube on an average daily basis starting in June 2020 and TikTok’s numbers have continued to grow ever since.

In June 2020, TikTok overtook YouTube for the first time, with kids watching an average of 82 minutes per day on TikTok versus an average of 75 minutes per day on YouTube, according to new data from parental control software maker Qustodio.

This past year, the gulf between the two widened, it said, as kids in 2022 saw their average daily use of TikTok climb to a whopping 107 minutes, or 60% longer than the time they spent watching video content on YouTube (67 minutes).

TikTok not only topped the average daily usage of other video apps, like Netflix (48 mins.) and Disney+ (40 mins.), it also came out ahead of other social apps, including Snapchat (72 mins.), Instagram (45 mins.), Facebook (20 mins.), Pinterest (16 mins.) and Twitter (10 mins.) among the under-18 crowd.

Image Credits: Qustodio

Meanwhile, as the U.S. grapples with TikTok bans across college campuses and in the government, the app’s addictive video content was viewed, on average, 113 minutes per day in this market, compared with 77 minutes per day on YouTube, 52 minutes for Netflix, 90 minutes on Snapchat, and 20 minutes on Pinterest.

There is still some good news for YouTube, though. The study found that the average daily time spent on YouTube was up by 20% year-over-year, to reach 67 minutes — the highest number since Qustodio began reporting on annual trends in 2019. YouTube also gained sizable global market share and mindshare last year, as 63% of kids worldwide were using the service in 2022. The report additionally broke down a few top markets in more detail, noting that 60% of U.S. kids watch YouTube, compared with 67% in the U.K., 73% in Spain, and 58% in Australia. The second most popular video service was Netflix, with 39% popularity among kids worldwide.

Overall, kids under 18 managed to increase their video content viewing by 18% in 2022, watching 45 minutes daily, on average, across long-form video services like YouTube, Netflix, Disney+, Prime Videos, and others.

Other winners for the year included Netflix and Amazon Prime Video, which saw 7% and 10% gains in popularity, respectively — meaning if they were used at some point by these under-18 years. But in terms of average daily minutes spent, Prime Video dropped 15% year-over-year to 34 minutes. Disney+ declined by the same percentage, dropping from a 47-minute average daily to 40 minutes in 2022. Twitch also suffered last year with only 11% of under-18-year-olds tuning in compared with 16% in 2021.

Image Credits: Qustodio

TikTok’s growth among the younger demographic has forced big tech giants to combat the threat with short-form video of their own. YouTube Shorts is YouTube’s solution to the problem. Google this month reported Shorts crossed 50 billion daily views. Instagram, of course, has been cramming Reels into its experience — and receiving some backlash over the changes. Instagram head Adam Mosseri even admitted earlier this year, the platform has been pushing “too many videos” on users.

It’s not clear this shoehorning of Reels into Instagram has paid off with the younger crowd. In Qustodio’s analysis, the app fell out of the top 5 most popular social media apps in the U.S., U.K., and Australia with users under 18. It still ranked No. 5 globally, however, behind TikTok, Facebook (38% of kids used it globally!), Snapchat, and Pinterest.

Though the software firm chose to analyze Roblox among other video games, it’s worth also noting the game is a social network of sorts — and an extremely popular destination among kids worldwide. The gaming platform was popular with 59% of kids globally, and average daily time spent grew 4% year-over-year to 180 minutes. That’s larger than any other games, including the No. 2 game Minecraft (up 37% to 48 mins.), Clash Royale, Brawl Stars, Clash of Clans and What Would You Choose?

Qustodio’s full report digs into other app trends as well, including Twitter’s 7% growth in popularity worldwide, which also led to it appearing in the list of most-blocked apps by parents in 2022 for the first time. It also delved into educational app usage where Google Classroom ruled on school devices, and Duolingo remained a top app on personal devices. And it looked at communication, where WhatsApp and Discord edged out Messages as the most popular way to chat with friends, though Zoom saw more minutes spent daily.

While the report’s data is limited to the app usage Quostido tracks on its own platform, it’s a sizable group that includes over 400,000 global families with children in the Gen Z and/or Gen Alpha demographic. It additionally sureveyed 1,617 parents directly to ask them about how they manage their children’s access to technology.

TikTok is crushing YouTube in annual study of kids’ and teens’ app usage by Sarah Perez originally published on TechCrunch

Instagram’s new ‘Quiet Mode’ helps you take a break from the app

Instagram announced today it’s expanding its selection of time management tools with the launch of a new feature called “Quiet Mode.” The feature aims to reduce users’ anxiety about taking time off from the app by silencing incoming notifications, auto-replying to DMs, and setting your status to ‘In Quiet Mode’ to inform friends that you’re not active on the app at present. The company said it will prompt teen users to enable the feature if they’re using the app late at night.

The update is one of several changes rolling out today, which also include expanded parental control tools and other tools to manage recommendations.

The launches come as Instagram works to make its app less of a target for regulators and lawmakers who have been concerned with social media’s potential harms, particularly for teenage users. To date, Instagram has added several teen safety features, including those to protect teens’ privacy and reduce unwanted adult contact, limit ad targeting, restrict teens’ access to mature content, and others to help parents monitor and manage their teens’ Instagram use through parental controls.

Quiet Mode joins a handful of other screen time management tools Instagram now offers, including daily time spent controls that allow people to track their app use and send themselves alerts, those to configure”take a break” reminders after individual app sessions extend beyond a certain amount of time, and various tools to pause, snooze and unfollow pages, groups, and people to help further reduce engagement with addictive or otherwise unwanted content.

With the new Quiet Mode feature, however, the idea isn’t just to introduce a tool that pushes users to take a break. Instead, it focuses on the real-world impacts that accompany trying to step away for a bit from an app that you regularly use — and one where others expect you to be available.

For young people in particular, Instagram has grown to become a popular messaging tool — so much so that the company in years past launched two different variations on standalone communication apps, Direct and Threads. (The latter was shut down in 2021.) While breaking out messaging as its own separate experience didn’t work out for Instagram, messaging remains a key draw for the main app. For Instagram’s heaviest users, that means not answering DMs is up there with ignoring text messages  — it’s something that’s considered rude, even though some teens admit that the expectation to be constantly available can be stressful.

With Quiet Mode, Instagram users can choose to take a break — to study, to sleep, or to otherwise disengage. It’s sort of like the Instagram equivalent of turning your instant messaging light off, for those who remember the AIM and ICQ era. When you exit Quiet Mode, the app will offer a summary of what you missed during your downtime to help you get caught up.

Instagram says teens will be prompted to enable the feature when they spend a “specific” amount of time on Instagram at night, but Quiet Mode will be offered to all users. (Instagram says the prompt will be triggered after a “short” amount of time, but didn’t provide details on what it considers short.)

Quiet Mode is launching first to users in the U.S., United Kingdom, Ireland, Canada, Australia, and New Zealand, and Instagram says it hopes to roll it out to more countries soon.

While Quiet Mode was the highlight of today’s news, Instagram is rolling out a handful of updates as well.

One is the newly added ability for parents to see the accounts their teen has blocked, when using Instagram’s built-in parental control tools. This could make it easier for parents to have discussions with their teens so they can talk about the change, if need be.

Instagram is also letting users better control what shows up for them on the app’s Explore page. This is something that’s algorithmically driven by user behavior, though doesn’t always reflect what a user wishes they were seeing.

In fact, there was a viral thread on Twitter about this subject just the other day, related to a complaint about too much adult-oriented content appearing on some people’s Instagram’s Explore tab (referred to in the thread as the “Discover” tab.). As many people pointed out in the replies, this tab is driven by an algorithm that shows you what you might like, based on your app usage. That means you’ll only see “adult” content if that’s what you view and interact with — meaning those complaining were just “telling on themselves,” the original poster wrote. Other people’s Explore tabs may be filled with kittens or art or tattoos or cooking or fashion, and other innocuous content, users agreed. (The thread also offered a fascinating glimpse into what people’s Explore pages look like, if you’ve ever been curious about the variety.)

Image Credits: Instagram

Going forward, Instagram says users will be able to hide multiple pieces of unwanted content from Explore at once, which could help those whose pages resemble their behavior but not their actual interests. Plus, if you select “Not Interested” on a post in Explore, Instagram will try to avoid showing you this same type of content going forward elsewhere in the app’s recommendations — like Reels, Search and more.

A final tweak to users’ recommendations will be driven by blocked words.

If you’ve already configured Instagram to hide comments and DMs with certain words, emojis or hashtags, that block will also apply to recommended posts across the app. That means if you block a word like “fitness” or “recipes,” the company explains, it won’t show you content where those words appear in the caption or hashtag.

Image Credits: Instagram

This seems an attempt to address the problem where searches for things like “workout tips” or “healthy recipes” easily lead users to content associated with extreme dieting and eating disorders. This topic had been the focus of a 2021 Congressional hearing about the harms to teen mental health that comes from using apps like Facebook and Instagram. As U.S. lawmakers have yet to take serious action here, Meta is again attempting to regulate itself on the matter by putting the controls in the hands of the end user, rather than holding itself accountable for algorithmic failures.

While Quiet Mode is only launching in select markets, the other updates are rolling out globally to iOS and Android, Instagram said.

 

Instagram’s new ‘Quiet Mode’ helps you take a break from the app by Sarah Perez originally published on TechCrunch