German teens went crazy for this ‘compliments’ app, and now VCs are backing its next phase

The teenage market for apps is a tough nut to crack and stay relevant in. Just ask Snapchat. Equally, teens are going through a stage in life where almost every social interaction seems to carry portent of some kind of other. This would explain in part why apps like SendIt, NGL, and Nocapp (some are Snapchat connected tools) took off as ways for teens to anonymously comment on each other. And AskFM would probably like us all to forget the various suicides that occurred when it was released in its initial form, back in the day. (And you thought Instagram is bad for mental health…).

Meanwhile, somehow (somehow!?) a new startup has appeared with the idea that yet another app is going to help this dumpster fire of social interactions, but let’s hear them out before jumping to conclusions.

Slay” bills itself as a “positive social media network for teenagers”. The reason we are talking about it today is that it’s grown like a weed after launching last year in Germany, where it reached Number 1 on the German iOS App Store four days after launch. It’s now claiming to have over 250,000 registered users and claims its gaining traction in other countries including the UK, where it recently launched.

So what’s the attraction here? When users open the app it shows users 12 questions which the user can only answer by choosing another user (from their school, class or peer group) to pay an anonymous compliment to (or “slay”). For example, the app may ask a user “Who inspires me to do my best?”. They can then choose from four other users from their school to pay this ‘slay’ to. They can then view compliments from other kids, provided they answer the 12 questions when logging on. The identity of those who sent the compliment remains hidden.

This reminds me of BeReal’s mechanic, where you can only see other people’s BeReal photos by uploading your own.

And Slay is also not dissimilar to Gas, the messaging platform popular among teens for its positive spin on social media, acquired by Discord yesterday. On Gas, anonymous polling is intended to boost users’ confidence.

The other reason Slay has popped onto the TC radar, is that its growth has attracted the interest of VCs.

It’s now raised a $2.63m (€2.5m) pre-seed funding round led by Accel. Also participating was 20VC. Additional investors including Supercell Co-Founder and CEO Ilkka Paananen, Behance founder Scott Belsky, football star Mario Götze, Kevin Weil (Scribble Ventures) and musician Alex Pall (The Chainsmokers).

Slay says it is aiming to reset the teen relationship with social apps by re-balancing things away from the negative sentiments on social platforms, by normalising the giving of compliments. It also says it’s been designed with safety, content moderation and teenage mental wellbeing built in. We shall see…

Digging into the app, one can see that it’s been built very simply as a ‘compliment app’. Whether that is going to be enough to keep users coming back is hard to say. Teenager behaviour is hard to second guess. Getting zero can also send a ‘signal’, for instance.

Suffice it to say, Slay claims it will “never sell or share personal data with third parties.” Given the history of social apps, let’s see how long this lasts.

There is also no direct messaging facility, although users will be able to add links to social media profiles, so clearly they will be able message each other eventually, off-app.

Adults are supposedly not allowed to ‘join’ schools, and approximate location is asked for to suggest nearby schools. Any questions and interactions are asked by the app, not by users themselves.

SLAY was founded in 2022 by a team of three 23-year old, Berlin-based co-founders: Fabian Kamberi, Jannis Ringwald and Stefan Quernhorst. The idea was Kamberi’s, who had been building consumer apps since he was a teenager, and says he was inspired by the experiences of his siblings struggled with the negativity of social media apps during the COVID-19 pandemic. 
 
CEO and Co-founder Kamberi, told me via email: “We see Slay in the future not only as an anonymous polling app [referring to the aforementioned Gas], but as the go-to spot for teens to rediscover social interactions in various play modes.”

“Our app is similar to Gas, and their acquisition shows a great proof of what we have built and what is in store for the future in our space. However, apps that rely solely on anonymous Q&A, for example, carry a high cyberbullying risk, which – by contrast – we prevent through our rigorous content moderation as well as specially designed gamemodes,” he added.

But the question is, why does he think a social app can improve mental health when so many social apps have not?

“We have received thousands of feedback messages from users thanking us for making them feel valued in times of fast moving, negative social media interactions,” he told me.

He said the startup could well ship new features which might create more engagement but at the same time it might bring a risk for negativity: “So we focus very much on the individual experience that each user has, aiming to make it as positive as possible.” He said the startup’s job is “content safety.”

So what’s Slay’s business model? How will it make money? Kamberi says it will likely be premium features, services or tools which users pay for: “We are currently building several exclusive, paid play modes as well as add ons, which we will release through feedback cycles with users and supported by data.”

SLAY is available in Germany, Austria, Switzerland and the United Kingdom.

Julien Bek, Principal at Accel, added via a statement: “We’re extremely impressed by the SLAY app, both in its immediate popularity among teenagers and the team’s positive goal of improving teenage mental health in the digital world. Already, the SLAY team has seen almost half its active users use it every school day.”

German teens went crazy for this ‘compliments’ app, and now VCs are backing its next phase by Mike Butcher originally published on TechCrunch

Top anonymous social app NGL forced to stop tricking its users

A popular anonymous social app that was misleading its users with fake messages has been forced to change. The top-ranked app NGL, which became the No. 1 app on the U.S. App Store in June, quietly rolled out an update yesterday that sees it now informing users when they receive messages that aren’t from their friends — as users had been previously led to believe. Before, NGL sent these fake messages as a means of creating engagement, then charged for “hints” about the message’s sender.

The app has also now lowered its subscription pricing which promises to reveal details about who is behind the anonymous messages.

NGL is one of a handful of anonymous social apps that had recently shifted their attention to Instagram after Snapchat cracked down on apps of this nature using its developer tools, as part of Snap’s broader efforts to reduce harm to minors.

To use NGL, users would tap a button in the app to copy a unique URL they could share with friends and followers across the web.

Image Credits: NGL App Store listing

While Snap could prevent direct integrations with its own developer tools, NGL users could still copy and paste the special link into their Snapchat Stories or wherever they chose — like Twitter or any other app. However, a “Share” button in the app made it easy to post directly to Instagram Stories. Then, when others saw the link on their friend’s Story or post, they could click it to anonymously ask that person a question. These questions would appear as messages in NGL’s in-app “Inbox” for users to read and respond to.

However, NGL had a trick up its sleeves. If users didn’t get any engagement on their shared link, the app itself would generate messages automatically. Users had no real way of knowing that these messages were actually fake questions the app was sending them. But many suspected that was the case as the questions sounded like things their friends wouldn’t ask. (We confirmed the messages were fake by generating an NGL link but not sharing it. We then received messages).

NGL’s app reviews have been filled with complaints that its questions seemed to be coming from bots. What’s worse, the app developer was charging users for “hints” to learn more about who was asking the question. This means users were paying, in some cases, for hints about bots!  This could be considered fraud. (We’d advise impacted users to request refunds from Apple.)

The NGL app got its ideas from rival Sendit, a similar social app that also offers a variety of Snapchat games. In fact, Sendit’s maker is now suing NGL for stealing its ideas — the NGL developer previously worked on Sendit before realizing the potential in simply cloning the idea and raking in the money himself. As it turns out, there is some business to be had here. By July, NGL had topped 15 million downloads and had pulled in  $2.4 million in revenue by selling its subscriptions.

TechCrunch had called out NGL for its misleading tactics and, apparently, someone was listening. (Actually, we do understand there was a discussion between the developer and Apple about this). NGL has not commented.

Yesterday, NGL issued an update that now sees it labeling its fake messages with a tag that reads “sent with ❤ from the NGL team.” This is meant to indicate the message is not from a friend, but from the app itself. (Arguably, the wording could be clearer. Some users — particularly among its target market of young adults —  could interpret this tag to mean the message is simply being delivered by the app.)

These messages also don’t show a subscription prompt. In addition, the subscription cost was lowered a bit, from $9.99/week to $6.99/week, and now includes other features beyond “hints.” For instance, it touts users will get “early access” to exclusive games besides the anonymous Q&A. One of the paid games is already included — an anonymous confessions game.

The app’s rival Sendit’s Q&A feature had worked in much of the same way and it, too, just updated its subscription. Now, instead of just charging for hints, Sendit “Diamond members” can reveal the name and Bitmoji of the sender (in some cases), access exclusive games, unlock a custom icon, and remove ads from the experience, the app claims. However, its pricing still remains $9.99 per week.

Though the viral buzz around these apps has since died down a bit, they still remain highly ranked. NGL is the No. 9 app on the U.S. App Store’s Lifestyle charts and Sendit is No. 12 among Social Networking apps.

This Week in Apps: Elon wants out, TikTok’s content ratings, a new milestone for subscription revenue

Welcome back to This Week in Apps, the weekly TechCrunch series that recaps the latest in mobile OS news, mobile applications and the overall app economy.

Global app spending reached $65 billion in the first half of 2022, up only slightly from the $64.4 billion during the same period in 2021, as hypergrowth fueled by the pandemic has slowed down. But overall, the app economy is continuing to grow, having produced a record number of downloads and consumer spending across both the iOS and Google Play stores combined in 2021, according to the latest year-end reports. Global spending across iOS and Google Play last year was $133 billion, and consumers downloaded 143.6 billion apps.

This Week in Apps offers a way to keep up with this fast-moving industry in one place with the latest from the world of apps, including news, updates, startup fundings, mergers and acquisitions, and much more.

Do you want This Week in Apps in your inbox every Saturday? Sign up here: techcrunch.com/newsletters

Top Stories

TikTok is getting a rating system

Some TikToks are too racy or mature for younger teens — a problem TikTok aims to address with the upcoming launch of a new content ratings system. The “Content Levels” system, as it will be called, is meant to provide a means of classifying content on the video app — similar to how movies, TV shows and video games today feature age ratings.

TikTok acknowledged some content on its app may contain “mature or complex themes that may reflect personal experiences or real-world events that are intended for older audiences.” It will work to assign these sorts of videos a “maturity score” that will block them from being viewed by younger users. Not all videos will be rated, however. The goal will be to rate videos that get flagged for review and those that are gaining virality. Initially, the system will focus on preventing inappropriate content from reaching users ages 13 to 17, TikTok says, but will become a broader system over time.

The launch follows a 2021 congressional inquiry into social apps, including TikTok and others, which focused on how their algorithmic recommendation systems could be promoting harmful content, like eating disorder content, to younger users. TikTok has also been making headlines for its promotion of dangerous and destructive viral stunts, like kids destroying public school bathroomsshooting each other with pellet guns or jumping off milk crates, among other things.

TikTok, like other social apps, is in hot water over the potential negative impacts to minors using its service. But it’s under particular scrutiny since the reveal that parent company ByteDance — in China — was accessing U.S. TikTok user data. Alongside the maturity ratings, TikTok says it will also launch content filters that will let users block videos with hashtags or certain words from their feeds.

For all its ills, TikTok has more developed parental controls than its U.S. rivals and the launch of a content ratings system could push other apps reaching minors, like Instagram and Snapchat, to do the same.

Will he or won’t he? The Twitter deal heads to court

Elon wants out. The Tesla and SpaceX exec has got a serious case of buyer’s remorse. Musk offered to buy Twitter at $54.20 per share — it’s a weed joke! Get it? 420! — but the stock today is only trading at $36.29 per share. So it’s not so funny anymore. Now the exec is attempting to use some flimsy excuses about “bots” on the network in order to get out of the legal agreement. But Twitter just said, see you in court! (Well, in legalese, it said Musk’s termination was “invalid and wrongful.”) Twitter then delivered a few more jabs in a letter filed with the SEC, noting Musk “apparently believes that he — unlike every other party subject to Delaware contract law — is free to change his mind, trash the company, disrupt its operations, destroy stockholder value, and walk away.” Burn!

Sadly, caught in the chaos are Twitter’s advertisers, some of whom are exiting, and of course, the Twitter employees who often don’t know what’s going on, who will prevail or what Musk may do if the deal is forced through. (Vent here if you want!) And what does this mean for Twitter’s conference Chirp later this year, if the deal is still in limbo?

This has been such a weird and fraught acquisition since day one, with some poor folks at the SEC having to collate tweets of poop emoji and memes as investor alerts. It’s also one that makes a pretty good case as to why we should tax billionaires more — too much money turns large companies and people’s livelihoods into toys for their amusement, apparently.

Non-game revenue tops games for the first time on the U.S. App Store

App Store icon on iPhone screen

Image Credits: TechCrunch

A major shift in the U.S. app economy has just taken place. In the second quarter of this year, U.S. consumer spending in non-game mobile apps surpassed spending in mobile games for the first time in May 2022, and the trend continued in June. This drove the total revenue generated by non-game apps higher for the quarter, reaching about $3.4 billion on the U.S. App Store, compared with $3.3 billion spent on mobile games.

After the shift in May, 50.3% of the spending was coming from non-game apps by June 2022, according to new findings in a report from app intelligence firm Sensor Tower. By comparison, games had accounted for more than two-thirds of total spending on the U.S. App Store just five years ago.

The trend was limited to the U.S. App Store and was not seen on Google Play, however. In Q2, games accounted for $2.3 billion in consumer spending on Google Play in the U.S., while non-game apps accounted for about $1 billion. Read more about the new data here.

Kids and teens now spend more time on TikTok than YouTube

The TikTok logo is seen on an iPhone 11 Pro max

Image Credits: Nur Photo (opens in a new window) / Getty Images

A study of 400,000 families performed by parental control software maker Qustodio found that kids and teens ages 4-18 now spend more time watching videos on TikTok than they do watching YouTube — and that’s been the case since June 2020, in fact. That month, TikTok overtook YouTube for the first time, as this younger demographic began averaging 82 minutes per day on TikTok versus an average of 75 minutes per day on YouTube.

YouTube had still been ahead in 2019 as kids and teens were spending an average of 48 minutes on the platform on a global basis, compared with 38 minutes on TikTok. But with the shift in usage that took place in June 2020, TikTok came out on top for 2020 as a whole, with an average of 75 minutes per day, compared with 64 minutes for YouTube.

In the years since, TikTok has continued to dominate with younger users. By the end of 2021, kids and teens were watching an average of 91 minutes of TikTok per day compared with just 56 minutes per day spent watching YouTube, on a global basis.

Likely aware of this threat, YouTube launched its own short-form platform called Shorts, which it now claims has topped 1.5 billion logged-in monthly users. The company believes this will push users toward its long-form content — but so far, that hasn’t happened, it seems. Read the full report here.

TikTok is eating into Google Search and Maps, says Google

In a bit of an incredible reveal (if one that helps Google from an anticompetitive standpoint), a Google exec admitted that younger people’s use of TikTok and Instagram is actually impacting the company’s core products, like Search and Maps.

TechCrunch broke this news following comments made at Fortune’s Brainstorm Tech event this week.

“In our studies, something like almost 40% of young people, when they’re looking for a place for lunch, they don’t go to Google Maps or Search,” said Google SVP Prabhakar Raghavan, who runs Google’s Knowledge & Information organization. “They go to TikTok or Instagram.”

Google confirmed to us his comments were based on internal research that involved a survey of U.S. users, ages 18 to 24. The data has not yet been made public, we’re told, but may later be added to Google’s competition site, alongside other stats — like how 55% of product searches now begin on Amazon, for example.

Weekly News

Platforms: Apple

  • The iOS 16 public beta has arrived. It’s here, it’s surprisingly functional, and it brings a number of great new features to iPhone users, including a customizable Lock Screen with support for new Lock Screen widgets, more granular Focus Mode features, an improved messaging experience with an Undo Send option, SMS filters, iCloud Shared Photo Library for families, CAPTHCA bypassing and this clever new image cutout feature that lets you “pick up” objects from photos and copy them into other apps. On iPadOS 16, there are a number of specialized features, including the new Stage Manager multitasking interface.

Apple's new visual lookup feature

Apple’s new visual lookup feature. Image Credits: Apple

Platforms: Google

  • Samsung rolled out its One UI 4.5 update for Galaxy Watches, which is powered by Wear OS 3.5. The update includes a full QWERTY keyboard, customizable watch faces and dual-SIM support, and will run on the Galaxy Watch4, the Galaxy Watch4 Classic and other models.
  • Google expanded its Play Games for PC beta, which brings Android apps to Windows, to more regions, including Thailand and Australia.
  • Google released the fourth and final Android 13 beta ahead of its official launch, which the company says is “just a few weeks away.” There were not many changes with this update, as Google already reached platform stability with Android 13 beta 3 last month.

E-commerce

  • TikTok launched a new educational program targeting small businesses that want to learn how to use its platform to drive sales. The launch follows TikTok’s decision to pause the expansion of its Shop initiative. The program walks businesses through setting up an account, creating content and using TikTok ads products, and features coaching and tips from other SMBs.
  • NYC fast delivery apps could face a shutdown if new bills proposed by New York’s City Council get approved. The city is concerned about the dark stores’ workers’ safety.

Augmented Reality

  • Shopify showed off a wild internal experiment using Apple’s new RoomPlan API that allowed users to more easily reset their room in order to see how new furniture could work. The test lets you remove the furniture already in your room to create a lifelike digital twin of your room that can be overlaid in your real space using AR. Users could then swipe through new room sets to see how they’d look in their own space. Spotify said it has nothing in production related to this right now — but wow, someone should!

Fintech/Crypto

  • FlickPlay, an AR social app that lets users unlock NFTs and display them in a wallet, was among those selected to participate in Disney’s 2022 startup accelerator, among others focused on AR, web3 and AI experiences.

Social

Image Credits: TechCrunch

  • Two anonymous social Q&A apps are heading to court. Sendit’s maker, Iconic Hearts, is suing rival NGL for stealing its proprietary business data in order to build what’s since become a top-ranked Q&A app on the App Store. Of note, the court filing reveals that the apps are using fake questions to engage their users — something many had already suspected.
  • Reddit and GIPHY partner. Reddit is now allowing its safe-for-work and non-quarantined subreddits to enable GIPHY for use in the comments. Those moderators who don’t want the GIF comments will need to opt out. Previously GIFs in comments were available as a paid subscription perk (via Reddit’s Powerups), but most of these will now be available for free.
  • TikTok’s head of global security stepped down. Someone had to pay for that security debacle which found that U.S. TikTok user data was being viewed in China. Global security head Roland Cloutier will be stepping down effective September 2 and will be replaced by Kim Albarella, who’s been appointed the interim head of TikTok’s Global Security Organization.
  • A children’s rights group called out TikTok for age-appropriate design issues, ahead of TikTok’s launch of new safety features. The group’s research looked at various apps’ default settings and terms offered to minors, including also WhatsApp and Instagram — spanning 14 different countries — including the U.S., Brazil, Indonesia and the U.K. The report noted TikTok was defaulting 17-year-olds to public accounts outside of certain EU markets and the U.K., lacked terms in people’s first languages and wasn’t being transparent about age requirements, among other things.
  • Instagram began testing a Live Producer tool that lets creators go live from their desktop using streaming software, like OBS,  Streamyard and Streamlabs. Only a small group of participants currently has access to the tool, which opens up access to using additional cameras, external mics and graphics.
  • Instagram also rolled out more features to its creator subscription test, including subscriber group chats, reels and posts for subscribers only, and a subscriber-only tab on a creator’s profile.
  • Twitter is testing custom timelines built by developers around specific themes, starting with a custom timeline for The Bachelorette in the U.S. This is the latest product that attempts to allow users different views into Twitter, along with List, Topics, Communities and Trending. It’s also now testing a feature that reminds users to add image descriptions for accessibility.

Twitter custom timeline

Twitter custom timeline. Image Credits: Twitter/Amir Shevat

  • Facebook started testing a way for users to have up to five separate profiles tied to a single account. The company said this would allow users to take advantage of different profiles for interacting with specific groups — like a profile for use with friends and another one for coworkers.
  • Activist investor Elliott Management told Pinterest that it has acquired a 9%+ stake in the company. The Pinterest stock jumped more than 15% after hours on the news.

Messaging

  • WhatsApp rolled out the ability for users to react to messages using any emoji, instead of just the chosen six it had offered previously. The feature is one of several WhatsApp developed for its broader Communities update but is making available to all app users.
  • Meta’s smartglasses, Ray-Ban Stories, now let users make calls, hear message readouts and send end-to-end encrypted messages with WhatsApp. The glasses already support Messenger and offer other features like photo-taking and video recording, listening to music and more.

Dating

Image Credits: Match

  • Match Group is expanding its use of free background checks across more of its dating apps. The feature, powered by Garbo, was first launched on Tinder earlier this year. It’s now available on other Match Group apps, including Match and Stir.
  • Google has responded to Match Group’s antitrust lawsuit in a new court filing, which refers to Match’s original complaint as a “cynical attempt” to take advantage of Google Play’s distribution platform and other tools while attempting to sidestep Google’s fees. The two tech giants have been battling it out in court after Match sued Google this May over its alleged monopoly power in Android app payments. The companies have a temporary truce that sees Match setting aside its commissions in escrow while they await the court’s decision. If Google prevails, it wants to kick Match out of its app store altogether. 

Streaming & Entertainment

  • Truecaller is taking on Clubhouse — even though the hype has worn off over live audio. The caller ID app maker ventured into a new market with the launch of Open Doors, a live audio app that lets people communicate in real time. Unlike Clubhouse and others, the new app offers no rooms, invites, recording tools or extensive moderation features. It claims to only scan user contacts on the local device.
  • Netflix inked a deal with Microsoft for its upcoming ad-supported plan. According to reports, Netflix appreciated Microsoft’s approach to privacy and ability to iterate quickly. (It also helped it wasn’t a streaming competitor, like Comcast’s NBCU or Roku.)
  • Apple added a new perk for Apple Music subscribers, Apple Music Sessions, which gives listeners access to exclusive releases in spatial audio that have been recorded in Apple’s music studios around the world. The sessions began by featuring country artists, including Carrie Underwood and Tenille Townes.

Gaming

  • Twitter’s H1 2022 report found there were roughly 1.5 billion tweets about gaming on its platform, up 36% year-over-year. Genshin Impact (No. 1) and Wordle (No. 2) were the most tweeted-about games.

Reading & News

  • Upnext launched a read-it-later app and Pocket competitor for iOS, iPad and web. The app aims to differentiate itself by supporting anything users want to save, not just articles but also things like videos, podcasts, Twitter threads, PDFs and more. It then organizes this in a home screen that curates your collection with Daily Picks, and offers a swipe-based interface for archiving content.

Government & Policy

  • TikTok this week paused a privacy policy change in Europe after a regulator inquiry over how the platform planned to stop asking users for consent to receive targeted ads.
  • Confirming earlier reports, Kakao said it’s removing the external payment link from its KakaoTalk messaging app on the Play Store to come into compliance with Google’s terms, after being blocked from issuing updates. The move brought more attention to the policy and saw the regulator get involved in talks, which was likely the point of Kakao’s protest in the first place.
  • After an FTC commissioner urged the U.S. to ban TikTok, rival Triller reported a surge in users. Triller had pivoted to focus more on entertainment and events as TikTok established itself as the top short form video platform in the U.S.

Funding and M&A

🤝 Match Group acquired the members-only dating app The League, which focuses on matching ambitious and career-focused professionals. The app has previously faced accusations it’s elitist, particularly because it screens and vets members after an application process instead of being open to all. Deal terms weren’t revealed.

🤝 Spotify acquired the Wordle-inspired music-guessing game Heardle for an undisclosed sum. The company believes the deal could help support music discovery in its app and could help drive organic social sharing. Heardle’s website had 41 million visits last month.

💰 Tutoring marketplace and app Preply raised $50 million in Series C funding led by edtech-focused Owl Ventures. The startup has 32,000 tutors from 190 countries teaching over 50 languages, it says, and claims to have grown revenues and users 10x since 2019.

🤝 Fintech for kids GoHenry app acquired Pixbay to help it expand into Europe. The latter has 200,000 members across France and Spain. U.K.-based GoHenry has over 2 million users in the U.K. and U.S.

💰 Japan’s SmartBank raised $20 million in Series A funding for its prepaid card and finance app. The round was led by Globis Capital Partners. The startup claims 100,000+ downloads so far and is aiming for 1 million by the end of next year.

🤝 Israeli company ironSource is merging with the game development platform Unity Software, after the latter saw its share price fall over 70% in 2022 and have a market cap of under $12 billion. IronSource went public a year ago at an $11.1 billion valuation and is valued at $4.4 billion at the time of the merger. Silver Lake and Sequoia will invest $1 billion in Unity after the merger.

💰 Consumer fintech startup Uprise raised $1.4 million in pre-seed funding from a range of investors. The company offers a website and app aimed at Gen Z users that takes in their full financial picture, including overlooked items like employer benefits, and offers recommendations.

💰 Indian fintech OneCard raised over $100 million in a Series D round of financing that values the business at over $1.4 billion. The company offers a metal credit card controlled by an app that also offers contactless payments. The startup has over 250,000 customers.

💰 Stori, a Mexican fintech offering credit cards controlled by an app, raised $50 million in equity at a $1.2 billion valuation and another $100 million in debt financing. BAI Capital, GIC and GGV Capital co-led the equity portion of the deal. The company claims to have seen 20x revenue growth in 2021, but doesn’t share internal metrics.

💰 U.K. stock trading app Lightyear raised $25 million in Series A funding led by Lightspeed. The startup said it’s launching in 19 European countries, including Germany and France.

Downloads

Linktree launches a native app

Linktree, a website that allows individuals, including online creators, to manage a list of links they can feature in their social media bios via a Linktree URL, launched its first mobile app this week. The new app for iOS and Android lets users create a Linktree from their phone, add and manage their links, customize their design and more. Users can also track analytics, sales and payments, among other things. (You can read more about the new app here on TechCrunch.)

Not gonna lie, this NGL lawsuit is kinda juicy

The anonymous Q&A app NGL climbed to the top of the App Store by tricking its users with questions it claims are sent in by their friends and by charging for useless hints about who supposedly wrote those messages. But many of the questions users receive aren’t from real people; they’re generated automatically — an idea NGL’s top competitor, the maker of the Sendit apps, is now alleging NGL’s maker stole alongside other confidential business information, according to a new lawsuit.

In a complaint filed on July 1, 2022, in the Superior Court of California, Sendit’s creator, Iconic Hearts Holdings, Inc. (previously known as FullSenders), claims that NGL acquired its trade secrets through “improper means” as a result of a breach of duties by the suit’s defendant, Raj Vir, an Instagram software engineer, who had worked on Sendit on the side.

For those who don’t keep up with teen app trends, both Sendit and NGL are leading anonymous Q&A apps, a subgroup of social apps currently popular among a younger demographic. The apps have been ranking at the top of the app stores charts for months, as anonymous apps typically do — before they implode from bullying, lawsuits or get banned by the app stores themselves.

As of today, NGL is the No. 5 top (non-game) free app on the U.S. App Store. Since launching late last year, the company has generated over $2.4 million in revenue, according to third-party estimates. Sendit’s apps are currently ranked at No. 12 in Social Networking (Sendit) and No. 57 in Social Networking (Sendit — Q&A on Instagram), and have earned over $11 million, per data from Sensor Tower.

Both Sendit and NGL allow users to post links to their social accounts, like Instagram or Snapchat Stories, which friends can click on to send the poster anonymous questions. (Think: “who do you have a crush on?” and other teenage gossip.)

The recipient, in turn, receives the questions in the app’s inbox, and can then post their response to their social accounts for all to read. The apps monetize this activity by offering their users “hints” about the person asking the questions so they can find out who asked what.

While NGL focuses only on anonymous Q&As, Sendit offers two variations of its service. Its original app is aimed at Snapchat users and provides a variety of games in addition to the anonymous Q&A feature. Its newer app, meanwhile, brings anonymous Q&A’s to Instagram. It launched following Snapchat’s rollout of stricter policies earlier this year that banned anonymous apps from using its developer tools. (Sendit received an extension to come into compliance with those policies, Snapchat told us.)

The apps are problematic, however, because they’ve been demonstrated to be using misleading tactics to trick their young users into thinking they were receiving engagement from friends when they were not.

Both apps are also incredibly similar including in their visual design, how they work, their business model, and other aspects.

As it turns out, that may not have been an accident.

The recently filed Iconic Hearts lawsuit (see below) states that the company hired Vir to develop Sendit’s mobile apps back in September 2018. Vir then continued to consult with the company afterward, it says. In May 2021, Iconic Hearts began having conversations with Vir about offering him a full-time position or allowing him to continue as a contractor. But instead of taking the job, Vir took the company’s ideas and insights and used them to build his own version of Sendit’s app, the complaint explains.

“Vir was integral in founding, building, and launching ‘NGL – anonymous q&a,’ an app that is nearly identical to, and directly competes with, the Sendit apps,” reads the filing. It additionally details how Vir used his friendship with Iconic Hearts’ founder, Hunter Rice, and his role as a Sendit developer and consultant in order to gain information about the company and its apps. (Apparently, Rice and Vir weren’t just business colleagues, they were friends — former high school classmates who had bonded after college over their shared interest in tech, the filing notes.)

During Vir’s time working on Sendit’s apps, he had access to insider information — like which features drove the most user engagement and other future development plans, the lawsuit states. He had also signed a developer agreement, which forbade him from using this information for any other purpose beyond his work with the Sendit apps, it says.

Rice believes Vir was never serious about the job offered to him at Iconic Hearts, the complaint continues, but was instead using his ongoing access to build NGL, a copy of Sendit which launched in late 2021 on the App Store and soon became the App Store’s No. 1 app in June 2022.

The filing explains how Vir had access to detailed app data and KPIs (key performance indicators) and other metrics designed to make the app succeed. Because of his relationship with Sendit, Vir asked for and was given access to all sorts of business data and metrics — like click-through rates, conversion rates, which prompts were the highest performing, how they were ordered to create virality, the placement of call-to-action buttons, financial performance, MRR (monthly recurring revenue), churn rate, LTV (lifetime value), metrics related to average response rates, share counts, viral coefficients, and much more.

Among these business details, was Sendit’s use of fake questions. The company had previously denied using bots when TechCrunch asked.

Many users of Sendit and NGL’s apps had already suspected some of the questions they received were not really coming from their friends, but had been automatically generated. The app stores are filled with user reviews that claim these apps are tricking them, then ripping them off by charging for unhelpful hints — like those that only share a user’s city or the type of phone they have.

TechCrunch also recently tested both NGL and Sendit’s anonymous Q&A system by generating a link for questions but then didn’t show it to anyone, and yet still received half a dozen so-called “questions from friends” in our inboxes.

This feature is actually detailed in the new lawsuit as one of the many aspects of Sendit’s apps that NGL supposedly stole. Reads the complaint:

Iconic Hearts had also developed a unique system, “Engagement Messages,” which sends content to a inbox if interactions with the user had been idle over a certain period of time. “Engagement Message” re-trigger a user to use the app. This generates more “shares” on the app, more density within a user’s trend network (i.e. more people sharing more times), which adds to an app’s saturation, the most critical measure of success and growth. It took Iconic Hearts years of trial-and-error, testing, and iterating its product to optimize its proprietary Engagement Messages System and various components thereof, such as the optimal period of time after which to send an Engagement Message, how the Engagement Message gets pushed, the design of the Engagement Message, and the content of the Engagement Message.

This section essentially confirms users’ suspicions about the fake questions. It also now places a burden on the app stores to take action, we should think, as neither company discloses to its users that these “engagement messages” are not being sent by their friends as the app’s description would lead them to believe.

Surprisingly, Iconic Hearts didn’t know of Vir’s betrayal until recently. Even as late as June 2022, Vir concealed his involvement with NGL the complaint states. The lawsuit claims Vir finally admitted his involvement to Rice on June 21, 2022, by saying “okay, I’ll clear the air. I’ve been lying to your face this entire time. I am building NGL,” and then, “congratulations for being the Head of Product at NGL.”

Yikes, if true.

Neither party has responded to our requests for comment at this time.

As to what extent Iconic Hearts will be able to prove its claims in a legal fashion remains to be seen. The suit is asking for damages and injunctive relief. The suit also names dozens of unknown defendants who may be working or partnering with NGL, which Iconic Hearts hopes the court will reveal and name.

ICONIC HEARTS HOLDINGS, INC. vs. RAJ VIR; NGL LABS LLC; and DOES 1 through 50, inclusive, by TechCrunch on Scribd

Anonymous social NGL tops 15M installs, $2.4M in revenue as users complain about being scammed

A popular anonymous social app called NGL has now topped 15 million global installs, according to new data from app intelligence firm Sensor Tower, released today. The app, which is now one of a handful of unregulated and potentially problematic anonymous apps targeting teens, has been swiftly climbing the charts since its December 2021 launch. But while NGL is now sitting in the top 10 in the U.S. App Store, much of its current growth is being driven by Android users in markets like India and Indonesia, Sensor Tower’s data indicates.

Emerging markets can help push new apps like NGL up the charts, where they gain even more attention from consumers. But in the case of NGL and others, there are reasons to be concerned about its rapid adoption. The app and some of its rivals have been accused of using bots to drive user engagement.

This is an issue because the app promises “anonymous” social interactions in the form of Q&A’s from online friends and then monetizes by offering “hints” as to who sent you those messages, which has allowed it to pull in millions. In other words, both the App Store and Google Play as well as the app developer itself appear to be profiting from an illegitimate operation that tricks people into thinking their friends are asking questions, when some of the questions are actually automated.

When TechCrunch recently tested NGL and its rival Sendit, we copied the provided short links into an Instagram Story that was only live for a mere moment before we took it down. This tricked the apps into thinking we were now awaiting anonymous questions from our friends. A few hours later, questions — supposedly from our friends who saw our link — appeared in the apps’ respective inboxes. But in reality, no one had seen our link as it was never live long enough to be clicked by anyone, much less half by the half dozen people who supposedly send us messages, according to NGL.

It seems pretty clear from these tests NGL is attempting to mislead users — and its low-ranked app store reviews are filled with complaints about its usage of bots.

Keyword analysis for the U.S. finds reviews referencing words like “bot” or “bots,” “fake messages,” “fake questions,” “scam,” and other terms about the app’s tricks or how the user “wasted” ten dollars to get hints about who had asked the questions. The app’s five-star reviews, meanwhile, come across as fairly suspect as many repeat the same phrase: “very good.”

Image Credits: NGL app reviews via Sensor Tower

NGL charges users $9.99 per week for a subscription providing unlimited hints as to who sent the questions you receive. The hints aren’t useful, either, users said in the reviews, as they only offer broad details like location and device model. To date, Sensor Tower says NGL’s users have spent around $2.4 million in the app. The $9.99 unlimited hints subscription was the top in-app purchase, followed by the $1.99 per week unlimited hints option, the firm noted.

Despite these issues, the app continues to gain traction. NGL’s Google Play installs now outpace its App Store installs, with the app seeing some 6.5 million Android downloads in June compared with 4.4 million for iOS.

Indonesia is NGL’s largest market to date, accounting for approximately a fifth of NGL’s lifetime installs or about 3.1 million installs, 83% of which are from Google Play, Sensor Tower says. The U.S. is the app’s second-largest market by installs, representing nearly a fifth of its lifetime installs, or about 3 million installs. India represents about 10% of total installs with about 1.5 million.

So far this month, NGL’s growth trends have continued with the app pulling in another 4.6 million installs as of July 10 (2.7 million installs on Android and 1.9 million on iOS.) That’s up 107x from the 43,000 installs the app saw. during the first 10 days of June, but it’s down by 41% when compared with the prior 10-day period (June 21-30), when the app saw 7.8 million installs. This indicates adoption levels may be somewhat normalizing since the surge in late June, but the app is continuing to grow steadily for the time being.

It’s not clear why the app stores have not taken action against this app and its rivals, given how trivially easy it was to trick this app and competitors into sending us fake questions. At the very least, when an app reaches the top of the app store, it begs for a closer inspection.

NGL has not commented on the new data or users’ complaints at this time.

Amid growing concerns around app addiction, TikTok rolls out more screen tools

TikTok users regularly complain of hours lost on the platform, thanks to the video app’s unmatched ability to distract, entertain and engage users by way of its advanced recommendation technology. Its addictive nature has been the subject of numerous psychological studies and parents’ concerns, as TikTok becomes one of the most-used apps among children. In lieu of dialing back its digital dopamine dispenser, TikTok is today rolling out a new set of screen time features designed to put users in better control of their TikTok usage.

Among the new features are two new controls for monitoring and managing screen time usage, as well as a new digital well-being guide that’s being added to the app’s Safety Center.

The guide, titled “How can I reflect on my digital well-being with my family and friends?,” is meant to help users “reflect more holistically” about how they spend time online, TikTok said.

Image Credits: TikTok’s new Safety Guide

 

The new screen time features, meanwhile, are in addition to the screen time controls TikTok already launched for families, which have been available globally for over two years and include a way for parents to set screen time limits for children. The features are also supplemental to the daily screen time limits tool TikTok added in February 2020, which is available to all users from the app’s existing Digital Wellbeing section.

Instead, the newly launched in-app tool arriving today helps users control how much time they spend on TikTok in a single sitting by allowing them to schedule regular screen time breaks.

The feature aims to address app addiction issues that aren’t related to total consumption, but to losing track of how long you’re spending on TikTok each time you open the app. In other words, it’s a tool for managing screen time on a per-session basis.

Image Credits: TikTok

With the new tool, users can ask the app to remind them to take a break after a certain amount of time of their choosing.

By default, it suggests break reminders of either 10, 20, or 30 minutes, though users can set reminders for a custom time if they wanted to engage in either longer or shorter sessions before being shown the notification. These default suggested times for session breaks are much reduced from the daily screen time limits the existing tool recommends. The latter asks users to set a daily screen time limit of 40, 60, 90 or 120 minutes.

TikTok notes the tool can also be snoozed or switched off at any time, which would allow people to use reminders as they saw fit. For instance, users may not want to limit screen time on a lazy weekend at home, but would want to limit their TikTok time during the work week when they’re trying to get to bed early.

Related to this addition, TikTok is rolling out a new screen time dashboard that will provide more information about how much time users are spending on the app. It includes summaries of the user’s daily time spent on the app, the number of times they opened the app, and a breakdown of daytime and nighttime usage. Users can also opt in to receive weekly notifications that remind them to review their dashboard.

Image Credits: TikTok

While the new screen time tools are rolling out to global users of any age, TikTok says it’s also introducing new screen time prompts for minor users in between the ages of 13 and 17. Going forward, if a minor has been on the app for more than 100 minutes in a single day, TikTok will remind them of its new screen time limits tool the next time they open the app.

That such a feature even needs to exist suggests that many parents have still not bothered to configure TikTok’s existing parental controls, leaving the company to stand in as the digital parent.

Millennial and Gen X users — today’s parents — may have grown up with technology or used it throughout the majority of their adult lives, but a surprising number still do not use supervise their children’s screen time and digital device usage. A 2021 Kaspersky study found only half of U.S. parents were using parental control apps and only 44% supervised children’s use of devices. In other markets, the adoption of parental controls may be even lower. A 2020/2021 Ofcom study, for instance, suggested that around six in ten U.K. parents were aware of digital parental controls, but only around a third actually used them.

Image Credits: TikTok well-being page

The rollout of new screen time features follows a recent April 2022 investigation by the Wall Street Journal into the impact TikTok is having on children’s brains. The report cited a widely reported scientific study that examined how the app’s algorithm activates the reward centers of the brain — including those associated with addiction. While the study focused on college students and young adults, not kids, it found that about 5.9% of TikTok users may have “significant problematic use.”

Young people have an even harder time with managing their screen time use, The WSJ reported, because their brain’s prefrontal cortex — which targets impulse control and decision-making — isn’t fully developed until age 25.

Arguably, TikTok’s screen time tools, including those for minor children, are more robust than those on rival platforms like Instagram and YouTube due to the granularity of TikTok’s controls. And they’re well ahead of Snapchat, which has yet to launch its own parental controls. But given the now well-documented health impacts of addictive social apps — particularly on children’s brains — it’s expected regulators will soon step in to exert more control over the market so it’s not up to app makers to pick and choose what tools to offer and how they should work.

Snapchat’s stricter policies for anonymous apps and friend finders aren’t yet fully enforced

A small handful of Snap Kit platform developers have not yet complied with the new guidelines around anonymous messaging and friend-finding apps announced in March. The Snapchat maker revamped its developer platform policies on March 17, 2022, to ban anonymous apps and require developers to build friend-finding apps to limit access to users 18 or older. The policy changes were effective immediately and existing developers were given 30 days to come into compliance — a date that would have passed last month.

It is now mid-May and some developers of the newly banned and restricted apps are not yet meeting Snap’s new requirements, we’ve found.

Snap says only a small number of developers asked for and were granted additional time to bring apps into compliance, as they worked in good faith to make the necessary changes. But it may be difficult for consumers to tell which apps are compliant, which are skirting the new rules, and which are marketing Snap Kit integrations that they actually don’t have.

For example, one of the apps offered an extension is Sendit, the anonymous Q&A app that surged to the top of the App Store last year after Snap suspended other top anonymous Q&A apps, YOLO and LMK. Those latter two apps had been banned from Snap’s platform after the company was sued by a mother of a teen who died by suicide after being bullied via those tools. This year, Snap was named in a second lawsuit, alongside Meta, related to an alleged lack of safeguards across social media platforms which a mother says contributed to her 11-year-old’s suicide.

Snap has since conducted a review of its platform policies with a focus on potential child safety issues related to third-party apps that integrate with Snapchat’s features and functionality.

This culminated in the new policies that were introduced in March which impact apps that build using the Snap Kit platform. This suite of developer tools allows third-party apps to offer sign-in with Snapchat for user verification, or utilize Snapchat features like Snap’s Camera, Bitmoji, Stories, and more.

At the time Snap announced its new policies, it said the changes would impact a small subset of its over 1,500 developers in its wider community. Around 2% would be impacted by the ban on anonymous messaging apps, Snap said, and another 3% would be impacted by the new requirement to age-gate friend-finding apps.

Sendit appeared to be non-compliant as it was not utilizing a required feature, as specified by Snap’s own developer documentation.

Here, Snap offered an example of how something called “Identity Web View” could be adopted by third-party developers who today use Snap Kit to build anonymous Q&A apps. The feature would allow anonymous Q&A apps to come into compliance with the new policies as it will require apps to present a new modal to users that they must click to send their Bitmoji avatar URL and abbreviated Display Name to the third-party application. Then, they can use the third-party app to post their question — but no longer anonymously. Essentially, it allows Q&A apps to continue to function in much of the same way as before, but without the potential dangers of anonymous bullying — the user is identified.

Sendit, however, doesn’t currently use this modal even though it’s the example shown in the developer documentation screenshot. But Snap says the developer asked for more time to make these changes, which was granted. Snap believes the app, currently No. 8 in the Lifestyle section on the App Store, will soon come into compliance.

Image Credits: Snap, via Snapchat Developer documentation

Other third-party apps also appear to be still operating as usual and, at first glance, seem to not be in compliance with Snap’s new policies.

Apps skirting the rules — or operating outside them?

But this is where things get more tricky — some apps have been granted an extension, some are routing around Snap’s rules, and others are marketing themselves as Snapchat-connected apps when they’re not actually using the SDK.

For instance, the app LMK — to be clear, a different LMK than the original LMK app that was banned last year — is still offering its “anonymous polls” app which integrates with Snapchat’s features. The app is rated 12+ on the App Store and is functioning as usual. But LMK was among that requested and was granted an extension.

Anonymous messaging apps HMU, rated ages 9+ on the App Store, and Intext, rated ages 4+ are also still operating. Both advertise themselves as Snapchat-connected apps. But Intext has been banned from Snap’s platform — you’ll get an error if you try to authenticate with Snapchat using the app’s “Login with Snapchat” button.

HMU appears to have skirted the ban, however. Its app still works.

Meanwhile, a number of friend-finding apps, like Hoop, Wink, Swipr, Purp, and Dope — all of which are now supposed to be only available to adult users  — are still published on the App Store with an age rating of 12+, as of the time of writing. If Snap had vetted and approved them, then they would have the highest age rating on the App Store, which is 17+. (Apple should change this to 18+!)

Confusingly, these apps’ lower age ratings don’t necessarily mean all the apps are breaking Snap’s policies. As it turns out, some of these apps are simply positioning themselves as being Snapchat-connected in their marketing materials — like their App Store screenshots. But in reality, they’re working around their lack of access to Snap’s SDK in other ways.

For instance, Hoop’s App Store page says it’s for making friends on Snapchat, and yet it’s downloadable to anyone aged 12 or up. If it was a Snap Kit platform app, then it would be in violation. But Hoop is not in violation because it’s no longer using the SDK. (But who could tell?!)

Image credit: Hoop

Instead, Hoop has users enter their Snapchat username during onboarding and provides an in-app Snapchat button to “request” someone share their username with you. It’s a workaround that allows the app to still function as a tool for finding friends on Snapchat, but allows the app to operate without relying on developer access to the SDK. But this sort of deviousness on developers’ parts could cause complications for Snap in the future, as it faces potential litigation and regulations related to platform safety.

Requests for comment to the third-party app makers themselves were not returned.

For parents, this lack of consistency across the Snapchat app ecosystem also means they can’t rely on using Apple or Google’s built-parental controls to block the Snapchat friend-finding apps from being downloaded to their child or younger teen’s device. And, once in the hands of younger users, bypassing the age-gate is as simple as using a fake birthdate.

Snap tells us that since it announced the new policies, it has removed the vast majority of apps that were out of compliance with its policies.

But given the extent to which apps are skirting the rules, it could be more useful if the app stores themselves would integrate these same guidelines into their own app review processes. Or perhaps this is all a sign that regulation, in fact, is needed to protect children and teens from accessing experiences that are either potentially harmful or designed for adults.

After all, Snapchat shares the top charts with other apps that cater to a younger, often teenaged, user base — and the rules that apply to it, should apply to anyone.

For instance, one app eating into the Gen Z market is the newer app called BeReal, which prompts users for spontaneous photos. BeReal has now surpassed 10 million cumulative downloads to date, according to estimates from app intelligence firm data.ai (formerly App Annie). 3.3 million downloads took place in Q1 alone, and the majority of users in key markets are Gen Z, the firm said.

Another app, LiveIn, caters to Gen Z as well by allowing users to post photos to each others’ Home Screens via a widget — a feature BeReal also adopted. It’s now No. 2 on the U.S. App Store’s Top Apps chart, while its rival Locket Widget, is No. 24.

These apps are offering experiences that not only cater to Snapchat’s core demographic, but also features that overlap in some ways with what Snapchat is used for — fun, off-the-cuff photos that aren’t meant to stick around. While Snapchat is still growing, its rivals could expand their own platforms to adopt more Snapchat-like features over time, at which point they could also become a cause for concern if they ventured into similar anonymous Q&A or friend-finding spaces.

For now, however, these apps present a different type of threat: one that could see Snapchat losing its users’ time and engagement as they try out new ways to connect with friends.

Does it smell like teen spirit, or teen bankruptcy?

Hello and welcome back to Equity, a podcast about the business of startups, where we unpack the numbers and nuance behind the headlines.

This was a live week! Which meant that Mary Ann Azevedo was on the mic with Alex Wilhelm, and Grace Mendenhall, our ever-trusty producer, helped us power through. A big shoutout to Dennis, Julio, and Yashad for getting all the tech working well.

Right, what did we dive into during our live taping? A lot!

Equity is back Monday! Chat then!

Equity drops every Monday at 7 a.m. PDT and Wednesday and Friday at 6 a.m. PDT, so subscribe to us on Apple Podcasts, Overcast, Spotify and all the casts.

Following suicides and lawsuits, Snapchat restricts apps building on its platform with new policies

After a bullied teen died by suicide, a grieving mother last year sued the platform where the abuse had taken place — Snapchat — for not doing enough to protect its younger users. Another lawsuit, related to another suicide, followed last month. In response to the former, Snap banned the anonymous messaging apps that had facilitated online bullying and vowed to revamp its policies to address what sort of Snapchat-connected experiences could be built using its developer tools. Today, the company announced the results of its policy review and the changes it’s making.

Effective immediately for new developers building on its Snap Kit platform, Snap is banning anonymous messaging apps and will require anyone building friend-finding apps to limit those apps to users 18 and up. Existing developers are being given 30 days to come into compliance with the new policies.

These changes are limited to third-party apps integrated with Snapchat but don’t address Snap’s wider platform safety issues.

Snap says the policy update will impact a small subset of their community of over 1,500 developers. Only around 2% of developers will be impacted by the prohibition on anonymous messaging apps, while another 3% will be impacted by the new requirement to age-gate their apps. The company also noted that developers who remove anonymous messaging from their apps can have their apps re-reviewed and remain a Snap Kit partner.

One app that greatly benefited from the earlier ban on anonymous messaging apps YOLO and LMK, Sendit, is among those that will need to make changes in order to continue to work with Snapchat. In a matter of months following the bans, Sendit had gained millions more downloads from teens who still wanted a way to post anonymous Q&As.

The draw of anonymous social apps is unquestionable, especially for young people. But over the years, time and again, it’s been proven that such apps cannot be used responsibly –but can result in devastating consequences. From the early MySpace days to the teen suicides linked to Ask.fm to the unfortunately well-funded anonymous apps like Secret and Yik Yak (neither of which lasted), anonymity in the hands of young people has been tested and consistently failed. Considering this history, it was arguably irresponsible to permit this sort of activity on Snapchat in the first place, given its core demographic of teens and young adults.

In addition to the anonymous messaging ban, Snap will also now limit friend-finding apps to adult users ages 18 and up.

Friend-finding apps are designed to connect users with strangers on Snapchat, can encourage people to share their personal information, and are a common avenue for child predators to reach younger, vulnerable Snapchat users. Often, the apps are used for dating purposes or sexting, not “friend-finding,” and can be filled with porn bots. For years, law enforcement officials and child safety experts have warned about child predators on Snapchat and dubbed friend-finding apps as “Tinder for teens.”

Issues with these apps continue today. For example, an investigation published last month by The Times detailed the rampant sexual abuse and racism taking place on one of these apps, Yubo.

The anonymous messaging ban and restrictions on friend-finding apps are the only two major changes being made to Snap’s policies today, but the company notes that developers’ apps will still have to go through a review process where they have to answer questions about their use cases and demo their proposed integrations. Snap also said it will conduct periodic reviews every six months to ensure the functionality of the apps hasn’t changed in a way that would violate its policies. Any developer who intentionally seeks to deceive Snap will be removed from Snap Kit and the developer platform altogether, it added.

“As a platform that works with a wide range of developers, we want to foster an ecosystem that helps apps protect user safety, privacy and well-being while unlocking product innovation for developers and helping them grow their businesses,” a Snap spokesperson said in reference to the policy updates. “We believe we can do both, and will continue to regularly evaluate our policies, monitor app compliance, and work with developers to better protect the well-being of our community.”

Snap’s platform safety still needs work

While the changes impact third-party apps integrating with Snapchat, the company has yet to address the broader child safety issues on its platform via other features, like age-gated experiences or its promised parental controls. Instead, today’s changes represent a first step toward what could be a lot more work ahead for the company in terms of child safety.

But platform safety is already top of mind for social media companies industry-wide as regulatory pressure heats up. In its case, Snap was hauled before Congress last fall to answer lawmakers’ questions over various safety issues impacting minors and young adults using its app, including the prevalence of eating disorder content and adult-oriented fare that’s inappropriate for Snapchat’s younger teenage users but not blocked by an age gate.

Snap was also sued this January by another family that lost their child to suicide after she succumbed to pressure to send sexually explicit photos that were later leaked among her classmates. The complaint states that Snapchat’s lack of verification of the child’s age and its use of disappearing messages contributed to her death. In addition, the suit mentions how anonymous messaging played a role, though it doesn’t directly reference the use of third-party anonymous apps.

In the same month, Snap addressed other issues with its friend recommendation feature to make it harder for drug dealers to connect with teens on the app. The problem had been the subject of an NBC News investigation that connected Snapchat with the sale of fentanyl-laced pills that had killed teens and young adults in over a dozen states.

Prior to that, the company faced lawsuits for its “speed filter” that let users take photos that showed how fast they were going. The filter contributed to numerous car accidentsinjuries, and even deaths over the years, but wasn’t disabled at driving speed until 2021. (Snap declined to comment on this matter because litigation is pending.)

Now that lawmakers are finally looking to rein in the Wild West days of Big Tech, where growth and engagement were consistently prioritized over user safety, Snap has been preparing to make changes. It hired its first-ever head of platform safety, Jacqueline Beauchere, in September.

Snap CEO Evan Spiegel in October also said the company was developing parental control tools. These tools — which would follow the launch of parental controls on TikTok and, just this week, Instagram — will allow parents to see who their teens are talking to on the app.

Snap hasn’t said if the tools will address parents’ other concerns — including a way for parents to disable the child’s access to sending or receiving disappearing messages, restrict friend requests or require approvals, block the child from sharing photos and other media, or hide the adult-oriented (and often clickbait-y) content that features prominently in the app’s Discover section.

“We want to help provide ways for parents and teens to partner together to ensure their safety and well-being online — similar to the ways parents help prepare their kids in real life,” a Snap spokesperson said of the parental controls. “We hope that these new tools will serve as a conversation starter between parents and their teens about how to be safe online.”

The company said its initial suite of parental controls is on track for a launch this year. The developer policy changes are live now.

If you or someone you know is struggling with depression or has had thoughts of harming themselves or taking their own life, The National Suicide Prevention Lifeline (1-800-273-8255) provides 24/7, free, confidential support for people in distress, as well as best practices for professionals and resources to aid in prevention and crisis situations.

TikTok updates its policies with focus on minor and LGBTQ safety, age appropriate content and more

Months after TikTok was hauled into its first-ever major congressional hearing over platform safety, the company is today announcing a series of policy updates and plans for new features and technologies aimed at making the video-based social network a safer and more secure environment, particularly for younger users. The changes attempt to address some concerns raised by U.S. senators during their inquiries into TikTok’s business practices, including the prevalence of eating disorder content and dangerous hoaxes on the app, which are particularly harmful to teens and young adults. In addition, TikTok is laying out a roadmap for addressing other serious issues around hateful ideologies with regard to LGBTQ and minor safety — the latter which will involve having creators designate the age-appropriateness of their content.

TikTok also said it’s expanding its policy to protect the “security, integrity, availability, and reliability” of its platform. This change follows recent news that the Biden administration is weighing new rules for Chinese apps to protect U.S. user data from being exploited by foreign adversaries. The company said it will open cyber incident monitoring and investigative response centers in Washington, D.C., Dublin and Singapore this year, as part of this expanded effort to better prohibit unauthorized access to TikTok content, accounts, systems and data.

Another one of the bigger changes ahead for TikTok is a new approach to age-appropriate design — a topic already front of mind for regulators.

In the U.K., digital services aimed at children now have to abide by legislative standards that address children’s privacy, tracking, parental controls, the use of manipulative “dark patterns” and more. In the U.S., meanwhile, legislators are working to update the existing children’s privacy law (COPPA) to add more protection for teens. TikTok already has different product experiences for users of different ages, but it now wants to also identify which content is appropriate for younger and older teens versus adults.

Image Credits: TikTok’s age-appropriate design

TikTok says it’s developing a system to identify and restrict certain types of content from being accessed by teens. Though the company isn’t yet sharing specific details about the new system, it will involve three parts. First, community members will be able to choose which “comfort zones” — or levels of content maturity — they want to see in the app. Parents and guardians will also be able to use TikTok’s existing Family Pairing parental control feature to make decisions on this on behalf of their minor children. Finally, TikTok will ask creators to specify when their content is more appropriate for an adult audience.

Image Credits: TikTok’s Family Pairing feature

“We’ve heard directly from our creators that they sometimes have a desire to only reach a specific older audience. So, as an example, maybe they’re creating a comedy that has adult humor, or offering kind of boring workplace tips that are relevant only to adults. Or maybe they’re talking about very difficult life experiences,” explained Tracy Elizabeth, TikTok’s U.S. head of Issue Policy, who oversees minor safety for the platform, in a briefing with reporters. “So given those varieties of topics, we’re testing ways to help better empower creators to reach the intended audience for their specific content,” she noted.

Elizabeth joined TikTok in early 2020 to focus on minor safety and was promoted into her new position in November 2021, which now sees her overseeing the Trust & Safety Issue Policy teams, including Minor Safety, Integrity & Authenticity, Harassment & Bullying, Content Classification and Applied Research teams. Before TikTok, she spent over three and half years at Netflix, where she helped the company establish its global maturity ratings department. That work will inform her efforts at TikTok.

But, Elizabeth notes, TikTok won’t go as far as having “displayable” ratings or labels on TikTok videos, which would allow people to see the age-appropriate nature of a given piece of content at a glance. Instead, TikTok will rely on categorization on the back end, which will lean on having creators tag their own content in some way. (YouTube takes a similar approach, as it asks creators to designate whether their content is either adult or “made for kids,” for example.)

TikTok says it’s running a small test in this area now, but has nothing yet to share publicly.

“We’re not in the place yet where we’re going to introduce the product with all the bells and whistles. But we will experiment with a very small subset of user experiences to see how this is working in practice, and then we will make adjustments,” Elizabeth noted.

Image Credits: TikTok

TikTok’s updated content policies

In addition to its plans for a content maturity system, TikTok also announced today it’s revising its content policies in three key areas: hateful ideologies, dangerous acts and challenges, and eating disorder content.

While the company had policies addressing each of these subjects already, it’s now clarifying and refining these policies and, in some cases, moving them to their own category within its Community Guidelines in order to provide more detail and specifics as to how they’ll be enforced.

In terms of hateful ideologies, TikTok is adding clarity around prohibited topics. The policy will now specify that practices like deadnaming and misgendering, misogyny or content supporting or promoting conversion therapy programs will not be permitted. The company says these subjects were already prohibited, but it heard from creators and civil society organizations that its written policies should be more explicit. GLAAD, which worked with TikTok on the policy, shared a statement from its CEO Sarah Kate in support of the changes, noting that this “raises the standard for LGBTQ safety online” and “sends a message that other platforms which claim to prioritize LGBTQ safety should follow suit with substantive actions like these,” she said.

Another policy being expanded focuses on dangerous acts and challenges. This is an area the company recently addressed with an update to its Safety Center and other resources in the wake of upsetting, dangerous and even fatal viral trends, including “slap a teacher,” the blackout challenge and another that encouraged students to destroy school property. TikTok denied hosting some of this content on its platform, saying for example, that it found no evidence of any asphyxiation challenges on its app, and claiming “slap a teacher” was not a TikTok trend. However, TikTok still took action to add more information about challenges and hoaxes to its Safety Center and added new warnings when such content was searched for on the app, as advised by safety experts and researchers.

Today, TikTok says dangerous acts and challenges will also be broken out into its own policy, and it will launch a series of creator videos as part of a broader PSA-style campaign aimed at helping TikTok’s younger users better assess online content. These videos will relay the message that users should “Stop, Think, Decide, and Act,” when they come across online challenges — meaning, take a moment to pause, consider whether the challenge is real (or check with an adult, if unsure), decide if it’s risky or harmful, then act by reporting the challenge in the app, and by choosing not share it.

Image Credits: TikTok

On the topic of eating disorder content — a major focus of the congressional hearing not only for TikTok, but also for other social networks like Instagram, YouTube and Snapchat — TikTok is taking more concrete steps. The company says it already removes “eating disorder” content, like content that glorifies bulimia or anorexia, but it will now broaden its policy to restrict the promotion of “disordered eating” content. This term aims to encompass other early-stage signs that can later lead to an eating disorder diagnosis, like extreme calorie counting, short-term fasting and even over-exercise. This is a more difficult area for TikTok to tackle because of the nuance involved in making these calls, however.

The company acknowledges that some of these videos may be fine by themselves, but it needs to examine what sort of “circuit breakers” can be put into place when it sees people becoming trapped in filter bubbles where they’re consuming too much of this sort of content. This follows on news TikTok announced in December, where the company shared how its product team and trust and safety team began collaborating on features to help “pop” users’ filter bubbles in order to lead them, by way of recommendations, into other areas for a more diversified experience.

While this trio of policy updates sounds good on paper, enforcement here is critical — and difficult. TikTok has had guidelines against some of this content, but misogyny and transphobic content have slipped through the cracks, repeatedly. At times, violative content was even promoted by TikTok’s algorithms, according to some tests. This sort of moderation failure is an area where TikTok says it aims to learn from and improve.

“At TikTok, we firmly believe that feeling safe is what enables everybody’s creativity to truly thrive and shine. But well-written, nuanced and user-first policies aren’t the finish line. Rather, the strength of any policy lies in enforceability,” said TikTok’s policy director for the U.S. Trust & Safety team, Tara Wadhwa, about the updates. “We apply our policies across all the features that TikTok offers, and in doing so, we absolutely strive to be consistent and equitable in our enforcement,” she said.

At present, content goes through technology that’s been trained to identify potential policy violations, which results in immediate removal if the technology is confident the content is violative. Otherwise, it’s held for human moderation. But this lag time impacts creators, who don’t understand why their content is held for hours (or days!) as decisions are made, or why non-violative content was removed, forcing them to submit appeals. These mistakes — which are often attributed to algorithmic or human errors — can make the creator feel personally targeted by TikTok.

To address moderation problems, TikTok says it’s invested in specialized moderator training in areas like body positivity, inclusivity, civil rights, counter speech and more. The company claims around 1% of all videos uploaded in the third quarter of last year — or 91 million videos — were removed through moderation policies, many before they ever received views. The company today also employs “thousands” of moderators, both as full-time U.S. employees as well as contract moderators in Southeast Asia, to provide 24/7 coverage. And it runs post-mortems internally when it makes mistakes, it says.

However, problems with moderation and policy enforcement become more difficult with scale as there is simply more content to manage. And TikTok has now grown big enough to be cutting into Facebook’s growth as one of the world’s largest apps. In fact, Meta just reported Facebook saw its first-ever decline in users in the fourth quarter, which it blamed, in part, on TikTok. As more young people turn to TikTok as their preferred social network, it will be pressed upon to not just say the right things, but actually get these things right.