Datagen raises $50 million Series B to empower computer vision teams

Datagen raised a $50 million Series B round to boost the growth of its synthetic data solution for computer vision (CV) teams, bringing its total funding to over $70 million, the Israel-born company announced today. The round was led by new investor Scale Venture Partners, with partner Andy Vitus joining Datagen’s board of directors.

With offices in Tel Aviv and New York, Datagen “is creating a complete CV stack that will propel advancements in AI by simulating real world environments to rapidly train machine learning models at a fraction of the cost,” Vitus said. The Palo Alto-based VC predicts that “this will fundamentally transform the way computer vision applications are developed and tested.”

Investors that had backed Datagen’s $18.5 million Series A round 11 months ago participated in this new round. This includes VC firms TLV Partners, and Spider Capital, as well as Series A leader Viola Ventures, this time also through its growth arm Viola Growth. High-profile individuals from the AI and data field doubled down too, such as computer scientists Michael J. Black and Trevor Darrell, NVIDIA’s director of AI Gal Chechik, and Kaggle’s CEO Anthony Goldbloom.

The list of investors could get longer, Datagen’s CEO Ofir Zuk (Chakon) told TechCrunch. Although the round closed a few weeks ago, the startup left “a small part in deferred closing” with a few names that remain to be confirmed.

One of Datagen’s main milestones since its Series A was building a self-serve platform that its target users demanded in their early feedback, Zuk said. This gives Datagen a more scalable way to help clients generate the visual data that they need to train their computer vision applications.

Datagen’s solution is used by computer vision teams and machine learning engineers inside a variety of organizations, including some Fortune 100 and ‘big tech’ companies. It has a wide range of applications, but there are four that are accelerating faster than others, Zuk said: AR/VR/metaverse, in-cabin automotive and automotive in general, smart conferencing, and home security.

In-cabin automotive is a good example to better understand what Datagen does. The term refers to what happens inside a car, such as whether or not the passenger is wearing a seatbelt. Passengers and cars come in many forms, which is where AI comes in handy. Based on some initial real-life 3D motion capture, Datagen lets its customers generate the much larger quantity of data that they need to, for instance, decide where exactly an airbag should be deployed.

We just touched on the common thread of synthetic data: how it leverages real-world data and extrapolates it into the kind of data that need more and more of, plentiful, and enriched to remove bias, cover edge cases, and more.

Datagen’s focus is visual data, but it isn’t tied to a sector in particular. If use cases in retail and robotics take off, for example, it will only need to collect specific real-life data, such as motion capture from warehouses. The algorithms and technology on top of this are domain-agnostic, Zuk said.

A twenty-plus-year-old enterprise-focused VC firm, Scale already invested in automotive simulation platform Cognata, and is bullish about simulated data. So is Zuk: “Synthetic data is taking over real data,” he summed up.

Datagen raises $50 million Series B to empower computer vision teams

Datagen raised a $50 million Series B round to boost the growth of its synthetic data solution for computer vision (CV) teams, bringing its total funding to over $70 million, the Israel-born company announced today. The round was led by new investor Scale Venture Partners, with partner Andy Vitus joining Datagen’s board of directors.

With offices in Tel Aviv and New York, Datagen “is creating a complete CV stack that will propel advancements in AI by simulating real world environments to rapidly train machine learning models at a fraction of the cost,” Vitus said. The Palo Alto-based VC predicts that “this will fundamentally transform the way computer vision applications are developed and tested.”

Investors that had backed Datagen’s $18.5 million Series A round 11 months ago participated in this new round. This includes VC firms TLV Partners, and Spider Capital, as well as Series A leader Viola Ventures, this time also through its growth arm Viola Growth. High-profile individuals from the AI and data field doubled down too, such as computer scientists Michael J. Black and Trevor Darrell, NVIDIA’s director of AI Gal Chechik, and Kaggle’s CEO Anthony Goldbloom.

The list of investors could get longer, Datagen’s CEO Ofir Zuk (Chakon) told TechCrunch. Although the round closed a few weeks ago, the startup left “a small part in deferred closing” with a few names that remain to be confirmed.

One of Datagen’s main milestones since its Series A was building a self-serve platform that its target users demanded in their early feedback, Zuk said. This gives Datagen a more scalable way to help clients generate the visual data that they need to train their computer vision applications.

Datagen’s solution is used by computer vision teams and machine learning engineers inside a variety of organizations, including some Fortune 100 and ‘big tech’ companies. It has a wide range of applications, but there are four that are accelerating faster than others, Zuk said: AR/VR/metaverse, in-cabin automotive and automotive in general, smart conferencing, and home security.

In-cabin automotive is a good example to better understand what Datagen does. The term refers to what happens inside a car, such as whether or not the passenger is wearing a seatbelt. Passengers and cars come in many forms, which is where AI comes in handy. Based on some initial real-life 3D motion capture, Datagen lets its customers generate the much larger quantity of data that they need to, for instance, decide where exactly an airbag should be deployed.

We just touched on the common thread of synthetic data: how it leverages real-world data and extrapolates it into the kind of data that need more and more of, plentiful, and enriched to remove bias, cover edge cases, and more.

Datagen’s focus is visual data, but it isn’t tied to a sector in particular. If use cases in retail and robotics take off, for example, it will only need to collect specific real-life data, such as motion capture from warehouses. The algorithms and technology on top of this are domain-agnostic, Zuk said.

A twenty-plus-year-old enterprise-focused VC firm, Scale already invested in automotive simulation platform Cognata, and is bullish about simulated data. So is Zuk: “Synthetic data is taking over real data,” he summed up.

TrueCircle scoops $5.5M to use AI to drive recycling efficiency

UK-based TrueCircle, a computer vision startup founded just last year, has nabbed $5.5 million in pre-seed funding in a bid to bring data-driven AI to the recycling industry to improve recovery rates and quality — with the overarching goal of transforming the economics of waste reuse to shrink demand for virgin materials.

So far the startup has its tech up and running in eight UK waste sorting facilities but is ramping up quickly, with more launches coming in Q2 — when it will be expanding internationally into Europe and the US.

It tells TechCrunch it’s shooting to have some 30-40 customers using its tech within 12 months’ time. 

The pre-seed is notable for its size. The round is led by Chris Sacca’s climate focused Lowercarbon Capital fund, with participation from Passion Capital, Giant Ventures and firstminute Capital, as well as the founders of companies including Revolut, Monzo, Infarm and Unity investing in a personal capacity.

Commenting on TrueCircle’s pre-seed raise in a statement, Lowercarbon Capital’s Clay Dumas, said: “Single-use plastic is a 300 million tonne scourge on our oceans and landfills that keeps the petrochemical industry in business. We backed TrueCircle because they’re harnessing technology and markets to build a solution that scales to the dimensions of the problem.”

TrueCircle’s two co-founders, Eamon Jubbawy and Rishi Stocker, are not new to the startup game. (Indeed, Jubbawy actually has two startups on the go at once right now; the other being an a16z-backed fintech called Sequence.)

The pair, who originally met at school, tout a lot of relevant tech and business smarts they’re bringing to bear here: Including computer vision experience from Onfido, another of Jubbawy’s startups, where he built up a computer vision team focused on identity document verification and face matching (he left Onfido in summer 2020); and commercial experience from fintech startup Revolut, where Stocker was one of its first employees and spent four years running global partnerships. He also previously worked at FMCG giant Unilever, and says he’s no strange to the challenges of increasing packaging recycling rates.

Recycling isn’t the most glamorous topic ofc but low levels of efficiency in the waste processing industry are a pressing problem from multiple angles — not least when combined with humanity’s pressing need to radically shrink global consumption in order to cut emissions and avoid catastrophic climate change — meaning there are real, meaningful problems here that tech could help solve.

Problems that scale all over the globe, too. So the disruption potential — and revenue ‘opportunities’ — look huge.

Regulation is also driving a lot more attention to what’s passing down the conveyor belts, as lawmakers start to impose conditions on use of virgin materials for things like packaging — actively changing the economics of recycling.

Equally, widespread public anger over direct environmental impacts of discarded waste, like single-use plastic polluting the oceans and creating a risk to marine life, is creating energy for change.

Meanwhile AI-driven efficiency gains — and the digitalization of industrial processes more generally — are being specifically looked to to address climate change, including by policymakers in the Europe Union who are pushing a combined ‘green and digital’ transformation investment strategy for the bloc to try to hit net zero carbon emissions by 2050.

“The beauty of [our approach] is if you scale it up across the tonnage that’s been processed in the world today it’s a very scalable business model — if we were to just focus on this data-as-a-service business but our ambitions don’t stop there,” says Stocker. “I think this is the thing that gets us all super excited. We have a chance here to disrupt a $20BN per year industry through a much more digitalized trading infrastructure.”

“Historically, attempts to revolutionize this industry were maybe a bit more academic and technology based but I think the approach which we’ve taken, from our experience of building and commercializing technology companies — at Onfido; Rishi was heading up a lot of our monetization strategies at Revolut — we’ve realized you need a lot more than that,” adds Jubbawy.

“You need great tech but you also need to find a way to make this industry work commercially. Hence turning our focus to getting the sales process working really effectively because that’s just another reason why the recycling industry hasn’t been given the attention that it should.”

Rewinding slightly, TrueCircle’s founders are starting with a pretty elementary idea of applying computer vision technology to the waste streams flowing through processing facilities so it can provide its customers with real-time time flows of data on what’s passing through their plants — powering waste analytics and alerts.

This means the startup is installing connected camera and lighting kit in their customers’ waste processing plants — and doing that free of charge since the business model is a SaaS-style fee, based on the processing per tonnage scanned.

“What we realized is these facilities — their major issue today is they have absolutely no data,” explains Stocker. “It’s a completely data-sparse industry.

“In the absence of any data, on the composition of waste coming in, and more importantly the actual quality they’re able to achieve on those distinct output lines, they come across the same problem again and again: 1) Their buyers don’t trust what they’re purchasing so they always get underpaid for the materials. And 2) they actually don’t know when there are issues in their plant because they have no way of capturing real-time data.”

“That was the real lightbulb moment for us — especially where Eamon’s computer vision knowledge of setting up and building Onfide from scratch came into play — we realized with a few very quick tests, actually by installing a very cheap camera with a lighting set-up on the conveyor belts that are in these facilities we could then feed that data to the cloud and apply a computer vision machine learning model to tag every single item,” he adds.

We’ve seen this sort of idea before — such as by TechCrunch Disrupt battlefield alum Greyparrot (another UK-based startup), which was founded back in 2019 and already sells an AI waste recognition system that’s been globally deployed.

But TrueCircle suggests its approach is more “full stack” as it’s also building an automation piece, initially via digital alerts its system sends to factory employees when quality thresholds drop below a customizable level — providing them with a root cause diagnosis so they can take immediate action to correct a problem with their sorting machinery.

Later it says it wants to integrate the alert system with the plant’s machinery in order that its software could automatically undertake those sorts of corrections too.

“The next step that we’re working on now is actually programmatically integrating with their existing machinery — such that when we spot an issue we can adjust the settings of that device and ensure it resolves it without manual intervention,” says Stocker. “So that’s really where we want to get to. We want to be this data as a service layer that spots issues, fixes them and then certifies the quality to maximize the selling potential.”

There’s more too: In parallel, TrueCircle is building a marketplace to support waste processing facilities in selling the verified material they reclaim.

Here its premise is that it will be able to help facilities achieve better prices for the processed waste as a result of the data that will come attached to it — aka, the analytics and quality/purity guarantee its AI is able to provide.

So the pitch is that — finally — waste processing facilities will have the data to show buyers that ensures they can get a fair price.

“By having a bit more of a full stack approach, to helping recycling facilities work with each other, connect with each other, obviously have better data on what they’re doing and make better decisions you can get the whole industry working more effectively,” suggests Jubbawy.

“We go after buyers who care about quality,” adds Stocker. “We’ve been able to attract buyers from Germany, for example, onto the platform — because they can see exactly what they’re buying and they can place a bid that’s reflective of that quality.

“This is a classic data as a service business — at least in its first module — because now a facility can come onto the platform and say okay I want to understand the quality of my outputs to help our facility get better revenues from a range of buyers. So they’re able to log on and generate a report for buyers. When they sell material at the end of every month they’re able to attach this report of real-time data which shows the exact quality of that line to all of the buyers.”

“I come from the fintech world so I kind of bring it back to Moody’s ratings,” he adds. “We see it as we become this Moody’s equivalent for the recycling industry and then that enables us to build the rest of the infrastructure that the industry needs to facilitate efficient recycling.”

TrueCircle says its AI models can currently identify around 50 different categories associated with waste — such as the material of the item, its weight, the brand, whether it’s food grade item etc.

While accuracy rates for its waste scanning AIs are slated at between 92-98%.

And after two months, the startup says it was able to demonstrate — in “some” of the initial facilities using its alerts dashboard — that its customers were getting a 10-15% higher recovery rate vs how they were operating before, i.e. without any AI to keep an eye on waste purity.

Given the types of jobs set to be automated here — i.e. dirty, smelly and potentially dangerous low paid manual labor — this is one application of AI that might be more welcomed than feared, Jubbawy also suggests.

“Ultimately the reason I’m motivated by this is I remember reading Bill Gates’ book on How to Avoid a Climate Disaster where he categorizes all the causes of this 51BN tonnes of greenhouse gases that we need to remove and the unnecessary use of virgin materials for packaging adds around 2-3% — so well above 1Gigaton,” he says, adding that the team’s overriding motivation is “doing our part to get those 51BN tonnes down to zero”.

Agot AI gives restaurants computer vision to see where food orders go wrong

Artificial intelligence has infiltrated a number of industries, and the restaurant industry was one of the latest to embrace this technology, driven in main part by the global pandemic and the need to shift to online orders.

That need continues to grow. In 2021, some 60% of Americans ordered takeout or delivery at least once a week, and 31% used a third-party delivery service. Market Study Report predicts the global restaurant management software market to grow nearly 15% annually to reach $6.95 billion by 2025.

However, we’ve all had that experience where you receive your food delivery only to find the order is wrong. Agot AI is using machine learning to develop computer vision technology, initially targeting the quick-serve restaurant (QSR) industry, so those types of errors can be avoided.

The company was founded three years ago by Evan DeSantola and Alex Litzenberger to solve that operations perspective in restaurant technology, reward employee success and improve a restaurant’s customer satisfaction.

Agot AI

Image Credits: Agot AI

Its product confirms order accuracy in real-time for online ordering and notifies employees if an order needs a correction; for example, they forgot to add cheese or ketchup.

Since unveiling its technology, the company has worked with a group of large food service brands to deploy it, including Yum! Brands, which Agot is partnering with to pilot the technology in about 20 restaurants (with plans to expand to 100 restaurants if the pilot is successful), CEO DeSantola told TechCrunch.

Gavin Felder, chief strategy officer at Yum! Brands, said via a written statement that the company is “always looking for innovative ways to leverage technology to empower our team members, and improve both their experience and the customer experience in our restaurants,” and that early results from the pilot program “indicate a promising potential to deliver more accurate orders to our customers across all the channels we serve.”

Yum! Brands isn’t just a customer, but one of Agot AI’s investors — it participated in Agot’s $12 million funding round that included Conti Ventures, the venture arm of strategic investor Continental Grain Co., Kitchen Fund and Grit Ventures. That brings the company’s total fundraising to date to $16 million.

Agot will deploy its new capital into growing its engineering team, securing more pilot programs with QSR brands and adding more features so restaurants can provide better overall experience both at the drive-thru and behind the counter.

The company has shown operational capabilities in small and medium-sized proof-of-concept deployment and is ready to scale its technology to larger markets and audiences.

Though Agot did not disclose its growth metrics, Mike Regan, the company’s chief business officer, said he was previously an investor when he met DeSantola and saw how big of an opportunity order accuracy was and how Agot was taking a holistic approach in doing it, calling the company’s vision “nothing short of transformational.”

While companies, like Toast, were some of the restaurant management pioneers, other startups picked up on this need from the industry in the past two years and not only came with their own approach, but also received venture capital.

For example, just in the past few months, we saw Lunchbox, Deliverect, Orda, Zak, Sunday and Margin Edge all announce new rounds, suggesting that there is plenty of money to go around as restaurants figure out how to quickly adapt to the new ordering norm.

The restaurant industry is “tight” right now, Regan added, but Agot is “way beyond where a startup of three years” should be in terms of success and that it is poised to take on a majority of the QSR business in the next couple of years.

Indeed, the new capital will “bring Agot to the next stage of our business,” DeSantola said.

“We have demonstrated success in early pilots and are excited to scale across current and additional enterprise partners,” he added. “We intend to use the capital to expand our suite of offerings, customer pace and analytics, operations analytics and drive-thru technology.”

Tiger, Coatue double down on Hinge Health with new $400M infusion

Hinge Health, the San Francisco-based company that offers a digital solution to treat chronic musculoskeletal (MSK) conditions, like back and joint pain, closed on $400 million in Series E funding to give the company a $6.2 billion valuation.

Tiger Global and Coatue Management, which co-led the company’s $300 million Series D round back in January, are back again to lead this one. They are joined by Alkeon Capital and Whale Rock, which put in a $200 million secondary investment to acquire some ownership in Hinge. Tiger, Coatue and Alkeon were also recently an investment team in Abacus.ai, which announced a $50 million round this week.

The new investment, which brings Hinge Health’s total funds raised to date at over $1 billion, will be funneled into technology and product development to improve access, outcomes and patient experiences, Daniel Perez, co-founder and CEO of Hinge Health, said in a written statement. The company also touts that it is “now one of the most valuable companies in digital health.”

Hinge Health’s technology aims to reduce MSK pain, surgeries and opioid use through the use of advanced wearable sensors and computer vision technology that is monitored by a clinical care team of physical therapists, physicians and board-certified health coaches.

So far this year, the company, which is approaching 1,000 employees, more than doubled its customer base to now serve over 575 enterprise customers. It also rounded out its leadership team with new hires Lalith Vadlamannati, who joined as chief technology officer after spending 13 years at Amazon, and Vincent Lim, chief people officer, who has previous experience scaling teams at Google, Medium and JUUL Labs.

Amazon partners with AXS to install Amazon One palm readers at entertainment venues

Amazon’s biometric scanner for retail, the Amazon One palm reader, is expanding beyond the e-commerce giant’s own stores. The company announced today it has acquired its initial third-party customer with ticketing company AXS, which will implement the Amazon One system at Denver, Colorado’s Red Rocks Amphitheatre as an option for contactless entry for event-goers.

This is the first time the Amazon One system will be used outside an Amazon-owned retail store, and the first time it’s used for entry into an entertainment venue. Amazon says it expects AXS to roll out the system to more venues in the future, but didn’t offer any specifics as to which ones or when.

At Red Rocks, guests will be able to associate their AXS Mobile ID with Amazon One at dedicated stations before they enter the amphitheatre, or they can enroll at a second station once inside in order to use the reader at future AXS events. The enrollment process takes about a minute and customers can choose to enroll either one or both palms. Once set up, ticketholders can use a dedicated entry line for Amazon One users.

“We are proud to work with Amazon to continue shaping the future of ticketing through cutting-edge innovation,” said Bryan Perez, CEO of AXS, in a statement. “We are also excited to bring Amazon One to our clients and the industry at a time when there is a need for fast, convenient, and contactless ticketing solutions. At AXS, we are continually deploying new technologies to develop secure and smarter ticketing offerings that improve the fan experience before, during, and after events,” he added.

Amazon’s palm reader was first introduced amid the pandemic in September 2020, as a way for shoppers to pay at Amazon Go convenience stores using their palm. To use the system, customers would first insert their credit card then hover their palm over the device to associate their unique palm print with their payment mechanism. After setup, customers could enter the store just by holding their palm above the biometric scanner for a second or so. Amazon touted the system as a safer, “contactless” means of payment, as customers aren’t supposed to actually touch the reader. (Hopefully, that’s the case, considering the pandemic rages on.)

On the tech side, Amazon One uses computer vision technology to create the palm signatures, it said.

In the months that followed, Amazon expanded the biometric system to several more stores, including other Amazon Go convenience stores, Amazon Go Grocery stores, and its Amazon Books and Amazon 4-star stores. This April, it brought the system to select Whole Foods locations. To encourage more sign-ups, Amazon even introduced a $10 promotional credit to enroll your palm prints at its supported stores.

When palm prints are linked to Amazon accounts, the company is able to collect data from customers’ offline activity to target ads, offers, and recommendations over time. And the data remains with Amazon until a customer explicitly deletes it, or if the customer doesn’t use the feature for at least two years.

While the system offers an interesting take on contactless payments, Amazon’s track record in this area has raised privacy concerns. The company had in the past sold biometric facial recognition services to law enforcement in the U.S. Its facial recognition technology was the subject of a data privacy lawsuit. And it was found to be still storing Alexa voice data even after users deleted their audio files.

Amazon has responded by noting its palm print images are encrypted and sent to a secure area built for Amazon One in the cloud where Amazon creates the customers’ palm signatures. It also has noted it allows customers to unenroll from either a device or from its website, one.amazon.com once all transactions have been processed.

Advanced rider assistance systems: Tech spawned by the politics of micromobility

The desire to achieve something as simple as keeping shared electric scooters off sidewalks has driven the development of some advanced technology in the micromobility industry. Once the province of geofencing, scooter companies are so eager to get a leg up on the competition that they’re now implementing technology similar to advanced driver assistance systems (ADAS) usually found in cars.

Operators like Spin, Voi, Zipp, Bird and Superpedestrian are investing in camera-based or location-based tech that can detect and even correct poor rider behavior, sometimes going to the extent of slowing scooters to a stop if they’re riding on a sidewalk.

People riding or parking scooters on sidewalks is a big problem for cities and forms one of the main complaints from NIMBYist residents who dislike change all the more when it becomes a tripping hazard. Companies are trying to solve this problem with tech that effectively puts the onus of rider behavior on operators, which may result in cities requiring scooter operators to have this sort of ADAS tech.

Scooter ADAS is probably the most doable and cost-effective method that cities can use to prevent unwanted rider behavior. And, it’s far cheaper than trying to police rider behavior themselves, or, address the lack of protected cycling infrastructure.

“This technology comes from a need for protected bike lanes,” said Dmitry Shevelenko, co-founder and president of Tortoise, an automated vehicle positioning service for micromobility companies. “It exists in this world where riders kind of have to do things that aren’t that great for others, because they have nowhere else to go. And so that’s the true driver of the need for this.”

Cities can solve this problem for the long term by building bike lanes or creating scooter parking bays, but until that happens, operators need to reassure local administrations that micromobility is safe, compliant and a good thing for cities.

“Until cities have dedicated infrastructure for whatever new modality comes to play, you have to figure out a way to use technology to make sure things don’t mix poorly,” said Alex Nesic, co-founder and chief business officer of Drover AI, a computer vision startup that provides camera-based scooter ADAS. “That’s really what we’re after. We want to enable this kind of maturation of the industry.”

Street views versus satellite views

Drover AI works with Spin, while Luna, another computer vision company, works with Voi and Zipp to attach cameras, sensors and a microprocessor to scooters to detect lanes, sidewalks, pedestrians and other environmental surroundings.