Digital biomarkers are healthcare’s next frontier

Blood pressure, body temperature, hemoglobin A1c levels and other biomarkers have been used for decades to track disease. While this information is essential for chronic condition management, these and many other physiological measurements are typically captured only periodically, making it difficult to reliably detect early meaningful changes.

Moreover, biomarkers extracted from blood require uncomfortable blood draws, can be expensive to analyze, and again, are not always timely.

Historically, continuous tracking of an individual’s vital signs meant they had to be in a hospital. But that’s not true anymore. Digital biomarkers, collected from wearable sensors or through a device, offer healthcare providers an abundance of traditional and new data to precisely monitor and even predict a patient’s disease trajectory.

With cloud-based servers and sophisticated, yet inexpensive, sensors both on the body and off, patients can be monitored at home more effectively than in a hospital, especially when the sensor data is analyzed with artificial intelligence (AI) and machine-learning technology.

Opportunities for digital biomarkers

A major opportunity for digital biomarkers is in addressing neurodegenerative diseases such as mild cognitive impairment, Alzheimer’s disease and Parkinson’s disease.

Neurodegenerative disease is a major target for digital biomarker development due to a lack of easily accessible indicators that can help providers diagnose and manage these conditions. A definitive diagnosis for Alzheimer’s disease today, for example, generally requires positron emission tomography (PET), magnetic resonance imaging (MRI) or other imaging studies, which are often expensive and not always accurate or reliable.

Cost savings and other benefits

Digital biomarkers have the potential to unlock significant value for healthcare providers, companies and, most importantly, patients and families, by detecting and slowing the development of these diseases.

Fetcher raises $27M to automate aspects of job candidate sourcing

Reflecting the growing investor interest in HR technology startups, Fetcher, the talent acquisition platform formerly known as Scout, today closed a $27 million Series B funding round led by Tola Capital with participation from G20 Ventures, KFund, and Accomplice. The new money — $7 million in debt and $20 million in equity — brings the startup’s total capital raised to $40 million, which co-founder and CEO Andres Blank says is being put toward international expansion and building out the Fetcher platform with new applicant tracking system (ATS) integrations and customer relationship management capabilities.

Fetcher was co-launched in 2014 by Blank, Chris Calmeyn, Javier Castiarena, and Santi Aimetta as a professional networking app called Caliber. After a few years, the founding Fetcher team decided to pivot into recruitment, leveraging some of the automation technology they’d built into Caliber.

“Hiring high-quality, diverse candidates had always been a pain point for me. At one of my prior startups, I personally experienced this issue, and after bringing on a recruiting team to help scale hiring efforts, I saw that their time was also too valuable to be spent on the manual, repetitive tasks that come with sourcing candidates,” Blank told TechCrunch in an email interview. “Rather than relying on expensive staffing fees, I thought there must be a better way to keep sourcing in-house, without it taking up too much time and energy on the talent acquisition teams and hiring managers.”

Through a Chrome extension, Fetcher’s platform ties in with ATS products as well as Gmail and Outlook to allow recruiters to source candidates directly from LinkedIn. Fetcher filters jobseekers into prebuilt email workflows, offering analytics including progress toward diversity goals at the individual, team, position, and company levels.


The Fetcher candidate directory.

Fetcher also performs predictive modeling, automatically gauging the interest of job candidates from their replies, and “automated sourcing,” which runs in the background to push applicants through vetting processes via automated emails.

“A great candidate experience is essential for any company, and part of that experience comes from building long-term relationships with candidates over time. Fetcher’s candidate directory allows companies to remarket to qualified candidates, set up reminders for future connections, and add additional outreach emails to the automated sequences,” Blank said. “Overall, the goal is to make it simple for companies to store, update, and connect with great candidates over time, messaging them about future job opportunities, milestones at the company, and more.”

The reliance on algorithms is a bit concerning, given the potential for bias — Amazon infamously scrapped a recruitment algorithm that favored male engineers and New York City recently placed restrictions on the use of AI in hiring. When asked about it, Blank asserted that the platform’s automation technologies allow for “a more diverse group of prospects” to push through the hiring funnel. He also highlighted Fetcher’s outreach policy, noting that people who don’t wish to be contacted about opportunities via Fetcher can send data deletion requests.

“[O]ur secret sauce here at Fetcher is combining both machine and human intelligence in order to minimize the biases that exist on both sides,” Blank said. “Beyond this, we also have diversity metrics on each search (visible on our platform to the client too), which keeps us in check. If we’re over- or under-indexing anywhere on the gender or demographics front, the platform can course correct. Finally, we remove selection biases from the client. The way we do this is that once a client trusts that the search is heading in the right direction (after vetting a handful of candidates upfront), they place the search on full automation. This means that going forward, they are no longer vetting every candidate, but simply reaching out to all qualified candidates that are found for [a given] open role.”

Blank linked to case studies from customers like, which recently used Fetcher to hire employees mostly from underrepresented groups. But biases can enter at many different, often unpredictable stages of the pipeline. As Harvard Business Review’s Miranda Bogen writes: “For example, if [a] system notices that recruiters happen to interact more frequently with white men, it may well find proxies for those characteristics (like being named Jared or playing high school lacrosse) and replicate that pattern. This sort of adverse impact can happen without explicit instruction, and worse, without anyone realizing.”


Image Credits: Fetcher

The risk doesn’t appear to be dissuading recruiters. Fetcher currently has over 350 customers (growing 10% month-over-month) including Behr Paint, Albertson’s, Foursquare, and Shutterstock., and annual recurring revenue tripled in the last 12 months.

Beyond the strong top-line numbers, Fetcher is benefiting from the broader boom in the HR tech segment, which has seen high venture capital activity over the past few months. According to Pitchbook, HR tech startups collected more than $9.2 billion in venture capital funding globally from January 2021 to October 2021 — a 130% jump from 2020’s total.

“Fetcher is uniquely positioned as one of the only software-as-a-service recruiting platforms to automate both candidate sourcing and email outreach efficiently,” Blank said. “Rather than using a straight database model, Fetcher is the only sourcing solution that can truly automate the sourcing process for companies, based on its unique combination of ‘machine learning with human intelligence.’ This model allows for what feels like a 24/7 sourcer to work in the background for each client. By automating both the sourcing and outreach sides of recruiting, Fetcher can reduce the number of internal sourcers and recruiters a company needs, as well as significantly reduce the budget being spent on outside recruiting firms, agencies, or consultants.”

Fetcher employs 45 people, currently, and plans to double that number by the end of the year.

Everstream Analytics secures new cash to predict supply chain disruptions

Everstream Analytics, a supply chain insights and risk analytics startup, today announced that it raised $24 million in a Series A round led by Morgan Stanley Investment Management with participation from Columbia Capital, StepStone Group, and DHL. CEO Julie Gerdeman said that the new money would be used to “propel technology innovation” and “further global expansion.”

Everstream, which was launched as Resilience360 and Riskpulse, provides predictive insights for supply chains. Drawing on billions of supply chain interactions, the company applies AI to assess materials, suppliers, and facilities for risk.

Plenty of startups claim to do this, including Backbone, Altana, and Craft. Project44 recently raised $202 million to expand its own set of predictive analytics tools, including estimated time of arrivals for shipments.

But what sets Everstream apart is its access to proprietary data that goes beyond what competitors are leveraging, according to Gerdeman.

“[Everstream provides] visibility into essentially every network, component, ingredient, ​and raw material around the world,” she told TechCrunch via email. “Connected business networks, scalable computing power, graph data base technology, and advances in AI algorithms enable Everstream to combine massive volumes of public and proprietary data to build a model of the global supply chain.”

As new data enters the platform, Everstream, which integrates with existing enterprise resource planning systems, retrains its AI system to reflect the current supply chain environment. Customers receive proactive warnings based on signals including financial reports and news of weather events, environmental and sustainability risks, and natural disasters.

For example, Everstream can warn businesses when it might be difficult to source a specific material and how likely customers are to cancel, increase, or move forward orders. It can also provide suggestions for optimizing logistics operations based on metrics such as timeliness, quality, and cost of goods shipped.

“Everstream’s AI-based models and preset dynamic thresholds can be used to predict disruptions and prescribe recommendations to mitigate risk and deliver better results to the business needs,” Gerdeman added. “[Everstream] identifies the most impactful risks in the network and creates targeted insights-based on inputs from the … platform, including incident monitoring, predictive risks, ESG, and shipment data — slashing time, cost, and complexity.”

Most would argue these are useful tools at a time when uncertainty continues to dog the supply chain — assuming Everstream’s AI systems perform as well as advertised. While some surveys show tepid adoption of predictive analytics among the supply chain industry, Gartner recently found that 87% of supply chain professionals plan to invest in “resilience” within the next two years, including automation and AI.

Investors seemingly see the potential. Last year was a banner year for venture-backed supply chain management companies, which saw $11.3 billion in funding, according to Crunchbase.

For its part, Everstream claims its customer base has grown 550% to date in 2022 and now includes brands like AB InBev, Google, Bayer, Schneider Electric, Unilever, and Whirlpool. Mum’s the word on concrete revenue numbers; Gerdeman demurred when asked about them.

“The pandemic has illustrated why deep visibility is needed not only into a company’s network, but down to the component, ingredient, ​and raw material level, because it doesn’t matter if the company’s supplier is operational if their suppliers are not,” Gerdeman said. “Everstream’s insights are not only predictive in nature, but they are also prescriptive – meaning we not only tell clients what’s coming next, but also what they should do about it.”

Everstream, which employs 100 people, has raised $70 million in equity and debt funding so far.

Shadow announces premium plan for its cloud gaming service

It’s been a year since Octave Klaba, the founder of OVHcloud, acquired Shadow following a commercial court order. After a stabilizing period, the company is now ready to launch a new plan, a new service and a new B2B offering.

Shadow is a cloud computing service for gamers. People can pay a monthly subscription fee to access a full-fledged computer in a data center near them. It is a Windows instance, which means that you can install whatever you want — games, photo editing software, Microsoft Office, you name it.

But the service works particularly well for gamers as everything has been optimized for video games, from latency, 4:4:4 color support, gamepad compatibility to specifications. Currently, subscribers get the equivalent of an Nvidia GeForce GTX 1080, 12GB of RAM and 256GB of storage for $29.99 per month, or €29.99 in Europe.

This is fine if you want to play Fortnite or Minecraft, but there are more recent GPUs that can improve your gaming experience. That’s why the company is announcing an upgrade for people who want better specifications.

Instead of a separate plan, the Shadow Power Upgrade is an add-on on top of your base subscription. For another $14.99 per month (or €14.99), you can access a server with an AMD EPYC 7543P CPU with 4 cores and 8 threads, 16GB of RAM and a recent GPU.

Depending on the data center, users will get an Nvidia Geforce RTX 3070 or the equivalent GPU in Nvidia’s professional GPU lineup. Users could also get a professional AMD Radeon GPU based on the RDNA 2 architecture (AMD Radeon Pro V620).

As you can see, Shadow is partnering extensively with AMD instead of relying exclusively on Intel for CPU models and Nvidia for GPU models. This could help when it comes to sourcing negotiations and supply chain constraints.

When it comes to availability, users will be able to pre-order the Power Upgrade this summer and it will be available this fall.

As for storage, if you think 256GB is not enough, you can purchase additional storage blocks in 256GB increments for $2.99 (or €2.99) per block per month. The maximum is 2TB.

The company is also launching new markets as people living in Canada and Austria will be able to subscribe this fall. As a reminder, Shadow is currently available in France, Belgium, Luxembourg, the Netherlands, Germany, Switzerland, the U.K. and the U.S.

International expansions for a cloud computing service can be a bit difficult as you want your users to live as closely as possible from a data center where you operate due to latency concerns. There are currently eight data centers with Shadow servers — three in France, one in Germany and four in North America.

From hubiC to Shadow Drive

OVHcloud founder Octave Klaba also owns a cloud storage service called hubiC. For the past few years, that service has been on hold as it stopped accepting new customers and is going to shut down soon. hubiC never really managed to compete with Dropbox, Google Drive or Microsoft OneDrive when it comes to features and reliability.

But it doesn’t mean that it was a bad idea. There’s still some room for competition in the online storage space. That’s why Shadow is going to launch a new service called Shadow Drive this fall.

Based on Nextcloud, a popular open-source online storage application, Shadow Drive will store and sync your files so that they can be accessible through a web browser, a desktop app or a mobile app. There will be a free plan with 20GB of storage and a premium plan with 2TB of storage for $8.99 per month (€8.99).

Expanding to business use cases

From the very beginning, Shadow has always been thinking about B2B use cases for its cloud computing service. Essentially, if the company can make the service work for gamers, it will be perfectly capable of running productivity apps and other professional software.

The company is now accepting customers for its new division called Shadow Business Solutions. Clients will be able to create, manage and share access to several virtual machines running on Shadow servers.

For instance, Bandai Namco worked with Shadow for the Elden Ring press campaign. The company shared a login and password with video game journalists so that they could play and review the game on a powerful computer and in a secure environment.

In many ways, Shadow hit the reset button last year with Octave Klaba’s acquisition of the company. While the company might not attract millions of users with such a premium positioning for its cloud computer, it sounds like Shadow now has a strong foundation for future iterations.

Image Credits: Shadow

Heartex raises $25M for its AI-focused, open source data labeling platform

Heartex, a startup that bills itself as an “open source” platform for data labeling, today announced that it landed $25 million in a Series A funding round led by Redpoint Ventures. Unusual Ventures, Bow Capital, and Swift Ventures also participated, bringing Heartex’s total capital raised to $30 million.

Co-founder and CEO Michael Malyuk said that the new money will be put toward improving Heartex’s product and expanding the size of the company’s workforce from 28 people to 68 by the end of the year.

“Coming from engineering and machine learning backgrounds, [Heartex’s founding team] knew what value machine learning and AI can bring to the organization,” Malyuk told TechCrunch via email. “At the time, we all worked at different companies and in different industries yet shared the same struggle with model accuracy due to poor-quality training data. We agreed that the only viable solution was to have internal teams with domain expertise be responsible for annotating and curating training data. Who can provide the best results other than your own experts?”

Software developers Malyuk, Maxim Tkachenko, and Nikolay Lyubimov co-founded Heartex in 2019. Liubimov was a senior engineer at Huawei before moving to Yandex, where he worked as a backend developer on speech technologies and dialogue systems.


Heartex’s dashboard.

The ties to Yandex, a company sometimes referred to as the “Google of Russia”, might unnerve some — particularly in light of accusations by the European Union that Yandex’s news division played a sizeable role in spreading Kremlin propaganda. Heartex has an office in San Francisco, California, but several of the company’s engineers are based in the former Soviet Republic of Georgia.

When asked, Heartex says that it doesn’t collect any customer data and open sources the core of its labeling platform for inspection. “We’ve built a data architecture that keeps data private on the customer’s storage, separating the data plane and control plane,” Malyuk added. “Regarding the team and their locations, we’re a very international team with no current members based in Russia.”

Setting aside its geopolitical affiliations, Heartex aims to tackle what Malyuk sees as a major hurdle in the enterprise: extracting value from data by leveraging AI. There’s a growing wave of businesses aiming to become ‘data-centric’ — Gartner recently reported that enterprise use of AI grew a whopping 270% over the past several years. But many organizations are struggling to use AI to its fullest.

“Having reached a point of diminishing returns in algorithm-specific development, enterprises are investing in perfecting data labeling as part of their strategic, data-centric initiatives,” Malyuk said. “This is a progression from earlier development practices that focused almost exclusively on algorithm development and tuning.”

If, as Malyuk asserts, data labeling is receiving increased attention from companies pursuing AI, it’s because labeling is a core part of the AI development process. Many AI systems “learn” to make sense of images, videos, text and audio from examples that have been labeled by teams of human annotators. The labels enable the systems to extrapolate the relationships between the examples (e.g., the link between the caption “kitchen sink”: and a photo of a kitchen sink) to data the systems haven’t seen before (e.g., photos of kitchen sinks that weren’t included in the data used to “teach” the model).

The trouble is, not all labels are created equal. Labeling data like legal contracts, medical images, and scientific literature requires domain expertise that not just any annotator has. And — being human — annotators make mistakes. In an MIT analysis of popular AI data sets, researchers found mislabeled data like one breed of dog confused for another and an Ariana Grande high note categorized as a whistle.

Malyuk makes no claim that Heartex completely solves these issues. But in an interview, he explained that the platform is designed to support labeling workflows for different AI use cases, with features that touch on data quality management, reporting, and analytics. For example, data engineers using Heartex can see the names and email addresses of annotators and data reviewers, which are tied to labels that they’ve contributed or audited. This helps to monitor label quality and — ideally — to fix problems before they impact training data.

“The angle for the C-suite is pretty simple. It’s all about improving production AI model accuracy in service of achieving the project’s business objective,” Malyuk said. “We’re finding that most C-suite managers with AI, machine learning, and/or data science responsibilities have confirmed through experience that, with more strategic investments in people, processes, technology, and data, AI can deliver extraordinary value to the business across a multitude of diverse use cases. We also see that success has a snowball effect. Teams that find success early are able to create additional high-value models more quickly building not just on their early learnings but also on the additional data generated from using the production models.”

In the data labeling toolset arena, Heartex competes with startups including AIMMO, Labelbox, Scale AI, and Snorkel AI, as well as Google and Amazon (which offers data labeling products through Google Cloud and SageMaker, respectively). But Malyuk believes that Heartex’s focus on software as opposed to services sets it apart from the rest. Unlike many of its competitors, the startup doesn’t sell labeling services through its platform.

“As we’ve built a truly horizontal solution, our customers come from a variety of industries. We have small startups as customers, as well as several Fortune 100 companies. [Our platform] has been adopted by over 100,000 data scientists globally,” Malyuk said, while declining to reveal revenue numbers. “[Our customers] are establishing internal data annotation teams and buying [our product] because their production AI models aren’t performing well and recognize that poor training data quality is the primary cause.”

Glean aims to help employees surface info across sprawling enterprise systems

At enterprises of a certain size, keeping track of data including apps, employees, and projects becomes increasingly challenging. According to McKinsey, employees spend 1.8 hours every day — 9.3 hours per week, on average — searching for and gathering information. The veracity of metrics like these has been challenged over the years. But it’s reasonable to say that knowledge workers in particular devote a sizeable chunk of their workdays to sifting through data, whether to find basic contact info or domain-specific files.

The emergence in recent years of AI algorithms that can parse natural language has fueled the rise of platforms that can shrink that chunk. At least, that’s the assertion of Arvind Jain, a former Google engineer and Rubrik co-founder, whose startup, Glean, employs AI to power a unified search experience across all apps used at a company.

Arvind, a former Google engineer, began work on Glean at Rubrik, the cloud data management company. In Rubrik’s annual employee pulse survey, Arvind observed that one of the biggest productivity challenges was workers not being able to find the information they needed — whether a specific document or subject-matter expert.

“Engineers were spending too much time outside code; account managers couldn’t find the latest research or presentation needed to close deals; new employees took too long to ramp,” Arvind told TechCrunch in an email interview. “This growing problem was not only destroying productivity, but also sapping energy and detracting from the employee experience.”

Other companies were experiencing the same issues, as it turned out — exacerbated by their embrace of the cloud and distributed work setups. Sensing an opportunity, Arvind managed to convince former engineering lead Piyush Prahladka, ex-Facebook and -Microsoft engineer T.R. Vishwanath, and Tony Gentilcore, previously at AT&T and Google, to build the prototype for Glean.


Image Credits: Glean

Fast forward to 2022, and Glean has over 70 customers including Okta, Confluent, Samsara, Grammarly, and Outreach. Reflecting growth since its 2019 founding, Glean today closed a $100 million Series C round led by Sequoia with participation from the Slack Fund at a $1 billion valuation post-money.

On the one hand, Glean’s technology isn’t incredibly novel. Services like Microsoft’s SharePoint Syntex, Amazon Kendra, and Google Cloud Search tap natural language processing technology to understand not only document minutia but the searches employees across an organization might perform, like “How do I invest in our company’s 401k?” They fall under the banner of “cognitive search,” a product category encompassing search tools that implement AI to ingest, understand, organize, and query data from multiple sources.

But Arvind claims that Glean is simpler to set up and use than the competition, including smaller outfits like Coveo, Elastic, Lucidworks, and Mindbreeze.

“[Glean] takes less than two hours for initial setup, and doesn’t require any engineering talent or manual fine-tuning for implementation,” Arvind said. “And Glean has seamless workflow integration, whether you’re using Glean in the web app, new tab page, sidebar search, native search, or Slack commands.”

Arvind notes that one of the major problems in enterprise search is the diversity of data sources, like knowledge bases, tickets, chat messages, and pull requests. To address this, Glean uses AI systems to predict for every query the relative importance of content across these sources, training separate systems on customer data to learn company-specific jargon, concepts, entities, and acronyms. To deliver personalized results, as well as proactively recommend documents, Glean accounts for variables like a person’s role, work patterns, job function, and specific projects and responsibilities in its indexing.

“Glean’s biggest competitor is the status quo: employees continuing to deal with the complexity of finding the information and people they need at work. In a typical sales process, potential customers often require Glean to first start with a pilot to demonstrate how much value implementing Glean can provide,” Arvind said. “Glean uses the user’s information to personalize the search experience for the user along several dimensions — for example, for the same query an engineer may see very different results than a sales executive. Glean also uses the user’s activity, such as clicks on search results, to improve the search relevance.”

Because Glean acts like a layer on top of all other apps a company uses, it can double as a work portal from where managers can create and share “shortlinks” to resources (e.g., “go/benefits” instead of a long URL). Management can also share news, handbooks, expense policies, KPI dashboards, and company OKRs and expose the company’s people directory, which shows who people are and what projects they’re working on.


Image Credits: Glean

In a dashboard, Glean surfaces answers to frequently asked questions and devotes a space to links and descriptions of those links that can be shared with the wider organization. A control panel allows users to run data loss prevention reports on the sources from which Glean draws and check for compliance with GDPR, CCPA, and other privacy regulations.

“Prospective customers are often anxious about providing Glean with access to all of their data, which is why Glean has spent so much time ensuring it respects all privacy controls from the applications that it integrates with, and invested heavily in security certifications and processes from the very beginning … When [a] user deletes a document in the underlying application (Slack, Drive, Office, etc.), the document gets deleted from the Glean system as well,” Arvind said. “Glean customers can choose to have Glean host them or self-host Glean to keep their information within their environment. With Glean’s e-discovery and data loss prevention tools, companies can be confident about what data is available within their organization — and how that information is used.”

No enterprise search tool is without limitations. In a 2021 survey by APQC, which provides benchmarks and best practices for businesses, 19% of workers said that poor search functionality is a key problem in their organizations. But there’s a healthy market for enterprise search solutions regardless. The same survey found that 41% of respondents expect to “significantly” increase investment in search and findability within their organizations in the next 12 to 18 months.

Glean, whose total capital raised stands at $155 million, plans to use the proceeds from the latest round to expand its team, build out a go-to-market plan, and “drive new feature innovation.” Glean has more than 100 employees today and expects to have over 250 by the end of the year.

“Increased value put on employee productivity and happiness has been a boon to Glean’s growth among fast-growing companies that care about employee experience,” Arvind said. “Glean provides value to prospective customers from the first minute they start searching, and Glean is constantly working through product development and customer enablement to ensure customers have the best experience from then on.”

Tractian, which uses AI to monitor industrial equipment, raises $15M

Tractian, a startup developing a product to monitor the status of machines and electrical infrastructure, today announced that it closed a $15 million Series A funding round led by Next47, with participation from Y Combinator and others. The money will be put toward product development and expanding Tractian’s workforce and geographic footprint, according to co-founder and co-CEO Igor Marinelli, as well as ongoing customer acquisition efforts.

Founded in 2019, Tractian is the brainchild of Y Combinator alumni Marinelli and Gabriel Lameirinhas. Prior to starting Tractian, they worked at a paper manufacturer, International Paper, as software engineers, where Marinelli says they noticed how backwards the systems were for monitoring machinery health.

“Industrial managers of any kind need traceability of work orders, and need to know the health of their machines from kilometers away from the operations,” Marinelli said. “[W]ithout the proper combination of hardware and software, you can’t solve the industry’s real challenge.”

Tractian’s flagship product, which Marinelli says is patent pending in the U.S., uses AI to identify mechanical problems a machine might be having by analyzing its “rotational assets,” like motors, pumps and compressors. Tractian can spot signs of looseness, imbalance and misalignment from vibration and temperature anomalies measured by custom sensors, Marinelli claims, in addition to potential electrical failures.

Tractian provides sensors that attach to — and send data about — machines via 3G or 4G cellular networks. The company’s software provides checklist and inspection steps for each machine, plus diagnostics, recommendations, alerts and scheduling tools and inventories.


Monitoring equipment with Tractian. Image Credits: Tractian

Marinelli readily acknowledges that Tractian isn’t the first to the machine analytics space. Predictive maintenance technologies have been used for decades in jet engines and gas turbines, and companies including Samsara, Augury, Upkeep and MaintainX offer solutions with capabilities similar to Tractian. In April, Amazon threw its hat in the ring with the general launch of Lookout for Equipment, a service that ingests sensor data from a customer’s industrial equipment and then trains a machine learning model to predict early warning signs of machine failure.

In a sign of the segment’s competitiveness, Augury just this month acquired Seebo, a startup that provided manufacturing teams with the insights to optimize their industrial processes. Augury is one of the better-funded startups in the sector, having raised nearly $300 million in venture capital to date.

But both Marinelli and Lameirinhas sense opportunity in a market that could be worth $12.3 billion by 2025. In 2018, Gartner predicted that by 2022, spend on internet of things-enabled predictive maintenance would increase to $12.9 billion, up from $3.4 billion in 2018.

While Marinelli declined to go into detail when asked about the technical details of Tractian’s platform, including the accuracy of its algorithms, he noted that Tractian’s customer base of roughly 200 companies spans well-known brands like John Deere, Bosch, Embraer and Hyundai.

Looking ahead, the key for Tractian will be convincing would-be customers that its technology performs better than the rest. In a survey by McKinsey, analysts at the firm highlight the dangers of an under-performing predictive maintenance algorithm, claiming that one company saved over 10% on the breakdown of a piece of equipment using an algorithm but spent significantly more as a result of the algorithm’s high false-positive rate.

“[O]ur technology involves the same concept of Shazam, but for machines,” Marinelli said. [The pandemic especially] increased the necessity of real time monitoring of assets because many operators [can’t] be physically working [near them] for long periods of time.”

In March, Tractian announced its expansion to North America, opening a new office in Mexico with a team dedicated to developing the company’s activities there. Tractian plans to follow up with market entry in Atlanta, Georgia later this year.

When reached for comment, Debjit Mukerji, a partner at Next47 who plans to join Tractian’s board of directors, said: “This is a critical space, the heartbeat of our economy. Next47 is thrilled to join Tractian on its mission to transform the maintenance experience for enterprises globally. Having followed this space for years, we concluded that frictionless deployment, intuitive user interfaces and a mobile/cloud-first approach are essential ingredients of success, particularly in the underserved medium enterprise segment. Tractian combines these in its extraordinary product vision and consistently delights its customers.”

Tractian currently has 100 employees and it expects to expand its headcount to 200 in the next 18 months. The company’s total capital raised stands at $19 million; Marinelli demurred when asked about the valuation.

Microsoft joins Salesforce, Google and IBM in offering sustainability tracking products

As environmental issues take center stage with increasingly severe weather incidents, wildfires, droughts and floods across the globe, companies who generate pollution up and down the supply chain are looking for ways to measure their impact on the environment with an ultimate goal of finding ways to minimize their overall contribution to the problem.

But in order to improve your carbon footprint, you need to be able to get a baseline measurement and then follow the data over time and that takes a set of software tools.

Microsoft wants to help, and it announced new offering today called Microsoft Cloud for Sustainability. In a blog post authored by Alysa Taylor, Corporate vice president, industry, apps, and data marketing; and Elisabeth Brinton, Corporate vice president for sustainability; the company outlined its plans for a new offering to help.

The solution aims to use of a set of measuring devices to collect the data, and then take advantage of Microsoft’s cloud-based data collection services to process and understand that data.

“To effectively drive sustainability reporting, sustainability efforts, and business transformation, organizations need better visibility into activities across their enterprise and value chain. Collecting and connecting IoT data from devices using sensors — combined with rich services at the edge or in the cloud — provides the basis to monitor and measure activities at scale.”

The data collection piece takes place in a tool called Microsoft Sustainability Manager, which they write”will empower organizations to more easily record, report and reduce their environmental impact through increasingly automated data connections that deliver actionable insights.”

Microsoft is not alone in this type of effort as other major players have made similar announcements over the last year including Salesforce, Google and IBM. In fact, IBM CEO Arvind Krishna said at a press event last week, sustainability is going to be a big business moving forward, as more companies try to reduce their carbon output. Salesforce and Google have built similar products, while IBM acquired Envizi, a startup that helps companies measure their carbon usage.

All of these solutions are data driven, and look to help companies collect key data to understand their environmental impact, while finding ways to reduce it over time and meet sustainability goals.

Microsoft will also be working with a variety of partners to help extend the solution beyond its core offering. The tool will be available for testing or purchasing on June 1st.

Vivun is digitizing pre-sales, and just got $75M in new investment

Vivun was built by Matt Darrow, a guy who cut his teeth in pre-sales at Zuora. He believed that pre-sales teams had a lot to offer to the organization, but there really wasn’t any software to capture that knowledge. That’s why he built Vivun.

The company is helping to define a new software category, which tends to get the attention of investors, Darrow says more than that, the company is fundamentally sound and growing fast.

“I think at the core of it is that we’ve been able to grow and run a business that has just really great fundamentals and metrics. And we’re leaning into a really big opportunity,” Darrow told me.

Among the metric milestones he chose to highlight, in the company’s most recent fiscal year, it grew ARR more than 4x, it surpassed 100 employees and increased customer count by more than 160%. Customers include Snowflake, Genesys and UiPath, among others.

While he didn’t want to discuss specific revenue numbers around that 4x ARR growth, he did give a strong hint that he’s building a substantial company. “We’re on the phase of our journey where our objective over the next year is how do we get to around the $30 million in ARR. So that’s kind of where we are in size and scope,” he told me.

Paul Drews, partner and head of US investments at Salesforce Ventures, which is leading the investment in the company in this round, says a company like Vivun is particularly important to help push sales at a time when traditional sales methods are shifting to buyer-driven processes. And with Salesforce’s focus on sales, the investment from its venture arm makes a lot of sense

“The buying process has just changed a lot over the years from more of a top down process to a bottoms up, where buyers want to get their hands on the product. As part of this evolution, we’ve noticed that the role of pre-sales has really just grown in importance,” Drews said.

He said to this point he hadn’t seen a purpose-built product to address this issue and as an investor seeing someone break new ground really drew him to the company.

The startup just surpassed 140 employees and expects to reach 200 before the end of the year. He says that unlike in 2019 when he was launching the company, today the company has a people/HR team in place to help drive the company’s DEI initiatives. Further, he says that being 100% distributed helps. In fact, they already have employees based in 10 different countries.

The $75 million investment comes as part of the company’s Series C round. In addition to lead investor Salesforce Ventures, new investor Tiger Global also participated along with existing investors Menlo Ventures, Accel and Unusual Ventures. The company reports it has now raised a total of $131 million.

Pipedream lands $20M to connect disparate apps

Software integrations can be challenging. While serverless functions-as-a-service (FaaS) products like AWS Lambda, Google Cloud Functions, and CloudFlare Workers can help ease the complexity, they aren’t always the right fit for developers looking to deploy and run integrations with other software apps. Just ask the team at BrightRoll, the programmatic video advertising platform acquired by Yahoo! (full disclosure: TechCrunch’s parent company) in 2014, which ran 176 unique data pipelines to manage various app integrations.

Frustrated by the process, Tod Sacerdoti, who co-founded BrightRoll before before a stint at Flux Capital as an investor, teamed up with seven former BrightRoll  employees in product and engineering roles to found Pipedream, an integration platform for building workflows and connecting cloud services. Pipedream allows customers to create workflows with open source connectors to APIs and extensions coded in Node.js, Python, Go, or Bash for custom logic.

Investors lined up behind the idea. Today, Pipedream announced that it raised $20 million in a funding round led by True Ventures with participation from CRV, Felicis Ventures, and the World Innovation Lab. Sacerdoti says that the proceeds will be put toward further developing Pipedream’s product, expanding the number of apps on the platform, and building out a go-to-market team.


Pipedream’s app integration interface.

“When BrightRoll was acquired, there was an internal team of ten employees called the ‘product hackers’ that was responsible for [app] integrations … While the business value of the team was massive, it took tremendous resources to build and manage the integrations. We felt there had to be a better way,” Sacerdoti told TechCrunch in an email interview. “When we interviewed our first 30 potential customers, one of them said, ‘This would be so valuable if it existed but it is impossible to build. It’s a Pipedream.’ Hence, our name.”

Founded in 2019, Pipedream is akin to workflow automation platforms like Zapier, Integromat, Workato, and Mulesoft — albeit more developer-focused. Any user, customer, or partner can add an integration to the close-sourced portion of the platform, while developers can add proprietary or internal integrations that aren’t intended to be shared with the larger Pipedream community. For example, Sacerdoti says that many customers use Pipedream to interact with existing AI platforms, building integrations to leverage AI models such as OpenAI’s GPT-3 or a cloud service AI product from Google, Microsoft, or AWS.

As of today, which marks the general release of the latest version of the Pipedream platform, customers get data stores to store and retrieve data across steps and workflows and single sign-on support.

“A huge percentage of the software engineering being done today is integrating APIs, often called ‘glue code,'” Sacerdoti said. “Data engineering teams are nearly universally under-staffed and engineering leadership wants those teams working on the highest value parts of the business that only they can work on. Integrations rarely clear this bar and, as a result, integrations often don’t get prioritized. As enterprises look for automation and efficiencies, integrations are one of the most impactful and profitable efforts they can take.”

Pipedream has a number of competitors in the workflow automation space, chiefly Zappier and the aforementioned FaaS products. Newer players include n8n, which provides a framework for users to synchronize data and workflows, and Merge, which helps its users build customer-facing integrations with third-party tools.

But Pipedream’s business is booming nevertheless. Sacerdoti claims that the company has over 100 enterprise customers including Checkr and Scale AI and sees more than 600 developers sign up every day. Sacerdoti attributes part of the growth to the number of triggers and actions on Pipedream’s platform (over 700), which he believes positions the platform well against rivals.


Image Credits: Pipedream

It’s no coincidence that Pipedream’s success comes amid explosive growth in API — and software-as-a-service (SaaS) app — usage. According to a recent survey from RapidAPI, 61.6% of developers relied on APIs more in 2021 than in 2020. And Statista found that organizations worldwide in 2021 were using an average amount of 110 SaaS apps. That’s a lot of apps to connect, and developers tend to prefer API-forward approaches a la Pipedream’s for their simplicity.

“With Pipedream, a single technical user can produce and maintain the same level of integrations that a team of ten could do previously. In addition, the integrations can be worked on collaboratively among a team and hosted/run in the cloud, allowing larger teams to be even more efficient,” Sacerdoti said.

To date, 15-employee Pipedream has raised $22 million.