Pienso builds no-code tools for training AI models

AI might be the “it thing” of the moment. But that doesn’t mean it’s getting easier to deploy. According to 2023 S&P Global survey, about half of companies with at least one AI project in production are still at the pilot or proof-of-concept stages. The reasons for the slow ramp-ups vary, but the commonly cited […]

© 2024 TechCrunch. All rights reserved. For personal use only.

Zama’s homomorphic encryption tech lands it $73M on a valuation of nearly $400M

Homomorphic encryption, a complex technique that uses cryptographic algorithms to keep data secure as it travels around networks and to third parties, continues to elude mass-market scalability and thus adoption — not least because currently, the complexity that makes it so effective also makes it slow and hard to use widely. But in a world […]

© 2024 TechCrunch. All rights reserved. For personal use only.

Hundreds of AI luminaries sign letter calling for anti-deepfake legislation

Hundreds in the artificial intelligence community have signed an open letter calling for strict regulation of AI-generated impersonations, or deepfakes. While this is unlikely to spur real legislation (despite the House’s new task force), it does act as a bellwether for how experts lean on this controversial issue. The letter, signed by over 500 people […]

© 2024 TechCrunch. All rights reserved. For personal use only.

House punts on AI with directionless new task force

The House of Representatives has founded a Task Force on artificial intelligence that will “ensure America continues leading in this strategic area,” as Speaker Mike Johnson put it. But the announcement feels more like a punt after years of indecision that show no sign of ending. In a way this task force — chaired by […]

© 2024 TechCrunch. All rights reserved. For personal use only.

Survey finds relatively few Americans actually use (or fear) ChatGPT

From its prominence in the tech world, you might think that everyone on the planet is using ChatGPT or some other AI tool for everything from automating their job to planning their garden. But recent Pew polling suggests the language model isn’t quite as popular or threatening as some would have you think. Ongoing polling […]

Amazon brings new AI-driven features to Thursday Night Football

As Amazon’s Prime Video gears up for its second year as the exclusive rights holder to NFL’s Thursday Night Football (TNF), the streaming service hopes to give fans a more enhanced viewing experience with a slew of new AI-driven features.

During a demo with Prime Video executives, TechCrunch learned about the AI elements coming to TNF this season, as well as the first Black Friday NFL game and when viewers can expect HDR video quality.

AI is changing how sports content is consumed

If we asked someone about sports broadcasting a few years ago, we would have bet artificial intelligence (AI) and machine learning (ML) would be the last thing on their mind. However, the use of AI and ML models in sports has drastically altered how hardcore fans watch games, allowing them to dive deeper into the analytics.

“We don’t want to just put math on the screen,” Betsy Riley, senior coordinating producer at Prime Video, told us. “It’s about using data to tell a deeper story and to bring our fans insights so that they understand the game better. We think doing that lets people understand the chess match that’s unfolding on the field. For us as a tech company, it’s been really fun to dig into the numbers and to think about ways we can innovate and use tech to tell the story of the game in deeper and more meaningful ways.”

Amazon introduced AI to TNF last year, including X-Ray, which gives fans real-time access to live statistics and data; Rapid Recap, which generates up to 13 two-minute-long highlights for viewers to catch up on plays during a game and more. And after winning its first Sports Emmy award in May, it’s safe to say the tech behemoth isn’t easing up on the gas.

All the new AI features will live within Prime Vision with Next Gen Stats— TNF’s weekly alternate stream that features various graphic overlays on the screen during plays so fans can see stats and analysis in real-time.

Note that Amazon will internally test the features during tonight’s preseason game at 8 p.m. ET. However, fans won’t be able to experience them just yet. The features roll out on September 14, when the 2023 season begins. (Fans can find the complete TNF schedule on Amazon’s website).

Defensive Alerts

What if we said that AI can predict blitzes? Defensive Alerts is Amazon’s in-house ML neural network that recognizes when defensive players are about to rush the opposing quarterback. A red orb will appear around the players of interest so fans know exactly who to focus on.

“It’s able to look at all players XY coordinate data, their relationship to each other, as well as their acceleration; where are they moving and how fast are they moving directionally to predict who’s going to blitz,” explained Sam Schwartzstein, TNF Analytics Expert at Prime Video.

The ML model was trained on 35,000 plays and will continue to get smarter, Schwartzstein told TechCrunch, adding that it’s identifying blitzes and situations better than offensive linemen. He also said the team has a panel of NFL experts who are former quarterbacks, coaches and offensive linemen that help annotate the plays.

“Having this as an in-house neural network can only expand the kind of features that we can do in the future,” he said.

Prime Targets

Prime Targets (featured in the first image at the very top of the page) works similarly in that a green orb will light up a player that is open for a pass. The feature automatically tracks when a quarterback drops back to get ready to throw a pass, and the receiver (lit up by the green orb) runs out and creates separation from himself and the defenders.

This feature was previously called Open Receiver, which tracked which players would most likely convert the first down. Amazon tested it during last season’s games.

“This is the first statistic that is measuring the process of the play,” Schwartzstein noted. “Everything that we do on Prime Vision is predictive… This is all in real-time.”

Fourth Down Territory

Amazon is also launching a feature that may help fans understand how fourth-down decisions are made while potentially helping teams prepare for fourth downs.

The fourth down territory is an area on the field that offensive players use in an attempt to tie or win the game. Historically, coaches usually opt to punt the ball away since it feels less risky. However, as years go by, more and more teams are going for the fourth-down conversion.

Instead of putting analytics on the screen after the play happened, Fourth Down Territory operates like a real NFL analytics coordinator does; it shows viewers exactly when a team should try a fourth down and what the probability is.

Field Goal Target Zones

NFL fans are accustomed to seeing field goal target lines on broadcasts—the digital line that appears at the end of half or end of the game, where if a team gets to it, they can kick a field goal. Amazon’s Field Goal Target Zones feature will have multiple lines on the screen that tell viewers the likelihood that a kicker will make a field goal at each point.

Key Plays

Key Plays gives fans the ability to view in-game highlights and critical moments, whether they’re already watching the game live or streaming on demand afterward. Much like Rapid Recap ensures fans never miss the action, Key Plays leverages AI and machine learning to offer viewers a full rundown of what’s happening on the field.

What can viewers expect for the NFL’s first Black Friday game?

Amazon and the NFL announced last fall that Prime Video would exclusively stream the NFL’s first Black Friday game on November 24, with the Miami Dolphins playing against the New York Jets and an expected kick-off at 3 p.m. ET.

Notably, the game will be free to watch for non-subscribers. The e-commerce giant will also use this as an opportunity to promote exclusive shopping deals to viewers.

During a press call earlier this week, Prime Video’s global head of sports, Jay Marine, hinted that Amazon has some “interesting things” planned for Black Friday that will be “additive” for fans.

While the company declined to share more details, Riley told TechCrunch that we can expect to hear more in the coming weeks. It’s our guess that Amazon will roll out interactive shopping elements in order to take advantage of the busiest shopping day of the year. Last year, the company launched a dedicated fan store page, TNF Central, offering TNF-related items, NFL-branded merch and Amazon devices.

We also spoke with Eric Orme, who serves as director of live events at Prime Video. Orme oversees global product, engineering, and operations for live sports events, including Thursday Night Football, the Premier League, UEFA Champions League, MLB, NBA, and US Open tennis, among others.

Prime Video will likely see a surge in viewership during the Black Friday game since everyone will be home for the holiday weekend. However, Orme is confident it will be a smooth streaming experience for all viewers.

“We work really closely with the retail teams and everybody’s leveraging AWS, so we spend a lot of time around traffic projections,” Orme said. “We ran a bunch of scenarios and are really confident where we think that number is going to be.”

Prime Video brings HDR video quality to TNF

While the streamer already tested HDR streaming last season, it will officially be available to all TNF viewers this year. Many fans will likely appreciate the visual upgrade because it provides a more compelling experience, with greater contrast and vivid colors.

The company said it would be available on any HDR-enabled device, and subscribers don’t need to change anything in settings as the livestream will automatically be upgraded.

Modular secures $100M to build tools to optimize and create AI models

Modular, a startup creating a platform for developing and optimizing AI systems, has raised $100 million in a funding round led by General Catalyst with participation from GV (Google Ventures), SV Angel, Greylock and Factory.

Bringing Modular’s total raised to $130 million, the proceeds will be put toward product expansion, hardware support and the expansion of Modular’s programming language, Mojo, CEO Chris Lattner says.

“Because we operate in a deeply technical space that requires highly specialized expertise, we intend to use this funding to support the growth of our team,” Lattner said in an email interview with TechCrunch. “This funding will not be primarily spent on AI compute, but rather improving our core products and scaling to meet our incredible customer demand.”

Lattner, an ex-Googler, co-founded Palo Alto-based Modular in 2022 with Tim Davis, a former Google colleague in the tech giant’s Google Brain research division. Both Lattner and Davis felt that AI was being held back by an overly complicated and fragmented technical infrastructure, and founded Modular with a focus on removing the complexity of building and maintaining AI systems at large scale.

Modular provides an engine that tries to improve the inferencing performance of AI models on CPUs — and beginning later this year, GPUs — while delivering on cost savings. Compatible with existing cloud environments, machine learning frameworks like Google’s TensorFlow and Meta’s PyTorch and even other AI accelerator engines, Modular’s engine, currently in closed preview, lets developers import trained models and run them up to 7.5 times faster versus on their native frameworks, Lattner claims.

Modular’s other flagship product, Mojo, is a programming language that aims to combine the usability of Python with features like caching, adaptive compilation techniques and metaprogramming. Currently available in preview to “hundreds” of early adopters, Modular plans to release Mojo in general availability early next month.

“Our developer platform enables our customers, and the world’s developers, to defragment their AI technology stacks — pushing more innovations into production faster and realizing more value from their investment in AI,” Lattner said. “We’re attacking the complexity that slows AI development today by solving the fragmentation issues that plague the AI stack, starting with where AI software meets AI hardware.”

Ambitious much? Perhaps. But none of what roughly-70-employee Modular’s proposing is out of the realm of possibility.

Deci, backed by Intel, is among the startups offering tech to make trained AI models more efficient — and performant. Another in that category is OctoML, which automatically optimizes, benchmarks and packages models for an array of different hardware.

In any case, to Lattner’s point, AI demand is fast approaching the limits of sustainability — making any tech to cut down on its compute requirements hugely desirable. The generative AI models in vogue today are 10 to 100 times bigger than older AI models, as a recent piece in The Wall Street Journal points out, and much of the public cloud infrastructure wasn’t built for running these systems — at least not at this scale.

It’s already had an impact. Microsoft is facing a shortage of the server hardware needed to run AI so severe that it might lead to service disruptions, the company warned in an earnings report. Meanwhile, the sky-high appetite for AI inferencing hardware — mainly GPUs — has driven GPU provider Nvidia’s market cap to $1 trillion. But Nvidia’s become a victim of its own success; the company’s best-performing AI chips are reportedly sold out until 2024.

For these reasons and others, more than half of AI decision makers in top companies report facing barriers to deploying the latest AI tools, according to a 2023 poll from S&P Global.

“The compute power needed for today’s AI programs is massive and unsustainable under the current model,” Lattner said. “We’re already seeing instances where there is not enough compute capacity to meet demand. Costs are skyrocketing and only the big, powerful tech companies have the resources to build these types of solutions. Modular solves this problem, and will allow for AI products and services to be powered in a way that is far more affordable, sustainable and accessible for any enterprise.”

Modular

Modular’s Mojo programming language, a ‘fast superset’ of Python.

That’s reasonable. But I’m less convinced that Modular can drive widespread adoption of its new programming language, Mojo, when Python is so entrenched in the machine learning community. According to one survey, as of 2020, 87% of data scientists used Python on a regular basis.

But Lattner argues that Mojo’s benefits will drive its growth.

“One thing that is commonly misunderstood about AI applications is that they are not just a high- performance accelerator problem,” he said. “AI today is an end-to-end data problem, which involves loading and transforming data, pre-processing, post-processing and networking. These auxiliary tasks are usually done in Python and C++, and only Modular’s approach with Mojo can bring all these components together to work in a single unified technology base without sacrificing performance and scalability.”

He might be right. The Modular community grew to over 120,000 developers in the four months since Modular’s product keynote in early May, Lattner claims, and “leading tech companies” are already using the startup’s infrastructure, with 30,000 on the waitlist.

“The most important enemy of Modular is complexity: complexity in software layers that only work in special cases, software that’s tied to specific hardware and complexity driven by the low-level nature of high-performance accelerators,” he said. “The very thing that makes AI such a powerful and transformative technology is the reason it requires so much effort to reach scale, so much talent invested in building bespoke solutions and so much compute power to deliver consistent results. The Modular engine and Mojo together level the playing field, and this is just the start.”

And — at least from a funding standpoint — what an auspicious start it is.

Hugging Face raises $235M from investors including Salesforce and Nvidia

As first reported by The Information, then seemingly verified by Salesforce CEO Marc Benioff on X (formerly known as Twitter), AI startup Hugging Face has raised $235 million in a Series D funding round.

The tranche, which had participation from Google, Amazon, Nvidia, Intel, AMD, Qualcomm, IBM, Salesforce and Sound Ventures, values Hugging Face at $4.5 billion. That’s double the startup’s valuation from May 2022 and reportedly more than 100 times Hugging Face’s annualized revenue, reflecting the enormous appetite for AI and platforms to support its development.

Hugging Face has raised a total of $395.2 million to date, placing it among the better-funded AI startups in the space. Those ahead of it are OpenAI ($11.3 billion), Anthropic ($1.6 billion), Inflection AI ($1.5 billion), Cohere ($435 million) and Adept ($415 million).

“AI is the new way of building all software. It’s the most important paradigm shift of the decade and, compared to the software shift, it’s going to be bigger because of new capabilities and faster because software paved the way,” co-founder and CEO Clément Delangue told TechCrunch via email. “Hugging Face intends to be the open platform that empowers this paradigm shift.”

Delangue, a French entrepreneur, launched Brooklyn-based Hugging Face in 2016 alongside Julien Chaumond and Thomas Wolf. The trio originally built a chatbot app targeted at teenagers. But after open sourcing the algorithm behind the app, Hugging Face pivoted to focus on creating a platform for creating, testing and deploying machine learning.

Today, Hugging Face offers a number of data science hosting and development tools, including a GitHub-like hub for AI code repositories, models and data sets as well as web apps to demo AI-powered applications. Hugging Face also provides libraries for tasks like data set processing and evaluating models in addition to an enterprise version of the hub that supports software-as-a-service and on-premises deployments.

Hugging Face’s paid functionality includes AutoTrain, which helps to automate the task of training AI models; Inference API, which allows developers to host models without managing the underlying infrastructure; and Infinity, which is designed to increase the speed with which an in-production model processes data.

Hugging Face has 10,000 customers today, it claims, and more than 50,000 organizations on the platform. And its model hub hosts over 1 million repositories.

Contributing to the growth is the strong, sustained interest in AI from the enterprise. According to a HubSpot poll, 43% of business leaders say that they plan to increase their investment in AI and automation tools over the course of 2023, while 31% say AI and automation tools are very important to their overall business strategy.

Much of what Hugging Face delivers falls into MLOps, a category of tools for streamlining the process of taking AI models to production and then maintaining and monitoring them. The MLOps market is substantial in its own right, with one report estimating that it’ll reach $16.61 billion by 2030.

But Hugging Face dabbles in other areas, too.

In 2021, Hugging Face launched BigScience, a volunteer-led project to produce an open source language model as powerful as OpenAI’s GPT-3, but free and open for anyone to use. It culminated in Bloom, a multilingual model that for more than a year has been available to tinker with on Hugging Face’s model hub.

Bloom is but one of several open source models to which Hugging Face has contributed development resources.

Hugging Face collaborated with ServiceNow, the enterprise software company, to release a free code-generating AI model called StarCoder (a follow-up model, SafeCoder, debuted this week). And the startup made available its own free version of ChatGPT, OpenAI’s viral AI-powered chatbot, in partnership with the German nonprofit LAION.

Hugging Face’s team-ups extend to major cloud providers, some of which are strategic investors.

Hugging Face recently worked with Nvidia to expand access to cloud compute via Nvidia’s DGX computing platform. It has a partnership with Amazon to extend its products to AWS customers and everage Amazon’s custom Trainium chips to train the next generation of Bloom. And Hugging Face collaborated with Microsoft on Hugging Face Endpoints on Azure, a way to turn Hugging Face-developed AI models into scalable production solutions hosted through Azure.

With this latest investment, Delangue says that Hugging Face plans to “double down” on its supportive efforts in many domains, including research, enterprise and startups. It has 170 employees, but plans on recruiting new talent over the coming months.