AccuWeather acquires air pollution startup Plume Labs

Weather forecast company AccuWeather is acquiring French startup Plume Labs — terms of the deal are undisclosed. Originally founded in 2014, Plume Labs has gradually expanded its product offering to offer three different products focused on air pollution data.

First, the startup launched a mobile app for iOS and Android that gives you information about air quality. At first, it was a simple city-level air pollution forecasting app. The company would aggregate data from different sources to predict how pollution would evolve over time.

Over time, Plume Labs improved its forecasting abilities as it can now predict air quality for the next few days. Plue Labs uses some machine learning models for its predictions. It now also offers detailed maps with street-by-street information. This way, if you’re commuting to work on a bike or a moped, you know that you should avoid a busy street in particular.

Plume Labs then wanted to empower its users by making air quality tracking visual and actionable. That’s why it designed its own air quality tracker that connects to your smartphone using Bluetooth Low Energy.

The second-generation device can track particulate matter (PM1, PM2.5 and PM10) as well as polluting gases (nitrogen dioxide and volatile organic compounds). It’s been relatively successful as there are currently more Plume Labs devices being used than government-supported monitoring stations.

Finally, Plume Labs started offering air pollution data as an API. The company has aggregated thousands of environmental monitoring stations around the world and applied its machine learning model on this data. This way, Plume Labs customers get a head start if they want to integrate air quality data in their products. They don’t have to deal with different data sources and unify these data sets into a single set. Similarly, they don’t have to allocate resources on machine learning applied to air pollution.

In January 2020, AccuWeather integrated Plume Labs’ data in its weather forecasting products. The weather forecasting company used that opportunity to acquire a stake in Plume Labs. And now, AccuWeather is going one step further and acquiring the rest of the company.

“Air quality plays an intrinsic role in AccuWeather’s mission of saving lives and helping people prosper, and this acquisition will help us provide users and customers with an even more personalized experience as well as a 360-degree understanding of the impact of weather on their wellness,” AccuWeather president Steven R. Smith said in a statement. “Our exclusive alliance delivered on the promise to help put our users in greater control of their health, and we are committed to that goal even more firmly with this new strategic direction.”

Plume Labs will become the center for climate and environmental data for AccuWeather. This acquisition proves that air pollution is becoming a key metric for many industries.

“Seven years ago, David Lissmyr and I launched Plume Labs to make air quality information accessible to everyone,” Plume Labs co-founder and CEO Romain Lacombe said in a statement. “Since then, our work has helped galvanize the fight for clean air by making the health impact of climate change personal. Joining forces with AccuWeather now is an extraordinary opportunity to amplify our impact at planetary scale and help 1.5 billion people avoid air pollution around the world.”

Up next, the Plume Labs team and technology will continue to operate and expand its work to other environmental risks, such as wildfires. Climate risk forecasting is still in its early days, but today’s acquisition confirms that it is going to become more important over time.

At ‘Lens Fest,’ Snap debuts creations tools for more sophisticated augmented reality experiences

As Snap’s creators begin to experiment with the company’s augmented reality Spectacles hardware, the company is delving deeper into juicing the capabilities of its Lens Studio to build augmented reality filters which are more connected, more realistic and more futuristic. At the company’s annual Lens Fest event, Snap debuted a number of changes coming to their lens creation suite. Changes range from efforts to integrate outside media and data to more AR-centric features designed with a glasses-future in mind.

On the media side, Snap will be debuting a news sounds library which will allow creators to add audio clips and millions of songs from Snapchat’s library of licensed music directly into their lenses. Snap is also making efforts to bring real-time data into Lenses via an API library that showcase evolving trends like weather information from Accuweather or cryptocurrency prices from FTX. One of the bigger feature updates will allow users to embed links inside lenses and send them to different web pages.

Image: Snap

Snap’s once-goofy selfie filters remain a big growth opportunity for the company which has long had augmented reality in its sights. Snap detailed that there are now more than 2.5 million lenses that have been built by more than a quarter-million creators. Those lenses have been viewed by users a collective 3.5 trillion times, the company says. The company is building out its own internal “AR innovation lab,” called Ghost, which will help the company bankroll Lens designers who are looking to push the limits of what’s possible, dishing out grants for up to $150k for individual projects.

As the company looks to make lenses smarter, they’re also looking to. make them more technically capable.

Beyond integrating new data types, Snap is also looking at the underlying AR tech to help make for enjoyable lenses for users with lower-end phones. Its World Mesh feature has allowed users with higher-end phones to leverage AR and view lenses that integrate more real world geometry data for digital objects in a lens to interact with. Now, Snap is enabling this feature across more basic phones as well.

Image: Snap

Similarly, Snap is also rolling out tools to make digital objects react more realistically in reference to each other, debuting an in-lens physics engine which will allow for more dynamic lenses that can not only interact more deeply with the real world but can adjust to simultaneous user input as well.

Snap’s efforts to create more sophisticated lens creation tools on mobile come as the company is also looking to build out more future-flung support for the tools developers may need to design for hands-free glasses experiences on its new AR Spectacles. Creators have been crafting experiences with the new hardware for months and Snap has been building new lens functionality to address their concerns and spark up new opportunities.


Ultimately, Snap’s glasses are still firmly in developer mode and the company hasn’t offered any timelines for when they might ship a consumer product with integrated AR capabilities, so they theoretically have plenty of time to build to build in the background.

Some of the tools Snap has been quietly building include Connected Lenses which enable shared experiences inside Lenses so multiple users can interact with the same content using AR Spectacles. In their developer iteration, the AR Spectacles don’t have the longest battery life, meaning that Snap has had to get creative in ensuring that Snap’s are there when you need them without running persistently. The company’s Endurance mode allows lenses to continue running in the background off-display while waiting for a specific trigger like reaching a certain GPS location.

Byton enlists ViacomCBS, Accuweather, Xperi and more to take on Tesla’s in-car entertainment

Byton is starting to amass key partners such as global media giant ViacomCBS to bring video content as well as information and other services to the eye-popping 48-inch wraparound digital dashboard screen in its upcoming electric M-Byte SUV.

The China-based electric car startup is calling its dashboard screen the “Byton Stage” and the plan to turn it  — along with several other displays — into an interactive experience that delivers everything from entertainment and navigation to health stats and even review powerpoint slides is part of a bigger goal to compete against Tesla.

“In a world where Tesla has prominently taken its place during the past years, people are constantly telling me they are ready for an alternative option in choosing a new premium electric vehicle,” Byton CEO Daniel Kirchert said during a press conference Sunday ahead of CES in Las Vegas. “I believe Byton’s ready to be that alternative.”

Byton showed off a production version of the M-Byte on Sunday during a press conference ahead of CES, the annual tech trade show in Las Vegas. The M-Byte, which is expected to go into volume production later this year, will be priced a $45,000, Kirchert said.

Byton announced Sunday several partnerships to bring content into the vehicle, including ViacomCBS, Access, Accuweather, Aiqudo, Cloud Car, Road.Travel and Xperi.

Each partnership is filling out that interactive content ecosystem. Access Twine is the platform that Byton will use to deliver the content to its “stage,” whole Cloud Car will handle the cloud-connected natural language recognition. Aiqudo’s Voice to Action platform enables customers to use voice commands and integrates with the mobile apps on their smartphones. Road.Travel lets users plan and book trips and Xperi will deliver digital HD Radio and DTS Connected Radio.

Some of that content, such as video from ViacomCBS is meant to be viewed while the M-Byte is parked, following in the footsteps of Tesla, which has a number of games as well as video streaming in its vehicles.

When the vehicle is parked, users can hit “cinema mode” to view videos or “office mode” to review PowerPoint slides. And soon there will be more. Byton also announced it is launching an app developer program.

“Our aim is to make the Byton M-Byte a seamless part of your digital life and easy access to compelling video content will be integral to that experience,” Andreas Schaaf, the company’s chief customer officer said in a prepared statement.