Why MPG should matter for electric vehicles

If saving the environment is merely a lifestyle choice, the automakers and their latest electric vehicles have got us covered. Tesla’s Plaid touts performance. Leafs, Priuses and Volts preach humility. And Ford is flexing its muscle with launches of electric Mustangs and F-150s.

But if consumers’ choices are going to contribute to a greener future — if they’re going to opt for energy efficiency over flash — they need the ability to make smart purchasing decisions. To enable that, an old-fashioned measuring stick from the gasoline era could come in handy: the concept of miles per gallon.

In the electric vehicle (EV) era, car shopping is no longer a simple matter of finding a high-MPG car and a cheap gallon of gas. Electricity costs are confusing. Price and efficiency information is hard to find and harder to understand. And ultimately, you have to do the math.

That means getting to know electric energy’s unit of choice: the kilowatt-hour, or kWh — a string of characters better suited to an engineering textbook. To determine their costs and carbon footprints, drivers must solve the brain teasers that turn kWh into dollars and miles.

If you don’t do that, you’re trusting the automakers to do the right thing for you and the environment.

The government can lead on this problem. In fact, it has, and it does. Gas pumps have long been required to list the price of a gallon, gallons pumped and total fill-up cost. A vehicle’s EPA-mandated miles-per-gallon rating — displayed on dashboards and on every new car’s MPG sticker — ties it all together.

So maybe we already have a common denominator for the EV age. A familiar, tangible energy unit that gives us an apples-to-apples way to think about cost, efficiency and pollution.

Fellow Americans, say hello — again — to the gallon. Even as we leave the gas-powered car behind, we can keep its energy unit. It’s tangible, and if it works for the energy contained in gas, we can make it work for electricity.

According to the Environmental Protection Agency, a gallon of unleaded gas contains about 34 kWh of energy. Knowing that, you can easily deduce how much your energy purchase costs and how far it can take you. The gallon can even help you better understand your other electricity usage, putting your home energy costs on an apples-to-apples basis with your automobile’s energy costs.

When I gallon-ized my energy bills for the month of August, I learned:

  • My house used 56 gallons (1,888 kWh) worth of electricity.
  • My average home electricity cost was $6.34 per gallon.
  • At a Tesla supercharger, I paid $8.43 per gallon (25 cents per kWh).

The government already publishes an MPG equivalent for electric and hybrid vehicles. Using MPG, it becomes clear that electric vehicles make up for a lot of that high cost-per-gallon in efficiency, often with ratings over 100 MPG.

MPG is already good for more than car shopping. New York City’s MPG mandates have doubled taxis’ fuel efficiency since 2009. (The city also reserves a portion of taxi licenses — medallions — for hybrids.) Uber and Lyft have announced green initiatives, but their lightly regulated status has let them avoid MPG standards.

Smart energy shopping alone will not solve climate change. Energy watchdogs also need to monitor the industry’s carbon impact from both electricity generation and EV-related hardware manufacturing.

All else equal, though, using less energy means less pollution. And common units can steer us toward smart choices that encompass far more than our cars. Should I buy batteries so I can stock up on electricity when it’s cheapest? Do solar panels make sense? What about better insulation or more efficient appliances?

A high-MPG vehicle and a home that also goes a long way on a gallon? Together, that would be a solid lifestyle choice.

Light is the key to long-range, fully autonomous EVs

Advanced driver assistance systems (ADAS) hold immense promise. At times, the headlines about the autonomous vehicle (AV) industry seem ominous, with a focus on accidents, regulation or company valuations that some find undeserving. None of this is unreasonable, but it makes the amazing possibilities of a world of AVs seem opaque.

One of the universally accepted upsides of AVs is the potential positive impact on the environment, as most AVs will also be electric vehicles (EVs).

Industry analyst reports project that by 2023, 7.3 million vehicles (7% of the total market) will have autonomous driving capabilities requiring $1.5 billion of autonomous-driving-dedicated processors. This is expected to grow to $14 billion in 2030, when upward of 50% of all vehicles sold will be classified as SAE Level 3 or higher, as defined by the National Highway Traffic Safety Administration (NHTSA).

Fundamental innovation in computing and battery technology may be required to fully deliver on the promise of AEVs with the range, safety and performance demanded by consumers.

While photonic chips are faster and more energy efficient, fewer chips will be needed to reach SAE Level 3; however, we can expect this increased compute performance to accelerate the development and availability of fully SAE Level 5 autonomous vehicles. In that case, the market for autonomous driving photonic processors will likely far surpass the projection of $14 billion by 2030.

When you consider all of the broad-based potential uses of autonomous electric vehicles (AEVs) — including taxis and service vehicles in major cities, or the clean transport of goods on our highways — we begin to see how this technology can rapidly begin to significantly impact our environment: by helping to bring clean air to some of the most populated and polluted cities.

The problem is that AEVs currently have a sustainability problem.

To operate efficiently and safely, AEVs must leverage a dizzying array of sensors: cameras, lidar, radar and ultrasonic sensors, to name just a few. These work together, gathering data to detect, react and predict in real time, essentially becoming the “eyes” for the vehicle.

While there’s some debate surrounding the specific numbers of sensors required to ensure effective and safe AV, one thing is unanimously agreed upon: These cars will create massive amounts of data.

Reacting to the data generated by these sensors, even in a simplistic way, requires tremendous computational power — not to mention the battery power required to operate the sensors themselves. Processing and analyzing the data involves deep learning algorithms, a branch of AI notorious for its outsized carbon footprint.

To be a viable alternative, both in energy efficiency and economics, AEVs need to get close to matching gas-powered vehicles in range. However, the more sensors and algorithms an AEV has running over the course of a journey, the lower the battery range — and the driving range — of the vehicle.

Today, EVs are barely capable of reaching 300 miles before they need to be recharged, while a traditional combustion engine averages 412 miles on a single tank of gas, according to the U.S. Department of Energy. Adding autonomous driving into the mix widens this gap even further and potentially accelerates battery degradation.

Recent work published in the journal Nature Energy claims that the range of an automated electric vehicle is reduced by 10%-15% during city driving.

At the 2019 Tesla Autonomy Day event, it was revealed that driving range could be reduced by up to 25% when Tesla’s driver-assist system is enabled during city driving. This reduces the typical range for EVs from 300 miles to 225 — crossing a perceived threshold of attractiveness for consumers.

A first-principle analysis takes this a step further. NVIDIA’s AI compute solution for robotaxis, DRIVE, has a power consumption of 800 watts, while a Tesla Model 3 has an energy consumption rate of about 11.9 kWh/100 km. At the typical city speed limit of 50 km/hour (about 30 mph), the Model 3 is consuming approximately 6 kW — meaning power solely dedicated to AI compute is consuming approximately 13% of total battery power intended for driving.

This illustrates how the power-hungry compute engines used for automated EVs pose a significant problem for battery life, vehicle range and consumer adoption.

This problem is further compounded by the power overhead associated with cooling the current generation of the power-hungry computer chips that are currently used for advanced AI algorithms. When processing heavy AI workloads, these semiconductor chip architectures generate massive amounts of heat.

As these chips process AI workloads, they generate heat, which increases their temperature and, as a consequence, performance declines. More effort is then needed and energy wasted on heat sinks, fans and other cooling methods to dissipate this heat, further reducing battery power and ultimately EV range. As the AV industry continues to evolve, new solutions to eliminate this AI compute chip heat problem are urgently needed.

The chip architecture problem

For decades, we have relied on Moore’s law, and its lesser-known cousin Dennard scaling, to deliver more compute power per footprint repeatedly year after year. Today, it’s well known that electronic computers are no longer significantly improving in performance per watt, resulting in overheating data centers all over the world.

The largest gains to be had in computing are at the chip architecture level, specifically in custom chips, each for specific applications. However, architectural breakthroughs are a one-off trick — they can only be made at singular points in time in computing history.

Currently, the compute power required to train artificial intelligence algorithms and perform inference with the resulting models is growing exponentially — five times faster than the rate of progress under Moore’s law. One consequence of that is a huge gap between the amount of computing needed to deliver on the massive economic promise of autonomous vehicles and the current state of computing.

Autonomous EVs find themselves in a tug of war between maintaining battery range and the real-time compute power required to deliver autonomy.

Photonic computers give AEVs a more sustainable future

Fundamental innovation in computing and battery technology may be required to fully deliver on the promise of AEVs with the range, safety and performance demanded by consumers. While quantum computers are an unlikely short- or even medium-term solution to this AEV conundrum, there’s another, more available solution making a breakthrough right now: photonic computing.

Photonic computers use laser light, instead of electrical signals, to compute and transport data. This results in a dramatic reduction in power consumption and an improvement in critical, performance-related processor parameters, including clock speed and latency.

Photonic computers also enable inputs from a multitude of sensors to run inference tasks concurrently on a single processor core (each input encoded in a unique color), while a traditional processor can only accommodate one job at a time.

The advantage that hybrid photonic semiconductors have over conventional architectures lies within the special properties of light itself. Each data input is encoded in a different wavelength, i.e., color, while each runs on the same neural network model. This means that photonic processors not only produce more throughput compared to their electronic counterparts, but are significantly more energy efficient.

Photonic computers excel in applications that require extreme throughput with low latency and relatively low power consumption — applications like cloud computing and, potentially, autonomous driving, where the real-time processing of vast amounts of data is required.

Photonic computing technology is on the brink of becoming commercially available and has the potential to supercharge the current roadmap of autonomous driving while also reducing its carbon footprint. It’s clear that interest in the benefits of self-driving vehicles is increasing and consumer demand is imminent.

So it is crucial for us to not only consider the industries it will transform and the safety it can bring to our roads, but also ensure the sustainability of its impact on our planet. In other words, it’s time to shine a little light on autonomous EVs.

Dabbel gets $4.4M to cut CO2 by automating HVAC for commercial buildings

Düsseldorf-based proptech startup Dabbel is using AI to drive energy efficiency savings in commercial buildings.

It’s developed cloud-based self-learning building management software that plugs into the existing building management systems (BMS) — taking over control of heating and cooling systems in a way that’s more dynamic than legacy systems based on fixed set-point resets.

Dabbel says its AI considers factors such as building orientation and thermal insulation, and reviews calibration decisions every five minutes — meaning it can respond dynamically to changes in outdoor and indoor conditions.

The 2018-founded startup claims this approach of layering AI-powered predictive modelling atop legacy BMS to power next-gen building automation is able to generate substantial energy savings — touting reductions in energy consumption of up to 40%.

“Every five minutes Dabbel reviews its decisions based on all available data,” explains CEO and co-founder, Abel Samaniego. “With each iteration, Dabbel improves or adapts and changes its decisions based on the current circumstances inside and outside the building. It does this by using cognitive artificial intelligence to drive a Model-Based Predictive Control (MPC) System… which can dynamically adjust all HVAC setpoints based on current/future conditions.”

In essence, the self-learning system predicts ahead of time the tweaks that are needed to adapt for future conditions — saving energy vs a pre-set BMS that would keep firing the boilers for longer.

The added carrot for commercial building owners (or tenants) is that Dabbel squeezes these energy savings without the need to rip and replace legacy systems — nor, indeed, to install lots of IoT devices or sensor hardware to create a ‘smart’ interior environment; the AI integrates with (and automatically calibrates) the existing heating, ventilation, and air conditioning (HVAC) systems.

All that’s needed is Dabbel’s SaaS — and less than a week for the system to be implemented (it also says installation can be done remotely).

“There are no limitations in terms of Heating and Cooling systems,” confirms Samaniego, who has a background in industrial engineering and several years’ experience automating high tech plants in Germany. “We need a building with a Building Management System in place and ideally a BACnet communication protocol.”

Average reductions achieved so far across the circa 250,000m² of space where its AI is in charge of building management systems are a little more modest but a still impressive 27%. (He says the maximum savings seen at some “peak times” is 42%.)

The touted savings aren’t limited to a single location or type of building/client, according to Dabbel, which says they’ve been “validated across different use cases and geographies spanning Europe, the U.S., China, and Australia”.

Early clients are facility managers of large commercial buildings — Commerzbank clearly sees potential, having incubated the startup via its early-stage investment arm — and several schools.

A further 1,000,000m² is in the contract or offer phase — slated to be installed “in the next six months”.

Dabbel envisages its tech being useful to other types of education institutions and even other use-cases. (It’s also toying with adding a predictive maintenance functionality to expand its software’s utility by offering the ability to alert building owners to potential malfunctions ahead of time.)

And as policymakers around the global turn their attention to how to achieve the very major reductions in carbon emissions that are needed to meet ambitious climate goals the energy efficiency of buildings certainly can’t be overlooked.

“The time for passive responses to addressing the critical issue of carbon emission reduction is over,” said Samaniego in a statement. “That is why we decided to take matters into our own hands and develop a solution that actively replaces a flawed human-based decision-making process with an autonomous one that acts with surgical precision and thanks to artificial intelligence, will only improve with each iteration.”

If the idea of hooking your building’s heating/cooling up to a cloud-based AI sounds a tad risky for Internet security reasons, Dabbel points out it’s connecting to the BMS network — not the (separate) IT network of the company/building.

It also notes that it uses one-way communication via a VPN tunnel — “creating an end-to-end encrypted connection under high market standards”, as Samaniego puts it.

The startup has just closed a €3.6 million (~$4.4M) pre-Series A funding round led by Target Global, alongside main incubator (Commerzbank’s early-stage investment arm), SeedX, plus some strategic angel investors.

Commenting in a statement, Dr. Ricardo Schaefer, partner at Target Global, added: “We are enthusiastic to work with the team at Dabbel as they offer their clients a tangible and frictionless way to significantly reduce their carbon footprint, helping to close the gap between passive measurement and active remediation.”

 

Crusoe Energy is tackling energy use for cryptocurrencies and data centers and greenhouse gas emissions

The two founders of Crusoe Energy think they may have a solution to two of the largest problems facing the planet today — the increasing energy footprint of the tech industry and the greenhouse gas emissions associated with the natural gas industry.

Crusoe, which uses excess natural gas from energy operations to power data centers and cryptocurrency mining operations, has just raised $128 million in new financing from some of the top names in the venture capital industry to build out its operations — and the timing couldn’t be better.

Methane emissions are emerging as a new area of focus for researchers and policymakers focused on reducing greenhouse gas emissions and keeping global warming within the 1.5 degree targets set under the Paris Agreement. And those emissions are just what Crusoe Energy is capturing to power its data centers and bitcoin mining operations.

The reason why addressing methane emissions is so critical in the short term is because these greenhouse gases trap more heat than their carbon dioxide counterparts and also dissipate more quickly. So dramatic reductions in methane emissions can do more in the short term to alleviate the global warming pressures that human industry is putting on the environment.

And the biggest source of methane emissions is the oil and gas industry. In the U.S. alone roughly 1.4 billion cubic feet of natural gas is flared daily, said Chase Lochmiller, a co-founder of Crusoe Energy. About two thirds of that is flared in Texas with another 500 million cubic feet flared in North Dakota, where Crusoe has focused its operations to date.

For Lochmiller, a former quant trader at some of the top American financial services institutions, and Cully Cavmess, a third generation oil and gas scion, the ability to capture natural gas and harness it for computing operations is a natural combination of the two men’s interests in financial engineering and environmental preservation.

NEW TOWN, ND – AUGUST 13: View of three oil wells and flaring of natural gas on The Fort Berthold Indian Reservation near New Town, ND on August 13, 2014. About 100 million dollars worth of natural gas burns off per month because a pipeline system isn’t in place yet to capture and safely transport it . The Three Affiliated Tribes on Fort Berthold represent Mandan, Hidatsa and Arikara Nations. It’s also at the epicenter of the fracking and oil boom that has brought oil royalties to a large number of native americans living there. (Photo by Linda Davidson / The Washington Post via Getty Images)

The two Denver natives met in prep-school and remained friends. When Lochmiller left for MIT and Cavness headed off to Middlebury they didn’t know that they’d eventually be launching a business together. But through Lochmiller’s exposure to large scale computing and the financial services industry, and Cavness assumption of the family business they came to the conclusion that there had to be a better way to address the massive waste associated with natural gas.

Conversation around Crusoe Energy began in 2018 when Lochmiller and Cavness went climbing in the Rockies to talk about Lochmiller’s trip to Mt. Everest.

When the two men started building their business, the initial focus was on finding an environmentally friendly way to deal with the energy footprint of bitcoin mining operations. It was this pitch that brought the company to the attention of investors at Polychain, the investment firm started by Olaf Carlson-Wee (and Lochmiller’s former employer), and investors like Bain Capital Ventures and new investor Valor Equity Partners.

(This was also the pitch that Lochmiller made to me to cover the company’s seed round. At the time I was skeptical of the company’s premise and was worried that the business would just be another way to prolong the use of hydrocarbons while propping up a cryptocurrency that had limited actual utility beyond a speculative hedge against governmental collapse. I was wrong on at least one of those assessments.)

“Regarding questions about sustainability, Crusoe has a clear standard of only pursuing projects that are net reducers of emissions. Generally the wells that Crusoe works with are already flaring and would continue to do so in the absence of Crusoe’s solution. The company has turned down numerous projects where they would be a buyer of low cost gas from a traditional pipeline because they explicitly do not want to be net adders of demand and emissions,” wrote a spokesman for Valor Equity in an email. “In addition, mining is increasingly moving to renewables and Crusoe’s approach to stranded energy can enable better economics for stranded or marginalized renewables, ultimately bringing more renewables into the mix. Mining can provide an interruptible base load demand that can be cut back when grid demand increases, so overall the effect to incentivize the addition of more renewable energy sources to the grid.”

Other investors have since piled on including: Lowercarbon Capital, DRW Ventures, Founders Fund, Coinbase Ventures, KCK Group, Upper90, Winklevoss Capital, Zigg Capital and Tesla co-founder JB Straubel.

The company now operate 40 modular data centers powered by otherwise wasted and flared natural gas throughout North Dakota, Montana, Wyoming and Colorado. Next year that number should expand to 100 units as Crusoe enters new markets such as Texas and New Mexico. Since launching in 2018, Crusoe has emerged as a scalable solution to reduce flaring through energy intensive computing such as bitcoin mining, graphical rendering, artificial intelligence model training and even protein folding simulations for COVID-19 therapeutic research.

Crusoe boasts 99.9% combustion efficiency for its methane, and is also bringing additional benefits in the form of new networking buildout at its data center and mining sites. Eventually, this networking capacity could lead to increased connectivity for rural communities surrounding the Crusoe sites.

Currently, 80% of the company’s operations are being used for bitcoin mining, but there’s increasing demand for use in data center operations and some universities, including Lochmiller’s alma mater of MIT are looking at the company’s offerings for their own computing needs.

“That’s very much in an incubated phase right now,” said Lochmiller. “A private alpha where we have a few test customers… we’ll make that available for public use later this year.”

Crusoe Energy Systems should have the lowest data center operating costs in the world, according to Lochmiller and while the company will spend money to support the infrastructure buildout necessary to get the data to customers, those costs are negligible when compared to energy consumption, Lochmiller said.

The same holds true for bitcoin mining, where the company can offer an alternative to coal powered mining operations in China and the construction of new renewable capacity that wouldn’t be used to service the grid. As cryptocurrencies look for a way to blunt criticism about the energy usage involved in their creation and distribution, Crusoe becomes an elegant solution.

Institutional and regulatory tailwinds are also propelling the company forward. Recently New Mexico passed new laws limiting flaring and venting to no more than 2 percent of an operator’s production by April of next year and North Dakota is pushing for incentives to support on-site flare capture systems while Wyoming signed a law creating incentives for flare gas reduction applied to bitcoin mining. The world’s largest financial services firms are also taking a stand against flare gas with BlackRock calling for an end to routine flaring by 2025.

“Where we view our power consumption, we draw a very clear line in our project evaluation stage where we’re reducing emissions for an oil and gas projects,” Lochmiller said. 

Immersion cooling to offset data centers’ massive power demands gains a big booster in Microsoft

LiquidStack does it. So does Submer. They’re both dropping servers carrying sensitive data into goop in an effort to save the planet. Now they’re joined by one of the biggest tech companies in the world in their efforts to improve the energy efficiency of data centers, because Microsoft is getting into the liquid-immersion cooling market.

Microsoft is using a liquid it developed in-house that’s engineered to boil at 122 degrees Fahrenheit (lower than the boiling point of water) to act as a heat sink, reducing the temperature inside the servers so they can operate at full power without any risks from overheating.

The vapor from the boiling fluid is converted back into a liquid through contact with a cooled condenser in the lid of the tank that stores the servers.

“We are the first cloud provider that is running two-phase immersion cooling in a production environment,” said Husam Alissa, a principal hardware engineer on Microsoft’s team for datacenter advanced development in Redmond, Washington, in a statement on the company’s internal blog. 

While that claim may be true, liquid cooling is a well-known approach to dealing with moving heat around to keep systems working. Cars use liquid cooling to keep their motors humming as they head out on the highway.

As technology companies confront the physical limits of Moore’s Law, the demand for faster, higher performance processors mean designing new architectures that can handle more power, the company wrote in a blog post. Power flowing through central processing units has increased from 150 watts to more than 300 watts per chip and the GPUs responsible for much of Bitcoin mining, artificial intelligence applications and high end graphics each consume more than 700 watts per chip.

It’s worth noting that Microsoft isn’t the first tech company to apply liquid cooling to data centers and the distinction that the company uses of being the first “cloud provider” is doing a lot of work. That’s because bitcoin mining operations have been using the tech for years. Indeed, LiquidStack was spun out from a bitcoin miner to commercialize its liquid immersion cooling tech and bring it to the masses.

“Air cooling is not enough”

More power flowing through the processors means hotter chips, which means the need for better cooling or the chips will malfunction.

“Air cooling is not enough,” said Christian Belady, vice president of Microsoft’s datacenter advanced development group in Redmond, in an interview for the company’s internal blog. “That’s what’s driving us to immersion cooling, where we can directly boil off the surfaces of the chip.”

For Belady, the use of liquid cooling technology brings the density and compression of Moore’s Law up to the datacenter level

The results, from an energy consumption perspective, are impressive. The company found that using two-phase immersion cooling reduced power consumption for a server by anywhere from 5 percent to 15 percent (every little bit helps).

Microsoft investigated liquid immersion as a cooling solution for high performance computing applications such as AI. Among other things, the investigation revealed that two-phase immersion cooling reduced power consumption for any given server by 5% to 15%. 

Meanwhile, companies like Submer claim they reduce energy consumption by 50%, water use by 99%, and take up 85% less space.

For cloud computing companies, the ability to keep these servers up and running even during spikes in demand, when they’d consume even more power, adds flexibility and ensures uptime even when servers are overtaxed, according to Microsoft.

“[We] know that with Teams when you get to 1 o’clock or 2 o’clock, there is a huge spike because people are joining meetings at the same time,” Marcus Fontoura, a vice president on Microsoft’s Azure team, said on the company’s internal blog. “Immersion cooling gives us more flexibility to deal with these burst-y workloads.”

At this point, data centers are a critical component of the internet infrastructure that much of the world relies on for… well… pretty much every tech-enabled service. That reliance however has come at a significant environmental cost.

“Data centers power human advancement. Their role as a core infrastructure has become more apparent than ever and emerging technologies such as AI and IoT will continue to drive computing needs. However, the environmental footprint of the industry is growing at an alarming rate,” Alexander Danielsson, an investment manager at Norrsken VC noted last year when discussing that firm’s investment in Submer.

Solutions under the sea

If submerging servers in experimental liquids offers one potential solution to the problem — then sinking them in the ocean is another way that companies are trying to cool data centers without expending too much power.

Microsoft has already been operating an undersea data center for the past two years. The company actually trotted out the tech as part of a push from the tech company to aid in the search for a COVID-19 vaccine last year.

These pre-packed, shipping container-sized data centers can be spun up on demand and run deep under the ocean’s surface for sustainable, high-efficiency and powerful compute operations, the company said.

The liquid cooling project shares most similarity with Microsoft’s Project Natick, which is exploring the potential of underwater datacenters that are quick to deploy and can operate for years on the seabed sealed inside submarine-like tubes without any onsite maintenance by people. 

In those data centers nitrogen air replaces an engineered fluid and the servers are cooled with fans and a heat exchanger that pumps seawater through a sealed tube.

Startups are also staking claims to cool data centers out on the ocean (the seaweed is always greener in somebody else’s lake).

Nautilus Data Technologies, for instance, has raised over $100 million (according to Crunchbase) to develop data centers dotting the surface of Davey Jones’ Locker. The company is currently developing a data center project co-located with a sustainable energy project in a tributary near Stockton, Calif.

With the double-immersion cooling tech Microsoft is hoping to bring the benefits of ocean-cooling tech onto the shore. “We brought the sea to the servers rather than put the datacenter under the sea,” Microsoft’s Alissa said in a company statement.

Ioannis Manousakis, a principal software engineer with Azure (left), and Husam Alissa, a principal hardware engineer on Microsoft’s team for datacenter advanced development (right), walk past a container at a Microsoft datacenter where computer servers in a two-phase immersion cooling tank are processing workloads. Photo by Gene Twedt for Microsoft.

ConsenySys launches a more energy-efficient NFT ecosystem with a project from artist Damien Hirst as its first drop

The NFT craze has been an intriguing moment for digital artists who have seen seen great leaps in how tech has allowed them to create their work, but not as much progress in shifting how they profit off of it.

Though crypto’s early adopter artists have seemed to gain the most attention thus far, more institutionally present artists are dipping their feet into the token world. One of the bigger barriers has been the environmental concerns tied to the Ethereum blockchain which required intense energy usage to mint new artwork, tied to incredibly high transaction fees, something that has invited controversy for early artists because of climate change cocerns.

There have been a number of blockchain products to emerge in recent months that promise the benefits of Ethereum with greater speed, lower costs and lower energy usage, most notably Dapper Labs’ Flow blockchain which powers their NBA Top Shot product. Today, we saw the debut of a new “layer-two” entrant from ConsenSys, called Palm, which operates as a sidechain on Ethereum’s main network but will be supported via the popular crypto wallet MetaMask.

As part of Palm’s launch, the artist Damien Hirst announced he will be launching an NFT project, his first, called The Currency Project on the platform’s Palm NFT Studio.

Ethereum has already committed to transitioning to a more energy-efficient proof-of-stake consensus structure, but it’s unclear how quickly that’s going to happen. The network currently relies on a proof-of-work system (as does bitcoin), which use an energy-intensive manner of prioritizing where the next block in a chain is mined that gets more intensive as a network sees more traffic. It’s a reason why crypto mining operations have had to consistently invest in the latest hardware to maintain an edge and use more power. Proof-of-work does away with most of that, instead choosing nodes on the network to mine the next block based on reputation or their existing stake. There are some real security tradeoffs which have required workarounds though plenty in the crypto community aren’t quite satisfied with the compromises, though proponents argue that environmental concerns should take precedent.

In a press release, the team behind Palm says the ecosystem is “99% more energy efficiency than proof of work systems.”

Unlike Dapper Labs’ Flow, Palm benefits from its interconnectedness with the community of Ethereum developers, something that was present in today’s announcement which showcased several industry partnerships including Nifty. The news arrived alongside details this morning of Dapper Labs’ monster $305 million fund raise which will give the company backing to build on the momentum of Top Shot which has given the broader NFT space the wave of enthusiasm it’s currently experiencing.

To improve accountability, Norrsken VC ties partner compensation to its portfolio’s sustainable successes

With the close of its latest investment fund, Norrsken VC is is taking an unprecedented step in tying the compensation of its partners to the positive changes the firm’s portfolio companies have on the world — and not just their financial returns.

The firm, which released its impact assessment for 2020 last week, has invested in companies that address seven of the United Nations’ seventeen sustainable development goals, and is benchmarking its performance on goals that range from the tightly monitored to the slightly tautological.

In some instances, the goals are simply customer metrics (with the assumption that the more customers on a product, the better they’re doing). To be fair, these are in areas like education and healthcare where the true impact of a company’s services are harder to measure.

The firm’s portfolio has much more tangible progress in the climate change mitigation and sustainability space. Here, emissions avoided or increases in energy efficiency can be measured quite easily.  And those energy efficiency gains and emissions reductions, along with lower waste associated with the firm’s food and agtech businesses are where the firm has seen its best performance.

When they exit, this performance will matter a great deal to the partners at Norrsken, because their compensation is directly affected by it.

“For each investment that we make, we set targets pre-investment for what we want to see in terms of impact,” said Tove Larsson, a general partner with Norrsken VC. “We do that together with some of our key LPs in the fund. We need to get the advisory committee’s approval of the targets. We set thsoe targets for an individual year and then on an annual basis.”

When the fund reaches the end of its cycle, the firm will look at the aggregated outcome of all of the impact KPIs and will weight the results of each company’s impact based o the amount we invested in each company. Based on that, the firm decides whether the team gets any carried interest or not.

If the portfolio companies hit sixty percent of the impact targets that have been set by the firm and its advisory board members, then they receive half of the carried interest, with the rest donated to charity. “There’s a linear escalation up to 100 percent. And if we don’t achieve that then the carried interest will be paid out to a charity organization or an NGO,” said Larsson.

Image Credit: Norrsken VC

The partners at Norrsken see their novel compensation structure as a point of differentiation, especially as the number of firms focused on themes related to the UN’s sustainable development goals continues to increase dramatically.

“We we started to invest, we were one of the first — four years ago. Then the market evolved so quickly where we got questions around how do you stand out and how do you know whether you’re truly an impact player,” said Agate Freimane, a general partner at the firm.

“This is a core part of the DNA. We need to do better and show that we can walk the talk,” Freimane said. So the firm took a page from the European Investment Fund, whose operations impose similar restrictions on compensation, she said. “When we heard about this way of doing it, we said tis make 100 percent sense, and why doesn’t everyone do it?”

So far, the team hasn’t had any problems hitting the target it had set. “We’re at 119 percent of the 2020 targets,” Freimane said. Still that’s only 12 percent of the long term targets. “At the moment, we’v e done one tenth of what we need to do over the lifetime of the fund.”

Even if some of the targets may be… imprecise… the steps that the firm’s portfolio companies have taken to reduce greenhouse gas emissions and food waste, and improving energy efficiency are having a real, measurable imapct. Whether that’s the reduction of data center energy demand by 10 Gigawatt hours thanks to the deployment of Submer technologies; reducing 11,000 tons of food waste through operations at Karma, Whywaste, Matsmart or Olio; saving 4 million liters of water from carwashes using Woshapp; or the development of 38 megawatts of solar projects thanks to the work of Alight.

Image Credit: Norrsken VC

“What we’re most proud of is that we’re actually doing this now,” said Larsson. “It’s not perfect, what we have delivered now, But we really think we need to start somewhere and it is key that the industry needs to become more transparent. The first thing we mentioned is that we think it is an achievement that we are tracking it and making it public.”

With an ARR topping $250 million, LA’s vertical SAAS superstar ServiceTitan is now worth $8.3 billion

Who knew building a vertical software as a service toolkit focused on home heating and cooling could be worth $8.3 billion?

That’s how much Los Angeles-based ServiceTitan, a startup founded just eight years ago is worth now, thanks to some massive tailwinds around homebuilding and energy efficiency that are serving to boost the company’s bottom line and netting it an unprecedented valuation for a vertical software company, according to bankers.

The company’s massive mint comes thanks to a new $500 million financing round led by Sequoia’s Global Equities fund and Tiger Global Management.

ServiceTitan’s backers are a veritable who’s who of the venture industry, with longtime white shoe investors like Battery Ventures, Bessemer Venture Partners and Index Ventures joining the later stage investment funds like T. Rowe Price, Dragoneer Investment Group, and ICONIQ Growth.

In all, the new $500 million round likely sets the stage for a public offering later this year or before the end of 2022 if market conditions hold.

ServiceTitan now boasts more than 7,500 customers that employ more than 100,000 technicians and conduct nearly $20 billion worth of transactions providing services ranging from plumbing, air conditioning, electrical work, chimney, pest services and lawn care.

If Angi and Thumbtack are the places where homeowners go to find services and technicians, then ServiceTitan is where those technicians go to manage and organize their own businesses.

Based in Glendale, Calif., with satellite offices in Atlanta and Armenia, ServiceTitan built its business to solve a problem that its co-founders knew intimately as the children of parents whose careers were spent in the HVAC business.

The market for home services employs more than 5 million workers in the US and represents a trillion dollar global market.

Despite the siren song of global expansion, there’s likely plenty of room for ServiceTitan to grow in the U.S. Home ownership in the country is at a ten-year high thanks to the rise of remote work and an exodus from the largest American cities accelerated by the COVID-19 pandemic.

A focus on energy efficiency and a desire to reduce greenhouse gas emissions will likely cause a surge in residential and commercial retrofits which will also boost new business. Indeed these trends were already apparent in the statistic that home improvement spending was up 3 percent in 2020 even though the broader economy shrank by 3.5 percent.

“We depend on the men and women of the trades to maintain our life support systems: running water, heat, air conditioning, and power,” said Ara Mahdessian, co-founder and CEO of ServiceTitan. “Today, as both homeownership rates and time spent at home reach record highs, these essential service providers are facing rising demand from an increasingly tech-savvy homeowner. By providing contractors with the tools they need to deliver a great customer experience and grow their businesses with ease, ServiceTitan is enabling the hardworking men and women of the trades to reach the level of success they deserve.”

Three energy-innovation takeaways from Texas’ deep freeze

Individual solutions to the collective crisis of climate change abound: backup diesel generators, Tesla powerwalls, “prepper” shelters. However, the infrastructure that our modern civilization relies on is interconnected and interdependent — energy, transportation, food, water and waste systems are all vulnerable in climate-driven emergencies. No one solution alone and in isolation will be the salvation to our energy infrastructure crisis.

After Hurricane Katrina in 2005, Superstorm Sandy in 2012, the California wildfires last year, and the recent deep freeze in Texas, the majority of the American public has not only realized how vulnerable infrastructure is, but also how critical it is to properly regulate it and invest in its resilience.

What is needed now is a mindset shift in how we think about infrastructure. Specifically, how we price risk, how we value maintenance, and how we make policy that is aligned with our climate reality. The extreme cold weather in Texas wreaked havoc on electric and gas infrastructure that was not prepared for unusually cold weather events. If we continue to operate without an urgent (bipartisan?) investment in infrastructure, especially as extreme weather becomes the norm, this tragic trend will only continue (with frontline communities bearing a disproportionately high burden).

A month after Texas’ record-breaking storm, attention is rightly focused on helping the millions of residents putting their lives back together. But as we look toward the near-term future and get a better picture of the electric mobility tipping point on the horizon, past-due action to reform our nation’s energy infrastructure and utilities must take precedence.

Emphasize energy storage

Seventy-five percent of Texas’ electricity is generated from fossil fuels and uranium, and about 80% of the power outages in Texas were caused by these systems. The state and the U.S. are overly dependent on outdated energy generation, transmission and distribution technologies. As the price of energy storage is expected to drop to $75/kWh by 2030, more emphasis needs to be placed on “demand-side management” and distributed energy resources that support the grid, rather than trying to supplant it. By pooling and aggregating small-scale clean energy generation sources and customer-sited storage, 2021 can be the year that “virtual power plants” realize their full potential.

Policymakers would do well to mandate new incentives and rebates to support new and emerging distributed energy resources installed on the customers’ side of the utility meter, such as California’s Self-Generation Incentive Program.

Invest in workforce development

For the energy transition to succeed, workforce development will need to be a central component. As we shift from coal, oil and gas to clean energy sources, businesses and governments — from the federal to the city level — should invest in retraining workers into well-paying jobs across emerging verticals, like solar, electric vehicles and battery storage. In energy efficiency (the lowest-hanging fruit of the energy transition), cities should seize the opportunity to tie equity-based workforce development programs to real estate energy benchmarking requirements.

These policies will not only boost the efficiency of our energy systems and the viability of our aging building stock, creating a more productive economy but will also lead to job growth and expertise in a growth industry of the 21st century. According to analysis from Rewiring America, an aggressive national commitment to decarbonization could yield 25 million good-paying jobs over the next 15 years.

Build microgrids for reliability

Microgrids can connect and disconnect from the grid. By operating on normal “blue-sky” operating days as well as during emergencies, microgrids provide uninterrupted power when the grid goes down — and reduce grid constraints and energy costs when grid-connected. Previously the sole domain of military bases and universities, microgrids are growing 15% annually, reaching an $18 billion market in the U.S. by 2022.

For grid resiliency and reliable power supply, there is no better solution than community-scale microgrids that connect critical infrastructure facilities with nearby residential and commercial loads. Funding feasibility studies and audit-grade designs — so that communities have zero-cost but high-quality pathways to constructable projects, as New York State did with the NY Prize initiative — is a proven way to involve communities in their energy planning and engage the private sector in building low-carbon resilient energy systems.

Unpredictability and complexity are quickening, and technology has its place, but not simply as an individual safeguard or false security blanket. Instead, technology should be used to better calculate risk, increase system resilience, improve infrastructure durability, and strengthen the bonds between people in a community both during and in between emergencies.

Bringing jobs and health benefits, BlocPower unlocks energy efficiency retrofits for low income communities

Retrofitting buildings to make them more energy efficient and better at withstanding climate change induced extreme weather is going to be a big, multi-billion dollar business. But it’s one that’s been hard for low-income communities to tap, thanks to obstacles ranging faulty incentive structures to an inability to adequately plan for which upgrades will be most effective in which buildings.

Enter BlocPower, a New York-based startup founded by a longtime advocate for energy efficiency and the job creation that comes with it, which has a novel solution for identifying, developing and profiting off of building upgrades in low income communities — all while supporting high-paying jobs for workers in the communities the company hopes to serve.

The company also has managed to raise $63 million in equity and debt financing to support its mission. That money is split between an $8 million investment from some of the country’s top venture firms and a $55 million debt facility structured in part by Goldman Sachs to finance the redevelopment projects that BlocPower is creating.

These capital commitments aren’t charity. Government dollars are coming for the industry and private companies from healthcare providers, to utility companies, to real estate developers and property managers all have a vested interest in seeing this market succeed.

There’s going to be over $1 billion carved out for weatherization and building upgrades in the stimulus package that’s still making its way through Congress

For BlocPower’s founder, Donnel Baird, the issue of seeing buildings revitalized and good high-paying jobs coming into local communities isn’t academic. Baird was born in Brooklyn’s Bedford Stuyvesant neighborhood and witnessed firsthand the violence and joblessness that was ripping the fabric of that rich and vibrant community apart during the crack epidemic and economic decline of the 1980s and early 90s.

Seeing that violence firsthand, including a shooting on his way to school, instilled in Baird a desire to “create jobs for disconnected Black and brown people” so they would never feel the hopelessness and lack of opportunity that fosters cycles of violence.

Some time after the shooting, Baird’s family relocated from Brooklyn to Stone Mountain, Georgia, and after graduating from Duke University, Baird became a climate activist and community organizer, with a focus on green jobs. That led to a role in the presidential campaign for Barack Obama and an offer to work in Washington on Obama’s staff.

Baird declined the opportunity, but did take on a role reaching out to communities and unions to help implement the first stimulus package that Obama and Biden put together to promote green jobs.

And it was while watching the benefits of that stimulus collapse under the weight of a fragmented building industry that Baird came up with the idea for BlocPower.

“It was all about the implementation challenges that we ran into,” Baird said. “If you have ten buildings on a block in Oakland and they were all built by the same developer at the same time. If you rebuild those buildings and you retrofit all of those buildings, in five of those buildings you’re going to trap carbon monoxide in and kill everybody and in the other five buildings you’re going to have a reduction in emissions and energy savings.”

Before conducting any retrofits to capture energy savings (and health savings, but more on that later), Baird says developers need to figure out the potential for asbestos contamination in the building; understand the current heating, ventilation, and cooling systems that the building uses; and get an assessment of what actually needs to be done.

That’s the core problem that Baird says BlocPower solves. The company has developed software to analyze a building’s construction by creating a virtual twin based on blueprints and public records. Using that digital twin the company can identify what upgrades a building needs. Then the company taps lines of credit to work with building owners to manage the retrofits and capture the value of the energy savings and carbon offsets associated with the building upgrades.

For BlocPower to work, the financing piece is just as important as the software. Without getting banks to sign off on loans to make the upgrades, all of those dollars from the federal government remain locked up. “That’s why the $7 billion earmarked for investment in green buildings did not work,” Baird said. “At BlocPower our view is that we could build software to simulate using government records… we could simulate enough about the mechanicals, electrical, and plumbing across buildings in NYC so that we could avoid that cost.”

Along with co-founder Morris Cox, Baird built BlocPower while at Columbia University’s business school so that he could solve the technical problems and overcome the hurdles for community financing of renewable retrofit projects.

Right before his graduation, in 2014, the company had applied for a contract to do energy efficiency retrofits and was set to receive financing from the Department of Energy. The finalists had to go down to the White House and pitch the President. That pitch was scheduled for the same day as a key final exam for one of Baird’s Columbia classes, which the professor said was mandatory. Baird skipped the test and won the pitch, but failed the class.

After that it was off to Silicon Valley to pitch the business. Baird met with 200 or more investors who rejected his pitch. Many of these investors had been burned in the first cleantech bubble or had witnessed the fiery conflagrations that engulfed firms that did back cleantech businesses and swore they’d never make the same mistakes.

That was the initial position at Andreessen Horowitz when Baird pitched them, he said. “When I went to Andreessen Horowitz, they said ‘Our policy is no cleantech whatsoever. You need to figure out how software is going to eat up this energy efficiency market’,” Baird recalled.

Working with Mitch Kapor, an investor and advisor, Baird worked on the pitch and got Kapor to talk to Ben Horowitz. Both men agreed to invest and BlocPower was off to the races.

The company has completed retrofits in over 1,000 buildings since its launch, Baird said, mainly to prove out its thesis. Now, with the revolving credit facility in hand, BlocPower can take bigger bites out of the market. That includes a contract with utility companies in New York that will pay $30 million if the company can complete its retrofits and verify the energy savings from that work.

There are also early projects underway in Oakland and Chicago, Baird said.

Building retrofits do more than just provide energy savings, as Goldman Sachs managing director Margaret Anadu noted in a statement.

“BlocPower is proving that it is possible to have commercial solutions that improve public health in underserved communities, create quality jobs and lower carbon emissions,” Anadu said. “We are so proud to have supported Donnel and his team…through both equity and debt capital to further expand their reach.”

These benefits also have potential additional revenue streams associated with them that BlocPower can also capture, according to investor and director, Mitch Kapor.

“There are significant linkages that are known between buildings and pollution that are a public health issue. In a number of geographies community hospitals are under a mandate to improve health outcomes and BlocPower can get paid from health outcomes associated with the reduction in carbon. That could be a new revenue stream and a financing mechanism,” Kapor said. “There’s a lot of work to be done in essentially taking the value creation engine they have and figuring out where to bring it and which other engines they need to have to have the maximum social impact.”

Social impact is something that both Kapor and Baird talk about extensively and Baird sees the creation of green jobs as an engine for social justice — and one that can reunite a lot of working class voters whose alliances were fractured by the previous administration. Baird also believes that putting people to work is the best argument for climate change policies that have met with resistance among many union workers.

“We will not be able to pass shit unless workers and people of color are on board to force the U.S. senate to pass climate change policy,” Baird said. “We have to pass the legislation that’s going to facilitate green infrastructure in a massive way.”

He pointed to the project in Oakland as an example of how climate policies can create jobs and incentivize political action.

“In Oakland we’re doing a pilot project in 12 low income buildings in oakland. I sent them $20K to train these workers from local people of color in Oakland… they are being put to work in Oakland,” Baird said. “That’s the model for how this gets built. So now we need them to call Chuck Shumer to push him to the left on green building legislation.”