Forerunner is software for NFIMBYs, or no flooding in my backyard

Mayors have the toughest job in the world, and leading a city is only getting harder. Even as populations swell in urban cores across the world, climate change is constraining the geographies where that growth can happen. Coastal communities which are popular with residents are also taking a gamble when it comes to rising sea levels. How do you tradeoff a need for growth with the requirement for protecting residents from disaster?

In most cases, the pendulum is fully tilted toward growth. Coastal towns continue to allow widespread sprawl and development, chasing ever more property taxes and residents even as sea levels get ever more uncomfortably high. It’s a recipe for disaster — and one that many cities have chosen to bake anyway.

Forerunner wants that pendulum to swing the other way. Its platform allows city planners and building managers to survey, investigate and enforce stricter building codes and land use standards with a focus on mitigating future flood damage. It’s particularly focused on American cities with heavy usage of the federal flood insurance program, and Forerunner helps cities maximize their adherence to that program’s byzantine rules.

The company pulls in data from FEMA and other sources to determine a property’s mandatory lowest floor height requirement, and whether the property conforms to that rule. It also tracks flood zone boundaries and helps with the administrative overhead of processing federal flood insurance documentation, such as creating and managing elevation certificates.

Co-founders JT White and Susanna Pho have been friends for years and worked at the MIT Media Lab before eventually coming together in early 2019 to build out this floodplain management product. “It cannot be underscored enough that a lot of communities just don’t follow [federal flood] regulations,” Pho said. “They will revert their ordinances from something more strict … since they can’t do a lot of day-to-day compliance.”

Coastal cities devastated by floods are protected by federal flood insurance, but that often creates a moral hazard: since damage is paid for, there isn’t much incentive to avoid it in the first place. The federal government is attempting to tighten those standards, and there is also a sense among a new generation of city planners and municipal leaders that the build-devastation-rebuild model of many cities needs to stop given climate change. After flooding, “we want to see communities rebuild to higher standards,” White said. “The sort of cycle of rebuilding and doing the same thing over and over again is infuriating to us.”

Transitioning to a new model isn’t easy of course. “There are a lot of hard decisions that these communities must make,” he said, but “our software makes it a bit easier to do these things.” So far, the company has gotten early traction with 33 communities currently using Forerunner according to the founders.

Although it has customer clusters in Louisiana and northern New Jersey, the company’s largest customer is Harris County, which includes much of the Houston, Texas metro area. The county could potentially save $5 million on their flood insurance premiums with better adherence to federal standards, according to White. “One of the benefits of our product is that we can help you protect and increase this immediate discount to every flood insurance policyholder in your community starting next year,” he said. Ultimately though, FEMA focuses on disincentives rather than incentives. “The biggest stick that FEMA has is that it can suspend communities from the flood insurance program,” he noted.

The company raised an early seed round in 2019, and has been focused on building up the platform’s capabilities and getting the sales flywheel spinning — which can be a tough order in the govtech space.

Even as demand intensifies for more housing and growth, climate change is simultaneously placing its own demands on cities. Mayors and city leaders are increasingly going to have to transition from the growth models of the past to the resilient models of the future.

BreezoMeter, which powers air quality in Apple’s Weather app, launches Wildfire Tracker

BreezoMeter has been on a mission to make environmental health hazards accessible to as many people as possible. Through its air quality index (AQI) calculations, the Israel-based company can now identify the quality of air down to a few meters in dozens of countries. A partnership with Apple to include its data into the iOS Weather app along with its own popular apps delivers those metrics to hundreds of millions of users, and an API product allows companies to tap into its dataset for their own purposes.

Right on the heels of a $30 million Series C round a few weeks ago, the company is radially expanding its product from air quality into the real-time detection of wildfire perimeters with its new product, Wildfire Tracker.

The new product will take advantage of the company’s fusion of sensor data, satellite imagery, and local eyewitness reports to be able to identify the edges of wildfires in real-time. “People expect accurate wildfire information just as they expect accurate weather or humidity data,” Ran Korber, CEO and co-founder, said. “It has an immediate effect on their life.” He added further that BreezoMeter wants to “try to connect the dots between climate tech and human health.”

Fire danger zones will be indicated with polygonal boundaries marked in red, and as always, air quality data will be viewable in these zones and in surrounding areas.

BreezoMeter’s air quality maps can show the spread of wildfire pollution easily. Image Credits: BreezoMeter.

Korber emphasized that getting these perimeters accurate across dozens of countries was no easy feat. Sensors can be sparse, particularly in the forests where wildfires ignite. Meanwhile, satellite data that focuses on thermal imaging can be fooled. “We’re looking for abnormalities … many of the times you have these false positives,” Korber said. He gave an example of a large solar panel array which can look very hot with thermal sensors but obviously isn’t a fire.

The identified fire perimeters will be available for free to consumers on BreezoMeter’s air quality map website, and will shortly come to the company’s apps as well. Later this year, these perimeters will be available from the company’s APIs for commercial customers. Korber hopes the API endpoints will give companies like car manufacturers the ability to forewarn drivers that they are approaching a conflagration.

The new feature is just a continuation of BreezoMeter’s long-time expansion of its product. “When we started, it was just air quality … and only forecasting air pollution in Israel,” Korber said. “Almost every year since then, we expanded the product portfolio to new environmental hazards.” He pointed to the addition of pollen in 2018 and the increasingly global nature of the app.

Wildfire detection is an, ahem, hot area these days for VC investors. For example, Cornea is a startup focused on helping firefighters identify and mitigate blazes, while Perimeter wants to help identify boundaries of wildfires and give explicit evacuation instructions complete with maps. As Silicon Valley’s home state of California and much of the world increasingly become a tinderbox for fires, expect more investment and products to enter this area.

A16Z leads investment in Firemaps, a marketplace for home hardening against wildfires

Wildfires are burning in countries all around the world. California is dealing with some of the worst wildfires in its history (a superlative that I use essentially every year now) with the Caldor fire and others blazing in the state’s north. Meanwhile, Greece and other Mediterranean nations have been fighting fires for weeks to bring a number of massive blazes under control.

With the climate increasingly warming, millions of home just in the United States alone are sitting in zones at high risk for wildfires. Insurance companies and governments are putting acute pressure on homeowners to invest more in defending their homes in what is typically dubbed “hardening,” or ensuring that if fires do arrive, a home has the best chance to survive and not spread the disaster further.

SF-based Firemaps has a bold vision for rapidly scaling up and solving the problem of home hardening by making a complicated and time-consuming process as simple as possible.

The company, which was founded just a few months ago in March, sends out a crew with a drone to survey a homeowner’s house and property if it is in a high-risk fire zone. Within 20 minutes, the team will have generated a high-resolution 3D model of the property down to the centimeter. From there, hardening options are identified and bids are sent out to trade contractors to perform the work on the company’s marketplace.

Once the drone scans a house, Firemaps can create a full CAD model of the structure and the nearby property. Image Credits: Firemaps.

While early, it’s already gotten traction. In addition to hundreds of homeowners who have signed up on its website and a few dozen that have been scanned, Andrew Chen of A16Z has led a $5.5 million seed round into the business (the Form D places the round sometime around April). Uber CEO Dara Khosrowshahi and Addition’s Lee Fixel also participated.

Firemaps is led by Jahan Khanna, who co-founded it along with his brother, who has a long-time background in civil engineering, and Rob Moran. Khanna was co-founder and CTO of early ride-sharing startup Sidecar, where Moran joined as one of the company’s first employees. The trio spent cycles exploring how to work on climate problems, while staying focused on helping people in the here and now. “We have crossed certain thresholds [with the climate] and we need to get this problem under control,” Khanna said. “We are one part of the solution.”

Over the past few years Khanna and his brother explored opening a solar farm or a solar-powered home in California. “What was wild, whenever we talked to someone, is they said you cannot build anything in California since it will burn down,” Khanna said. “What is kind of the endgame of this?” As they explored fire hardening, they realized that millions of homeowners needed faster and cheaper options, and they needed them sooner rather than later.

While there are dozens of options to harden a home to fire, some popular options include constructing an ember-free zone within a few feet of a home, often by placing gravel made of granite on the ground, as well as ensuring that attic vents, gutters, and siding are fireproof and can withstand high temperatures. These options can vary widely in cost, although some local and state governments have created reimbursement programs to allow homeowners to recoup at least some of the expenses of these improvements.

A Firemaps house in 3D model form with typical hardening options and associated prices. Image Credits: Firemaps.

The company’s business model is simple: vetted contractors pay Firemaps to be listed as an option on its platform. Khanna believes that because its drone offers a comprehensive model of a home, contractors will be able to bid for contracts without doing their own site visits. “These contractors are getting these shovel-ready projects, and their acquisition costs are basically zero,” Khanna said.

Long-term, “our operating hypothesis is that building a platform and building these models of homes is inherently valuable,” Khanna said. Right now, the company is launched in California, and the goal for the next year is to “get this model repeatable and scalable and that means doing hundreds of homes per week,” he said.

Paladin publicly launches Knighthawk, a first response drone for cities

Emergency response is a time-sensitive business. When fires burn or a driver crashes their car, seconds can mean the difference between saving lives and watching a situation spiral rapidly out of control. For fire and police departments, getting teams on site can be challenging, what with the vagaries of traffic and bad routing.

Houston-headquartered Paladin is a startup building a custom drone hardware and software solution for cities to be able to respond to emergencies faster and with better data. After years of development, the company is publicly unveiling its Knighthawk and Watchtower products.

The Knighthawk is a custom-made drone designed for the specific needs of emergency response personnel. It comes complete with two cameras — one 10x zoom optical and one thermal — to provide the best video feeds on a developing situation at both day and night with only a half second latency. Importantly, the drone has a time range of 55 minutes and can travel multiple miles away to reach a site, according to the company. Launch time can be as short as a few seconds from when a 911 call comes in.

Paladin Drones’ Knighthawk operating during the day. Image Credits: Paladin Drones

To manage the drones and watch the video feeds, operators use the company’s Watchtower software (available as an app) to place a pin on a map to direct the drone to the likely site of an emergency. Once there, uploaded video feeds will display in the app as well as in a 911 center’s existing computer-aided dispatch systems, a topic we covered quite a bit in our RapidSOS EC-1 from a few weeks ago.

Paladin Drones’ Watchtower allows operators to direct, manage and watch video from Knighthawk drones. Image Credits: Paladin Drones.

The public launch is a huge step forward for the company, which TechCrunch last profiled in 2019 as it was emerging from Y Combinator with a $1.3 million seed from the likes of Khosla, Correlation Ventures, and Paul Buchheit. Back then, the focus was on building software to integrate with an off-the-shelf DJI drone. Paladin was experimenting with a beta Android app where an operator could place a pin on a map and direct the drone to a site.

Yet, that model proved insufficient for the task. CEO and co-founder Divy Shrivastava said that as the company developed its product, it learned it needed to own the hardware stack as well. “The drones that we were using weren’t purpose-built for automation,” he said. “We ended up coming up with our own communication technology for our drones … so that we won’t lose connection.”

CEO and co-founder Divy Shrivastava. Image Credits: Paladin Drones.

Since the company’s founding in 2018, it has responded to about 1,600 emergencies, according to its own internal data. The company has spent prodigious hours with departments in two locations — Memorial Villages in Houston and Orange Township in Ohio — responding to a handful of calls per day at specific hours.

That restriction hints at what has been one of the toughest challenges for the drone startup: regulations. The FAA has put in place strict rules around visual line of sight for operators of drones. In order to realize its vision of a completely seamless and easily deployed system, Paladin has had to collect extensive data and present it to the FAA to get operating waivers, which the agency offers through a “First Responder Tactical Beyond Visual Line of Sight” exception. So far, it has secured these types of waivers for the two cities it works with, and Shrivastava is confident that the company has developed a repeatable process for any new cities that want to purchase its products.

Installation is relatively simple, according to Shrivastava. The drones themselves can be placed anywhere, even “a parking lot,” and are often stationed at a police department or firehouse. No special hardware or sensors or guidelines need to be installed in the city for the drones to process the terrain or understand their surroundings. Some software integration is required to connect drones into the computer-aided dispatch system used by 911 call takers.

With the public launch and more proof points on the board, the company is focusing on ramping up sales and “our long-term goal is to have every single fire, police and first response agency use us,” Shrivastava said. The team has expanded to about eight, although the company’s other co-founder, Trevor Pennypacker, departed in late 2019, and now works at Tesla.

FloodMapp wants to predict where water goes before it washes away your home

Floods are devastating. They rip asunder communities, wipe out neighborhoods, force the evacuation of thousands of people every year, and recovering them can take years — assuming recovery is possible at all. The U.S. government estimates that floods in recent decades (exclusive of hurricanes and tropical storms) have caused an estimated $160 billion in damage and killed hundreds of people.

One would think that we should have a real-time model for where water is and where it is going around the world, what with all of those sensors on the ground and satellites in orbit. But we mostly don’t, instead relying on antiquated models that fail to take into account the possibilities of big data and big compute.

FloodMapp, a Brisbane, Australia-based startup, is aiming to wash out the old approaches to hydrology and predictive analytics and put in place a much more modern approach to help emergency managers and citizens know when the floods are coming — and what to do.

CEO and co-founder Juliette Murphy has spent a lifetime in the water resources engineering field, and saw first hand the heavy destruction that water can cause. In 2011, she watched as her friend’s home was submerged in the midst of terrible flooding. The “water went right over the peak of her house,” she said. Two years later in Calgary, she saw the same situation again: floods and fear as friends tried to determine whether and how to evacuate.

Those memories and her own professional career led her to think more about how to build better tools for disaster managers. She ultimately synced up with CTO and co-founder Ryan Prosser to build FloodMapp in 2018, raising $1.3 million AUD along with a matching grant.

The company’s premise is simple: we have the tools to build real-time flooding models today, but we just have chosen not to take advantage of them. Water follows gravity, which means that if you know the topology of a place, you can predict where the water will flow to. The challenge has been that calculating second-order differential equations at high resolution remains computationally expensive.

Murphy and Prosser decided to eschew the traditional physics-based approach that has been popular in hydrology for decades for a completely data-based approach that takes advantage of widely available techniques in machine learning to make those calculations much more palatable. “We do top down what used to be bottoms up,” Murphy said. “We have really sort of broken the speed barrier.” That work led to the creation of DASH, the startup’s real-time flood model.

FloodMapp’s modeling of the river flooding in Brisbane. Image Credits: FloodMapp

Unlike typical tech startups though, FloodMapp isn’t looking to be its own independent platform. Instead, it interoperates with existing geographic information systems (GIS) like ESRI’s ArcGIS by offering a data layer that can be combined with other data streams to provide situational awareness to emergency response and recovery personnel. Customers pay a subscription fee for access to FloodMapp’s data layer, and so far, the company is working with the Queensland Fire and Emergency Services in Australia as well as the cities of Norfolk and Virginia Beach in Virginia.

But it’s not just emergency services the startup is ultimately hoping to attract. Any company with physical assets, from telcos and power companies to banks and retail chains with physical stores could potentially be a customer of the product. In fact, FloodMapp is betting that the SEC will mandate further climate change financial disclosures, which could lead to a … flood of new business (I get one flood pun, okay, I get one).

FloodMapp’s team has expanded from its original two founders to a whole crop of engineering and sales personnel. Image Credits: FloodMapp

Murphy notes that “we are still in our early stages” and that the company is likely to raise further financing early next year as it gets through this year’s flood season and onboards several new customers. She hopes that ultimately, FloodMapp will “not only help people, but help our country change and adapt in the face of a changing climate.”

When the Earth is gone, at least the internet will still be working

The internet is now our nervous system. We are constantly streaming and buying and watching and liking, our brains locked into the global information matrix as one universal and coruscating emanation of thought and emotion.

What happens when the machine stops though?

It’s a question that E.M. Forster was intensely focused on more than a century ago in a short story called, rightly enough, “The Machine Stops,” about a human civilization connected entirely through machines that one day just turn off.

Those fears of downtime are not just science fiction anymore. Outages aren’t just missing a must-watch TikTok clip. Hospitals, law enforcement, the government, every corporation — the entire spectrum of human institutions that constitute civilization now deeply rely on connectivity to function.

So when it comes to disaster response, the world has dramatically changed. In decades past, the singular focus could be roughly summarized as rescue and mitigation — save who you can while trying to limit the scale of destruction. Today though, the highest priority is by necessity internet access, not just for citizens, but increasingly for the on-the-ground first responders who need bandwidth to protect themselves, keep abreast of their mission objectives, and have real-time ground truth on where dangers lurk and where help is needed.

While the sales cycles might be arduous as we learned in part one and the data trickles have finally turned to streams in part two, the reality is that none of that matters if there isn’t connectivity to begin with. So in part three of this series on the future of technology and disaster response, we’re going to analyze the changing nature of bandwidth and connectivity and how they intersect with emergencies, taking a look at how telcos are creating resilience in their networks while defending against climate change, how first responders are integrating connectivity into their operations, and finally, exploring how new technologies like 5G and satellite internet will affect these critical activities.

Wireless resilience as the world burns

Climate change is inducing more intense weather patterns all around the world, creating second- and third-order effects for industries that rely on environmental stability for operations. Few industries have to be as dynamic to the changing context as telecom companies, whose wired and wireless infrastructure is regularly buffeted by severe storms. Resiliency of these networks isn’t just needed for consumers — it’s absolutely necessary for the very responders trying to mitigate disasters and get the network back up in the first place.

Unsurprisingly, no issue looms larger for telcos than access to power — no juice, no bars. So all three of America’s major telcos — Verizon (which owns TechCrunch’s parent company Verizon Media, although not for much longer), AT&T and T-Mobile — have had to dramatically scale up their resiliency efforts in recent years to compensate both for the demand for wireless and the growing damage wrought by weather.

Jay Naillon, senior director of national technology service operations strategy at T-Mobile, said that the company has made resilience a key part of its network buildout in recent years, with investments in generators at cell towers that can be relied upon when the grid cannot. In “areas that have been hit by hurricanes or places that have fragile grids … that is where we have invested most of our fixed assets,” he said.

Like all three telcos, T-Mobile pre-deploys equipment in anticipation for disruptions. So when a hurricane begins to swirl in the Atlantic Ocean, the company will strategically fly in portable generators and mobile cell towers in anticipation of potential outages. “We look at storm forecasts for the year,” Naillon explained, and do “lots of preventative planning.” They also work with emergency managers and “run through various drills with them and respond and collaborate effectively with them” to determine which parts of the network are most at risk for damage in an emergency. Last year, the company partnered with StormGeo to accurately predict weather events.

Predictive AI for disasters is also a critical need for AT&T. Jason Porter, who leads public sector and the company’s FirstNet first-responder network, said that AT&T teamed up with Argonne National Laboratory to create a climate-change analysis tool to evaluate the siting of its cell towers and how they will weather the next 30 years of “floods, hurricanes, droughts and wildfires.” “We redesigned our buildout … based on what our algorithms told us would come,” he said, and the company has been elevating vulnerable cell towers four to eight feet high on “stilts” to improve their resiliency to at least some weather events. That “gave ourselves some additional buffer.”

AT&T has also had to manage the growing complexity of creating reliability with the chaos of a climate-change-induced world. In recent years, “we quickly realized that many of our deployments were due to weather-related events,” and the company has been “very focused on expanding our generator coverage over the past few years,” Porter said. It’s also been very focused on building out its portable infrastructure. “We essentially deploy entire data centers on trucks so that we can stand up essentially a central office,” he said, empathizing that the company’s national disaster recovery team responded to thousands of events last year.

Particularly on its FirstNet service, AT&T has pioneered two new technologies to try to get bandwidth to disaster-hit regions faster. First, it has invested in drones to offer wireless services from the sky. After Hurricane Laura hit Louisiana last year with record-setting winds, our “cell towers were twisted up like recycled aluminum cans … so we needed to deploy a sustainable solution,” Porter described. So the company deployed what it dubs the FirstNet One — a “dirigible” that “can cover twice the cell coverage range of a cell tower on a truck, and it can stay up for literally weeks, refuel in less than an hour and go back up — so long-term, sustainable coverage,” he said.

AT&T’s FirstNet One dirigible to offer internet access from the air for first responders. Image Credits: AT&T/FirstNet

Secondly, the company has been building out what it calls FirstNet MegaRange — a set of high-powered wireless equipment that it announced earlier this year that can deploy signals from miles away, say from a ship moored off a coast, to deliver reliable connectivity to first responders in the hardest-hit disaster zones.

As the internet has absorbed more of daily life, the norms for network resilience have become ever more exacting. Small outages can disrupt not just a first responder, but a child taking virtual classes and a doctor conducting remote surgery. From fixed and portable generators to rapid-deployment mobile cell towers and dirigibles, telcos are investing major resources to keep their networks running continuously.

Yet, these initiatives are ultimately costs borne by telcos increasingly confronting a world burning up. Across conversations with all three telcos and others in the disaster response space, there was a general sense that utilities just increasingly have to self-insulate themselves in a climate-changed world. For instance, cell towers need their own generators because — as we saw with Texas earlier this year — even the power grid itself can’t be guaranteed to be there. Critical applications need to have offline capabilities, since internet outages can’t always be prevented. The machine runs, but the machine stops, too.

The trend lines on the frontlines are data lines

While we may rely on connectivity in our daily lives as consumers, disaster responders have been much more hesitant to fully transition to connected services. It is precisely in the middle of a tornado and the cell tower is down that you realize a printed map might have been nice to have. Paper, pens, compasses — the old staples of survival flicks remain just as important in the field today as they were decades ago.

Yet, the power of software and connectivity to improve emergency response has forced a rethinking of field communications and how deeply technology is integrated on the ground. Data from the frontlines is extremely useful, and if it can be transmitted, dramatically improves the ability of operations planners to respond safely and efficiently.

Both AT&T and Verizon have made large investments in directly servicing the unique needs of the first responder community, with AT&T in particular gaining prominence with its FirstNet network, which it exclusively operates through a public-private partnership with the Department of Commerce’s First Responder Network Authority. The government offered a special spectrum license to the FirstNet authority in Band 14 in exchange for the buildout of a responder-exclusive network, a key recommendation of the 9/11 Commission, which found that first responders couldn’t communicate with each other on the day of those deadly terrorist attacks. Now, Porter of AT&T says that the company’s buildout is “90% complete” and is approaching 3 million square miles of coverage.

Why so much attention on first responders? The telcos are investing here because in many ways, the first responders are on the frontiers of technology. They need edge computing, AI/ML rapid decision-making, the bandwidth and latency of 5G (which we will get to in a bit), high reliability, and in general, are fairly profitable customers to boot. In other words, what first responders need today are what consumers in general are going to want tomorrow.

Cory Davis, director of public safety strategy and crisis response at Verizon, explained that “more than ever, first responders are relying on technology to go out there and save lives.” His counterpart, Nick Nilan, who leads product management for the public sector, said that “when we became Verizon, it was really about voice [and] what’s changed over the last five [years] is the importance of data.” He brings attention to tools for situational awareness, mapping, and more that are a becoming standard in the field. Everything first responders do “comes back to the network — do you have the coverage where you need it, do you have the network access when something happens?”

The challenge for the telcos is that we all want access to that network when catastrophe strikes, which is precisely when network resources are most scarce. The first responder trying to communicate with their team on the ground or their operations center is inevitably competing with a citizen letting friends know they are safe — or perhaps just watching the latest episode of a TV show in their vehicle as they are fleeing the evacuation zone.

That competition is the argument for a completely segmented network like FirstNet, which has its own dedicated spectrum with devices that can only be used by first responders. “With remote learning, remote work and general congestion,” Porter said, telcos and other bandwidth providers were overwhelmed with consumer demand. “Thankfully we saw through FirstNet … clearing that 20 MHz of spectrum for first responders” helped keep the lines clear for high-priority communications.

FirstNet’s big emphasis is on its dedicated spectrum, but that’s just one component of a larger strategy to give first responders always-on and ready access to wireless services. AT&T and Verizon have made prioritization and preemption key operational components of their networks in recent years. Prioritization gives public safety users better access to the network, while preemption can include actively kicking off lower-priority consumers from the network to ensure first responders have immediate access.

Nilan of Verizon said, “The network is built for everybody … but once we start thinking about who absolutely needs access to the network at a period of time, we prioritize our first responders.” Verizon has prioritization, preemption, and now virtual segmentation — “we separate their traffic from consumer traffic” so that first responders don’t have to compete if bandwidth is limited in the middle of a disaster. He noted that all three approaches have been enabled since 2018, and Verizon’s suite of bandwidth and software for first responders comes under the newly christened Verizon Frontline brand that launched in March.

With increased bandwidth reliability, first responders are increasingly connected in ways that even a decade ago would have been unfathomable. Tablets, sensors, connected devices and tools — equipment that would have been manual are now increasingly digital.

That opens up a wealth of possibilities now that the infrastructure is established. My interview subjects suggested applications as diverse as the decentralized coordination of response team movements through GPS and 5G; real-time updated maps that offer up-to-date risk analysis of how a disaster might progress; pathfinding for evacuees that’s updated as routes fluctuate; AI damage assessments even before the recovery process begins; and much, much more. In fact, when it comes to the ferment of the imagination, many of those possibilities will finally be realized in the coming years — when they have only ever been marketing-speak and technical promises in the past.

Five, Gee

We’ve been hearing about 5G for years now, and even 6G every once in a while just to cause reporters heart attacks, but what does 5G even mean in the context of disaster response? After years of speculation, we are finally starting to get answers.

Naillon of T-Mobile noted that the biggest benefit of 5G is that it “allows us to have greater coverage” particularly given the low-band spectrum that the standard partially uses. That said, “As far as applications — we are not really there at that point from an emergency response perspective,” he said.

Meanwhile, Porter of AT&T said that “the beauty of 5G that we have seen there is less about the speed and more about the latency.” Consumers have often seen marketing around voluminous bandwidths, but in the first-responder world, latency and edge computing tends to be the most desirable features. For instance, devices can relay video to each other on the frontlines, without necessarily needing a backhaul to the main wireless network. On-board processing of image data could allow for rapid decision-making in environments where seconds can be vital to the success of a mission.

That flexibility is allowing for many new applications in disaster response, and “we are seeing some amazing use cases coming out of our 5G deployments [and] we have launched some of our pilots with the [Department of Defense],” Porter said. He offered an example of “robotic dogs to go and do bomb dismantling or inspecting and recovery.”

Verizon has made innovating on new applications a strategic goal, launching a 5G First Responders Lab dedicated to guiding a new generation of startups to build at this crossroads. Nilan of Verizon said that the incubator has had more than 20 companies across four different cohorts, working on everything from virtual reality training environments to AR applications that allow firefighters to “see through walls.” His colleague Davis said that “artificial intelligence is going to continue to get better and better and better.”

Blueforce is a company that went through the first cohort of the Lab. The company uses 5G to connect sensors and devices together to allow first responders to make the best decisions they can with the most up-to-date data. Michael Helfrich, founder and CEO, said that “because of these new networks … commanders are able to leave the vehicle and go into the field and get the same fidelity” of information that they normally would have to be in a command center to receive. He noted that in addition to classic user interfaces, the company is exploring other ways of presenting information to responders. “They don’t have to look at a screen anymore, and [we’re] exploring different cognitive models like audio, vibration and heads-up displays.”

5G will offer many new ways to improve emergency responses, but that doesn’t mean that our current 4G networks will just disappear. Davis said that many sensors in the field don’t need the kind of latency or bandwidth that 5G offers. “LTE is going to be around for many, many more years,” he said, pointing to the hardware and applications taking advantage of LTE-M standards for Internet of Things (IoT) devices as a key development for the future here.

Michael Martin of emergency response data platform RapidSOS said that “it does feel like there is renewed energy to solve real problems,” in the disaster response market, which he dubbed the “Elon Musk effect.” And that effect definitely does exist when it comes to connectivity, where SpaceX’s satellite bandwidth project Starlink comes into play.

Satellite uplinks have historically had horrific latency and bandwidth constraints, making them difficult to use in disaster contexts. Furthermore, depending on the particular type of disaster, satellite uplinks can be astonishingly challenging to setup given the ground environment. Starlink promises to shatter all of those barriers — easier connections, fat pipes, low latencies and a global footprint that would be the envy of any first responder globally. Its network is still under active development, so it is difficult to foresee today precisely what its impact will be on the disaster response market, but it’s an offering to watch closely in the years ahead, because it has the potential to completely upend the way we respond to disasters this century if its promises pan out.

Yet, even if we discount Starlink, the change coming this decade in emergency response represents a complete revolution. The depth and resilience of connectivity is changing the equation for first responders from complete reliance on antiquated tools to an embrace of the future of digital computing. The machine is no longer stoppable.


Future of Technology and Disaster Response Table of Contents


Data was the new oil, until the oil caught fire

We’ve been hearing how “data is the new oil” for more than a decade now, and in certain sectors, it’s a maxim that has more than panned out. From marketing and logistics to finance and product, decision-making is now dominated by data at all levels of most big private orgs (and if it isn’t, I’d be getting a résumé put together, stat).

So it might be a something of a surprise to learn that data, which could transform how we respond to the increasingly deadly disasters that regularly plague us, has been all but absent from much of emergency response this past decade. Far from being a geyser of digital oil, disaster response agencies and private organizations alike have for years tried to swell the scope and scale of the data being inputted into disaster response, with relatively meager results.

That’s starting to change though, mostly thanks to the internet of things (IoT), and frontline crisis managers today increasingly have the data they need to make better decisions across the resilience, response, and recovery cycle. The best is yet to come — with drones flying up, simulated visualizations, and artificial intelligence-induced disasters — what we’re seeing today on the frontlines is only the beginning of what could be a revolution in disaster response in the 2020s.

The long-awaited disaster data deluge has finally arrived

Emergency response is a fight against the fog of war and the dreadful ticking of the clock. In the midst of a wildfire or hurricane, everything can change in a matter of seconds — even milliseconds if you aren’t paying attention. Safe roads ferrying evacuees can suddenly become impassable infernos, evacuation teams can reposition and find themselves spread far too thin, and unforeseen conditions can rapidly metastasize to cover the entire operating environment. An operations center that once had perfect information can quickly find it has no ground truth at all.

Unfortunately, even getting raw data on what’s happening before and during a disaster can be extraordinarily difficult. When we look at the data revolution in business, part of the early success stems from the fact that companies were always heavily reliant on data to handle their activities. Digitalization was and is the key word: moving from paper to computers in order to transform latent raw data into a form that was machine-readable and therefore analyzable. In business, the last ten years was basically upgrading to version two from version one.

In emergency management however, many agencies are stuck without a version at all. Take a flood — where is the water and where is it going? Up until recently, there was no comprehensive data on where waters rose from and where they sloshed to. When it came to wildfires, there were no administrative datasets on where every tree in the world was located and how prone each is to fire. Even human infrastructure like power lines and cell towers often had little interface with the digital world. They stood there, and if you couldn’t see them, they couldn’t see you.

Flood modeling is on the cutting edge of disaster planning and response. Image Credits: CHANDAN KHANNA/AFP via Getty Images

Models, simulations, predictions, analysis: all of these are useless without raw data, and in the disaster response realm, there was no detailed data to be found.

After years of promising an Internet of Things (IoT) revolution, things are finally internet-izing, with IoT sensors increasingly larding up the American and world landscape. Temperature, atmospheric pressure, water levels, humidity, pollution, power, and other sensors have been widely deployed, emitting constant streams of data back into data warehouses ready for analysis.

Take wildfires in the American West. It wasn’t all that long ago that the U.S. federal government and state firefighting agencies had no knowledge of where a blaze was taking place. Firefighting has been “100 years of tradition unimpeded by progress,” Tom Harbour, head of fire response for a decade at the U.S. Forest Service and now chief fire officer at Cornea put it.

And he’s right. After all, firefighting is a visceral activity — responders can see the fires, even feel the burning heat echoing off of their flesh. Data wasn’t useful, particularly in the West where there are millions of acres of land and large swaths are sparsely populated. Massive conflagrations could be detected by satellites, but smoldering fires in the brush would be entirely invisible to the geospatial authorities. There’s smoke over California — exactly what is a firefighter on the ground supposed to do with such valuable information?

Today after a decade of speculative promise, IoT sensors are starting to clear a huge part of this fog. Aaron Clark-Ginsberg, a social scientist at RAND Corporation who researches community resilience, said that air quality sensors have become ubiquitous since they are “very cheap [and] pretty easy to use” and can offer very fine-grained understandings of pollution — a key signal, for instance, of wildfires. He pointed to the company Purple Air, which in addition to making sensors, also produces a popular consumer map of air quality, as indicative of the potential these days for technology.

Maps are the critical intersection for data in disasters. Geospatial information systems (GIS) form the basis for most planning and response teams, and no company has a larger footprint in the sector than privately-held Esri. Ryan Lanclos, who leads public safety solutions at the company, pointed to the huge expansion of water sensors as radically changing responses to certain disasters. “Flood sensors are always pulsing,“ he said, and with a “national water model coming out of the federal government ,” researchers can now predict through GIS analysis how a flood will affect different communities with a precision unheard of previously.

Digital maps and GIS systems are increasingly vital for disaster planning and response, but paper still remains quite ubiquitous. Image Credits: Paul Kitagaki Jr.-Pool/Getty Images

Cory Davis, the director of public safety strategy and crisis response at Verizon (which, through our parent company Verizon Media, is TechCrunch’s ultimate owner), said that all of these sensors have transformed how crews work to maintain infrastructure as well. “Think like a utility that is able to put a sensor on a power line — now they have sensors and get out there quicker, resolve it, and get the power back up.”

He noted one major development that has transformed sensors in this space the last few years: battery life. Thanks to continuous improvements in ultra-low-power wireless chips as well as better batteries and energy management systems, sensors can last a really long time in the wilderness without the need for maintenance. “Now we have devices that have ten-year battery lives,” he said. That’s critical, because it can be impossible to connect these sensors to the power grid in frontier areas.

The same line of thinking holds true at T-Mobile as well. When it comes to preventative planning, Jay Naillon, senior director of national technology service operations strategy at the telco, said that “the type of data that is becoming more and more valuable for us is the storm surge data — it can make it easier to know we have the right assets in place.” That data comes from flood sensors that can offer real-time warnings signals to planners across the country.

Telecom interest — and commercial interest in general — has been critical to accelerating the adoption of sensors and other data streams around disasters. While governments may be the logical end user of flood or wildfire data, they aren’t the only ones interested in this visibility. “A lot of consumers of that information are in the private sector,” said Jonathan Sury, project director at the National Center for Disaster Preparedness at the Earth Institute at Columbia University. “These new types of risks, like climate change, are going to affect their bottom lines,” and he pointed to bond ratings, insurance underwriting and other areas where commercial interest in sensor data has been profound.

Sensors may not literally be ubiquitous, but they have offered a window into the ambiguity that emergency managers have never had visibility into before.

Finally, there is the extensive datasets around mobile usage that have become ubiquitous throughout much of the world. Facebook’s Data for Good project, for instance, provides data layers around connectivity — are users connecting from one place and then later connecting from a different location, indicating displacement? That sort of data from the company and telcos themselves can help emergency planners scout out how populations are shifting in real-time.

Data, data, on the wall — how many AIs can they call?

Rivulets of data have now turned into floods of information, but just like floodwaters rising in cities across the world, the data deluge now needs a response all its own. In business, the surfeit of big data has been wrangled with an IT stack from data warehouses all the way to business intelligence tools.

If only data for disasters could be processed so easily. Data relevant for disasters is held by dozens of different organizations spanning the private, public, and non-profit sectors, leading to huge interoperability problems. Even when the data can be harmonized, there are large challenges in summarizing the findings down to an actual decision a frontline responder can use in their work — making AI a tough sale still today, particularly outside of planning. As Davis of Verizon put it, “now that they have this plethora of data, a lot of cities and federal agencies are struggling with how to use it.”

Unfortunately, standardization is a challenge at all scales. Globally, countries mostly lack interoperability, although standards are improving over time. Amir Elichai, the founder and CEO of 911 call-handling platform Carbyne, said that “from a technology standpoint and a standards standpoint, there is a big difference between countries,” noting that protocols from one country often have to be completely rewritten to serve a different market.

Tom Cotter, director of emergency response and preparedness at health care disaster response organization Project HOPE, said that even setting up communications between responders can be challenging in an international environment. “Some countries allow certain platforms but not others, and it is constantly changing,” he said. “I basically have every single technology communication platform you can possibly have in one place.”

One senior federal emergency management official acknowledged that data portability has become increasingly key in procurement contracts for technology, with the government recognizing the need to buy commercially-available software rather than custom-designed software. That message has been picked up by companies like Esri, with Lanclos stating that “part of our core mission is to be open and … create data and to share that openly to the public or securely through open standards.”

For all its downsides though, the lack of interoperability can be ironically helpful for innovation. Elichai said that the “lack of standards is an advantage — you are not buying into a legacy standard,” and in some contexts where standards are lacking, quality protocols can be built with the assumption of a modern data workflow.

Even with interoperability though, the next challenge becomes data sanitation — and disaster data is dirty as … well, something. While sensor streams can be verified and cross-checked with other datasets, in recent years there has been a heavy increase in the quantity of citizen-submitted information that has to be carefully vetted before it is disseminated to first responders or the public.

With citizens having more access to smartphones than ever, emergency planners have to sanitize uploaded data uploaded in order to verify and make it useful. Image Credits: TONY KARUMBA/AFP via Getty Images

Bailey Farren, CEO and co-founder of disaster communications platform Perimeter, said that “sometimes citizens have the most accurate and real-time information, before first responders show up — we want citizens to share that with …government officials.” The challenge is how to filter the quality goods from the unhelpful or malicious. Raj Kamachee, the CIO of Team Rubicon, a non-profit which assembles teams of volunteer military veterans to respond to natural disasters, said that verification is critical, and it’s a key element of the infrastructure he has built at the organization since joining in 2017. “We’ve gotten more people using it so more feedback [and] more data [is] coming through the pipes,” he said. “So creating a self-service, a very collaborative approach.”

With quality and quantity, the AI models should come, right? Well, yes and no.

Sury of Columbia wants to cool down at least some of the hype around AI. “The big caveat with all of these machine learning and big data applications is that they are not a panacea — they are able to process a lot of disparate information, [but] they’re certainly not going to tell us exactly what to do,” he said. “First responders are already processing a lot of information,” and they don’t necessarily need more guidance.

Instead, AI in disasters is increasingly focused on planning and resilience. Sury pointed to OneConcern, a resiliency planning platform, as one example of how data and AI can be combined in the disaster planning process. He also pointed to the CDC’s Social Vulnerability Index and risk tools from FEMA that integrate different data signals into scalar values by emergency planners to optimize their contingency plans.

Yet, almost everyone I talked to was much more hesitant about the power of AI. As I discussed a bit in part one of this series regarding the disaster sales cycle, data tools have to be real-time and perfect every time given the lives that are on the line. Kamachee of Team Rubicon noted that when choosing tools, he avoids whiz-bang and instead looks at the pure utility of individual vendors. “We go high tech, but we prepare for low tech,” he said, empathizing that in disaster response, everything must be agile and adaptable to changing circumstances.

Elichai of Carbyne saw this pattern in his sales. There’s a “sensitivity in our market and the reluctance from time to time to adopt” new technologies he said, but acknowledged that “there is no doubt that AI at a certain point will provide benefits.”

Naillon of T-Mobile had similar views from the operator perspective, saying that “I can’t say that we really leverage AI very much” in the company’s disaster planning. Instead of AI as brain, the telecom company simply uses data and forecast modeling to optimally position equipment — no fancy GANs required.

Outside of planning, AI has helped in post-disaster recovery, and specifically around damage assessments. After a crisis transpires, assessments of infrastructure and private property have to be made in order for insurance claims to be filed and for a community to move forward. Art delaCruz, COO and president of Team Rubicon, noted that technology and a flourish of AI has helped significantly around damage assessments. Since his organization often helps rebuild communities in the course of its work, triaging damage is a critical element of its effective response strategy.

There’s a brighter future, other than that brightness from the sun that is going to burn us to a crisp, right?

So AI today is helping a bit with resilience planning and disaster recovery and not so much during emergency response itself, but there is certainly more to come across the entire cycle. Indeed, there is a lot of excitement about the future of drones, which are increasingly being used in the field, but there are concerns long term about whether AI and data will ultimately cause more problems than they solve.

Drones would seem to have an obvious value for disaster response, and indeed, they have been used by teams to get additional aerial footage and context where direct access by responders is limited. Kamachee of Team Rubicon noted that in the Bahamas on a mission, response teams used drones to detect survivors, since major roads were blocked. The drones snapped images that were processed using AI, and helped the team to identify those survivors for evacuation. He described drones and their potential as “sexy; very, very cool.”

Aerial views from drones can give disaster response teams much better real-time information, particularly in areas where on-the-ground access is limited. Image Credits: Mario Tama/Getty Images

Cotter of Project HOPE similarly noted that faster data processing translates to better responses. “Ultimately speed is what saves lives in these disasters,” he said. We’re “also able to manage more responses remotely [and] don’t have to send as many people downrange,” giving response teams more leverage in resource-constrained environments.

“I see more emergency management agencies using drone technology — search and rescue, aerial photography,” Davis of Verizon said, arguing that operators often have a mentality of “send a machine into a situation first.” He continued, arguing, “artificial intelligence is going to continue to get better and better and better [and] enable our first responders to respond more effectively, but also more efficiently and safer.”

With data flooding in from sensors and drones and processed and verified better than ever, disaster response can improve, perhaps even better than Mother Nature can galvanize her increasingly deadly whims. Yet, there is one caveat: will the AI algorithms themselves cause new problems in the future?

Clark-Ginsburg of RAND, perhaps supplying that typical RANDian alternatives analysis, said that these solutions can also create problems themselves, “technological risks leading to disaster and the world of technology facilitating disaster.” These systems can break, they can make mistakes, and more ominously — they can be sabotaged to increase chaos and damage.

Bob Kerrey, a co-chair of the 9/11 Commission, former senator and governor of Nebraska, and currently the board chairman of Risk & Return, a disaster response VC fund and philanthropy I profiled recently, pointed to cybersecurity as increasingly a wild card in many responses. “There wasn’t a concept called zero days — let alone a market for zero days — in 2004 [when the 9/11 Commission was doing its work], and now there is.” With the 9/11 terrorist attacks, “they had to come here, they had to hijack planes … now you don’t need to hijack planes to damage the United States,” noting that hackers “can be sitting with a bunch of other guys in Moscow, in Tehran, in China, or even your mother’s basement.”

Data is a revolution in the making for disaster response, but it may well cause a whole second-order set of problems that didn’t exist before. What is giveth is taketh away. The oil gushes, but then the well suddenly runs dry – or simply catches fire.


Future of Technology and Disaster Response Table of Contents