The human-focused startups of the hellfire

Disasters may not always be man-made, but they are always responded to by humans. There’s a whole panoply of skills and professions required today to respond to even the tiniest emergency, and that doesn’t even include the needs during pre-disaster planning and post-disaster recovery. It’s not a very remunerative industry for most and the mental health effects from stress can linger for decades, but the mission at the core of this work — to help people in the time of their greatest need — is what continues to attract many to partake in this never-ending battle anyway.

In the last three parts of this series on the future of technology and disaster response, I’ve focused on, well, technology, and specifically the sales cycle for new products, the sudden data deluge now that Internet of Things (IoT) is in full force, and the connectivity that allows that data to radiate all around. What we haven’t looked at enough so far is the human element: the people who actually respond to disasters as well as what challenges they face and how technology can help them.

So in this fourth and final part of the series, we’ll look at four areas where humans and technology intersect within disaster response and what future opportunities lie in this market: training and development, mental health, crowdsourced responses to disasters, and our doomsday future of hyper-complex emergencies.

Training in a hellfire

Most fields have linear approaches to training. To become a software engineer, students learn some computer science theory, add in some programming practice, and voilà (note: your mileage may vary). To become a medical doctor, aspiring physicians take an undergraduate curriculum teeming with biology and chemistry, head to medical school for two deadened years of core anatomy and other classes and then switch into clinical rotations, a residency, and maybe fellowships.

But how do you train someone to respond to emergencies?

From 911 call takers to EMTs and paramedics to emergency planning officials and the on-the-ground responders who are operating in the center of the storm as it were, there are large permutations in the skills required to do these jobs well. What’s necessary aren’t just specific hard skills like using call dispatch software or knowing how to upload video from a disaster site, but also critically-important softer skills as well: precisely communicating, having sangfroid, increasing agility, and balancing improvisation with consistency. The chaos element also can’t be overstated: every disaster is different, and these skills must be viscerally recombined and exercised under extreme pressure with frequently sparse information.

A whole range of what might be dubbed “edtech” products could serve these needs, and not just exclusively for emergency management.

Communications, for instance, isn’t just about team communications, but also communicating with many different constituencies. Aaron Clark-Ginsberg, a social scientist at RAND Corporation, said that “a lot of these skills are social skills — being able to work with different groups of people in culturally and socially appropriate ways.” He notes that the field of emergency management has heightened attention to these issues in recent years, and “the skillset we need is to work with those community structures” that already exist where a disaster strikes.

As we’ve seen in the tech industry the last few years, cross-cultural communication skills remain scarce. One can always learn this just through repeated experiences, but could we train people to develop empathy and understanding through software? Can we develop better and richer scenarios to train emergency responders — and all of us, really — on how to communicate effectively in widely diverging conditions? That’s a huge opportunity for a startup to tackle.

Emergency management is now a well-developed career path. “The history of the field is very fascinating, [it’s] been increasingly professionalized, with all these certifications,” Clark-Ginsberg said. That professionalization “standardizes emergency response so that you know what you are getting since they have all these certs, and you know what they know and what they don’t.” Certifications can indicate singular competence, but perhaps not holistic assessment, and it’s a market that offers opportunities for new startups to create better assessments.

Like many of us, responders get used to doing the same thing over and over again, and that can make training for new skills even more challenging. Michael Martin of emergency data management platform RapidSOS describes how 911 call takers get used to muscle memory, “so switching to a new system is very high-risk.” No matter how bad existing software interfaces are, changing them will very likely slow every single response down while increasing the risk of errors. That’s why the company offers “25,000 hours a year for training, support, integration.” There remains a huge and relatively fragmented market for training staff as well as transitioning them from one software stack to another.

Outside these somewhat narrow niches, there is a need for a massive renaissance in training in this whole area. My colleague Natasha Mascarenhas recently wrote an EC-1 on Duolingo, an app designed to gamify and entrance students interested in learning second languages. It’s a compelling product, and there is no comparative training system for engaging the full gamut of first responders.

Art delaCruz, COO and president of Team Rubicon, a non-profit which assembles teams of volunteer military veterans to respond to natural disasters, said that it’s an issue his organization is spending more time thinking about. “Part of resilience is education, and the ability to access information, and that is a gap that we continue to close on,” he said. “How do you present information that’s more simple than [a learning management system]?” He described the need for “knowledge bombs like flash cards” to regularly provide responders with new knowledge while testing existing ideas.

There’s also a need to scale up best practices rapidly across the world. Tom Cotter, director of emergency response and preparedness at Project Hope, a non-profit which empowers local healthcare workers in disaster-stricken and impoverished areas, said that in the context of COVID-19, “a lot of what was going to be needed [early on] was training — there were huge information gaps at the clinical level, how to communicate it at a community level.” The organization developed a curriculum with Brown University’s Watson Institute in the form of interactive PowerPoints that were ultimately used to train 100,000 healthcare workers on the new virus, according to Cotter.

When I look at the spectrum of edtech products existing today, one of the key peculiarities is just how narrow each seems to focus. There are apps for language learning and for learning math and developing literacy. There are flash card apps like Anki that are popular among medical students, and more interactive approaches like Labster for science experiments and Sketchy for learning anatomy.

Yet, for all the talk of boot camps in Silicon Valley, there is no edtech company that tries to completely transform a student in the way that a bona fide boot camp does. No startup wants to holistically develop their students, adding in hard skills while also advancing the ability to handle stress, the improvisation needed to confront rapidly-changing environments, and the skills needed to communicate with empathy.

Maybe that can’t be done with software. Maybe. Or perhaps, no founder has just had the ambition so far to go for broke — to really revolutionize how we think about training the next generation of emergency management professionals and everyone else in private industry who needs to handle stress or think on their feet just as much as frontline workers.

That’s the direction where Bryce Stirton, president and co-founder of public-safety company Responder Corp, has been thinking about. “Another area I am personally a fan of is the training space around VR,” he said. “It’s very difficult to synthesize these stressful environments,” in areas like firefighting, but new technologies have “the ability to pump the heart that you need to experience in training.” He concludes that “the VR world, it can have a large impact.”

Healing after disaster

When it comes to trauma, few fields face quite the challenge as emergency response. It’s work that almost by definition forces its personnel to confront some of the most harrowing scenes imaginable. Death and destruction are given, but what’s not always accounted for is the lack of agency in some of these contexts for first responders — the family that can’t be saved in time so a 911 call taker has to offer final solace, or the paramedics who don’t have the right equipment even as they are showing up on site.

Post-traumatic stress is perhaps the most well-known and common mental health condition facing first responders, although it is hardly the only one. How to ameliorate and potentially even cure these conditions represents a burgeoning area of investment and growth for a number of startups and investors.

Risk & Return, for instance, is a venture firm heavily focused on companies working on mental health as well as human performance more generally. In my profile of the firm a few weeks ago, managing director Jeff Eggers said that “We love that type of technology since it has that dual purpose: going to serve the first responder on the ground, but the community is also going to benefit.”

Two examples of companies from its portfolio are useful here to explore as examples of different pathways in this category. The first is Alto Neuroscience, which is a stealthy startup founded by Amit Etkin, a multidisciplinary neuroscientist and psychiatrist at Stanford, to create new clinical treatments to post-traumatic stress and other conditions based on brainwave data. Given its therapeutic focus, it’s probably years before testing and regulatory approvals come through, but this sort of research is on the cutting-edge of innovation here.

The second company is NeuroFlow, which is a software startup using apps to guide patients to better mental health outcomes. Through persistent polling, testing, and collaboration with practitioners, the company’s tools allow for more active monitoring of mental health — looking for emerging symptoms or relapses in even the most complicated cases. NeuroFlow is more on the clinical side, but there are obviously a wealth of wellness startups that have percolated in recent years as well like Headspace and Calm.

Outside of therapeutics and software though, there are entirely new frontiers around mental health in areas like psychedelics. That was one of the trends I called out as a top five area for investment in the 2020s earlier this year, and I stand by that. We’ve also covered a startup called Osmind which is a clinical platform for managing patients with a psychedelic focus.

Risk & Return itself hasn’t made an investment in psychedelics yet, but Bob Kerrey, the firm’s board chairman and the former co-chair of the 9/11 Commission as well as former governor and senator of Nebraska, said that “it’s difficult to do this if you are the government, but easier to do this in the private sector.”

Similar to edtech, mental health startups might get their start in the first responder community, but they are hardly limited to this population. Post-traumatic stress and other mental health conditions affect wide swaths of the world’s population, and solutions that work in one community can often translate more broadly to others. It’s a massive, massive market, and one that could potentially transform the lives of millions of people for the better.

Before moving on, there’s one other area of interest here, and that is creating impactful communities for healing. First responders and military veterans experience a mission and camaraderie in their service that they often lack once they are in new jobs or on convalescence. DelaCruz of Team Rubicon says that one of the goals of bringing veterans to help in disaster regions is that the veterans themselves “reconnect with identity and community — we have these incredible assets in these men and women who have served.” It’s not enough to just find a single treatment per patient — we oftentimes need to zoom out to the wider population to see how mental health ripples out.

Helping people find purpose may not be the easiest challenge to solve as a startup, but it’s certainly a major challenge for many, and an area fermenting with new approaches now that the the social networking wave has reached its nadir.

Crowdsourcing disaster response

Decentralization has been all the rage in tech in recent years — just mention the word blockchain in a TechCrunch article to get at least 50 PR emails about the latest NFT for a toilet stain. While there is obviously a lot of noise, one area where substance may pan out well is in disaster response.

If the COVID-19 pandemic showed anything, it was the power of the internet to aggregate as well as verify data, build dashboards, and deliver highly-effective visualizations of complex information for professionals and laypeople alike. Those products were developed by people all around the world often from the comfort of their own homes, and they demonstrate how crowds can quickly draft serious labor to help respond to crises as they crop up.

Jonathan Sury, project director at the National Center for Disaster Preparedness at the Earth Institute at Columbia University, said that “COVID has really blown so much of what we think about out of the water.” With so many ways to collaborate online right now, “that’s what I would say is very exciting … and also practical and empowering.”

Clark-Ginsberg of RAND calls it the “next frontier of disaster management.” He argues that “if you can use technology to broaden the number of people who can participate in disaster management and respond to disasters,” then we might be reaching an entirely new paradigm for what effective disaster response will look like. “Formal structures [for professional frontline workers] have strengthened and that has saved lives and resources, but our ability to engage with everyday responders is still something to work on.”

Many of the tools that underpin these crowdsourced efforts don’t even focus on disasters. Sury pointed to Tableau and data visualization platform Flourish as examples of the kinds of tools that remote, lay first responders are using. There are now quite robust tools for tabular data, but we’re still relatively early in the development of tools for handling mapping data — obviously critical in the crisis context. Unfolded.ai, which I profiled earlier this year, is working on building scalable geospatial analytics in the browser. A lot more can be done here.

Oftentimes there are ways to coordinate the coordinators. Develop for Good, which I looked at late last year, is a non-profit designed to connect enterprising computer science students to software and data projects at non-profits and agencies that needed help during the pandemic. Sometimes these coordinators are non-profit orgs, and sometimes, just very active Twitter accounts. There’s a lot more experimentation possible on how to coordinate efforts in a decentralized way while still engaging with professional first responders and the public sector.

Speaking of decentralization, it’s even possible that blockchain could play a role in disaster and crisis response. Many of these opportunities rest on using blockchain for evidence collection or for identity. For example, earlier this week Leigh Cuen took a careful look at an at-home sexual assault evidence collection kit from Leda Health that uses the blockchain to establish a clear time for when a sample was collected.

There is a lot more potential to harness the power of crowdsourcing and decentralization, and many of these projects have applications far outside disaster management itself. These tools not only solve real problems — they provide real community to people who may not be related to the disaster itself, but are enthusiastic to do their part to help others.

The black swans of black swans

In terms of startups, the three markets I identified — better training, better mental health, and better crowdsourcing collaboration tools, particularly around data — collectively represent a very compelling set of markets that will not only be valuable for founders, but can rapidly improve lives.

In his book Normal Accidents, Charles Perrow talks about how an increasing level of complexity and coupledness in our modern technical systems all but guarantee disasters to occur. Add in a warming world as well as the intensity, frequency, and just plain unusualness of disasters arriving each year, and we are increasingly seeing entirely novel forms of emergencies we have never responded to before. Take most recently the ultra-frigid conditions in Texas that sapped power from its grid, leading to statewide blackouts for hours and days in some parts of the state.

Clark-Ginsberg said, “We are seeing these risks emerge that aren’t just typical wildfires — where we have a response structure that we can easily setup and manage the hazard, [we’re] very good at managing these typical disasters. There are more of these atypical disasters cropping up, and we have a very hard time setting up structures for this — the pandemic is a great example of that.”

He describes these challenges as “trans-boundary risk management,” disasters that cross bureaucratic lines, professions, societies, and means of action. “It takes a certain agility and the ability to move quickly and the ability to work in ways outside typical bureaucratic structures, and that is just challenging full stop,” he said.

The Future of Technology and Disaster Response

Even as we begin to have better point solutions to the individual problems that disasters and their responses require, we can’t be remiss in neglecting the more systematic challenges that these emergencies are bringing to the fore. We have to start thinking about bringing humans together faster and in more novel ways to be the most effective, while coupling them flexibly and with agility to the best tools that meet their needs in the moment. That’s probably not literally “a startup,” but more a way of thinking about what it means to construct a disaster response fresh given the information available.

Amanda Levin, a policy analyst at the Natural Resources Defense Council, said that “even if we mitigate, there are huge pressures and huge impacts today from a warming world … even if we stop emissions today, [they] will still persist.” As one of my interviewees in government service who asked to go unnamed noted about disaster response, “You always are coming up short somewhere.” The problems are only getting harder, and we humans need much better tools to match the man-made trials we created for ourselves. That’s the challenge — and opportunity — for a tough century ahead.

When the Earth is gone, at least the internet will still be working

The internet is now our nervous system. We are constantly streaming and buying and watching and liking, our brains locked into the global information matrix as one universal and coruscating emanation of thought and emotion.

What happens when the machine stops though?

It’s a question that E.M. Forster was intensely focused on more than a century ago in a short story called, rightly enough, “The Machine Stops,” about a human civilization connected entirely through machines that one day just turn off.

Those fears of downtime are not just science fiction anymore. Outages aren’t just missing a must-watch TikTok clip. Hospitals, law enforcement, the government, every corporation — the entire spectrum of human institutions that constitute civilization now deeply rely on connectivity to function.

So when it comes to disaster response, the world has dramatically changed. In decades past, the singular focus could be roughly summarized as rescue and mitigation — save who you can while trying to limit the scale of destruction. Today though, the highest priority is by necessity internet access, not just for citizens, but increasingly for the on-the-ground first responders who need bandwidth to protect themselves, keep abreast of their mission objectives, and have real-time ground truth on where dangers lurk and where help is needed.

While the sales cycles might be arduous as we learned in part one and the data trickles have finally turned to streams in part two, the reality is that none of that matters if there isn’t connectivity to begin with. So in part three of this series on the future of technology and disaster response, we’re going to analyze the changing nature of bandwidth and connectivity and how they intersect with emergencies, taking a look at how telcos are creating resilience in their networks while defending against climate change, how first responders are integrating connectivity into their operations, and finally, exploring how new technologies like 5G and satellite internet will affect these critical activities.

Wireless resilience as the world burns

Climate change is inducing more intense weather patterns all around the world, creating second- and third-order effects for industries that rely on environmental stability for operations. Few industries have to be as dynamic to the changing context as telecom companies, whose wired and wireless infrastructure is regularly buffeted by severe storms. Resiliency of these networks isn’t just needed for consumers — it’s absolutely necessary for the very responders trying to mitigate disasters and get the network back up in the first place.

Unsurprisingly, no issue looms larger for telcos than access to power — no juice, no bars. So all three of America’s major telcos — Verizon (which owns TechCrunch’s parent company Verizon Media, although not for much longer), AT&T and T-Mobile — have had to dramatically scale up their resiliency efforts in recent years to compensate both for the demand for wireless and the growing damage wrought by weather.

Jay Naillon, senior director of national technology service operations strategy at T-Mobile, said that the company has made resilience a key part of its network buildout in recent years, with investments in generators at cell towers that can be relied upon when the grid cannot. In “areas that have been hit by hurricanes or places that have fragile grids … that is where we have invested most of our fixed assets,” he said.

Like all three telcos, T-Mobile pre-deploys equipment in anticipation for disruptions. So when a hurricane begins to swirl in the Atlantic Ocean, the company will strategically fly in portable generators and mobile cell towers in anticipation of potential outages. “We look at storm forecasts for the year,” Naillon explained, and do “lots of preventative planning.” They also work with emergency managers and “run through various drills with them and respond and collaborate effectively with them” to determine which parts of the network are most at risk for damage in an emergency. Last year, the company partnered with StormGeo to accurately predict weather events.

Predictive AI for disasters is also a critical need for AT&T. Jason Porter, who leads public sector and the company’s FirstNet first-responder network, said that AT&T teamed up with Argonne National Laboratory to create a climate-change analysis tool to evaluate the siting of its cell towers and how they will weather the next 30 years of “floods, hurricanes, droughts and wildfires.” “We redesigned our buildout … based on what our algorithms told us would come,” he said, and the company has been elevating vulnerable cell towers four to eight feet high on “stilts” to improve their resiliency to at least some weather events. That “gave ourselves some additional buffer.”

AT&T has also had to manage the growing complexity of creating reliability with the chaos of a climate-change-induced world. In recent years, “we quickly realized that many of our deployments were due to weather-related events,” and the company has been “very focused on expanding our generator coverage over the past few years,” Porter said. It’s also been very focused on building out its portable infrastructure. “We essentially deploy entire data centers on trucks so that we can stand up essentially a central office,” he said, empathizing that the company’s national disaster recovery team responded to thousands of events last year.

Particularly on its FirstNet service, AT&T has pioneered two new technologies to try to get bandwidth to disaster-hit regions faster. First, it has invested in drones to offer wireless services from the sky. After Hurricane Laura hit Louisiana last year with record-setting winds, our “cell towers were twisted up like recycled aluminum cans … so we needed to deploy a sustainable solution,” Porter described. So the company deployed what it dubs the FirstNet One — a “dirigible” that “can cover twice the cell coverage range of a cell tower on a truck, and it can stay up for literally weeks, refuel in less than an hour and go back up — so long-term, sustainable coverage,” he said.

AT&T’s FirstNet One dirigible to offer internet access from the air for first responders. Image Credits: AT&T/FirstNet

Secondly, the company has been building out what it calls FirstNet MegaRange — a set of high-powered wireless equipment that it announced earlier this year that can deploy signals from miles away, say from a ship moored off a coast, to deliver reliable connectivity to first responders in the hardest-hit disaster zones.

As the internet has absorbed more of daily life, the norms for network resilience have become ever more exacting. Small outages can disrupt not just a first responder, but a child taking virtual classes and a doctor conducting remote surgery. From fixed and portable generators to rapid-deployment mobile cell towers and dirigibles, telcos are investing major resources to keep their networks running continuously.

Yet, these initiatives are ultimately costs borne by telcos increasingly confronting a world burning up. Across conversations with all three telcos and others in the disaster response space, there was a general sense that utilities just increasingly have to self-insulate themselves in a climate-changed world. For instance, cell towers need their own generators because — as we saw with Texas earlier this year — even the power grid itself can’t be guaranteed to be there. Critical applications need to have offline capabilities, since internet outages can’t always be prevented. The machine runs, but the machine stops, too.

The trend lines on the frontlines are data lines

While we may rely on connectivity in our daily lives as consumers, disaster responders have been much more hesitant to fully transition to connected services. It is precisely in the middle of a tornado and the cell tower is down that you realize a printed map might have been nice to have. Paper, pens, compasses — the old staples of survival flicks remain just as important in the field today as they were decades ago.

Yet, the power of software and connectivity to improve emergency response has forced a rethinking of field communications and how deeply technology is integrated on the ground. Data from the frontlines is extremely useful, and if it can be transmitted, dramatically improves the ability of operations planners to respond safely and efficiently.

Both AT&T and Verizon have made large investments in directly servicing the unique needs of the first responder community, with AT&T in particular gaining prominence with its FirstNet network, which it exclusively operates through a public-private partnership with the Department of Commerce’s First Responder Network Authority. The government offered a special spectrum license to the FirstNet authority in Band 14 in exchange for the buildout of a responder-exclusive network, a key recommendation of the 9/11 Commission, which found that first responders couldn’t communicate with each other on the day of those deadly terrorist attacks. Now, Porter of AT&T says that the company’s buildout is “90% complete” and is approaching 3 million square miles of coverage.

Why so much attention on first responders? The telcos are investing here because in many ways, the first responders are on the frontiers of technology. They need edge computing, AI/ML rapid decision-making, the bandwidth and latency of 5G (which we will get to in a bit), high reliability, and in general, are fairly profitable customers to boot. In other words, what first responders need today are what consumers in general are going to want tomorrow.

Cory Davis, director of public safety strategy and crisis response at Verizon, explained that “more than ever, first responders are relying on technology to go out there and save lives.” His counterpart, Nick Nilan, who leads product management for the public sector, said that “when we became Verizon, it was really about voice [and] what’s changed over the last five [years] is the importance of data.” He brings attention to tools for situational awareness, mapping, and more that are a becoming standard in the field. Everything first responders do “comes back to the network — do you have the coverage where you need it, do you have the network access when something happens?”

The challenge for the telcos is that we all want access to that network when catastrophe strikes, which is precisely when network resources are most scarce. The first responder trying to communicate with their team on the ground or their operations center is inevitably competing with a citizen letting friends know they are safe — or perhaps just watching the latest episode of a TV show in their vehicle as they are fleeing the evacuation zone.

That competition is the argument for a completely segmented network like FirstNet, which has its own dedicated spectrum with devices that can only be used by first responders. “With remote learning, remote work and general congestion,” Porter said, telcos and other bandwidth providers were overwhelmed with consumer demand. “Thankfully we saw through FirstNet … clearing that 20 MHz of spectrum for first responders” helped keep the lines clear for high-priority communications.

FirstNet’s big emphasis is on its dedicated spectrum, but that’s just one component of a larger strategy to give first responders always-on and ready access to wireless services. AT&T and Verizon have made prioritization and preemption key operational components of their networks in recent years. Prioritization gives public safety users better access to the network, while preemption can include actively kicking off lower-priority consumers from the network to ensure first responders have immediate access.

Nilan of Verizon said, “The network is built for everybody … but once we start thinking about who absolutely needs access to the network at a period of time, we prioritize our first responders.” Verizon has prioritization, preemption, and now virtual segmentation — “we separate their traffic from consumer traffic” so that first responders don’t have to compete if bandwidth is limited in the middle of a disaster. He noted that all three approaches have been enabled since 2018, and Verizon’s suite of bandwidth and software for first responders comes under the newly christened Verizon Frontline brand that launched in March.

With increased bandwidth reliability, first responders are increasingly connected in ways that even a decade ago would have been unfathomable. Tablets, sensors, connected devices and tools — equipment that would have been manual are now increasingly digital.

That opens up a wealth of possibilities now that the infrastructure is established. My interview subjects suggested applications as diverse as the decentralized coordination of response team movements through GPS and 5G; real-time updated maps that offer up-to-date risk analysis of how a disaster might progress; pathfinding for evacuees that’s updated as routes fluctuate; AI damage assessments even before the recovery process begins; and much, much more. In fact, when it comes to the ferment of the imagination, many of those possibilities will finally be realized in the coming years — when they have only ever been marketing-speak and technical promises in the past.

Five, Gee

We’ve been hearing about 5G for years now, and even 6G every once in a while just to cause reporters heart attacks, but what does 5G even mean in the context of disaster response? After years of speculation, we are finally starting to get answers.

Naillon of T-Mobile noted that the biggest benefit of 5G is that it “allows us to have greater coverage” particularly given the low-band spectrum that the standard partially uses. That said, “As far as applications — we are not really there at that point from an emergency response perspective,” he said.

Meanwhile, Porter of AT&T said that “the beauty of 5G that we have seen there is less about the speed and more about the latency.” Consumers have often seen marketing around voluminous bandwidths, but in the first-responder world, latency and edge computing tends to be the most desirable features. For instance, devices can relay video to each other on the frontlines, without necessarily needing a backhaul to the main wireless network. On-board processing of image data could allow for rapid decision-making in environments where seconds can be vital to the success of a mission.

That flexibility is allowing for many new applications in disaster response, and “we are seeing some amazing use cases coming out of our 5G deployments [and] we have launched some of our pilots with the [Department of Defense],” Porter said. He offered an example of “robotic dogs to go and do bomb dismantling or inspecting and recovery.”

Verizon has made innovating on new applications a strategic goal, launching a 5G First Responders Lab dedicated to guiding a new generation of startups to build at this crossroads. Nilan of Verizon said that the incubator has had more than 20 companies across four different cohorts, working on everything from virtual reality training environments to AR applications that allow firefighters to “see through walls.” His colleague Davis said that “artificial intelligence is going to continue to get better and better and better.”

Blueforce is a company that went through the first cohort of the Lab. The company uses 5G to connect sensors and devices together to allow first responders to make the best decisions they can with the most up-to-date data. Michael Helfrich, founder and CEO, said that “because of these new networks … commanders are able to leave the vehicle and go into the field and get the same fidelity” of information that they normally would have to be in a command center to receive. He noted that in addition to classic user interfaces, the company is exploring other ways of presenting information to responders. “They don’t have to look at a screen anymore, and [we’re] exploring different cognitive models like audio, vibration and heads-up displays.”

5G will offer many new ways to improve emergency responses, but that doesn’t mean that our current 4G networks will just disappear. Davis said that many sensors in the field don’t need the kind of latency or bandwidth that 5G offers. “LTE is going to be around for many, many more years,” he said, pointing to the hardware and applications taking advantage of LTE-M standards for Internet of Things (IoT) devices as a key development for the future here.

Michael Martin of emergency response data platform RapidSOS said that “it does feel like there is renewed energy to solve real problems,” in the disaster response market, which he dubbed the “Elon Musk effect.” And that effect definitely does exist when it comes to connectivity, where SpaceX’s satellite bandwidth project Starlink comes into play.

Satellite uplinks have historically had horrific latency and bandwidth constraints, making them difficult to use in disaster contexts. Furthermore, depending on the particular type of disaster, satellite uplinks can be astonishingly challenging to setup given the ground environment. Starlink promises to shatter all of those barriers — easier connections, fat pipes, low latencies and a global footprint that would be the envy of any first responder globally. Its network is still under active development, so it is difficult to foresee today precisely what its impact will be on the disaster response market, but it’s an offering to watch closely in the years ahead, because it has the potential to completely upend the way we respond to disasters this century if its promises pan out.

Yet, even if we discount Starlink, the change coming this decade in emergency response represents a complete revolution. The depth and resilience of connectivity is changing the equation for first responders from complete reliance on antiquated tools to an embrace of the future of digital computing. The machine is no longer stoppable.


Future of Technology and Disaster Response Table of Contents


Startup employees should pay attention to Biden’s capital gains tax plans

The Biden Administration has reportedly proposed significant changes to the capital gains tax, aiming to target the wealthiest Americans to help fund his historic aid programs.

If the current proposal goes into effect, it will have an impact on startup employees who aren’t (yet) wealthy. And it’s unlikely the Biden Administration has considered the consequences, because many of these employees aren’t yet in the highest tax bracket. But startup employees need to pay close attention to these changes when planning what to do with their stock options.

We don’t yet know what will end up in a passed bill, which may look very different from the originally proposed plan. This shouldn’t cause alarm for employees or cause them to avoid exercising options, but it is something they should be thinking about when planning their equity strategy.

When it comes to employee equity, the worst decision is always not having a plan of action.

As always, employees should work with their advisers to plan accordingly and get ahead of any changes.

How changes in capital gains tax impact startup stock options

Historically, long-term capital gains, or gains on assets held for over a year, have enjoyed preferential tax rates in comparison to short-term capital gains, which are assets held for less than a year. In Biden’s original proposal, he suggests raising the long-term capital gains rate to the highest ordinary income tax rate on income over $1 million.

If Biden’s changes are enacted, it means that there would no longer be preferential tax rates for those that make over $1 million on the sale of their shares post-IPO or as part of an acquisition. Many employees “go long” with their equity, selling them a year after exercising to benefit from long-term capital gains tax. Under this change, they may be limited to the amount of upside they can convert to preferential capital gains tax depending on their income levels and when they sell.

As with any tax legislation, the devil is in the details, many of which are still to be determined. These are the questions employees should be asking if the legislation moves forward:

  • Is the first $1 million in capital gains still taxed at preferential rates or do I factor in other sources of income to determine the $1 million threshold?
  • How can I plan around the sale of my shares to stay under the $1 million threshold?
  • Is there any impact on qualified small business stock (QSBS)?

Clarity on these questions and details of the plan will provide critical information for employees looking to exercise if Biden’s tax plan advances in Congress.

Capital gains tax rules have always been political

Many presidents have expressed interest in changing the capital gains laws in the past. President Obama, for example, wanted to raise the capital gains tax. President Trump campaigned on capital gains rules, suggesting the carried interest rules, which are possible because of capital gains tax rules, be eliminated.

The questions now are: Will Biden be successful in addressing capital gains tax rules? And will the Democrats risk backlash or potential downsides driven by increased capital gains tax? Many experts suggest that the final legislation, if passed, will result in a capital gains tax increase, but much less than Biden’s original proposal. Some are suggesting Congress will settle on no more than 30% as the highest capital gain rates for those who earn more than $1 million.

Only time will tell, but the suggested tax plan may create a significant, if unintended, burden to startup employees more than anyone else.

Planning around your equity

There’s still a lot of uncertainty around what new tax legislation may look like or if it will happen at all. At this point, startup employees may not necessarily need to act on these potential changes, but they should be taking it into account when planning what to do with their equity and, more specifically, when they are planning to exercise.

Either way, employees should still strongly consider exercising their stock options (it’s a key benefit of working at a startup, after all). Taxes are just one consideration. For example, many companies have exercise deadlines after employees leave a company.

Even if rates to capital gains taxes change, exercising early may still have its benefits, as many employees may still be able to create a plan to sell up to a certain number of shares at preferential rates every year.

While Biden’s proposed plan is focused on changing the federal tax rates, state income tax considerations remain. Startup employees have been moving away from high-tax states such as California and New York in favor of no-income tax states such as Texas and Florida. Those that are planning a move may have a big incentive to exercise their options to limit California and New York’s reach on the shares.

It’s important that employees understand the advantages and disadvantages of exercising today versus waiting until after an IPO. When it comes to employee equity, the worst decision is always not having a plan of action.

80% of the 22 million comments on net neutrality rollback were fake, investigation finds

Of the 22 million comments submitted to the FCC regarding 2017’s controversial rollback of net neutrality, some 18 million were fake, an investigation by the New York Attorney General’s office has found. The broadband industry funded the fraudulent creation of about 8.5 million of those, while a 19-year-old college student submitted 7.7 million, and the remainder came from unknown but spurious sources.

The damning report, issued today, is the result of years of work; it set up a tip line early on so people could report fraudulent comments, and no doubt received plenty, as people were already independently finding themselves, dead relatives, and other obviously fake submissions among the record.

It turns out that a huge number of these comments were paid for by a consortium of broadband companies called Broadband for America, which laid out about $4.2 million for the purpose. They contracted with several “lead generator” companies, the kind of shady operations that offer you free trials of “male enhancement pills” or the like if you fill out a form — in this case, asking the person to write an anti-net-neutrality comment.

As if that wasn’t bad enough, the lead generation companies didn’t even bother plying their shady trade in what passes for an honest way; instead they fabricated the lists and comments with years-old data and in one case with identities stolen in a major data breach. The practice was near universal:

In all, six lead generators funded by the broadband industry engaged in fraud. As a result, nearly every comment and message the broadband industry submitted to the FCC and Congress was fake, signed using the names and addresses of millions of individuals without their knowledge or consent.

The broadband companies are off the hook on a technicality, since they were careful to firewall themselves from the practices of those they were contracting with, even though the record shows it was plain that the information being collected and used was fraudulent. But because the actions were, ostensibly, independently taken by the enterprising lead generators, the buck stops there.

Notably, these scams were also involved in more than a hundred other advocacy campaigns, including submitting over a million fake comments for an EPA proceeding and millions of other letters and digital comments.

The wholesale undermining of the processes of government earned fines of $3.7M, $550K, and $150K for Fluent Inc, React2Media, and Opt-Intelligence respectively. There are also “comprehensive reforms” imposed on them, though it may be best not to expect much from those.

Internet rights advocacy organization Fight for the Future issued a king-size “I told you so” noting that they had flagged this process at the time and helped bring it to the attention of both government officials and ordinary folks.

Another 7.7 million fake comments were submitted by a single person, a California college student who simply combined a fake name generation site with disposable email service to provide plausible identities. The person automated an individual comment submission process, and somehow the FCC’s systems didn’t flag it. Another unknown person used similar means to submit another 1.6 million fake comments.

Acting FCC Chairwoman Jessica Rosenworcel said in a statement that “Today’s report demonstrates how the record informing the FCC’s net neutrality repeal was flooded with fraud. This was troubling at the time because even then the widespread problems with the record were apparent. We have to learn from these lessons and improve because the public deserves an open and fair opportunity to tell Washington what they think about the policies that affect their lives.”

Indeed at the time Rosenworcel suggested delaying the vote, joining many in the country who felt the scale of the shenanigans warranted further investigation — but then-Chairman Ajit Pai brushed aside their concerns, one of many decisions that have considerably tarnished his legacy.

Altogether it’s a pretty sad situation, and the broadband companies and their lobbyists get off without so much as a slap on the wrist. The NY AG report has a variety of recommendations, some of which no doubt have already been implemented or suggested as the FCC’s comment debacle became clear, but the bad guys definitely won this time.

China expresses concern over its absence in India’s 5G trials

China expressed concern on Wednesday over India’s move to not grant any Chinese firm permission to participate in 5G trials in the world’s second largest internet market as the two neighboring nations struggle to navigate business ties amid their geo-political tensions.

India’s Department of Telecommunications earlier this week approved over a dozen firm’s applications to conduct a six-month trial to test the use and application of 5G technology in the country.

Among those who have received the approval include international giants such as Ericsson, Nokia, and Samsung that will collaborate with Indian telecom operators Jio Platforms, Airtel, Vodafone Idea, and MTNL for the trial.

Huawei, ZTE and other Chinese companies, that have been operating in India for several years, haven’t received the approval from the Indian government to participate in the upcoming trial. The Indian ministry said earlier this week that it granted permission to those firms that had been picked by the telecom operators.

Wang Xiaojian, the spokesperson of Chinese Embassy in India, said in a statement on Wednesday that the nation expresses “concern and regret that Chinese telecommunications companies have not been permitted to conduct 5G trials with Indian Telecom Service Providers in India.”

“Relevant Chinese companies have been operating in India for years, providing mass job opportunities and making contribution to India’s infrastructure construction in telecommunications. To exclude Chinese telecommunications companies from the trials will not only harm their legitimate rights and interests, but also hinder the improvement of the Indian business environment, which is not conducive to the innovation and development of related Indian industries,” added Xiaojian.

Last year, Airtel (India’s second-largest telecom operator) had said that it was open to collaborating with global technology firms, including those from China, for components. “Huawei, over the last 10 or 12 years, has become extremely good with their products to a point where I can safely today say their products at least in 3G, 4G that we have experienced is significantly superior to Ericsson and Nokia without a doubt. And I use all three of them,” Sunil Mittal, the founder of Airtel, said at a conference last year.

In the same panel, then U.S. commerce secretary Wilbur Ross had urged India and other allies of the U.S. to avoid Huawei.

The geo-political tension between India and China escalated last year with skirmishes at the shared border. India, which early last year amended a rule to make it difficult for Chinese firms to invest in Indian companies, has since banned over 200 apps including TikTok, UC Browser and PUBG Mobile that have ties with China over national security concerns.

India’s move earlier this week follows similar decisions taken by the U.S., U.K. and Australia, all of which have expressed concerns about Huawei and ZTE and their ties with the Chinese government.

“The Chinese side hopes that India could do more to enhance mutual trust and cooperation between the two countries, and provide an open, fair, just, and non-discriminatory investment and business environment for market entities from all countries, including China, to operate and invest in India,” wrote Xiaojian.

Last year, China had expressed “serious concerns” and “firmly opposed” India’s charges that Chinese apps posed national security concerns. The Chinese Embassy had alleged that by banning apps with links to China, New Delhi was engaging in “discriminatory practices” that “violated WTO rules.”

Data was the new oil, until the oil caught fire

We’ve been hearing how “data is the new oil” for more than a decade now, and in certain sectors, it’s a maxim that has more than panned out. From marketing and logistics to finance and product, decision-making is now dominated by data at all levels of most big private orgs (and if it isn’t, I’d be getting a résumé put together, stat).

So it might be a something of a surprise to learn that data, which could transform how we respond to the increasingly deadly disasters that regularly plague us, has been all but absent from much of emergency response this past decade. Far from being a geyser of digital oil, disaster response agencies and private organizations alike have for years tried to swell the scope and scale of the data being inputted into disaster response, with relatively meager results.

That’s starting to change though, mostly thanks to the internet of things (IoT), and frontline crisis managers today increasingly have the data they need to make better decisions across the resilience, response, and recovery cycle. The best is yet to come — with drones flying up, simulated visualizations, and artificial intelligence-induced disasters — what we’re seeing today on the frontlines is only the beginning of what could be a revolution in disaster response in the 2020s.

The long-awaited disaster data deluge has finally arrived

Emergency response is a fight against the fog of war and the dreadful ticking of the clock. In the midst of a wildfire or hurricane, everything can change in a matter of seconds — even milliseconds if you aren’t paying attention. Safe roads ferrying evacuees can suddenly become impassable infernos, evacuation teams can reposition and find themselves spread far too thin, and unforeseen conditions can rapidly metastasize to cover the entire operating environment. An operations center that once had perfect information can quickly find it has no ground truth at all.

Unfortunately, even getting raw data on what’s happening before and during a disaster can be extraordinarily difficult. When we look at the data revolution in business, part of the early success stems from the fact that companies were always heavily reliant on data to handle their activities. Digitalization was and is the key word: moving from paper to computers in order to transform latent raw data into a form that was machine-readable and therefore analyzable. In business, the last ten years was basically upgrading to version two from version one.

In emergency management however, many agencies are stuck without a version at all. Take a flood — where is the water and where is it going? Up until recently, there was no comprehensive data on where waters rose from and where they sloshed to. When it came to wildfires, there were no administrative datasets on where every tree in the world was located and how prone each is to fire. Even human infrastructure like power lines and cell towers often had little interface with the digital world. They stood there, and if you couldn’t see them, they couldn’t see you.

Flood modeling is on the cutting edge of disaster planning and response. Image Credits: CHANDAN KHANNA/AFP via Getty Images

Models, simulations, predictions, analysis: all of these are useless without raw data, and in the disaster response realm, there was no detailed data to be found.

After years of promising an Internet of Things (IoT) revolution, things are finally internet-izing, with IoT sensors increasingly larding up the American and world landscape. Temperature, atmospheric pressure, water levels, humidity, pollution, power, and other sensors have been widely deployed, emitting constant streams of data back into data warehouses ready for analysis.

Take wildfires in the American West. It wasn’t all that long ago that the U.S. federal government and state firefighting agencies had no knowledge of where a blaze was taking place. Firefighting has been “100 years of tradition unimpeded by progress,” Tom Harbour, head of fire response for a decade at the U.S. Forest Service and now chief fire officer at Cornea put it.

And he’s right. After all, firefighting is a visceral activity — responders can see the fires, even feel the burning heat echoing off of their flesh. Data wasn’t useful, particularly in the West where there are millions of acres of land and large swaths are sparsely populated. Massive conflagrations could be detected by satellites, but smoldering fires in the brush would be entirely invisible to the geospatial authorities. There’s smoke over California — exactly what is a firefighter on the ground supposed to do with such valuable information?

Today after a decade of speculative promise, IoT sensors are starting to clear a huge part of this fog. Aaron Clark-Ginsberg, a social scientist at RAND Corporation who researches community resilience, said that air quality sensors have become ubiquitous since they are “very cheap [and] pretty easy to use” and can offer very fine-grained understandings of pollution — a key signal, for instance, of wildfires. He pointed to the company Purple Air, which in addition to making sensors, also produces a popular consumer map of air quality, as indicative of the potential these days for technology.

Maps are the critical intersection for data in disasters. Geospatial information systems (GIS) form the basis for most planning and response teams, and no company has a larger footprint in the sector than privately-held Esri. Ryan Lanclos, who leads public safety solutions at the company, pointed to the huge expansion of water sensors as radically changing responses to certain disasters. “Flood sensors are always pulsing,“ he said, and with a “national water model coming out of the federal government ,” researchers can now predict through GIS analysis how a flood will affect different communities with a precision unheard of previously.

Digital maps and GIS systems are increasingly vital for disaster planning and response, but paper still remains quite ubiquitous. Image Credits: Paul Kitagaki Jr.-Pool/Getty Images

Cory Davis, the director of public safety strategy and crisis response at Verizon (which, through our parent company Verizon Media, is TechCrunch’s ultimate owner), said that all of these sensors have transformed how crews work to maintain infrastructure as well. “Think like a utility that is able to put a sensor on a power line — now they have sensors and get out there quicker, resolve it, and get the power back up.”

He noted one major development that has transformed sensors in this space the last few years: battery life. Thanks to continuous improvements in ultra-low-power wireless chips as well as better batteries and energy management systems, sensors can last a really long time in the wilderness without the need for maintenance. “Now we have devices that have ten-year battery lives,” he said. That’s critical, because it can be impossible to connect these sensors to the power grid in frontier areas.

The same line of thinking holds true at T-Mobile as well. When it comes to preventative planning, Jay Naillon, senior director of national technology service operations strategy at the telco, said that “the type of data that is becoming more and more valuable for us is the storm surge data — it can make it easier to know we have the right assets in place.” That data comes from flood sensors that can offer real-time warnings signals to planners across the country.

Telecom interest — and commercial interest in general — has been critical to accelerating the adoption of sensors and other data streams around disasters. While governments may be the logical end user of flood or wildfire data, they aren’t the only ones interested in this visibility. “A lot of consumers of that information are in the private sector,” said Jonathan Sury, project director at the National Center for Disaster Preparedness at the Earth Institute at Columbia University. “These new types of risks, like climate change, are going to affect their bottom lines,” and he pointed to bond ratings, insurance underwriting and other areas where commercial interest in sensor data has been profound.

Sensors may not literally be ubiquitous, but they have offered a window into the ambiguity that emergency managers have never had visibility into before.

Finally, there is the extensive datasets around mobile usage that have become ubiquitous throughout much of the world. Facebook’s Data for Good project, for instance, provides data layers around connectivity — are users connecting from one place and then later connecting from a different location, indicating displacement? That sort of data from the company and telcos themselves can help emergency planners scout out how populations are shifting in real-time.

Data, data, on the wall — how many AIs can they call?

Rivulets of data have now turned into floods of information, but just like floodwaters rising in cities across the world, the data deluge now needs a response all its own. In business, the surfeit of big data has been wrangled with an IT stack from data warehouses all the way to business intelligence tools.

If only data for disasters could be processed so easily. Data relevant for disasters is held by dozens of different organizations spanning the private, public, and non-profit sectors, leading to huge interoperability problems. Even when the data can be harmonized, there are large challenges in summarizing the findings down to an actual decision a frontline responder can use in their work — making AI a tough sale still today, particularly outside of planning. As Davis of Verizon put it, “now that they have this plethora of data, a lot of cities and federal agencies are struggling with how to use it.”

Unfortunately, standardization is a challenge at all scales. Globally, countries mostly lack interoperability, although standards are improving over time. Amir Elichai, the founder and CEO of 911 call-handling platform Carbyne, said that “from a technology standpoint and a standards standpoint, there is a big difference between countries,” noting that protocols from one country often have to be completely rewritten to serve a different market.

Tom Cotter, director of emergency response and preparedness at health care disaster response organization Project HOPE, said that even setting up communications between responders can be challenging in an international environment. “Some countries allow certain platforms but not others, and it is constantly changing,” he said. “I basically have every single technology communication platform you can possibly have in one place.”

One senior federal emergency management official acknowledged that data portability has become increasingly key in procurement contracts for technology, with the government recognizing the need to buy commercially-available software rather than custom-designed software. That message has been picked up by companies like Esri, with Lanclos stating that “part of our core mission is to be open and … create data and to share that openly to the public or securely through open standards.”

For all its downsides though, the lack of interoperability can be ironically helpful for innovation. Elichai said that the “lack of standards is an advantage — you are not buying into a legacy standard,” and in some contexts where standards are lacking, quality protocols can be built with the assumption of a modern data workflow.

Even with interoperability though, the next challenge becomes data sanitation — and disaster data is dirty as … well, something. While sensor streams can be verified and cross-checked with other datasets, in recent years there has been a heavy increase in the quantity of citizen-submitted information that has to be carefully vetted before it is disseminated to first responders or the public.

With citizens having more access to smartphones than ever, emergency planners have to sanitize uploaded data uploaded in order to verify and make it useful. Image Credits: TONY KARUMBA/AFP via Getty Images

Bailey Farren, CEO and co-founder of disaster communications platform Perimeter, said that “sometimes citizens have the most accurate and real-time information, before first responders show up — we want citizens to share that with …government officials.” The challenge is how to filter the quality goods from the unhelpful or malicious. Raj Kamachee, the CIO of Team Rubicon, a non-profit which assembles teams of volunteer military veterans to respond to natural disasters, said that verification is critical, and it’s a key element of the infrastructure he has built at the organization since joining in 2017. “We’ve gotten more people using it so more feedback [and] more data [is] coming through the pipes,” he said. “So creating a self-service, a very collaborative approach.”

With quality and quantity, the AI models should come, right? Well, yes and no.

Sury of Columbia wants to cool down at least some of the hype around AI. “The big caveat with all of these machine learning and big data applications is that they are not a panacea — they are able to process a lot of disparate information, [but] they’re certainly not going to tell us exactly what to do,” he said. “First responders are already processing a lot of information,” and they don’t necessarily need more guidance.

Instead, AI in disasters is increasingly focused on planning and resilience. Sury pointed to OneConcern, a resiliency planning platform, as one example of how data and AI can be combined in the disaster planning process. He also pointed to the CDC’s Social Vulnerability Index and risk tools from FEMA that integrate different data signals into scalar values by emergency planners to optimize their contingency plans.

Yet, almost everyone I talked to was much more hesitant about the power of AI. As I discussed a bit in part one of this series regarding the disaster sales cycle, data tools have to be real-time and perfect every time given the lives that are on the line. Kamachee of Team Rubicon noted that when choosing tools, he avoids whiz-bang and instead looks at the pure utility of individual vendors. “We go high tech, but we prepare for low tech,” he said, empathizing that in disaster response, everything must be agile and adaptable to changing circumstances.

Elichai of Carbyne saw this pattern in his sales. There’s a “sensitivity in our market and the reluctance from time to time to adopt” new technologies he said, but acknowledged that “there is no doubt that AI at a certain point will provide benefits.”

Naillon of T-Mobile had similar views from the operator perspective, saying that “I can’t say that we really leverage AI very much” in the company’s disaster planning. Instead of AI as brain, the telecom company simply uses data and forecast modeling to optimally position equipment — no fancy GANs required.

Outside of planning, AI has helped in post-disaster recovery, and specifically around damage assessments. After a crisis transpires, assessments of infrastructure and private property have to be made in order for insurance claims to be filed and for a community to move forward. Art delaCruz, COO and president of Team Rubicon, noted that technology and a flourish of AI has helped significantly around damage assessments. Since his organization often helps rebuild communities in the course of its work, triaging damage is a critical element of its effective response strategy.

There’s a brighter future, other than that brightness from the sun that is going to burn us to a crisp, right?

So AI today is helping a bit with resilience planning and disaster recovery and not so much during emergency response itself, but there is certainly more to come across the entire cycle. Indeed, there is a lot of excitement about the future of drones, which are increasingly being used in the field, but there are concerns long term about whether AI and data will ultimately cause more problems than they solve.

Drones would seem to have an obvious value for disaster response, and indeed, they have been used by teams to get additional aerial footage and context where direct access by responders is limited. Kamachee of Team Rubicon noted that in the Bahamas on a mission, response teams used drones to detect survivors, since major roads were blocked. The drones snapped images that were processed using AI, and helped the team to identify those survivors for evacuation. He described drones and their potential as “sexy; very, very cool.”

Aerial views from drones can give disaster response teams much better real-time information, particularly in areas where on-the-ground access is limited. Image Credits: Mario Tama/Getty Images

Cotter of Project HOPE similarly noted that faster data processing translates to better responses. “Ultimately speed is what saves lives in these disasters,” he said. We’re “also able to manage more responses remotely [and] don’t have to send as many people downrange,” giving response teams more leverage in resource-constrained environments.

“I see more emergency management agencies using drone technology — search and rescue, aerial photography,” Davis of Verizon said, arguing that operators often have a mentality of “send a machine into a situation first.” He continued, arguing, “artificial intelligence is going to continue to get better and better and better [and] enable our first responders to respond more effectively, but also more efficiently and safer.”

With data flooding in from sensors and drones and processed and verified better than ever, disaster response can improve, perhaps even better than Mother Nature can galvanize her increasingly deadly whims. Yet, there is one caveat: will the AI algorithms themselves cause new problems in the future?

Clark-Ginsburg of RAND, perhaps supplying that typical RANDian alternatives analysis, said that these solutions can also create problems themselves, “technological risks leading to disaster and the world of technology facilitating disaster.” These systems can break, they can make mistakes, and more ominously — they can be sabotaged to increase chaos and damage.

Bob Kerrey, a co-chair of the 9/11 Commission, former senator and governor of Nebraska, and currently the board chairman of Risk & Return, a disaster response VC fund and philanthropy I profiled recently, pointed to cybersecurity as increasingly a wild card in many responses. “There wasn’t a concept called zero days — let alone a market for zero days — in 2004 [when the 9/11 Commission was doing its work], and now there is.” With the 9/11 terrorist attacks, “they had to come here, they had to hijack planes … now you don’t need to hijack planes to damage the United States,” noting that hackers “can be sitting with a bunch of other guys in Moscow, in Tehran, in China, or even your mother’s basement.”

Data is a revolution in the making for disaster response, but it may well cause a whole second-order set of problems that didn’t exist before. What is giveth is taketh away. The oil gushes, but then the well suddenly runs dry – or simply catches fire.


Future of Technology and Disaster Response Table of Contents


Is Washington prepared for a geopolitical ‘tech race’?

When Secretary of State Antony Blinken and National Security Advisor Jake Sullivan sat down with Chinese officials in Anchorage, Alaska for the first high-level bilateral summit of the new administration, it was not a typical diplomatic meeting. Instead of a polite but restrained diplomatic exchange, the two sides traded pointed barbs for almost two hours. “There is growing consensus that the era of engagement with China has come to an unceremonious close,” wrote Sullivan and Kurt Campbell, the Administration’s Asia czar also in attendance, back in 2019. How apt that they were present for that moment’s arrival.

A little more than one hundred days into the Biden Administration, there is no shortage of views on how it should handle this new era of Sino-American relations. From a blue-ribbon panel assembled by former Google Chairman Eric Schmidt to a Politico essay from an anonymous former Trump Administration official that consciously echoes (in both its name and its author’s anonymity) George Kennan’s famous “Long Telegram” laying out the theory of Cold War containment, to countless think tank reports, it seems everyone is having their say.

What is largely uncontroversial though is that technology is at the center of U.S.-China relations, and any competition with China will be won or lost in the digital and cyber spheres. “Part of the goal of the Alaska meeting was to convince the Chinese that the Biden administration is determined to compete with Beijing across the board to offer competitive technology,” wrote David Sanger in the New York Times shortly afterward.

But what, exactly, does a tech-centered China strategy look like? And what would it take for one to succeed?

Tech has brought Republicans and Democrats uneasily together

One encouraging sign is that China has emerged as one of the few issues on which even Democrats agree that President Trump had some valid points. “Trump really was the spark that reframed the entire debate around U.S.-China relations in DC,” says Jordan Schneider, a China analyst at the Rhodium Group and the host of the ChinaTalk podcast and newsletter.

While many in the foreign policy community favored some degree of cooperation with China before the Trump presidency, now competition – if not outright rivalry – is widely assumed. “Democrats, even those who served in the Obama Administration, have become much more hawkish,” says Erik Brattberg of the Carnegie Endowment for International Peace. Trump has caused “the Overton Window on China [to become] a lot narrower than it was before,” adds Schneider.

The US delegation led by Secretary of State Antony Blinken face their Chinese counterparts at the opening session of US-China talks at the Captain Cook Hotel in Anchorage, Alaska on March 18, 2021. Image Credits: FREDERIC J. BROWN/POOL/AFP via Getty Images

As the U.S.-China rivalry has evolved, it has become more and more centered around competing philosophies on the use of technology. “At their core, democracies are open systems that believe in the free flow of information, whereas for autocrats, information is something to be weaponized and stifled in the service of the regime,” says Lindsay Gorman, Fellow for Emerging Technologies at the German Marshall Fund. “So it’s not too surprising that technology, so much of which is about how we store and process and leverage information, has become such a focus of the U.S.-China relationship and of the [broader] democratic-autocratic competition around the world.”

Tech touches everything now – and the stakes could not be higher. “Tech and the business models around tech are really ‘embedded ideology,’’ says Tyson Barker of the German Council on Foreign Relations. “So what tech is and how it is used is a form of governance.”

What does that mean in practice? When Chinese firms expand around the world, Barker tells me, they bring their norms with them. So when Huawei builds a 5G network in Latin America, or Alipay is adopted for digital payments in Central Europe, or Xiaomi takes more market share in Southeast Asia, they are helping digitize those economies on Chinese terms using Chinese norms (as opposed to American ones). The implication is clear: whoever defines the future of technology will determine the rest of the twenty-first century.

That shifting balance has focused minds in Washington. “I think there is a strong bipartisan consensus that technology is at the core of U.S.-China competition,” says Brattberg. But, adds Gorman, “there’s less agreement on what the prescription should be.” While the Democratic experts now ascendant in Washington agree with Trump’s diagnosis of the China challenge, they believe in a vastly different approach from their Trump Administration predecessors.

Out, for instance, are restrictions on Chinese firms just for being Chinese. “That was one of the problems with Trump,” says Walter Kerr, a former U.S. diplomat who publishes the China Journal Review. “Trump cast broad strokes, targeting firms whether it was merited or not. Sticking it to the Chinese is not a good policy.”

Instead the focus is on inward investment – and outward cooperation.

Foreign policy is domestic policy

Democrats are first shoring up America domestically – in short, be strong at home to be strong abroad. “There’s no longer a bright line between foreign and domestic policy,” President Biden said in his first major foreign policy speech. “Every action we take in our conduct abroad, we must take with American working families in mind. Advancing a foreign policy for the middle class demands urgent focus on our domestic economic renewal.”

This is a particular passion of Jake Sullivan, Biden’s national security advisor, who immersed himself in domestic policy while he was Hillary Clinton’s chief policy aide during her 2016 presidential campaign. “We’ve reached a point where foreign policy is domestic policy, and domestic policy is foreign policy,” he told NPR during the transition.

Jake Sullivan, White House national security adviser, speaks during a news conference Image Credits: Jim Lo Scalzo/EPA/Bloomberg via Getty Images

This is increasingly important for technology, as concern grows that America is lagging behind on research and development. “We’re realizing that we’ve underinvested in the government grants and research and development projects that American companies [need] to become highly innovative in fields like quantum computing, AI, biotechnology, etc,” says Kerr.

“Rebuilding” or “sustaining” America’s “technological leadership” is a major theme of the Longer Telegram and is the very operating premise of the report of the China Strategy Group assembled by Eric Schmidt, former executive chairman of Alphabet, Google’s parent company, and the first chair of the Department of Defense’s Innovation Advisory Board. Those priorities have only become more important during the pandemic. It’s a question of “how do we orient the research system to fill in the industrial gaps that have been made very clear by the COVID crisis?” says Schneider of Rhodium.

While it hasn’t gone so far as to adopt a national industrial strategy, the Administration’s most ambitious officials are looking to prod along tech research in critical sectors. To that end, the National Security Council, which Sullivan runs, is reshaping itself around technology issues; Biden appointed the first deputy national security advisor focusing on technology issues as well as a high-profile senior director for technology. Their goal: to harness the same energy that drove the development of Silicon Valley during the Cold War into out-competing China.

That said, the ingredients to American (and Western) innovation aren’t exactly a secret: investment in education, research, and talent. “The West still has [most of] the universities, R&D and leading companies,” says Brattberg. “There’s still a lot of competitiveness and leverage.” Unsurprisingly, investing to retain that edge is a key theme of Biden’s $2 trillion infrastructure plan, which includes funds for basic research, supply chain support, broadband connectivity, and support for the semiconductor industry.

As almost anyone in Silicon Valley will tell you, a functioning and welcoming immigration system is a crucial ingredient, too. “The U.S. is at its best when it welcomes talent from around the world and gives people the tools to succeed and thrive here,” says Gorman. Whether the Biden Administration can strike a deal with Senate Republicans on comprehensive immigration reform – or even funding basic research – remains an open question, though. And even if it can succeed, American ingenuity is no longer sufficient on its own.

Team America

Whether it’s for talent or partnerships, the U.S.-China tech competition will be won overseas. Allies are “the most salient and straightforward way Biden can bring leverage to the table compared to Trump,” says Schneider.

Biden, Blinken, and other senior administration officials have loudly and repeatedly pronounced their preferences to work with democratic partners on international challenges, particularly in the Indo-Pacific region. It is no accident that Blinken and Sullivan’s meeting in Anchorage was preceded by a trip to Japan and South Korea, two of America’s closest allies in the region, and that Japanese Prime Minister Yoshihide Suga was the first foreign leader to visit Biden at the White House. “If you add the U.S. to the EU, Australia, Taiwan, and South Korea, you tilt the balance of economic heft and technological prowess back toward us,” he adds.

U.S. President Joe Biden and Prime Minister Yoshihide Suga of Japan hold a news conference in the Rose Garden of the White House on April 16, 2021. Image Credits: Doug Mills-Pool/Getty Images)

The ground for Blinken and company is increasingly fertile. Chinese diplomats have been aggressive, if not downright condescending, to countries they perceive have slighted China. In one recent example, the Chinese embassy in Dublin sent a series of tweets targeting an Irish-British journalist couple who had been forced to relocate to Taiwan as a result of a harassment campaign over their critical coverage of China’s Uyghur policy in Xinjiang. This so-called ‘wolf warrior’ diplomacy (a reference to a jingoist action film) is prompting a backlash, and helping convince many policy elites in countries who had hoped to sit out a U.S.-China conflict that perhaps Washignton’s China skeptics have a point.

This perhaps explains the proliferating alpha-numeric soup of coalitions and alliances being floated to secure a free and democratic internet for the future. There’s the D10, a secure supply chain network floated by British Prime Minister Boris Johnson, which adds Australia, India, and South Korea to the existing G7 countries (U.S., U.K., Canada, France, Italy, Germany, and Japan). Schmidt’s report calls for a T-12 (the D10 minus Italy plus Finland, Sweden, and Israel). Others look to expand existing technology-related groupings like the Five Eyes signals intelligence alliance of the U.S., U.K., Australia, Canada, and New Zealand, or harness burgeoning non-technical ones like the Quad. Gorman points to the significance of the news that the Quad itself – Australia, India, Japan, and the US – announced the creation of a working group on emerging technology at its first-ever (virtual) leaders summit in March.

Meanwhile, Senator Mark Warner, a Democrat from Virginia, has proposed a technology partnership to be run out of the State Department to coordinate with allies – including a $5 billion fund for research – with the explicit purpose of countering China.

International tech standards are increasingly not set by the West

Even if it can shephard its allies, the U.S. still faces stiff international headwinds. The Trump Administration’s decision to withdraw from the Trans-Pacific Partnership, a trade deal negotiated by the Obama Administration with ten other Pacific Rim countries with the intent of setting trade standards in the Asia-Pacific, was taken as a sign that perhaps the U.S. pivot to Asia was less ambitious than advertised. The pact, rebranded as the Comprehensive and Progressive Agreement for Trans-Pacific Partnership (CPTPP), has continued without the U.S.  – and now even China has expressed interest in joining.

Trump’s disdain for working within multilateral forums has also meant that Washington has essentially ceded the field of global technical standard-setting. Beijing has taken advantage, aggressively working the UN system so that Chinese officials now lead four of the 15 specialized UN agencies, including the two most focused on regulating technology: the International Telecommunication Union (ITU), which helps set global technical standards, and the World Intellectual Property Organization (WIPO), which is responsible for protecting intellectual property rights around the world.

China is also backing Russian efforts to rewrite internet governance. With Chinese support, Russia won a UN General Assembly vote in 2019 to start drafting a new cybercrime treaty. Their goal is to replace the U.S-backed 2001 Budapest Convention on Cybercrime, which was  created by democracies through the Council of Europe, with a treaty that one critic said would include provisions “likely to provide cover to authoritarian governments to persecute their political opponents.” Russia and China also unsuccessfully tried to use the (now Chinese-led) ITU to replace the Internet Corporation for Assigned Names and Numbers (ICANN), a private body of experts that governs internet domain names.

These efforts are all part of China Standards 2035, an explicit plan to internationalize standards to Chinese preferences in areas like 5G and the Internet of Things (IoT). As Emily de La Bruyère and Nathan Picarsic wrote on TechCrunch last year, “Beijing has spent the past two decades establishing influential footholds in multilateral bodies and targeted industrial areas. Now, it is using those footholds to set their rules – with them, to define the infrastructure of the future world.”

Hawks, doves, and U.S. divisions

Even within the new consensus on China, there are fissures on how to handle China itself.

On the hawkish side, the Schmidt Report concedes that “some degree of technological bifurcation is in U.S. interests.” But calibrating just how much is a difficult question. “It’s already a reality,” says Barker of the German Council on Foreign Relations. “The question is: how deep does the split have to be?”

Few argue for complete decoupling, Brattberg, the Carnegie scholar who has written extensively on tech diplomacy, says. After all, many are loath to concede completely separate ‘free’ and ‘authoritarian’ internets. There are other implications as well: a “bipolar, bifurcated internet … would have some very serious adverse implications in terms of cost [and] a slowdown in innovation,” one former UK intelligence official told me last year.

The key is to pinpoint which specific technologies are essential to produce domestically. “To the extent we [decouple from China], we have to do it in a smart way,” says Gorman. “There’s a risk of going too far and hurting potential innovation in the U.S. So the debate going forward is going to be: How do you address true national security vulnerabilities without emulating an authoritarian approach that might say ‘just ban everything from a certain country.’”

And even if we can form a consensus at home, America’s allies are no less divided as I wrote last year with regards to Huawei. While the debate over the Chinese company’s role in 5G has evolved, with both France and the U.K. (in a reversal) moving to phase out its kit, the debate over what role China should play economically and technologically in Europe is still very much alive.

The U.K. government is clear-eyed; in its Integrated Review of foreign and defense policy published in March, it acknowledged that China’s “growing international assertiveness … will pose an increasing risk to UK interests” and set an explicit goal for itself to be a third “science and tech superpower.” France, meanwhile, laid out an Indo-Pacific strategy backing the principle of a free and open Pacific, an explicit challenge to Chinese preferences.

But many are still equivocal. As Singapore’s Prime Minister Lee Hsien Loong wrote last year in Foreign Affairs, “Asian countries do not want to be forced to choose between the two.” Berlin made clear in its Indo-Pacific strategy last year that it was also reticent to make an outright choice. New Zealand, conscious of its important trade with China, is reluctant to expand the use of Five Eyes beyond intelligence sharing. Meanwhile, Italy endorsed China’s infrastructure-focused Belt and Road Initiative in 2019 and called the country a “strategic partner” last year. And the European Union moved forward on a trade deal with China late last year despite very public lobbying against it from the United States.

A world of tradeoffs

The challenge for the Biden Administration will be to assemble practical coalitions without asking allies and partners to make impossible choices. They will succeed if they can reframe the question. “In Europe, they don’t like ‘decoupling’ but they do like ‘diversification’,” says Brattberg. They also don’t like the idea of joining a U.S.-led alliance. Instead, he says, Washington should frame cooperation as “coalitions among like-minded democrtaic partners.”

For that to work, the U.S. will have to work out the bilateral issues it has with its allies first. “We need to be much more savvy on engaging directly with the EU on resolving issues like data transfers, digital taxation, and data privacy,” he said. “Digital sovereignty shouldn’t come at the expense of partnership with like-minded partners.”

Chancellor of Germany Angela Merkel gives a speech during the press conference at the end of the meeting with Chinese Premier Li Keqiang (not pictured) of at The Great Hall Of The People on September 06, 2019 in Beijing, China. Image Credits: Andrea Verdelli-Pool/Getty Images

Nimbleness will be key – multiple experts told me it will be far better to create ad hoc coalitions on particular issues than to create a single fixed democratic tech alliance. This would have the benefit of keeping groupings tight without excluding countries with key expertise in particular areas (think Sweden and 5G or Taiwan and semiconductors). Washington should also take a collegiate approach, recognizing and respecting that its allies will not always be in lock-step on every aspect of the relations with China. In other words, the U.S. shouldn’t let the perfect be the enemy of the good, as agreement most of the time on most issues is probably sufficient to create the momentum Washington needs.

The U.S. can still compete globally and widen the circle of like-minded countries, Gorman, the scholar at GMF, tells me, but it has to invest in them if they are going to build out their tech sectors in a way that is aligned with democratic values and standards. “It’s really about providing an attractive counteroffer,” she said.

Even if the United States retains its technological edge for the near future, Americans should start adjusting to a future where Silicon Valley’s dominance is no longer inevitable. Chinese technologists are pulling ahead in areas like 5G while Chinese firms are competing on price (mobile phones) and increasingly on quality (e-commerce) and innovation (see: TikTok). China also exerts enormous clout through its control of supply chains and rare earth metals as well as its vast customer base.

Perhaps China’s greatest leverage point is its looming presence over Taiwan. As long as Taiwan remains one of the leading manufacturers of semiconductors (chip giant TSMC manufactures 90% of the world’s most advanced chips), the world’s technology industry will be vulnerable to the precarity of cross-Strait relations.

Will technology become just another chip in the geopolitical game the U.S. and China are playing, then? The Biden Administration is more prepared than its predecessor to weigh the tradeoffs, Barker of the German Council on Foreign Relation retells me. But it’s unclear how Washington, so early in this administration, will prioritize technology issues if faced with the prospects of Chinese cooperation on other priorities.

After all, at any given moment, the U.S. (and its allies) must weigh a host of priorities vis-à-vis China. And for all of the downsides to its bellicosity, the Trump Administration’s fixation on a handful of issues gave it leverage: it was willing to ignore Uyghurs and other human rights abuses in order to get a trade deal (even if it was deeply flawed).

The Biden Administration, on the other hand, has not yet articulated any priorities at all. If the rhetoric from Washington can be believed, the White House thinks it can make progress on climate, Taiwan, trade, human rights, and any number of other areas, all at once. This on its own creates a vulnerability. As historian Niall Ferguson reminded us in a recent Bloomberg column, then National Security Advisor Henry Kissinger was outmaneuvered when he went to China in 1971 with a multi-issue agenda and China singularly focused on Taiwan.

Beijing’s diplomats, despite their wolf-warrior missteps, are still savvy negotiators. If they are allowed to do so, they will once again try to play different parts of the Administration against each other, conditioning progress on climate, for example, on a softening over geopolitics, as the Brookings scholar Thomas Wright has warned. In that light, it simply strains credulity that an ‘all of the above’ approach will work, especially when Biden’s wish list keeps the issues Trump cared about, like trade, 5G, and Taiwan, and adds those he ignored, like human rights, democracy, and climate change.

This is where America’s alliances may prove to be Biden’s hidden ace. If Biden can forge a common-enough front with a wide-enough spectrum of allies, the U.S. will be better able to withstand Chinese pressure to trade progress on one issue against another. Instead, forcing China to negotiate with the U.S. and its allies on an issue-by-issue basis may put Washington in a better position to succeed.

Of all the issues in America’s China portfolio, though, the tech race provides one extra advantage: for all the talk of industrial strategy, alliances, and diplomatic maneuvers, Washington is not the only or even primary actor involved. The Biden Administration can help set the rules, invest in basic research, and defend American interests abroad, but American innovation depends on its innovators – and there are still bountiful numbers of them tinkering away.

The most disastrous sales cycle in the world

Startups constantly talk about being mission-oriented, but it’s hard to take most of those messages seriously when the mission is optimizing cash flow for tax efficiency. However, a new generation of startups is emerging that are taking on some of the largest global challenges and bringing the same entrepreneurial grit, operational excellence, and technical brilliance to bear on actual missions — ones that may well save thousands of lives.

ClimateTech has been a huge beneficiary of this trend in general, but one small specialty has caught my eye: disaster response. It’s a category for software services that’s percolated for years with startups here and there, but now a new crop of founders is taking on the challenges of this space with renewed urgency and vigor.

As the elevator pitch would have it, disaster response is hitting hockey stick growth. 2020 was a brutal year, and in more ways than just the global COVID-19 pandemic. The year also experienced a record number of hurricanes, among the worst wildfire seasons in the Western United States, and several megastorms all across the world. Climate change, urbanization, population growth, and poor response practices have combined to create some of the most dangerous conditions humanity has ever collectively faced.

I wanted to get a sense of what the disaster response market has in store this decade, so over the past few weeks, I have interviewed more than 30 startup founders, investors, government officials, utility execs and more to understand this new landscape and what’s changed. In this four-part series on the future of technology and disaster response, to be published this weekend and next, we’ll look at the sales cycle in this market, how data is finally starting to flow into disaster response, how utilities and particularly telcos are dealing with internet access issues, and how communities are redefining disaster management going forward.

Before we get into all the tech developments in disaster response and resilience though, it’s important to ask a basic question: if you build it, will they come? The resounding answer from founders, investors, and government procurement officials was simple: no.

In fact, in all my conversations for this series, the hell of the emergency management sales cycle came up repeatedly, with more than one individual describing it as possibly the toughest sale that any company could make in the entire world. That view might be surprising in a market that easily runs into the tens of billions of dollars if the budgets for procurement are aggregated across local, state, federal, and international governments. Yet, as we will see, the unique dynamics of this market make almost any traditional sales approach useless.

Despite that pessimism though, that doesn’t mean sales are impossible, and a new crop of startups are piercing the barriers of entry in this market. We’ll look at the sales and product strategies that startups are increasingly relying on today to break through.

The sale from hell

Few will be surprised that government sales are hard. Generations of govtech startup founders have learned that slow sales cycles, byzantine procurement processes, cumbersome verification and security requirements, and a general lassitude among contract officers makes for a tough battlefield to close on revenue. Many government agencies now have programs to specifically onboard startups, having discovered just how hard it is for new innovations to run through their gauntlet.

Emergency management sales share all the same problems as other govtech startups, but then they deal with about a half dozen more problems that make the sales cycle go from exhausting to infernal hell.

The first and most painful is the dramatic seasonality of the sales in the emergency space. Many agencies that operate on seasonal disasters — think hurricanes, wildfires, winter storms, and more — often go through an “action” period where they respond to these disasters, and then transition into a “planning” period where they assess their performance, determine what changes are needed for next season, and consider what tools might be added or removed to increase the effectiveness of their responders.

Take Cornea and Perimeter, two startups in the wildfire response space that I profiled recently. Both of the teams described how they needed to think in terms of fire seasons when it came to product iteration and sales. “We took two fire seasons to beta test our technology … to solve the right problem the right way,” Bailey Farren, CEO and co-founder of Perimeter, said. “We actually changed our focus on beta testing during the [2019 California] Kincaid fire.”

In this way, disaster tech could be compared to edtech, where school technology purchases are often synchronized with the academic calendar. Miss the June through August window in the U.S. education system, and a startup is looking at another year before it will get another chance at the classroom.

Edtech might once have been a tougher sale to make in order to thread that three-month needle, but disaster response is getting more difficult every year. Climate change is exacerbating the length, severity, and damage caused by all types of disasters, which means that responding agencies that might have had six months or more out-of-season to plan in the past are sometimes working all year long just to respond to emergencies. That gives little time to think about what new solutions an agency needs to purchase.

Worse, unlike the standardized academic calendar, disasters are much less predictable these days as well. Flood and wildfire seasons, for instance, used to be relatively concentrated in certain periods of the year. Now, such emergencies can emerge practically year-round. That means that procurement processes can both start and freeze on a moment’s notice as an agency has to respond to its mission.

Seasonality doesn’t just apply to the sales cycle though — it also applies to the budgets of these agencies. While they are transpiring, disasters dominate the eye of the minds for citizens and politicians, but then we forget all about them until the next catastrophe. Unlike the annual consistency of other government tech spending, disaster tech funding often comes in waves.

One senior federal emergency management official, who asked not to be named since he wasn’t authorized to speak publicly, explained that consistent budgets and the ability to spend them quickly is quite limited during “blue sky days” (i.e. periods without a disaster), and agencies like his have to rely on piecing together supplementary disaster funds when Congress or state legislatures authorize additional financing. The best agencies have technological roadmaps on hand so that when extra funding comes in, they can use it immediately to realize their plans, but not all agencies have the technical planning resources to be that prepared.

Amir Elichai, the CEO and co-founder of Carbyne, a cloud-native platform for call handling in 911 centers, said that this wave of interest crested yet again with the COVID-19 pandemic last year, triggering huge increases in attention and funding around emergency response capabilities. “COVID put a mirror in front of government faces and showed them that ‘we’re not ready’,” he said.

Perhaps unsurprisingly, next-generation 911 services (typically dubbed NG911), which have been advocated for years by the industry and first responders, is looking at a major financing boost. President Biden’s proposed infrastructure bill would add $15 billion to upgrade 911 capabilities in the United States — funding that has been requested for much of the last decade. Just last year, a $12 billion variant of that bill failed in the Senate after passing the U.S. House of Representatives.

Sales are all about providing proverbial painkillers versus vitamins to customers, and one would expect that disaster response agencies looking to upgrade their systems would be very much on the painkiller side. After all, the fear and crisis surrounding these agencies and their work would seem to bring visceral attention to their needs.

Yet, that fear actually has the opposite effect in many cases, driving attention away from systematic technology upgrades in favor of immediate acute solutions. One govtech VC, who asked not to be named to speak candidly about the procurement process his companies go through, said that “we don’t want to paint the picture that the world is a scary and dangerous place.” Instead, “the trick is to be … focused on the safety side rather than the danger.” Safety is a much more prevalent and consistent need than sporadically responding to emergencies.

When a wave of funding finally gets approved though, agencies often have to scramble to figure out what to prioritize now that the appropriated manna has finally dropped from the legislative heaven. Even when startups provide the right solutions, scrying which problems are going to get funded in a particular cycle requires acute attention to every customer.

Josh Mendelsohn, the managing partner at startup studio and venture fund Hangar, said that “the customers have no shortage of needs that they are happy to talk about … the hardest part is how you narrow the funnel — what are the problems that are most meritorious?” That merit can, unfortunately, evolve very rapidly as mission requirements change.

Let’s say all the stars line up though — the agencies have time to buy, they have a need, and a startup has the solution that they want. The final challenge that’s probably the toughest to overcome is simply the lack of trust that new startups have with agencies.

In talking to emergency response officials the past few weeks, reliability unsurprisingly came up again and again. Responding to disasters is mission-critical work, and nothing can break in the field or in the operations center. Frontline responders still use paper and pens in lieu of tablets or mobile phones since they know that paper is going to work every single time and not run out of battery juice. The move fast and break things ethos of Silicon Valley is fundamentally incompatible with this market.

Seasonality, on-and-off funding, lack of attention, procurement scrambling, and acute reliability requirements combine to make emergency management sales among the hardest possible for a startup. That doesn’t even get into all the typical govtech challenges like integrating with legacy systems, the massive fragmentation of thousands of emergency response agencies littered across the United States and globally, and the fact that in many agencies, people aren’t that interested in change in the first place. As one individual in the space described how governments approach emergency technology, “a lot of departments are looking at it as maybe I can hit retirement before I have to deal with it.”

The strategies for breaking out of limbo

So the sales cycle is hell. Why, then, are VCs dropping money in the sector? After all, we’ve seen emergency response data platform RapidSOS raise $85 million just a few months ago, about the same time Carbyne raised $25 million. There are quite a few more startups at the earliest phases that have raised pre-seed and seed investment as well.

The key argument that nearly everyone in this sector agreed on is that founders (and their investors) have to throw away their private-sector sales playbooks and rebuild their approach from the bottom up to sell specifically to these agencies. That means devising entirely different strategies and tactics to secure revenue performance.

The first and most important approach is, in some respects, to not even start with a company at all, but rather to start learning what people in this field actually do. As the sales cycle perhaps indicates, disaster response is unlike any other work. The chaos, the rapidly changing environment, the multi-disciplinary teams and cross-agency work that has to take place for a response to be effective have few parallels to professional office work. Empathy is key here: the responder that uses paper might have nearly lost their life in the field when their device failed. A 911 center operator may have listened to someone perish in real-time as they scrambled to find the right information from a software database.

In short, it’s all about customer discovery and development. That’s not so different from the enterprise world, but patience radiated out of many of my conversations with industry participants. It just takes more time — sometimes multiple seasons — to figure out precisely what to build and how to sell it effectively. If an enterprise SaaS product can iterate to market-fit in six months, it might take two to three years in the government sector to reach an equivalent point.

Michael Martin of RapidSOS said “There is no shortcut to doing customer discovery work in public service.” He noted that “I do think there is a real challenge between the arrogance of the Silicon Valley tech community and the reality of these challenges“ in public safety, a gap that has to be closed if a startup wants to find success. Meanwhile, Bryce Stirton, president and co-founder of public-safety company Responder Corp, said that “The end user is the best way to look at all the challenges … what are all the boxes the end user has to check to use a new technology?”

Mendelsohn of Hangar said that founders need to answer some tough questions in that process. “Ultimately, what are your entry points,” he asked. “Cornea has had to go through that customer discovery process … it all feels necessary, but what are the right things that require the least amount of behavior change to have impact immediately?”

Indeed, that process is appreciated on the other side as well. The federal emergency management official said, “everyone has a solution, but no one asked me about my problem.” Getting the product right and having it match the unique work that takes place in this market is key.

Let’s say you have a great product though — how do you get it through the perilous challenges of the procurement process? Here, answers differed widely, and they offer multiple strategies on how to approach the problem.

Martin of RapidSOS said that “government does not have a good model for procuring new services to solve problems.” So, the company chose to make its services free for government. “In three years, we went from no agencies using our stuff to all agencies using our stuff, and that was based on not making it a procurement problem,” he said. The company’s business model is based on having paid corporate partners who want to integrate their data into 911 centers for safety purposes.

That’s a similar model used by MD Ally, which received a $3.5 million seed check from General Catalyst this past week. The company adds telehealth referral services into 911 dispatch systems, and CEO and founder Shanel Fields emphasized that she saw an opportunity to create a revenue engine from the physician and mental health provider side of her market while avoiding government procurement.

Outside of what might be dubbed “Robinhood for government” (aka, just offering a service for free), another approach is to link up with more well-known and trusted brand names to offer a product that has the innovation of a startup but the reliability of an established player. Stirton of Responder said “we learned in [this market] that it takes more than just capital to get companies started in this space.” What he found worked was building private-sector partnerships to bring a joint offering to governments. For instance, he noted cloud providers Amazon Web Services and Verizon have good reputations with governments and can get startups over procurement hurdles (TechCrunch is owned by Verizon Media, which is owned by Verizon).

Elichai of Carbyne notes that much of his sales is done through integration partners, referencing CenterSquare as one example. For 911 services, “The U.S. market is obviously the most fragmented” and so partners allow the company to avoid selling to thousands of different agencies. “We are usually not selling direct to governments,” he said.

Partners can also help deal with the problem of localism in emergency procurement: many government agencies don’t know precisely what to buy, so they simply buy software that is offered by companies in their own backyard. Partners can offer a local presence while also allowing a startup to have a nimble national footprint.

Another angle on partners is building out a roster of experienced but retired government executives who can give credibility to a startup through their presence and networks. Even more than in enterprise, government officials, particularly in emergency management, have to work and trust one another given the closely-coupled work that they perform. Hearing a positive recommendation from a close contact down the street can readily change the tenor of a sales conversation.

Finally, as much as emergency management software is geared for governments, private sector companies increasingly have to consider much of the same tooling to protect their operations. Many companies have distributed workforces, field teams, and physical assets they need to protect, and often have to respond to disasters in much the same way that governments do. For some startups, it’s possible to bootstrap in the private sector early on while continuing to assiduously develop public sector relationships.

In short, a long-term customer development program coupled with quality partnerships and joint offerings while not forgetting the private sector offers the best path for startups to break through into these agencies.

The good news is that the hard work can be rewarded. Not only are there serious dollars that flow through these agencies, but the agencies themselves know that they need better technology. Tom Harbour, who is chief fire officer at Cornea and formerly national director of fire management at the U.S. Forest Service, notes that “These are billions of dollars we spend … and we know we can be more efficient.” Government doesn’t always make it easy to create efficiency, but for the founders willing to go the distance, they can build impactful, profitable, and mission-driven companies.

Biden’s labor secretary thinks many gig workers should be reclassified as employees

Biden Labor Secretary Marty Walsh charged into the white hot issue of the gig economy Thursday, asserting that many people working without benefits in the gig economy should be classified as employees instead.

In an interview with Reuters, Walsh said that the Department of Labor is “looking at” the gig economy, hinting that worker reclassification could be a priority in the Biden administration.

“… In a lot of cases gig workers should be classified as employees,” Walsh said. “In some cases they are treated respectfully and in some cases they are not and I think it has to be consistent across the board.”

Walsh also said that the labor department would be talking to companies that benefit from gig workers to ensure that non-employees at those companies have the same benefits that an “average employee” in the U.S. would have.

“These companies are making profits and revenue and I’m not [going to] begrudge anyone for that because that’s what we are about in America… but we also want to make sure that success trickles down to the worker,” Walsh said.

Walsh’s comments aren’t backed by federal action, yet anyway, but they still made major waves among tech companies that leverage non-employee labor. Uber and Lyft stock dipped on the news Thursday, along with Doordash.

In the interview, Walsh also touched on pandemic-related concerns about gig workers who lack unemployment insurance and health care through their employers. The federal government has picked up the slack during the pandemic with two major bills granting gig workers some benefits, but otherwise they’re largely without a safety net.

Reforming labor laws has been a tenet of Biden’s platform for some time and the president has been very vocal about bolstering worker protections and supporting organized labor. One section of then President-elect Biden’s transition site was devoted to expanding worker protections, calling the misclassification of employees as contract workers an “epidemic.”

Biden echoed his previous support for labor unions during a joint address to Congress Wednesday night, touting the Protecting the Right to Organize Act — legislation that would protect workers looking to form or join unions. That bill would also expand federal whistleblower protections.

“The middle class built this country,” Biden said. “And unions build the middle class.”

Biden proposes ARPA-H, a health research agency to ‘end cancer’ modeled after DARPA

In a joint address to Congress last night, President Biden updated the nation on vaccination efforts and outlined his administration’s ambitious goals.

Biden’s first 100 days have been characterized by sweeping legislative packages that could lift millions of Americans out of poverty and slow the clock on the climate crisis, but during his first joint address to Congress, the president highlighted another smaller plan that’s no less ambitious: to “end cancer as we know it.”

“I can think of no more worthy investment,” Biden said Wednesday night. “I know of nothing that is more bipartisan…. It’s within our power to do it.”

The comments weren’t out of the blue. Earlier this month, the White House released a budget request for $6.5 billion to launch a new government agency for breakthrough health research. The proposed health agency would be called ARPA-H and would live within the NIH. The initial focus would be on cancer, diabetes and Alzheimer’s but the agency would also pursue other “transformational innovation” that could remake health research.

The $6.5 billion investment is a piece of the full $51 billion NIH budget. But some critics believe that ARPA-H should sit under the Department of Health and Human Services rather than being nested under NIH. 

ARPA-H would be modeled after the Defense Advanced Research Projects Agency (DARPA), which develops moonshot-like tech for defense applications. DARPA’s goals often sound more like science fiction than science, but the agency contributed to or created a number of now ubiquitous technologies, including a predecessor to GPS and most famously ARPANET, the computer network that grew into the modern internet.

Unlike more conservative, incremental research teams, DARPA aggressively pursues major scientific advances in a way that shares more in common with Silicon Valley than it does with other governmental agencies. Biden believes that using the DARPA model on cutting edge health research would keep the U.S. from lagging behind in biotech.

“China and other countries are closing in fast,” Biden said during the address. “We have to develop and dominate the products and technologies of the future: advanced batteries, biotechnology, computer chips, and clean energy.”