Dear Sophie: How does the International Entrepreneur Parole program work?

Here’s another edition of “Dear Sophie,” the advice column that answers immigration-related questions about working at technology companies.

“Your questions are vital to the spread of knowledge that allows people all over the world to rise above borders and pursue their dreams,” says Sophie Alcorn, a Silicon Valley immigration attorney. “Whether you’re in people ops, a founder or seeking a job in Silicon Valley, I would love to answer your questions in my next column.”

Extra Crunch members receive access to weekly “Dear Sophie” columns; use promo code ALCORN to purchase a one- or two-year subscription for 50% off.


Dear Sophie,

I’m the founder of an early-stage, two-year-old fintech startup. We really want to move to San Francisco to be near our lead investor.

I heard International Entrepreneur Parole is back. What is it, and how can I apply?

— Joyous in Johannesburg

Dear Joyous,

Today for the first time, international startup founders can sigh a breath of relief because there is new hope for immigration! This hope comes in the form of a little-known pathway to live and work legally in the United States. This pathway is now possible because, effective today, the U.S. Department of Homeland Security (DHS) withdrew the proposed rule to remove the International Entrepreneur Parole Program. This development is FANTASTIC for startup founders everywhere!

DHS believes that “qualified entrepreneurs who would substantially benefit the United States by growing new businesses and creating jobs for U.S. workers” should be able to benefit from “all viable” immigration options. The National Venture Capital Association is “thrilled” at the news, and so am I!

A composite image of immigration law attorney Sophie Alcorn in front of a background with a TechCrunch logo.

Image Credits: Joanna Buniak / Sophie Alcorn (opens in a new window)

International Entrepreneur Parole (IEP) allows founders to request a 30-month immigration status, with the possibility of a 30-month extension as well. Spouses of those with IEP can qualify for work permits. There’s no limit to the variety of fields in which startups can qualify — we’ve had interest from founders in everything from autonomous drone delivery to AI for law enforcement; anticancer drug discovery to satellites.

To qualify, you need to show that:

  • Your startup is less than five years old.

The human-focused startups of the hellfire

Disasters may not always be man-made, but they are always responded to by humans. There’s a whole panoply of skills and professions required today to respond to even the tiniest emergency, and that doesn’t even include the needs during pre-disaster planning and post-disaster recovery. It’s not a very remunerative industry for most and the mental health effects from stress can linger for decades, but the mission at the core of this work — to help people in the time of their greatest need — is what continues to attract many to partake in this never-ending battle anyway.

In the last three parts of this series on the future of technology and disaster response, I’ve focused on, well, technology, and specifically the sales cycle for new products, the sudden data deluge now that Internet of Things (IoT) is in full force, and the connectivity that allows that data to radiate all around. What we haven’t looked at enough so far is the human element: the people who actually respond to disasters as well as what challenges they face and how technology can help them.

So in this fourth and final part of the series, we’ll look at four areas where humans and technology intersect within disaster response and what future opportunities lie in this market: training and development, mental health, crowdsourced responses to disasters, and our doomsday future of hyper-complex emergencies.

Training in a hellfire

Most fields have linear approaches to training. To become a software engineer, students learn some computer science theory, add in some programming practice, and voilà (note: your mileage may vary). To become a medical doctor, aspiring physicians take an undergraduate curriculum teeming with biology and chemistry, head to medical school for two deadened years of core anatomy and other classes and then switch into clinical rotations, a residency, and maybe fellowships.

But how do you train someone to respond to emergencies?

From 911 call takers to EMTs and paramedics to emergency planning officials and the on-the-ground responders who are operating in the center of the storm as it were, there are large permutations in the skills required to do these jobs well. What’s necessary aren’t just specific hard skills like using call dispatch software or knowing how to upload video from a disaster site, but also critically-important softer skills as well: precisely communicating, having sangfroid, increasing agility, and balancing improvisation with consistency. The chaos element also can’t be overstated: every disaster is different, and these skills must be viscerally recombined and exercised under extreme pressure with frequently sparse information.

A whole range of what might be dubbed “edtech” products could serve these needs, and not just exclusively for emergency management.

Communications, for instance, isn’t just about team communications, but also communicating with many different constituencies. Aaron Clark-Ginsberg, a social scientist at RAND Corporation, said that “a lot of these skills are social skills — being able to work with different groups of people in culturally and socially appropriate ways.” He notes that the field of emergency management has heightened attention to these issues in recent years, and “the skillset we need is to work with those community structures” that already exist where a disaster strikes.

As we’ve seen in the tech industry the last few years, cross-cultural communication skills remain scarce. One can always learn this just through repeated experiences, but could we train people to develop empathy and understanding through software? Can we develop better and richer scenarios to train emergency responders — and all of us, really — on how to communicate effectively in widely diverging conditions? That’s a huge opportunity for a startup to tackle.

Emergency management is now a well-developed career path. “The history of the field is very fascinating, [it’s] been increasingly professionalized, with all these certifications,” Clark-Ginsberg said. That professionalization “standardizes emergency response so that you know what you are getting since they have all these certs, and you know what they know and what they don’t.” Certifications can indicate singular competence, but perhaps not holistic assessment, and it’s a market that offers opportunities for new startups to create better assessments.

Like many of us, responders get used to doing the same thing over and over again, and that can make training for new skills even more challenging. Michael Martin of emergency data management platform RapidSOS describes how 911 call takers get used to muscle memory, “so switching to a new system is very high-risk.” No matter how bad existing software interfaces are, changing them will very likely slow every single response down while increasing the risk of errors. That’s why the company offers “25,000 hours a year for training, support, integration.” There remains a huge and relatively fragmented market for training staff as well as transitioning them from one software stack to another.

Outside these somewhat narrow niches, there is a need for a massive renaissance in training in this whole area. My colleague Natasha Mascarenhas recently wrote an EC-1 on Duolingo, an app designed to gamify and entrance students interested in learning second languages. It’s a compelling product, and there is no comparative training system for engaging the full gamut of first responders.

Art delaCruz, COO and president of Team Rubicon, a non-profit which assembles teams of volunteer military veterans to respond to natural disasters, said that it’s an issue his organization is spending more time thinking about. “Part of resilience is education, and the ability to access information, and that is a gap that we continue to close on,” he said. “How do you present information that’s more simple than [a learning management system]?” He described the need for “knowledge bombs like flash cards” to regularly provide responders with new knowledge while testing existing ideas.

There’s also a need to scale up best practices rapidly across the world. Tom Cotter, director of emergency response and preparedness at Project Hope, a non-profit which empowers local healthcare workers in disaster-stricken and impoverished areas, said that in the context of COVID-19, “a lot of what was going to be needed [early on] was training — there were huge information gaps at the clinical level, how to communicate it at a community level.” The organization developed a curriculum with Brown University’s Watson Institute in the form of interactive PowerPoints that were ultimately used to train 100,000 healthcare workers on the new virus, according to Cotter.

When I look at the spectrum of edtech products existing today, one of the key peculiarities is just how narrow each seems to focus. There are apps for language learning and for learning math and developing literacy. There are flash card apps like Anki that are popular among medical students, and more interactive approaches like Labster for science experiments and Sketchy for learning anatomy.

Yet, for all the talk of boot camps in Silicon Valley, there is no edtech company that tries to completely transform a student in the way that a bona fide boot camp does. No startup wants to holistically develop their students, adding in hard skills while also advancing the ability to handle stress, the improvisation needed to confront rapidly-changing environments, and the skills needed to communicate with empathy.

Maybe that can’t be done with software. Maybe. Or perhaps, no founder has just had the ambition so far to go for broke — to really revolutionize how we think about training the next generation of emergency management professionals and everyone else in private industry who needs to handle stress or think on their feet just as much as frontline workers.

That’s the direction where Bryce Stirton, president and co-founder of public-safety company Responder Corp, has been thinking about. “Another area I am personally a fan of is the training space around VR,” he said. “It’s very difficult to synthesize these stressful environments,” in areas like firefighting, but new technologies have “the ability to pump the heart that you need to experience in training.” He concludes that “the VR world, it can have a large impact.”

Healing after disaster

When it comes to trauma, few fields face quite the challenge as emergency response. It’s work that almost by definition forces its personnel to confront some of the most harrowing scenes imaginable. Death and destruction are given, but what’s not always accounted for is the lack of agency in some of these contexts for first responders — the family that can’t be saved in time so a 911 call taker has to offer final solace, or the paramedics who don’t have the right equipment even as they are showing up on site.

Post-traumatic stress is perhaps the most well-known and common mental health condition facing first responders, although it is hardly the only one. How to ameliorate and potentially even cure these conditions represents a burgeoning area of investment and growth for a number of startups and investors.

Risk & Return, for instance, is a venture firm heavily focused on companies working on mental health as well as human performance more generally. In my profile of the firm a few weeks ago, managing director Jeff Eggers said that “We love that type of technology since it has that dual purpose: going to serve the first responder on the ground, but the community is also going to benefit.”

Two examples of companies from its portfolio are useful here to explore as examples of different pathways in this category. The first is Alto Neuroscience, which is a stealthy startup founded by Amit Etkin, a multidisciplinary neuroscientist and psychiatrist at Stanford, to create new clinical treatments to post-traumatic stress and other conditions based on brainwave data. Given its therapeutic focus, it’s probably years before testing and regulatory approvals come through, but this sort of research is on the cutting-edge of innovation here.

The second company is NeuroFlow, which is a software startup using apps to guide patients to better mental health outcomes. Through persistent polling, testing, and collaboration with practitioners, the company’s tools allow for more active monitoring of mental health — looking for emerging symptoms or relapses in even the most complicated cases. NeuroFlow is more on the clinical side, but there are obviously a wealth of wellness startups that have percolated in recent years as well like Headspace and Calm.

Outside of therapeutics and software though, there are entirely new frontiers around mental health in areas like psychedelics. That was one of the trends I called out as a top five area for investment in the 2020s earlier this year, and I stand by that. We’ve also covered a startup called Osmind which is a clinical platform for managing patients with a psychedelic focus.

Risk & Return itself hasn’t made an investment in psychedelics yet, but Bob Kerrey, the firm’s board chairman and the former co-chair of the 9/11 Commission as well as former governor and senator of Nebraska, said that “it’s difficult to do this if you are the government, but easier to do this in the private sector.”

Similar to edtech, mental health startups might get their start in the first responder community, but they are hardly limited to this population. Post-traumatic stress and other mental health conditions affect wide swaths of the world’s population, and solutions that work in one community can often translate more broadly to others. It’s a massive, massive market, and one that could potentially transform the lives of millions of people for the better.

Before moving on, there’s one other area of interest here, and that is creating impactful communities for healing. First responders and military veterans experience a mission and camaraderie in their service that they often lack once they are in new jobs or on convalescence. DelaCruz of Team Rubicon says that one of the goals of bringing veterans to help in disaster regions is that the veterans themselves “reconnect with identity and community — we have these incredible assets in these men and women who have served.” It’s not enough to just find a single treatment per patient — we oftentimes need to zoom out to the wider population to see how mental health ripples out.

Helping people find purpose may not be the easiest challenge to solve as a startup, but it’s certainly a major challenge for many, and an area fermenting with new approaches now that the the social networking wave has reached its nadir.

Crowdsourcing disaster response

Decentralization has been all the rage in tech in recent years — just mention the word blockchain in a TechCrunch article to get at least 50 PR emails about the latest NFT for a toilet stain. While there is obviously a lot of noise, one area where substance may pan out well is in disaster response.

If the COVID-19 pandemic showed anything, it was the power of the internet to aggregate as well as verify data, build dashboards, and deliver highly-effective visualizations of complex information for professionals and laypeople alike. Those products were developed by people all around the world often from the comfort of their own homes, and they demonstrate how crowds can quickly draft serious labor to help respond to crises as they crop up.

Jonathan Sury, project director at the National Center for Disaster Preparedness at the Earth Institute at Columbia University, said that “COVID has really blown so much of what we think about out of the water.” With so many ways to collaborate online right now, “that’s what I would say is very exciting … and also practical and empowering.”

Clark-Ginsberg of RAND calls it the “next frontier of disaster management.” He argues that “if you can use technology to broaden the number of people who can participate in disaster management and respond to disasters,” then we might be reaching an entirely new paradigm for what effective disaster response will look like. “Formal structures [for professional frontline workers] have strengthened and that has saved lives and resources, but our ability to engage with everyday responders is still something to work on.”

Many of the tools that underpin these crowdsourced efforts don’t even focus on disasters. Sury pointed to Tableau and data visualization platform Flourish as examples of the kinds of tools that remote, lay first responders are using. There are now quite robust tools for tabular data, but we’re still relatively early in the development of tools for handling mapping data — obviously critical in the crisis context. Unfolded.ai, which I profiled earlier this year, is working on building scalable geospatial analytics in the browser. A lot more can be done here.

Oftentimes there are ways to coordinate the coordinators. Develop for Good, which I looked at late last year, is a non-profit designed to connect enterprising computer science students to software and data projects at non-profits and agencies that needed help during the pandemic. Sometimes these coordinators are non-profit orgs, and sometimes, just very active Twitter accounts. There’s a lot more experimentation possible on how to coordinate efforts in a decentralized way while still engaging with professional first responders and the public sector.

Speaking of decentralization, it’s even possible that blockchain could play a role in disaster and crisis response. Many of these opportunities rest on using blockchain for evidence collection or for identity. For example, earlier this week Leigh Cuen took a careful look at an at-home sexual assault evidence collection kit from Leda Health that uses the blockchain to establish a clear time for when a sample was collected.

There is a lot more potential to harness the power of crowdsourcing and decentralization, and many of these projects have applications far outside disaster management itself. These tools not only solve real problems — they provide real community to people who may not be related to the disaster itself, but are enthusiastic to do their part to help others.

The black swans of black swans

In terms of startups, the three markets I identified — better training, better mental health, and better crowdsourcing collaboration tools, particularly around data — collectively represent a very compelling set of markets that will not only be valuable for founders, but can rapidly improve lives.

In his book Normal Accidents, Charles Perrow talks about how an increasing level of complexity and coupledness in our modern technical systems all but guarantee disasters to occur. Add in a warming world as well as the intensity, frequency, and just plain unusualness of disasters arriving each year, and we are increasingly seeing entirely novel forms of emergencies we have never responded to before. Take most recently the ultra-frigid conditions in Texas that sapped power from its grid, leading to statewide blackouts for hours and days in some parts of the state.

Clark-Ginsberg said, “We are seeing these risks emerge that aren’t just typical wildfires — where we have a response structure that we can easily setup and manage the hazard, [we’re] very good at managing these typical disasters. There are more of these atypical disasters cropping up, and we have a very hard time setting up structures for this — the pandemic is a great example of that.”

He describes these challenges as “trans-boundary risk management,” disasters that cross bureaucratic lines, professions, societies, and means of action. “It takes a certain agility and the ability to move quickly and the ability to work in ways outside typical bureaucratic structures, and that is just challenging full stop,” he said.

The Future of Technology and Disaster Response

Even as we begin to have better point solutions to the individual problems that disasters and their responses require, we can’t be remiss in neglecting the more systematic challenges that these emergencies are bringing to the fore. We have to start thinking about bringing humans together faster and in more novel ways to be the most effective, while coupling them flexibly and with agility to the best tools that meet their needs in the moment. That’s probably not literally “a startup,” but more a way of thinking about what it means to construct a disaster response fresh given the information available.

Amanda Levin, a policy analyst at the Natural Resources Defense Council, said that “even if we mitigate, there are huge pressures and huge impacts today from a warming world … even if we stop emissions today, [they] will still persist.” As one of my interviewees in government service who asked to go unnamed noted about disaster response, “You always are coming up short somewhere.” The problems are only getting harder, and we humans need much better tools to match the man-made trials we created for ourselves. That’s the challenge — and opportunity — for a tough century ahead.

When the Earth is gone, at least the internet will still be working

The internet is now our nervous system. We are constantly streaming and buying and watching and liking, our brains locked into the global information matrix as one universal and coruscating emanation of thought and emotion.

What happens when the machine stops though?

It’s a question that E.M. Forster was intensely focused on more than a century ago in a short story called, rightly enough, “The Machine Stops,” about a human civilization connected entirely through machines that one day just turn off.

Those fears of downtime are not just science fiction anymore. Outages aren’t just missing a must-watch TikTok clip. Hospitals, law enforcement, the government, every corporation — the entire spectrum of human institutions that constitute civilization now deeply rely on connectivity to function.

So when it comes to disaster response, the world has dramatically changed. In decades past, the singular focus could be roughly summarized as rescue and mitigation — save who you can while trying to limit the scale of destruction. Today though, the highest priority is by necessity internet access, not just for citizens, but increasingly for the on-the-ground first responders who need bandwidth to protect themselves, keep abreast of their mission objectives, and have real-time ground truth on where dangers lurk and where help is needed.

While the sales cycles might be arduous as we learned in part one and the data trickles have finally turned to streams in part two, the reality is that none of that matters if there isn’t connectivity to begin with. So in part three of this series on the future of technology and disaster response, we’re going to analyze the changing nature of bandwidth and connectivity and how they intersect with emergencies, taking a look at how telcos are creating resilience in their networks while defending against climate change, how first responders are integrating connectivity into their operations, and finally, exploring how new technologies like 5G and satellite internet will affect these critical activities.

Wireless resilience as the world burns

Climate change is inducing more intense weather patterns all around the world, creating second- and third-order effects for industries that rely on environmental stability for operations. Few industries have to be as dynamic to the changing context as telecom companies, whose wired and wireless infrastructure is regularly buffeted by severe storms. Resiliency of these networks isn’t just needed for consumers — it’s absolutely necessary for the very responders trying to mitigate disasters and get the network back up in the first place.

Unsurprisingly, no issue looms larger for telcos than access to power — no juice, no bars. So all three of America’s major telcos — Verizon (which owns TechCrunch’s parent company Verizon Media, although not for much longer), AT&T and T-Mobile — have had to dramatically scale up their resiliency efforts in recent years to compensate both for the demand for wireless and the growing damage wrought by weather.

Jay Naillon, senior director of national technology service operations strategy at T-Mobile, said that the company has made resilience a key part of its network buildout in recent years, with investments in generators at cell towers that can be relied upon when the grid cannot. In “areas that have been hit by hurricanes or places that have fragile grids … that is where we have invested most of our fixed assets,” he said.

Like all three telcos, T-Mobile pre-deploys equipment in anticipation for disruptions. So when a hurricane begins to swirl in the Atlantic Ocean, the company will strategically fly in portable generators and mobile cell towers in anticipation of potential outages. “We look at storm forecasts for the year,” Naillon explained, and do “lots of preventative planning.” They also work with emergency managers and “run through various drills with them and respond and collaborate effectively with them” to determine which parts of the network are most at risk for damage in an emergency. Last year, the company partnered with StormGeo to accurately predict weather events.

Predictive AI for disasters is also a critical need for AT&T. Jason Porter, who leads public sector and the company’s FirstNet first-responder network, said that AT&T teamed up with Argonne National Laboratory to create a climate-change analysis tool to evaluate the siting of its cell towers and how they will weather the next 30 years of “floods, hurricanes, droughts and wildfires.” “We redesigned our buildout … based on what our algorithms told us would come,” he said, and the company has been elevating vulnerable cell towers four to eight feet high on “stilts” to improve their resiliency to at least some weather events. That “gave ourselves some additional buffer.”

AT&T has also had to manage the growing complexity of creating reliability with the chaos of a climate-change-induced world. In recent years, “we quickly realized that many of our deployments were due to weather-related events,” and the company has been “very focused on expanding our generator coverage over the past few years,” Porter said. It’s also been very focused on building out its portable infrastructure. “We essentially deploy entire data centers on trucks so that we can stand up essentially a central office,” he said, empathizing that the company’s national disaster recovery team responded to thousands of events last year.

Particularly on its FirstNet service, AT&T has pioneered two new technologies to try to get bandwidth to disaster-hit regions faster. First, it has invested in drones to offer wireless services from the sky. After Hurricane Laura hit Louisiana last year with record-setting winds, our “cell towers were twisted up like recycled aluminum cans … so we needed to deploy a sustainable solution,” Porter described. So the company deployed what it dubs the FirstNet One — a “dirigible” that “can cover twice the cell coverage range of a cell tower on a truck, and it can stay up for literally weeks, refuel in less than an hour and go back up — so long-term, sustainable coverage,” he said.

AT&T’s FirstNet One dirigible to offer internet access from the air for first responders. Image Credits: AT&T/FirstNet

Secondly, the company has been building out what it calls FirstNet MegaRange — a set of high-powered wireless equipment that it announced earlier this year that can deploy signals from miles away, say from a ship moored off a coast, to deliver reliable connectivity to first responders in the hardest-hit disaster zones.

As the internet has absorbed more of daily life, the norms for network resilience have become ever more exacting. Small outages can disrupt not just a first responder, but a child taking virtual classes and a doctor conducting remote surgery. From fixed and portable generators to rapid-deployment mobile cell towers and dirigibles, telcos are investing major resources to keep their networks running continuously.

Yet, these initiatives are ultimately costs borne by telcos increasingly confronting a world burning up. Across conversations with all three telcos and others in the disaster response space, there was a general sense that utilities just increasingly have to self-insulate themselves in a climate-changed world. For instance, cell towers need their own generators because — as we saw with Texas earlier this year — even the power grid itself can’t be guaranteed to be there. Critical applications need to have offline capabilities, since internet outages can’t always be prevented. The machine runs, but the machine stops, too.

The trend lines on the frontlines are data lines

While we may rely on connectivity in our daily lives as consumers, disaster responders have been much more hesitant to fully transition to connected services. It is precisely in the middle of a tornado and the cell tower is down that you realize a printed map might have been nice to have. Paper, pens, compasses — the old staples of survival flicks remain just as important in the field today as they were decades ago.

Yet, the power of software and connectivity to improve emergency response has forced a rethinking of field communications and how deeply technology is integrated on the ground. Data from the frontlines is extremely useful, and if it can be transmitted, dramatically improves the ability of operations planners to respond safely and efficiently.

Both AT&T and Verizon have made large investments in directly servicing the unique needs of the first responder community, with AT&T in particular gaining prominence with its FirstNet network, which it exclusively operates through a public-private partnership with the Department of Commerce’s First Responder Network Authority. The government offered a special spectrum license to the FirstNet authority in Band 14 in exchange for the buildout of a responder-exclusive network, a key recommendation of the 9/11 Commission, which found that first responders couldn’t communicate with each other on the day of those deadly terrorist attacks. Now, Porter of AT&T says that the company’s buildout is “90% complete” and is approaching 3 million square miles of coverage.

Why so much attention on first responders? The telcos are investing here because in many ways, the first responders are on the frontiers of technology. They need edge computing, AI/ML rapid decision-making, the bandwidth and latency of 5G (which we will get to in a bit), high reliability, and in general, are fairly profitable customers to boot. In other words, what first responders need today are what consumers in general are going to want tomorrow.

Cory Davis, director of public safety strategy and crisis response at Verizon, explained that “more than ever, first responders are relying on technology to go out there and save lives.” His counterpart, Nick Nilan, who leads product management for the public sector, said that “when we became Verizon, it was really about voice [and] what’s changed over the last five [years] is the importance of data.” He brings attention to tools for situational awareness, mapping, and more that are a becoming standard in the field. Everything first responders do “comes back to the network — do you have the coverage where you need it, do you have the network access when something happens?”

The challenge for the telcos is that we all want access to that network when catastrophe strikes, which is precisely when network resources are most scarce. The first responder trying to communicate with their team on the ground or their operations center is inevitably competing with a citizen letting friends know they are safe — or perhaps just watching the latest episode of a TV show in their vehicle as they are fleeing the evacuation zone.

That competition is the argument for a completely segmented network like FirstNet, which has its own dedicated spectrum with devices that can only be used by first responders. “With remote learning, remote work and general congestion,” Porter said, telcos and other bandwidth providers were overwhelmed with consumer demand. “Thankfully we saw through FirstNet … clearing that 20 MHz of spectrum for first responders” helped keep the lines clear for high-priority communications.

FirstNet’s big emphasis is on its dedicated spectrum, but that’s just one component of a larger strategy to give first responders always-on and ready access to wireless services. AT&T and Verizon have made prioritization and preemption key operational components of their networks in recent years. Prioritization gives public safety users better access to the network, while preemption can include actively kicking off lower-priority consumers from the network to ensure first responders have immediate access.

Nilan of Verizon said, “The network is built for everybody … but once we start thinking about who absolutely needs access to the network at a period of time, we prioritize our first responders.” Verizon has prioritization, preemption, and now virtual segmentation — “we separate their traffic from consumer traffic” so that first responders don’t have to compete if bandwidth is limited in the middle of a disaster. He noted that all three approaches have been enabled since 2018, and Verizon’s suite of bandwidth and software for first responders comes under the newly christened Verizon Frontline brand that launched in March.

With increased bandwidth reliability, first responders are increasingly connected in ways that even a decade ago would have been unfathomable. Tablets, sensors, connected devices and tools — equipment that would have been manual are now increasingly digital.

That opens up a wealth of possibilities now that the infrastructure is established. My interview subjects suggested applications as diverse as the decentralized coordination of response team movements through GPS and 5G; real-time updated maps that offer up-to-date risk analysis of how a disaster might progress; pathfinding for evacuees that’s updated as routes fluctuate; AI damage assessments even before the recovery process begins; and much, much more. In fact, when it comes to the ferment of the imagination, many of those possibilities will finally be realized in the coming years — when they have only ever been marketing-speak and technical promises in the past.

Five, Gee

We’ve been hearing about 5G for years now, and even 6G every once in a while just to cause reporters heart attacks, but what does 5G even mean in the context of disaster response? After years of speculation, we are finally starting to get answers.

Naillon of T-Mobile noted that the biggest benefit of 5G is that it “allows us to have greater coverage” particularly given the low-band spectrum that the standard partially uses. That said, “As far as applications — we are not really there at that point from an emergency response perspective,” he said.

Meanwhile, Porter of AT&T said that “the beauty of 5G that we have seen there is less about the speed and more about the latency.” Consumers have often seen marketing around voluminous bandwidths, but in the first-responder world, latency and edge computing tends to be the most desirable features. For instance, devices can relay video to each other on the frontlines, without necessarily needing a backhaul to the main wireless network. On-board processing of image data could allow for rapid decision-making in environments where seconds can be vital to the success of a mission.

That flexibility is allowing for many new applications in disaster response, and “we are seeing some amazing use cases coming out of our 5G deployments [and] we have launched some of our pilots with the [Department of Defense],” Porter said. He offered an example of “robotic dogs to go and do bomb dismantling or inspecting and recovery.”

Verizon has made innovating on new applications a strategic goal, launching a 5G First Responders Lab dedicated to guiding a new generation of startups to build at this crossroads. Nilan of Verizon said that the incubator has had more than 20 companies across four different cohorts, working on everything from virtual reality training environments to AR applications that allow firefighters to “see through walls.” His colleague Davis said that “artificial intelligence is going to continue to get better and better and better.”

Blueforce is a company that went through the first cohort of the Lab. The company uses 5G to connect sensors and devices together to allow first responders to make the best decisions they can with the most up-to-date data. Michael Helfrich, founder and CEO, said that “because of these new networks … commanders are able to leave the vehicle and go into the field and get the same fidelity” of information that they normally would have to be in a command center to receive. He noted that in addition to classic user interfaces, the company is exploring other ways of presenting information to responders. “They don’t have to look at a screen anymore, and [we’re] exploring different cognitive models like audio, vibration and heads-up displays.”

5G will offer many new ways to improve emergency responses, but that doesn’t mean that our current 4G networks will just disappear. Davis said that many sensors in the field don’t need the kind of latency or bandwidth that 5G offers. “LTE is going to be around for many, many more years,” he said, pointing to the hardware and applications taking advantage of LTE-M standards for Internet of Things (IoT) devices as a key development for the future here.

Michael Martin of emergency response data platform RapidSOS said that “it does feel like there is renewed energy to solve real problems,” in the disaster response market, which he dubbed the “Elon Musk effect.” And that effect definitely does exist when it comes to connectivity, where SpaceX’s satellite bandwidth project Starlink comes into play.

Satellite uplinks have historically had horrific latency and bandwidth constraints, making them difficult to use in disaster contexts. Furthermore, depending on the particular type of disaster, satellite uplinks can be astonishingly challenging to setup given the ground environment. Starlink promises to shatter all of those barriers — easier connections, fat pipes, low latencies and a global footprint that would be the envy of any first responder globally. Its network is still under active development, so it is difficult to foresee today precisely what its impact will be on the disaster response market, but it’s an offering to watch closely in the years ahead, because it has the potential to completely upend the way we respond to disasters this century if its promises pan out.

Yet, even if we discount Starlink, the change coming this decade in emergency response represents a complete revolution. The depth and resilience of connectivity is changing the equation for first responders from complete reliance on antiquated tools to an embrace of the future of digital computing. The machine is no longer stoppable.


Future of Technology and Disaster Response Table of Contents


Facebook’s hand-picked ‘oversight’ panel upholds Trump ban — for now

Facebook’s content decision review body, a quasi-external panel that’s been likened to a ‘Supreme Court of Facebook’ but isn’t staffed by sitting judges, can’t be truly independent of the tech giant which funds it, has no legal legitimacy or democratic accountability, and goes by the much duller official title ‘Oversight Board’ (aka the FOB) — has just made the biggest call of its short life…

Facebook’s hand-picked ‘oversight’ panel has voted against reinstating former U.S. president Donald Trump’s Facebook account.

However it has sought to row the company back from an ‘indefinite’ ban — finding fault with its decision to impose an indefinite restriction, rather than issue a more standard penalty (such as a penalty strike or permanent account closure).

In a press release announcing its decision the board writes:

Given the seriousness of the violations and the ongoing risk of violence, Facebook was justified in suspending Mr. Trump’s accounts on January 6 and extending that suspension on January 7.

However, it was not appropriate for Facebook to impose an ‘indefinite’ suspension.

It is not permissible for Facebook to keep a user off the platform for an undefined period, with no criteria for when or whether the account will be restored.”

The board wants Facebook to revision its decision on Trump’s account within six months — and “decide the appropriate penalty”. So it appears to have succeeded in… kicking the can down the road.

The FOB is due to hold a press conference to discuss its decision shortly so stay tuned for updates.

This story is developing… refresh for updates…

It’s certainly been a very quiet five months on mainstream social media since Trump had his social media ALL CAPS megaphone unceremoniously shut down in the wake of his supporters’ violent storming of the capital.

For more on the background to Trump’s deplatforming do make time for this excellent explainer by TechCrunch’s Taylor Hatmaker. But the short version is that Trump finally appeared to have torched the last of his social media rule-breaking chances after he succeeded in fomenting an actual insurrection on U.S. soil on January 6. Doing so with the help of the massive, mainstream social media platforms whose community standards don’t, as a rule, give a thumbs up to violent insurrection…

Data was the new oil, until the oil caught fire

We’ve been hearing how “data is the new oil” for more than a decade now, and in certain sectors, it’s a maxim that has more than panned out. From marketing and logistics to finance and product, decision-making is now dominated by data at all levels of most big private orgs (and if it isn’t, I’d be getting a résumé put together, stat).

So it might be a something of a surprise to learn that data, which could transform how we respond to the increasingly deadly disasters that regularly plague us, has been all but absent from much of emergency response this past decade. Far from being a geyser of digital oil, disaster response agencies and private organizations alike have for years tried to swell the scope and scale of the data being inputted into disaster response, with relatively meager results.

That’s starting to change though, mostly thanks to the internet of things (IoT), and frontline crisis managers today increasingly have the data they need to make better decisions across the resilience, response, and recovery cycle. The best is yet to come — with drones flying up, simulated visualizations, and artificial intelligence-induced disasters — what we’re seeing today on the frontlines is only the beginning of what could be a revolution in disaster response in the 2020s.

The long-awaited disaster data deluge has finally arrived

Emergency response is a fight against the fog of war and the dreadful ticking of the clock. In the midst of a wildfire or hurricane, everything can change in a matter of seconds — even milliseconds if you aren’t paying attention. Safe roads ferrying evacuees can suddenly become impassable infernos, evacuation teams can reposition and find themselves spread far too thin, and unforeseen conditions can rapidly metastasize to cover the entire operating environment. An operations center that once had perfect information can quickly find it has no ground truth at all.

Unfortunately, even getting raw data on what’s happening before and during a disaster can be extraordinarily difficult. When we look at the data revolution in business, part of the early success stems from the fact that companies were always heavily reliant on data to handle their activities. Digitalization was and is the key word: moving from paper to computers in order to transform latent raw data into a form that was machine-readable and therefore analyzable. In business, the last ten years was basically upgrading to version two from version one.

In emergency management however, many agencies are stuck without a version at all. Take a flood — where is the water and where is it going? Up until recently, there was no comprehensive data on where waters rose from and where they sloshed to. When it came to wildfires, there were no administrative datasets on where every tree in the world was located and how prone each is to fire. Even human infrastructure like power lines and cell towers often had little interface with the digital world. They stood there, and if you couldn’t see them, they couldn’t see you.

Flood modeling is on the cutting edge of disaster planning and response. Image Credits: CHANDAN KHANNA/AFP via Getty Images

Models, simulations, predictions, analysis: all of these are useless without raw data, and in the disaster response realm, there was no detailed data to be found.

After years of promising an Internet of Things (IoT) revolution, things are finally internet-izing, with IoT sensors increasingly larding up the American and world landscape. Temperature, atmospheric pressure, water levels, humidity, pollution, power, and other sensors have been widely deployed, emitting constant streams of data back into data warehouses ready for analysis.

Take wildfires in the American West. It wasn’t all that long ago that the U.S. federal government and state firefighting agencies had no knowledge of where a blaze was taking place. Firefighting has been “100 years of tradition unimpeded by progress,” Tom Harbour, head of fire response for a decade at the U.S. Forest Service and now chief fire officer at Cornea put it.

And he’s right. After all, firefighting is a visceral activity — responders can see the fires, even feel the burning heat echoing off of their flesh. Data wasn’t useful, particularly in the West where there are millions of acres of land and large swaths are sparsely populated. Massive conflagrations could be detected by satellites, but smoldering fires in the brush would be entirely invisible to the geospatial authorities. There’s smoke over California — exactly what is a firefighter on the ground supposed to do with such valuable information?

Today after a decade of speculative promise, IoT sensors are starting to clear a huge part of this fog. Aaron Clark-Ginsberg, a social scientist at RAND Corporation who researches community resilience, said that air quality sensors have become ubiquitous since they are “very cheap [and] pretty easy to use” and can offer very fine-grained understandings of pollution — a key signal, for instance, of wildfires. He pointed to the company Purple Air, which in addition to making sensors, also produces a popular consumer map of air quality, as indicative of the potential these days for technology.

Maps are the critical intersection for data in disasters. Geospatial information systems (GIS) form the basis for most planning and response teams, and no company has a larger footprint in the sector than privately-held Esri. Ryan Lanclos, who leads public safety solutions at the company, pointed to the huge expansion of water sensors as radically changing responses to certain disasters. “Flood sensors are always pulsing,“ he said, and with a “national water model coming out of the federal government ,” researchers can now predict through GIS analysis how a flood will affect different communities with a precision unheard of previously.

Digital maps and GIS systems are increasingly vital for disaster planning and response, but paper still remains quite ubiquitous. Image Credits: Paul Kitagaki Jr.-Pool/Getty Images

Cory Davis, the director of public safety strategy and crisis response at Verizon (which, through our parent company Verizon Media, is TechCrunch’s ultimate owner), said that all of these sensors have transformed how crews work to maintain infrastructure as well. “Think like a utility that is able to put a sensor on a power line — now they have sensors and get out there quicker, resolve it, and get the power back up.”

He noted one major development that has transformed sensors in this space the last few years: battery life. Thanks to continuous improvements in ultra-low-power wireless chips as well as better batteries and energy management systems, sensors can last a really long time in the wilderness without the need for maintenance. “Now we have devices that have ten-year battery lives,” he said. That’s critical, because it can be impossible to connect these sensors to the power grid in frontier areas.

The same line of thinking holds true at T-Mobile as well. When it comes to preventative planning, Jay Naillon, senior director of national technology service operations strategy at the telco, said that “the type of data that is becoming more and more valuable for us is the storm surge data — it can make it easier to know we have the right assets in place.” That data comes from flood sensors that can offer real-time warnings signals to planners across the country.

Telecom interest — and commercial interest in general — has been critical to accelerating the adoption of sensors and other data streams around disasters. While governments may be the logical end user of flood or wildfire data, they aren’t the only ones interested in this visibility. “A lot of consumers of that information are in the private sector,” said Jonathan Sury, project director at the National Center for Disaster Preparedness at the Earth Institute at Columbia University. “These new types of risks, like climate change, are going to affect their bottom lines,” and he pointed to bond ratings, insurance underwriting and other areas where commercial interest in sensor data has been profound.

Sensors may not literally be ubiquitous, but they have offered a window into the ambiguity that emergency managers have never had visibility into before.

Finally, there is the extensive datasets around mobile usage that have become ubiquitous throughout much of the world. Facebook’s Data for Good project, for instance, provides data layers around connectivity — are users connecting from one place and then later connecting from a different location, indicating displacement? That sort of data from the company and telcos themselves can help emergency planners scout out how populations are shifting in real-time.

Data, data, on the wall — how many AIs can they call?

Rivulets of data have now turned into floods of information, but just like floodwaters rising in cities across the world, the data deluge now needs a response all its own. In business, the surfeit of big data has been wrangled with an IT stack from data warehouses all the way to business intelligence tools.

If only data for disasters could be processed so easily. Data relevant for disasters is held by dozens of different organizations spanning the private, public, and non-profit sectors, leading to huge interoperability problems. Even when the data can be harmonized, there are large challenges in summarizing the findings down to an actual decision a frontline responder can use in their work — making AI a tough sale still today, particularly outside of planning. As Davis of Verizon put it, “now that they have this plethora of data, a lot of cities and federal agencies are struggling with how to use it.”

Unfortunately, standardization is a challenge at all scales. Globally, countries mostly lack interoperability, although standards are improving over time. Amir Elichai, the founder and CEO of 911 call-handling platform Carbyne, said that “from a technology standpoint and a standards standpoint, there is a big difference between countries,” noting that protocols from one country often have to be completely rewritten to serve a different market.

Tom Cotter, director of emergency response and preparedness at health care disaster response organization Project HOPE, said that even setting up communications between responders can be challenging in an international environment. “Some countries allow certain platforms but not others, and it is constantly changing,” he said. “I basically have every single technology communication platform you can possibly have in one place.”

One senior federal emergency management official acknowledged that data portability has become increasingly key in procurement contracts for technology, with the government recognizing the need to buy commercially-available software rather than custom-designed software. That message has been picked up by companies like Esri, with Lanclos stating that “part of our core mission is to be open and … create data and to share that openly to the public or securely through open standards.”

For all its downsides though, the lack of interoperability can be ironically helpful for innovation. Elichai said that the “lack of standards is an advantage — you are not buying into a legacy standard,” and in some contexts where standards are lacking, quality protocols can be built with the assumption of a modern data workflow.

Even with interoperability though, the next challenge becomes data sanitation — and disaster data is dirty as … well, something. While sensor streams can be verified and cross-checked with other datasets, in recent years there has been a heavy increase in the quantity of citizen-submitted information that has to be carefully vetted before it is disseminated to first responders or the public.

With citizens having more access to smartphones than ever, emergency planners have to sanitize uploaded data uploaded in order to verify and make it useful. Image Credits: TONY KARUMBA/AFP via Getty Images

Bailey Farren, CEO and co-founder of disaster communications platform Perimeter, said that “sometimes citizens have the most accurate and real-time information, before first responders show up — we want citizens to share that with …government officials.” The challenge is how to filter the quality goods from the unhelpful or malicious. Raj Kamachee, the CIO of Team Rubicon, a non-profit which assembles teams of volunteer military veterans to respond to natural disasters, said that verification is critical, and it’s a key element of the infrastructure he has built at the organization since joining in 2017. “We’ve gotten more people using it so more feedback [and] more data [is] coming through the pipes,” he said. “So creating a self-service, a very collaborative approach.”

With quality and quantity, the AI models should come, right? Well, yes and no.

Sury of Columbia wants to cool down at least some of the hype around AI. “The big caveat with all of these machine learning and big data applications is that they are not a panacea — they are able to process a lot of disparate information, [but] they’re certainly not going to tell us exactly what to do,” he said. “First responders are already processing a lot of information,” and they don’t necessarily need more guidance.

Instead, AI in disasters is increasingly focused on planning and resilience. Sury pointed to OneConcern, a resiliency planning platform, as one example of how data and AI can be combined in the disaster planning process. He also pointed to the CDC’s Social Vulnerability Index and risk tools from FEMA that integrate different data signals into scalar values by emergency planners to optimize their contingency plans.

Yet, almost everyone I talked to was much more hesitant about the power of AI. As I discussed a bit in part one of this series regarding the disaster sales cycle, data tools have to be real-time and perfect every time given the lives that are on the line. Kamachee of Team Rubicon noted that when choosing tools, he avoids whiz-bang and instead looks at the pure utility of individual vendors. “We go high tech, but we prepare for low tech,” he said, empathizing that in disaster response, everything must be agile and adaptable to changing circumstances.

Elichai of Carbyne saw this pattern in his sales. There’s a “sensitivity in our market and the reluctance from time to time to adopt” new technologies he said, but acknowledged that “there is no doubt that AI at a certain point will provide benefits.”

Naillon of T-Mobile had similar views from the operator perspective, saying that “I can’t say that we really leverage AI very much” in the company’s disaster planning. Instead of AI as brain, the telecom company simply uses data and forecast modeling to optimally position equipment — no fancy GANs required.

Outside of planning, AI has helped in post-disaster recovery, and specifically around damage assessments. After a crisis transpires, assessments of infrastructure and private property have to be made in order for insurance claims to be filed and for a community to move forward. Art delaCruz, COO and president of Team Rubicon, noted that technology and a flourish of AI has helped significantly around damage assessments. Since his organization often helps rebuild communities in the course of its work, triaging damage is a critical element of its effective response strategy.

There’s a brighter future, other than that brightness from the sun that is going to burn us to a crisp, right?

So AI today is helping a bit with resilience planning and disaster recovery and not so much during emergency response itself, but there is certainly more to come across the entire cycle. Indeed, there is a lot of excitement about the future of drones, which are increasingly being used in the field, but there are concerns long term about whether AI and data will ultimately cause more problems than they solve.

Drones would seem to have an obvious value for disaster response, and indeed, they have been used by teams to get additional aerial footage and context where direct access by responders is limited. Kamachee of Team Rubicon noted that in the Bahamas on a mission, response teams used drones to detect survivors, since major roads were blocked. The drones snapped images that were processed using AI, and helped the team to identify those survivors for evacuation. He described drones and their potential as “sexy; very, very cool.”

Aerial views from drones can give disaster response teams much better real-time information, particularly in areas where on-the-ground access is limited. Image Credits: Mario Tama/Getty Images

Cotter of Project HOPE similarly noted that faster data processing translates to better responses. “Ultimately speed is what saves lives in these disasters,” he said. We’re “also able to manage more responses remotely [and] don’t have to send as many people downrange,” giving response teams more leverage in resource-constrained environments.

“I see more emergency management agencies using drone technology — search and rescue, aerial photography,” Davis of Verizon said, arguing that operators often have a mentality of “send a machine into a situation first.” He continued, arguing, “artificial intelligence is going to continue to get better and better and better [and] enable our first responders to respond more effectively, but also more efficiently and safer.”

With data flooding in from sensors and drones and processed and verified better than ever, disaster response can improve, perhaps even better than Mother Nature can galvanize her increasingly deadly whims. Yet, there is one caveat: will the AI algorithms themselves cause new problems in the future?

Clark-Ginsburg of RAND, perhaps supplying that typical RANDian alternatives analysis, said that these solutions can also create problems themselves, “technological risks leading to disaster and the world of technology facilitating disaster.” These systems can break, they can make mistakes, and more ominously — they can be sabotaged to increase chaos and damage.

Bob Kerrey, a co-chair of the 9/11 Commission, former senator and governor of Nebraska, and currently the board chairman of Risk & Return, a disaster response VC fund and philanthropy I profiled recently, pointed to cybersecurity as increasingly a wild card in many responses. “There wasn’t a concept called zero days — let alone a market for zero days — in 2004 [when the 9/11 Commission was doing its work], and now there is.” With the 9/11 terrorist attacks, “they had to come here, they had to hijack planes … now you don’t need to hijack planes to damage the United States,” noting that hackers “can be sitting with a bunch of other guys in Moscow, in Tehran, in China, or even your mother’s basement.”

Data is a revolution in the making for disaster response, but it may well cause a whole second-order set of problems that didn’t exist before. What is giveth is taketh away. The oil gushes, but then the well suddenly runs dry – or simply catches fire.


Future of Technology and Disaster Response Table of Contents


Is Washington prepared for a geopolitical ‘tech race’?

When Secretary of State Antony Blinken and National Security Advisor Jake Sullivan sat down with Chinese officials in Anchorage, Alaska for the first high-level bilateral summit of the new administration, it was not a typical diplomatic meeting. Instead of a polite but restrained diplomatic exchange, the two sides traded pointed barbs for almost two hours. “There is growing consensus that the era of engagement with China has come to an unceremonious close,” wrote Sullivan and Kurt Campbell, the Administration’s Asia czar also in attendance, back in 2019. How apt that they were present for that moment’s arrival.

A little more than one hundred days into the Biden Administration, there is no shortage of views on how it should handle this new era of Sino-American relations. From a blue-ribbon panel assembled by former Google Chairman Eric Schmidt to a Politico essay from an anonymous former Trump Administration official that consciously echoes (in both its name and its author’s anonymity) George Kennan’s famous “Long Telegram” laying out the theory of Cold War containment, to countless think tank reports, it seems everyone is having their say.

What is largely uncontroversial though is that technology is at the center of U.S.-China relations, and any competition with China will be won or lost in the digital and cyber spheres. “Part of the goal of the Alaska meeting was to convince the Chinese that the Biden administration is determined to compete with Beijing across the board to offer competitive technology,” wrote David Sanger in the New York Times shortly afterward.

But what, exactly, does a tech-centered China strategy look like? And what would it take for one to succeed?

Tech has brought Republicans and Democrats uneasily together

One encouraging sign is that China has emerged as one of the few issues on which even Democrats agree that President Trump had some valid points. “Trump really was the spark that reframed the entire debate around U.S.-China relations in DC,” says Jordan Schneider, a China analyst at the Rhodium Group and the host of the ChinaTalk podcast and newsletter.

While many in the foreign policy community favored some degree of cooperation with China before the Trump presidency, now competition – if not outright rivalry – is widely assumed. “Democrats, even those who served in the Obama Administration, have become much more hawkish,” says Erik Brattberg of the Carnegie Endowment for International Peace. Trump has caused “the Overton Window on China [to become] a lot narrower than it was before,” adds Schneider.

The US delegation led by Secretary of State Antony Blinken face their Chinese counterparts at the opening session of US-China talks at the Captain Cook Hotel in Anchorage, Alaska on March 18, 2021. Image Credits: FREDERIC J. BROWN/POOL/AFP via Getty Images

As the U.S.-China rivalry has evolved, it has become more and more centered around competing philosophies on the use of technology. “At their core, democracies are open systems that believe in the free flow of information, whereas for autocrats, information is something to be weaponized and stifled in the service of the regime,” says Lindsay Gorman, Fellow for Emerging Technologies at the German Marshall Fund. “So it’s not too surprising that technology, so much of which is about how we store and process and leverage information, has become such a focus of the U.S.-China relationship and of the [broader] democratic-autocratic competition around the world.”

Tech touches everything now – and the stakes could not be higher. “Tech and the business models around tech are really ‘embedded ideology,’’ says Tyson Barker of the German Council on Foreign Relations. “So what tech is and how it is used is a form of governance.”

What does that mean in practice? When Chinese firms expand around the world, Barker tells me, they bring their norms with them. So when Huawei builds a 5G network in Latin America, or Alipay is adopted for digital payments in Central Europe, or Xiaomi takes more market share in Southeast Asia, they are helping digitize those economies on Chinese terms using Chinese norms (as opposed to American ones). The implication is clear: whoever defines the future of technology will determine the rest of the twenty-first century.

That shifting balance has focused minds in Washington. “I think there is a strong bipartisan consensus that technology is at the core of U.S.-China competition,” says Brattberg. But, adds Gorman, “there’s less agreement on what the prescription should be.” While the Democratic experts now ascendant in Washington agree with Trump’s diagnosis of the China challenge, they believe in a vastly different approach from their Trump Administration predecessors.

Out, for instance, are restrictions on Chinese firms just for being Chinese. “That was one of the problems with Trump,” says Walter Kerr, a former U.S. diplomat who publishes the China Journal Review. “Trump cast broad strokes, targeting firms whether it was merited or not. Sticking it to the Chinese is not a good policy.”

Instead the focus is on inward investment – and outward cooperation.

Foreign policy is domestic policy

Democrats are first shoring up America domestically – in short, be strong at home to be strong abroad. “There’s no longer a bright line between foreign and domestic policy,” President Biden said in his first major foreign policy speech. “Every action we take in our conduct abroad, we must take with American working families in mind. Advancing a foreign policy for the middle class demands urgent focus on our domestic economic renewal.”

This is a particular passion of Jake Sullivan, Biden’s national security advisor, who immersed himself in domestic policy while he was Hillary Clinton’s chief policy aide during her 2016 presidential campaign. “We’ve reached a point where foreign policy is domestic policy, and domestic policy is foreign policy,” he told NPR during the transition.

Jake Sullivan, White House national security adviser, speaks during a news conference Image Credits: Jim Lo Scalzo/EPA/Bloomberg via Getty Images

This is increasingly important for technology, as concern grows that America is lagging behind on research and development. “We’re realizing that we’ve underinvested in the government grants and research and development projects that American companies [need] to become highly innovative in fields like quantum computing, AI, biotechnology, etc,” says Kerr.

“Rebuilding” or “sustaining” America’s “technological leadership” is a major theme of the Longer Telegram and is the very operating premise of the report of the China Strategy Group assembled by Eric Schmidt, former executive chairman of Alphabet, Google’s parent company, and the first chair of the Department of Defense’s Innovation Advisory Board. Those priorities have only become more important during the pandemic. It’s a question of “how do we orient the research system to fill in the industrial gaps that have been made very clear by the COVID crisis?” says Schneider of Rhodium.

While it hasn’t gone so far as to adopt a national industrial strategy, the Administration’s most ambitious officials are looking to prod along tech research in critical sectors. To that end, the National Security Council, which Sullivan runs, is reshaping itself around technology issues; Biden appointed the first deputy national security advisor focusing on technology issues as well as a high-profile senior director for technology. Their goal: to harness the same energy that drove the development of Silicon Valley during the Cold War into out-competing China.

That said, the ingredients to American (and Western) innovation aren’t exactly a secret: investment in education, research, and talent. “The West still has [most of] the universities, R&D and leading companies,” says Brattberg. “There’s still a lot of competitiveness and leverage.” Unsurprisingly, investing to retain that edge is a key theme of Biden’s $2 trillion infrastructure plan, which includes funds for basic research, supply chain support, broadband connectivity, and support for the semiconductor industry.

As almost anyone in Silicon Valley will tell you, a functioning and welcoming immigration system is a crucial ingredient, too. “The U.S. is at its best when it welcomes talent from around the world and gives people the tools to succeed and thrive here,” says Gorman. Whether the Biden Administration can strike a deal with Senate Republicans on comprehensive immigration reform – or even funding basic research – remains an open question, though. And even if it can succeed, American ingenuity is no longer sufficient on its own.

Team America

Whether it’s for talent or partnerships, the U.S.-China tech competition will be won overseas. Allies are “the most salient and straightforward way Biden can bring leverage to the table compared to Trump,” says Schneider.

Biden, Blinken, and other senior administration officials have loudly and repeatedly pronounced their preferences to work with democratic partners on international challenges, particularly in the Indo-Pacific region. It is no accident that Blinken and Sullivan’s meeting in Anchorage was preceded by a trip to Japan and South Korea, two of America’s closest allies in the region, and that Japanese Prime Minister Yoshihide Suga was the first foreign leader to visit Biden at the White House. “If you add the U.S. to the EU, Australia, Taiwan, and South Korea, you tilt the balance of economic heft and technological prowess back toward us,” he adds.

U.S. President Joe Biden and Prime Minister Yoshihide Suga of Japan hold a news conference in the Rose Garden of the White House on April 16, 2021. Image Credits: Doug Mills-Pool/Getty Images)

The ground for Blinken and company is increasingly fertile. Chinese diplomats have been aggressive, if not downright condescending, to countries they perceive have slighted China. In one recent example, the Chinese embassy in Dublin sent a series of tweets targeting an Irish-British journalist couple who had been forced to relocate to Taiwan as a result of a harassment campaign over their critical coverage of China’s Uyghur policy in Xinjiang. This so-called ‘wolf warrior’ diplomacy (a reference to a jingoist action film) is prompting a backlash, and helping convince many policy elites in countries who had hoped to sit out a U.S.-China conflict that perhaps Washignton’s China skeptics have a point.

This perhaps explains the proliferating alpha-numeric soup of coalitions and alliances being floated to secure a free and democratic internet for the future. There’s the D10, a secure supply chain network floated by British Prime Minister Boris Johnson, which adds Australia, India, and South Korea to the existing G7 countries (U.S., U.K., Canada, France, Italy, Germany, and Japan). Schmidt’s report calls for a T-12 (the D10 minus Italy plus Finland, Sweden, and Israel). Others look to expand existing technology-related groupings like the Five Eyes signals intelligence alliance of the U.S., U.K., Australia, Canada, and New Zealand, or harness burgeoning non-technical ones like the Quad. Gorman points to the significance of the news that the Quad itself – Australia, India, Japan, and the US – announced the creation of a working group on emerging technology at its first-ever (virtual) leaders summit in March.

Meanwhile, Senator Mark Warner, a Democrat from Virginia, has proposed a technology partnership to be run out of the State Department to coordinate with allies – including a $5 billion fund for research – with the explicit purpose of countering China.

International tech standards are increasingly not set by the West

Even if it can shephard its allies, the U.S. still faces stiff international headwinds. The Trump Administration’s decision to withdraw from the Trans-Pacific Partnership, a trade deal negotiated by the Obama Administration with ten other Pacific Rim countries with the intent of setting trade standards in the Asia-Pacific, was taken as a sign that perhaps the U.S. pivot to Asia was less ambitious than advertised. The pact, rebranded as the Comprehensive and Progressive Agreement for Trans-Pacific Partnership (CPTPP), has continued without the U.S.  – and now even China has expressed interest in joining.

Trump’s disdain for working within multilateral forums has also meant that Washington has essentially ceded the field of global technical standard-setting. Beijing has taken advantage, aggressively working the UN system so that Chinese officials now lead four of the 15 specialized UN agencies, including the two most focused on regulating technology: the International Telecommunication Union (ITU), which helps set global technical standards, and the World Intellectual Property Organization (WIPO), which is responsible for protecting intellectual property rights around the world.

China is also backing Russian efforts to rewrite internet governance. With Chinese support, Russia won a UN General Assembly vote in 2019 to start drafting a new cybercrime treaty. Their goal is to replace the U.S-backed 2001 Budapest Convention on Cybercrime, which was  created by democracies through the Council of Europe, with a treaty that one critic said would include provisions “likely to provide cover to authoritarian governments to persecute their political opponents.” Russia and China also unsuccessfully tried to use the (now Chinese-led) ITU to replace the Internet Corporation for Assigned Names and Numbers (ICANN), a private body of experts that governs internet domain names.

These efforts are all part of China Standards 2035, an explicit plan to internationalize standards to Chinese preferences in areas like 5G and the Internet of Things (IoT). As Emily de La Bruyère and Nathan Picarsic wrote on TechCrunch last year, “Beijing has spent the past two decades establishing influential footholds in multilateral bodies and targeted industrial areas. Now, it is using those footholds to set their rules – with them, to define the infrastructure of the future world.”

Hawks, doves, and U.S. divisions

Even within the new consensus on China, there are fissures on how to handle China itself.

On the hawkish side, the Schmidt Report concedes that “some degree of technological bifurcation is in U.S. interests.” But calibrating just how much is a difficult question. “It’s already a reality,” says Barker of the German Council on Foreign Relations. “The question is: how deep does the split have to be?”

Few argue for complete decoupling, Brattberg, the Carnegie scholar who has written extensively on tech diplomacy, says. After all, many are loath to concede completely separate ‘free’ and ‘authoritarian’ internets. There are other implications as well: a “bipolar, bifurcated internet … would have some very serious adverse implications in terms of cost [and] a slowdown in innovation,” one former UK intelligence official told me last year.

The key is to pinpoint which specific technologies are essential to produce domestically. “To the extent we [decouple from China], we have to do it in a smart way,” says Gorman. “There’s a risk of going too far and hurting potential innovation in the U.S. So the debate going forward is going to be: How do you address true national security vulnerabilities without emulating an authoritarian approach that might say ‘just ban everything from a certain country.’”

And even if we can form a consensus at home, America’s allies are no less divided as I wrote last year with regards to Huawei. While the debate over the Chinese company’s role in 5G has evolved, with both France and the U.K. (in a reversal) moving to phase out its kit, the debate over what role China should play economically and technologically in Europe is still very much alive.

The U.K. government is clear-eyed; in its Integrated Review of foreign and defense policy published in March, it acknowledged that China’s “growing international assertiveness … will pose an increasing risk to UK interests” and set an explicit goal for itself to be a third “science and tech superpower.” France, meanwhile, laid out an Indo-Pacific strategy backing the principle of a free and open Pacific, an explicit challenge to Chinese preferences.

But many are still equivocal. As Singapore’s Prime Minister Lee Hsien Loong wrote last year in Foreign Affairs, “Asian countries do not want to be forced to choose between the two.” Berlin made clear in its Indo-Pacific strategy last year that it was also reticent to make an outright choice. New Zealand, conscious of its important trade with China, is reluctant to expand the use of Five Eyes beyond intelligence sharing. Meanwhile, Italy endorsed China’s infrastructure-focused Belt and Road Initiative in 2019 and called the country a “strategic partner” last year. And the European Union moved forward on a trade deal with China late last year despite very public lobbying against it from the United States.

A world of tradeoffs

The challenge for the Biden Administration will be to assemble practical coalitions without asking allies and partners to make impossible choices. They will succeed if they can reframe the question. “In Europe, they don’t like ‘decoupling’ but they do like ‘diversification’,” says Brattberg. They also don’t like the idea of joining a U.S.-led alliance. Instead, he says, Washington should frame cooperation as “coalitions among like-minded democrtaic partners.”

For that to work, the U.S. will have to work out the bilateral issues it has with its allies first. “We need to be much more savvy on engaging directly with the EU on resolving issues like data transfers, digital taxation, and data privacy,” he said. “Digital sovereignty shouldn’t come at the expense of partnership with like-minded partners.”

Chancellor of Germany Angela Merkel gives a speech during the press conference at the end of the meeting with Chinese Premier Li Keqiang (not pictured) of at The Great Hall Of The People on September 06, 2019 in Beijing, China. Image Credits: Andrea Verdelli-Pool/Getty Images

Nimbleness will be key – multiple experts told me it will be far better to create ad hoc coalitions on particular issues than to create a single fixed democratic tech alliance. This would have the benefit of keeping groupings tight without excluding countries with key expertise in particular areas (think Sweden and 5G or Taiwan and semiconductors). Washington should also take a collegiate approach, recognizing and respecting that its allies will not always be in lock-step on every aspect of the relations with China. In other words, the U.S. shouldn’t let the perfect be the enemy of the good, as agreement most of the time on most issues is probably sufficient to create the momentum Washington needs.

The U.S. can still compete globally and widen the circle of like-minded countries, Gorman, the scholar at GMF, tells me, but it has to invest in them if they are going to build out their tech sectors in a way that is aligned with democratic values and standards. “It’s really about providing an attractive counteroffer,” she said.

Even if the United States retains its technological edge for the near future, Americans should start adjusting to a future where Silicon Valley’s dominance is no longer inevitable. Chinese technologists are pulling ahead in areas like 5G while Chinese firms are competing on price (mobile phones) and increasingly on quality (e-commerce) and innovation (see: TikTok). China also exerts enormous clout through its control of supply chains and rare earth metals as well as its vast customer base.

Perhaps China’s greatest leverage point is its looming presence over Taiwan. As long as Taiwan remains one of the leading manufacturers of semiconductors (chip giant TSMC manufactures 90% of the world’s most advanced chips), the world’s technology industry will be vulnerable to the precarity of cross-Strait relations.

Will technology become just another chip in the geopolitical game the U.S. and China are playing, then? The Biden Administration is more prepared than its predecessor to weigh the tradeoffs, Barker of the German Council on Foreign Relation retells me. But it’s unclear how Washington, so early in this administration, will prioritize technology issues if faced with the prospects of Chinese cooperation on other priorities.

After all, at any given moment, the U.S. (and its allies) must weigh a host of priorities vis-à-vis China. And for all of the downsides to its bellicosity, the Trump Administration’s fixation on a handful of issues gave it leverage: it was willing to ignore Uyghurs and other human rights abuses in order to get a trade deal (even if it was deeply flawed).

The Biden Administration, on the other hand, has not yet articulated any priorities at all. If the rhetoric from Washington can be believed, the White House thinks it can make progress on climate, Taiwan, trade, human rights, and any number of other areas, all at once. This on its own creates a vulnerability. As historian Niall Ferguson reminded us in a recent Bloomberg column, then National Security Advisor Henry Kissinger was outmaneuvered when he went to China in 1971 with a multi-issue agenda and China singularly focused on Taiwan.

Beijing’s diplomats, despite their wolf-warrior missteps, are still savvy negotiators. If they are allowed to do so, they will once again try to play different parts of the Administration against each other, conditioning progress on climate, for example, on a softening over geopolitics, as the Brookings scholar Thomas Wright has warned. In that light, it simply strains credulity that an ‘all of the above’ approach will work, especially when Biden’s wish list keeps the issues Trump cared about, like trade, 5G, and Taiwan, and adds those he ignored, like human rights, democracy, and climate change.

This is where America’s alliances may prove to be Biden’s hidden ace. If Biden can forge a common-enough front with a wide-enough spectrum of allies, the U.S. will be better able to withstand Chinese pressure to trade progress on one issue against another. Instead, forcing China to negotiate with the U.S. and its allies on an issue-by-issue basis may put Washington in a better position to succeed.

Of all the issues in America’s China portfolio, though, the tech race provides one extra advantage: for all the talk of industrial strategy, alliances, and diplomatic maneuvers, Washington is not the only or even primary actor involved. The Biden Administration can help set the rules, invest in basic research, and defend American interests abroad, but American innovation depends on its innovators – and there are still bountiful numbers of them tinkering away.

The most disastrous sales cycle in the world

Startups constantly talk about being mission-oriented, but it’s hard to take most of those messages seriously when the mission is optimizing cash flow for tax efficiency. However, a new generation of startups is emerging that are taking on some of the largest global challenges and bringing the same entrepreneurial grit, operational excellence, and technical brilliance to bear on actual missions — ones that may well save thousands of lives.

ClimateTech has been a huge beneficiary of this trend in general, but one small specialty has caught my eye: disaster response. It’s a category for software services that’s percolated for years with startups here and there, but now a new crop of founders is taking on the challenges of this space with renewed urgency and vigor.

As the elevator pitch would have it, disaster response is hitting hockey stick growth. 2020 was a brutal year, and in more ways than just the global COVID-19 pandemic. The year also experienced a record number of hurricanes, among the worst wildfire seasons in the Western United States, and several megastorms all across the world. Climate change, urbanization, population growth, and poor response practices have combined to create some of the most dangerous conditions humanity has ever collectively faced.

I wanted to get a sense of what the disaster response market has in store this decade, so over the past few weeks, I have interviewed more than 30 startup founders, investors, government officials, utility execs and more to understand this new landscape and what’s changed. In this four-part series on the future of technology and disaster response, to be published this weekend and next, we’ll look at the sales cycle in this market, how data is finally starting to flow into disaster response, how utilities and particularly telcos are dealing with internet access issues, and how communities are redefining disaster management going forward.

Before we get into all the tech developments in disaster response and resilience though, it’s important to ask a basic question: if you build it, will they come? The resounding answer from founders, investors, and government procurement officials was simple: no.

In fact, in all my conversations for this series, the hell of the emergency management sales cycle came up repeatedly, with more than one individual describing it as possibly the toughest sale that any company could make in the entire world. That view might be surprising in a market that easily runs into the tens of billions of dollars if the budgets for procurement are aggregated across local, state, federal, and international governments. Yet, as we will see, the unique dynamics of this market make almost any traditional sales approach useless.

Despite that pessimism though, that doesn’t mean sales are impossible, and a new crop of startups are piercing the barriers of entry in this market. We’ll look at the sales and product strategies that startups are increasingly relying on today to break through.

The sale from hell

Few will be surprised that government sales are hard. Generations of govtech startup founders have learned that slow sales cycles, byzantine procurement processes, cumbersome verification and security requirements, and a general lassitude among contract officers makes for a tough battlefield to close on revenue. Many government agencies now have programs to specifically onboard startups, having discovered just how hard it is for new innovations to run through their gauntlet.

Emergency management sales share all the same problems as other govtech startups, but then they deal with about a half dozen more problems that make the sales cycle go from exhausting to infernal hell.

The first and most painful is the dramatic seasonality of the sales in the emergency space. Many agencies that operate on seasonal disasters — think hurricanes, wildfires, winter storms, and more — often go through an “action” period where they respond to these disasters, and then transition into a “planning” period where they assess their performance, determine what changes are needed for next season, and consider what tools might be added or removed to increase the effectiveness of their responders.

Take Cornea and Perimeter, two startups in the wildfire response space that I profiled recently. Both of the teams described how they needed to think in terms of fire seasons when it came to product iteration and sales. “We took two fire seasons to beta test our technology … to solve the right problem the right way,” Bailey Farren, CEO and co-founder of Perimeter, said. “We actually changed our focus on beta testing during the [2019 California] Kincaid fire.”

In this way, disaster tech could be compared to edtech, where school technology purchases are often synchronized with the academic calendar. Miss the June through August window in the U.S. education system, and a startup is looking at another year before it will get another chance at the classroom.

Edtech might once have been a tougher sale to make in order to thread that three-month needle, but disaster response is getting more difficult every year. Climate change is exacerbating the length, severity, and damage caused by all types of disasters, which means that responding agencies that might have had six months or more out-of-season to plan in the past are sometimes working all year long just to respond to emergencies. That gives little time to think about what new solutions an agency needs to purchase.

Worse, unlike the standardized academic calendar, disasters are much less predictable these days as well. Flood and wildfire seasons, for instance, used to be relatively concentrated in certain periods of the year. Now, such emergencies can emerge practically year-round. That means that procurement processes can both start and freeze on a moment’s notice as an agency has to respond to its mission.

Seasonality doesn’t just apply to the sales cycle though — it also applies to the budgets of these agencies. While they are transpiring, disasters dominate the eye of the minds for citizens and politicians, but then we forget all about them until the next catastrophe. Unlike the annual consistency of other government tech spending, disaster tech funding often comes in waves.

One senior federal emergency management official, who asked not to be named since he wasn’t authorized to speak publicly, explained that consistent budgets and the ability to spend them quickly is quite limited during “blue sky days” (i.e. periods without a disaster), and agencies like his have to rely on piecing together supplementary disaster funds when Congress or state legislatures authorize additional financing. The best agencies have technological roadmaps on hand so that when extra funding comes in, they can use it immediately to realize their plans, but not all agencies have the technical planning resources to be that prepared.

Amir Elichai, the CEO and co-founder of Carbyne, a cloud-native platform for call handling in 911 centers, said that this wave of interest crested yet again with the COVID-19 pandemic last year, triggering huge increases in attention and funding around emergency response capabilities. “COVID put a mirror in front of government faces and showed them that ‘we’re not ready’,” he said.

Perhaps unsurprisingly, next-generation 911 services (typically dubbed NG911), which have been advocated for years by the industry and first responders, is looking at a major financing boost. President Biden’s proposed infrastructure bill would add $15 billion to upgrade 911 capabilities in the United States — funding that has been requested for much of the last decade. Just last year, a $12 billion variant of that bill failed in the Senate after passing the U.S. House of Representatives.

Sales are all about providing proverbial painkillers versus vitamins to customers, and one would expect that disaster response agencies looking to upgrade their systems would be very much on the painkiller side. After all, the fear and crisis surrounding these agencies and their work would seem to bring visceral attention to their needs.

Yet, that fear actually has the opposite effect in many cases, driving attention away from systematic technology upgrades in favor of immediate acute solutions. One govtech VC, who asked not to be named to speak candidly about the procurement process his companies go through, said that “we don’t want to paint the picture that the world is a scary and dangerous place.” Instead, “the trick is to be … focused on the safety side rather than the danger.” Safety is a much more prevalent and consistent need than sporadically responding to emergencies.

When a wave of funding finally gets approved though, agencies often have to scramble to figure out what to prioritize now that the appropriated manna has finally dropped from the legislative heaven. Even when startups provide the right solutions, scrying which problems are going to get funded in a particular cycle requires acute attention to every customer.

Josh Mendelsohn, the managing partner at startup studio and venture fund Hangar, said that “the customers have no shortage of needs that they are happy to talk about … the hardest part is how you narrow the funnel — what are the problems that are most meritorious?” That merit can, unfortunately, evolve very rapidly as mission requirements change.

Let’s say all the stars line up though — the agencies have time to buy, they have a need, and a startup has the solution that they want. The final challenge that’s probably the toughest to overcome is simply the lack of trust that new startups have with agencies.

In talking to emergency response officials the past few weeks, reliability unsurprisingly came up again and again. Responding to disasters is mission-critical work, and nothing can break in the field or in the operations center. Frontline responders still use paper and pens in lieu of tablets or mobile phones since they know that paper is going to work every single time and not run out of battery juice. The move fast and break things ethos of Silicon Valley is fundamentally incompatible with this market.

Seasonality, on-and-off funding, lack of attention, procurement scrambling, and acute reliability requirements combine to make emergency management sales among the hardest possible for a startup. That doesn’t even get into all the typical govtech challenges like integrating with legacy systems, the massive fragmentation of thousands of emergency response agencies littered across the United States and globally, and the fact that in many agencies, people aren’t that interested in change in the first place. As one individual in the space described how governments approach emergency technology, “a lot of departments are looking at it as maybe I can hit retirement before I have to deal with it.”

The strategies for breaking out of limbo

So the sales cycle is hell. Why, then, are VCs dropping money in the sector? After all, we’ve seen emergency response data platform RapidSOS raise $85 million just a few months ago, about the same time Carbyne raised $25 million. There are quite a few more startups at the earliest phases that have raised pre-seed and seed investment as well.

The key argument that nearly everyone in this sector agreed on is that founders (and their investors) have to throw away their private-sector sales playbooks and rebuild their approach from the bottom up to sell specifically to these agencies. That means devising entirely different strategies and tactics to secure revenue performance.

The first and most important approach is, in some respects, to not even start with a company at all, but rather to start learning what people in this field actually do. As the sales cycle perhaps indicates, disaster response is unlike any other work. The chaos, the rapidly changing environment, the multi-disciplinary teams and cross-agency work that has to take place for a response to be effective have few parallels to professional office work. Empathy is key here: the responder that uses paper might have nearly lost their life in the field when their device failed. A 911 center operator may have listened to someone perish in real-time as they scrambled to find the right information from a software database.

In short, it’s all about customer discovery and development. That’s not so different from the enterprise world, but patience radiated out of many of my conversations with industry participants. It just takes more time — sometimes multiple seasons — to figure out precisely what to build and how to sell it effectively. If an enterprise SaaS product can iterate to market-fit in six months, it might take two to three years in the government sector to reach an equivalent point.

Michael Martin of RapidSOS said “There is no shortcut to doing customer discovery work in public service.” He noted that “I do think there is a real challenge between the arrogance of the Silicon Valley tech community and the reality of these challenges“ in public safety, a gap that has to be closed if a startup wants to find success. Meanwhile, Bryce Stirton, president and co-founder of public-safety company Responder Corp, said that “The end user is the best way to look at all the challenges … what are all the boxes the end user has to check to use a new technology?”

Mendelsohn of Hangar said that founders need to answer some tough questions in that process. “Ultimately, what are your entry points,” he asked. “Cornea has had to go through that customer discovery process … it all feels necessary, but what are the right things that require the least amount of behavior change to have impact immediately?”

Indeed, that process is appreciated on the other side as well. The federal emergency management official said, “everyone has a solution, but no one asked me about my problem.” Getting the product right and having it match the unique work that takes place in this market is key.

Let’s say you have a great product though — how do you get it through the perilous challenges of the procurement process? Here, answers differed widely, and they offer multiple strategies on how to approach the problem.

Martin of RapidSOS said that “government does not have a good model for procuring new services to solve problems.” So, the company chose to make its services free for government. “In three years, we went from no agencies using our stuff to all agencies using our stuff, and that was based on not making it a procurement problem,” he said. The company’s business model is based on having paid corporate partners who want to integrate their data into 911 centers for safety purposes.

That’s a similar model used by MD Ally, which received a $3.5 million seed check from General Catalyst this past week. The company adds telehealth referral services into 911 dispatch systems, and CEO and founder Shanel Fields emphasized that she saw an opportunity to create a revenue engine from the physician and mental health provider side of her market while avoiding government procurement.

Outside of what might be dubbed “Robinhood for government” (aka, just offering a service for free), another approach is to link up with more well-known and trusted brand names to offer a product that has the innovation of a startup but the reliability of an established player. Stirton of Responder said “we learned in [this market] that it takes more than just capital to get companies started in this space.” What he found worked was building private-sector partnerships to bring a joint offering to governments. For instance, he noted cloud providers Amazon Web Services and Verizon have good reputations with governments and can get startups over procurement hurdles (TechCrunch is owned by Verizon Media, which is owned by Verizon).

Elichai of Carbyne notes that much of his sales is done through integration partners, referencing CenterSquare as one example. For 911 services, “The U.S. market is obviously the most fragmented” and so partners allow the company to avoid selling to thousands of different agencies. “We are usually not selling direct to governments,” he said.

Partners can also help deal with the problem of localism in emergency procurement: many government agencies don’t know precisely what to buy, so they simply buy software that is offered by companies in their own backyard. Partners can offer a local presence while also allowing a startup to have a nimble national footprint.

Another angle on partners is building out a roster of experienced but retired government executives who can give credibility to a startup through their presence and networks. Even more than in enterprise, government officials, particularly in emergency management, have to work and trust one another given the closely-coupled work that they perform. Hearing a positive recommendation from a close contact down the street can readily change the tenor of a sales conversation.

Finally, as much as emergency management software is geared for governments, private sector companies increasingly have to consider much of the same tooling to protect their operations. Many companies have distributed workforces, field teams, and physical assets they need to protect, and often have to respond to disasters in much the same way that governments do. For some startups, it’s possible to bootstrap in the private sector early on while continuing to assiduously develop public sector relationships.

In short, a long-term customer development program coupled with quality partnerships and joint offerings while not forgetting the private sector offers the best path for startups to break through into these agencies.

The good news is that the hard work can be rewarded. Not only are there serious dollars that flow through these agencies, but the agencies themselves know that they need better technology. Tom Harbour, who is chief fire officer at Cornea and formerly national director of fire management at the U.S. Forest Service, notes that “These are billions of dollars we spend … and we know we can be more efficient.” Government doesn’t always make it easy to create efficiency, but for the founders willing to go the distance, they can build impactful, profitable, and mission-driven companies.

Daily Crunch: Europe charges Apple with antitrust breach

Apple faces an antitrust complaint in Europe, TikTok has a new CEO and YouTube TV disappears from Roku. This is your Daily Crunch for April 30, 2021.

Also, this is my last day at TechCrunch, and therefore my last day writing The Daily Crunch. It’s been a blast rounding up the news for all of you, and thank you to everyone who took the time to tell me they enjoyed the newsletter.

On Monday, TechCrunch will be debuting a more collaborative approach to The Daily Crunch — stay tuned!

The big story: Europe charges Apple with antitrust breach

The European Commission has filed preliminary charges against Apple, focusing on complaints by Spotify that Apple’s App Store policies — particularly its requirements around in-app purchase — are anti-competitive.

“The Commission takes issue with the mandatory use of Apple’s own in-app purchase mechanism imposed on music streaming app developers to distribute their apps via Apple’s App Store,” it wrote. “The Commission is also concerned that Apple applies certain restrictions on app developers preventing them from informing iPhone and iPad users of alternative, cheaper purchasing possibilities.”

Apple has 12 weeks to respond to the charges.

The tech giants

ByteDance CFO assumes role as new TikTok CEO — Eight months after former TikTok CEO Kevin Mayer quit in the midst of a full-court press from the Trump administration, TikTok finally has a new permanent leader.

Roku removes YouTube TV from its channel store following failed negotiations — Earlier this week, Roku warned customers that the YouTube TV app may be removed from its streaming media players and TVs, and it alleged that Google was leveraging its monopoly power during contract negotiations to ask for unfair terms.

Computer vision inches toward ‘common sense’ with Facebook’s latest research — One development Facebook has pursued in particular is what’s called “semi-supervised learning.”

Startups, funding and venture capital

Developer-focused video platform Mux achieves unicorn status with $105M funding — “I think video’s eating software, the same way software was eating the world 10 years ago.”

As concerns rise over forest carbon offsets, Pachama’s verified offset marketplace gets $15M — The startup is building a marketplace for forest carbon credits that it says is more transparent and verifiable thanks to its use of satellite imagery and machine learning technologies.

Heirlume raises $1.38M to remove the barriers of trademark registration for small businesses — Heirlume’s machine-powered trademark registration platform turns the process into a self-serve affair that won’t break the budget.

Advice and analysis from Extra Crunch

Optimism reigns at consumer trading services as fintech VC spikes and Robinhood IPO looms — But services that help consumers trade might need to retool their models over time to ensure long-term income.

Amid the IPO gold rush, how should we value fintech startups? — If there has ever been a golden age for fintech, it surely must be now.

The health data transparency movement is birthing a new generation of startups — Twin struggles seem to be taking place: a push for more transparency on provider and payer data, and another for strict privacy protection for personal patient data.

(Extra Crunch is our membership program, which helps founders and startup teams get ahead. You can sign up here.)

Everything else

Cloud infrastructure market keeps rolling in Q1 with almost $40B in revenue — That’s up $2 billion from last quarter and up 37% over the same period last year.

The second shot is kicking in — A new episode of the Webby-nominated Equity podcast.

Pitch your startup to seasoned tech leaders, and a live audience, on Extra Crunch Live — We’re bringing the pitch-off format to Extra Crunch Live.

The Daily Crunch is TechCrunch’s roundup of our biggest and most important stories. If you’d like to get this delivered to your inbox every day at around 3pm Pacific, you can subscribe here.

Europe charges Apple with antitrust breach, citing Spotify App Store complaint

The European Commission has announced that it’s issued formal antitrust charges against Apple, saying today that its preliminary view is Apple’s app store rules distort competition in the market for music streaming services by raising the costs of competing music streaming app developers.

The Commission begun investigating competition concerns related to iOS App Store (and also Apple Pay) last summer.

“The Commission takes issue with the mandatory use of Apple’s own in-app purchase mechanism imposed on music streaming app developers to distribute their apps via Apple’s App Store,” it wrote today. “The Commission is also concerned that Apple applies certain restrictions on app developers preventing them from informing iPhone and iPad users of alternative, cheaper purchasing possibilities.”

The statement of objections focuses on two rules that Apple imposes in its agreements with music streaming app developers: Namely the mandatory requirement to use its proprietary in-app purchase system (IAP) to distribute paid digital content (with the Commission noting that it charges a 30% commission fee on all such subscriptions bought via IAP); and ‘anti-steering provisions’ which limit the ability of developers to inform users of alternative purchasing options.

“The Commission’s investigation showed that most streaming providers passed this fee [Apple’s 30% cut] on to end users by raising prices,” it wrote, adding: “While Apple allows users to use music subscriptions purchased elsewhere, its rules prevent developers from informing users about such purchasing possibilities, which are usually cheaper. The Commission is concerned that users of Apple devices pay significantly higher prices for their music subscription services or they are prevented from buying certain subscriptions directly in their apps.”

Commenting in a statement, EVP and competition chief Margrethe Vestager, added: “App stores play a central role in today’s digital economy. We can now do our shopping, access news, music or movies via apps instead of visiting websites. Our preliminary finding is that Apple is a gatekeeper to users of iPhones and iPads via the App Store. With Apple Music, Apple also competes with music streaming providers. By setting strict rules on the App store that disadvantage competing music streaming services, Apple deprives users of cheaper music streaming choices and distorts competition. This is done by charging high commission fees on each transaction in the App store for rivals and by forbidding them from informing their customers of alternative subscription options.”

Apple sent us this statement in response:

“Spotify has become the largest music subscription service in the world, and we’re proud for the role we played in that. Spotify does not pay Apple any commission on over 99% of their subscribers, and only pays a 15% commission on those remaining subscribers that they acquired through the App Store. At the core of this case is Spotify’s demand they should be able to advertise alternative deals on their iOS app, a practice that no store in the world allows. Once again, they want all the benefits of the App Store but don’t think they should have to pay anything for that. The Commission’s argument on Spotify’s behalf is the opposite of fair competition.”

Spotify’s founder, Daniel Ek, has also responded to the news of the Commission’s charges against Apple with a jubilant tweet — writing: “Today is a big day. Fairness is the key to competition… we are one step closer to creating a level playing field, which is so important for the entire ecosystem of European developers.”

Vestager is due to hold a press conference shortly — so stay tuned for updates.

This story is developing… 

A number of complaints against Apple’s practices have been lodged with the EU’s competition division in recent years — including by music streaming service Spotify; video games maker Epic Games; and messaging platform Telegram, to name a few of the complainants who have gone public (and been among the most vocal).

The main objection is over the (up to 30%) cut Apple takes on sales made through third parties’ apps — which critics rail against as an ‘Apple tax’ — as well as how it can mandate that developers do not inform users how to circumvent its in-app payment infrastructure, i.e. by signing up for subscriptions via their own website instead of through the App Store. Other complaints include that Apple does not allow third party app stores on iOS.

Apple, meanwhile, has argued that its App Store does not constitute a monopoly. iOS’ global market share of mobile devices is a little over 10% vs Google’s rival Android OS — which is running on the lion’s share of the world’s mobile hardware. But monopoly status depends on how a market is defined by regulators (and if you’re looking at the market for iOS apps then Apple has no competitors).

The iPhone maker also likes to point out that the vast majority of third party apps pay it no commission (as they don’t monetize via in-app payments). While it argues that restrictions on native apps are necessary to protect iOS users from threats to their security and privacy.

Last summer the European Commission said its App Store probe was focused on Apple’s mandatory requirement that app developers use its proprietary in-app purchase system, as well as restrictions applied on the ability of developers to inform iPhone and iPad users of alternative cheaper purchasing possibilities outside of apps.

It also said it was investigating Apple Pay: Looking at the T&Cs and other conditions Apple imposes for integrating its payment solution into others’ apps and websites on iPhones and iPads, and also on limitations it imposes on others’ access to the NFC (contactless payment) functionality on iPhones for payments in stores.

The EU’s antitrust regulator also said then that it was probing allegations of “refusals of access” to Apple Pay.

In March this year the UK also joined the Apple App Store antitrust investigation fray — announcing a formal investigation into whether it has a dominant position and if it imposes unfair or anti-competitive terms on developers using its app store.

US lawmakers have, meanwhile, also been dialling up attention on app stores, plural — and on competition in digital markets more generally — calling in both Apple and Google for questioning over how they operate their respective mobile app marketplaces in recent years.

Last month, for example, the two tech giants’ representatives were pressed on whether their app stores share data with their product development teams — with lawmakers digging into complaints against Apple especially that Cupertino frequently copies others’ apps, ‘sherlocking’ their businesses by releasing native copycats (as the practice has been nicknamed).

Back in July 2020 the House Antitrust Subcommittee took testimony from Apple CEO Tim Cook himself — and went on, in a hefty report on competition in digital markets, to accuse Apple of leveraging its control of iOS and the App Store to “create and enforce barriers to competition and discriminate against and exclude rivals while preferencing its own offerings”.

“Apple also uses its power to exploit app developers through misappropriation of competitively sensitive information and to charge app developers supra-competitive prices within the App Store,” the report went on. “Apple has maintained its dominance due to the presence of network effects, high barriers to entry, and high switching costs in the mobile operating system market.”

The report did not single Apple out — also blasting Google-owner Alphabet, Amazon and Facebook for abusing their market power. And the Justice Department went on to file suit against Google later the same month. So, over in the U.S., the stage is being set for further actions against big tech. Although what, if any, federal charges Apple could face remains to be seen.

At the same time, a number of state-level tech regulation efforts are brewing around big tech and antitrust — including a push in Arizona to relieve developers from Apple and Google’s hefty cut of app store profits.

While an antitrust bill introduced by Republican Josh Hawley earlier this month takes aim at acquisitions, proposing an outright block on big tech’s ability to carry out mergers and acquisitions. Although that bill looks unlikely to succeed, a flurry of antitrust reform bills are set to introduced as U.S. lawmakers on both sides of the aisle grapple with how to cut big tech down to a competition-friendly size.

In Europe lawmakers are already putting down draft laws with the same overarching goal.

In the EU, the Commission recently proposed an ex ante regime to prevent big tech from abusing its market power. The Digital Markets Act is set to impose conditions on intermediating platforms who are considered ‘gatekeepers’ to others’ market access.

While over in the UK, which now sits outside the bloc, the government is also drafting new laws in response to tech giants’ market power. It has said it intends to create a ‘pro-competition’ regime that will apply to platforms with so-called  ‘strategic market status’ — but instead of a set list of requirements it wants to target specific measures per platform.

Daily Crunch: Biden’s labor secretary says gig workers should be reclassified

The Biden administration hints at gig economy changes, Blue Origin will be taking passengers and we interview Jim Belushi about weed. This is your Daily Crunch for April 29, 2021.

The big story: Biden’s labor secretary says gig workers should be reclassified

The Biden administration’s labor secretary Marty Walsh recently said in an interview with Reuters that he’s “looking at” the gig economy.

“In a lot of cases gig workers should be classified as employees,” Walsh said. “In some cases they are treated respectfully and in some cases they are not and I think it has to be consistent across the board.”

For now, this is just talk, but it suggests that new regulations and gig work reclassification could be a priority for the new administration.

The tech giants

Google Pay update adds grocery offers, transit expansions and spending insights — Through partnerships with Safeway and Target, Google Pay users will now be able to browse their store’s weekly circulars showcasing the latest deals.

Zynga and Rollic acquire the hyper-casual game studio behind High Heels — The company said High Heels (or, if you insist, High Heels!) has been downloaded more than 60 million times since it launched in January.

IBM is acquiring cloud app and network management firm Turbonomic for up to $2B — Turbonomic provides tools to manage application performance, along with Kubernetes and network performance.

Startups, funding and venture capital

Blue Origin will start selling tickets for New Shepard space tourism flights on May 5 — The “when and how much” are the two burning questions that remain around the Jeff Bezos-backed space company’s first commercial passenger flights.

TravelPerk raises $160M in equity and debt after a year of derailed business trips — TravelPerk lets users compare, book and invoice trains, cars, flights, hotels and apartments from a range of providers.

MoviePass co-founder’s PreShow Interactive raises $3M to expand into gaming — The startup will give PC and console gamers a new way to earn in-game currency in exchange for watching ads.

Advice and analysis from Extra Crunch

Healthcare is the next wave of data liberation — David Jegen and Carl Byers of F-Prime Capital argue that the winners of the healthcare data transformation will look different than they did with financial data.

Fintech startups set VC records as the 2021 fundraising market continues to impress — New data indicate Q1 2021 was the biggest fintech VC quarter ever.

How to fundraise over Zoom more effectively — A year ago, many of us probably thought that virtual fundraising would be impossible.

(Extra Crunch is our membership program, which helps founders and startup teams get ahead. You can sign up here.)

Everything else

Jim Belushi is chasing the magic in cannabis — We interviewed Belushi about his new greenhouses, supplied in part by GrowGeneration.

U.S. video game spending increased 30% in Q1 — Content was up 25% for the quarter, accessories jumped 42% and hardware went up 82%, according to NPD.

Sequoia’s Shaun Maguire and Vise’s Samir Vasavada will talk success in fintech on Extra Crunch Live — Join us on May 19 to discuss what brought the pair together, key tips for fundraising and how to be successful in the fintech space.

The Daily Crunch is TechCrunch’s roundup of our biggest and most important stories. If you’d like to get this delivered to your inbox every day at around 3pm Pacific, you can subscribe here.