Dissecting value systems and exclusion in ‘big tech’, with Jessica Powell (Part Two)

In this second of my two-part conversation on the ethics of technology with Jessica Powell, the former head of PR at Google turned author of the wonderful satirical novel, The Big Disruption: A Totally Fictional But Essentially True Silicon Valley Story, we discuss the meaning of “genius” in the tech world; why Silicon Valley multi-millionaires vote for socialists; and whether it is possible to use the master’s code to destroy his app.

But first, an excerpt that might initially seem absurd, but considering that so many tech companies are currently working on or talking about space travel, it’s hard to be sure whom Powell is even satirizing here:

“Slow down,” Niels said. “Are you joking with me?”

“We have been working on the project for a year,” Gregor said. “Fifty engineers working in secret in Building 1. We’re building a colony on the moon.”

“You mean you have a spaceship and everything? How are you dealing with gravity? Wait, never mind, don’t answer that. What I mean is, since when did Anahata get into the business of humankind?”

“Anahata has always only ever been about humankind. Everything we do is done for — “

“Yeah, yeah, I know, everything we do is to improve humankind. But I mean, a society, Gregor. There are no synergies with our current business. How do you know how to construct a society?”

“Actually, a society is a lot like software. You build it on solid principles, then you iterate. Then you solutionize, and you iterate again.”

“What makes you think you can solve what centuries of wise men have failed to do?”

“Because we have something they don’t have,” Gregor said. He pushed his chair closer, and Niels couldn’t help but lean forward. The broken wooden spindle leaned with him, pushing into his back. But he did not move to swat it away; his eyes were locked on Gregor, their faces almost touching.

“Algorithms,” Gregor whispered.

“You have got to be kidding me,” Niels snorted. “These are humans we’re talking about, not robots. You can’t predict and control human behavior with algorithms.”

“That is an emotional reaction to what is a very logical project. And, yes, an algorithm could have predicted that you would respond that way. Even irrational behavior is rational when seen as a larger grouping of patterns. And as you can imagine, this project is built on patterns of success. Project Y, we call it. It will save Anahata — and, as a result, humankind.”

Greg E.: Let’s talk about the ideas in the book. Your Google -like company, Anahata, is symbolized by a squid that becomes the size of a bus, which is a great symbol of course for what companies like Google have become.

You explore what drives that kind of expansion, and in large part it’s a personal drive. One could say it’s a sex drive, but what I took from it was more a drive to be noticed, needed, recognized, that gets out of proportion.

I was wondering to what extent you felt like some of the crazy foibles of these male engineer characters being obsessed with women and women’s approval, was less about an actual sex drive and more about what the approval of those women would mean to them? They’re finally attractive enough, they’re finally likable enough. How do you feel about that?

Jessica P.: Oh, yeah. I think it definitely has a status thing. So much of the driver of this book is about ego. I think that’s absolutely the case.

Greg E.: At one point, the dichotomy between having a guiding philosophy and just being into your own ego took the form of character you call The Fixer, who is very Zen, and the CEO and big executives go to him essentially saying, “please fix our problem.”