May 2, 2022

Passing the Tech Ethics "Hot Potato" with Stephanie Hare

Today we wrestle with ethics and technology. Stephanie Hare's wonderful new book. Technology Is Not Neutral, gives us a much needed framework for thinking about how the technologies we interact with every day affect our moral lives more.

The player is loading ...
Bubble Trouble

Today we wrestle with ethics and technology. Stephanie Hare's wonderful new book. Technology Is Not Neutral, gives us a much needed framework for thinking about how the technologies we interact with every day affect our moral lives more.

Transcript

Richard Kramer: Welcome to Bubble Trouble, conversations between the economist and author Will Page and, myself, independent analyst Richard Kramer, where we lay out some inconvenient truths about how financial markets really work. Today, we wrestle with ethics and technology. Stephanie Hare's wonderful new book, Technology Is Not Neutral, gives us a much needed framework for thinking about how the technologies we interact with every day affect our moral lives. More in a moment.

Will Page: So welcome to the podcast, Stephanie. We always like to begin by tossin' the microphone to you to introduce yourself, your current work, and where the audience can find you, and also that accent, so in that order, take it away. [laughs]

Stephanie Hare: Hello, my name is Stephanie Hare, and I'm an independent researcher and broadcaster and now author of a new book on technology ethics. You can most easily find me on Twitter. My handle is @hare, H-A-R-E_Brain, so @hare_brain and my accent is a combination of just outside Chicago via Paris, long-time London, so my vowels are a mess.

Will Page: [laughs]. Okay. So the book is called Technology Is Not Neutral and A Short Guide to Technology and Ethics, and a title with the word technology and ethics isn't exactly clickbait so, rather than talk to us, talk to our audience for a second. How do we get this message across without crossin' over? Imagine you landed on Mars and you worked out this technology, the state and the society, how do you explain the word ethics to these three different groups?

Stephanie Hare: I think technology ethics is totally clickbait [laughs] which just shows you I need to get out more if I-

Will Page: Yeah. [laughs]

Stephanie Hare: ... I've been locked in my house for years already.

Will Page: Preach to the unconverted.

Stephanie Hare: Yeah. No, but I think right now, I mean, we're currently living-

Will Page: Mm.

Stephanie Hare: ... I think in a very ethical moment-

Will Page: Yes.

Stephanie Hare: ... where with the Russian invasion of Ukraine, you're having countries and-

Will Page: Mm.

Stephanie Hare: ... companies all having to take sides, take a call, make decisions. It's not good enough to just say, "It's complicated." You have to decide, are you staying in or are you pulling your people out, or what are you doing with your assets and infrastructure? What's your social media policy on allowing violence or political expression, censorship? How are you trying to support people in Russia who perhaps disagree with this invasion?

Will Page: Mm-hmm [affirmative].

Stephanie Hare: So you don't wanna punish the Russian people, you wanna punish Putin. Those are all, those sound like political questions, but politics is the expression of power. Ethics, which is about our values, is what we feel and think is good or bad, what is the right way to be, the wrong way to be? And of course there's as many opinions about that as there are people on the planet, including all those who came before us, whose words we get to consult and cite and often misquote on the internet. So, I think this is going to be a massive moment, and we're seeing it for investors as well with the rise of ESG investing, so environmental, social governance.

Will Page: 100%.

Stephanie Hare: We're seeing it with younger people who are really wanting the values of their lives to also be shared and aligned with those of their employers. So they want their employers to take a stand and they want to know what their employers think. And of course, that's, w- is really messy, particularly if you're a multinational and you're involved in all sorts of markets. So sometimes very different value sets-

Will Page: Hmm.

Stephanie Hare: ... and you're like, "I'm just here to sell a soft drink." [laughs]

Will Page: Yeah.

Stephanie Hare: "Why do I-"

Will Page: So-

Stephanie Hare: "... need to take a position on that?" Where's the demand for you to take a position coming from?

Richard Kramer: Did we have a dress rehearsal of that in the past few years with the Me Too and Black Lives Matter movements where, again, every company was forced to make statements and to pledge their ethics in public? Now, many of those companies, as we know, didn't live up to those ethics and it's certainly not confined to technology, but that was kind of a dress rehearsal for what we're seeing now in the Russia-Ukraine situation, do you think?

Stephanie Hare: Yeah, so certainly I think particularly in the United States, those two examples were really, uh, salient. But I'm also thinking even back a little bit earlier to the Arab uprisings. So, you saw a lot of technology companies, but also telecommunications companies. I remember Google took a really big stand on what was happening in Egypt and Cairo and to rose square and France Telecom got really criticized for not taking a similar or leaving or pulling out. And of course it's easier for Google to pull out than it was France Telecom, which had physical infrastructure and staff in much greater numbers in the country than it was for Google, for whom Egypt was a smaller part of its market, if fewer-

Richard Kramer: Mm.

Stephanie Hare: ... fewer people, et cetera. So it's that whole thing of it all just depends on it could be whether or not you have a supply chain issue because you're a fast moving consumer good and you're looking to make sure that there, it's not slave labor or child labor in your market, or if you make super conducting chips or semiconducting chips rather, you're looking at the rare-earth's minerals issue in the environment much more because you have to get those earths either from the Democratic Republic of Congo or China. And China obviously is a massive issue, not just because of the repression of the Uyghur Muslims, a million of whom are currently locked up-

Richard Kramer: Mm-hmm [affirmative].

Stephanie Hare: ... in a camp, but it's also the U.S.-China tech cold war. So even if you weren't a tech company or you weren't American, you can still get caught up in these things because the Department of Commerce might start issuing sanctions or making it impossible for people to do business and you're suddenly a small company in Belgium having to, having to take a side based on these two giants.

So it just, it's, it all depends so much on your positionality. Who are you in the world? Who are you accountable to? Like who are your stakeholders, that concept of stakeholder capitalism versus shareholder capitalism? And the model of it has, I think, changed a bit, but it's also like the political risk factor of ju-, people say they don't wanna do politics famously and tech companies for ages have said, "We're not political. We just want scale. We wanna serve as many people as possible," but it's that whole thing of you might not wanna do politics or ethics, but they're gonna do you.

Will Page: [laugh]

Stephanie Hare: If you don't have, if you haven't thought through your position on that, you're gonna get caught out in the moment. It might be better to do that thinking in a calmer time.

Will Page: I love it. It reminds me of the comedy Silicon Valley, where all these tech companies stand up on stage and say, "We wanna make the world a better place." Without considering ethics, how do you get from A to B to F. I love it and I love the book and the way I'm describing it to friends, if word of mouth counts for anything, is just like, where do you draw the line? Which immediately makes me think, should you draw a line? Which makes me think if you should, who should draw the line? You or the state and it's philosophical. But what I, the word I love about this book is the word framework.

Now data science is definitely a hype word at the moment. It's 8,000 books in Amazon with the word "big data" in its title, 8,000. But when I look at data science in companies, the biggest challenge is where do you house it? Do you have the data science department in the basement and occasionally listen to what they have to say, or do you integrate it within the firm so it's influencing every decision?

So we'll get to the ethics part of this conversation in this podcast, but I want to do it in reverse and ask, where do you think ethics should function within the firm? Who should it be reporting to? Who should it have underneath itself? I would love to hear from an organizational perspective where you believe, if they're gonna take your framework seriously, where should this framework be housed?

Stephanie Hare: So I can, I would answer that in two ways, which is gonna be top down first and then bottom up. Top down is that it's a CEO C-suite so the, the leadership and board issue-

Will Page: Mm-hmm [affirmative].

Stephanie Hare: ... because that, to quote Harry S. Truman, "The buck stops here." It's Apple CEO, Tim Cook, who is making the decision about what's happening with Apple in Russia. And yes, he's chatting with different people about it, but like his name is what's on the statements. It's his face. He's gonna be held accountable when the media has questions or when shareholders have questions or employees have questions.

Will Page: Mm-hmm [affirmative].

Stephanie Hare: So ultimately it's your CEO supported by the C-suite and board, all of whom have to know. I think it's the, its the, why do you exist? [laughs] Why are you here, what are you contributing to the world, and like what's the cost of that? All of that, that 360 degree ethical assessment of yourself, or even a metaphysical assessment [laughs] of yourself, the reality of you, why do you exist? I mean, how do we know all of those things? But then there's also going to be, and I hinted this at the book at the end, with this, you know, suggestion of, do we need a Hippocratic oath for tech? This is something that-

Will Page: Mm.

Stephanie Hare: ... you can't just be like, "Oh, it's a C-suite issue," or, "We've got somebody in compliance. We've got a lawyer who does that. So the rest of us are kind of off the hook and can, it's not in my job description." It's like, "No, it's in everybody's job description." And it's, I once worked in a company where I was working with someone brilliant in cybersecurity and he said the culture of health and safety on oil rigs was really interesting and relevant for cybersecurity.

Richard Kramer: Wow.

Stephanie Hare: And I was intrigued and he said, "The idea is that when you are on a rig, every single person there has a health and safety responsibility." So they see something that could cause an issue because everybody's lives are just de facto at risk on an oil rig at any time, it's not a question of hierarchy of like, "Oh, that's my boss's job," or, "That's this person with this title's job," or, "I'm only three weeks here, I, maybe I don't know," you call it out.

So I think there's that as well, that the lowliest [laughs] [inaudible 00:09:30] intern or new joiner, uh, or newest person there might actually be the person who spots a risk. Oftentimes when you're new, you see things that other people have just become inured to. Anyhow, so it's, I think it's holistic. It's 360. It could be from the top and it has to be from the top, but there's also an argument that it's bottom up and all around.

Will Page: So a sign of success with the boot could be even two or three years time. We look at some large tech companies and ethics is part of their induction program. You know, I made economics part of the induction program at Spotify. If you could install ethics as the, as soon as you join this company, this is ground into you, that would be a sign of success.

Richard Kramer: Well-

Stephanie Hare: I would die happy if that change were to happen. Like quite honestly, that never occurred in any company that I worked in, which I now find shocking. It would be amazing.

Will Page: That was a sign of the times.

Stephanie Hare: That would be a complete win.

Will Page: That's a sign of the times.

Richard Kramer: But let me, I'm gonna, I'm gonna push back a little bit on some of this because we're so far away from this ethical stance from companies being more than, in some cases, window dressing or risk management. So I'm on record of sa-, as saying many years ago now that if the CIA and NSA were really clever, they would've invented Facebook and Google because, long before Shoshana Zuboff wrote her book, I was talking about them putting 2 billion people under surveillance every day. And certainly in terms of the United States' position in the world, it behooves them greatly to have these assets, these projections of soft power that put so many billions of people under surveillance that they can tap into. Indeed, they're the most successful export industries that have ever been created in the history of man. They're the most profitable companies. And I'd say right now, they've risen to the stage of being almost nation states.

So how do you square the ethics of the society of the polity at large, which kind of wants these Facebooks and Googles to do the job that the CIA and NSA are, are tasked to do in terms of keeping tabs on people? And with the fact that what some of these companies do can be highly intrusive and in many people's view, certainly looking at the controversies over Instagram and, and how it affects people's behavior and mental health, really have quite detrimental consequences and be quite unethical.

Stephanie Hare: Mm.

Richard Kramer: How do you square those, the political dimension of what we want these companies to do and the individual dimension of what we are really disturbed by them doing?

Stephanie Hare: Mm. What a meaty question. I will try to answer that. So I guess on the one hand is like, why do social media companies exist? Are they there to be proxies for [laughs] intelligence services? Or are they there because they have figured out how to weaponize narcissism and our social animal existence so that I would be appalled if you stationed a CIA officer in my flat, but apparently I'm fine tweeting out a picture of my holiday or my lunch, for example. Why is that? Even though I know it can be tagged and traced and mapped out and will know all the other people who liked my picture of my lunch and therefore we can make guesses about that over time. Fine. And why am I participating in social media at all even though I know, I mean, I've done the research, I read the books, I'm totally aware of it and yet I still tweet and use LinkedIn.

Richard Kramer: Mm-hmm [affirmative].

Stephanie Hare: I don't use Facebook, um, and Instagram. So that's interesting as well is like somebody said with like con artists, like the reason a con works a lot of the time, not always, but a lot of times you're being conned, you participate in the con. And I think about that a lot where I'm like, "Oh my god." We've all been told, like Shoshana Zuboff wrote her wonderful book and the media's done a great job and we've had congressional hearings. [laughs] We all know if we are choosing to be on Facebook or on Instagram or Twitter or LinkedIn or whatever, we know. We know what we're doing. Uh, TikTok as well. And if we let our kids go on it, we also are aware that nobody's like a victim in this now.

So that's, I think that actually becomes more interesting than if we had this conversation say in 2015, right, before we knew a lot of that stuff and it was genuinely shocking when a lot of those revelations came out for many people. But now, now what is our ethical stance to them? I think now it's more, we've decided that we like them clearly, right? Because [laughs] yes, some people deleted their Facebook or whatever, but like clearly lots of people haven't. We've decided for the most part that we like these companies and we can tell that by their profits and their market share.

So what we're doing now, I think is saying, "We want you, we want to be able to share these pictures and have these connections and what we want is end encryption. And what we want is for you to take a stand on protecting our kids. And we want you to not use facial recognition, Facebook." [laugh] I'm speaking directly to Facebook on that one. And if they don't, then they get their feet held to the fire by all sorts of activists in the media and eventually they may change a bit or not. And I think they often only change when people howl enough.

So they've actually done a really good job in some ways of shifting the responsibility for what ethical calls to make onto all of us, right? And that's like a very clever thing that Silicon Valley has done, I think, in response to those, those Snowden revelations onwards was to go, "We're just a small group of people in a small part of the United States and we don't think that we should be making those decisions. Society should be making those decisions."

Richard Kramer: Wow.

Stephanie Hare: So they put it, it's like-

Richard Kramer: Right.

Stephanie Hare: ... pass the hot potato, only [laughs] it's an ethics potato.

Richard Kramer: Yeah. Well, Mark Zuckerberg said, "Whoa," to the government. "We need you to tell us what the regulation-"

Stephanie Hare: Right.

Richard Kramer: "... is going to be," and thus far, government has proved completely incapable of drawing up any regulations. A-a-and, and that's really no different than many other sectors if we've had many other Bubble Trouble podcasts about regulatory capture theory and the fact that the industries that are seeking to be regulated are mostly writing the regulations themselves.

But I wanna sort of step back. This notion of technology is not neutral because, again, when I was a graduate student many years ago, there was a lot of debate about autonomous technology. And the, the great example that was put out then was nuclear power.

Stephanie Hare: [laughs]

Richard Kramer: Now, obviously you can take uranium and turn it into weapons of mass destruction that, which are terrible. And you can also use it potentially as a source of tremendously effective, renewable energy. So that was the classic example of technology being autonomous, it's how it's used. And isn't the ethical element of it really about power and who holds power, and ultimately who gets to exercise that power with, again, the work of Shoshana Zuboff and others being the idea that these large tech platforms can exercise their power by nudging you towards certain outcomes or towards certain commercial ends for themselves, or exercise their power by creating really addictive technology as Tristan Harris would say, aiming a supercomputer at your brain. And we're really all simply unable to resist the power of the latest dopamine hit that's served up by Instagram or YouTube or any of the other algorithmic platforms. Is this ultimately a question of whether ethics matter to those in power?

Stephanie Hare: Yeah. So I see a really clear connection and relationship between ethics and power. And that's why in the book, I wanted to sort of put myself in a position of sympathy as though I were sort of sitting next to the reader going, "Have you studied philosophy? Because if you haven't, there's other bridges." [laugh]

Richard Kramer: [laughs]

Stephanie Hare: "It's not just ethics." And if you know them and like there are countries I've, I have a lot of, of French people in my life and I'm jealous of them for many reasons and one of them [laughs] the biggest ones, it's not just their food and fashion, but they study philosophy really hardcore in high school. It's like part of their culture. It's, it's part of being French, meant to get into university. It doesn't matter what degree you're studying. You have to pass these philosophy exams.

Richard Kramer: Wow.

Stephanie Hare: And they just seem to have this [laughs] like this faculty, because, and I, I describe it in the book as like a Swiss Army knife. It's like philosophy is like a Swiss Army knife and when you unfold it, it's got these six component tools, which are the six main branches of philosophy of which ethics is just one. So, if you wanna talk about technology ethics, it behooves you to at least be aware of the other five, if not to draw on them actively as you get more adept, so that you can understand how power, these very important questions of power relate to ethics, 'cause otherwise ethics could feel like you're, you're Socrates in the agora in your toga [laughs] be like, "What is a good life? What is technology? What is good technology?" And I wanted to make it super practical, um, and applied. And so you're right, the, the connection-

Richard Kramer: Mm.

Stephanie Hare: ... between power, who holds it, who doesn't have it. So when we design a tool, who is using the tool and on whom is the tool being used? What choice do they have? So we talk again about consent, when to click on cookies, it's not really consent. You have to click on it or you don't get access to the product or service you want. Can kids meaningfully consent to anything? Probably not. [laughs]

Richard Kramer: Yeah.

Stephanie Hare: Like I remember being a kid, say in the world. So particularly when you're online-

Richard Kramer: Yeah.

Stephanie Hare: ... please spare me. These companies know that, right? So then they'll put it on the parents like, "Oh, but yeah, so we're gonna tell the parents they have to police their children the whole time online." So it's, it gets really, it's messy. But that's why I wanted to, to embed ethics into that wider discussion of philosophy as a whole so that people could start to practice and, and play with it and why each branch, I give like a little description of each branch and then I'm like how it applies to technology to give like real world examples so that if you needed to like win a debate with somebody [laughs]-

Will Page: I love that.

Stephanie Hare: ... at work, you could at least have something to cite on-

Will Page: Yeah.

Stephanie Hare: ... or then riff off of it for yourself.

Will Page: That point you make there gives me goosebumps because what I've always felt is that the enemy of academia is echo chambers. Like why should economics have a monopoly in alleviating poverty as was the case for many years, decades, centuries. There's so many other disciplines who could contribute to that topic. And what you're trying to do is throw it wide open. And to your point about philosophy, whenever I'm back home in Edinburgh, I just stare at that statue for David Hume and just, he taught the world how to think and how we think. He went right back to the basics and I always like to say an economist who hasn't studied philosophy is like playing golf with one hand. It's a real disadvantage. You just can't-

Stephanie Hare: Yeah.

Will Page: ... question your reasoning. So, that's really important. This feels like a two parter podcast already. I wanna thank you for part one. And my big takeaway, there is a term tolerated trespass. You went from Snowden to the present and how we have a kind of tolerated trespass in the use and abuse of ethics. Who's tolerating and who says it's a trespass? We come back to part two where the do of pessimistic, Scottish economist wants to ask a question of the cost of all of this as well, but we'll be back in a moment.

Will here with a special request for you, our valued Bubble Trouble listener. First off, thank you for listening. Every time we put out an episode, we are so excited to see you enjoyin' them. Now we'd like to ask you to help grow the show. We would like to ask you to tell just one person about Bubble Trouble, who you think would enjoy the show. Perhaps you write a tweet or send a text or an email or post about it this episode. It doesn't matter. It just all helps. Just tell one person and your work is done. Myself and Richard, we would be so grateful. Thank you.

Richard Kramer: Welcome back to part two of this episode of Bubble Trouble. We're here with Stephanie Hare talking about her book, Technology Is Not Neutral, and I'm going to allow Will to step in with the first question. Will?

Will Page: So, ethics is an attractive lunchtime topic and there's no such thing as a free lunch. And what the do of pessimistic Scottish economist has to do is ask, "Who's gonna pay for this ethics? It's gonna come at a cost." And a recent spat involving Spotify, my former employer, Joe Rogan, and the artist, Neil Young kind of brought the ethics debate to life. Podcasts, you know, is this podcast creating liability for the platforms that are producing them? Discuss.

So in a quote in The Economist, I gave these following observations, software firm's valuations have long been driven by the notion that there's no marginal cost. Content moderation might be there first. And I wanna get into this because I'm gonna give you an example of an anonymous tech company, which is involved in streaming content, user generated content. They have 4,000 employees, 4,000 employees. You can imagine tech companies, typical structure. Have a guess how many of those 4,000 employees work in content moderation?

Stephanie Hare: Mm, 10%.

Will Page: 897. So pretty much one in four employees produce no value to this tech company. They're just dealing with a dead weight cost of moderation. And this for me is fascinating. I'll build this out even further and I'm gonna toss you the microphone and just riff on this one. But when Facebook disclosed how many people they employed in content moderation, interestingly, the stock price fell. I think that could be our next book for you is why did the stock price fall when you showed that you took content moderation seriously? So there's gotta be a cost to this. I think it's quite a significant cost and it could even be tech's first ever marginal cost, which creates all sorts of hangovers for those utopian dreams. So talk to me about, who's gonna pay for ethics.

Stephanie Hare: I guess, as you were saying that, I found myself, my brain sort of splitting into two and I was running with what you were saying of the what's the cost of ethics and who pays for it. And then I was opening up a second window of what's the cost of not doing ethics?

Will Page: Nice. The counter- [crosstalk 00:22:56]

Stephanie Hare: And who pays for that, right? It's the sort of, it's a constant 'cause it's, it's always a choice, right? It's always a choice.

Will Page: Right. So it's not, what did you achieve? It's what did you prevent?

Stephanie Hare: Yeah.

Will Page: Count the zeros, not the ones.

Stephanie Hare: Just bear with me 'cause I'm going to attempt to make an analogy. [laughs]

Will Page: [laughs]

Stephanie Hare: But, being, growing up in the United States and then moving outside of the United States, I was always very aware of the cost of the U.S. defense budget and having-

Will Page: Mm-hmm [affirmative].

Stephanie Hare: ... troops around the world and America being the world's policemen, whether or not that was something that the United States chose or felt it had to do. Discuss. And then living here in Europe, there was always this question of, you know, did we need to have as many troops that we had stationed in Germany or whatever. And a lot of that was a hangover from the Second World War and I trained as an historian. And so when I was debating this with very nerdy friends many years ago, they all thought that we should pull the U.S. troops out because there was peace. The end of history had occurred and Germany was reunited and the USSR had collapsed and everything was fine and we were trading and everyone was just making money.

And my view, I guess, was probably more the historian view of the United States has twice had to come over to this continent to sort things out. And I also remember in high school in the nineties watching the breakup of Yugoslavia and like terrible mass graves and crimes that everybody seems to have forgotten about, but I watched that. It was a really, um, formative moment for me. So learning about the importance of the U.S. troops in Europe, not just for peacekeeping in Europe, but as a base in the Sahel or on a, on the way to Iraq or of Afghanistan later after 9/11, you start to realize that like there's a cost to having U.S. troops in Europe and there's a cost to not having them. And we're seeing that right now-

Will Page: Mm.

Stephanie Hare: ... today, right? And the cost of NATO versus not having NATO or there are certain countries that haven't been making their NATO contributions as much as they could be and we've seen a couple of changes on that just in the past two weeks. Like Germany has made a totally radical-

Will Page: Radical.

Stephanie Hare: ... decision to increase de- defense spending. So what is the cost of Germany not-

Will Page: We're rushing decades of history.

Stephanie Hare: ... investing in defense versus what it's the cost if it does, right? They're getting like a real live example of that. So taking that analogy back to tech and ethics within companies, I think we have to ask ourselves, you can, the reason Facebook's stock price is gonna fall when you announce how many people are working on content moderation is 'cause you can quantify that and you can look at it and go, "Oh, it's quantifiable." Does the market even have a way of quantifying Facebook's ethical Screw-ups. Now, that might be when it got fined quite recently, a few [laughs] years ago and then its stock price went up [laughs], you know, fine the FTC. And it was like the biggest fine that had been issued and it didn't matter. So another part of me is a little bit like, "I don't really know if we want the market being the supreme arbiter of truth or what we, of ethical guidance." I mean, the market's decoupled from all sorts of realities or things that we might not like.

Will Page: Mm-hmm [affirmative].

Stephanie Hare: So that, you know, there's, there's that as well, not to offend any, anyone who's a investor or trader, but they have different priorities than ethics, right?

Will Page: So, so-

Stephanie Hare: Ethics might be part of it, but-

Will Page: ... just come back on that point there about what's the kind of factual of not paying for the cost of ethics as it were. There's something it might be, which I think resonates with your remarks so nicely. I give the example of two people who work in a PR and comms team in a firm and one person's job is to get good news out the gate. "Quarterly earnings are great, so give me good headlines." Another person is to prevent bad news from happening and how do you appraise them with KPIs and metrics?

Well, the metrics of good news is really easy. Like, "I got this number of headlines this quarter Wall Street Journal, FT, page one. Give me my promotion, give me my bonus, give me my head count." The person who spends all day preventing bad news, preventing the proverbial S-H-I-T from hitting the fan has nothing to show for it.

And just going back to the organizational psychology of how does this framework go into action, that raises questions, doesn't it? Like, "I prevented all these bad things from happening by paying for ethics." "Show me what did [laughs] show me what didn't happen." It's, it's gonna get tricky to persuade if it has to go up to the CEO to really embed this in companies. It just strikes me just listening to you, like it's gonna be like pushing water uphill in the early stages, I'm sure.

Stephanie Hare: It reminds me a little bit of the, the long hell lament of the chief information security officers of any company who are like, you know, "Us doing well is that we haven't had a data breach-"

Will Page: Yeah.

Stephanie Hare: "... accidental or hostile. We haven't had a cybersecurity event and our success is measured by what did not happen rather than-"

Will Page: Yeah.

Stephanie Hare: "... what did." And that's, again, like those pictures. I think I even used this analogy [laughs] in the book. I was quoting Taiwan's digital minister, Audrey Tang.

Will Page: Mm-hmm [affirmative].

Stephanie Hare: She uses a, a Taoist quote about the value. She, and it's a, it's like the analogy is of a pot. So if you imagine a clay pot-

Will Page: Oh.

Stephanie Hare: ... the clay creates a space inside that you could fill the pot with dirts, with water, with, uh, water and goldfish, if you wanted it, but whatever you want, put in your pot, M&Ms. And so she says you have value of the pot is where the pot's not, which I know-

Will Page: Mm-hmm [affirmative].

Stephanie Hare: ... sounds trippy, but it's [laughs], I just, I-I think very visually-

Will Page: Oh.

Richard Kramer: Yeah.

Stephanie Hare: ... so I was like, "Oh." [laughs]

Will Page: But-

Stephanie Hare: By-

Will Page: ... I- I, I alway- [laughs] I always think it make, you make me think of the Joseph Conrad novella Typhoon. I don't know if you've read that one, but just a famous story of [inaudible 00:28:24], of sea hardy captain who's told us, "Trouble ahead, and take the long route around the Pacific to get to his destination to avoid typhoon." He says, "I can't do that." He says, "Why?" "Because when I get to the destination, I'll have to fill out log book and explain why I was late." "And? You avoid a typhoon." "But my boss will say to me, 'How bad was it?' Said, 'I don't know. I avoided it.'" And that's where the problem starts, like this factoring in-

Stephanie Hare: Mm.

Will Page: ... and calculating the value of the counter factual is deep. It's really deep.

Stephanie Hare: Well, but there's a way of doing it. So luckily, who we all share this planet with eight billion other people and different organizations and countries, which allows us to see, we might have the really good cybersecurity practice so we don't have a data breacher get hacked, but unfortunately our friend over there who didn't invest or who didn't prioritize us didn't and we can quantify what happens when not [inaudible 00:29:14] hits them-

Will Page: Mm.

Stephanie Hare: ... right? [laughs] Or these people didn't look at their supply chain and boom. They had a major issue when it was discovered that children-

Will Page: Mm-hmm [affirmative].

Stephanie Hare: ... were involved or slave labor or whatever and you can quantify the fall out from that. So there's, there are ways, I think, to do it. It's more, it requires work. [laughs] People are lazy. We're all exhausted. We have [inaudible 00:29:36] retired. But I- I mean, that would be the fun challenge of this is like, "Absolutely. Let's find a way to quantify it." How do you make ethics easy? How do you demonstrate the value? And it isn't just, you know, what didn't happen. It's also are you attracting and retaining the best people? Are you actually making products and services that are new or that people want in a way that's great.

Will Page: Mm-hmm [affirmative].

Stephanie Hare: Um, lots of people are wanting to invest in things that make them feel good about the, their role in solving some of these problems, your, checking your pension is like a green oriented pension [laughs]. So you know, that stuff has started or changing diets so that we are eating better in a way that's better for the planet. I think there's ways of doing it. That's the question is how.

Richard Kramer: So, before we move on to the last section of, of this, our podcast that we always ask our guests to, to reflect on, I wanna ask you about one of the big examples in your book, and that is, you cover facial recognition, but you also cover the development of these contact tracing apps in the wake of the pandemic. And I guess the big picture question is why are governments so bad at developing technology?

Because in this case you spent £22 billion in the UK, which is effectively equivalent to an entire year's capital expenditure budget for Google building a global business of over 200 and close to $250 billion. You spent that much money in the UK, dumped into a bunch of consultancies run by a woman who has no technical competence whatsoever and it was a disaster. So what is it about power and values that we're seeing, for example, right now in the UK government that led them to do such a dreadful job, and I don't think that's a controversial statement in the slightest, at managing technology through this pandemic or ping-demic-

Stephanie Hare: Mm.

Richard Kramer: ... as it was for many people for a very long period of time.

Stephanie Hare: Yeah. So the one hand, the United kingdom's experience with digital health tools during the pandemic was this shit show. On the other hand-

Richard Kramer: [laughs]

Stephanie Hare: ... what we learned from that, and it's an expensive lesson, and what we're left with is quite interesting. So I have here on my little phone, the NHS app, which when I took my phone home, I was back in the U.S. for the holidays for Christmas. I hadn't seen my family in two years. And so I was giving them a little like demo of how the app works and they thought it was fantastic, which actually made me kind of look at it again, 'cause I had been really critical and well, writing the book. And then I thought, "Actually, there's something quite interesting here."

This app, if you have a smartphone, so you're not part of the digitally excluded, you're someone who has a smartphone, you can access this app and maybe you're younger and you wanna do all your stuff on, on apps, which a lot of my younger friends do here, it's fantastic. You can check your medication records, you can book appointments, you've got your, um, COVID vaccine passport that you can use to travel internationally and you can upload it to your iPhone wallet if you want and I believe it works for Google Android as well, so hurrah. That is cool. I mean-

Richard Kramer: Mm-hmm [affirmative].

Stephanie Hare: ... there's definitely a, a benefit to that-

Richard Kramer: Hmm. Hmm.

Stephanie Hare: ... because I had to get vaccinated in the U.S. and I was given that little like cardboard piece of paper from the CDC [laughs].

Richard Kramer: Yeah.

Stephanie Hare: Like somebody wrote in handwriting. I don't know if that ever gets entered into a database. I, it doesn't talk with my UK vaccine rec- records. It's a mess.

Will Page: Yeah.

Stephanie Hare: So on the one hand, the UK was trying. Yes, the minister in question, our health secretary, Matt Hancock, has a-a sketchy data protection track record and his own app and yes, he hired, uh, or appointed rather Dido Harding whose cybersecurity track record was not ideal I think we can all-

Will Page: [laughs]

Stephanie Hare: ... say for this task. That said, it's easy to say it now when we're here two years in, we've all been vaccinated and boosted and I mean, it's pretty much back to normal. We have to remember what it was like in March 2020, and I do. I have that sort of viscerally burned into my brain forever. It was terrifying. We didn't know if we were ever gonna have a vaccine, right? We didn't know what we were gonna be able to do and I think that was a great moment to live through as a researcher to go, "I'm watching a country literally throw the kitchen sink at this problem. How do you do crisis response, particularly international crisis response like a pandemic did, in a way that is fiscally responsible, where you're able to do value for money?" That's part of why I wrote the chapter, why I have a really beautiful data vis right at the end where I track all the mitigations that we tried here in the UK, particularly in England, which is where most of us live, both population wise and geography wise. I don't want to offend the other three nations. I just-

Will Page: Mm-hmm [affirmative].

Stephanie Hare: ... I wanted to focus on England 'cause I was living it. I looked at all the digital tools we did, but also the analog tools and I-I mapped them along with the data curves of cases, hospitalizations and deaths, because what I wanted was to be able to demonstrate not a causal relationship, that's much harder, but certainly a correlation-

Richard Kramer: Mm-hmm [affirmative].

Stephanie Hare: ... between what we were doing and what was happening to the actual rate of transmission and, and the hardcore effects that we all need to know about, which is hospitalizations and death. And my takeaway from it is we probably could have parked all of the digital initiatives.

Will Page: Wow.

Richard Kramer: Yeah.

Stephanie Hare: It would've been much better for us to have used that money to support people, to stay home and isolate when they got a test result that was positive or they were exposed to someone.

Richard Kramer: Absolutely.

Stephanie Hare: But that's a cultural issue here in the UK, the Tory government doesn't like to give people money [laughs] with the best of, unless it's their own friends of course. Uh, but they didn't, they didn't wanna do that so-

Richard Kramer: Well, th-that is precisely what they did. They gave £22 billion via Dido Harding to a bunch of consultants that ended up coming up with something that could have been written by a couple of co-op students in a computer science university program and ended up being industrialized quite well, but by the good folks at the NHS who like the people who's run gov.uk actually kind of know what they're doing. They're just not the PWC type consultants that, that are used to draining the government coffers. Let's move on from this because we can get very exercised about it. And I just like-

Will Page: [laughs]

Richard Kramer: ... using [laugh], I like, I like to use the analogy that the UK spent as much money as Google did running a global business of $250 billion and processing a trillion queries a day for search in one specific contract tracing app.

Will Page: Yeah.

Richard Kramer: ... that was a failure.

Will Page: Our, our listeners can't see it, but there is steam coming out of Richard and Stephanie's ears right now.

Richard Kramer: Yeah, there we go. Yeah.

Stephanie Hare: [laughs]

Richard Kramer: So what we-

Stephanie Hare: I'm just stoic right now.

Richard Kramer: Yeah. What we try to do, what we try to do at the end is, uh, do a bit of smoking. And in that sense, we ask our, our guests to give us a couple smoke signals and from a technology and ethics point of view, what are the couple weasel words or phrases you hear, the kind of things that you go, "Uh-oh, that's not going to end well," that help show technology companies, large and small, behaving or nation states, large and small, behaving in a way where they show that technology and ethics for them are not bedfellows.

Stephanie Hare: I think the constant emphasis on guideline principles, definitions, you could waste, oh, not waste, you could spend, and people do spend a lot of time and that never gets translated into action. So, anything that can't be measured and benchmarked or in some way held up for scrutiny always makes me a bit nervous. If it involves the marketing department in any way, I'm nervous-

Richard Kramer: [laughs]

Stephanie Hare: ... no offense to marketing department people. I just, 'cause I get worried that it's a PR communications branding exercise whereas what I want to see is like, how are you training your data scientists?

Will Page: Yeah.

Stephanie Hare: How are you training your developers? How are you training your board? Again, the money people-

Will Page: Interesting.

Stephanie Hare: ... [laughs] the operational people, the product owners and managers, like where does this sit in your organization? So if it's just committees and guidelines and principles, I feel like that was like a few years ago, that phase. And if they're still in that phase and they haven't moved on to show how they can, how are you living it? How are you walking your walk? That, that kind of makes me nervous.

Will Page: And w-with your permission, I'll ask our producer to get the Bill Hicks sketch, the famous one, which is, "If anybody here works in marketing, advertising, kill yourself." [laughs]

Stephanie Hare: [laughs]

Will Page: But he just-

Stephanie Hare: I feel really bad.

Will Page: There's, there's ethics there, but yeah, he comes at it-

Stephanie Hare: [laughs]

Will Page: ... the exact same way. How can I make money out of ethics? Ethics, that's a good dollar. [laughs] We can sell that dollar. Yeah, it goes down-

Stephanie Hare: Mm. Yeah, quite.

Will Page: ... it goes down the wrong cul-de-sac if it ends up in the marketing department. Agreed.

Richard Kramer: I-is there another smoke signal that you can point to that you hear politicians when they're talking about how important it is to get their hands around regulating technology, or you hear, uh, civil society groups talk about the evils of technology and you go, "Look, this is just not going anywhere"?

Stephanie Hare: Well, weirdly it's when they actually use the word ethics or ethically [laughs]-

Richard Kramer: [laughs]

Stephanie Hare: ... because they just apply it as a, an adjective to what they're doing without I'm like, "Sorry. How do you, what do you mean by that? Show, like I want to see the receipts. I want to see the accounts. [laughs] I want to know-"

Will Page: Wow.

Stephanie Hare: "... how this works." So I like weirdly, again, that word ethics makes me, it's not that it's bad. It's just that it sets my antenna up and I'm like, "Okay, there's whitewashing, there's ethics washing, there's green washing and there's sports washing." There's just so much washing going on that I think, yeah, any, just anytime I see it, if somebody's claiming how ethical they are, I'm like, "That's interesting. Tell me more."

Richard Kramer: [laughs] Yeah.

Stephanie Hare: "... You have my attention." Myself included, by the way [laughs].

Richard Kramer: Yeah.

Stephanie Hare: It's, it's very easy, isn't it? And that's kind of why I played in the book with somebody's a little devil in the front cover and an angel in the back is we all have that in us.

Will Page: I wanna thank Stephanie here for her time, uh, her voice and the book. I think the book's gonna be a huge success, I mean, just a couple of observations to wrap up. I always like to talk about the difference between the and a. When Simon Sharma around the time of the millennium produced Our History of Britain and it made history-

Stephanie Hare: Mm-hmm [affirmative].

Will Page: ... popular, like I hope your book makes ethics popular. He always said, "It's our history. I'm not claiming it's the history of Britain. You can write a history of Britain too. This is just our history," and you offer our framework. You don't push it out there as, "This is the framework," but really accessible and I think it's the moment, it's the era. I really hope this one flies. I really hope this one gets into the induction programs at tech.

And secondly, just a little footnote here to your achievements here. I just wanna say that my own book was influenced by a fantastic woman, a long time mentor of mine called Diana Coyle. Ironically on a plane, when I flew to Chicago, I boarded a plane and realized I was sitting next to Diana Coyle. I text my mom to say, "I wanna be a whole lot more intelligent when I get off this plane than when I got on it." By the time plane landed in Chicago, she had convinced me to start writing my books. So again, I'm just really pleased to see that her fingerprints are over your work as well. But Stephanie, thank you so-

Stephanie Hare: Yes.

Will Page: ... much for coming on the podcast and to stress to every HR department, shove this book into your induction program and make sure the next batch of engineers and data scientists you hire are reading it from cover to cover. Thank you so much.

Stephanie Hare: And give it to your CEO. [laughs]

Will Page: Absolutely. [laughs]

Stephanie Hare: Thank you so much. Thank you, Richard.

Richard Kramer: If you're new to Bubble Trouble, we hope you'll follow the show wherever you listen to podcasts. Bubble Trouble is produced by Eric [Newsom 00:40:51], Jesse Baker and Julia Nat at Magnificent Noise. You can learn more at BubbleTroublePodcast.com. Will Page and I will see you next time.