Noah Raford Can Help You Prepare for a Not-So-Nice Future

We spoke with a futurist to understand the difference between predicting what's coming down the pike and being ready for it emotionally.
A portrait of futurist Noah Raford
PHOTOGRAPH: GETTY IMAGES

On this week’s episode of Have a Nice Future, Gideon Lichfield and Lauren Goode are joined by someone whose full-time job was to predict the future. Noah Raford spent nearly 15 years working as the chief futurist for the United Arab Emirates, where he advised the government on how to prepare for all sorts of futuristic challenges, from pandemics to global warming. His advice? Get comfortable with discomfort.  

Show Notes

Check out our coverage on climate change, including some ideas on how to talk to your kids about it. Don’t miss our stories on AI and ChatGPT, especially Lauren’s review of ChatGPT, Bing Chat, and Bard.

Lauren Goode is @LaurenGoode. Gideon Lichfield is @glichfield. Bling the main hotline at @WIRED.

How to Listen

You can always listen to this week's podcast through the audio player on this page, but if you want to subscribe for free to get every episode, here's how:

If you're on an iPhone or iPad, just tap this link, or open the app called Podcasts and search for Have a Nice Future. If you use Android, you can find us in the Google Podcasts app just by tapping here. You can also download an app like Overcast or Pocket Casts, and search for Have a Nice Future. We’re on Spotify too.

Transcript

Note: This is an automated transcript, which may contain errors.

Lauren Goode: Hi, I'm Lauren Goode.

Gideon Lichfield: And I'm Gideon Lichfield, and this is Have a Nice Future, a show about how fast everything is changing.

Archival audioclip (Back to the Future): I want to help you.

Gideon Lichfield: Each week, we talk to someone with a big audacious idea about the future and ask, "Is this really the future we want?"

Archival audioclip (Back to the Future): Doc!

Gideon Lichfield: This week, our guest is Noah Raford, who for nearly 15 years served as an in-house futurist for the government of the UAE. 

Archival audioclip (Back to the Future): I'm from the future.

Lauren Goode: Alright, I'm gonna ask the question that everyone's wondering about: What is a futurist? Is this a real job?

Gideon Lichfield: Well, I mean, I think some people imagine it's just, you know, a guy who sits around making predictions about the future, and there are probably some people who do just that. But Noah calls himself an applied futurist, by which he means that he studies trends—technological, economic, demographic, political, you name it. And then he works within institutions like the government to help them take those trends into account in their decisionmaking and their policies. So how should they think about the impact of AI, for instance?

Noah Raford (audioclip): I was having a conversation with a Ghanaian friend recently, and the potential for AI and all the tools which represent in that kinda larger space has for basically dysfunctional failed states and second-order semi-functional states, which is most of the world in some argument, is absolutely huge.

Lauren Goode: All right, so he sounds pretty positive on AI, but we don't actually know how that's going to shake out yet. I'm wondering if you two talked about things he has predicted accurately in the past or some things he got right.

Gideon Lichfield: I mean, he made a pretty good call on Covid.

Noah Raford (audioclip): Called that early. I stopped going into work two months before it became a commonly accepted thing. When I had to go into work, I was wearing masks and everyone was yelling at me like, you're freaking people out. Why are you doing this?

Gideon Lichfield: And of course we talked about climate change

Noah Raford (audioclip): We still labor under the belief that we can stop climate change, and that's just not true.

Lauren Goode: Huh. That's pretty jarring to hear. What are we supposed to do with that kind of information?

Gideon Lichfield: Well, I think there was this undercurrent to the conversation with Noah, which was that being a futurist is not just about predicting the future or even about working with governments or other institutions to capitalize on it, but it's about being ready for it emotionally. Noah's kind of like a future therapist.

Archival audioclip (2001: A Space Odyssey): I can see you are really upset about this. You ought to sit down calmly, take a stress pill, and think things over.

Lauren Goode: Well, therapy typically leaves me feeling a little drained and like maybe I need three days to process whatever was discussed. So I look forward to hearing your conversation with Noah and then maybe you and I can cry together afterward. 

Gideon Lichfield: Yeah, I mean, we may cry, but I think you'll feel somewhat uplifted because even though he says some really alarming things, uh, I think Noah takes the premise of the show and kind of flips it on its head. For him, the question isn't “Is this the future we want?” It's more, how do we get ready for the future that's coming both technically and emotionally. That conversation with Noah Raford is after the break.

[Break]

Gideon Lichfield: Noah, thanks for joining me on Have a Nice Future. Are you having a nice future? 

Noah Raford: I'm having a great future.

Gideon Lichfield: Well, I would kind of expect a futurist to be having a great future. But this future that’s coming—it's climate change, it's AI, it's floundering democracies, and as you readily admit, it's pretty scary. So how do we prepare for that? 

Noah Raford: Yeah, there's a cognitive fallacy which I like to use called the future logical materialism trap, and it comes from an old book on Buddhism called Spiritual Materialism, the idea that if you just meditate enough, you're gonna achieve enlightenment and that enlightenment is an escape from suffering. And if you dig deeply into Buddhism and spiritual traditions, it's not the case. You can't … Enlightenment is not the alleviation of suffering, nor is analysis the alleviation of uncertainty about the future. The future logical materialism fallacy states that there is a certain amount of data which you can analyze and that if you just think about the right thing, if you pay attention to the right thing, you won't get surprised or you'll find the advantage. And that's an illusion, that's a trap, because there is no right thing. Everything is falling apart at once. Everything is changing simultaneously. And there's no amount of data you can analyze, there's no amount of trends which you can research or time which you can spend to actually discover what's gonna really happen in the future. You just end up stressing yourself out and missing the real changes which are happening around you 'cause you don't know what to prioritize and you spend all your time in this analysis paralysis.

Gideon Lichfield: I see. That's a lovely parallel. I like that a lot. And I think it's kind of reassuring …

Noah Raford: It is.

Gideon Lichfield: … to understand that we can spend time thinking and we don't have to find the answer. That doesn't mean that we've done a bad job.

Noah Raford: That's precisely right. And then that opens to the question of, well, what can you do? If it's not a question of data collection and of analysis, then what is the right response to have here? What is a healthy emotional response to the terror of the future and the excitement of the opportunities presented to us? And really that gets down to something which has nothing to do with analysis or even thinking. That's much more of an emotional response to the actions which we're able to take in our lives, be they large or small. And a funny application of this is when it comes to climate change. Objectively, it is materially irrelevant if you recycle or not. Objectively, it is absolutely irrelevant to the course of this planet if you use plastic straws or not. It makes zero difference to the outcome of climate change. However, subjectively, that might be an action which you can take consciously, which makes you feel better about being aware of this issue and trying to gain some power over that huge terrifying issue in your life through small actions. And that's important. Because the key thing here when you face this terror of the future is for us to be able to translate that terror into excitement and opportunity is we have to feel some sense of agency, and the only way to feel agency is through action, small, medium, and large.

Gideon Lichfield: So taking action is more important than whether it's the right action—the perfect action to take. The most important is to have agency and do something. What are one or two things that you think are important, pretty likely future trends that most people are not paying enough attention to—maybe one that's scary and one that's positive? 

Noah Raford: Well, my go-to answer for this historically has been climate change and AI. Thankfully, we're becoming more and more aware of the reality of climate change, but we still labor under the belief that we can stop climate change and that's just not true. We cannot. We cannot. The damage has been done. We might be able to lessen the consequences of it, but ultimately, our entire lives and certainly our children's entire lives are going to be spent dealing with the consequences of climate change. And that's going to produce dramatic, extraordinary historic shifts in all areas of our lives, which even those of us who do this on a day-to-day basis have a hard time really emotionally accepting. We're likely to see a billion-plus climate refugees in our lifetime, which is going to drive state failure, border clashes, xenophobia, racism, currency collapse, the collapse of electricity grids, food stresses, more pandemics, that's a terrifying thing to really live with. About once or twice a year, I have a conversation with my children that we're pretty lucky, but it's a pretty high likelihood that we're gonna be refugees at some point in our life.

Gideon Lichfield: That's quite a lot to set them up for.

Noah Raford: It's quite a lot. And what's been interesting about this in terms of both parenting but how you or all of us deal with these sorts of scary, big changes over time, is when you first have that conversation, it's terrifying and you ignore it, or you deny it. And the second time, you accept a little bit more, and the third time, it's scary but you're able to sit with that fear a little bit more, and the fourth and the fifth and the sixth time … So by having this conversation every six months, over time for years and years and years, you get to this point that is normalizing an emotional experience of dramatic change and it shifts from being a terrifying life-ending horror show into something which is just like, "Okay, yeah, if it happens, what will we do? Where would we go? How would you adapt? What might be exciting about this? Who would you wanna spend time with? Where would … " These sorts of things. It becomes more proactive 'cause you're gradually taking away the terror and the emotional pain of our comfortable lives being ripped away from us and therefore preparing for a more exciting future.

Gideon Lichfield: What’s something that’s coming up that you think we should be excited by instead of terrified by? 

Noah Raford: Right. So that's the terrifying question. And again, my traditional go-to answer here has been AI, which was something which very few people outside of the field were paying attention to until about, I don't know, two months ago when GPT-3 came out.

Gideon Lichfield: Okay, so now everybody knows about GPT and its variants, but you're also seeing people warning about use cases, like someone can just use it to cook up any number of recipes for toxins and how to distribute them, all sorts of possibilities for mischief that just now become much easier. So what makes you confident that it's net positive rather than negative? 

Noah Raford: The thing that excites me about that is because unlike most other emerging technologies to date, the degree of public discourse around the risks is exceedingly high. Like my father who's quite intelligent but pretty elderly at this point and not quite engaged in these issues, is forwarding the articles about OpenAI's discussion of the risks of this for misinformation and for using this for toxic chemicals and these types of things. So the degree of public awareness of this and the degree of conversation in the public realm about the risks and opportunities is really quite extraordinary. It's quite extraordinarily sophisticated. And it's just the tip of the iceberg here. We're going to see all sorts of regulatory experimentation around this soon as people struggle to deal with the copyright implications and the cross-border or jurisdictional issues of these things, and the issue of IP, if you upload a document that is under NDA or medical data to one of these services then. Who owns that? Where does that go? Have you breached HIPAA terms? All this stuff is really materially present right now and is being debated in every industry right now at a very advanced level. And that gives me hope.

Gideon Lichfield: There was this survey of 700-something AI experts that has been making the rounds, that asked them if they thought that, what was the likelihood basically of AI leading to human extinction or serious human incapacitation of the human race as a civilization—words to that effect—and 50 percent of these experts said they thought there was a more than 10 percent chance. Isn't that a pretty scary number? 

Noah Raford: I think it is pretty scary, but also I think that's a pretty honest assessment of the existential risks which we face, not just with AI, but really in terms of climate, in terms of nuclear exchange, in terms of additional pandemics, in terms of the total collapse of the financial system. I think that part of being future savvy in today's world is looking clear-eyed at the real terrible potentials of these technologies and not just turning your back on them like a bad parent who doesn't wanna have anything to do with the world their children is living in, but actually really engaging in these questions. Now, is 10 percent too high? Does that mean we shouldn't try to explore the implications and benefits of this? I would argue, no. 

Gideon Lichfield: Does it mean though we should slow down the deployment of it, maybe? 

Noah Raford: Is that even possible? 

Gideon Lichfield: That's a fair question, but it certainly makes the case that I think that you're making, which is that we should be talking about this a hell of a lot more.

Noah Raford: Absolutely. That's my fundamental point when it comes to AI. The thing that is encouraging about that is that unlike greenhouse gases or cigarettes, or the dangers of vaping, or the dangers of automobile reliance, or obesity, or diabetes, these chronic conditions, unlike so many of these other large social issues, there is a pretty robust conversation going on around AI because it has exploded into everyone's lives in such a public way that people can experience on a day-to-day personal basis. And I think that's at least the beginning of an exciting conversation around this. And that coupled with the extraordinary potential which this offers is fundamentally exciting because one way or another, we're going to have a forcing moment in the next decade or so between climate, economic failure, state collapse, large-scale populist movements and even potentially revolution, and the absolute destruction of much of the labor force by AI is gonna force us to fundamentally reevaluate what we mean for us as a society. What does economy mean? What does society mean? What is the world that we actually wanna build in here? And that conversation is gonna probably be violent. It's probably gonna be uncomfortable. It's gonna have different forms in different countries. But it's going to become the genesis of what the next couple of versions or iterations of human society look like through a decade or two of experimentation and discord and strife. And that's freaking exciting because the world is falling apart and we need to start building a new world.

Gideon Lichfield: I can't wait. And I'm also thinking of the privilege that people like you and I have to be the ones who can lead and convene and take part in that conversation instead of just being washed back and forth by the consequences of it.

Noah Raford: Absolutely, but one of the things that … Obviously, we're extraordinarily privileged to be able to think about these things and talk about these things, but one of the things that encourages me—and I still live in Dubai, and Dubai is the crossroads of human civilization—there are more expats who weren't born in the UAE living in Dubai than any other city in the world. And you have these conversations in coffee shops, at the market, in your office with so many different people and people from all walks of life are engaging with this. And I'll tell you the people who I've found who are most excited by this are people coming from developing contexts. This is South Asia, this is East Asia, this is North Africa. Why? Because these are the people who have historically been screwed out of benefits of the 20th-century globalized project. They've been the one on the receiving end of the extractive mechanism. And now suddenly, at least for the moment, there's the glimmer of hope, the possibility that that system is falling apart, cracks in the world order are emerging, and we have at our fingertips these extraordinarily generative tools to try to build new businesses and build new companies and societies and new ways of doing things. And people are excited. Talk to an 18-year-old—

Gideon Lichfield: That's so interesting that they are seeing the opportunity and the excitement in this. That kind of gives me hope as well.

Noah Raford: Absolutely. Talk to a 14-year-old Indian kid about this in Bangalore, they are losing their mind with the potential for this. It's incredible. Talk to … I was having a conversation with a Ghanaian friend recently, and the potential for AI and all the tools which represent that kinda larger space has for basically dysfunctional failed states and second-order semi-functional states, which is most of the world in some argument, is absolutely huge. And so really, there shouldn't be a sort of sense of privilege guilt around talking about this stuff 'cause we're not the only ones talking about it. And in fact, we're probably the ones who are most modestly using this. There's a lot of weird experiments going on in the shadows that are gonna define what tomorrow looks like.

Gideon Lichfield: So Dubai has been home for you for …

Noah Raford: That's 13, 14 years.

Gideon Lichfield: Right. And you spent a lot of that period working as the chief futurist for the government of Dubai. And Dubai is a place that makes some people in the West kind of uncomfortable. It's not a democratic government. It's kind of not an egalitarian place, very advanced in things like surveillance tech. Why did you spend so long there and what did you learn from it? What is … Why is Dubai so important to you? 

Noah Raford: It's such a good question, 'cause Dubai is one of those things which means so many different things to so many different people, depending on the eyes through which you look at it. And I think that it's quite typical, certainly, if you've never been there, if you just read about it in the press, particularly if you grew up in the West, Western Europe, UK, France, the United States, that you get a particularly filtered version of this. When I first moved there, working in the prime minister's office, the thing that was just so profoundly shocking and surprising and invigorating to me was that it was, like I say, it was globalization with the wool pulled off your eyes. It was this huge transit hub of ideas, of people, of attitudes, of beliefs, of goods, of services from literally almost every country in the world living together, working together, trading together, without the pretense and the intermediation that we normally experience each other through. Most of us don't spend a lot of time in India or China, or North Africa, or Russia, or Ukraine, or even other states in America. And so in that sense, it gives you this extraordinary vantage point to see how … Literally, I'll give you an example, one of the things which I gained a tiny degree of notoriety for in my job was like I predicted Covid early. And it's not because I'm really smart or I had some vast surveillance apparatus around me or something of that nature. It's because I live in Dubai and there's two-thirds of the world's population within an eight-hour flight there, and almost all economic activity in the region comes through Dubai. I have lots of friends in Singapore. I have lots of friends in China. I have lots of friends in India. And I'm in touch with them on a relatively frequent basis. So when things started unfolding in Wuhan and in China with Covid in end of December, beginning of January, it didn't take a genius to figure out how many flights a day are there between China and the UAE, and to think that this is something which could really break.

Gideon Lichfield: Right, and that's something that you could just see from where you sat, basically.

Noah Raford: Precisely. Precisely. And so I think that's what so extraordinary about Dubai is it really is a city of the future in the sense that because it is the epicenter of all of these different threads of human civilization and of economy and politics and ideology and belief and culture, everything is kind of there.

Gideon Lichfield: Do you think … Would you argue that Dubai is actually doing a better job of developing this technology in a way that is attentive to ethics and to social needs? 

Noah Raford: Yeah, absolutely, I absolutely would. I think there's a dozen examples. We could take cryptocurrency and blockchain as an example. Early on, we started working in that space in 2014. We created an industry in a civil society group called the Dubai Global Blockchain Council in 2014 that had all the big banks, all the big regulators, all the big tech companies, a bunch of the most interesting startups at the time, as well as a bunch of academics and entrepreneurs in it. And what do we do? We just first started talking about it. "What do we mean by this? What is this thing?" This was a big collaborative civil society effort and we did a bunch of public events, started having conversations in the media about it, building a dialogue around this in the public sphere. And then after kind of coming up with some understanding of what the risks and the opportunities were, each of the people in the organization and the industry association started to do experiments and prototypes, proof of concepts, "What does it mean for a bank? What does it mean for a logistics company? What does it mean for a car company? What does it mean for education certification and degrees, and for the health care authority?" And by doing these little experiments, we're able to do several things at once, which was first remove the emotional terror from the novelty of this new way of doing things, which is the biggest barrier to change, and second, provide strong coalitions of people who supported it in the industry, who had skin in the game and who are willing to test it out, that led to not only the legalization of crypto in 2016, but also a very ambitious strategy to implement cryptocurrency technologies in all of the public sector applications. And I think that's a … I think what … The US is still arguing over how to regulate this stuff.

Gideon Lichfield: What's been your best call and your worst call about the future? 

Noah Raford: I think Covid was a great call. Called that early. I stopped going to work two months before it became a commonly accepted thing. When I had to go into work, I was wearing masks and everyone was yelling at me like, "You're freaking people out. Why are you doing this? [chuckle] Stop this. Our staff is getting scared. We don't have an official statement out yet." But I said, "No, this is a big deal, this is coming." And in fact, my Covid escape story was like a scene from the movie Children of Men. I escaped Dubai two days before the border closed. I arrived in the UK to pick up my kids and family, managed to convince them in a 48-hour period that this was a really big deal. We rented a car, drove through the channel to France two hours before the borders closed, found a place to stay two hours before the national lockdown in France occurred and spent four months in an organic pig farm in Provence, and it was awesome.

Gideon Lichfield: That's sounds like a good call.

Noah Raford: It was awesome. So that was definitely probably my shining moment as an applied futurist there.

Gideon Lichfield: And as a father.

Noah Raford: And as a father, yeah. So that was the moment where all those years of conversation of like, "We might be refugees, guys. Things might change in really surprising and traumatic ways really fast." All that stuff paid off.

Gideon Lichfield: And what was your worst call? 

Noah Raford: What was my worst call? I'd have to say self-driving cars.

Gideon Lichfield: What did you say about them? 

Noah Raford: Well, I was big on … I was big on autonomous transport quite early, part of my background as an urban planner. And I took one of the first delegations of all senior Dubai government officials to San Francisco back in 2014 or so, 2014, '15. We visited Google X before it was Waymo, and a lot of these self-driving car companies then. And it looked to me that the development curve of AI and computer vision and autonomy was gonna yield pretty dramatic changes in the mobility space over the next five, 10 years. And when you're talking about road-building programs, that means you have to start changing the way you're doing things now. So, we went really big on self-driving cars and autonomous transport and developed a big autonomous transport strategy, and just the industry hasn't really gotten there.

Gideon Lichfield: What keeps you up at night? 

Noah Raford: Well, we certainly have not seen the last pandemic, that's for sure. I was reading a study the other day by Development Institute. They were estimating something like a 25 percent chance of another corona-scale pandemic within the next 10 years and a 50 percent chance within the next 20 years. That's just one of those things that is so far beyond our control that we have really not prepared for, that would really, really ruin a lot of people's days. So I'm really worried about that. I'm still worried about uncontrolled nuclear releases, I'm really worried … Actually, speaking about the pandemic thing, I'm really worried, a lot of the evidence I'm seeing coming out of the Siberian Tundra that's thawing, there's all sorts of unbelievably nasty viruses from 15,000 years ago, like new strains of anthrax and stuff which have been lying dormant in the Siberian Tundra, which are just gonna get released over the next decade or two. And we have—

Gideon Lichfield: You started talking about the tundra and I thought you're gonna say methane.

Noah Raford: I wish. I wish methane was … Methane's a long, slow problem for us. But we are likely to face some emerging infectious disease or virus coming out of melted tundra in Siberia sooner rather than later, and we have no idea what that looks like. And all the biologists which I've been reading around the space are pretty freaking terrified about it.

Gideon Lichfield: Okay. And then finally, what gives you cause for optimism? 

Noah Raford: I mean, it's exciting, right? It's exciting. I just started my own company, an AI company, which is super exciting to be able to have really the power of billions of dollars of computational data analysis in your hands for pennies, and to be able to build niche services around that, build really valuable companies around little niches that serve a specific audience really, really well. It's so exciting. There's so many life-changing amounts of money and life-changing businesses that are going to be made in the next two, three years around AI by itself, that it should give people hope. It should give people some immediate day-to-day sense of encouragement and enthusiasm that we can actually do something that is definitely gonna change our lives and our family's lives, if not "save the world." But as a whole, so many people are experimenting in these spaces, be it from biotech to new financial services, to new applications of AI, that it's just deeply exciting.

Gideon Lichfield: Noah, you are one of the people that I know that most lives in the future. And I'm gonna sign off by telling you to have a nice future because I'm pretty sure you're gonna have one.

Noah Raford: Let's have a nice future together.

Gideon Lichfield: Thank you, Noah.

Noah Raford: Thanks, Gideon.

[Break]

Gideon Lichfield: So, how did that make you feel, Lauren? 

Lauren Goode: Well, let me just say, I would love to be a fly on the wall during one of Noah Raford's dinner conversations with his kids.

Gideon Lichfield: Right, the ones where he says, "You're gonna be refugees."

Lauren Goode: Yes, every six months or so, just call for the family meeting and say, "Kids, we're preparing for the worst." I can't tell whether that instills unnecessary fear in future generations, or whether that's actually the smart thing to do. It seems like it's a pretty smart thing to do.

Gideon Lichfield: Seems like it's a smart thing to do to me. I mean, we're afraid of... We can't imagine conversations like that because we're precisely afraid of that possibility. Can you imagine becoming a refugee? But people have grown up in all sorts of different times of history with all kinds of different expectations about what the future is, or even the concept of the future. So, I feel like if you get people used to the fact that it's going to look or could look a certain way, then maybe it's a lot less traumatic when it finally happens.

Lauren Goode: Right. And it made me think also based on what he said about pandemics, that maybe we should be having those conversations once every three to six months among ourselves to say, "How are we going to handle it if or when the next pandemic happens?"

Gideon Lichfield: Right. It's almost like being preppers in the US, people who are just expecting the worst and are ready for it to happen. It's easy to laugh at them, but then there is something about that feeling of being … Feeling ready, feeling secure, that even if the worst happens, you can handle it. I think that might be psychologically helpful to a lot of people, especially in the world we're going into.

Lauren Goode: The second thing that came to mind was, I wasn't sure what to make of his approach to thinking about climate change and I wanted to hear your thoughts on this. Because on the one hand, he's saying we should do small things in order to feel a sense of agency, but on the other hand, he basically says that what we do does not matter. We could recycle all the stuff we're supposed to recycle, we could bike to work as I've started doing and it just doesn't matter. That was a little bit depressing to hear because this recent IPCC report makes it clear just how dire things are and that we really need swift, wide-scale reduction in the use of fossil fuels if we're going to avoid reaching catastrophic global temperatures. And yet, Noah is saying it doesn't matter. Everything we do doesn't matter.

Gideon Lichfield: I mean, I think …

Lauren Goode: How did you square this? 

Gideon Lichfield: I think … The way I took it was he was saying, yes, individual actions like biking to work one day or recycling this bottle are not going to make a difference, but I think if we are doing these things, taking responsibility for the way our actions might impact the climate, then two things happen: One, you get a sense of agency that I think is just a good for your mental well-being and two, you become part of this fabric of people who are taking these actions, thinking about these things and making the question of climate more central in their lives just by virtue of action. And I think it influences the decisions that we make around which politicians we support, or which campaigns we support, or which companies we buy from. And those decisions ultimately influence the people whose decisions do matter because those are the big policy and commercial decisions that ultimately change the course of what fuels we use or what energy gets expended.

Lauren Goode: Here's the other thing that you guys talked about that really stood out to me: AI. And I thought he really helped put this in perspective for me, because we as journalists are a little bit alarmed right now by the release of generative AI, and I think we should be, but Noah also brought up a fair point about how there may be a 14-year-old kid in another part of the world, and he mentions Bangalore specifically, that is seeing generative AI as a tool or as an opportunity, and that some of the technological developments of the past 20 years may have left large portions of the population behind and this is a chance for people to actually sort of get on … on even ground. At the same time, I'm not entirely convinced that generative AI will be any different from those advancements because we're already seeing how it's being released within a sort of capitalistic structure and how we, people, are being used as inputs for the training data. What did you make of that? 

Gideon Lichfield: Of course, there are going to be the 14-year-olds in Bangalore who will be able to have opportunities they wouldn't have had otherwise because of generative AI. For me, there's always this quote that sticks in my mind from Zeynep Tufekci, who's a sociologist of technology, and a piece that she wrote that I commissioned, and it is, "Power always learns and powerful tools always fall into its hands." And the point that she's making is the structural political economic power arrangements that exist tend to reassert themselves, and they are slower to catch up to new technologies, but they ultimately do catch up. So I am skeptical that these tools are gonna fundamentally create a more level playing field. They will give opportunities to some.

Lauren Goode: Say a little bit more about that.

Gideon Lichfield: I mean that we will see the kid from Bangalore become the leader of a multibillion-dollar company, and in fact, that's already happened with some of the top companies in the US. But those power structures that they come into are existing power structures. So is India going to overtake the US because of generative AI? No. Is India gonna overtake the US for other reasons? Possibly. But I don't think technology is the driving force here. It's demographic questions, it's economic questions, it's geopolitical questions that are driving who ultimately ends up on top.

Lauren Goode: Yeah, and especially since these generative AI tools are coming from these capitalistic enterprises, there are roots in academic research, but right now it's kind of a race among these companies, these tech companies, to just put things out into the world and best each other.

Gideon Lichfield: Yeah.

Lauren Goode: That doesn't seem to bode super well for us as humans. We're all humans, but that doesn't seem to bode super well for us as the average tech consumers.

Gideon Lichfield: One of the things that for me was interesting was how he talked about the UAE as this transit hub for people all over the world. I think a lot about the future of government and democracy, and one of the things that's clear is that the notion that I grew up with, that liberal democracy is just gonna take over the entire world, that's not going to happen. And what Noah was hinting at for me was a future in which there are many different kinds of government systems around the world and it's not necessarily … Obviously, the democracies are the ones that are best at serving their citizens or at keeping up with the pace of technological development and how to regulate it. I think he described how he felt like Dubai is much more on top of the pace of technological change and how to keep it doing the best things for its society. So I feel like that's an uncomfortable position for us in the western world because we grew up with this notion that our political system is necessarily going to yield the best results, the most equal results for everybody. And Noah got me thinking a lot about just how true or untrue that is.

Lauren Goode: How hopeful did you feel walking away from your conversation with him? 

Gideon Lichfield: I felt more hopeful simply because I saw from him that there is a way to think about the future that doesn't require you to feel like you have to be able to predict the future. It's more about a mindset of preparedness for it. Now, sure, I think that also comes with a certain amount of privilege that someone like Noah has, or someone like you or I have, that we're in a good position to be able to capitalize on wherever the future is going and be prepared for it. That isn't true for everyone. But at least it gave me some reassurance that you don't have to know where things are going in order to survive them.

Lauren Goode: So it's not like he told you we're going to have a nice future, but he told you we can at least be prepared for a not nice future? 

Gideon Lichfield: Umm, yeah.

Lauren Goode: That's our show for today. Thanks for listening. Have a Nice Future is hosted by me, Lauren Goode.

Gideon Lichfield: And me, Gideon Lichfield. 

Lauren Goode: If you like the show, you can leave us a review wherever you get your podcasts, and follow us to hear more episodes. 

Gideon LichfieldHave a Nice Future is a production of Condé Nast Entertainment. Danielle Hewitt and Lena Richards from Prologue Projects produce the show.

Lauren: See you back here next Wednesday.