Physicist Jim Al-Khalili (of Iraqi descent) explains how to get out from under the cloud of misinformation. “Every now and again, I go off and I will make a TV documentary. I enjoy communicating complex ideas as well as discovering them for myself.” Al-Khalili teaches physics and conducts research at the University of Surrey.
Brian Gallagher in Nautilus: Jim Al-Khalili has an enviable gig. The Iraqi-British scientist gets to ponder some of the deepest questions—What is time? How do nature’s forces work?—while living the life of a TV and radio personality. Al-Khalili hosts The Life Scientific, a show on BBC Radio 4 featuring his interviews with scientists on the impact of their research and what inspires and motivates them. He’s also presented documentaries and authored popular science books, including a novel, Sun Fall, about the crisis that unfolds when, in 2041, Earth’s magnetic field starts to fail. His latest book, The Joy of Science, is his response to a different crisis.
“The Joy of Science was motivated by this sense that a lot of us have, that public discourse is becoming increasingly polarized,” Al-Khalili tells Nautilus. “There seems to be a rise in irrational, anti-scientific thinking, and conspiracy theories. And there’s no room for debate, particularly amplified by the internet and social media.” His message is that we should all be thinking more critically. “If we could export some of the ideas of science, when science is done well, into everyday life, I think we would all be happier, more empowered.”
Al-Khalili tells me that doling out advice is quite the departure for him. But after a long career in physics and science communication, he says with a laugh, “I’ve reached that stage where I arrogantly think I can impart wisdom to the world.” In our interview, Al-Khalili discusses, among other things, the unprecedented level of cognitive dissonance nowadays, what’s wrong with Occam’s razor, and whether ideological thinking conflicts with a scientific mindset. He also defends “scientific realism,” and walks me through a puzzle about light that Einstein dreamt up as a teenager.
What drove you to write a book about living by the scientific method now?
We are bombarded by information all the time, and your average person really doesn’t know who or what to trust. But we can learn to know who and what to trust. We can employ some of the ways that we do science—examining biases, the importance of uncertainty, being prepared to change your mind in the light of new evidence. Those sorts of things go against human nature because we want to be right about our opinions. We don’t like to be told we are wrong. But that’s not the way we do things in science.
Do you have a memorable example of a scientist admitting they made a mistake?
I have a lovely story. A few years ago, I made a documentary for the BBC called Gravity and Me. We’d finished filming, and I was due to go into the studio to do the voiceover, and it was due to be aired on British TV a few weeks later, and we discovered that I’d made a mistake. I was trying to explain the idea of clocks running at different rates in Earth’s gravity. Because time runs slower, not just when you travel close to the speed of light, but also when you are in a strong gravitational field. We went back to the BBC and said, “Look, hold on to the transmission. We’ve made a mistake.” And they said, “Fine. We’ll do that. Reshoot all the stuff that you got wrong and put in the correct stuff and that will be the wiser.”
And I said, “Actually, this is a really good opportunity to explain how science works and that we do make mistakes and that it’s okay to make mistakes. How about if I make it as part of the documentary? I say, ‘Unfortunately, at this point I realized I’d got it wrong and in fact, it’s such and such.’” And the guys at BBC, the commissioning editor, were quite nervous about this. They said, “Oh, Jim, we are concerned about your reputation as a professor of physics if you admit your mistake publicly like that.” I said, “Well, clearly you don’t understand how science works. It’s not something to be ashamed of to admit you are wrong.”
And I stood by my guns and we absolutely made sure that was part of the documentary. I was getting emails from people after, saying, “Oh, Jim, you’re so brave to admit your mistake.” I said, “No. It’s great. I mean, that’s how we learn, that’s how we do science. There’s nothing wrong with that.”
Science is carried out and funded by humans with various biases and motives. But would you say it still is a uniquely trustworthy enterprise?
This is not an easy issue. Science of course is very broad. In my area of research in theoretical physics, to a large extent it is value-free. The equations of quantum mechanics that I might come up with or write down will be exactly the same, whether they’re discovered by physicists in China or Russia. There’s a universality about the laws of physics that transcend cultures and political ideologies. But of course there are lots of areas of science, particularly in the social sciences, dealing with the complexity of human behavior, where it’s difficult to avoid value judgements and biases. And that’s just the way scientists have to behave, to try and remove biases, or examine their own biases.
It’s even more difficult for the wider public, who are not trained in science, to know who to trust and what to trust. You see something on YouTube or you read an article online—how do you know (a) whether it’s good science and it’s based on firm evidence and data, and (b) whether whoever is getting that idea across has their own vested interests? Many scientists work for corporations and industry, in the pay of people who do have other vested interests, so it is difficult.
My message is that you shouldn’t take a lot of these ideas at face value. We have to invest some effort into digging in to find out whether something comes from a reputable source or not. To some extent, we may have to rely on technology to help us do that filtering. But even that comes with its dangers. Who’s creating the AI that’s telling you what is fake news and what is good news? As a society, we have to have this discussion because we need to know how to discriminate among all the information that we are being bombarded with every day.
How confident are you that AI can be relied on to show us trustworthy information?
Well, I’m quite nervous about how well we can utilize AI. But we are going to have to use AI to help us filter the trustworthy information from the misinformation and disinformation. But the problem is, who creates that AI algorithm? If it’s Google or Facebook that is filtering what we receive, and they say, “Look, we’ve removed all this other stuff because that’s misinformation.” Well, who says? Is that AI built by someone with an ideological stance? We’re going to have to figure out ways of making sure that AI is completely neutral on this matter. Maybe it’s providing us with a forum where we can debate things a bit more rationally and civilly than we are at the moment. There’s too much information out there for us as a society to develop our own rational skills to decide for ourselves. We’re going to have to make use of technology, but we have to be very careful about how we implement it.
How helpful is Occam’s razor—the idea of favoring simple explanations—in deciding where to place our trust?
William of Occam was this medieval monk who actually lived very near to my university, University of Surrey in England, and the razor that’s named after him is simply that if you have lots of different explanations, chances are the simplest one is the right one. That served us well in science, but there are dangerous pitfalls because things aren’t always as simple as we’d like them to be. And when you apply that in everyday life, it’s even more problematic because we are living in a world now where we want the simplest explanation.
“Don’t blind me with details. This is what I believe, this simple idea. And this is what I’m going to go with.” Very often, issues that we have to deal with in everyday life are more complicated. Not everything can be reduced to a meme or a tweet. And yet we see the problems we have today, with the polarization of ideologies, particularly on social media, where each side is so absolute and certain in their position, and they don’t want to acknowledge that actually an issue is more complex, more complicated, more nuanced.
How would you revise Occam’s razor?
Maybe, “It’s not the simplest explanation that is the right one, but the most useful explanation.” It could be that sometimes—and certainly in science if we want to describe a concept—it is more complicated than we’d like it to be, and we have to acknowledge that and bite that bullet.
In the book, you mention a thought experiment that Einstein, as a teenager, came up with to get a handle on the unintuitive behavior of light. He wondered: If you were traveling at the speed of light, holding a mirror in front of your face, would you see your reflection? How do you answer that?
The issue is if you are flying at the speed of light and the mirror is in front of you, to see your face reflected in the mirror, light has to bounce off your face, onto the mirror, and then back into your eyes again. But if you are traveling at the speed of light, how can light ever overtake you, reach the mirror and come back again?
The answer is, Yes, we will always see our reflection because Einstein’s theory of relativity tells us that all motion is relative. I’m traveling at the speed of light—according to what reference? There will always be a frame of reference in which I can say, I’m not moving at all. And this is Einstein’s great breakthrough in 1905, his special theory of relativity, which says that the speed of light is absolute. It doesn’t matter how fast you’re moving, you will always see light traveling at that same speed, the maximum speed in our universe.
And so me flying, holding a mirror in front of my face, will be no different to me standing still holding a mirror in front of my face. I always see my reflection. Relativity theory forces us to rethink the notion of distances and time intervals. The example I always give to my students is, if I shine a torch out into the sky, so the light from the torch is traveling at the speed of light away from me, here standing on Earth, and then you, Brian, jump in a rocket and fly off at, say, 99 percent of the speed of light, trying to catch that light beam traveling parallel to it, I would see the light beam overtaking you, slowly at 1 percent of the speed of light, because you aren’t going nearly as fast as it and that also makes logical sense. But for you in the rocket, you see that same light beam going past you, at the same speed that I see it leaving my torch. So something has to give, and what gives is our notion of the flow of time.
I would see your time as running much more slowly than mine. Your seconds are ticking by slowly. That’s why you see the light beam going past you very quickly, because your time is running slower. One second for you, the light beam has gone past you very quickly, but for me I can see it creeping past. So the notions of distance and time change. And that’s where relativity theory becomes counterintuitive and fun to teach.
Do you think we can know reality, the world “out there,” as it truly is, or is it more complicated than that?
This is an age-old question and it particularly came to the fore a century ago with the development of quantum mechanics: the most counterintuitive idea in science, the theory of the subatomic world. Famously there were long-running debates between the leading physicists of the time, Einstein versus the Danish physicist Niels Bohr. Einstein was a realist. He believed there’s a real world out there and it’s science’s job to get as close as we can to that truth. In The Joy of Science, I lay my cards on the table. I would side with Einstein on that one. We may never reach it, but the world is the way it is. We can’t make up our own narrative. We can’t decide on our own reality. But Niels Bohr, the father of quantum mechanics—the guy was a genius—would argue that the job of science isn’t to find out how the world is, because we can never find out how it is. The job of science is to see what we can say based on what we see, our perception, of how the world is. We can never say how the world really is.
Do you feel like that’s a cop out?
Yes. We should say there’s a real world out there, and it’s our job to try and find ways of breaking out from the models that we create in our minds—the reality that we construct in our minds—that we hope reflects what the real world is like. I don’t see any reason why we should absolve ourselves from that responsibility.
Why do you say that cognitive dissonance is far more serious in our modern culture and times than it has ever been?
Cognitive dissonance, the idea that we’ll have a view and then we’ll be confronted with something that goes completely against it, is something that happens to us on a daily basis. Pre-internet, we tended to read the newspaper or get our news from a source that we felt that aligned with our worldview. To a large extent, we still do that now, but what has changed is that the internet and social media and YouTube have amplified the problem, because we are now exposed to the opposing views in a very real way, far more than we’d ever been before. Confirmation bias, you like to hear what you already believe in, was much easier in the past. Life was simpler.
Today we are confronted with having to deal with information coming from across the whole spectrum, for any particular issue, whether it’s political, ideological, or religious. And we adopt a defense mechanism against that, which is to reject the views that we don’t like, that we don’t agree with. And my argument is, Hang on. Don’t be so hasty in rejecting it, however uncomfortable it makes you feel. Learn that there’s no shame in changing your mind in the light of new information.
The term ideology comes up quite a bit in your book. Would you say people should generally avoid making ideological commitments if they want to think about things more scientifically?
I don’t think so. Ideology can mean anything. Some people even refer to science as an ideology. There are certain beliefs in science, whether you believe in the many-worlds interpretation of quantum mechanics or not, that become almost like an ideology. But no, this is part of human nature, that we have a worldview. We have a political view. We have a moral compass. We believe something is right and something is wrong. This changes, of course. What was acceptable a hundred years ago clearly isn’t acceptable now and vice versa. So, holding ideological views is absolutely part of the human condition. It’s just that we should try a bit harder to examine and question why we hold those ideological views and not be so certain, so absolute about them.
Why should we question our motives for believing what we think is true?
It’s the way we do things in science. We constantly test our own ideas and because we know if we are wrong about something, other scientists eventually will discover it. Of course, some scientists will stick to their guns no matter what, but they don’t last long. Those ideas don’t survive very long. Just because you want something to be true or you want something to be correct, doesn’t make it so. I think it’s a nice lesson that wider society could adopt. Being able to admit you are wrong, to change your mind, in science is a strength, unlike in politics, where it’s regarded as a weakness, right? Politicians don’t like to admit mistakes or that they’re wrong. Wouldn’t it be refreshing if they could say, “Oh, actually. No, you’ve got a good point there. I’ve changed my mind. I now think this.”
Has your joy of science changed at all as you’ve gotten older and learned more?
Probably, it has increased rather than diminished. I don’t feel there’s going to come a time where I say, “Okay, I’m done with science. I want to go and play golf or travel around the world.” I want to be able to do that, of course, but I don’t think my love for science will diminish at all. I don’t plan to retire, much to my wife’s annoyance.
Brian Gallagher is an associate editor at Nautilus. Follow him on Twitter @bsgallagher.