Steven Pinker, Award-winning Author and Experimental Psychologist, Harvard University
Guests have two options for purchasing parking. Please park in the respective parking locations, or your car may be ticketed.
Only full-day parking passes can be purchased online. Day passes are $15, but can be purchased in advance here. Search "Daniel Pearl" and enter your vehicle information. Follow the signs in the parking lot to park your car in a yellow permit parking location.
You may park in any location if you require accessible parking and have a placard.
In sponsoring the Daniel Pearl Memorial Lecture Series, the Burkle Center for International Relations celebrates the memory of Daniel Pearl as a prominent journalist who dedicated his life to bringing joy and understanding to the world. Past presenters have included David Remnick of The New Yorker, Leon Wieseltier of The New Republic, Christopher Hitchens, CNN's Anderson Cooper, David Brooks and Thomas Friedman of The New York Times, ABC’s Ted Koppel, CBS’s Jeff Greenfield, Daniel Schorr of NPR, CNN's Larry King, former Secretary of State Dr. Condoleezza Rice, CNN's Christiane Amanpour, Ambassador Samantha Power, Pulitzer Prize-winning journalist Bret Stephens, CNN anchor Fareed Zakaria, author and journalist Bob Woodward, CNN's Jake Tapper, French philosopher Bernard-Henri Lévy and most recently, Garry Kasparov, former world chess champion and Russian pro-democracy leader.
Kal Raustiala
Okay, good afternoon, everyone. Welcome. I'm Kal Raustiala. I direct the UCLA Burkle Center, I teach here at the law school. It is a great honor to have you all here. We are here for the Daniel Pearl Memorial Lecture, which we do every year here at UCLA. Can everyone hear me okay? Is this working? So I think as you all know, over two decades ago, Daniel Pearl was brutally murdered in Pakistan, while working as a reporter for The Wall Street Journal. And in the years since, we have commemorated Daniel's life and his work as a journalist, as a truth seeker, as someone who represented many important things to all of us here at UCLA. And so we've done this lecture in his honor and to honor his legacy. For this year's Daniel Pearl Memorial Lecture, our guest is the prominent author, thinker, speaker, Steven Pinker. I'll introduce Professor Pinker properly in a moment, but I'm just going to lay out how the event is going to work. So first, we're going to begin as we often do, as soon as I'm done, with some remarks from Judea Pearl, Daniel's father, who's also a Professor Emeritus of Computer Science here at UCLA. Immediately following Judea's remarks, Professor Pinker will come up here, he will give his talk and then he and I will sit in those chairs and have a brief conversation. And at the end of that, we will open it up to questions from all of you. We should have time for a few questions. We have these very unusual microphones in the front that look like boxes, you're going to hand those around. When you ask a question, I know they're a little odd, when you ask a question, of course, wait to be called on. But please keep your questions short and to the point. This is a crowded house, we have an overflow room with more people in it, so we want to keep the questions to the point, short and brief. So that's going to be the run of the show. So let me introduce Professor Pinker and then again, I'm going to welcome Judea up to the stage. So Steven Pinker is the Johnstone Family Professor of Psychology at Harvard University. He is a Canadian by birth, raised in Montreal, but has spent, I think it's fair to say much of his working life here in the United States. He is a professor at Harvard, as I mentioned, but he's also taught elsewhere, MIT, Stanford, and I think you spent a year at Santa Barbara, if I read your bio correctly at UC Santa Barbara. He's a member of the National Academy of Sciences. And in addition to his voluminous and I think justly celebrated scholarly work in a number of fields of cognitive psychology, theories of mind and other issues, he is best known to most of us as one of the rare academics who writes effectively and widely for a general audience. He was named by Time Magazine as one of the Top 100 Most Influential People in the world. And he's twice been a finalist for the Pulitzer Prize. His books include the Language Instinct, The Blank Slate, The Better Angels of Our Nature, and most recently, Rationality: What It Is, Why it Seems Scarce, and Why it Matters, which will inform his remarks today in his presentation. So again, please join me in welcoming first to the stage Judea Pearl, Daniel's father and a professor here at UCLA. Thank you.
Steven Pinker
Thank you, Kal, for this kind introduction. And thank you all for being here. I see friends, colleagues, students, supporters of the foundation, supporters of the Burkle Center, and the community, the UCLA community and the Los Angeles metropolitan area community for whom this event, annual event, has become a landmark of common values and common aspirations. This year, I would like to start, before I introduce the speaker, I would like to start with apology or confession for a long lasting, lingering, embarrassing neglect. That I'm talking about the fact that out of the 21 speakers we had in this series for the year. The 15 were journalists and all media personalities, and five were diplomats and none was a scientist. Okay, while it is true that Danny was a journalist, through and through, and it seems fitting though to commend, commemorate his life and work by inviting his colleagues to reflect on the essence of their profession, and its importance to its social role, we must also recognize the parallel between journalist and scientist. Now, both professions are driven by obsessive dedication to an elusive commodity called Truth, t r u t h. And both devote their times to the pursuit of truth, believing for some reason that this would enable society to make more rational decisions for the betterment of mankind. Now with this motto in mind, we are honored and fortunate to have Professor Steven Pinker as our speaker today. Because Professor Pinker is not merely a microscope bound scientist, he has significantly contributed to clarifying the method, the method by which truth is uncovered, approximated, and communicated. His study, or his book and this book of rationality, the topic of today's discussion, delves deeply into the secret of his methodology of this methodology, its benefits, and why it is so often abundant. It is, in my own work, I have found the Pinker's book on rationality to be so compelling that I have changed the title of my forthcoming book to read coexistence, rationality, and other fighting words. It is, I believe itself evidence that understanding rationality is indispensable in our current era of fake news, in conspiracy theories, where so many intelligent people prefer information that reinforces their preconceived worldview over facts that challenge those theories. Just to draw a contrast, this is precisely what scientific theory tells us not to do. Both scientists and journalists become famous by stepping out of their comfort zone, and paying attention to fact that over turn received theories. As we gather here today, I should also mention that it's important on the grounds of this great university, we should find the discussion of rationality especially pertinent to our beloved UCLA campus, which has been besieged, disrupted and violated by irrational forces since October 7. And it's still struggling, struggling to recover from this predicament with great pain and anguish. Thank you, Steven, for honoring our son Danny with your lecture today.
Thank you very much. Thanks to all of you for coming. It's a bittersweet honor to commemorate Daniel Pearl, the annual lecture in his memory. And Daniel Pearl was a martyr for free speech, for for intellectual freedom. Judea Pearl, as you all know, is the perhaps the world's deepest analyst of the concept of causality, an inherent component of rationality. And so, I'm going to try to draw some, some connections between the two themes and speak today about human rationality and at least academic freedom. So let me start with rationality, which poses a profound puzzle. On the one hand, our species can lay claim to stupendous feats of rationality. We've figured out how to walk on the moon and take a portrait of our home planet. We have a estimated the age of the universe and plumbed the faraway regions of the cosmos. We've penetrated the secrets of life and of mind. We've fought back against the Horsemen of the Apocalypse, reducing the human toll from scourges that have immiserated our species for as long as we've existed, such as the human toll from war, from famine, from extreme poverty, and child mortality. At the same time, a majority of Americans between the age of 18 and 24 think that astrology is very or sort of scientific. Large proportions believe in conspiracy theories, such as that the COVID 19 vaccines are actually microchips that Bill Gates is trying to inject into us to surveil our lives or that the American deep state houses a cabal of cannibalistic Satan worshipping pedophiles. Many people swallow fake news such as "Joe Biden calls Trump supporters dregs of society" or "Yoko Ono had an affair with Hillary Clinton in the 70s." And people fall for various forms of paranormal woowoo, such as possession by the devil, extrasensory perception, ghosts and spirits, witches, spiritual energy in mountains, trees and crystals. So the puzzle is if people can be rational, why does humanity seem to be losing its mind? And this was the tension that I took up in my book Rationality. Now there is a standard answer to the nature of rationality and irrationality coming from my own field, cognitive psychology, and the neighboring field of behavioral economics, which is rationality itself may be defined by normative models. These are benchmarks for how we ought to reason. And in my book Rationality and a course that I teach at Harvard, I try to convey what I think are the seven cognitive tools or normative models that every educated person ought to master. These things include logic; probability; Bayesian reasoning, that is calibrating degree of credence in an idea according to strength of evidence; theory of rational choice or expected utility; statistical decision theory, also known as signal detection theory, that is, how do you make a decision when it depends on noisy inputs, and the different ways of being wrong have different costs and benefits; Game Theory, how does a rational agent choose a course of action when the outcome depends on the choices of other rational agents; and the analysis of correlation and causation, which, as I mentioned, and as I assume you all know, owes tremendously to the foundational work of Judea Pearl, and in his honor, I will reproduce a correlation and causation joke that I used to explain the concept in, in my book Rationality. And it's about a it's about a couple who is experiencing sexual frustration in a traditional Jewish community. And it is written in Jewish law that a man is responsible for his wife sexual pleasure. This is true. So they approached the rabbi, they tell him of their problem and the rabbi strokes his beard, and he says, well, here's an idea you can try hire a handsome, young, athletic man, to wave a towel over you the next time you make love. And the fantasies will help the missus achieve satisfaction. Completely kosher. And so they try it and the Earth doesn't move, nothing happens. So they go back to the rabbi, and the rabbi strokes, his beard again, and he says, well, this time, let's try a slight variation. This time, the young men will make love to your wife and you wave a towel. So they try it and sure enough, she has an ecstatic screaming orgasm, and the husband turns to them and says schmuck, now that's the way you wave a towel. That's a joke about correlation and causation. And, in fact, our host Professor Pearl can explain mathematically the fallacy in the husband's reasoning. Okay, well, according to the standard approach again, this is what the majority of cognitive psychologists and behavioral economics believe. Humans don't naturally reason with these normative models, but we fall back on heuristics, biases, and primitive intuitions resulting in widespread fallacies. Many of you are familiar with this work often associated with the late Daniel Kahneman, author of the best-seller Thinking Fast and Slow and his late collaborator, Amos Tversky. So the solution is greater promotion of numeracy and scientific literacy, better science and math education. So let me just give you some examples of some classic human fallacies. These are phenomena that have been demonstrated in the lab and in countless classrooms over a course of 60 years, highly replicable. So here's a classic example of logical reasoning: consider a deck of cards in which every card has a number on one side and a letter on the other. Here is a possible rule. If a card has a D on one side, it has a three on the other. Which of the following cards do you have to turn over to test whether that rule holds of this deck of cards? So you've got a D, you've got a, an F, you've got a three, and you've got a seven. So which cards do you have to turn over to test whether the rule is satisfied? Okay, this has been done hundreds of times since the 1960s. And the most popular answers are you have to turn over the D card or you have to turn over the D and the three card. I haven't asked for a show of hands. But I suspect that's what occurred to many of you in this room because it is very replicable. The correct answer is you have to turn over the D card and the seven card. Well, how come? Well, everyone knows you got to turn over the D card, because if it if you turned it over, and there wasn't a three that would falsify the rule. Everyone knows you don't have to turn over the F card. So the rule doesn't say anything about it one way or another. A lot of people think you have to show over the three card. But when you think about it, that's irrelevant. The rule says if d then three, not if three than D thinking that you have to turn over the three cards as an example of the fallacy of affirming the consequent that is going from p and if p then q two if q then p, which is a fallacy. But and when you think about it, you really do have to turn over the seven card, because if you turned it over and there was a D on the other side, that would falsify the rule. But it doesn't occur to most people. It's a common explanation, really more of a redescription is that it's a case of confirmation bias. That is that people are pretty good at looking for examples that conform with a generalization. They're not so good at looking for potentially disconfirming confirmatory examples. Let me give you another classic example. This one from Bayesian inference. The probability that a woman has breast cancer is 1%. If a woman does have breast cancer, the probability that she tests positive is 90%. That's the sensitivity of the test. If she does not have cancer, the probability that she nevertheless test positive is 9%. That's the false positive rate. A woman tests positive, what is the chance that she actually has the disease? Okay, just think about it intuitively, in terms of the ballpark probability that you would assign? Well, in samples of physicians, the average answer is about probability of 80 to 90%. The correct answer is 9%. That's right. When you go to a doctor and you test positive, chances are the doctor will overestimate the chance that you have the disease by a factor of about 10. A common explanation is that this is shows that people tend to neglect base rates in this case, the fact that only 1% of people in the population have the disease in the first case, which means that most of the positives are going to be false positives. They only think about representative stereotypes that is a person with cancer probably will test positive, but they fail to discount that by how rare it is in the population being at the base rate.
I'll give you one other example of or set of examples of common fallacies in human reasoning. This comes consists of intuitions that people naturally have about the world work, how the world works, that served our ancestors well in a pre scientific environment, but which often are have been debunked by our best science. For example, I'll give you three of those intuitions. Many people are vulnerable to dualistic thinking that is to thinking that every human being has a body and has a mind. And it's a natural way to think as we interact with one another. We don't treat each other as hunks of meat or robots. We assume that the person we're dealing with has a mind with beliefs and desires and intentions just like ours, even though we can't experience them directly. Well, once you think that there can be bodies plus minds, it's a short step to imagine well there can be minds without bodies. And so you get beliefs in spirits and souls and ghosts and an afterlife, and reincarnation and ESP. We're also prone to essentialist thinking, that is that living things contain an invisible essence that gives them their form and powers. You can't see it directly, but it's what organizes the living, working tissues of organic beings. Well, there's a short step to conclude that disease must be caused by some kind of adulteration of one's essence by a foreign contaminant or pollutant. And so you get resistance to vaccines as old as vaccines themselves, because when you think about it, intuitively, what is a vaccine, you're actually taking the the substance that makes you sick and injecting it into deep into your tissues. It's not surprising that many people think that that is an icky prospect. It makes people receptive to homeopathy and herbal remedies which seem to be transfusing the healthful essence of a living thing into the body. And it leads to the widespread rediscovery in in culture after culture of crack cures, like purging and vomiting and enemas and bloodletting and fasting and the vague sense that to be to treat disease, you've got to get rid of toxins whatever they are. Then we also are prone to teleological intuitions. We know that our own plans, our own artifacts and tools and creations were designed with a purpose. From there, it's a short step to assume that the whole world was designed for a purpose and so to be receptive to creationism and astrology, and synchronicity, and the vague, Oprah-esque sense that everything happens for a reason. So, can human irrationality be explained by these heuristics, fallacies, and primitive intuition, the subject matter of a lot of cognitive psychology? And as a corollary, can we cure it by promoting logic, numeracy and scientific literacy? This suggestion is found its way into a cartoon such as this SMBC cartoon whose caption is, this is why people should learn statistics. The woman says I will not fly in a plane, they aren't safe from terrorists. Hold on, I'll text you an article about it. Or a headline from the satirical newspaper The Onion, "CDC announces his plan to send every US household pamphlet on probabilistic thinking." This is a a comforting idea for those of us who teach science and statistics and cognitive psychology and so on. But I have come to realize that the answer of the question, Is this the way to cure? Scientific illiteracy is unfortunately, no. And let me give you a couple of illustrations as to why not. So here's, let me give you an example from logic. A question is, is this syllogism valid? Which is to say, does the conclusion follow logically from the premises? The premises are: if college admissions are fair, then affirmative action laws are no longer necessary. College admissions are not fair, therefore affirmative action laws are necessary. Does the conclusion follow from the premises? The answer is no. This is a an example of the fallacy of denying the antecedent. That is a going from if if p then q to if not p then not q, which is a logical fallacy. And when this would people are given this problem, a majority of liberals commit the fallacy but a majority of conservatives do not. Now if you ask conservatives, to what's the explanation? They'll say? Well, it's just what we've told you all along, liberals are irrational. Not so fast. How about this syllogism? If less severe punishments deter people from committing crime, then capital punishment should not be used. Less severe punishments do not deter people from committing crime, therefore, capital punishment should be used. Well, now a majority of conservatives commit the fallacy. And a majority of liberals do not. And of course, what's what is happening is the same thing on both sides. Namely, people start out with the conclusion that they want to believe is true because it is a kind of signature or calling card of their own political coalition. And they will think that any train of reasoning that leads to some conclusion that they wanted to be true in the first place is valid. Let me turn now to some tests of climate literacy. That is the science behind climate change and ask you whether each of these statements is true or false. Climate scientists believed that if the North Pole ice cap melted as a result of global warming, sea levels would rise. True or false? What gas do most scientists believe causes temperatures in the atmosphere to rise: carbon dioxide, hydrogen, helium, or radon? Climate scientists believe that human caused global warming will increase the risk of skin cancer in human beings. One of the causes of global warming is a hole in the ozone layer. And one of the ways to mitigate climate change is to clean up toxic waste dumps. Now, it turns out that these statements are all false. That if global warming resulted in the melting of ice caps on Greenland and Antarctica, that would cause sea level rise, but the North Pole ice caps are floating on water to begin with, and their melting would no more raise sea levels than if an ice cube in your coke melted it would cause the cup of coke to overflow. Likewise, ozone hole is a is a was a serious environmental problem, but it is completely independent of climate change. Toxic waste dumps have nothing to do with it. When you give this test to a sample of people, the believers in anthropogenic climate change don't score any better than the deniers. Many of the believers have a vague sense of green and natural and versus pollution and contamination. And they're not particularly astute as to the actual science behind climate change. So what does lead to science denial? The question is actually the answer is actually pretty simple. Politics, political orientation. Why is the earth getting warmer? If you ask people is it because of human activity or natural patterns, you find that the more conservative, the more climate denial has nothing to do with scientific literacy, not understanding of atmospheric chemistry, it's just politics, the farther to the right, the more climate denial. Similarly, when it comes to denial of the proposition that humans evolved from other organisms, the degree of belief in that proposition is completely uncorrelated with tests of scientific literacy, like quite like what is bigger an atom or an electron, probes of how well people understand Darwin's theory of natural selection, completely uncorrelated with their belief as to whether humans evolved from other primates. What predicts belief in evolution is just religiosity, the more religious, the more you deny evolution.
Speaker 1
This is these are findings that come from the legal scholar Daniel Kahan, who calls it express expressive rationality, namely, opinions can be signals of loyalty to a social group or coalition. It turns out that despite our laments of why of rapid scientific illiteracy, most scientific findings are accepted. In contrast to say the consensus a few 100 years ago, very few people in the mainstream of American society today believe in werewolves, unicorns, animal sacrifice, bloodletting, the divine right of kings. miasma as the explanation of disease or omens and eclipses and commets. Pick your favorite cooky politician, chances are they will not endorse any of these statements. There is only a few politicized emotionally charged topics which become kind of talismans or litmus tests of political loyalty. And that's where you find the science denial. So how do we achieve feats of rationality, despite both the cognitive biases that cognitive psychologists have documented and then on top of them all the political biases? Well, the general answer, and this was hinted at by Professor Pearl, is that there's a kind of social rationality that we there are some institutions that by their very design, promote rationality, and that's how as a species, we've managed to accomplish anything rational. Because really what few of us can really justify our beliefs, including the true beliefs. I've been vaccinated against COVID six times. If you ask me how the vaccines work, I'll say something something mRNA something something immune system, something, something antibodies, basically, I trust the people in the white coats who say that they're safe and effective. So we depend, all of us, on the expertise of scientists, historians, journalists, government record keepers and mainstream authors. And we, the trust is often well placed because it's a well designed, rationality promoting institution, one person can notice and make up for another's biases. They can make us more collectively more rational than any of us is individually. We all are very poor at spotting fallacies and biases in ourselves. We're much, much better at spotting fallacies and biases in the other guy. And so if you have a community where people can point out each other's fallacies can express hypotheses, keep the ones that seem to withstand criticism and attacks, not repeat the errors of the past, then the species can kind of inch its way toward greater rationality. Let me again, let me give you a concrete example. Remember the card selection task if d then three, as I mentioned, when people are given this problem to solve about one in 10 gets it right. But if you put them in groups of three and four, now seven out of the 10, get it right. All it takes us for one person to spot the correct answer and explain it to everyone else and the rest of the group virtually always agrees. What do I mean by rationality promoting institutions, while in fact, Professor Pearl mentioned two of them starting with science with its demand for empirical testing and peer review, democratic government with its checks and balances, journalism with its demand for editing and fact checking the judicial system with its adversarial proceedings, even Wikipedia, probably the most successful social media with it, where the editors have to sign on to five pillars of objectivity, neutrality and sourcing. You compare it to social media that are kind of a free for all, like Tik Tok and Twitter and Instagram, the record is not as impressive. And in theory, academia with its commitment to freedom of inquiry and open debate. And, as I said, in theory, and that's the question that I will turn to now. What it suggests, by the way, in general, our overall set of guidelines for how do we promote rationality is to safeguard the credibility and objectivity of rationality promoting institutions. Experts should be prepared to show their work that is not to issue edicts from on high as if they were oracles but to explain the basis of their recommendations. Fallibility should be acknowledged, since we all start out ignorant about everything and fallibility is part of being rational. And perhaps most important, gratuitous politicization should be avoided. And this is a an advisory that our scientific institutions have been rather flamboyant in floating of recent where our major scientific magazines and organizations have been falling over themselves to proclaim how aligned they are with the political left. They should not be surprised if the center and the right blows them off. In the, in other words, if we want people to accept the scientific consensus on climate change, vaccines, public health measures, and science in general, they should not be branded as left wing causes. So the final topic, then that I'll turn to is the question of academic freedom, which is objectively under threat, the Foundation for Individual Rights and expression which tracks assaults on academic freedom, found that between the year 2014 and the year 2022, there were 877 attempts to punish scholars for constitutionally protected speech, 114 incidents of censorship, 156 firings- 44 of them tenured professors, more than during the McCarthy era. So here's a graph just showing how the problem rather suddenly got much much worse leading to the puzzle, why should academia of all institutions punish the expression of opinion? Isn't that what academia is for? Well, part of the answer comes from the political orientation of university professors, which has become increasingly left wing. This graph only goes to the year 2014. That was 10 years ago. And if we had comparable data today, there's no question that the leftward shift as represented by the top line would be even more extreme. And it isn't because of trends in the general population. The same questions show pretty much a flatline in the population as a whole. I'll present some recent data from a institution with which I'm familiar, Harvard, where according to the most recent poll, 37% of my colleagues identify themselves as very liberal, 45% as liberal, 16% as moderate, and one and a half percent as conservative and even that figure is going to go down because he's going to retire. So what's the big deal about academic freedom? Now, it's not just to stanch the bleeding of confidence in universities, though that is a problem. Confidence in, trust in universities has been sinking faster than any other American institution. That's saying a lot, considering how much trust has eroded in the media, government, and so on. Fewer than half of Americans believe it has a positive effect on the country. Also, academic freedom should not be defended as a perquisite of professors just professors have the right to say whatever they want, because professors are as much a part of the problem as part of the solution. A lot of these cancellations and censorship incidents and firings were at the behest of professors, but it's rather to safeguard rationality itself. No one is infallible or omniscient. That should be obvious. And we've seen evidence of logical and statistical fallacies primitive intuitions, the my side bias namely believing a proposition if it is a signature cause of your own coalition, expressive rationality, opinions as signs of loyalty to a coalition. Intellectual progress is driven by what Karl Popper calls conjecture and refutation. Namely, some people propose ideas, others probe whether their sound. In the long run, if the institution is well designed, people keep the ideas that have withstood criticism, criticism and attempts to falsification and
Speaker 1
try not to repeat their mistakes. Any institution that disables this cycle by making certain hypotheses unsayable is doomed to error. We have many historical examples, such as the embrace of heliocentric of sorry, geocentrism, of creationism of license of Lamarckism by the by the Soviet Union. And any institution that just prevents certain opinions from being voiced will is bound to provide erroneous guidance on vital issues. Again, since no one's infallible, no one's on this omniscient. And even when the academic consensus is almost certainly correct, people won't accept it. And I have encountered this a number of times where I will talk about the fact that the vast majority of scientists agree that human activity has been warming the planet and the response is well, why should I trust the consensus if it comes from a clique that allows no dissent? Consensus just means that anyone who disagrees will be canceled or driven out. And it is a legitimate criticism if we want to safeguard the credibility of the scientific consensus, we have to reassure people that are coming from a community where if there are criticisms, they can be voiced and they can be refuted. So it's in the service of promoting academic freedom as a necessary means to the end of rationality, that I in a number of colleagues formed the Council on Academic Freedom at Harvard, a faculty led organization devoted to upholding the ideals of free inquiry, intellectual diversity, and civil discourse, again, not as a perquisite of professors, but as one least small part of the effort to promote rationality. Thank you. Thank you for having me.
Kal Raustiala
Okay, can everyone hear us okay? All right. So thank you very much for that talk. So I'm going to admit normally, for this lecture, we've had either people from journalism, from politics, from foreign affairs areas that I know something about, I really know nothing about rationality. So I'm gonna ask some naive questions and then I'm gonna open it up to everyone else. So the first question I have is, I have read a few of your books. And I think you have a, a justly earned reputation as someone who's generally optimistic and sees kind of an increasing progress in a number of areas. So I want to start by asking you about the the areas you started in the beginning, you give a lot of examples, funny ones, about irrationality, and biases, some of which are probably eternal, but some of which seem to be more prominent today. And then you give some others in which we've moved away like werewolves. So I just want to know, on balance, do you think the problem of irrationality as a societal problem is getting better? And if so, why?
Speaker 1
Yeah, so it's, I think, I think there's a lot of rationality inequality, which may be increasing that at the high end, we see rationality being applied to problems that formerly were just matters of intuition and hunches and conventional wisdom, and what the engineers called hippo- highest paid person's opinion. But more and more are being brought under the wing of rational analysis. So in crime fighting, for example, there we have data driven policing, where daily data are gathered on where the hotspots for crime and the resources concentrated there probably one of the reasons for the great American Crime decline in the 1990s. We have evidence based medicine, which should be a redundancy, but isn't. We have Moneyball in sports, we got a poll aggregation in polling like 538.com, we have feedback informed therapy. So rational approaches are being applied to more domains. We have, of course, the recent advances in artificial intelligence. But at the same time, it's never been easier to propagate nonsense. And there are movements from autocratic and authoritarian populist movements, deliberately try to, as I'm quoting Steve Bannon, flood the zone with shit. That is spread so many propositions without any claim to being true, that people no longer know what to believe, and therefore more receptive to what an autocrat would dictate. And of course, we see that in in Russia and other autocracies as well. So it's hard to given that this the these extremes, it's hard to know how to give a report card to the world as a whole.
Kal Raustiala
What about just American society? Like you gave a number of examples, including you showed some pictures of January 6, and so forth, let's just say politically laden examples which fit into your talk. But it's it does seem like there's more of that or more extreme examples about this or that what from either side? Perhaps though, I don't actually think it's really balanced. I think there's more from from one political side, but either way.
Speaker 1
Yeah, I think you can quantify there is more from from the right than from the left, although it does come from both sides. And the polarization itself, which we've all read about, particularly negative polarization, given that one of the main sources of irrationality is not ignorance, but rather adoption of dogmas that are signature issues for your side, then as people become more polarized, you expect that more opinions will be held on irrational grounds. Namely, that's what my side believes. And I'm going to, I'm sticking with it. And I do consider it, I mean. And it is, you know, I agree that there's more, there is more fake news from the right than from the left, that the left has not been holding up its end of the rationality bargain, when you have so many cancellations and censorship and partisan political branding of scientific and cultural institutions. So, there's contributions from the left as well.
Kal Raustiala
And do you see that polarization? I mean, I agree with all of that. Do you think that that's causally linked in the way that you discuss? In other words, is polarization driving the adoption of these ideas in that tribal mode that you just described? Or is there is there some kind of more complex causality in which they feed back on each other?
Speaker 1
Yeah, I think there is. I think the answer is, is it more complex than what you just said? Yes, there's always questions always. Yes. It's always more complex. But I think the the polarization, and we don't really understand the reasons for the polarization, but I think polarization is driving a lot of it. I mean, the conventional answer for why we're polarized is social media, which may be part of the answer, but I don't think it's the entire answer.
Kal Raustiala
What role do you think it plays?
Steven Pinker
I think it makes it what is interesting is as someone who has, despite having a reputation for optimism, mainly because I plot data that present a more positive picture than what you get from the news. Because the news is a non random sample of the worst things happening on Earth at any given time. So if you only base your understanding of the world from headlines, you're being fed and being fed with increasing efficiency, a non random sample of every disaster and calamity. And positive developments are often things that don't happen, that is a part of the world is at peace, you know, not a story for 40 years of peace in Vietnam. You know, when when when I was a student, this would have been unthinkable utopia. But it's true, but it's not headlined, or things that build up by a few percentage points a year and compound, and so no one ever notices them, but they can transform the world such as the decline in extreme poverty, which has, as Max Roser put it, the papers could have had the headline 137,000 people escaped from extreme poverty yesterday, every day for the last 30 years. But they never had that headline. And the result is a billion people escaped from extreme poverty and no one knows about it. So, in just discussing facts like that, I am sometimes called an optimist, although I have a rather dark view of human nature, that was the topic of my book, The Blank Slate, I believe there is such a thing as human nature. And one of the things that social media has kind of unveiled is, if you just let people say their thing. Contrary to expectations that I may even have been willing to harbor that you get a kind of new enlightenment, just people share ideas. But, unless you actually build in safeguards, like peer review, editing, fact checking, people don't really have much of a commitment to objectivity and truth in their own intuitions. That's just not the reason that people believe most things outside their immediate experience. And I didn't get to talk about that this afternoon. But it's a major theme of my book Rationality, that outside, immediate, everyday experience, where people I think, are quite rational, and you kind of have to have an appreciation of causality and objectivity and truth. Otherwise, you will keep food in the fridge and gas in the car and the kids kind of clothed and fed and off to school on time, because reality is your merciless. It won't put up with your fallacies and biases. And so whether through evolution or learning, we're not bad at navigating our physical environment. But a lot of the opinions that we think of as irrational just don't affect people day to day, even what was the cause of particular war? What's the cause of fortune and misfortune? Where did human species arise from? How old is the universe? It kind of doesn't matter to most people to your day to day life. And until very recently, until the scientific revolution and the data revolution, and that revolution in data keeping in journalism, no one could know. Any belief was as good as any other belief, there was no way of dating the universe until quite recently. And our minds are adapted to a world without science in which there is no way to find objective answers to these big, cosmic and historical questions. And so people believe things because they are morally uplifting, because they're entertaining. They're good story. They're good things for the kids to believe. They're, they're edifying. You know, whether they're true or false. I think people don't have an intuition that that's even possible or desirable. And so, unfettered expression will naturally lead to a lot of nonsense being circulated. And I think we've kind of just, we probably should have been able to predict that by just letting everyone have an opinion, by itself is not going to push toward objective truth, you need something like the guidelines of Wikipedia. Now, so I think that helps polarization because you could have different people with their mutually incompatible meaning promotes it. Oh, that's right. I mean, helps, meaning doesn't make it better. But it's it makes it increase. That's right. Although I think it's not the only reason, there are other sociological phenomena such as the residential segregation by level of education and decline, that is, people with college degrees with, with people near people with college degrees and see them and talk to them and people without, conversely, with their own kind, and both mixed neighborhoods, and institutions, like religious organizations, service clubs, bowling leagues, compulsory armed service, which used to mix people from different educational backgrounds have weakened. And so you get echo chambers that can be self-reinforcing. So I think that's another reason for polarization.
Kal Raustiala
I want to come back to social media, but just on the point you made about people not knowing things that we didn't need to know. I'm kind of a lapsed political scientist, but in political science, we talk about rational ignorance in a lot of settings. And so I'm just curious how you so you give examples about like climate literacy, and it would seem like that would be a case where many of us don't really need to know, the science, we're not scientists. But we might. So we might be rationally ignorant about it, as long as there are gatekeepers who do maybe provide that peer review or whatever it is that you think is the proper process, is that kind of fair summation of, of how you know?
Steven Pinker
No, I think that that's right, and I would take it just one step further from the point of view of a psychologist and say, it is not just the people, you know, at our convenience, there's so many hours in the day, it's hard enough just to you know, keep your job, get your kids ready for school, so you know, who has the time to do a deep dive, right? And I think that is that's absolutely true. And of course, from that literature, we also know that voting, there's not really, it's not really a way of furthering your interests because your vote is astronomically unlikely to decide an election. It's a puzzle people vote at all, It's a puzzle people vote at all. And it's and the common answer is it's self expression. So what I would just add as a psychologist is that it isn't just that people say, Well, who has the time to find out, I'm going to let the experts decide I'm going to be happily ignorant. Rather, people do adopt beliefs, they just don't think that you only ought to believe things for which there are reasons. And if there are no reasons, you should be agnostic, that is very unpsychollogical, or psychologically unreal, that is, people are perfectly happy to have beliefs in areas that they know nothing about. Because if the, especially if the beliefs are seem to have some kind of moral value, that is they promote the right values, they make your side look good, and the other side look stupid or evil. Or even if they're kind of fun to think, they're entertaining, they're worth spreading people listen to you, if you've spent a conspiracy theory, that kind of fun. And the idea that I think a lot of people in journalism and academia have that wait, no, you can't just believe something, because it's fun to believe it or because it makes your side feel good, or because it's a good story, or it's morally uplifting. You should only believe things that are true. That is a deeply weird belief. We, you know, I think most of us, you know, on good days have that belief. But in it, we are being unlike the typical member of our species.
Kal Raustiala
So, I'm just gonna ask a couple more, and then we'll, we'll open it up. So, just on social media to go back to that, it seems like there's two things. So, I think in your enlightenment book, you talked about the printing press and some of your work, you're talking about the printing press, and others have pointed out that one of the differences in terms of the spread of ideas with these prior technologies is there was always gatekeeping in various ways. Publishers made decisions, editors made decisions. Social media is by its nature unmediated. And so individuals can trade things true or false. So that's one big distinction. The other distinction is it's also been sort of weaponized in a way that we really haven't seen before, though that's not as new, we see versions of that past technology. So I'm curious, when you think about the pernicious role of social media, how do you, how far do you go with that? And what do you think we should do? So for example, just to take a live issue, the proposed, you know, TikTok ban or TikTok divestiture, is that an appropriate response? Or how would you, based on what you what you've learned, how would you approach that problem?
Speaker 1
So I'm not an expert in the history of media, but my understanding is that, although there was some gatekeeping by the scarcity of former media technologies, that is you had to have a printing press to print stuff. Still, with every new medium, I think there was an actual, actually an awful lot of nonsense that was propagated. Printing press, of course, together with the Bible, which of course itself is filled with fake news and conspiracy theories. What are miracles but fake news? But also Malleus Maleficarum, the kind of the, the the guide to witchcraft, which led to the massive witch hunt in the 16th century, when, and when mass production of newspapers from the rotary press began, and newsprint was cheap. I think until, my understanding is, until respectable news outlets got their act together and formed associations and standards, most news was fake news. I mean, you could open the paper and there were stories of sea monsters and alien visitors and babies born with two heads, and miracles and all kinds of of nonsense, until I think it was in the, maybe an expert in journalism can correct me, I think it wasn't really until the 1910s and 1920s that newspapers formed associations and set standards of editing and fact checking and where it started to matter whether a newspaper had a reputation for probity, accuracy, fact checking. And so you, of course not, didn't make them infallible, but it just meant that you wouldn't be reading about Martian civilization, the sand and sea monsters.
Kal Raustiala
I remember the weekly world news which had that sort of thing a whole time well into the 80s or 90s. Exactly right.
Speaker 1
So, the thing is that until I think that the shortly after the turn of the 20th century, all newspapers were like the weekly rent news.
Kal Raustiala
Yeah, okay. So, I want to open up to questions.
Steven Pinker
I just mentioned one other example: book publishing. In the early days, a lot of books were utterly plagiarized, the facts in them were fabricated, you'd have no idea how often a famous author would be put on the title page of some nonsense that he never wrote. So it was it was kind of a wild west in it maybe a general law that any new medium will initially, until there's respected gatekeepers will just result in a flood of addictions.
Kal Raustiala
Okay, so we'd like to start with some questions from students just to kick it off, so students in the room, I'll start off in the back. Now we have just to remind everyone, we have these box microphones, so just wait for this weird microphone to come to you. You could just. Yeah, perfect.
Audience Member
Hello, Mr. pinker. Thanks so much for being here. I know, we all really appreciate your lecture. And I learned about you in my high school psychology class, then language learning. So it's quite exciting to see you here in person. My question was regarding like big groups of people, maybe not finding the objective truth. Recently, like a stats class, we learned that if you have like 100 people guess the weight of a cow, the average of that guess is usually extremely close to the actual weight of the cow, even though the actual guesses themselves may be way off. So in that case, it seems like aggregating lots of different ideas might yield a right answer. So how do you kind of justify that with your earlier statement of maybe it might not be the truth every single time? Thanks so much.
Steven Pinker
Yeah, that's a great example. Because because it's something called the phenomenon of the wisdom of crowds. That is, if you aggregate estimates, and I think my understanding is that some certain conditions have to be in place for crowds to be wise, as opposed to being carried away by groupthink or the madness of crowds. Among them are that the estimates have to be made independently. So in the weight of the cow example, everyone writes their guests on a slip of paper, and then somebody on the stage quickly averages them. If you have a discussion going, then the most charismatic or influential or guy with the lowest voice or the most effective mansplainer could sweep others to his estimate, no matter how bogus and and I think there's some other conditions as well. So it is a real phenomenon. And ideally, that's what you, the kind of statistical power that you want to tap in groups. But when they do communicate, I think you need additional kind of guardrails. That is, demands that people defend their beliefs, openness to dissent, and contrary beliefs, not privileging authority by just privileging coherence and evidence based and other kinds of guidelines.
Kal Raustiala
Thank you so much. You got another student question, but can I just follow up on that. We just had a pretty momentous verdict handed down today. Juries don't follow that independence standards. They deliberate. We're in the law school. So what's your view on whether that system is a good one? Should we in fact, have independent assessments by juries that then are accumulated rather than having that deliberation that we associate?
Steven Pinker
Probably, in the sense I'm not enough of an expert to give a definitive answer, but I do know, there has been research on juries, really since the 70s. Some of it involving the people in the jury pool who don't actually serve because there wasn't enough demand and, alternates with all the alternates, yes. They'll have researchers level mock trial and experimentally vary what happens and then we look at jury deliberations. My understanding is that a jury is not the, especially with the unanimity rule, it's actually not an optimal decision making body but what you saw- we all saw 12 Angry Men Men were one principle guide can overturn the other 11 that apparently never happens. But there are coalition's, there's browbeating, there's influence, there's charisma, and so on. So it's really good not to be a defendant in a criminal trial, even if you're innocent. But and there probably are, I am almost sure that there'll be better designs, certainly in the recent book by Daniel Kahneman and Cass Sunstein and Olivier Siboney called Noise. They note that there are systematic pathologies in certain kinds of committees and groups, particularly when they don't give independent assessments, and they reckon that is the noise gets consolidated or amplified in a way the kind of Wisdom of Crowds situation is one in which the noise there mechanism for the noise cancels rather than amplifies.
Kal Raustiala
Great. Great Okay, so other students with questions. Yes, you have your hand, dark shirt. Yes. Hand up perfect. Keep your hand up. Microphone is coming. I think the microphone can even be tossed.
Audience Member
Oh, hello. Oh, hi, Professor Pinker, thank you for your talk. I was learning about you in my textbooks, so it's an honor to speak to you. I was wondering, how would you think about the research regarding sort of like the spread of fake news or misinformation has to relate to the inattention to the truthfulness of the information?
Steven Pinker
Could you say that again?
Audience Member
Oh, yes. So the people's sort of like spread of misinformation on their social media has to do with their inattention to the truthfulness of the of the content itself. So they don't really filter through the message as much as they maybe I don't know normatively should. And I was wondering, what's your, what's your thought on this given that you were saying that institutions promoting sort of like the rationality promoting institutions are important, but if people don't even pay attention to the truthfullness of information, what they even, you know, how it's gonna mediate this effect?
Steven Pinker
Yeah, no, it's true that some, apparently is researched by Gordon Pennycook and others, that some small nudges can tamp down the temptation to spread false news, which then, because of exponential growth, can go viral. And so even if you were to tamp down the slight tendency to repeat it, that could stop the virality in theory, and even like a war. Notice, like, Are you sure this is true? Or do you have reason to believe this is true? If people are just reminded of that, they're less likely to pass on fake news. So that's, that is a ray of hope that there may might be at least some nudges that, I think in the absence of something with a little bit more teeth, like a Wikipedia community, where there is some kind of informal policing. It's probably not enough. But it probably would, would help matters if the social media gave people these reminders. And I just I noticed already that X has a policy where if you re, I guess you don't say retweet, re-X, repost something that you haven't clicked open to read, it will kind of nag you; are you sure you want to, to retweet this? You haven't you haven't opened it to read it yet? I was sort of surprised to see that, it's probably not a bad thing. And, again, I don't think it would be enough, but it wouldn't hurt.
Kal Raustiala
I'm going to ask a follow up from the other overflow room, there is an overflow room. And so it's sort of related. So the question is, why are young adults– I'm not going to endorse the premise you can you can attack the premise if you want– Why are young adults so trusting, underlined, of information they read on social media? Why do they have so much confidence in news from unreliable sources?
Steven Pinker
Yeah. So, I mean, the part that is unquestionably true is that younger people get more from social media, compared to certainly news printed on dead trees least of all, that's really an old person's medium. Even cable news, I think the median, I read this morning, the median age of a CNN or MSNBC or Fox News patron is something like 70. So, it's overwhelmingly and for younger cohorts, it's overwhelmingly social media. The question is, what is being, what are they consuming on social media? That is, if it's just links to articles in, you know, respectable sources, you know, that's okay. Is the second part of the premise that they buy it, you know, young people tell us older people, the fact that that's where they get the news doesn't mean that necessarily, that they accept everything that they read. But it is worrisome to the extent that on average, certainly, things in a social media feed are going to be less vetted, less fact checked, and in specific sources that cultivate a reputation for for accuracy.
Kal Raustiala
I mean, isn't it even worse than that, though, and that so many of the algorithms amplify engagement, and engagement is often either hating on something or or as you say, clicking or not clicking and forwarding and so forth. So that there's actually an inducement to have those things kind of know what's going on all around,
Steven Pinker
Yeah, I know, that is a real concern.
Kal Raustiala
Okay, so we're gonna open it up to the general audience. I'm gonna give Chancellor Cornice our first question.
Audience Member
We started with the second characteristics. Perhaps maybe most troubling is statistic [unintelligible]. And even those he was trained in statistics, our intuition often fails, and they go back and we analyze the problems and its oh, its a surprise. When people don't like statistics, they don't like it when the weatherman, where the person says is a 30% chance, they don't know what it means. Right? And when it's black and white, now here's the problem. As scientists, that we're really in some ways trying to say based on our current understanding. As soon as you go there, you're opinion might be wrong. So creation, afterlife, can prove that it's false? The answer is, you can't prove that it's impossible. So many of these things are not going to change. Because we don't have perfect knowledge to prove that it's impossible.
Steven Pinker
Yeah. So it's
Kal Raustiala
Can everyone hear that? I realize he didn't get the microphone? Yeah.
Steven Pinker
Okay.
Kal Raustiala
Try to try to paraphrase.
Steven Pinker
Correct me if I say it wrong. So that there is a an intuitive resistance to statistical thinking, there people are liable to your black and white fallacy. And that this is particularly pernicious in the case of accepting scientific conclusions, because, necessarily, scientific conclusions are stated in degrees of credence. And people mistakenly think that scientific facts are proven to be correct, even say, Where's your proof? And proof is actually something you get in math and logic, but not in science. So there, there's two leaps that people have to take. One of them is just statistical thinking in general. And I didn't have time, or it wasn't appropriate in today's lecture, to talk about some of the ways that people's statistical intuitions can be improved or honed, such as converting estimates of single events, like that woman has cancer, what is the probability that she has cancer, which is actually a little bit of a strange concept anyway, single event probability. And people do have trouble wrapping their minds around either she does have cancer, or she doesn't, mean to say that there's a point three chance that she has it. And so if you reframe these notorious Bayesian problem, problems in frequencies, that is, instead of 1%, there's a 1% chance that a woman has breast cancer, you say, imagine 1000 women, 10 of them have breast cancer, of those 10, nine of them will test positive, of the 990 that don't 89 of them will test positive, then people start to avoid some of these errors in Bayesian reasoning, because they can visualize quantities and subdivide them, and not have to think in terms of the somewhat mystical concept of the probability of one case. Unfortunately, in the case of applying probabilistic thinking to science, and Faizian leap, that is, in science, as you say, something is either people can think, well, it either is the case, or isn't the case? What does it mean to say, it's point seven likely to be true? It's going to be a heavier lift, to get people to import what statistical intuitions we do have, which tends to be in relative frequencies, to the case of credence in a single proposition. So, there has to be, I think the one possible route to expanding people's intuitions in the case of, say the a Bayesian approach to science, would be to tap whatever intuitions allow people to gamble. Because when you when you understand odds, when you put money on something, you are betting on a single event, and you've got to have some degree of credence in which way it'll it'll turn out, the election will go one way or another, the Celtics will either win or won't win the NBA championship, to the extent that people can understand that as they lust if they're going to put money on it. That's a kind of way of getting to the idea that scientific hypotheses, like sports outcomes are single propositions to which you can attach a number between zero and one. And in fact, there is a part of the rationality community that advocates betting on outcomes that prediction markets, even better than Wisdom of Crowds of simply aggregating opinions, yet people bet on their opinions. Apparently, that gives rise even more accurate forecasts, if people could start to think of how much would you bet that so and so will will win or lose, that might pull on the intuitions in the right direction.
Kal Raustiala
Okay, a lot of hands and we don't have a ton of time. So I'm going to; here's, here's what I'm gonna do. I'm going to take two or three questions at a time. No, we can take like one at a time. Okay, go ahead now. Just wait wait for wait for the microphone. Yes. We're just at the end of our time slot. Yes, please.
Audience Member
Hi, I'm looking at at Daniel pearls photo above you and his words, I am a Jew. And I want to ask you about anti semitism, and the irrational beliefs on the right and the left and all over the world.
Steven Pinker
Yeah, so what's the deal with anti semitism? In two minutes. Yeah. It is. Well, what is the correct answer? Yeah. [unintelligible]. Yeah, it's I mean, there's a number of very deep aspects to that question. What why has anti semitism been so perennial over a course of literally millennia? And what why is, why does there seem to have been an uptick in the last short period of time, maybe even a year? And what can be done about it? So in terms of, you know, I don't have definitive answers to any of the three. It's sometimes thought, its been proposed, as opposed to any kind of mystical thing like, you know, all the world hates the Jews, and they always will and it's just one of the unfathomable Mysteries of the Human Condition. That's kind of unsatisfied. There is a theory from the economist and historian of ideas Thomas Sowell, that anti semitism is part of a phenomenon of hatred of middlemen and minorities, owing in part from economic illiteracy. That because Jews for many centuries have specialized in economic middlemen niches, such as retailing, and money lending, which, even though economics tells us that any prosperous society has to rely on money moving around, and middlemen making things widely available across fungible markets, intuitively, we tend to think of value as stuff coming into existence. So farmers create wealth and craftsmen create wealth, and miners create wealth. We don't think of financiers as creating wealth or as merchants as creating wealth. There's a tendency to backslide and to think of them as parasites and exploiters and thieves. And so, Jews get expelled, the economy tanks, but still people don't make the connection. And so there's hatred, which would explain why at least some of the phenomena of anti semitism can also be seen in hatred, sometimes genocidal hatred of other middlemen minorities, such as the Armenians in the Ottoman Empire, the Asians in particularly subcontinent Indians in Africa, the Overseas Chinese and Southeast Asia and Vietnam and Philippines and so on. That a common denominator is a hatred of middlemen minorities. In the case of Judaism, there's additional things the myth of of deicide, just living apart with insular customs tends to make any minority community a conspicuous target. So, those are some of the of the answers. Why recently? Well, there is there's a sad tendency, since any kind of ethnic hatred, but particularly anti semitism comes from a kind of bifurcation into tribes, us versus them. You do have your paradoxically among highly educated university students, there has been the ideology that the world can be divided into oppressors and oppressed kind of version of Marxism that left behind the whole bourgeoisie, the proletariat, and grafted that on to races and sexes, and ethnic groups. And so somehow, and economically, historically, bizarrely, Jews were on the side of the oppressors. And so an ideology that divides the world into good and evil in this rather toxic mindset, where Jews ended up on the wrong side. You're aided by the fact that the situation in the Middle East involves some complexity. So just appreciating the morality of a just war and differentiating between violence in self defense versus violence as raw aggression. And that kind of even kind of moral philosophy 101 of how do you what, what are the criteria for a just war? What is legitimate self defense? When may force be applied? I think has receded in university educations in favor of oppressive oppressor-oppressed dichotomy. So that has partly explains why it's so prevalent on university campuses. Anyway, there are many more reasons for that, just throwing out some ideas.
Kal Raustiala
I know there's so many questions, but we're over time. Please join me in thanking Steven Pinker.
Transcribed by https://otter.ai