Latest news with #ChangingMyMind
Yahoo
21-03-2025
- Politics
- Yahoo
The Danger of a Too-Open Mind
This is an edition of the Books Briefing, our editors' weekly guide to the best in books. Sign up for it here. At a moment when just asking questions can feel synonymous with bad-faith arguments or conspiratorial thinking, one of the hardest things to hold on to might be an open mind. As Kieran Setiya wrote this week in The Atlantic on the subject of Julian Barnes's new book, Changing My Mind, 'If a functioning democracy is one in which people share a common pool of information and disagree in moderate, conciliatory ways, there are grounds for pessimism about its prospects.' But what should the civic-minded citizen do with that pessimism? Knowing about our tendency toward rationalization and confirmation bias, alongside the prevalence of misinformation, how do we know when, or whether, to change our minds? First, here are four new stories from The Atlantic's Books section: What Shakespeare got right about PTSD The life of the mind can only get you so far The last great Yiddish novel 'Coalescence,' a poem by Cameron Allan Another article published this week presents a possible test case. The Yale law professor Justin Driver examines a new book, Integrated—and, more broadly, a surge of skepticism over the effects of Brown v. Board of Education, the landmark 1954 Supreme Court decision that ordered the racial integration of American public schools. The book's author, Noliwe Rooks, was 'firmly in the traditional pro-Brown camp' as recently as five years ago, Driver writes. But America's failure to accommodate Black children in predominantly white schools, combined with the continuing lack of resources in largely Black schools, led Rooks to conclude in her book that Brown was in fact 'an attack on the pillars of Black life': that integration, as carried out, has failed many Black children, while undermining the old system of strong Black schools. Should this case of intellectual flexibility be celebrated? It certainly makes for a lively debate. Driver calls Rooks's 'disenchantment' with the ruling 'entirely understandable,' but he sticks to his own belief that Brown has done more good than harm, and he makes a case for it. For example, Rooks portrays Washington, D.C.'s prestigious all-Black Dunbar High School as a hub of the community, staffed by proud and dedicated educators. Driver complicates the history of those 'glory days' by quoting its most prominent graduates: 'Much as they valued having talented, caring teachers, these men understood racial segregation intimately, and they detested it.' And he notes that, beyond changing education, 'Brown fomented a broad-gauge racial revolution throughout American public life.' He demonstrates that we can absorb new information—in this case, evidence of the many shortcomings of American school integration—without forgetting the lessons of the past. Barnes makes a similar case in Changing My Mind, a book that is, in fact, mostly about why the novelist hasn't altered his opinions and ultimately doubts that trying to is worth it. To adopt new beliefs, he writes, we would have 'to forget what we believed before, or at least forget with what passion and certainty we believed it.' Setiya chides Barnes for his view that, given our hardwired biases, we might want to give up on being swayed at all. But he concludes that such stubbornness is 'not all bad.' Perhaps keeping an open mind is overrated—at least if it means 'coming to accept the unacceptable,' as Setiya puts it. And how should a person determine what's unacceptable? 'When we fear that our environment will degrade,' Setiya writes, 'we can record our fundamental values and beliefs so as not to forsake them later.' Once we know what our principles are, we can more easily weigh new information against our existing convictions. Without them, it would be easier to change our minds—but impossible to know when we're right. It's Hard to Change Your Mind. A New Book Asks If You Should Even Try. By Kieran Setiya The novelist Julian Barnes doubts that we can ever really overcome our fixed beliefs. He should keep an open mind. Read the full article. , by Whittaker Chambers This 1952 memoir is still thrust in the hands of budding young conservatives, as a means of inculcating them into the movement. Published during an annus mirabilis for conservative treatises, just as the American right was beginning to emerge in its modern incarnation, Witness is draped in apocalyptic rhetoric about the battle for the future of mankind—a style that helped establish the Manichaean mentality of postwar conservatism. But the book is more than an example of an outlook: It tells a series of epic stories. Chambers narrates his time as an underground Communist activist in the '30s, a fascinating tale of subterfuge. An even larger stretch of the book is devoted to one of the great spectacles in modern American politics, the Alger Hiss affair. In 1948, after defecting from his sect, Chambers delivered devastating testimony before the House Un-American Activities Committee accusing Hiss, a former State Department official and a paragon of the liberal establishment, of being a Soviet spy. History vindicates Chambers's version of events, and his propulsive storytelling withstands the test of time. — Franklin Foer From our list: Six political memoirs worth reading 📚 Free: My Search for Meaning, by Amanda Knox 📚 Sister Europe, by Nell Zink 📚 Twist, by Colum McCann What Impossibly Wealthy Women Do for Love and Fulfillment By Sophie Gilbert Watching the show, I found myself stuck on one question: Whom is this for? Is there an underserved niche of Santa Barbara moms with their own pristine vegetable gardens who have previously been too intimidated to attempt baking focaccia? And yet, as With Love, Meghan went on, it started to hit a few of the classic pleasure points. A beautiful woman with a wardrobe of stealth-wealth beige separates and floral dresses? Check. A fixation, both nutritional and aesthetic, on how best to feed one's family, down to fruit platters arranged like rainbows and jars of chia seeds and hemp hearts to sneak into pancakes? Check. A strange aside where she details what it meant for her to take her husband's name? Ding ding ding: We're in tradwife territory now. This is absurd, of course. Meghan isn't a tradwife; if anything, she's a girlboss, a savvy, mediagenic entrepreneur with a new podcast dedicated to businesswomen and a nascent retail brand. So why does she seem to be trying so hard to rebrand as one, offering up this wistful performance of femininity and old-fashioned domestic arts that feels staged—and pretty familiar? Read the full article. When you buy a book using a link in this newsletter, we receive a commission. Thank you for supporting The Atlantic. Sign up for The Wonder Reader, a Saturday newsletter in which our editors recommend stories to spark your curiosity and fill you with delight. Explore all of our newsletters. Article originally published at The Atlantic


Atlantic
21-03-2025
- Politics
- Atlantic
The Danger of a Too-Open Mind
This is an edition of the Books Briefing, our editors' weekly guide to the best in books. Sign up for it here. At a moment when just asking questions can feel synonymous with bad-faith arguments or conspiratorial thinking, one of the hardest things to hold on to might be an open mind. As Kieran Setiya wrote this week in The Atlantic on the subject of Julian Barnes's new book, Changing My Mind, 'If a functioning democracy is one in which people share a common pool of information and disagree in moderate, conciliatory ways, there are grounds for pessimism about its prospects.' But what should the civic-minded citizen do with that pessimism? Knowing about our tendency toward rationalization and confirmation bias, alongside the prevalence of misinformation, how do we know when, or whether, to change our minds? First, here are four new stories from The Atlantic 's Books section: Another article published this week presents a possible test case. The Yale law professor Justin Driver examines a new book, Integrated — and, more broadly, a surge of skepticism over the effects of Brown v. Board of Education, the landmark 1954 Supreme Court decision that ordered the racial integration of American public schools. The book's author, Noliwe Rooks, was 'firmly in the traditional pro- Brown camp' as recently as five years ago, Driver writes. But America's failure to accommodate Black children in predominantly white schools, combined with the continuing lack of resources in largely Black schools, led Rooks to conclude in her book that Brown was in fact 'an attack on the pillars of Black life': that integration, as carried out, has failed many Black children, while undermining the old system of strong Black schools. Should this case of intellectual flexibility be celebrated? It certainly makes for a lively debate. Driver calls Rooks's 'disenchantment' with the ruling 'entirely understandable,' but he sticks to his own belief that Brown has done more good than harm, and he makes a case for it. For example, Rooks portrays Washington, D.C.'s prestigious all-Black Dunbar High School as a hub of the community, staffed by proud and dedicated educators. Driver complicates the history of those 'glory days' by quoting its most prominent graduates: 'Much as they valued having talented, caring teachers, these men understood racial segregation intimately, and they detested it.' And he notes that, beyond changing education, ' Brown fomented a broad-gauge racial revolution throughout American public life.' He demonstrates that we can absorb new information—in this case, evidence of the many shortcomings of American school integration—without forgetting the lessons of the past. Barnes makes a similar case in Changing My Mind, a book that is, in fact, mostly about why the novelist hasn't altered his opinions and ultimately doubts that trying to is worth it. To adopt new beliefs, he writes, we would have 'to forget what we believed before, or at least forget with what passion and certainty we believed it.' Setiya chides Barnes for his view that, given our hardwired biases, we might want to give up on being swayed at all. But he concludes that such stubbornness is 'not all bad.' Perhaps keeping an open mind is overrated—at least if it means 'coming to accept the unacceptable,' as Setiya puts it. And how should a person determine what's unacceptable? 'When we fear that our environment will degrade,' Setiya writes, 'we can record our fundamental values and beliefs so as not to forsake them later.' Once we know what our principles are, we can more easily weigh new information against our existing convictions. Without them, it would be easier to change our minds—but impossible to know when we're right. It's Hard to Change Your Mind. A New Book Asks If You Should Even Try. By Kieran Setiya The novelist Julian Barnes doubts that we can ever really overcome our fixed beliefs. He should keep an open mind. What to Read Witness, by Whittaker Chambers This 1952 memoir is still thrust in the hands of budding young conservatives, as a means of inculcating them into the movement. Published during an annus mirabilis for conservative treatises, just as the American right was beginning to emerge in its modern incarnation, Witness is draped in apocalyptic rhetoric about the battle for the future of mankind—a style that helped establish the Manichaean mentality of postwar conservatism. But the book is more than an example of an outlook: It tells a series of epic stories. Chambers narrates his time as an underground Communist activist in the '30s, a fascinating tale of subterfuge. An even larger stretch of the book is devoted to one of the great spectacles in modern American politics, the Alger Hiss affair. In 1948, after defecting from his sect, Chambers delivered devastating testimony before the House Un-American Activities Committee accusing Hiss, a former State Department official and a paragon of the liberal establishment, of being a Soviet spy. History vindicates Chambers's version of events, and his propulsive storytelling withstands the test of time. — Franklin Foer Out Next Week 📚 Free: My Search for Meaning, by Amanda Knox 📚 Sister Europe, by Nell Zink 📚 Twist, by Colum McCann Your Weekend Read What Impossibly Wealthy Women Do for Love and Fulfillment By Sophie Gilbert Watching the show, I found myself stuck on one question: Whom is this for? Is there an underserved niche of Santa Barbara moms with their own pristine vegetable gardens who have previously been too intimidated to attempt baking focaccia? And yet, as With Love, Meghan went on, it started to hit a few of the classic pleasure points. A beautiful woman with a wardrobe of stealth-wealth beige separates and floral dresses? Check. A fixation, both nutritional and aesthetic, on how best to feed one's family, down to fruit platters arranged like rainbows and jars of chia seeds and hemp hearts to sneak into pancakes? Check. A strange aside where she details what it meant for her to take her husband's name? Ding ding ding: We're in tradwife territory now. This is absurd, of course. Meghan isn't a tradwife; if anything, she's a girlboss, a savvy, mediagenic entrepreneur with a new podcast dedicated to businesswomen and a nascent retail brand. So why does she seem to be trying so hard to rebrand as one, offering up this wistful performance of femininity and old-fashioned domestic arts that feels staged—and pretty familiar?


New York Times
18-03-2025
- Entertainment
- New York Times
Book Review: ‘Changing My Mind,' by Julian Barnes
CHANGING MY MIND, by Julian Barnes In an essay from his collection 'The Dyer's Hand,' W.H. Auden describes his personal Eden: an 'absolute monarchy, elected for life by lot,' a place without automobiles, airplanes, newspapers, movies, radio or television, whose economy depends on lead mining, coal mining, chemical factories and sheep farming and whose public statues are 'confined to famous defunct chefs.' In 'Changing My Mind,' a slender new book-length essay that has the misfortune to share a title with a 2009 collection by Zadie Smith, Julian Barnes, the novelist and all-around man of letters, envisions a far less idiosyncratic utopia, which he calls, tongue-in-cheekily, B.B.R. (Barnes's Benign Republic). I'd gladly live in B.B.R. — its attractions include separation of church and state, nuclear disarmament, and restoration of arts and humanities courses at schools and universities — while I wouldn't last a day in Auden's zany Ruritania. But which is more fun to read about? 'Changing My Mind' can't make up its mind about whether it's a single piece or, as it appears to be, a loosely connected series of ruminations on the topics of 'Memories,' 'Words,' 'Politics,' 'Books' and 'Age and Time.' The back cover of the handsome Notting Hill Editions paperback calls it 'an engaging and erudite essay,' but, in fact, the copyright page tells us that 'versions of these essays were first broadcast on BBC Radio 3' … in 2016. The book's origins may account for otherwise baffling concluding lines, in which Barnes, now 79, confronts mortality. (As he did, more affectingly, in his 2013 memoir 'Levels of Life.') 'Who knows, perhaps a friendly radio producer with a microphone will come along to my bedside and ask the right questions. If so, I'll be able to let you know.' Barnes begins the book by pointing out what an odd expression 'I changed my mind' is: 'Where is this 'I' that is changing this 'mind,' like some rider controlling a horse with their knees?' he asks. 'This 'I' we feel so confident about isn't something beyond and separate from the mind' that 'you might as well say 'My mind changed me.'' Thank you for your patience while we verify access. If you are in Reader mode please exit and log into your Times account, or subscribe for all of The Times. Thank you for your patience while we verify access. Already a subscriber? Log in. Want all of The Times? Subscribe.
Yahoo
17-03-2025
- Politics
- Yahoo
Why It's Hard to Change Your Mind
Julian Barnes opens Changing My Mind, his brisk new book about our unruly intellects, with a quote famously attributed to the economist John Maynard Keynes: 'When the facts change, I change my mind.' It's a fitting start for an essay on our obliviousness to truth, because Keynes didn't say that—or not exactly that. The economist Paul Samuelson almost said it in 1970 (replacing 'facts' with 'events') and in 1978 almost said it again (this time, 'information'), attributing it to Keynes. His suggestion stuck, flattering our sense of plausibility—it's the sort of thing Keynes would have said—and now finds itself repeated in a work of nonfiction. Our fallibility is very much on display. Not that Barnes would deny that he makes mistakes. The wry premise of his book is that he's changed his mind about how we change our minds, evolving from a Keynesian faith in fact and reason to a framing inspired by the Dadaist Francis Picabia's aphorism 'Our heads are round so that our thoughts can change direction.' (In this case, the citation is accurate.) Barnes concludes that our beliefs are changed less by argument or evidence than by emotion: 'I think, on the whole, I have become a Picabian rather than a Keynesian.' Barnes is an esteemed British novelist, not a social scientist—one of the things he hasn't changed his mind about is 'the belief that literature is the best system we have of understanding the world'—but his shift in perspective resonates with a host of troubling results in social psychology. Research in recent decades shows that we are prone to 'confirmation bias,' systematically interpreting new information in ways that favor our existing views and cherry-picking reasons to uphold them. We engage in 'motivated reasoning,' believing what we wish were true despite the evidence. And we are subject to 'polarization': As we divide into like-minded groups, we become more homogeneous and more extreme in our beliefs. If a functioning democracy is one in which people share a common pool of information and disagree in moderate, conciliatory ways, there are grounds for pessimism about its prospects. For Barnes, this is not news: 'When I look back at the innumerable conversations I've had with friends and colleagues about political matters over the decades,' he laments, 'I can't remember a single, clear instance, when a single, clear argument has made me change my mind—or when I have changed someone else's mind.' Where Barnes has changed his mind—about the nature of memory, or policing others' language, or the novelists Georges Simenon and E. M. Forster—he attributes the shift to quirks of experience or feeling, not rational thought. Both Barnes and the social scientists pose urgent, practical questions. What should we do about the seeming inefficacy of argument in politics? How can people persuade opponents on issues such as immigration, abortion, or trans rights in cases where their interpretation of evidence seems biased? Like the Russian trolls who spread divisive rhetoric on social media, these questions threaten one's faith in what the political analyst Anand Giridharadas has called 'the basic activity of democratic life—the changing of minds.' The situation isn't hopeless; in his recent book, The Persuaders, Giridharadas portrays activists and educators who have defied the odds. But there is a risk of self-fulfilling prophecy: If democratic discourse comes to seem futile, it will atrophy. [Read: The cognitive biases tricking your brain] Urgent as it may be, this fear is not what animates Barnes in Changing My Mind. His subject is not moving other minds, but rather changing our own. It's easy and convenient to forget that confirmation bias, motivated reasoning, and group polarization are not problems unique to those who disagree with us. We all interpret evidence with prejudice, engage in self-deception, and lapse into groupthink. And though political persuasion is a topic for social scientists, the puzzle of what I should do when I'm afraid that I'm being irrational or unreliable is a philosophical question I must inevitably ask, and answer, for myself. That's why it feels right for Barnes to approach his topic through autobiography, in the first person. This genre goes back to Descartes' Meditations: epistemology as memoir. And like Descartes before him, Barnes confronts the specter of self-doubt. 'If Maynard Keynes changed his mind when the facts changed,' he admits, 'I find that facts and events tend to confirm me in what I already believe.' You might think that this confession of confirmation bias would shake his confidence, but that's not what happens to Barnes, or to many of us. Learning about our biases doesn't necessarily make them go away. In a chapter on his political convictions, Barnes is cheerfully dogmatic. 'When asked my view on some public matter nowadays,' he quips, 'I tend to reply, 'Well, in Barnes's Benign Republic …'' He goes on to list some of BBR's key policies: For a start … public ownership of all forms of mass transport, and all forms of power supply—gas, electric, nuclear, wind, solar … Absolute separation of Church and State … Full restoration of all arts and humanities courses at schools and universities … and, more widely, an end to a purely utilitarian view of education. This all sounds good to me, but it's announced without a hint of argument. Given Barnes's doubts about the power of persuasion, that makes sense. If no one is convinced by arguments, anyway, offering them would be a waste of time. Barnes does admit one exception: 'Occasionally, there might be an area where you admit to knowing little, and are a vessel waiting to be filled.' But, he adds, 'such moments are rare.' The discovery that reasoning is less effective than we hoped, instead of being a source of intellectual humility, may lead us to opt out of rational debate. [Yascha Mounk: The doom spiral of pernicious polarization] Barnes doesn't overtly make this case—again, why would he? But it's implicit in his book and it's not obviously wrong. When we ask what we should think in light of the social science of how we think, we run into philosophical trouble. I can't coherently believe that I am basically irrational or unreliable, because that belief would undermine itself: another conviction I can't trust. More narrowly, I can't separate what I think about, say, climate change from the apparent evidence. It's paradoxical to doubt that climate change is real while thinking that the evidence for climate change is strong, or to think, I don't believe that climate change is real, although it is. My beliefs are my perspective on the world; I cannot step outside of them to change them 'like some rider controlling a horse with their knees,' as Barnes puts it, 'or the driver of a tank guiding its progress.' So what am I to do? One consolation, of sorts, is that my plight—and yours—predates the findings of social science. Philosophers like Descartes long ago confronted the perplexities of the subject trapped within their own perspective. The limits of reasoning are evident from the moment we begin to do it. Every argument we make contains premises an opponent can dispute: They can always persist in their dissent, so long as they reject, time and again, some basic assumption we take for granted. This doesn't mean that our beliefs are unjustified. Failure to convert the skeptic—or the committed conspiracy theorist—need not undermine our current convictions. Nor does recent social science prove that we're inherently irrational. In conditions of uncertainty, it's perfectly reasonable to put more faith in evidence that fits what we take to be true than in unfamiliar arguments against it. Confirmation bias may lead to deadlock and polarization, but it is better than hopelessly starting from scratch every time we are contradicted. None of this guarantees that we'll get the facts right. In Meditations, Descartes imagines that the course of his experience is the work of an evil demon who deceives him into thinking the external world is real. Nowadays, we might think of brains in vats or virtual-reality machines from movies like The Matrix. What's striking about these thought experiments is that their imagined subjects are rational even though everything they think they know is wrong. Rationality is inherently fallible. What social science reveals is that we are more fallible than we thought. But this doesn't mean that changing our mind is a fool's errand. New information might be less likely to lead us to the truth than we would like to believe—but that doesn't mean it has no value at all. More evidence is still better than less. And we can take concrete steps to maximize its value by mitigating bias. Studies suggest, for instance, that playing devil's advocate improves our reliability. Barnes notwithstanding, novel arguments can move our mind in the right direction. [Read: Changing your mind can make you less anxious] As Descartes' demon shows, our environment determines how far being rational correlates with being right. At the evil-demon limit, not at all: We are trapped in the bubble of our own experience. Closer to home, we inhabit epistemic bubbles that impede our access to information. But our environment is something we can change. Sometimes it's good to have an open mind and to consider new perspectives. At other times, it's not: We know we're right and the risk of losing faith is not worth taking. We can't ensure that evidence points us to the truth, but we can protect ourselves from falling into error. As Barnes points out, memory is 'a key factor in changing our mind: we need to forget what we believed before, or at least forget with what passion and certainty we believed it.' When we fear that our environment will degrade, that we'll be subject to misinformation or groupthink, we can record our fundamental values and beliefs so as not to forsake them later. Seen in this light, Barnes's somewhat sheepish admission that he has never really changed his mind about politics seems, if not entirely admirable, then not all bad. Where the greater risk is that we'll come to accept the unacceptable, it's just as well to be dogmatic. Article originally published at The Atlantic


Atlantic
17-03-2025
- Politics
- Atlantic
Why It's Hard to Change Your Mind
Julian Barnes opens Changing My Mind, his brisk new book about our unruly intellects, with a quote famously attributed to the economist John Maynard Keynes: 'When the facts change, I change my mind.' It's a fitting start for an essay on our obliviousness to truth, because Keynes didn't say that —or not exactly that. The economist Paul Samuelson almost said it in 1970 (replacing 'facts' with 'events') and in 1978 almost said it again (this time, 'information'), attributing it to Keynes. His suggestion stuck, flattering our sense of plausibility—it's the sort of thing Keynes would have said—and now finds itself repeated in a work of nonfiction. Our fallibility is very much on display. Not that Barnes would deny that he makes mistakes. The wry premise of his book is that he's changed his mind about how we change our minds, evolving from a Keynesian faith in fact and reason to a framing inspired by the Dadaist Francis Picabia's aphorism 'Our heads are round so that our thoughts can change direction.' (In this case, the citation is accurate.) Barnes concludes that our beliefs are changed less by argument or evidence than by emotion: 'I think, on the whole, I have become a Picabian rather than a Keynesian.' Barnes is an esteemed British novelist, not a social scientist—one of the things he hasn't changed his mind about is 'the belief that literature is the best system we have of understanding the world'—but his shift in perspective resonates with a host of troubling results in social psychology. Research in recent decades shows that we are prone to ' confirmation bias,' systematically interpreting new information in ways that favor our existing views and cherry-picking reasons to uphold them. We engage in ' motivated reasoning,' believing what we wish were true despite the evidence. And we are subject to ' polarization ': As we divide into like-minded groups, we become more homogeneous and more extreme in our beliefs. If a functioning democracy is one in which people share a common pool of information and disagree in moderate, conciliatory ways, there are grounds for pessimism about its prospects. For Barnes, this is not news: 'When I look back at the innumerable conversations I've had with friends and colleagues about political matters over the decades,' he laments, 'I can't remember a single, clear instance, when a single, clear argument has made me change my mind—or when I have changed someone else's mind.' Where Barnes has changed his mind—about the nature of memory, or policing others' language, or the novelists Georges Simenon and E. M. Forster—he attributes the shift to quirks of experience or feeling, not rational thought. Both Barnes and the social scientists pose urgent, practical questions. What should we do about the seeming inefficacy of argument in politics? How can people persuade opponents on issues such as immigration, abortion, or trans rights in cases where their interpretation of evidence seems biased? Like the Russian trolls who spread divisive rhetoric on social media, these questions threaten one's faith in what the political analyst Anand Giridharadas has called 'the basic activity of democratic life—the changing of minds.' The situation isn't hopeless; in his recent book, The Persuaders, Giridharadas portrays activists and educators who have defied the odds. But there is a risk of self-fulfilling prophecy: If democratic discourse comes to seem futile, it will atrophy. Urgent as it may be, this fear is not what animates Barnes in Changing My Mind. His subject is not moving other minds, but rather changing our own. It's easy and convenient to forget that confirmation bias, motivated reasoning, and group polarization are not problems unique to those who disagree with us. We all interpret evidence with prejudice, engage in self-deception, and lapse into groupthink. And though political persuasion is a topic for social scientists, the puzzle of what I should do when I'm afraid that I'm being irrational or unreliable is a philosophical question I must inevitably ask, and answer, for myself. That's why it feels right for Barnes to approach his topic through autobiography, in the first person. This genre goes back to Descartes' Meditations: epistemology as memoir. And like Descartes before him, Barnes confronts the specter of self-doubt. 'If Maynard Keynes changed his mind when the facts changed,' he admits, 'I find that facts and events tend to confirm me in what I already believe.' You might think that this confession of confirmation bias would shake his confidence, but that's not what happens to Barnes, or to many of us. Learning about our biases doesn't necessarily make them go away. In a chapter on his political convictions, Barnes is cheerfully dogmatic. 'When asked my view on some public matter nowadays,' he quips, 'I tend to reply, 'Well, in Barnes's Benign Republic …'' He goes on to list some of BBR's key policies: For a start … public ownership of all forms of mass transport, and all forms of power supply—gas, electric, nuclear, wind, solar … Absolute separation of Church and State … Full restoration of all arts and humanities courses at schools and universities … and, more widely, an end to a purely utilitarian view of education. This all sounds good to me, but it's announced without a hint of argument. Given Barnes's doubts about the power of persuasion, that makes sense. If no one is convinced by arguments, anyway, offering them would be a waste of time. Barnes does admit one exception: 'Occasionally, there might be an area where you admit to knowing little, and are a vessel waiting to be filled.' But, he adds, 'such moments are rare.' The discovery that reasoning is less effective than we hoped, instead of being a source of intellectual humility, may lead us to opt out of rational debate. Yascha Mounk: The doom spiral of pernicious polarization Barnes doesn't overtly make this case—again, why would he? But it's implicit in his book and it's not obviously wrong. When we ask what we should think in light of the social science of how we think, we run into philosophical trouble. I can't coherently believe that I am basically irrational or unreliable, because that belief would undermine itself: another conviction I can't trust. More narrowly, I can't separate what I think about, say, climate change from the apparent evidence. It's paradoxical to doubt that climate change is real while thinking that the evidence for climate change is strong, or to think, I don't believe that climate change is real, although it is. My beliefs are my perspective on the world; I cannot step outside of them to change them 'like some rider controlling a horse with their knees,' as Barnes puts it, 'or the driver of a tank guiding its progress.' So what am I to do? One consolation, of sorts, is that my plight—and yours—predates the findings of social science. Philosophers like Descartes long ago confronted the perplexities of the subject trapped within their own perspective. The limits of reasoning are evident from the moment we begin to do it. Every argument we make contains premises an opponent can dispute: They can always persist in their dissent, so long as they reject, time and again, some basic assumption we take for granted. This doesn't mean that our beliefs are unjustified. Failure to convert the skeptic—or the committed conspiracy theorist—need not undermine our current convictions. Nor does recent social science prove that we're inherently irrational. In conditions of uncertainty, it's perfectly reasonable to put more faith in evidence that fits what we take to be true than in unfamiliar arguments against it. Confirmation bias may lead to deadlock and polarization, but it is better than hopelessly starting from scratch every time we are contradicted. None of this guarantees that we'll get the facts right. In Meditations, Descartes imagines that the course of his experience is the work of an evil demon who deceives him into thinking the external world is real. Nowadays, we might think of brains in vats or virtual-reality machines from movies like The Matrix. What's striking about these thought experiments is that their imagined subjects are rational even though everything they think they know is wrong. Rationality is inherently fallible. What social science reveals is that we are more fallible than we thought. But this doesn't mean that changing our mind is a fool's errand. New information might be less likely to lead us to the truth than we would like to believe—but that doesn't mean it has no value at all. More evidence is still better than less. And we can take concrete steps to maximize its value by mitigating bias. Studies suggest, for instance, that playing devil's advocate improves our reliability. Barnes notwithstanding, novel arguments can move our mind in the right direction. As Descartes' demon shows, our environment determines how far being rational correlates with being right. At the evil-demon limit, not at all: We are trapped in the bubble of our own experience. Closer to home, we inhabit epistemic bubbles that impede our access to information. But our environment is something we can change. Sometimes it's good to have an open mind and to consider new perspectives. At other times, it's not: We know we're right and the risk of losing faith is not worth taking. We can't ensure that evidence points us to the truth, but we can protect ourselves from falling into error. As Barnes points out, memory is 'a key factor in changing our mind: we need to forget what we believed before, or at least forget with what passion and certainty we believed it.' When we fear that our environment will degrade, that we'll be subject to misinformation or groupthink, we can record our fundamental values and beliefs so as not to forsake them later. Seen in this light, Barnes's somewhat sheepish admission that he has never really changed his mind about politics seems, if not entirely admirable, then not all bad. Where the greater risk is that we'll come to accept the unacceptable, it's just as well to be dogmatic.