Latest news with #Zmigrod


Time of India
30-05-2025
- Science
- Time of India
Do our brains shape our political views?
Fascinating research on neuroscience of ideology Ideological extremism is usually explained by social, economic and demographic factors. There's not enough research on how individual thinking patterns can make some people more likely to support violence in defense of their beliefs or group. That's what Leor Zmigrod dives into in The Ideological Brain: A Radical Science of Susceptible Minds. Does ideology shape our brains, or do our brains shape ideological leanings? Is there a neuroscience of free will, could extreme worldviews be rooted in cognition and biology? It all depends on how sharp the brief is. What's being studied? Political identities or radicalisation or religion? Is the research on brain areas that do the decision-making, or regions for emotional processing? Is focus on brain structure, or on brain function? Is there a 'how' in the mix, finding a mechanism, or is it simply a search for an effect? The book asks as many questions as it answers as it threads together neuro-research, politics and philosophy to also shine a light on the scope of future research. Zmigrod and political neuroscientists like her are asking how deeply into our brain can ideological systems penetrate. 'How far into the mind and body indoctrination really goes.' Experiments included mapping neural/brain activity and the region where it was happening when participants were exposed to political videos, news etc. Obedient actions evoked brain patterns different from free choices. A 2011 study that compared sizes of liberal and conservative brains found more conservative people had a larger right amygdala than political liberals. Amygdalae are twin brain bits that store emotional 'feels' of negative emotions such as threat, fear & disgust, and information we internalise on social hierarchies. A brain part's size is linked to its processing capacity, but the degree to which anatomy responds to or depends on function is still under study. Enter the chicken-'n-egg puzzle. Do individuals lean towards more conservative ideologies because they have larger amygdalae or does being immersed in 'system-justifying ideologies' – status quo – and conservatism lead to structural brain changes? But, aha, size doesn't matter, says the book. Two sets of scientists found that liberal participants had a larger ACC – a central sausage-shaped thingamajig incharge of emotional processing and cognitive control – but couldn't replicate the results in later tests. Zmigrod argues it's not the size, but function that matters. When it comes to ideological thinking, ACC is 'haughtily aware of its own importance'. The queen is the prefrontal cortex (PFC) – that deals with complex decision-making and high-flying mental computations. More the damage or injury to PFC, more conservative the person. Those with a damaged PFC would identify extreme statements as moderate. Those with intact PFC would spot extreme for what it is. So, to be progressive, all you need is an undamaged PFC? 'Not so fast,' says Zmigrod. Because PFC's like a transport hub making sense of all the info zigging in and zagging out to and from all parts of the brain – it's never a standalone. And then there's dopamine. Most rigid individuals have specific genes that impact how and where dopamine is distributed through the brain – less in PFC, more in parts that control instinct. These pathways can be traced to discover the neurochemistry of ideology. Summing up: The book says, greater the uncertainty, the more susceptible the brain is to dogmatism. Most leaders are creating ever-new uncertainties. What's that doing to people? Facebook Twitter Linkedin Email Disclaimer Views expressed above are the author's own.
Yahoo
12-05-2025
- Politics
- Yahoo
Is your brain your political destiny?
You often hear about 'ideology' these days. Even if that word isn't mentioned, it's very much what's being discussed. When President Donald Trump denounces the left, he's talking about gender ideology or critical race theory or DEI. When the left denounces Trump, they talk about fascism. Wherever you look, ideology is being used to explain or dismiss or justify policies. Buried in much of this discourse is an unstated assumption that the real ideologues are on the other side. Often, to call someone 'ideological' is to imply that they're fanatical or dogmatic. But is that the best way to think about ideology? Do we really know what we're talking about when we use the term? And is it possible that we're all ideological, whether we know it or not? Leor Zmigrod is a cognitive neuroscientist and the author of The Ideological Brain. Her book makes the case that our political beliefs aren't just beliefs. They're also neurological signatures, written into our neurons and reflexes, and over time those signatures change our brains. Zmigrod's point isn't that 'brain is destiny,' but she is saying that our biology and our beliefs are interconnected in important ways. I invited Zmigrod onto The Gray Area to talk about the biological roots of belief and whether something as complicated as ideology is reducible to the brain in this way. As always, there's much more in the full podcast, so listen and follow The Gray Area on Apple Podcasts, Spotify, Pandora, or wherever you find podcasts. New episodes drop every Monday. This interview has been edited for length and clarity. What is ideology? How are you defining it? I think ideology has two components. One is a very fixed doctrine, a set of descriptions about the world that's very absolutist, that's very black and white, and that is very resistant to evidence. An ideology will always have a certain kind of causal narrative about the world that describes what the world is like and also how we should act within that world. It gives prescriptions for how we should act, how we should think, how we should interact with other people. But that's not the end of the story. To think ideologically is both to have this fixed doctrine and also to have a very fixed identity that influences how you judge everyone. And that fixed identity stems from the fact that every ideology, every doctrine, will have believers and nonbelievers. So when you think ideologically, you're really embracing those rigid identity categories and deciding to exclusively affiliate with people who believe in your ideology and reject anyone who doesn't. The degree of ideological extremity can be mapped onto how hostile you are to anyone with differing beliefs, whether you're willing to potentially harm people in the name of your ideology. You write, 'Not all stories are ideologies and not all forms of collective storytelling are rigid and oppressive.' How do you tell the difference? How do you, for instance, distinguish an ideology from a religion? Is there room for a distinction like that in your framework? What I think about often is the difference between ideology and culture. Because culture can encompass eccentricities; it can encompass deviation, different kinds of traditions or patterns from the past, but it's not about legislating what one can do or one can't do. The moment we detect an ideology is the moment when you have very rigid prescriptions about what is permissible and what is not permissible. And when you stop being able to tolerate any deviation, that's when you've moved from culture, which can encompass a lot of deviation and reinterpretations, to ideology. How do you test for cognitive flexibility versus rigidity? In order to test someone's cognitive rigidity or their flexibility, one of the most important things is not just to ask them, because people are terrible at knowing whether they're rigid or flexible. The most rigid thinkers will tell you they're fabulously flexible, and the most flexible thinkers will not know it. So that's why we need to use these unconscious assessments, these cognitive tests and games that tap into your natural capacity to be adaptable or to resist change. One test to do this is called the Wisconsin Card Sorting Test, which is a card-sorting game where people are presented with a deck of cards that they need to sort. And initially, they don't know what the rule that governs the game is, so they try and figure it out. And quickly, they'll realize that they should match the cards in their deck according to their color. So they'll start putting a blue card with a blue card, a red card with a red card, and they'll get affirmation that they're doing it. They start enacting this rule, adopting it, applying it again and again and again. And after a while, unbeknownst to them, the rule of the game changes and suddenly this color rule doesn't work anymore. That's the moment of change that I'm most interested in because some people will notice that change and they will adapt. They will then go looking for a different rule, and they'll quickly figure out that they should actually sort the cards according to the shape of the objects on the card and they'll follow this new rule. Those are very cognitively flexible individuals. But there are other people who will notice that change and they will hate it. They will resist that change. They will try to say that it never happened, and they'll try to apply the old rule, despite getting negative feedback. And those people that really resist the change are the most cognitively rigid people. They don't like change. They don't adapt their behavior when the evidence suggests that they do. So if someone struggles to switch gears in a card-sorting game, that says something about their comfort with change and ambiguity in general. And someone who struggles with change and ambiguity in a card game will probably also have an aversion to something like pluralism in politics because their brain processes that as chaotic. Is that a fair summary of the argument? Yeah, broadly. People who resist that change, who resist uncertainty, who like things to stay the same, when the rules change. They really don't like it. Often that translates into the most cognitively rigid people, people who don't like pluralism, who don't like debate. But that can really coexist on both sides of the political spectrum. When we're talking about diversity, that can be a more politicized concept, and you can still find very rigid thinkers being very militant about certain ideas that we might say are progressive. So it's quite nuanced. It's easy to understand why being extremely rigid would be a bad thing. But is it possible to be too flexible? If you're just totally unmoored and permanently wide open and incapable of settling on anything, that seems bad in a different way, no? What you're talking about is a kind of immense persuadability, but that's not exactly flexibility. There is a distinction there because being flexible is about updating your beliefs in light of credible evidence, not necessarily adopting a belief just because some authority says so. It's about seeing the evidence and responding to it. Focusing on rigidity does make a lot of sense, but is there a chance you risk pathologizing conviction? How do you draw the line between principled thinking and dogmatic thinking? It's not about pathologizing conviction, but it is about questioning what it means to believe in an idea without being willing to change your mind on it. And I think that there is a very fine line between what we call principles and what we call dogmas. This gets particularly thorny in the moral domain. No one wants to be dogmatic, but it's also hard to imagine any kind of moral clarity without something like a fixed commitment to certain principles or values. And what often happens is if we don't like someone's values, we'll call them extremists or dogmatic. But if we like their values, we call them principled. Yeah, and that's why I think that a psychological approach to what it means to think ideologically helps us escape from that kind of slippery relativism. Because then it's not just about, Oh, where is someone relative to us on certain issues on the political spectrum? It's about thinking, Well, what does it mean to resist evidence? There is a delicate path there where you can find a way to have a moral compass — maybe not the same absolutist moral clarity that ideologies try to convince you exists, but you can have a morality without having really dogmatic ideologies. How much of our rigid thinking is just about our fear of uncertainty? Ideologies are our brains' way of solving the problem of uncertainty in the world because our brains are these incredible predictive organs. They're trying to understand the world, looking for shortcuts wherever possible because it's very complicated and very computationally expensive to figure out everything that's happening in the world. Ideologies kind of hand that to you on a silver plate and they say, Here are all the rules for life. Here are all rules for social interaction. Here's a description of all the causal mechanisms for how the world works. There you go. And you don't need to do that hard labor of figuring it out all on your own. That's why ideologies can be incredibly tempting and seductive for our predictive brains that are trying to resolve uncertainty, that are trying to resolve ambiguities, that are just trying to understand the world in a coherent way. It's a coping mechanism. In the book, you argue that every worldview can be practiced extremely and dogmatically. I read that, and I just wondered if it leaves room for making normative judgments about different ideologies. Do you think every ideology is equally susceptible to extremist practices? I sometimes get strong opposition from people saying, Well, my ideology is about love. It's about generosity or about looking after others. The idea is that these positive ideologies should be immune from dogmatic and authoritarian ways of thinking. But this research isn't about comparing ideologies as these big entities represented by many people. I'm asking if there are people within all these ideologies who are extremely rigid. And we do see that every ideology can be taken on militantly. Not every ideology is equally violent or equally quick to impose rules on others, but every ideology that has this very strong utopian vision of what life and the world should be, or a very dystopian fear of where the world is going, all of those have a capacity to become extreme. How do you think about causality here? Are some people just biologically prone to dogmatic thinking, or do they get possessed by ideologies that reshape their brain over time? This is a fascinating question, and I think that causality goes both ways. I think there's evidence that there are preexisting predispositions that propel some people to join ideological groups. And that when there is a trigger, they will be the first to run to the front of the line in support of the ideological cause. But at the same time, as you become more extreme, more dogmatic, you are changed. The way you think about the world, the way you think about yourself, changes. You become more ritualistic, more narrow, more rigid in every realm of life. So yes, ideology also changes you. Listen to the rest of the conversation and be sure to follow The Gray Area on Apple Podcasts, Spotify, Pandora, or wherever you listen to podcasts.


Vox
12-05-2025
- Politics
- Vox
Is your brain your political destiny?
You often hear about 'ideology' these days. Even if that word isn't mentioned, it's very much what's being discussed. When President Donald Trump denounces the left, he's talking about gender ideology or critical race theory or DEI. When the left denounces Trump, they talk about fascism. Wherever you look, ideology is being used to explain or dismiss or justify policies. Buried in much of this discourse is an unstated assumption that the real ideologues are on the other side. Often, to call someone 'ideological' is to imply that they're fanatical or dogmatic. But is that the best way to think about ideology? Do we really know what we're talking about when we use the term? And is it possible that we're all ideological, whether we know it or not? Leor Zmigrod is a cognitive neuroscientist and the author of The Ideological Brain. Her book makes the case that our political beliefs aren't just beliefs. They're also neurological signatures, written into our neurons and reflexes, and over time those signatures change our brains. Zmigrod's point isn't that 'brain is destiny,' but she is saying that our biology and our beliefs are interconnected in important ways. I invited Zmigrod onto The Gray Area to talk about the biological roots of belief and whether something as complicated as ideology is reducible to the brain in this way. As always, there's much more in the full podcast, so listen and follow The Gray Area on Apple Podcasts, Spotify, Pandora, or wherever you find podcasts. New episodes drop every Monday. This interview has been edited for length and clarity. What is ideology? How are you defining it? I think ideology has two components. One is a very fixed doctrine, a set of descriptions about the world that's very absolutist, that's very black and white, and that is very resistant to evidence. An ideology will always have a certain kind of causal narrative about the world that describes what the world is like and also how we should act within that world. It gives prescriptions for how we should act, how we should think, how we should interact with other people. But that's not the end of the story. To think ideologically is both to have this fixed doctrine and also to have a very fixed identity that influences how you judge everyone. And that fixed identity stems from the fact that every ideology, every doctrine, will have believers and nonbelievers. So when you think ideologically, you're really embracing those rigid identity categories and deciding to exclusively affiliate with people who believe in your ideology and reject anyone who doesn't. The degree of ideological extremity can be mapped onto how hostile you are to anyone with differing beliefs, whether you're willing to potentially harm people in the name of your ideology. You write, 'Not all stories are ideologies and not all forms of collective storytelling are rigid and oppressive.' How do you tell the difference? How do you, for instance, distinguish an ideology from a religion? Is there room for a distinction like that in your framework? What I think about often is the difference between ideology and culture. Because culture can encompass eccentricities; it can encompass deviation, different kinds of traditions or patterns from the past, but it's not about legislating what one can do or one can't do. The moment we detect an ideology is the moment when you have very rigid prescriptions about what is permissible and what is not permissible. And when you stop being able to tolerate any deviation, that's when you've moved from culture, which can encompass a lot of deviation and reinterpretations, to ideology. How do you test for cognitive flexibility versus rigidity? In order to test someone's cognitive rigidity or their flexibility, one of the most important things is not just to ask them, because people are terrible at knowing whether they're rigid or flexible. The most rigid thinkers will tell you they're fabulously flexible, and the most flexible thinkers will not know it. So that's why we need to use these unconscious assessments, these cognitive tests and games that tap into your natural capacity to be adaptable or to resist change. One test to do this is called the Wisconsin Card Sorting Test, which is a card-sorting game where people are presented with a deck of cards that they need to sort. And initially, they don't know what the rule that governs the game is, so they try and figure it out. And quickly, they'll realize that they should match the cards in their deck according to their color. So they'll start putting a blue card with a blue card, a red card with a red card, and they'll get affirmation that they're doing it. They start enacting this rule, adopting it, applying it again and again and again. And after a while, unbeknownst to them, the rule of the game changes and suddenly this color rule doesn't work anymore. That's the moment of change that I'm most interested in because some people will notice that change and they will adapt. They will then go looking for a different rule, and they'll quickly figure out that they should actually sort the cards according to the shape of the objects on the card and they'll follow this new rule. Those are very cognitively flexible individuals. But there are other people who will notice that change and they will hate it. They will resist that change. They will try to say that it never happened, and they'll try to apply the old rule, despite getting negative feedback. And those people that really resist the change are the most cognitively rigid people. They don't like change. They don't adapt their behavior when the evidence suggests that they do. So if someone struggles to switch gears in a card-sorting game, that says something about their comfort with change and ambiguity in general. And someone who struggles with change and ambiguity in a card game will probably also have an aversion to something like pluralism in politics because their brain processes that as chaotic. Is that a fair summary of the argument? Yeah, broadly. People who resist that change, who resist uncertainty, who like things to stay the same, when the rules change. They really don't like it. Often that translates into the most cognitively rigid people, people who don't like pluralism, who don't like debate. But that can really coexist on both sides of the political spectrum. When we're talking about diversity, that can be a more politicized concept, and you can still find very rigid thinkers being very militant about certain ideas that we might say are progressive. So it's quite nuanced. It's easy to understand why being extremely rigid would be a bad thing. But is it possible to be too flexible? If you're just totally unmoored and permanently wide open and incapable of settling on anything, that seems bad in a different way, no? What you're talking about is a kind of immense persuadability, but that's not exactly flexibility. There is a distinction there because being flexible is about updating your beliefs in light of credible evidence, not necessarily adopting a belief just because some authority says so. It's about seeing the evidence and responding to it. Focusing on rigidity does make a lot of sense, but is there a chance you risk pathologizing conviction? How do you draw the line between principled thinking and dogmatic thinking? It's not about pathologizing conviction, but it is about questioning what it means to believe in an idea without being willing to change your mind on it. And I think that there is a very fine line between what we call principles and what we call dogmas. This gets particularly thorny in the moral domain. No one wants to be dogmatic, but it's also hard to imagine any kind of moral clarity without something like a fixed commitment to certain principles or values. And what often happens is if we don't like someone's values, we'll call them extremists or dogmatic. But if we like their values, we call them principled. Yeah, and that's why I think that a psychological approach to what it means to think ideologically helps us escape from that kind of slippery relativism. Because then it's not just about, Oh, where is someone relative to us on certain issues on the political spectrum? It's about thinking, Well, what does it mean to resist evidence? There is a delicate path there where you can find a way to have a moral compass — maybe not the same absolutist moral clarity that ideologies try to convince you exists, but you can have a morality without having really dogmatic ideologies. How much of our rigid thinking is just about our fear of uncertainty? Ideologies are our brains' way of solving the problem of uncertainty in the world because our brains are these incredible predictive organs. They're trying to understand the world, looking for shortcuts wherever possible because it's very complicated and very computationally expensive to figure out everything that's happening in the world. Ideologies kind of hand that to you on a silver plate and they say, Here are all the rules for life. Here are all rules for social interaction. Here's a description of all the causal mechanisms for how the world works. There you go. And you don't need to do that hard labor of figuring it out all on your own. That's why ideologies can be incredibly tempting and seductive for our predictive brains that are trying to resolve uncertainty, that are trying to resolve ambiguities, that are just trying to understand the world in a coherent way. It's a coping mechanism. In the book, you argue that every worldview can be practiced extremely and dogmatically. I read that, and I just wondered if it leaves room for making normative judgments about different ideologies. Do you think every ideology is equally susceptible to extremist practices? I sometimes get strong opposition from people saying, Well, my ideology is about love. It's about generosity or about looking after others. The idea is that these positive ideologies should be immune from dogmatic and authoritarian ways of thinking. But this research isn't about comparing ideologies as these big entities represented by many people. I'm asking if there are people within all these ideologies who are extremely rigid. And we do see that every ideology can be taken on militantly. Not every ideology is equally violent or equally quick to impose rules on others, but every ideology that has this very strong utopian vision of what life and the world should be, or a very dystopian fear of where the world is going, all of those have a capacity to become extreme. How do you think about causality here? Are some people just biologically prone to dogmatic thinking, or do they get possessed by ideologies that reshape their brain over time? This is a fascinating question, and I think that causality goes both ways. I think there's evidence that there are preexisting predispositions that propel some people to join ideological groups. And that when there is a trigger, they will be the first to run to the front of the line in support of the ideological cause.


New European
29-04-2025
- Politics
- New European
Everyday Philosophy: Inside the minds of extremists
What motivated Abedi's vicious assaults in prison is unclear, but his earlier murders were, in the words of the sentencing judge: 'to advance the cause of Islamism; a matter distinct from and abhorrent to the vast majority of those who follow the Islamic faith.' On Wednesday, Salman Rushdie's attacker Hadi Matar was to have been sentenced for attempted murder, but that has been adjourned to May 16. Hashem Abedi, who helped plan and prepare the 2017 Manchester Arena bombing in which 22 people were killed and 264 wounded, attacked three prison officers using hot oil and improvised knives at HMP Frankland a few weeks ago. Two officers are in hospital with severe stab wounds. Abedi, already serving a minimum of 55 years, is unlikely ever to be released. Tabloids describe Matar, Abedi and other ideologically driven terrorists as monsters, manifestations of pure evil, or as zombie-like products of indoctrination, but we urgently need a better account than that. In her recently published The Ideological Brain: The Radical Science of Flexible Thinking, Leor Zmigrod attempts one. Her focus is the causes of cognitive rigidity and extremism. She describes how immersion in an ideology can reshape an individual's brain. As a political neuroscientist, she draws on psychological, social, and neuroscientific research. Rigid ideological thinkers commit to narratives about the world that explain and confirm their approach. These deliver clear rules about how they should behave. Counter-evidence is invisible to them. Typically, they surround themselves with a community of like-minded believers who reinforce their prejudices and give them a sense of belonging and self-worth. They don't adapt their views in the light of evidence because they already know the truth: it's confirmed everywhere. As Zmigrod puts it: 'Ideologies provide easy solutions to our queries, scripts that we can follow, groups to which we belong. Guiding our thoughts and actions, ideologies are the shortcuts to our desire to understand the world and be understood back.' From within an ideology, everything is neatly ordered, everything is as you expect it to be. Hannah Arendt observed that ideological thinking 'proceeds with a consistency that exists nowhere else in reality'. Once you've settled on some unrevisable axioms, the consequences follow with a terrible logic. Not all rigid thinkers end up as terrorists, but a small percentage do, with devastating consequences. People with a highly rigid mindset don't just treat evidence differently, they also perceive and remember differently. Zmigrod cites experiments in which children were told a story. When asked to recall what happened, those identified by other tests as at the rigid end of the spectrum often invented details, such as undesirable traits for characters from different ethnic groups. Their memories of the story made it consistent with their biases. In contrast, the more 'liberal'-minded children recalled events with greater accuracy. What causes such rigidity? Unravelling the direction of cause and effect is a complex chicken-and-egg task. Zmigrod has discovered that the brains of the most cognitively rigid have lower concentrations of dopamine in their prefrontal cortex (crudely, the brain's decision-making centre), and more in areas that control rapid instincts. This is something that is influenced by genes. However, Zmigrod cautions, there's no 'dogmatic gene'. The interactions between different genes is complex and expression of particular genes is not predetermined: it depends on multiple factors, including, of course, experience, and exposure to ideologies. Not everyone is equally at risk: some individuals are especially susceptible. 'There are gradations in vulnerability and recognising this continuity is crucial,' Zmigrod explains. Vulnerable people's immersion in the world of an ideology changes their brain structures and reinforces and rewards dangerous patterns of thought. That doesn't absolve them of responsibility, but their brains chemically reinforce what an ideology is telling them. Most of us share some patterns of thought with rigid thinkers in some spheres. We are all somewhere on that spectrum, but our position is not fixed. At times of stress, for example, people become more dogmatic in their thinking; when less stressed they become more flexible. Once seduced by an ideology, however, the force of the downward spiral with its patterns of reinforcement can be hard to resist, and for some all but impossible without help. Ideologies, Zmigrod acknowledges, are 'terrifyingly alluring'. Our only hope lies in fostering critical and flexible thinking in


New York Times
08-04-2025
- Politics
- New York Times
Ideology May Not Be What You Think but How You're Wired
So sharp are partisan divisions these days that it can seem as if people are experiencing entirely different realities. Maybe they actually are, according to Leor Zmigrod, a neuroscientist and political psychologist at Cambridge University. In a new book, 'The Ideological Brain: The Radical Science of Flexible Thinking,' Dr. Zmigrod explores the emerging evidence that brain physiology and biology help explain not just why people are prone to ideology but how they perceive and share information. This conversation has been edited for clarity and brevity. What is ideology? It's a narrative about how the world works and how it should work. This potentially could be the social world or the natural world. But it's not just a story: It has really rigid prescriptions for how we should think, how we should act, how we should interact with other people. An ideology condemns any deviation from its prescribed rules. You write that rigid thinking can be tempting. Why is that? Ideologies satisfy the need to try to understand the world, to explain it. And they satisfy our need for connection, for community, for just a sense that we belong to something. There's also a resource question. Exploring the world is really cognitively expensive, and just exploiting known patterns and rules can seem to be the most efficient strategy. Also, many people argue — and many ideologies will try to tell you — that adhering to rules is the only good way to live and to live morally. I actually come at it from a different perspective: Ideologies numb our direct experience of the world. They narrow our capacity to adapt to the world, to understand evidence, to distinguish between credible evidence and not credible evidence. Ideologies are rarely, if ever, good. Q: In the book, you describe research showing that ideological thinkers can be less reliable narrators. Can you explain? Remarkably, we can observe this effect in children. In the 1940s, Else Frenkel-Brunswik, a psychologist at the University of California, Berkeley, interviewed hundreds of children and tested their levels of prejudice and authoritarianism, like whether they championed conformity and obedience or play and imagination. When children were told a story about new pupils at a fictional school and asked to recount the story later, there were significant differences in what the most prejudiced children remembered, as opposed to the most liberal children. Liberal children tended to recall more accurately the ratio of desirable and undesirable traits in the characters of the story; their memories possessed greater fidelity to the story as it was originally told. In contrast, children who scored highly on prejudice strayed from the story; they highlighted or invented undesirable traits for the characters from ethnic minority backgrounds. So, the memories of the most ideologically-minded children incorporated fictions that confirmed their pre-existing biases. At the same time, there was also a tendency to occasionally parrot single phrases and details, rigidly mimicking the storyteller. Are people who are prone to ideology taking in less information? Are they processing it differently? The people most prone to ideological thinking tend to resist change or nuance of any kind. We can test this with visual and linguistic puzzles. For instance, in one test, we ask them to sort playing cards by various rules, like suit or color. But suddenly they apply the rule and it doesn't work. That's because, unbeknownst to them, we changed the rule. The people who tend to resist ideological thinking are adaptable, and so when there's evidence the rules have changed, they change their behavior. Ideological thinkers, when they encounter the change, they resist it. They try to apply the old rule even though it doesn't work anymore. In one study you conducted, you found that ideologues and nonideologues appear to have fundamental differences in their brains' reward circuitry. Can you describe your findings? In my experiments I've found that the most rigid thinkers have genetic dispositions related to how dopamine is distributed in their brains. Rigid thinkers tend to have lower levels of dopamine in their prefrontal cortex and higher levels of dopamine in their striatum, a key midbrain structure in our reward system that controls our rapid instincts. So our psychological vulnerabilities to rigid ideologies may be grounded in biological differences. In fact, we find that people with different ideologies have differences in the physical structure and function of their brains. This is especially pronounced in brain networks responsible for reward, emotion processing, and monitoring when we make errors. For instance, the size of our amygdala — the almond-shaped structure that governs the processing of emotions, especially negatively tinged emotions such as fear, anger, disgust, danger and threat — is linked to whether we hold more conservative ideologies that justify traditions and the status quo. What do you make of this? Some scientists have interpreted these findings as reflecting a natural affinity between the function of the amygdala and the function of conservative ideologies. Both revolve around vigilant reactions to threats and the fear of being overpowered. But why is the amygdala larger in conservatives? Do people with a larger amygdala gravitate toward more conservative ideologies because their amygdala is already structured in a way that is more receptive to the negative emotions that conservatism elicits? Or can immersion in a certain ideology alter our emotional biochemistry in a way that leads to structural brain changes? The ambiguity around these results reflects a chicken-and-egg problem: Do our brains determine our politics, or can ideologies change our brains? If we're wired a certain way, can we change? You have agency to choose how passionately you adopt these ideologies or what you reject or what you don't. I think we all can shift in terms of our flexibility. It's obviously harder for people who have genetic or biological vulnerabilities toward rigid thinking, but that doesn't mean that it's predetermined or impossible to change.