Latest news with #Picabian
Yahoo
17-03-2025
- Politics
- Yahoo
Why It's Hard to Change Your Mind
Julian Barnes opens Changing My Mind, his brisk new book about our unruly intellects, with a quote famously attributed to the economist John Maynard Keynes: 'When the facts change, I change my mind.' It's a fitting start for an essay on our obliviousness to truth, because Keynes didn't say that—or not exactly that. The economist Paul Samuelson almost said it in 1970 (replacing 'facts' with 'events') and in 1978 almost said it again (this time, 'information'), attributing it to Keynes. His suggestion stuck, flattering our sense of plausibility—it's the sort of thing Keynes would have said—and now finds itself repeated in a work of nonfiction. Our fallibility is very much on display. Not that Barnes would deny that he makes mistakes. The wry premise of his book is that he's changed his mind about how we change our minds, evolving from a Keynesian faith in fact and reason to a framing inspired by the Dadaist Francis Picabia's aphorism 'Our heads are round so that our thoughts can change direction.' (In this case, the citation is accurate.) Barnes concludes that our beliefs are changed less by argument or evidence than by emotion: 'I think, on the whole, I have become a Picabian rather than a Keynesian.' Barnes is an esteemed British novelist, not a social scientist—one of the things he hasn't changed his mind about is 'the belief that literature is the best system we have of understanding the world'—but his shift in perspective resonates with a host of troubling results in social psychology. Research in recent decades shows that we are prone to 'confirmation bias,' systematically interpreting new information in ways that favor our existing views and cherry-picking reasons to uphold them. We engage in 'motivated reasoning,' believing what we wish were true despite the evidence. And we are subject to 'polarization': As we divide into like-minded groups, we become more homogeneous and more extreme in our beliefs. If a functioning democracy is one in which people share a common pool of information and disagree in moderate, conciliatory ways, there are grounds for pessimism about its prospects. For Barnes, this is not news: 'When I look back at the innumerable conversations I've had with friends and colleagues about political matters over the decades,' he laments, 'I can't remember a single, clear instance, when a single, clear argument has made me change my mind—or when I have changed someone else's mind.' Where Barnes has changed his mind—about the nature of memory, or policing others' language, or the novelists Georges Simenon and E. M. Forster—he attributes the shift to quirks of experience or feeling, not rational thought. Both Barnes and the social scientists pose urgent, practical questions. What should we do about the seeming inefficacy of argument in politics? How can people persuade opponents on issues such as immigration, abortion, or trans rights in cases where their interpretation of evidence seems biased? Like the Russian trolls who spread divisive rhetoric on social media, these questions threaten one's faith in what the political analyst Anand Giridharadas has called 'the basic activity of democratic life—the changing of minds.' The situation isn't hopeless; in his recent book, The Persuaders, Giridharadas portrays activists and educators who have defied the odds. But there is a risk of self-fulfilling prophecy: If democratic discourse comes to seem futile, it will atrophy. [Read: The cognitive biases tricking your brain] Urgent as it may be, this fear is not what animates Barnes in Changing My Mind. His subject is not moving other minds, but rather changing our own. It's easy and convenient to forget that confirmation bias, motivated reasoning, and group polarization are not problems unique to those who disagree with us. We all interpret evidence with prejudice, engage in self-deception, and lapse into groupthink. And though political persuasion is a topic for social scientists, the puzzle of what I should do when I'm afraid that I'm being irrational or unreliable is a philosophical question I must inevitably ask, and answer, for myself. That's why it feels right for Barnes to approach his topic through autobiography, in the first person. This genre goes back to Descartes' Meditations: epistemology as memoir. And like Descartes before him, Barnes confronts the specter of self-doubt. 'If Maynard Keynes changed his mind when the facts changed,' he admits, 'I find that facts and events tend to confirm me in what I already believe.' You might think that this confession of confirmation bias would shake his confidence, but that's not what happens to Barnes, or to many of us. Learning about our biases doesn't necessarily make them go away. In a chapter on his political convictions, Barnes is cheerfully dogmatic. 'When asked my view on some public matter nowadays,' he quips, 'I tend to reply, 'Well, in Barnes's Benign Republic …'' He goes on to list some of BBR's key policies: For a start … public ownership of all forms of mass transport, and all forms of power supply—gas, electric, nuclear, wind, solar … Absolute separation of Church and State … Full restoration of all arts and humanities courses at schools and universities … and, more widely, an end to a purely utilitarian view of education. This all sounds good to me, but it's announced without a hint of argument. Given Barnes's doubts about the power of persuasion, that makes sense. If no one is convinced by arguments, anyway, offering them would be a waste of time. Barnes does admit one exception: 'Occasionally, there might be an area where you admit to knowing little, and are a vessel waiting to be filled.' But, he adds, 'such moments are rare.' The discovery that reasoning is less effective than we hoped, instead of being a source of intellectual humility, may lead us to opt out of rational debate. [Yascha Mounk: The doom spiral of pernicious polarization] Barnes doesn't overtly make this case—again, why would he? But it's implicit in his book and it's not obviously wrong. When we ask what we should think in light of the social science of how we think, we run into philosophical trouble. I can't coherently believe that I am basically irrational or unreliable, because that belief would undermine itself: another conviction I can't trust. More narrowly, I can't separate what I think about, say, climate change from the apparent evidence. It's paradoxical to doubt that climate change is real while thinking that the evidence for climate change is strong, or to think, I don't believe that climate change is real, although it is. My beliefs are my perspective on the world; I cannot step outside of them to change them 'like some rider controlling a horse with their knees,' as Barnes puts it, 'or the driver of a tank guiding its progress.' So what am I to do? One consolation, of sorts, is that my plight—and yours—predates the findings of social science. Philosophers like Descartes long ago confronted the perplexities of the subject trapped within their own perspective. The limits of reasoning are evident from the moment we begin to do it. Every argument we make contains premises an opponent can dispute: They can always persist in their dissent, so long as they reject, time and again, some basic assumption we take for granted. This doesn't mean that our beliefs are unjustified. Failure to convert the skeptic—or the committed conspiracy theorist—need not undermine our current convictions. Nor does recent social science prove that we're inherently irrational. In conditions of uncertainty, it's perfectly reasonable to put more faith in evidence that fits what we take to be true than in unfamiliar arguments against it. Confirmation bias may lead to deadlock and polarization, but it is better than hopelessly starting from scratch every time we are contradicted. None of this guarantees that we'll get the facts right. In Meditations, Descartes imagines that the course of his experience is the work of an evil demon who deceives him into thinking the external world is real. Nowadays, we might think of brains in vats or virtual-reality machines from movies like The Matrix. What's striking about these thought experiments is that their imagined subjects are rational even though everything they think they know is wrong. Rationality is inherently fallible. What social science reveals is that we are more fallible than we thought. But this doesn't mean that changing our mind is a fool's errand. New information might be less likely to lead us to the truth than we would like to believe—but that doesn't mean it has no value at all. More evidence is still better than less. And we can take concrete steps to maximize its value by mitigating bias. Studies suggest, for instance, that playing devil's advocate improves our reliability. Barnes notwithstanding, novel arguments can move our mind in the right direction. [Read: Changing your mind can make you less anxious] As Descartes' demon shows, our environment determines how far being rational correlates with being right. At the evil-demon limit, not at all: We are trapped in the bubble of our own experience. Closer to home, we inhabit epistemic bubbles that impede our access to information. But our environment is something we can change. Sometimes it's good to have an open mind and to consider new perspectives. At other times, it's not: We know we're right and the risk of losing faith is not worth taking. We can't ensure that evidence points us to the truth, but we can protect ourselves from falling into error. As Barnes points out, memory is 'a key factor in changing our mind: we need to forget what we believed before, or at least forget with what passion and certainty we believed it.' When we fear that our environment will degrade, that we'll be subject to misinformation or groupthink, we can record our fundamental values and beliefs so as not to forsake them later. Seen in this light, Barnes's somewhat sheepish admission that he has never really changed his mind about politics seems, if not entirely admirable, then not all bad. Where the greater risk is that we'll come to accept the unacceptable, it's just as well to be dogmatic. Article originally published at The Atlantic


Atlantic
17-03-2025
- Politics
- Atlantic
Why It's Hard to Change Your Mind
Julian Barnes opens Changing My Mind, his brisk new book about our unruly intellects, with a quote famously attributed to the economist John Maynard Keynes: 'When the facts change, I change my mind.' It's a fitting start for an essay on our obliviousness to truth, because Keynes didn't say that —or not exactly that. The economist Paul Samuelson almost said it in 1970 (replacing 'facts' with 'events') and in 1978 almost said it again (this time, 'information'), attributing it to Keynes. His suggestion stuck, flattering our sense of plausibility—it's the sort of thing Keynes would have said—and now finds itself repeated in a work of nonfiction. Our fallibility is very much on display. Not that Barnes would deny that he makes mistakes. The wry premise of his book is that he's changed his mind about how we change our minds, evolving from a Keynesian faith in fact and reason to a framing inspired by the Dadaist Francis Picabia's aphorism 'Our heads are round so that our thoughts can change direction.' (In this case, the citation is accurate.) Barnes concludes that our beliefs are changed less by argument or evidence than by emotion: 'I think, on the whole, I have become a Picabian rather than a Keynesian.' Barnes is an esteemed British novelist, not a social scientist—one of the things he hasn't changed his mind about is 'the belief that literature is the best system we have of understanding the world'—but his shift in perspective resonates with a host of troubling results in social psychology. Research in recent decades shows that we are prone to ' confirmation bias,' systematically interpreting new information in ways that favor our existing views and cherry-picking reasons to uphold them. We engage in ' motivated reasoning,' believing what we wish were true despite the evidence. And we are subject to ' polarization ': As we divide into like-minded groups, we become more homogeneous and more extreme in our beliefs. If a functioning democracy is one in which people share a common pool of information and disagree in moderate, conciliatory ways, there are grounds for pessimism about its prospects. For Barnes, this is not news: 'When I look back at the innumerable conversations I've had with friends and colleagues about political matters over the decades,' he laments, 'I can't remember a single, clear instance, when a single, clear argument has made me change my mind—or when I have changed someone else's mind.' Where Barnes has changed his mind—about the nature of memory, or policing others' language, or the novelists Georges Simenon and E. M. Forster—he attributes the shift to quirks of experience or feeling, not rational thought. Both Barnes and the social scientists pose urgent, practical questions. What should we do about the seeming inefficacy of argument in politics? How can people persuade opponents on issues such as immigration, abortion, or trans rights in cases where their interpretation of evidence seems biased? Like the Russian trolls who spread divisive rhetoric on social media, these questions threaten one's faith in what the political analyst Anand Giridharadas has called 'the basic activity of democratic life—the changing of minds.' The situation isn't hopeless; in his recent book, The Persuaders, Giridharadas portrays activists and educators who have defied the odds. But there is a risk of self-fulfilling prophecy: If democratic discourse comes to seem futile, it will atrophy. Urgent as it may be, this fear is not what animates Barnes in Changing My Mind. His subject is not moving other minds, but rather changing our own. It's easy and convenient to forget that confirmation bias, motivated reasoning, and group polarization are not problems unique to those who disagree with us. We all interpret evidence with prejudice, engage in self-deception, and lapse into groupthink. And though political persuasion is a topic for social scientists, the puzzle of what I should do when I'm afraid that I'm being irrational or unreliable is a philosophical question I must inevitably ask, and answer, for myself. That's why it feels right for Barnes to approach his topic through autobiography, in the first person. This genre goes back to Descartes' Meditations: epistemology as memoir. And like Descartes before him, Barnes confronts the specter of self-doubt. 'If Maynard Keynes changed his mind when the facts changed,' he admits, 'I find that facts and events tend to confirm me in what I already believe.' You might think that this confession of confirmation bias would shake his confidence, but that's not what happens to Barnes, or to many of us. Learning about our biases doesn't necessarily make them go away. In a chapter on his political convictions, Barnes is cheerfully dogmatic. 'When asked my view on some public matter nowadays,' he quips, 'I tend to reply, 'Well, in Barnes's Benign Republic …'' He goes on to list some of BBR's key policies: For a start … public ownership of all forms of mass transport, and all forms of power supply—gas, electric, nuclear, wind, solar … Absolute separation of Church and State … Full restoration of all arts and humanities courses at schools and universities … and, more widely, an end to a purely utilitarian view of education. This all sounds good to me, but it's announced without a hint of argument. Given Barnes's doubts about the power of persuasion, that makes sense. If no one is convinced by arguments, anyway, offering them would be a waste of time. Barnes does admit one exception: 'Occasionally, there might be an area where you admit to knowing little, and are a vessel waiting to be filled.' But, he adds, 'such moments are rare.' The discovery that reasoning is less effective than we hoped, instead of being a source of intellectual humility, may lead us to opt out of rational debate. Yascha Mounk: The doom spiral of pernicious polarization Barnes doesn't overtly make this case—again, why would he? But it's implicit in his book and it's not obviously wrong. When we ask what we should think in light of the social science of how we think, we run into philosophical trouble. I can't coherently believe that I am basically irrational or unreliable, because that belief would undermine itself: another conviction I can't trust. More narrowly, I can't separate what I think about, say, climate change from the apparent evidence. It's paradoxical to doubt that climate change is real while thinking that the evidence for climate change is strong, or to think, I don't believe that climate change is real, although it is. My beliefs are my perspective on the world; I cannot step outside of them to change them 'like some rider controlling a horse with their knees,' as Barnes puts it, 'or the driver of a tank guiding its progress.' So what am I to do? One consolation, of sorts, is that my plight—and yours—predates the findings of social science. Philosophers like Descartes long ago confronted the perplexities of the subject trapped within their own perspective. The limits of reasoning are evident from the moment we begin to do it. Every argument we make contains premises an opponent can dispute: They can always persist in their dissent, so long as they reject, time and again, some basic assumption we take for granted. This doesn't mean that our beliefs are unjustified. Failure to convert the skeptic—or the committed conspiracy theorist—need not undermine our current convictions. Nor does recent social science prove that we're inherently irrational. In conditions of uncertainty, it's perfectly reasonable to put more faith in evidence that fits what we take to be true than in unfamiliar arguments against it. Confirmation bias may lead to deadlock and polarization, but it is better than hopelessly starting from scratch every time we are contradicted. None of this guarantees that we'll get the facts right. In Meditations, Descartes imagines that the course of his experience is the work of an evil demon who deceives him into thinking the external world is real. Nowadays, we might think of brains in vats or virtual-reality machines from movies like The Matrix. What's striking about these thought experiments is that their imagined subjects are rational even though everything they think they know is wrong. Rationality is inherently fallible. What social science reveals is that we are more fallible than we thought. But this doesn't mean that changing our mind is a fool's errand. New information might be less likely to lead us to the truth than we would like to believe—but that doesn't mean it has no value at all. More evidence is still better than less. And we can take concrete steps to maximize its value by mitigating bias. Studies suggest, for instance, that playing devil's advocate improves our reliability. Barnes notwithstanding, novel arguments can move our mind in the right direction. As Descartes' demon shows, our environment determines how far being rational correlates with being right. At the evil-demon limit, not at all: We are trapped in the bubble of our own experience. Closer to home, we inhabit epistemic bubbles that impede our access to information. But our environment is something we can change. Sometimes it's good to have an open mind and to consider new perspectives. At other times, it's not: We know we're right and the risk of losing faith is not worth taking. We can't ensure that evidence points us to the truth, but we can protect ourselves from falling into error. As Barnes points out, memory is 'a key factor in changing our mind: we need to forget what we believed before, or at least forget with what passion and certainty we believed it.' When we fear that our environment will degrade, that we'll be subject to misinformation or groupthink, we can record our fundamental values and beliefs so as not to forsake them later. Seen in this light, Barnes's somewhat sheepish admission that he has never really changed his mind about politics seems, if not entirely admirable, then not all bad. Where the greater risk is that we'll come to accept the unacceptable, it's just as well to be dogmatic.


The Guardian
16-03-2025
- General
- The Guardian
‘We remember as true things that never even happened': Julian Barnes on memory and changing his mind
It sounds a simple business. 'I changed my mind.' Subject, verb, object – a clear, clean action, without correcting or diminishing adjectives or adverbs. 'No, I'm not doing that – I changed my mind' is usually an irrefutable statement. It implies the presence of strong arguments which can be provided if necessary. The economist John Maynard Keynes, charged with inconsistency, famously replied, 'When the facts change, I change my mind.' So, he – and we – are happily and confidently in charge of this whole operation. The world may sadly incline to inconsistency, but not us. And yet the phrase covers a great variety of mental activities, some seemingly rational and logical, others elemental and instinctive. There may be a simmering-away beneath the level of consciousness until the bursting realisation comes that, yes, you have changed your mind completely on this subject, that person, this theory, that worldview. The dadaist Francis Picabia once put it like this: 'Our heads are round so that our thoughts can change direction.' And I think this feels as close to a true accounting of our mental processes as does Maynard Keynes's statement. When I was growing up, adults of my parents' generation used to say, 'Changing her mind is a woman's privilege.' This was, according to your male point of view, either a charming or an infuriating characteristic. It was regarded as something essentially female, or feminine, sometimes mere whimsicality, sometimes deeply emotional and intuitively intelligent – again, intuition was back then a female speciality – and related to the very nature of the woman in question. So perhaps you could say men were Keynesian, and women Picabian. You rarely hear that phrase about a woman's privilege nowadays, and to many it sounds doubtless merely sexist and patronising. On the other hand, if you approach the matter from a philosophical or neuroscientific point of view, it looks a little different. 'I changed my mind.' Subject, verb, object, a simple transaction under our control. But where is this 'I' that is changing this 'mind', like some rider controlling a horse with their knees, or the driver of a tank guiding its progress? Certainly not very visible to the eye of the philosopher or brain scientist. This 'I' we feel so confident about isn't something beyond and separate from the mind, controlling it, but rather something inside the mind, and arising from it. In the words of one neuroscientist, 'there is no self-stuff' locatable within the brain. Far from being a horse rider or tank commander, we are at the wheel of a driverless car of the near future. To the outside observer, there is a car, and a steering wheel, with someone sitting in front of it. And this is true – except that on this particular model the driver cannot switch from automatic to manual, because manual does not exist. So if things are this way round – if it's the brain, the mind, that gives birth to what we think of as 'I', then the phrase 'I changed my mind' doesn't make much sense. You might as well say, 'My mind changed me.' And if we see things this way round, then changing one's mind is something we don't necessarily understand ourselves. In which case, it's not just a woman's privilege, but a human privilege. Though perhaps 'privilege' isn't quite the right word – better to say, characteristic, or oddity. Sometimes in my life, I've been a logical Keynesian about the whole business, sometimes a dadaist Picabian. But generally, in either case, I've been confident that I was right to change my mind. This is another characteristic of the process. We never think, Oh, I've changed my mind and have now adopted a weaker or less plausible view than the one I held before, or a sillier or more sentimental view. We always believe that changing our mind is an improvement, bringing a greater truthfulness, or a greater sense of realism, to our dealings with the world and other people. It puts an end to vacillation, uncertainty, weak-mindedness. It seems to make us stronger and more mature; we have put away yet another childish thing. Well, we would think that, wouldn't we? I remember the story of an Oxford undergraduate of literary aspirations visiting Garsington Manor in the 1920s where the artistic hostess Lady Ottoline Morrell presided. She asked him, 'Do you prefer spring or autumn, young man?' He replied spring. Her riposte was that when he got older he would probably prefer autumn. In the late 1970s I interviewed the novelist William Gerhardie, who was almost exactly half a century older than me. I was young and callow, he was extremely aged, indeed bed-ridden. He asked me if I believed in the afterlife. I said that I didn't. 'Well, you might when you get to my age,' he replied with a chuckle. I admired him for the remark, while not believing that I would ever change my mind to that degree. But we all expect, indeed approve of, some changes over the years. We change our minds about many things, from matters of mere taste – the colours we prefer, the clothes we wear – to aesthetic matters – the music, the books we like – to adherence to social groups – the football team or political party we support – to the highest verities – the person we love, the god we revere, the significance or insignificance of our place in the seemingly empty or mysteriously full universe. We make these decisions – or these decisions make us – constantly, though they are often camouflaged by the momentousness of the acts that provoke them. Love, parenthood, the death of those close to us: such matters reorient our lives, and often make us change our minds. Is it merely that the facts have changed? No, it's more that areas of fact and feeling hitherto unknown to us have suddenly become clear, that the emotional landscape has altered. And in a great swirl of emotion, our minds change. So I think, on the whole, I have become a Picabian rather than a Keynesian. Consider the question of memory. This is often a key factor in changing our mind: we need to forget what we believed before, or at least forget with what passion and certainty we believed it, because we now believe something different that we know to be truer and deeper. Memory, or the weakening or lack of it, helps endorse our new position; it is part of the process. And beyond this, there's the wider question of how our understanding of memory changes. Mine certainly has over my lifetime. When I was an unreflecting boy, I assumed that memory operated like a left-luggage office. An event in our lives happens, we make some swift, subconscious judgment on the importance of that event, and if it is important enough, we store it in our memory. Later, when we need to recall it, we take the left-luggage ticket along to a department of our brain, which releases the memory back to us – and there it is, as fresh and uncreased as the moment it happened. But we know it's not like that really. We know that memory degrades. We have come to understand that every time we take that memory out of the locker and expose it to view, we make some tiny alteration to it. And so the stories we tell most often about our lives are likely to be the least reliable, because we will have subtly amended them in every retelling down the years. Sometimes it doesn't take years at all. I have an old friend, a considerable raconteur, who once, in my presence, in the course of a single day, told the same anecdote to three different audiences with three different punchlines. At the third hearing, after the laughter had subsided, I murmured, perhaps a little unkindly, 'Wrong ending, Thomas.' He looked at me in disbelief (at my manners); I looked at him in disbelief (at his not being able to stick to a reliable narrative). There is also such a thing as a memory transplant. My wife and I were great friends of the painter Howard Hodgkin, and travelled with him and his partner to many places. In 1989, we were in Taranto in southern Italy, when Howard spotted a black towel in an old-fashioned haberdasher's window. We went in, Howard asked to see it, and the assistant produced from a drawer a black towel. No, Howard explained, it wasn't quite the same black as the one in the window. The assistant, unflustered, produced another one, and then another one, each of which Howard rejected as not being as black as the one in the window. After he had turned down seven or eight, I was thinking (as one might), for God's sake, it's only a towel, you only need it to dry your face. Then Howard asked the assistant to get the one out of the window, and we all saw at once that it was indeed very, very slightly blacker than all the others. A sale was concluded, and a lesson about the precision of an artist's eye learned. I described this incident in an essay about Howard, and doubtless told it orally a few times as well. Many years later, after Howard's death, I was at dinner in painterly circles when a woman, addressing her husband, said, 'Do you remember when we went into that shop with Howard for a black towel…' Before she could finish, I reminded her firmly that this was my story, which her expression clearly acknowledged. And I don't believe she was doing it knowingly: she somehow remembered it as happening to her and her husband. It was an artless borrowing – or a piece of mental cannibalism, if you prefer. It's salutary to discover, from time to time, how other people's memories are often quite different from our own – not just of events, but of what we ourselves were like back then. A few years ago, I had an exchange of correspondence about one of my books with someone whom I'd been at school with, but had not kept up with and had no memory of. The exchange turned into a sharp disagreement, at which point he clearly decided he might as well tell me what he thought of me – or, more accurately, tell me what he remembered now of what he had thought of me back when we were at school together. 'I remember you,' he wrote, 'as a noisy and irritating presence in the Sixth Form corridor.' This came as a great surprise to me, and I had to laugh, if a little ruefully. My own memory insisted – and still does – that I was a shy, self-conscious and well-behaved boy, though inwardly rebellious. But I couldn't deny this fellow pupil's reminiscence; and so, belatedly, I factored it in, and changed my mind about what I must have been like – or, at least, how I might have appeared to others – 50 and more years ago. Gradually, I have come to change my mind about the very nature of memory itself. For a long time I stuck pretty much with the left-luggage-department theory, presuming that some people's memories were better because their brain's storage conditions were better, or they had shaped and lacquered their memories better before depositing them in the first place. Some years ago, I was writing a book that was mainly about death, but also a family memoir. I have one brother – three years older, a philosopher by profession – and emailed him explaining what I was up to. I asked some preliminary questions about our parents – how he judged them as parents, what they had taught us, what he thought their own relationship was like. I added that he himself would inevitably feature in my book. He replied with an initial declaration that astonished me. 'By the way,' he wrote, 'I don't mind what you say about me, and if your memory conflicts with mine, go with yours, as it is probably better.' I thought this was not just extremely generous of him, but also very interesting. Though he was only three years older than me, he was assuming the superiority of my memory. I guessed that this could be because he was a philosopher, living in a world of higher and more theoretical ideas; whereas I was a novelist, professionally up to my neck in the scruffy, everyday details of life. But it was more than this. As he explained to me, he had come to distrust memory as a guide to the past. By itself, unsubstantiated, uncorroborated memory was in his view no better than an act of the imagination. (James Joyce put it the other way round, 'Imagination is memory' – which is much more dubious.) My brother gave an example. In 1976 he had gone to a philosophical conference on Stoic logic held at Chantilly, north of Paris, organised by Jacques Brunschwig, whom he had never met before. He took a train from Boulogne, and clearly remembered missing his stop, and having to take a taxi back up the line and arriving late as a consequence. He and Brunschwig became close friends, and 30 years later they were having dinner in Paris and reminiscing about how they first met. Brunschwig remembered how he had waited on the platform at Chantilly and immediately recognised my brother as soon as he stepped down from the train. They stared at one another in disbelief (and perhaps had to apply some Stoic logic to their quandary). That book came out 17 years ago. And in the meantime, I have come round to my brother's point of view. I now agree that memory, a single person's memory, uncorroborated and unsubstantiated by other evidence, is a feeble guide to the past. I think, more strongly than I used to, that we constantly reinvent our lives, retelling them – usually – to our own advantage. I believe that the operation of memory is closer to an act of the imagination than it is to the clean and reliably detailed recuperation of an event in our past. I think that sometimes we remember as true things that never even happened in the first place; that we may grossly embellish an original incident out of all recognition; that we may cannibalise someone else's memory, and change not just the endings of the stories of our lives, but also their middles and beginnings. I think that memory, over time, changes, and, indeed, changes our mind. That's what I believe at the moment, anyway. Though in a few years, perhaps I will have changed my mind about it all over again. This is an edited extract from Changing My Mind by Julian Barnes, published by Notting Hill Editions on 18 March (£8.99). To support the Guardian and Observer order your copy from Delivery charges may apply