Latest news with #ShoshanaZuboff


Hindustan Times
14-05-2025
- Entertainment
- Hindustan Times
Review: The Dream Hotel by Laila Lalami
Big Brother was just the beginning. When surveillance shakes hands with capitalism, as Shoshana Zuboff points out in her 2018 book on the subject, the system claims human experience 'as free raw material for translation into behavioural data'. This data is then used for everything from control to profit. Sara, the protagonist of The Dream Hotel, Laila Lalami's new novel set in the near future, experiences the consequences of such a system firsthand. Entire generations, she acknowledges, have been watched and recorded 'from the womb to the grave'. The result: they accept 'corporate ownership of their personal data to be a fact of life, as natural as leaves growing on trees'. The Dream Hotel is set in a United States where pervasive surveillance is the norm. Citizens are constantly monitored through facial recognition, biometrics, public cameras, social media, and other forms of digital scrutiny. The latest among these is the so-called DreamSaver, an implant to help you sleep better, but which also records your dreams. Armed with these sources, the government's crime-prediction algorithm assigns risk scores to individuals. Those who exceed a certain threshold are deemed potential future threats and held in 'retention centres' until their scores drop to acceptable levels. The novel's premise, then, echoes the one found in Philip K Dick's 1956 science fiction novella Minority Report, filtered through what could well be an episode of Charlie Brooker's Black Mirror. It also brings to mind Ismail Kadare's 1981 novel The Palace of Dreams in which the functionaries of an autocratic state transcribe the dreams of compliant subjects in the belief that 'all that is murky and harmful, or that will become so in a few years or centuries, makes its first appearance in men's dreams'. In this future scenario, some things have stayed the same. As Sara thinks, 'It was impossible to be on social media these days without encountering trolls, bots, cyborgs, scammers, sock puppets, reply guys, or conspiracy theorists -- people who were best avoided, ignored, or blocked.' No surprises there. A 38-year-old museum archivist and historian of post-colonial Africa, Sara is flagged by the all-seeing algorithm upon returning to Los Angeles from an overseas museum trip. Her mind races through the possibilities: had she run a red light, neglected to pay for a parking violation, or left a grocery store without scanning all her items? Had her phone pinged near a political protest or some kind of public disturbance? All of these would have been 'recorded on smartphones, documented in screenshots, or watched from hidden security cameras, then stored in online databases'. She is transported to a nearby town and held in the precincts of a public school that has been converted to a holding centre. Unhelpful officials and bureaucratic errors crush her hopes of a quick release, and she remains stuck in detention indefinitely. The novel swirls around Sara's suffocating present: a period of time filled with her dreams, memories of the past with her husband and their twins, and relationships with fellow detainees and supervisors. She and the other inmates are given a series of monotonous tasks and have to deal with issues of privacy, food, and hygiene. In an overt nod to fictional parallels, Sara also checks out books by Octavia Butler, Kafka, and Borges from the centre's library. The narrative also includes news articles, audio transcripts, medical reports, emails, and other documents, which are set apart from the main narrative rather than integrated into it. These insertions not only shed light on the mechanics of the story's futuristic world but also reveal how bureaucratic glitches impact the lives of the incarcerated. The Dream Hotel doesn't progress in the manner of most dystopian fiction. Gone are the usual tropes of swelling rebellions or dramatic clashes with authority; instead, the conflict stays confined to Sara's immediate circumstances. Much of the novel's middle section unfolds quietly – perhaps too quietly – dwelling on the holding centre's monotonous rhythms and the women trapped within them. Lalami's observations about corporate control over this surveillance society are sharp and spot-on. When Sara learns of plans to use the DreamSaver for more than just surveillance, she realises: 'The only way to increase profit continually is to extract more from the same resources.' At another time, she rues 'the parasitic logic' which has made its way so deeply into the collective mind that 'to defy lucre is to mark oneself as a radical, or a criminal, or a lunatic'. In this world of algorithmic policing, Sara has to remind herself that a crime isn't the same as a moral transgression. 'It's only that the line of legality has moved, and now I'm on the wrong side of it.' We blame the algorithm for our predicament, she thinks, 'but the algorithm was written by people, not machines'. Freedom, then, with all its complications and risks, 'can only be written in the company of others'. Lalami's measured approach makes The Dream Hotel a critique that lingers on the mundane horrors of compliance and the erosion of autonomy. Sanjay Sipahimalani is a Mumbai-based writer and reviewer.


Forbes
30-04-2025
- Business
- Forbes
Addiction Capitalism Is A Global Social Sustainability Crisis For Tech
Close up shot, group of children hands busy using smartphone at school corridor - concept of social ... More media, playing games, technology and education. We are living in an age of addiction capitalism. A strong term, yet it appears to be apt. The revenue of the world's largest social media, gaming, and smartphone companies, sometimes less affectionately known as 'Big Tech', benefits from the creation and exploitation of design features that generate compulsive user behavior. This has profound implications for social sustainability. What began as a revolution to connect us has brought countless advantages: social media allows us to stay in touch with family and friends, find jobs, and learn about topics we care about. It has also quietly morphed into something with an insidious downside: a system in which maximizing screen time has become synonymous with maximizing profit, with no equivalent measure of social cost. As of 2023, the majority of the world's population owns a smartphone. Addiction capitalism can now be considered a global social sustainability issue. Addiction capitalism refers to an economic model in which companies design technology products and digital platforms to be as habit-forming as possible. The logic is simple: the more attention or 'engagement' a user gives to a platform or device, the more data the user generates and the stronger the business case for the platform to advertisers and investors. This kind of thinking is not new: the terms 'surveillance capitalism' and 'limbic capitalism' tracked with the meteoric rise of Big Tech. Shoshana Zuboff, who coined the idea in 2014, defines surveillance capitalism as 'the unilateral claiming of private human experience as free raw material for translation into behavioral data'. David T. Courtwright's 2019 book uses the term 'limbic capitalism' to describe a range of products designed to generate habitual consumption connected to the limbic system, the brain's 'pleasure center'. Addiction capitalism complements these terms and is specific, spotlighting the economic incentives that lead companies to design technological and digital offerings optimized for compulsive use. Smartphones are now a central tool of this model. A recent study found that participants in the UK interact with their smartphones every five minutes during waking hours. Many participants in the study were surprised by this finding, convinced of its truth only upon viewing video footage of their behavior. This finding matches a US survey that concluded the average American checks their phone 205 times a day. The same study reports that 80% of Americans check their phones within ten minutes of waking up, almost 50% do not remember the last time they went 24 hours without their phone, and over 40% self-describe as phone 'addicted'. Such statistics are far from unique to the US and UK. A study of 18–26-year-olds in South Africa found that than 80% surveyed self-reported as 'addicted' to their smartphones, checking their devices dozens of times per waking hour. Brazil, France, Norway and Turkey all make the global top ten of countries with the highest percentage of time used on smartphones: addiction capitalism is a global phenomenon. Social sustainability refers to the impact of corporate activity on people. Seen through this lens, the relevance of addiction capitalism to sustainability becomes clear. In his 2024 book The Anxious Generation: How the Great Rewiring of Childhood is Causing an Epidemic of Mental Illness, social psychologist Jonathan Haidt presents a sobering picture: around 2012, a steep rise in anxiety, depression, and self-harm among adolescents began, particularly among girls. This timeline aligns closely with the widespread adoption of smartphones and social media. Haidt points to an 'experience gap' in which young people's time spent online appears to crowd out real-world social interaction, healthy risk-taking, and play. Research into the exact mechanisms behind the outcomes Haidt describes continues apace. In the meantime, multiple studies connect high social media use to increased loneliness, sleep problems, and attention difficulties. This holds for adults and young people alike, though the impact on children's and teenagers' neurological and social development is particularly significant. Addiction capitalism doesn't just market to youth – it appears to mold them. Notifications, likes, infinite scrolling, autoplay features, and similar features of user experience are no accident: they are designed to command attention. Algorithms optimize time spent on platforms, which means there is a strong incentive for users to be shown content that generates strong emotions and more 'engagement'. As a recent study by University College London concludes, social media algorithms 'amplify extreme content', including content that is violent and or demeaning to specific groups. Exposure to addictive capitalism, while global, also skews towards those who are already socially or economically vulnerable. In families that make less than $50,000 per year in the US, for example, children aged eight and under spend double the amount of time on screens each day compared to children in families with household income of $100,000 and up. This pattern appears to be repeated across the globe, raising serious questions about the role of addiction capitalism in perpetuating inequality and social exclusion. Herein lies the challenge: What does social sustainability mean for a business whose financial success depends on keeping users, including vulnerable users, engaged in compulsive behavior? Addiction is a strong term, and some will contest its use here. Technological innovation inevitably brings change, and indeed, sometimes, unfounded moral panic. Addiction capitalism is, nevertheless, the outcome of well-researched, highly strategic choices made by firms. As smartphones, gaming, and social media proliferate globally, there is a need for an equally global conversation about what social sustainability means for technology companies. Three insights investors, founders, and executives can use to prepare themselves for leadership on this topic include: Sustainability is not only about carbon and climate commitments. It is about the impact of your business model on people. Any business that drives widespread harm to social systems and mental health, especially among the next generation, is unsustainable. Autoplay, infinite scroll, and specific algorithms drive time spent on platforms and devices. These are not neutral features. They are the outcome of deliberate design decisions, and they affect human behavior. Social sustainability means transparency on design, and designing ethically, not just profitably. In the future, the impact of technology could be viewed through the same lens we apply to tobacco and alcohol: as a public health issue. If it is the case that addiction capitalism undermines mental health as it amplifies vulnerability and exclusion, scrutiny of the industry will only continue to increase. Addiction capitalism is not accidental. It is a consequence of corporate choices. The core of social sustainability in technology may be that monetizing attention at any cost is not a sustainable business model. We are early in the journey of understanding what digital wellbeing means across the globe. This is a social sustainability issue that will continue to rise up on the agenda.


Los Angeles Times
27-02-2025
- Los Angeles Times
Forget thought crime. People are incarcerated for dream crime in this near-future novel
It's overwhelming to think of how carefully tracked we are by private interests at this point in time: what we buy, what we watch, what we search online, what we want to know about other people — and who we know and how well. Shoshana Zuboff's 'The Age of Surveillance Capitalism' describes the perfect storm of extractive profit-seeking and privacy erosion that drives so much of contemporary life. When it comes to today's corporations, she explains, our lives are the product, and the power that's accrued to surveillance capitalism abrogates our basic rights in ways that we have not yet figured out how to fight through collaborative action. Our ability to mobilize, she suggests, 'will define a key battleground upon which the fight for a human future unfolds.' You can feel the influence of these concerns in Laila Lalami's powerful, richly conceived fifth novel about pre-crime, 'The Dream Hotel' — out March 4. Set in the near future, the book's corporatized reality is slightly more twisted than ours but entirely plausible, a place where private greed has resulted in a disturbing bureaucracy with no true due process. As the novel opens, Moroccan American mother and archivist Sara Hussein is in Madison, a 120-bed 'retention' center near Los Angeles, run by a private company, where, in the interests of crime prevention, people whose dreams have marked them as high-risk for committing crimes are kept under steady, intrusive observation. According to the powers that be, Sara is being held because she dreamed of killing her husband. And while she refuses to believe this means something bigger, she also worries about all the holes in her knowledge; throughout the novel, Lalami plays out the shiftiness and uncertainty of reality when dreams are given more predictive weight than facts to stunning effect. Sara has been inside so long — at the start of the book, 281 days — that communication from her husband has slowed, and she fears that he has started to believe she is guilty. When a new woman is admitted to the facility, her naive assumptions about how the system works — the result of ignorance that seems at first to mirror our own — counter Sara's experience-driven awareness of problems. After having twins, and struggling to get enough sleep, Sara had agreed to surgery that outfitted her with a neuroprosthetic — the private company's promise was that you could feel rested after shorter periods of sleep, but under the principles of surveillance capitalism, its reach has since expanded into people's private, inner lives and become a basis for what amounts to incarceration, though it's not labeled such. 'Once dreams became a commodity, a new market opened — and markets are designed to grow. Sales must be increased, initiatives developed, channels broadened.' We'll later discover that, in line with surveillance-capitalist impulses, the company is not only watching but also cultivating product placement in dreams. Here, rendering this edge-of-nightmare world, Lalami skates along at the height of her powers as a writer of intelligent, complex characters. By training, Sara is a historian of postcolonial Africa, and her career has been spent as a digital archivist at the Getty Museum. She maps what she knows of archives to the operation of algorithms, understanding that the latter work according to search terms provided by a human with limited knowledge, and that, therefore, its method for seeking out pre-crime is profoundly fallible. The book kicks off with Lalami's clever marketing language for the dream surveillance device: 'You're a good person; if you were in a position to stop disaster, you probably would.' By flattering people's sense of themselves as good, as wanting to stop crimes against women and children — not so different from the curtailment of civil liberties after 9/11, where the risks of terrorism were treated on balance as drastically more significant than preserving individual freedoms — the device has become normalized. What makes use of the device so insidious is not simply the monitoring, of course, but that trivial actions, and even non-actions, mere thoughts, lead inexorably to nightmarish scenarios. The retention center has procedures that purportedly adhere to due process, but as in Franz Kafka's 'The Trial' or Vladimir Sorokin's 'The Queue,' where bureaucracy stands in the way of getting anywhere, every time it seems like Sara's time in the facility is about to be over, something trivial occurs to push her hearing date back, or to otherwise deny her release. Unlike those atmospheric novels in which the central authority in the bureaucracy remains inaccessible, Lalami not only renders Sara relatable through mentions of mundane things like hiking with her husband or caring for babies but also builds the perspectives of some of the villains of the piece with nuance. It's not only the claustrophobia of an enclosed space with strangers or control-seeking authorities but time itself that creates the feeling of dread. Lalami writes, 'Each day resembles the one that came before it, the monotony adding to the women's apprehension and leading them to make decisions that damage their cases.' The novel takes a fascinating turn, one that calls up Zuboff's insights that we haven't yet developed forms of collaborative action to counter surveillance capitalism, when Sara realizes that she and other retained people do have a tool to fight back, namely the work they do while incarcerated. It's a clever progressive pivot that tamps down the dystopian vibes that support the original premise of the book. At one point, Sara looks at a mural and notices that the laborers depicted are watched by a painted foreman, 'and later by the artist in his studio, and later yet by her, the process transforming them from people into objects.' But, even in its awareness that subjectivity is stripped away when people are treated as data points, the novel refuses a grim understanding of how people might become damaged in their behavior toward one another while under surveillance (changes to behavior seen in East Berlin, North Korea, the Xinjiang Uygur Autonomous Region and other places in the world that have fallen to totalitarianism). Rather, as with her other novels, there's a softhearted universalism to Lalami's treatment of surveillance capitalism. Hers is one in which humans retain the ability to trust one another enough to forge working solidarities and authentic collaborations. Although it relies on a speculative technology for its plot, 'The Dream Hotel' is astounding, elegantly constructed, character-driven fiction. Lalami's realistic approach to Sara and others, inflected with leftist politics and history, elides any sharp division we might imagine about where we've been and what we face ahead. 'Maybe past and present aren't all that different,' Sara thinks at a critical moment. 'The strange thing — the amazing thing, really — is that we've managed to find workarounds to surveillance.' Within the latter part of the novel, it's not the stuff of tragedy or alarm about the human condition we encounter, but surprising, unadulterated hope. Felicelli is a novelist and critic who served on the board of the National Book Critics Circle from 2021-24.
Yahoo
05-02-2025
- Business
- Yahoo
Google Quietly Walks Back Promise Not To Use AI for Weapons or Harm
As whispers of AI hype filled the air in 2018, it seemed almost inevitable that we would soon be facing a whole new world, full of near-human robots and cybernetic dogs. But with that came a host of questions: how would it all change our jobs, how might we protect ourselves from an AI takeover, and more broadly, how could AI be designed for good instead of evil? Facing those questions and an uncertain future, Google affirmed its commitment to ethical tech development in a statement on its AI principles, including commitments not to use its AI in ways "likely to cause overall harm," like in weapons or surveillance tech. Fast forward seven years later, and those commitments have been quietly scrubbed from Google's AI principles page. The move has drawn a host of criticism at the change's ominous undertones. "Having that removed is erasing the work that so many people in the ethical AI space and the activist space as well had done at Google," former head of Google's ethical AI team Margaret Mitchell told Bloomberg, which broke the story. "More problematically it means Google will probably now work on deploying technology directly that can kill people." Google isn't the first AI company to retract its commitment not to make killbots. Last summer, OpenAI likewise deleted its pledge not to use AI for "military and warfare," as reported by The Intercept at the time. Though it hasn't announced any Terminator factories — yet — Google said in a statement yesterday that "companies, governments, and organizations... should work together to create AI that protects people, promotes global growth, and supports national security." Read: we can do whatever we want. Deal with it. And while the company's news is troubling, it's drawing on a long history of dubious profiteering. After all, Google was the first major tech company to recognize the value of surveillance through data. "Google is to surveillance capitalism what the Ford Motor Company and General Motors were to mass-production-based managerial capitalism," wrote acclaimed tech critic Shoshana Zuboff in 2019. "In our time Google became the pioneer, discoverer, elaborator, experimenter, lead practitioner, role model, and diffusion hub of surveillance capitalism." As far back as the early 2000s, Google has been exploring the value of personal browsing data — a leering asset sometimes known as "digital exhaust" — which it realized contained predictive information about individual users as they traveled across the web. Soon, pressured by the Dot-com collapse and the need to generate revenue, Google leaned into that tech as it built the dominant tracking and advertising apparatus of our time. The revelation that user data could translate into cold hard cash spun off into a host of data-driven products like hyper-targeted ads, predictive algorithms, personal assistants, and smart homes, all of which propelled Google into the market giant it is today. Now, the past feels like prelude. As tech companies like Google dump untold billions into developing AI, the race is on to generate revenue for impatient investors. It's no wonder that unscrupulous AI profit models are now on the table — after all, they're the supposed new backbone of the company. More on Google: Something Bad Is Brewing Inside Google