logo
#

Latest news with #PhilipK.Dick

'Absolutely phenomenal' series with all-star cast leaving Amazon Prime soon
'Absolutely phenomenal' series with all-star cast leaving Amazon Prime soon

Daily Mirror

time25-05-2025

  • Entertainment
  • Daily Mirror

'Absolutely phenomenal' series with all-star cast leaving Amazon Prime soon

Amazon Prime Video is home to a number of hit series and movies, as well as some hidden gems, but one show which is set to leave the streaming service is Philip K. Dick's Electric Dreams Sci-fi fans on Amazon Prime Video have less than a month left to devour the star-studded anthology series Philip K. Dick's Electric Dreams. The show, which first aired on Channel 4 in 2017 before making its way to the streaming giant, draws inspiration from ten of the author's short stories. ‌ Its debut episode took cues from his tale The Hood Maker, plunging viewers into an alternate 1970s London rife with mind-reading citizens, societal unrest, and escalating distrust that culminates in riots. ‌ Other episodes dive into themes like interstellar travel and the lives of synthetic beings endowed with human-like intelligence and emotions. The cast list reads like a who's who of television royalty, featuring Richard Madden of Game of Thrones fame, Breaking Bad's Bryan Cranston, Steve Buscemi, Anna Paquin, Terrence Howard, and Holliday Grainger. Cranston not only graced the screen but also put on the producer's hat for the series, working alongside writers such as Jack Thorne, Tony Grisoni, and Dee Rees. Upon its release, the series was met with critical acclaim, drawing parallels to the cult favourite Black Mirror and earning a solid 72% on Rotten Tomatoes. One person praised: 'Just recently discovered this series. Too bad each episode was a standalone. Several of these episodes would have made a good complete series on their own.' ‌ Another said: 'This show is absolutely phenomenal! Every episode makes you think and scares you that it could be our future. They're so well filmed, scripted and acted. I want more!!!!! "So sad to see it was only one season. Bring us more! Shocked not all 5 star reviews! "Each episode is a different short story. They feel like mini movies that are so detailed and intense that when they are over you want more. They really put your "what if" mind to use.' ‌ 'Every episode I've seen of this show is just incredible. It makes you think. Sometimes about humanity and sometimes about sci-fi tropes and conventions and how we should be breaking those more often. Electric Dreams is immersive and beautifully made,' a third said. Back in 2017, executive producer Ronald D. Moore expressed his enthusiasm for the project, saying: "Philip K. Dick's stories have inspired a lot of us, especially those of us who have loved science fiction since childhood. From Blade Runner, Minority Report, and Man On The High Castle, there's so much material." Moore added: "So when the opportunity came to work on it, I did jump at it because I knew this is a rare opportunity to do something with one of the master's works. "I wasn't familiar with the short stories at all, but I instantly realised we could definitely do a series like this and the more we talked about the nature of the project – that each show would be individualised as an anthology but have a diversity of viewpoint and give artists an opportunity to bring their own vision – the more exciting the project got." Philip K. Dick's Electric Dreams is available to watch on Amazon Prime Video

AI is everywhere, but especially on TV
AI is everywhere, but especially on TV

Boston Globe

time09-05-2025

  • Entertainment
  • Boston Globe

AI is everywhere, but especially on TV

The portrayals can be violently and appropriately ambivalent. Netflix's future-shock anthology ' "Sunny" on Apple TV+. Apple TV+ Advertisement Which brings us back to 'Murderbot.' Played by Murderbot has hacked itself so that it is no longer required to follow human orders. But damned if it doesn't keep caring about their well-being anyway. It also finds itself acting and thinking a little less machine-like. 'I felt like a balloon floating above myself, filled with agony,' it muses after taking a beating from a less benign and more advanced Security Unit. That's deep, Murderbot. Such notes of humanity are what give AI stories their moral and philosophical tension, especially when the idea of mortality enters the picture. The replicants of 'Blade Runner' (1982, based on Philip K. Dick's novel 'Do Androids Dream of Electric Sheep?') are dangerous, but only because they've learned that they have an expiration date and have decided they're not going out like that. Same with HAL 9000, the rogue mainframe from '2001: A Space Odyssey' (1968) that goes homicidal only when, through an inspired bit of lip-reading, it realizes its days are numbered. The current AI panic/promise has elicited any number of reflexive jokes about Skynet, the AI system responsible for the apocalypse in the 'Terminator' franchise. But even those mean machines gave us the here-to-help Terminator of 'Terminator 2: Judgment Day' (1991), played by the face of terror from the first movie, Arnold Schwarzenegger. Advertisement This sympathy for the AI devil is a defining trait of the current AI TV spate. Yes, the technology can be scary, and you should trust it at your own risk. But it's the humans you should really be wary of: the people pulling the strings at sprawling corporations, or showing a willingness, even eagerness, to shed some blood in pursuit of profits. AI might be the instrument of such conspiracies, but it doesn't create them. In the crushingly comical season 7 premiere of 'Black Mirror,' a woman with brain cancer (Rashida Jones again) gets a new lease on life through a cutting-edge AI procedure. But the service comes with expensive coverage tiers, and if she doesn't splurge with money she and her husband (Chris O'Dowd) don't really have, she unconsciously spouts ad copy at very awkward moments. AI isn't the real culprit here. Greed is. 'Black Mirror,' for all of its much-discussed technological twists, is ultimately about the same thing as 'Murderbot': what it means to be human. To want to binge TV shows when your annoying clients need you. To wonder what life requires of you, and how much juice you can squeeze out of it before it's your time to shuffle off this mortal coil. Looking at its clients, with their insecurities and wobbly romances and petty grievances, Murderbot can only shake its cybernetic head. And yet, despite its better judgment, it wants to be a part of it all. Of course it does. AI might provide shortcuts to creativity, but 'Murderbot' still springs from the human imagination (series creators Chris and Paul Weitz, working from novellas by Martha Wells) and explores human desires — along with the desire to be human. These are subjects AI isn't ready to tackle on its own. At least not yet. Advertisement In suggesting that humanity, for all its flaws, is kind of cool, the series might also be making a plug for good old-fashioned storytelling — the kind you can't get from, say, ChatGPT. Murderbot comes to realize that it, like TV, really needs that human touch.

The real-life risks of predictive policing—and what one city is doing differently
The real-life risks of predictive policing—and what one city is doing differently

Fast Company

time08-05-2025

  • Entertainment
  • Fast Company

The real-life risks of predictive policing—and what one city is doing differently

The 2002 sci-fi thriller Minority Report depicts a dystopian future where a specialized police unit is tasked with arresting people for crimes they have not yet committed. Directed by Steven Spielberg and based on a short story by Philip K. Dick, the drama revolves around 'PreCrime'—a system informed by a trio of psychics, or 'precogs,' who anticipate future homicides, allowing police officers to intervene and prevent would-be assailants from claiming their targets' lives. The film probes at hefty ethical questions: How can someone be guilty of a crime they haven't yet committed? And what happens when the system gets it wrong? While there is no such thing as an all-seeing 'precog,' key components of the future that Minority Report envisions have become reality even faster than its creators imagined. For more than a decade, police departments across the globe have been using data-driven systems geared toward predicting when and where crimes might occur and who might commit them. Far from an abstract or futuristic conceit, predictive policing is a reality. And market analysts are predicting a boom for the technology. Given the challenges in using predictive machine learning effectively and fairly, predictive policing raises significant ethical concerns. Absent technological fixes on the horizon, there is an approach to addressing these concerns: Treat government use of the technology as a matter of democratic accountability. Troubling history Predictive policing relies on artificial intelligence and data analytics to anticipate potential criminal activity before it happens. It can involve analyzing large datasets drawn from crime reports, arrest records and social or geographic information to identify patterns and forecast where crimes might occur or who may be involved. Law enforcement agencies have used data analytics to track broad trends for many decades. Today's powerful AI technologies, however, take in vast amounts of surveillance and crime report data to provide much finer-grained analysis. Expand to continue reading ↓

Why My Near-Future Dystopia Felt Inevitable
Why My Near-Future Dystopia Felt Inevitable

Yahoo

time25-04-2025

  • Politics
  • Yahoo

Why My Near-Future Dystopia Felt Inevitable

It happened to an electrical engineer from New Hampshire, a medical researcher at Harvard University, and an aging auntie from Seattle—all of them permanent residents. They were each returning home to the United States from an ordinary trip abroad when they were pulled aside by immigration agents, subjected to a lengthy interrogation, and then taken into custody and transferred to a detention facility miles away from home. Now they face an enormous, crushing bureaucracy that uses minor or long-forgotten infractions to keep them under indefinite detention. This type of encounter is not new, but it is headline news in 2025. It also happens to be how my dystopian novel, The Dream Hotel, opens. Set in a future of total technological surveillance, the book follows an American archivist who is detained at Los Angeles International Airport because an algorithm has used her dreams and behavior to predict that she will commit a crime. One review called it a 'Trump-Era Update' on Philip K. Dick's The Minority Report. Another credited its 'eerie sense of prescience.' When I was on tour for the book last month, someone asked if I'd known that the twice-impeached president and convicted felon would return to power. I hadn't. I started working on The Dream Hotel in 2014, during Barack Obama's administration, and wrote the bulk of it during Joe Biden's term in office. I had no idea Donald Trump would run for president in 2016, and after he lost in 2020, I didn't expect he'd be reelected. I was thinking instead about the ever-more-invasive forms of data collection that Big Tech had unleashed. I wondered if, one day, one of their devices might target the subconscious. The novel takes U.S. systems of surveillance and incarceration that have been deployed at the southern border or on foreign soil—and applies them to Americans. In writing about this potential future, I found inspiration in history. Surveillance has always been a part of the human experience, because it's one of the mechanisms that enables power to be exercised and enforced in society. 'No creature is hidden from His sight,' the Bible says, 'but all are naked and exposed to the eyes of Him to whom we must give account.' The Quran warns, 'God is all-knowing.' Omniscience is not confined to the realm of religious belief. Authoritarian systems share in the idea that, even if you're hidden behind the walls of your own home, someone might find out that you said the wrong thing or read the wrong books or met with the wrong people, and punish you for your transgressions. During the Cold War, East Germany's government employed a sprawling network of informants whom it equipped with state-of-the-art technology in order to spy on the population. On a visit to Berlin's Stasi Museum in 2023, I was struck by the range of everyday objects that could be used to conceal miniature cameras—a checkered tie, a jacket button, a watering can. The secret police even endeavored to create an archive of scents, by inducing suspects to touch yellow cloths and saving these in hermetic glass bottles. The Communist Party used this elaborate surveillance system to consolidate its power and crush political dissent for 40 years. [Read: They dreamed of Hitler] The United States has a long history of surveillance as well. The FBI famously spied on civil-rights activists, Black Panthers, feminists, Vietnam War protesters, and other leftist groups through programs such as COINTELPRO, which used wiretapping and mail interception to keep tabs on people it considered 'subversive.' This gave the Bureau access to information it could then use to disrupt their activities or sow division among them. The program cast a wide net. Martin Luther King Jr., Malcolm X, and Angela Davis were surveilled, as were Bobby Seale, Tom Hayden, and Jane Fonda. In addition to mechanical data collection, the agency also relied on information collected by informants and undercover officers. For all its power to harm, though, surveillance can also take forms that almost everyone would agree are benign, or even beneficial. For example, medical doctors have a range of tools at their disposal to track patients' heart rates, brain waves, or blood-glucose levels. The Federal Aviation Administration routinely conducts random drug and alcohol testing of its pilots and crews to ensure that they can fly safely. We watch young children when they play on the monkey bars, and keep a close eye on elders when they grow too frail or incapacitated to care for themselves. Big Tech's insidious hold on our lives comes from the fact that it combines both ends of this surveillance spectrum. Our devices deliver services that are highly protective (receiving a text alert each time a financial transaction affects a bank account, for example) as well as potentially abusive (making our political speech or our geographic movements available to, say, a police officer or an immigration agent). Technology companies are careful to present the equation as balanced, with convenience and connection on one side and collection of granular information on the other, so it is much harder for users to simply stop using their devices. In the early years of the internet, many people thought their data would be used only for targeted advertising. By 2014, when I began working on my novel, the unholy alliance between Big Tech and the government was becoming apparent. Edward Snowden had revealed the existence of PRISM, a mass-surveillance program that the National Security Agency operated in partnership with tech companies such as Apple, Facebook, Microsoft, and Google. PRISM was authorized under the PATRIOT Act, and although officials maintained that its targets were foreigners, the communications of Americans were routinely collected as well. A friend of mine, an avowed liberal, shrugged it off; he had nothing to hide, he said, and he trusted that then-President Obama would do the right thing. But even if you conceded that Obama could be trusted with the data—which I didn't—what would happen if this surveillance apparatus were run by someone else? The Snowden disclosures led to a monthslong national debate about privacy, but that eventually died down, and the program continued to operate. Still, its potential for abuse stayed with me. I grew up in Morocco in the 1970s and '80s, a period of state repression, kidnappings, and disappearances that came to be known as the Years of Lead, so I knew well what could happen when a government set its sights on an individual it found suspect or troublesome. A popular joke at the time went something like this: The CIA, the FBI, and the Moroccan police enter into a friendly contest. The Secretary of the United Nations releases a rabbit into the woods and asks them to catch it. The FBI places informants in the forest and, when it can't find the rabbit, concludes that it was never there. The CIA hits the forest with heavy artillery, then announces that the rabbit is dead. The Moroccan police go in and bring out a fox with two black eyes. 'Okay, okay,' the fox says. 'I am a rabbit.' [Read: A new kind of immigrant novel] Growing up under state control made me hypersensitive, decades later, to the dangers of technological surveillance. Tech companies have access to an ever-growing and highly detailed archive of our lives: our texts and emails, our pictures, our habits and movements, our cultural tastes and political opinions. In The Dream Hotel, I wanted to explore a world where privacy as we know it has ceased to exist, and Big Tech's alliance with the government has led to indefinite detention for pre-crime. Since the novel came out, friends have been sending me stories in the news. Like a Guardian report about how the U.K. government commissioned the development of a homicide-prediction algorithm. Or a CNN piece about how the State Department considers the 'expected beliefs, statements, or associations' of Mahmoud Khalil, the Columbia graduate and green-card holder currently being held in a Louisiana detention center, to be sufficient reason for his deportation. Or a Rolling Stone article about how the Trump administration might pursue denaturalizing American citizens and sending them to El Salvador. Then there is the New York Times story about how Elon Musk is leading efforts to create a giant government database that merges information from all existing federal records. Under this scheme, the personal, legal, financial, housing, educational, and employment information of every American would be centralized. (In my novel, this is called the OmniCloud.) I thought I was writing about a time 20 or 30 years into the future. I didn't foresee that in 2025, an unelected billionaire would have his underlings enter federal agencies over staffers' objections and—according to an official whistleblower report—just copy the private data of millions of citizens. Nor did I imagine that the acting director of ICE would bluntly state his vision of a deportation force that operates 'like [Amazon] Prime, but with human beings.' But the point of a speculative novel isn't to see what a writer got right or wrong about the future. A speculative novel isn't even about the future, exactly, but about an alternative world in which our anxieties about the present moment are on full display. What if we faced a society-altering epidemic? (The Plague, Blindness.) What if the planet warmed? (Parable of the Sower.) What if we could clone ourselves? (Never Let Me Go.) What if some words and ideas were forbidden? (The Memory Police.) What if the government outlawed books? (Fahrenheit 451.) We don't put firefighters in charge of burning books—at least not yet—but Ray Bradbury gave us language to speak about the freedom to read and showed us how to notice threats to it. My hope is that readers will open themselves to the emotional experience of The Dream Hotel. And yes, maybe they will also think about the data they so easily and so frequently relinquish. Article originally published at The Atlantic

What Fiction Can Predict, and What It Can't
What Fiction Can Predict, and What It Can't

Atlantic

time25-04-2025

  • Politics
  • Atlantic

What Fiction Can Predict, and What It Can't

It happened to an electrical engineer from New Hampshire, a medical researcher at Harvard University, and an aging auntie from Seattle—all of them permanent residents. They were each returning home to the United States from an ordinary trip abroad when they were pulled aside by immigration agents, subjected to a lengthy interrogation, and then taken into custody and transferred to a detention facility miles away from home. Now they face an enormous, crushing bureaucracy that uses minor or long-forgotten infractions to keep them under indefinite detention. This type of encounter is not new, but it is headline news in 2025. It also happens to be how my dystopian novel, The Dream Hotel, opens. Set in a future of total technological surveillance, the book follows an American archivist who is detained at Los Angeles International Airport because an algorithm has used her dreams and behavior to predict that she will commit a crime. One review called it a ' Trump-Era Update ' on Philip K. Dick's The Minority Report. Another credited its ' eerie sense of prescience.' When I was on tour for the book last month, someone asked if I'd known that the twice-impeached president and convicted felon would return to power. I hadn't. I started working on The Dream Hotel in 2014, during Barack Obama's administration, and wrote the bulk of it during Joe Biden's term in office. I had no idea Donald Trump would run for president in 2016, and after he lost in 2020, I didn't expect he'd be reelected. I was thinking instead about the ever-more-invasive forms of data collection that Big Tech had unleashed. I wondered if, one day, one of their devices might target the subconscious. The novel takes U.S. systems of surveillance and incarceration that have been deployed at the southern border or on foreign soil—and applies them to Americans. In writing about this potential future, I found inspiration in history. Surveillance has always been a part of the human experience, because it's one of the mechanisms that enables power to be exercised and enforced in society. 'No creature is hidden from His sight,' the Bible says, 'but all are naked and exposed to the eyes of Him to whom we must give account.' The Quran warns, 'God is all-knowing.' Omniscience is not confined to the realm of religious belief. Authoritarian systems share in the idea that, even if you're hidden behind the walls of your own home, someone might find out that you said the wrong thing or read the wrong books or met with the wrong people, and punish you for your transgressions. During the Cold War, East Germany's government employed a sprawling network of informants whom it equipped with state-of-the-art technology in order to spy on the population. On a visit to Berlin's Stasi Museum in 2023, I was struck by the range of everyday objects that could be used to conceal miniature cameras—a checkered tie, a jacket button, a watering can. The secret police even endeavored to create an archive of scents, by inducing suspects to touch yellow cloths and saving these in hermetic glass bottles. The Communist Party used this elaborate surveillance system to consolidate its power and crush political dissent for 40 years. The United States has a long history of surveillance as well. The FBI famously spied on civil-rights activists, Black Panthers, feminists, Vietnam War protesters, and other leftist groups through programs such as COINTELPRO, which used wiretapping and mail interception to keep tabs on people it considered 'subversive.' This gave the Bureau access to information it could then use to disrupt their activities or sow division among them. The program cast a wide net. Martin Luther King Jr., Malcolm X, and Angela Davis were surveilled, as were Bobby Seale, Tom Hayden, and Jane Fonda. In addition to mechanical data collection, the agency also relied on information collected by informants and undercover officers. For all its power to harm, though, surveillance can also take forms that almost everyone would agree are benign, or even beneficial. For example, medical doctors have a range of tools at their disposal to track patients' heart rates, brain waves, or blood-glucose levels. The Federal Aviation Administration routinely conducts random drug and alcohol testing of its pilots and crews to ensure that they can fly safely. We watch young children when they play on the monkey bars, and keep a close eye on elders when they grow too frail or incapacitated to care for themselves. Big Tech's insidious hold on our lives comes from the fact that it combines both ends of this surveillance spectrum. Our devices deliver services that are highly protective (receiving a text alert each time a financial transaction affects a bank account, for example) as well as potentially abusive (making our political speech or our geographic movements available to, say, a police officer or an immigration agent). Technology companies are careful to present the equation as balanced, with convenience and connection on one side and collection of granular information on the other, so it is much harder for users to simply stop using their devices. In the early years of the internet, many people thought their data would be used only for targeted advertising. By 2014, when I began working on my novel, the unholy alliance between Big Tech and the government was becoming apparent. Edward Snowden had revealed the existence of PRISM, a mass-surveillance program that the National Security Agency operated in partnership with tech companies such as Apple, Facebook, Microsoft, and Google. PRISM was authorized under the PATRIOT Act, and although officials maintained that its targets were foreigners, the communications of Americans were routinely collected as well. A friend of mine, an avowed liberal, shrugged it off; he had nothing to hide, he said, and he trusted that then-President Obama would do the right thing. But even if you conceded that Obama could be trusted with the data—which I didn't—what would happen if this surveillance apparatus were run by someone else? The Snowden disclosures led to a monthslong national debate about privacy, but that eventually died down, and the program continued to operate. Still, its potential for abuse stayed with me. I grew up in Morocco in the 1970s and '80s, a period of state repression, kidnappings, and disappearances that came to be known as the Years of Lead, so I knew well what could happen when a government set its sights on an individual it found suspect or troublesome. A popular joke at the time went something like this: The CIA, the FBI, and the Moroccan police enter into a friendly contest. The Secretary of the United Nations releases a rabbit into the woods and asks them to catch it. The FBI places informants in the forest and, when it can't find the rabbit, concludes that it was never there. The CIA hits the forest with heavy artillery, then announces that the rabbit is dead. The Moroccan police go in and bring out a fox with two black eyes. 'Okay, okay,' the fox says. 'I am a rabbit.' Growing up under state control made me hypersensitive, decades later, to the dangers of technological surveillance. Tech companies have access to an ever-growing and highly detailed archive of our lives: our texts and emails, our pictures, our habits and movements, our cultural tastes and political opinions. In The Dream Hotel, I wanted to explore a world where privacy as we know it has ceased to exist, and Big Tech's alliance with the government has led to indefinite detention for pre-crime. Since the novel came out, friends have been sending me stories in the news. Like a Guardian report about how the U.K. government commissioned the development of a homicide-prediction algorithm. Or a CNN piece about how the State Department considers the ' expected beliefs, statements, or associations ' of Mahmoud Khalil, the Columbia graduate and green-card holder currently being held in a Louisiana detention center, to be sufficient reason for his deportation. Or a Rolling Stone article about how the Trump administration might pursue denaturalizing American citizens and sending them to El Salvador. Then there is the New York Times story about how Elon Musk is leading efforts to create a giant government database that merges information from all existing federal records. Under this scheme, the personal, legal, financial, housing, educational, and employment information of every American would be centralized. (In my novel, this is called the OmniCloud.) I thought I was writing about a time 20 or 30 years into the future. I didn't foresee that in 2025, an unelected billionaire would have his underlings enter federal agencies over staffers' objections and—according to an official whistleblower report —just copy the private data of millions of citizens. Nor did I imagine that the acting director of ICE would bluntly state his vision of a deportation force that operates ' like [Amazon] Prime, but with human beings.' But the point of a speculative novel isn't to see what a writer got right or wrong about the future. A speculative novel isn't even about the future, exactly, but about an alternative world in which our anxieties about the present moment are on full display. What if we faced a society-altering epidemic? (The Plague, Blindness.) What if the planet warmed? (Parable of the Sower.) What if we could clone ourselves? (Never Let Me Go.) What if some words and ideas were forbidden? (The Memory Police.) What if the government outlawed books? (Fahrenheit 451.) We don't put firefighters in charge of burning books—at least not yet—but Ray Bradbury gave us language to speak about the freedom to read and showed us how to notice threats to it. My hope is that readers will open themselves to the emotional experience of The Dream Hotel. And yes, maybe they will also think about the data they so easily and so frequently relinquish.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store