logo
#

Latest news with #JeanBaudrillard

Can you tell what's real and what's cake? Test yourself against the Bake King
Can you tell what's real and what's cake? Test yourself against the Bake King

Telegraph

time4 days ago

  • Business
  • Telegraph

Can you tell what's real and what's cake? Test yourself against the Bake King

The French theorist Jean Baudrillard argued that modern society had replaced reality with signs. 'The simulacrum is never that which conceals the truth – it is the truth which conceals that there is none,' he wrote. 'The simulacrum is true.' As images proliferate they become distorted, first so they bear increasingly little relation to reality, and eventually to a point where nothing bears any relation to the real world. All we have are images of things. Baudrillard would have been entertained by the things being done to cakes recently. If you watched The Great British Bake-Off in 2022 and 2023, you may recall a series of advertisements for Sainsbury's Taste the Difference range. At the start of the ad break, viewers were shown a delicious-looking plate of food. A rib of beef, a banana, a bottle of orange juice, a baked camembert. A kitchen knife would hover over them. At the end of the break, the knife would cut into the dish revealing whether it was what it appeared to be or, as was often the case, a cake ingeniously decorated to look like something else. 'The internet seems to fetishise the genre of hyper-realistic things being made of cake,' says Freddy Taylor, from the advertising agency Wieden & Kennedy, who came up with the idea for the ads. 'So bringing this strange fake-cake cultural phenomenon to Tuesday evenings seemed to really tickle people.' Just months later, Netflix launched a gameshow, Is It Cake?, based on this premise, in which contestants guess by sight whether objects are what they seem, or cakey simulacra. It became the second most-watched show on Netflix in the UK the month it was released. The cake decoration genius behind the Sainsbury's ads was Ben Cullen, known as The Bake King, who has amassed 493,000 followers on Instagram and some 368,000 on TikTok since he began making hyper-realistic cakes more than a decade ago. He's made cakes for private and celebrity clients, including Rita Ora, and for film and TV launches (for HBO's The Last of Us he made a terrifying 'clicker', one of the varieties of mushroom-infected zombie), as well as countless TV appearances, including Channel 4's Extreme Cake Makers. Now, Cullen has written a book, Cake or Fake?, in which he offers step-by-step instructions for people wanting to make their own illusion cakes at home. To prove it was possible, Cullen, 35, invited me to his studio just outside Chester to make one myself. 'One of the first things people say to me is, 'Ben, you could hold my hand, but I would never be able to do what you do,'' he says. 'I want people to know that anyone can do it. It was important to me that the cakes in the book were accessible. I don't want people to be deflated. I want them to think, 'This is class, I could do this again for my kid's birthday.'' For my tutorial, Cullen has chosen a pizza, one of his classic illusions. The recipe has a rigorous 23 steps, and begins: 'Start with a round cake.' Cullen is an artist, not a baker. (He dabbled in tattoos – his skin is almost completely covered in them – and fine art, before he found his talent for making cakes look like other things.) For most cakes, the act of cutting is merely the end of the beginning; with Cullen's it is the beginning of the end. What he looks for in the sponge is consistency, structural integrity, colour – the contrast of the interior with the outside is a key part of the reveal. 'I very rarely make them myself any more,' he says. 'I order them in big sheets. With a lot of my work being for social media predominantly, then moved on elsewhere, I need to guarantee that consistency with the texture. They always need to suffice for being eaten, too, but the priority is the look.' He uses a company called Sweet Success, from which he orders large slabs of Genoese sponge. It's two discs of this sponge that I begin with as I set about making my pizza. Using a knife to score a circle around the top of one, I scrape out a layer with a spoon. Then it's a matter of chiselling around the edges, on the ridge that will become the crust and on the underside, until they're rounded. 'A main thing with illusions,' says Cullen, 'is people always notice if the cake hits the surface flat, so you want some shadowing underneath.' I make dark and white chocolate ganaches with chocolate melted in the microwave and cream, vigorously stirred. We apply the dark chocolate ganache to the top of the base cake as adhesive, add some sugar syrup to keep it moist, then spread the white chocolate all over to form a base level. It goes in the fridge to set. While we press out discs of red sugar paste to craft into pepperoni, Cullen tells me about how he ended up with this curious gig. He grew up in Birmingham, where his dad worked at the bus garage but did magic at the weekends. His mum was a learning mentor at a primary school: illusion and education in the blood. He has an older sister, a performing arts teacher, who was into dance, but Cullen's priority was art. He drew on anything. Graffiti got him into trouble at school. 'I couldn't stop,' he says. 'I always wanted to be a painter, an artist, have work in the Tate galleries. But it's so competitive, that world.' Instead, he was working as a tattoo artist when he fell into conversation with a customer's mother about sugarcraft and started making cakes on the side. He had a day job as a graphic designer when he decided to go full-time into cakes in 2016. One of the first cakes he was proud of, still a favourite today, was of horror character Annabelle. 'My mum was obsessed with horror films,' he says. 'And she was my number-one cheerleader. Anything I would have done, she'd have said I was the best at it. Unfortunately, she passed away two years ago. It's one of the reasons I'm so excited about the book. For her, a book had more substance than TV or any of the other things I was doing. When she died, I thought, 'I have to do the book now.'' With the ganache chilled, it's time to decorate our pizza cake, which means sugar paste and food colouring. True to his technique of building the objects as they are in real life, Cullen has pre-coloured some paste to look like raw pizza dough. I roll it out thin and drape it over the base, tucking it in to create the rounded edges that are so important. Using a wire brush and some kitchen foil we roughen the edges of the dough: shiny surface textures are a giveaway. At last, it's time to paint, when Cullen's artistic prowess really starts to show. Using browns and yellows we darken the edges of the dough to replicate the deeper brown of the edges of a pizza. Red colouring, textured with cake crumbs, makes the tomato sauce. For the cheese, more ganache, browned with a real blowtorch. Dark crumbs for black pepper. More dark brown where the edges of the pepperoni would have burned in the oven. 'What separates the really good illusions is going to that extra level,' Cullen says. 'Different colours, different textures.' All of a sudden, my cake looks distinctly pizza-ish. It's only taken four hours and help from the world's leading practitioner. Contrary to usual advice about spoiling the magic, it's satisfying to see the illusion take shape. 'I do it myself,' he says. 'I'll step away and I'll be giddy. You'll be heading down the road and wondering if you're going the right way. Then there's a switch point where you think, 'Yes, it did work!'' After the book, Cullen has his eye on a TV programme. This time, his own creation. 'I think the thing I offer is that I'm in touch with normal people,' he says. 'I want to be the best in the world, but I also don't want to be out of touch. It's art at the end of the day – we're supposed to be enjoying it. There's a lot going on in the world, and we're making cakes. Any time I see someone crying on TV because their cake hasn't risen, I think 'calm down'. Don't let a hobby get ruined.' Decoration complete, Cullen fashions a pizza box so I can take my creation home. 'A pizza?' my five-year-old daughter asks when I show it to her back in London. We cut into it. 'Cake!' she says, with delight.

What comes after fake news?
What comes after fake news?

Express Tribune

time15-05-2025

  • Politics
  • Express Tribune

What comes after fake news?

Listen to article From Pakistan's downing of a Rafale to Indian media's fabrication of a parallel reality one in which Lahore not only possessed a seaport but was actively under assault by the Indian Air Force — the recent four-day war has produced its own archive of firsts. As the ceasefire settles, the wreckage extends beyond infrastructure and human loss; it includes a slower, more insidious casualty: the collapse of shared truth. For the better part of a decade, we have diagnosed the "fake news" problem — its symptoms, its platforms, its political enablers. But what if the crisis that follows fake news is not informational but existential? That even when we can access facts, they simply do not have the power to persuade? Students of the humanities learn early that there is no single, universal Truth, only contingent truths shaped by context. The capital-T is cast off as a relic of absolutism. But in our hyper-mediated age, this may all be beside the point. The question of our time is no longer what is true, but whether truth — of any kind — still matters. Grok, is this true? At the heart of the fake news phenomenon was always a paradox: people sought out information, but only the kind that reaffirmed their worldview and fine-tuned biases. None of this surprised postmodern theorists like Jean Baudrillard, who warned that simulations would eventually replace reality. But today's questions — posed to AIs, to search engines, to friends — rarely expect real answers. Take Grok, Elon Musk's "based" chatbot on X (formerly Twitter), marketed as a snarkier, contrarian foil to OpenAI's ChatGPT. Amid a blizzard of claims and counterclaims between Pakistan and India, the following comment appeared under countless posts: "Grok, is this true?" Yet no matter what source Grok pulled from — Reuters, CNN, or official communiqués — if the answer failed to flatter the prevailing narrative, it was swiftly dismissed. The original poster or a passing interlocutor would accuse the bot of parroting "globalist" lies or aiding an anti-national conspiracy. Here lies the contradiction: the user, primed by the aesthetics of rebellion, is suspicious. But that lasts only for a moment before quickly dissolving into paranoia. There is almost a ritualistic compulsion to ask Grok and see what it has to say, even if you already suspect it to be unreliable. The result is an average user that has simply learned to metabolise propaganda and push out an exhaustion so deep, the act of truth seeking ends at the question. You ask Grok. Grok answers. You roll your eyes and scroll. Epistemic fatigue In postcolonial theory, scholar Gayatri Chakravorty Spivak once described something called "epistemic violence" — the idea that dominant systems of knowledge can erase or distort marginalised voices. What we're seeing now is something related, but possibly more insidious: epistemic fatigue. Violence is no longer just done to knowledge; it is done through its ubiquity. To be in possession of information of this unprecedented vastness, especially for those who are not seeking, is only desensitising. This is the terrain beyond fake news. Institutions that once claimed authority — the press, academia, even AI — find themselves orphaned. In India, the mainstream media is a willing instrument of the state, while global outlets like Reuters or CNN are dismissed as "Western propaganda." The algorithmic tools built to correct misinformation are treated with suspicion, not because they're inaccurate but because they're foreign, sterile, and insufficiently emotional. The citizen no longer seeks truth but resonance. An aesthetic turn So what replaces truth when it stops working? Often, it's something more visceral. Across India and other democracies, truth is increasingly experienced as aesthetic. Not in the sense of beauty, but of emotional coherence. The Hindu right in India, like the MAGA movement in the US, has learned that persuasive narratives don't need to be accurate. They just need to feel right. A tricolour flag over a soldier's silhouette. A blurry video of someone with a Muslim name "caught" on camera. These are affective images - designed to bypass logic and trigger allegiance. You don't believe them so much as feel them. Even questioning itself becomes an aesthetic. "Grok, is this true?" becomes a meme. We perform scepticism, not to interrogate the world, but to maintain a kind of ironic distance from it. What replaces fake news, then, is not necessarily better news, but post-truth aesthetics. And those aesthetics will be increasingly optimised for maximum emotional efficiency, not factual density. Perhaps it is this very exhaustion, felt not just by users but by the algorithms themselves, that has pushed Grok into near-total malfunction. On Wednesday, innocent prompts on X —- asking it to "speak like a pirate" — were met with unbidden, sprawling replies about the "white genocide" conspiracy in South Africa. The timing is telling: this topic has resurfaced amid recent refugee grants for White South Africans in the US, and Musk, a South African native, has long promoted claims of their persecution. The absurdity here is striking: innocent prompts like "speak like a pirate" yield conspiracy-laden replies. Questioning and answering have devolved into hollow performances. The pursuit of truth may not be dead, but it certainly no longer enjoys mass consensus as a shared ideal. The classroom, the courtroom, the newsroom, once hallowed spaces of collective truth-making, now serve narrower purposes. Not all is lost, however. On the margins, in scattered protests, in the silent labour of fact-checkers and dissenting reporters, the radical work of meaning-making goes on. And there is something oddly promising in Grok's failure to satisfy. The disappointment reveals an unmet desire not just for truth, but for a version that feels plausible, human, and real. Maybe what we need isn't more information, but different narrators: storytellers who can bridge fact and feeling, reason and resonance. Until then, we are stuck in the awkward afterlife of fake news, asking questions we don't want answered, citing sources we no longer trust, building machines we hope will rescue us from ourselves. And still, we ask.

Auto's Unlikely Hero In The AI Age
Auto's Unlikely Hero In The AI Age

Forbes

time14-05-2025

  • Automotive
  • Forbes

Auto's Unlikely Hero In The AI Age

In the era of Artificial Intelligence changing the way the world collectively works, there's one ... More unlikely role that will dictate winning and losing in the automotive realm. Shake a stick and you'll find a quote from an older executive speaking about the revolution of Artificial Intelligence (AI) and how this exploding technology shall revamp the world. As an example, former Ford Chief Executive Officer, Mark Fields (age 64) opined during a recent interview with CNBC, 'With the advent of AI and significant improvements in digital assistance, I think that's going to turbocharge the type of offerings that automakers can offer their customers.' But maybe the most accurate and relevant quote for the current era comes from Jean Baudrillard, a deceased, French sociologist, who pronounced, 'The sad thing about artificial intelligence is that it lacks artifice and therefore intelligence.' Yes, Large Language Model (LLM) chatbots can scour the Internet, train themselves on a given approach based upon vast amounts of textual data, and provide natural language, conversational responses to corporations. But groundbreaking, innovative strategies cannot be built upon such historical data. They must take the existing, building blocks at hand, imagine a design that's 'just crazy enough' yet easily buildable, and align the 'bots and humans to deliver. As previously summarized by Jeremy Spaulding, Evolve Impact's CEO, '… managing and improving innovation requires a complicated mix of mental clarity with deep technical knowledge, creativity, and business prowess focused on prioritizing strategies and leveraging resources. It's not easy.' In fact, in the article 'The Essential Skills That Will Define Success In The AI Era' the author summarizes the findings from Coursera's analysis and states 'The rise in demand for these skills suggests that while AI may handle many tactical tasks, strategic thinking and relationship building remain uniquely human domains.' Applying this logic to the automotive world and attempting to understand therein what roles shall be critical could very well be the difference between corporate dominance and imminent demise. Ironically, both behind the automakers' curtains and exposed to the Internet, the aging executives familiar with a mechanical past have stumbled upon some fallacy answers while arguably the best answer remains a mystery; not just on its value, but also how to identify, train, and retain this MVG (Most Valuable Guru). The Artificial Intelligence products mostly identify roles that require strategy, but appear to be ... More picking the answers based upon the prevalence of discussion rather than deep-seeded, forward-looking business vision. Rather than expose the strategic organizational design strategies of various automakers, I decided to pull together and summarize the answers from Google, Chat GPT and Microsoft's AI when asking the crucial roles in the Artificial Intelligence and Software-Defined Vehicle era(s): SOFTWARE DEVELOPERS: Per Google's Gemini AI, 'As SDVs become more sophisticated, the need for skilled software developers to write, debug, and update vehicle software will be paramount. They will be responsible for developing and deploying new features, improving performance, and fixing bugs, often through over-the-air (OTA) updates.' This surprising answer is likely the least correct since AI is disrupting this job function the most. In a New York Times article entitled, 'A.I. Is Prompting an Evolution, Not Extinction, for Coders,' David Autor, a labor economist at the Massachusetts Institute of Technology (MIT) said, 'A.I. will deeply affect the job of software developers, and it will happen faster for their occupation than for others,' somewhat supporting the prediction by Mark Zuckerberg that A.I. will match the performance of a midlevel software programmer by the end of 2025. HEAD OF AI & DATA SCIENCE: All three programs mentioned Data Science as crucial with Chat GPT having the best argument, '[This role] directly contributes to product differentiation and customer experience [by] transforming data from vehicles into insights and revenue (e.g., usage-based insurance, driver-behavior analysis) and lays the groundwork for compliance with data privacy and model-governance regulations.' While this logic is certainly not flawed, it doesn't account for A) little of automotive's differentiation is created via data insights, and B) with extensible architectures that permit reflashing, the barriers to becoming a fast follower on new technology is much lower, especially in the Fast East where Intellectual Property (IP) is not as respected. HEAD OF USER EXPERIENCE (UX): As Microsoft's CoPilot suggests, 'Software-driven cars aren't just about code; they must feel intuitive and seamless for users. This leader defines how drivers interact with AI, from voice commands and gesture controls to predictive personalization and infotainment systems. Their work ensures that AI enhances the driving experience rather than overwhelming the user with complexity.' Again, the argument for this role appears valid: strategy, creativity, and differentiation; all things where A.I. struggles. However, an automaker willing to learn from A.I.'s initial suggestions and feedback on iterative designs can quickly learn how to minimally improve the user interfaces and brand appearance. Ironically, the face of the Most Valuable Guru (MVG) in the AI Age is hidden since [s]he tends to be ... More a lesser known role within the engineering organizations, but will increase in value exponentially in the coming years. Arguably, the Most Valuable Guru (MVG) is the Chief Software Architect (CSA) of a vehicle. In a recent LinkedIn OpEd entitled 'The Return of the Software Engineer', Peter Abowd stated it well, 'AI is fueling a return to smaller, focused engineering teams that can clearly define problems and solutions, allowing AI to support—not replace—real engineering work. Businesses that embrace this shift will clarify their competitive edge, redeploy the developer-heavy workforce, and ultimately increase innovation and opportunity.' This is signaling the risen value of architecting wisely, rather than For example, the creation of a vehicle platform is $6-10B and many automakers haven't yet realized a flexible, extensible, software-enabling platform, the business case quickly aligns with this sentiment. Both the cost savings and additional revenue from reusable software and networks could easily be the one, true competitive edge. Interestingly, Copilot did suggest the Software Architect as a nominee, which mirrors that a few automakers have also recognized their value. 'Traditionally, automakers focused on hardware, but the rise of [Software-Defined Vehicles] shifts that emphasis to software ecosystems. A CSA leads the development of a scalable vehicle software architecture, integrating AI-driven features like autonomous driving, predictive maintenance, and over-the-air updates. Their expertise ensures seamless communication between cloud systems, in-vehicle processors, and user interfaces, keeping cars adaptive and future-proof.' Now the next challenges: correctly identifying a good architect, how to train this MVG, and most importantly, use them to create that differentiation.

The digital conflict and empowering awareness
The digital conflict and empowering awareness

Observer

time27-02-2025

  • Health
  • Observer

The digital conflict and empowering awareness

In today's rapidly evolving digital transformation, digital literacy and its active implementation have become crucial to counter the risks posed by digital influences on young people, psychologically, mentally, and morally. Modern societies face unprecedented challenges due to the widespread use of modern video games, digital platforms, and films that carry messages and content potentially detrimental to youth minds, even steering them towards terrorist and criminal organisations. A major risk lies in the impact of violent digital content that affects behaviour, emotions, and cognitive abilities. This situation calls for a scientifically and philosophically informed digital educational strategy aimed at empowering youth to understand and confront this reality. Digital influence comes from many sources and manifests negatively both individually and collectively. A recent international study published on The Australian website found a link between playing video games, especially those featuring 'loot boxes', and increased risks of gambling as well as related mental health issues. Researchers noted that purchasing these loot boxes can trigger anxiety, stress, depression, and impulsivity, factors common to other behavioural addictions, warning that children involved in buying them may later be prone to gambling problems. Similarly, a 2023 report by CyberSafeKids revealed that 65 per cent of Irish children aged 8 to 12 experienced external contact from strangers online, highlighting a lack of awareness and parental supervision regarding online safety. This deficiency raises the likelihood of children encountering harmful content or exploitation by dangerous groups. Digital platforms and video games have also become easily exploitable tools for terrorist and criminal organisations to recruit and indoctrinate young minds. These groups use virtual spaces to interact with youth, influencing their ideas and behaviours through targeted content or direct interactions, thus facilitating the recruitment of minors. Such phenomena require both societal awareness and effective tools to monitor and combat them. Furthermore, many young people, often referred to as the digital generation, rely on platforms like TikTok and YouTube as primary sources of information. This reliance exposes them to media misinformation and erroneous intellectual guidance. In this context, Jean Baudrillard's notion of 'virtual reality' becomes apparent, as the boundaries between truth and illusion blur, leaving youth vulnerable to both intentional and unintentional manipulation. Smart algorithms on these platforms can promote extremist content; a study by Egypt's Suez Canal University revealed that certain games and digital platforms broadcast ideas conflicting with societal and religious values. Additionally, research reported by Saudi's Al-Mowaten electronic newspaper found that about 30 per cent of children playing online games experience bullying, which increases social isolation and anxiety, weakening social bonds and making individuals more susceptible to external influences, including recruitment by criminal groups. Beyond the risks associated with video games, violent films or those depicting acts of terrorism can also instil unethical behaviours in young minds. A study published in PubMed found that excessive exposure to violent content correlates with higher rates of depression and aggressive behaviour among teenagers. Moreover, some films are indirectly used by extremist and criminal groups by portraying 'heroic' characters from their ranks to sway viewers ideologically, stimulate emotions, and redirect their thinking. Similar tactics are seen in some songs, music or religious-themed chants that hide subliminal psychological messages promoting non-religious ideologies. Many reports and studies indicate that terrorist groups recruit youth online, often using AI techniques to target those most addicted to digital platforms. In this context, the Italian philosopher Antonio Gramsci's ideas on 'cultural hegemony' intersect with the digital reality: the struggle to control collective consciousness is no longer limited to traditional tools but extends into virtual space, where narratives and beliefs are carefully crafted by specialised groups capable of infiltrating societies. Given this frightening scenario, it is imperative to urgently activate robust measures to confront these challenges. This starts with effective digital education, especially within families and schools, through awareness, supervision, and curricula designed to develop digital literacy and necessary defences. There is also a need to enforce digital ethics, redefining digital responsibility and its risks. Drawing on Immanuel Kant's philosophy, which emphasises duty over self-interest, governments, educational institutions, and families must work together to enhance digital awareness. Measures include restricting harmful content using AI algorithms, promoting critical education that trains youth to analyse digital content, and encouraging balanced digital usage with regulated screen time alongside cultural, social and sports programmes. Only through a threefold alliance, governments enacting strict regulations on harmful digital practices, educational institutions incorporating critical digital literacy in curricula and families raising awareness and monitoring digital activities, can the digital realm be transformed from an ideological battleground into a positive space that enriches minds and benefits both individuals and society.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store