Latest news with #MartinHeidegger
Yahoo
09-07-2025
- Science
- Yahoo
AI and art collide in this engineering course that puts human creativity first
Uncommon Courses is an occasional series from The Conversation U.S. highlighting unconventional approaches to teaching. Art and Generative AI I see many students viewing artificial intelligence as humanlike simply because it can write essays, do complex math or answer questions. AI can mimic human behavior but lacks meaningful engagement with the world. This disconnect inspired the course and was shaped by the ideas of 20th-century German philosopher Martin Heidegger. His work highlights how we are deeply connected and present in the world. We find meaning through action, care and relationships. Human creativity and mastery come from this intuitive connection with the world. Modern AI, by contrast, simulates intelligence by processing symbols and patterns without understanding or care. In this course, we reject the illusion that machines fully master everything and put student expression first. In doing so, we value uncertainty, mistakes and imperfection as essential to the creative process. This vision expands beyond the classroom. In the 2025-26 academic year, the course will include a new community-based learning collaboration with Atlanta's art communities. Local artists will co-teach with me to integrate artistic practice and AI. The course builds on my 2018 class, Art and Geometry, which I co-taught with local artists. The course explored Picasso's cubism, which depicted reality as fractured from multiple perspectives; it also looked at Einstein's relativity, the idea that time and space are not absolute and distinct but part of the same fabric. We begin with exploring the first mathematical model of a neuron, the perceptron. Then, we study the Hopfield network, which mimics how our brain can remember a song from just listening to a few notes by filling in the rest. Next, we look at Hinton's Boltzmann Machine, a generative model that can also imagine and create new, similar songs. Finally, we study today's deep neural networks and transformers, AI models that mimic how the brain learns to recognize images, speech or text. Transformers are especially well suited for understanding sentences and conversations, and they power technologies such as ChatGPT. In addition to AI, we integrate artistic practice into the coursework. This approach broadens students' perspectives on science and engineering through the lens of an artist. The first offering of the course in spring 2025 was co-taught with Mark Leibert, an artist and professor of the practice at Georgia Tech. His expertise is in art, AI and digital technologies. He taught students fundamentals of various artistic media, including charcoal drawing and oil painting. Students used these principles to create art using AI ethically and creatively. They critically examined the source of training data and ensured that their work respects authorship and originality. Students also learn to record brain activity using electroencephalography – EEG – headsets. Through AI models, they then learn to transform neural signals into music, images and storytelling. This work inspired performances where dancers improvised in response to AI-generated music. AI entered our lives so rapidly that many people don't fully grasp how it works, why it works, when it fails or what its mission is. In creating this course, the aim is to empower students by filling that gap. Whether they are new to AI or not, the goal is to make its inner algorithms clear, approachable and honest. We focus on what these tools actually do and how they can go wrong. We place students and their creativity first. We reject the illusion of a perfect machine, but we provoke the AI algorithm to confuse and hallucinate, when it generates inaccurate or nonsensical responses. To do so, we deliberately use a small dataset, reduce the model size or limit training. It's in these flawed states of AI that students step in as conscious co-creators. The students are the missing algorithm that takes back control of the creative process. Their creations do not obey AI but reimagine it by the human hand. The artwork is rescued from automation. Students learn to recognize AI's limitations and harness its failures to reclaim creative authorship. The artwork isn't generated by AI, but it's reimagined by students. Students learn chatbot queries have an environmental cost because large AI models use a lot of power. They avoid unnecessary iterations when designing prompts or using AI. This helps reducing carbon emissions. The course prepares students to think like artists. Through abstraction and imagination they gain the confidence to tackle the engineering challenges of the 21st century. These include protecting the environment, building resilient cities and improving health. Students also realize that while AI has vast engineering and scientific applications, ethical implementation is crucial. Understanding the type and quality of training data that AI uses is essential. Without it, AI systems risk producing biased or flawed predictions. This article is republished from The Conversation, a nonprofit, independent news organization bringing you facts and trustworthy analysis to help you make sense of our complex world. It was written by: Francesco Fedele, Georgia Institute of Technology Read more: AI literacy: What it is, what it isn't, who needs it and why it's hard to define AI is advancing even faster than sci-fi visionaries like Neal Stephenson imagined AI-powered assistive technologies are changing how we experience and imagine public space Francesco Fedele does not work for, consult, own shares in or receive funding from any company or organization that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.


Atlantic
10-06-2025
- General
- Atlantic
A Computer Wrote My Mother's Obituary
The funeral director said 'AI' as if it were a normal element of memorial services, like caskets or flowers. Of all places, I had not expected artificial intelligence to follow me into the small, windowless room of the mortuary. But here it was, ready to assist me in the task of making sense of death. It was already Wednesday, and I'd just learned that I had to write an obituary for my mother by Thursday afternoon if I wanted it to run in Sunday's paper. AI could help me do this. The software would compose the notice for me. As a professional writer, my first thought was that this would be unnecessary, at best. At worst, it would be an outrage. The philosopher Martin Heidegger held that someone's death is a thing that is truly their own. Now I should ask a computer to announce my mother's, by way of a statistical model? 'Did you say AI?' I asked the funeral director, thinking I must have been dissociating. But yes, she did. As we talked some more, my skepticism faded. The obituary is a specialized form. When a person of note dies, many newspapers will run a piece that was commissioned and produced years in advance: a profile of the deceased. But when a normal person dies—and this applies to most of us—the obituary is something else: not a standard piece of journalistic writing, but a formal notice, composed in brief, that also serves to celebrate the person's life. I had no experience in producing anything like the latter. The option to use AI was welcome news. After all, there were lots of other things to do. The obituary was one of dozens of details I would have to address on short notice. A family in grief must choose a disposition method for their loved one, and perhaps arrange a viewing. They must plan for services, choose floral arrangements or other accessories, select proper clothing for the deceased, and process a large amount of paperwork. Amid these and other tasks, I found that I was grateful for the possibility of any help at all, even from a computer that cannot know a mother's love or mourn her passing. The funeral director told me I would be given access to this AI tool in the funeral-planning online account that she had already created for me. I still had a few misgivings. Would I be sullying Mom's memory by doing this? I glanced over at an advertisement for another high-tech service—one that could make lab-grown diamonds from my mother's ashes or her hair. Having an AI write her obituary seemed pretty tame in comparison. 'Show me how to do it,' I said. Actually getting a computer to do the work proved unexpectedly difficult. Over the next 24 hours, the funeral director and I exchanged the kind of emails you might swap with office tech support while trying to connect to the shared printer. I was able to log in to the funeral portal (the funeral portal!) and click into the obituary section, but no AI option appeared. The funeral director sent over a screenshot of her display. 'It may look slightly different on your end,' she wrote. I sent a screenshot back: 'That interface is not visible to me.' Web-browser compatibility was discussed, then dismissed. The back-and-forth made me realize that Mom's memorial would be no more sullied by AI than it was by the very fact of using this software—a kind of Workday app for death and burial. In the end, the software failed us. My funeral director couldn't figure out how to give me access to the AI obituary writer, so I had to write one myself, using my brain and fingertips. I did what AI is best at: copying a formula. I opened up my dad's obituary, which Mom had written a couple of years earlier, and mirrored its format and structure. Dates and locations of birth and death, surviving family, professional life, interests. I was the computer now, entering data into a pre-provided template. When I finally did get the chance to try the AI obituary writer a few weeks later—after reaching out to Passare, the company behind it—I found its output more creative than mine, and somehow more personal. Like everything else, the funeral-services industry is now operated by cloud-based software-as-a-service companies. Passare is among them, and offers back-office software for funeral-home management along with family-facing funeral-planning tools. Josh McQueen, the company's vice president of marketing and product, explained why my earlier attempt to use the obituary-writing tool had failed: The funeral home must have had that feature set for staff-only access, which some businesses prefer. Then he gave me access to a mock funeral for the fictional departed John Smith so I could finally give it a go. I couldn't change John Smith's name, but I pretended I was writing the obituary for my mother instead. Using simple web forms, I put in her education and employment information, some life events that corresponded to her 'passions' and 'achievements,' and a few facts about relevant family members who had survived her or preceded her in death. These had to be entered one by one, choosing the type of relation from a drop-down and then checking a box to indicate whether the person in question was deceased. I felt like I was cataloging livestock. From there, Passare's software, which is built on top of ChatGPT technology, generated an obituary. And you know what—it was pretty good. Most of all, it was done, and with minimal effort from me. Here's an excerpt, with John Smith's name and pronouns swapped out for my mother's, and a couple of other very small alterations to smooth out the language: Sheila earned her bachelor's degree and dedicated her career to managing her late husband David's psychology private practice for decades. She was not only devoted to his work but also a dedicated caregiver for Dave in his later years. Throughout her life, Sheila nurtured his passions, which included playing music—especially the piano—and a deep appreciation for Native American art. She found joy in teaching skiing to children and sharing the vibrant personalities of her many pet birds. The AI obituary can also be tuned by length and tone—formal, casual, poetic, celebratory. (The poetic version added flourishes such as 'she found joy in the gentle keys of her piano, filling her home with music that echoed her spirit.') Because an obituary is already a schematic form of writing, the AI's results were not just satisfactory but excellent, even. And, of course, once the draft was done, I could adjust it as I wished. 'When we first started testing this, ChatGPT would just make up stories,' McQueen told me. It might assert that someone named Billy was often called Skippy, for example, and then concoct an anecdote to explain the fake nickname. This tendency of large language models, sometimes called hallucination, is caused by the technology's complex statistical underpinnings. But Passare found this problem relatively easy to tame by adjusting the prompts it fed to ChatGPT behind the scenes. He said he hasn't heard complaints about the service from any families who have used it. Obituaries do seem well suited for an AI's help. They're short and easy to review for accuracy. They're supposed to convey real human emotion and character, but in a format that is buttoned-up and professional, for a public audience rather than a private one. Like cover letters or wedding toasts, they represent an important and uncommon form of writing that in many cases must be done by someone who isn't used to writing, yet who will care enough to polish up the finished product. An AI tool can make that effort easier and better. And for me, at least, the tool's inhumanity was also, in its way, a boon. My experience with the elder-care and death industries—assisted living, hospice, funeral homes—had already done a fair amount to alienate me from the token empathy of human beings. As Mom declined and I navigated her care and then her death, industry professionals were always offering me emotional support. They shared kind words in quiet rooms that sometimes had flowers on a table and refreshments. They truly wanted to help, but they were strangers, and I didn't need their intimacy. I was only seeking guidance on logistics: How does all this work? What am I supposed to do? What choices must I make? A person should not pretend to be a friend, and a computer should not pretend to be a person. In the narrow context of my mom's obituary, the AI provided me with middle ground. It neither feigned connection nor replaced my human agency. It only helped—and it did so at a time when a little help was all I really wanted.


Observer
09-04-2025
- Politics
- Observer
A philosophical analysis of global policies
Our modern world is enduring a succession of crises reshaping human life on political, social, economic, and philosophical levels. These crises prompt a reevaluation of current systems, demanding a restructuring capable of confronting an increasingly complex and uncertain future. The present era, characterised as the "Age of Consecutive Crises", encompasses health disasters like Covid-19, geopolitical conflicts, environmental threats, and technological disruptions. Martin Heidegger's concept of existential anxiety reflects this crisis-laden reality, where collective anxiety drives urgent responses to issues that have transcended theoretical debates to demand immediate solutions. Philosophy plays a critical role in addressing the fundamental questions raised by these crises: What is the nature of human existence amidst instability? What are the ethical limits and collective responsibilities when crises arise from human actions themselves? The intensifying crises reveal structural deficiencies within the global system, particularly its ethical framework and strategic planning capabilities. Philosophically, the unknown is not mere emptiness but rather the result of unresolved issues, such as the absence of justice, economic inequality, and the disproportionate balance of power worldwide. When one dominant force imposes its oppressive weight over other nations, the future becomes an unclear projection of injustice. Major powers often engage in cross-border projects through massive investments or security systems that disregard the will of other peoples or enforce their agendas through warfare and coercion. Such actions produce an artificial unknown, one crafted by human ambition and political greed. Unless preventive policies and effective international governance are established, this trend will continue, driving humanity further into uncertainty and turmoil. The rapid development of Covid-19 vaccines demonstrated both technological prowess and ethical shortcomings. Historically, philosophy has provided humanity with guidance during critical junctures. Greek philosophers, particularly Socrates, confronted political and moral crises by relentlessly questioning established concepts of justice and virtue. Such questioning exposed flaws in social and political systems, encouraging self-awareness and intellectual growth. In contemporary times, philosophical frameworks such as those proposed by Arab philosopher Taha Abdurrahman offer new ethical paradigms capable of addressing global chaos. Abdurrahman's vision emphasises the need for an Islamic moral philosophy that restructures concepts of politics, freedom, and economic justice to align with current developments. His approach highlights the significance of practical philosophy, which merges theory with application, as a necessary tool for addressing modern challenges. John Rawls's 'A Theory of Justice' provides another philosophical perspective rooted in fairness and equitable resource distribution. Such theories, both ancient and modern, advocate for the creation of fair political systems aligned with the evolving realities of the digital age and shifting power dynamics. Technology, particularly artificial intelligence (AI), has emerged as a dual-edged force. While technological advancements have accelerated progress in various fields, including healthcare, they have also exposed significant disparities in global justice. For instance, the rapid development of Covid-19 vaccines demonstrated both technological prowess and ethical shortcomings, as poorer nations struggled to secure sufficient doses. The growing influence of algorithms and AI in political and military decision-making raises further concerns. Reports indicate that algorithms increasingly manipulate public opinion through media campaigns and electoral processes, exploiting vast data sets to influence social and psychological dynamics. Additionally, AI is weaponised in warfare through espionage, surveillance, and targeted killings. Michel Foucault's questions about power and control remain relevant: 'Who controls whom? Is technology a new form of power, or merely a tool in the hands of traditional forces?' The increasing dominance of digital systems demands ethical reflection and regulatory frameworks to prevent misuse and safeguard fundamental human rights. The crises humanity faces are not isolated phenomena but interconnected challenges resulting from unresolved structural deficiencies and unethical governance. Whether technological, geopolitical, or environmental, these crises reflect a broader moral failure to address essential questions about justice, fairness, and human dignity. Ultimately, the unknown is a human construct resulting from neglect, greed, and an unwillingness to address foundational issues. Philosophy remains essential as a tool for questioning, understanding, and restructuring human systems to foster a more just and ethical world. The future requires more than technological advancements; it demands moral clarity, philosophical depth, and a commitment to justice that transcends political and economic interests.


Washington Post
31-03-2025
- Entertainment
- Washington Post
This Pulitzer winner's sentences are beautiful. Her thinking can be confused.
It is the fate of many writers to excel at what they most despise and fail at what they most acclaim. The critic Susan Sontag fancied herself a novelist despite the evident mediocrity of her stiffly experimental attempts; the German philosopher Martin Heidegger inflicted reams of maudlin poetry on a readership already dazed by the density of his demanding prose. And although she has spent years cultivating a single-minded emphasis on political virtue, the Pulitzer Prize-winning critic Andrea Long Chu is something of an aesthete.