logo
South Korea votes for new president after martial law chaos

South Korea votes for new president after martial law chaos

UPI2 days ago

1 of 6 | A voter enters the voting booth at a polling location in the Seoul neighborhood Mullae on Tuesday. Photo by Thomas Maresca/UPI
SEOUL, June 3 (UPI) -- South Koreans headed to the polls in record numbers on Tuesday to vote for a new president exactly six months after a botched martial law decree by impeached President Yoon Suk Yeol unleashed political turmoil and deepened divisions in the country.
As of 6:00 p.m., some 33.8 million voters had cast a ballot, reaching a record-breaking 76.1% of South Korea's 44.4 million eligible voters, according to the National Election Commission. The figure was the highest during the same period since South Korea began holding direct elections in 1987.
Interest in the race has remained sky-high since a snap election was called after Yoon's removal from office in April. Most polls have shown liberal Democratic Party candidate Lee Jae-myung holding a commanding lead over the conservative People Power Party's Kim Moon-soo, with a Gallup Korea survey last week giving Lee an edge of 49% to 35%.
Polls close at 8 p.m. and a winner is expected to be announced around midnight, although the vote counting is expected to continue until early Wednesday morning, the NEC said. The winner's inauguration will be held within hours, without the typical two-month transition period due to Yoon's removal.
The new president will face a host of challenges, including an economic downturn and tariff negotiations with U.S. President Donald Trump, who last week announced plans to double tariffs on steel and aluminum to 50%. The presidential vacuum has made it difficult for South Korea, an export-driven country, to craft a trade package ahead of the July deadline for Trump's 90-day pause on so-called "reciprocal" tariffs.
Geopolitical concerns, including an increasingly dangerous North Korea, and a looming demographic crisis caused by the world's lowest birth rate are also on voters' minds this election season.
For many, however, the top issue was safeguarding South Korean democracy in the wake of Yoon's shocking Dec. 3 martial law attempt, which was overturned within hours by Democratic Party lawmakers including Lee Jae-myung.
"Yoon Suk Yeol tried to destroy the democratic system in Korea," Lee Kyung-jae, 61, said after voting in the Sadang district of Seoul. "I was angry about that, so I selected Lee Jae-myung."
Lee said he was a student activist during the 1980s democracy movement against the military dictatorship of President Chun Doo-hwan and called Yoon's martial law attempt a shocking flashback to that time.
"We thought our democratic system was secure," Lee said. "Most Koreans were so angry that Yoon tried to destroy it."
During his final campaign rally on Monday night, Lee said the election was a chance to "shake off the dark night of insurrection and welcome a new morning of hope."
Sean King, senior vice president and East Asia expert at New York-based consulting firm Park Strategies, told UPI that "general public disgust" at Yoon's botched marital law attempt made the race "clearly Lee Jae-myung's to lose."
On the foreign policy front, King said that Lee would likely take a less hawkish stance toward North Korea than Yoon, which may align with a potential nuclear summit between Trump and North Korean leader Kim Jong Un.
A Lee win "suits Donald Trump just fine, as Lee Jae-myung is sure to look favorably on Trump's desired reengagement of North Korea's Kim Jong Un," King said.
Other voters on Tuesday ranked defense and security issues as top concerns.
"I am concerned about the future of this Peninsula," Eric Park, 26, said outside of a polling station. He did not want to reveal who he voted for, but said he was looking for a conservative position on national defense and a strong military alliance with the United States.
"I want to see a stronger position," Park said, citing China and North Korea as looming threats. "I just think that's the only option to save the country and future generations."
Kim Moon-soo has vowed to be a "security president who eliminates the fear of North Korea's nuclear weapons," and has signaled a hard-line stance similar to the approach taken by Yoon Suk Yeol.
He has called for strengthening extended deterrence capabilities under the U.S.-South Korea military alliance, including the potential of redeploying U.S. tactical nuclear weapons on the Korean Peninsula.
While the race is a two-way contest, Lee Jun-seok of the minor conservative Reform Party generally received around 10% support in polls leading up to the election. Lee, 40, has found support among younger men in their 20s and 30s, while courting controversy with remarks widely considered to be misogynistic.
Choi In-woo, a 23-year-old business student, said he voted for Lee because he felt the other candidates were not engaged with issues facing younger voters, such as the need for pension reform in a rapidly aging society.
"[Lee Jae-myung and Kim Moon-soo] are not thinking of how to make further developments of the economy," he said. "They just want to keep arguing about the past."
Both Lee and Kim made economic growth their top campaign promise, with a focus on heavy government investment in the artificial intelligence industry. The two candidates have also agreed on the need for constitutional reform, with both proposing a transition to a two-term, four-year presidency to replace the current single, five-year term.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Choirs gather for special events throughout district
Choirs gather for special events throughout district

Yahoo

time12 minutes ago

  • Yahoo

Choirs gather for special events throughout district

AS PART of the City of Culture Year local Bradford Choirs around the district- Bradford Voices, Ben Rhydding Community Choir, Bridging Borders and Bradford Friendship Choir, are pleased to welcome the 41st Street Choir Festival. The weekend will be shared with Windrush Generations- Carnival of Culture. The weekend includes a full programme of street singing, concerts and workshops. Members of the public will be entertained free by the festival's visiting choirs, from across the UK who will assemble in City Park on Saturday June 14. They will be welcomed to the event and the city by James Mason, chief executive of West and North Yorkshire Chamber of Commerce. At 10.30am until 11am the choirs will sing together in what they call a Mass Sing. This will be an uplifting experience of more than 50 choirs and around 1100 singers, with the singing together unaccompanied in four-part harmony. They will sing songs of peace, unity and solidarity. Prior to this the audience will be warmed up by Bradford-based band the Peace Artistes. After the Mass Sing the choirs will sing in 15 locations around the city centre for shoppers and passers-by who can stop, listen, reflect and relax. On the evening choirs will then showcase their songs at St George's Hall. Bradford is one of the first places in the UK to be recognised as a City of Sanctuary and has always welcomed people seeking a safe place of refuge, it is also the only city in the UK with a Peace Museum which is now housed in Salts Mill, Saltaire, so it is a fitting location. First held in Sheffield in 1984 as the National Street Band Festival, the Street Choirs Festival brought together musicians who played in the signature marches and protests of a politically turbulent decade. The intention of the festival is to put music into protest to make it more creative, joyful and thought provoking. The festival has expanded to welcome community choirs who sing together for the love of singing. It includes women's choirs, asylum seeker choirs, anarchist choirs, socialist choirs, , LGBTQ choirs, choirs singing to raise awareness of human rights, social justice, environmental justice, climate justice and other campaigns. It has been hosted by community choirs across the UK, from Edinburgh to Brighton, Aberystwyth to Whitby and in 2026 it will be in Dumfries and Galloway. Bradford has hosted the festival twice before, in 1999 and 2005. More than 1200 eco-friendly handmade recycled bags made by the many sewers at Bingley-based Morsbags and screen printed by local firm Fingerprints have been produced for the Street Choir Festival. Morsbags in Bingley, are linked to the Plastic-Free Bingley action group and have a team of dedicated volunteers make shopping bags from donated pieces of fabric which are then given to a host of town shops to pass on to their aim is to encourage people to reduce their use of plastic bags and encourage recycling and reusing by handing out the bags. For more information visit Streetchoirsbradford@

Is OpenAI Building an Empire or a Religion? - The Assignment with Audie Cornish - Podcast on CNN Audio
Is OpenAI Building an Empire or a Religion? - The Assignment with Audie Cornish - Podcast on CNN Audio

CNN

time15 minutes ago

  • CNN

Is OpenAI Building an Empire or a Religion? - The Assignment with Audie Cornish - Podcast on CNN Audio

Audie Cornish 00:00:00 Elon Musk has told this story a couple of times, but it's good context. Elon Musk clip 00:00:04 Larry and Paige and I used to be very close friends and I would stay at his house and I'd talk to Larry into the late hours of the night about AI safety. Audie Cornish 00:00:13 It's the story of what motivated him to invest in AI. So many, many years ago, Larry Page of Google had just scooped up DeepMind, a cutting edge AI company founded by Mustafa Suleyman, who I have interviewed on this podcast. It's a good conversation. I'm gonna stick it in our show notes. But here's how Musk told the story at the New York Times Dealbook Summit last year. Elon Musk clip 00:00:37 'And it became apparent to me that Larry did not care about AI safety. I think perhaps the thing that gave it away was when he called me a speciesist for being pro-humanity. As in, you know, like a racist, but poor species. So I'm like, wait a second, what side are you on, Larry? Audie Cornish 00:01:00 'Google had the talent and the computing power and what seemed like infinite amounts of money to spend on both. Musk considered AI a double-edged sword, and he worried people were not worried enough about the sharper edge. He was hosting dinners talking about this. He was doing the university circuit talking about this. He met multiple times with President Obama to talk about it. Now, at the same time, Sam Altman, top executive at the famous Y Combinator, that's a tech company incubator. Well, he was also looking to take a big swing with a company focused on artificial intelligence. Karen Hao 00:01:37 Altman is a very strategic person. He plays the long game. And one of the things he's very, very good at is getting talent and getting capital towards a specific objective. And so early on, he thought, who are the people that I need to recruit to this project to turn this into a legitimate lab? The first person that he identified was Elon Musk. Audie Cornish 00:02:01 'Now the company they co-founded, OpenAI, launched in December of 2015 as a non-profit committed to open collaboration, making its patents and research publicly available. It didn't stay that way. Sky News clip 00:02:18 I think I just spotted behind this man here, Sam Altman, who is the boss of OpenAI, a tech powerhouse himself in the United States, one of many tech leaders who has come here to Saudi Arabia as part of the American delegation to meet the Crown Prince. There he is, Sam Altman, meeting Donald Trump and the Crown prince. Audie Cornish 00:02:39 'OpenAI is now a hybrid company, meaning it's a for-profit and a non-profit entity. They call it capped profit. However, they slice it, two things are true. They stopped sharing their open source code and they are monetizing the technology. Sky News clip 00:02:55 'But interesting, of course, that he's not the only highly, highly wealthy tech bro, shall we say, in this line-up. Going before him was the world's richest man, Elon Musk. And these images are remarkable. And these opportunities for these tech billionaires and multimillionaires is hugely important. Audie Cornish 00:03:16 'OpenAI stands at the center of the AI revolution, and the questions raised by its co-founders, now rivals, remain. Who should control AI? What are the hidden costs of the A.I. Revolution? And as these companies become the new empires, what power do we, ordinary people, have to shape the future? I'm Audie Cornish, and this is The Assignment. Audie Cornish 00:03:44 The story of OpenAI, the company, is not the story of artificial intelligence, but of how that science became a product that you're seeing everywhere, from customer service bots to your company HR department. Tech journalist Karen Hao has spent years chronicling how the story of AI shifted from big hopes to serve humanity to a scramble for power, profit, and influence. Karen Hao 00:04:09 My background is in tech. I was a mechanical engineer for undergrad. I went to work in Silicon Valley after graduating from MIT and Audie Cornish 00:04:19 And you're like, I don't want to make any money. I think I'm going to start doing journalism. Is that how it went? Because that's how it sounds like. Karen Hao 00:04:27 Basically. Audie Cornish 00:04:28 'Behind the scenes, she saw something familiar. Companies promising the world, then bending to the pressure of growth and scale. In her new book, Empire of AI, Hao writes about how the industry's founding ideals give way to secrecy and rivalry and the break neck arms race and the rise of a quasi-religious movement. The person at the center, Sam Altman. Karen Hao 00:04:54 'He was the president of Y Combinator, which is one of the most acclaimed startup accelerators in Silicon Valley. It's launched many famous startups. And he had this idea that he wanted to take big swings around hard technology. So he was investing more in quantum, in self-driving cars, in nuclear fusion, and- Audie Cornish 00:05:18 'And we should say, at that time, Silicon Valley's sort of churning out things that we can be describing as the Uber of X, the Airbnb of Y. Yeah, exactly. They were sort of iterations on, I'm not going to say novelty, but social media. They were not taking big swings, so to speak. And so here he is, this guy who has, like, all the connections in the world. Because literally, that's what the Combinator is. It is just- connection and networking and connecting the money to the ideas. Karen Hao 00:05:49 Yes, it was a very, very dense network. Exactly. Audie Cornish 00:05:50 He's the guy at the heart of that. And he decides to take a big swing. Karen Hao 00:05:55 'Early on, he thought, who are the people that I need to recruit to this project to turn this into a legitimate lab? The first person that he identified was Elon Musk. And Musk at the time was talking very publicly, very often about his fears around AI. And so Altman, to recruit Musk, he starts saying to Musk, I am like-minded in your views about AI. I'm also worried about AI being very powerful and going wrong, and it seems to me that if it will stay within Google it would go wrong, but the best way to counteract it is to create an AI lab of our own. Audie Cornish 00:06:39 For the good guys. Google is somehow evil and it's interesting because we were talking to I think the founder of DeepMind, Mustafa Suleyman, he was now at Microsoft and like he started his companies because he wanted to be the good guy, right? Karen Hao 00:06:54 Yeah, yeah. Audie Cornish 00:06:55 Everyone's the hero in the tech story of how they make things. Karen Hao 00:06:59 Yes. Audie Cornish 00:06:59 And in this case, yeah, same thing. Karen Hao 00:07:01 Yeah, it's a very, very common theme in AI. Also, just a very common theme in everyone's human experience. Everyone is the main character of their story. Everyone has the one trying to do it better you did hit upon something that is, I think, really key to understanding the AI world and how AI is being developed today. There's a lot of ideological clash that happens where everyone fashions themselves as morally superior to the other, and therefore they have to be the one that is the CEO of a new company, and they're going to do it better than the previous one. Audie Cornish 00:07:39 'And over time, I feel like OpenAI becomes like a cautionary tale, in a way, of like how those ideologies can come to a head. And people may not remember this, but a while back, the company went through a transition where it moved from being kind of non-profit oriented with a non- profit board to that board essentially rebelling and saying, Sam Altman, actually, you're not such a guy. In how you treat us, but also, are you taking all the sort of safety mechanisms seriously that you could? This blows up and becomes an international news story, even though none of us really know why or how. We just know that there are some people saying that AI is super, super bad and that he's not, you know, heating the guardrails and him being like, it's fine. And these people, they're just... Who knows what there's, I mean, who knows what could happen? And I see an echo of this conversation over and over again between the people you talk about, the boomers and the doomers. Karen Hao 00:08:45 Yes. Audie Cornish 00:08:46 Who are those two factions and how did they surface in OpenAI? Karen Hao 00:08:50 'Yeah. So one, going back to this idea that there is an ideological clash that really shapes these technologies. One thing that's happened in the last few years within the AI world and within the Silicon Valley world is there are really what can only be described as quasi-religious movements that have been born. Audie Cornish 00:09:10 'Did you say quasi- religious? 00:09:12 'Religious, yes. Quasi-religious movements that have been born and the reason I say this is because there are a lot of people within this world who believe in what I call the artificial general intelligence religion. This is a religion where you think that it is possible to recreate human intelligence. This is something that, there isn't actually scientific consensus on this. So even the people who talk about this, they themselves talk about it as a belief. It's just, you either have the belief or you don't. And if you do have the belief, then the second tenet is you believe it's going to create civilizational transformation. And there are two factions within this religion. There are the boomers who think that civilizatonal transformation will be hugely positive. AGI will bring us to utopia. And the other faction believes that transformation will be hugely devastating. AGI could potentially kill all of humanity. Audie Cornish 00:10:16 Just to be clear, you are not, this is not hyperbole. Like I have seen, this very public language. It comes out sometimes in groups of meetings of AI scientists, where you're right. It's either heaven or hell. Karen Hao 00:10:32 'Exactly, and I couch it by saying quasi-religious, but you could actually even argue that it is just full-blown religious because there's no evidence for either of these things. It is just a deep-seated belief, a spiritual belief. They do use spiritual language. Sometimes they even talk about AGI as them recreating digital gods and demons. This is the explicit language that they use. And the reason why I say the boomers and the doomers are factions in the same religion is because they both then conclude the same thing, which is, oh, we are the good guys. We are the ones that have to be in control of developing this technology. And so there is an inherent anti-democratic conclusion that they arrived to, which is we should not make this a democratic process. We should not be getting lots of input from various people. We should be opening up this technology to lots of people. We need to keep a tight fist. Clamp down on this technology and be secretive about it. In the early days of OpenAI when it was founded, it was found in late 2015, there were very few people that believed in the AGI religion. To even speak about the idea that artificial general intelligence was possible meant you were not a serious scientist because this is not based in science. And So the people that were drawn to the explicit premise that OpenAI set out, which was to be bold and ambitious and claim, yes, we are in fact trying to achieve AGI, it only attracted the people who already believed that and it just so happened that it attracted both of the factions, it attracted the boomers and the doomers. So throughout the history of OpenAI, there has always been these two factions constantly in tension with one another and fighting over how do we actually develop this technology? How should we approach deploying the technology? And one of the things that I concluded through my reporting was it was this tension that really led to the acceleration, the massive acceleration of AI development. Audie Cornish 00:12:49 I see. So it feels like if we're thinking like, jeez, where'd this come from? Why is AI everywhere? It's not in our heads. There really was an explosion of productivity because it became a kind of arms race in the industry. Karen Hao 00:13:02 'Yes, and the specific moment was when OpenAI decided, you know, ChatGPT was based on a model called GPT 3.5. GPT3, which was the previous generation, that was a giant step change from GPT2 in terms of sheer scale of the model. And OpenAI made a very explicit decision at the time where they thought, we have these AI models in the world today. We have never tried just blowing it up by multiple orders of magnitude. Like multiple orders of magnitude more data, multiple more orders of magnitude of computer chips for training these models. And so whereas GPT-2 I think was originally trained on maybe a few dozen chips, they decided to train GPT 3 on 10,000 chips, an entire supercomputer, one of the largest supercomputers that had ever been built at the time. And it was that jump that kicked off within the industry, the very first race, that was the opening shot that then led a lot of other companies to start swarming around this concept that OpenAI hit on, which is scale at all costs. And after ChatGPT came out, then the race really raised the next level. Audie Cornish 00:14:28 Coming up in the Church of AI, what role does Sam Altman play? Audie Cornish 00:14:34 Somebody said that if you dropped Sam Altmann on an island of cannibals and came back in five years, he'd be king. Karen Hao 00:14:42 Yes, that was Paul Graham. Audie Cornish 00:14:43 'I can never un-hear that. His mentor said. Karen Hao 00:14:47 As a compliment. Audie Cornish 00:14:47 As a compliment? Audie Cornish 00:14:51 Stay with us. Audie Cornish 00:14:55 'I know you're saying quasi-religion, but if we're taking the metaphor all the way, who is Sam Altman in this world then, right? Like, is he a pope? Is he a cult leader? Is he, you know what I mean? What is the, where does he start to fall on the spectrum between boomer and doomer, first of all, since we know he started out cautious. And then second of all. Yeah, is it a charismatic leader situation? Like what are we looking at when we see him in the public space really selling us on his vision? Karen Hao 00:15:33 Can I read you a quote from the opening of my book? Audie Cornish 00:15:36 Yes, oh my gosh, please do. I think I know.I think I know the one you're going to read, actually. Karen Hao 00:15:40 So I start my book with two quotes side by side. And the one from Sam Altman goes like this, "Successful people create companies, more successful people, create countries. The most successful people create religions" and this is in quotations that Sam Altmann is quoting this. "I heard this from Qi Lu, I'm not sure what the source is. It got me thinking though, the most successful founders do not set out to create companies. They're on a mission to create something closer to a religion. And at some point, it turns out that forming a company is the easiest way to do so." And so the thing about this religion that's so interesting is they do not pray to a higher power. They are the ones that believe they're creating a higher power. And Altman, I would say, is sort of like Paul Atreides' mom in Dune. Audie Cornish 00:16:36 I love this reference. Keep it going. Yes. Karen Hao 00:16:40 She was the one that created the myth, that created a religion around Paul Atreides, right? And when people encountered that myth, they didn't understand that it was a creation, so they just believed it. I think that's who Altman is. This is just based on, after doing a lot of reporting and understanding who he is, this is my own conclusion. This is not like I saw some document where he was talking about these things. I think he understood very, very early on in his career, as evidenced by this quote, that to mobilize people, to mobilize extraordinary resources, you have to create a religious fervor around a quest. And he figured out how to create that by evoking this idea of we are going to create this Artificial General Intelligence and to your question of is he a boomer or a doomer? No one really knows. And this was something that was quite interesting when interviewing people for the book, is regardless of how long they had worked with Altman, how closely they had worked with altman, no one could really say what he believes. And if they were doomers themselves. They thought that maybe Altman was more of a they were boomers themselves... Audie Cornish 00:18:03 Meaning that he cared, that he was concerned, but also the implication was, would care about safety and care about these things they're worried about. Karen Hao 00:18:12 Yes. And if they were boomers, they believed that Altman was certainly in their camp and most likely a boomer. And what I realized when I asked people, I would always ask people, what did Sam tell you in this conversation about what he believed and what the company was doing? I realized he always said different things to different people depending on what they wanted to hear. And so ultimately, I think he will choose whether to embody a more boomer ethos or a more doomer ethos based on what is convenient and what he needs to continue mobilizing, not just his employees, but also mobilizing the public, mobilizing regulators, policymakers to move in a direction that is in the best interest of open AI. Audie Cornish 00:19:02 The quote I thought you were gonna read was the one that, was it Paul Graham? Who said this? Somebody said that if you dropped Sam Altman on an island of cannibals and came back in five years, he'd be king. Karen Hao 00:19:16 Yes, that was Paul Graham. Yeah. His mentor. Audie Cornish 00:19:18 I can never unhear that. His mentor said this? Karen Hao 00:19:21 His mentor said this. Audie Cornish 00:19:23 As a compliment? Karen Hao 00:19:26 As a compliment. Audie Cornish 00:19:28 OK, so Karen, I want to move on to something else, which is the way you just described Altman is actually similar to the way people have described Donald Trump. And I'm bringing this up because we are seeing Altman travel in those circles now. Karen Hao 00:19:44 'Mm-hmm. Audie Cornish 00:19:44 In part because of the president's embrace of technology. So just for example, Trump does this three country tour of Saudi Arabia, UAE, Qatar, brings 50 CEOs and I'm watching from my desk on screen and I see Sam Altman there shaking hands. The AI czar is there, David Sachs, he calls it this like game changer in the global AI race. Which the administration thinks like, look, if the US can cement the position before anyone else, they don't have to worry about China, et cetera. But it was just so wild seeing Altman in the court of Saudi Arabia helping make this deal happen, and as we learned in the background reporting, upsetting Elon Musk in the process, who felt like he, that OpenAI was getting more attention than he was. It felt like I was watching a moment where world powers were divvying up something. Yeah. Only those world powers were tech companies. Karen Hao 00:20:52 'Yeah. So, I mean, the reason why I call my book Empire of AI is a nod to this argument that I make in the book that these companies need to be thought of as new forms of empire. And the reason is because empires of old and empires of AI share all the same features. First, they lay claim to resources that are not their own, but they redesign the rules to suggest that it was always their own. So they're scraping the internet saying, this was free for the taking, but people did not give their informed consent to the idea that just because you post on social media, you're suddenly going to be fodder for training models that could potentially restrict your economic opportunity. The empires also exploit labor all around the world, and with these AI companies, that not only refers to the fact that they contract a lot of workers around the that then work in extremely poor conditions to do data cleaning, data annotation, and then content moderation for these companies, but also the fact that these organizations are ultimately building labor-automating technologies. OpenAI's definition of AGI is highly autonomous systems that outperform humans in most economically valuable work. So not only is it exploiting labor on the way in, the product is also exploiting Labor. Audie Cornish 00:22:20 Ah, Karen, okay, that's a lot to unpack, but there are a couple things in there I wanna jump in and ask about. You mentioned the labor part of it. We've been talking about AI scientists and venture capitalists and all the people at the top level, and then I'm reading your book and I'm learning it's data workers in Kenya who are, I don't know what, stripping disturbing content out of the responses. Like, how does this even work? Tell me about the humans in AI. Karen Hao 00:22:51 'There was a part in opening eyes history, as we talked about, when they started shifting from being more research oriented to realizing they needed some kind of commercialization. And that meant placing their text generated generation tool that can spew anything into the hands of millions of users. There have been some really infamous scandals in which companies have done this and then it has been a spectacular failure because then the chat bot suddenly starts saying racist, abusive, toxic things in the hands of those users. And so OpenAI was like, we can't have that happen. We need to build a content moderation filter that we wrap around all of our models so that when a user says something, if the model does start to generate something, the filter blocks it. It never reaches back to the user. And so they contracted workers in Kenya to build this filter. And what these workers did, was they had to day in and day out read reams of the worst text on the internet, as well as AI-generated text where OpenAI prompted models to imagine the worst texts on the Internet. And then those workers had to very carefully annotate into a detailed taxonomy, is this violent content or is this sexual content? Is this extremely graphic violent content? Is this sexual abuse content? And does that involve children? For these workers, we see exactly what happened with the social media era and content moderators in social media. They were left deeply traumatized and it wasn't just the individuals that were left traumatized. It was their communities, their families, people that depended on them, that lost a person that they depended on and these models would not be possible. They would not have the success as products that they do without this labor and it is the most taxing harmful labor and they are paid two dollars an hour. The only justification is an ideological one. It is a belief that underpins all empires that there are superior groups of people that have some nature given or God given right to rule over and subjugate inferior people. Audie Cornish 00:25:11 And this is before we get to the workers in Colombia, right, who are labeling images for AI training, or, and I'm not gonna go down the rabbit hole here, the environmental impact that is required by these data centers. We've talked about that in other contexts and suggest our listeners actually check out Terms of Service, which is our podcast on this stuff, and they've gone into it. But it gets to your point that like, I think we see such a cleaned up version. Of chat GPT, you know, to whatever help you do, kind of like nonsense tasks, that we don't have a real sense of the cost, the human cost. Karen Hao 00:25:50 It's not just the human cost, it becomes a country cost where you literally are being dispossessed of the natural resources and the human resources to develop your country further. I mean that is colonialism and that is the legacy that we still live with. Audie Cornish 00:26:07 We see something similar with a couple countries that have tried to say, like, look, we're going to get in on the AI game and it's cost them. Karen Hao 00:26:14 Led them to repeat the same thing. So I spoke with activists in Chile and Uruguay. They were aggressively trying to fight data center expansion within their countries because the governments thought, we really want the foreign direct investment. We want to welcome these companies in to build these data centers. But then the communities that actually have to host the data centers, and these data center have to use fresh water to cool the overheating of the computers. Both of those countries, when those data centers came in, were experiencing historic droughts. And ultimately, it was in that moment that Google said, okay, we are going to come in and build a data center that now uses the fresh drinking water that you do not get to have. Audie Cornish 00:27:00 There's a recent poll that found that more than 75% of Americans basically want to slow down AI development, to make sure it's done safely or ethically, when they're sort of presented with that option. Do you think there is a version of critical public pressure that could have an effect on this conversation? Karen Hao 00:27:27 Absolutely — 00:27:27 Are you sure? Because when I see the tech people with Trump, you know, getting sovereign wealth money, I'm like, the jig is up. Karen Hao 00:27:37 Is it going to be incredibly hard? Absolutely. But you know, one of the features of empires is that they are made to feel inevitable, but historically every single empire has fallen. So it really is possible. It's just going to take an extraordinary amount of work. And I like to think about the full AI supply chain, you know these technologies, there are all of these ingredients that the companies need to build these models. And they're all of the spaces in which they then have to have access to deploy those models. So the ingredients include the data, they include the land, energy and water for hosting and powering the supercomputers, they include labor, they include talent, they include all of the intellectual property that writers, artists and creators have produced over time. And the spaces in which they need to deploy are schools, businesses, healthcare industry, all of these other types of industries. But one of the things that Silicon Valley has done incredibly well in the last decade is to convince everyone that their resources are in fact Silicon Valley's resources, but we need to remember— Audie Cornish 00:28:53 What do you mean by that? Karen Hao 00:28:53 You know I have friends that will say there's no data privacy anymore whatever they have all of my data anyway so I'll just continue giving them my data That's your resource. You have ownership over that. We have ownership over our data, our land, our energy, our water. We have ownership over our schools, our healthcare system. These are actually collectively owned or individually owned resources and spaces. And these companies actually need, they need access to it. And we are the ones that grant them access to it. Audie Cornish 00:29:28 But we're also told, like, this is gonna cure cancer. This is gonna help us get to Mars. This is going to, like the AI is going, you know, AI is gonna open dot dot dot. Open doors to this or that. And the Industrial Revolution had its benefits. You know what I mean? Like it had its downsides, but it had it's benefits. And I guess the reason why I'm wrestling with this is because, yeah, like I just had this long conversation with Mustafa Suleyman and he's very, like this could really help people, you know? I think like any tool. And then I talk to you and you're just like, power to the people, fight these colonizers. And then, I Google somewhere else and they're like, well, Skynet's coming, so I don't know what to tell y'all. You know what I mean? Like, you're going to be attacked by drones. Find a cave. I hear such conflicting information about how I feel about this industry, and I don't if it's about the technology itself or about the people involved in it. Karen Hao 00:30:23 Here's what I'll say. When Mustafa Suleyman says, it could help people, could is the operative word. And when people say it could be Skynet, could is still the operative word. And what I try to do with my book is say, this is what is literally happening right now. It's not a could scenario, it is the reality. And this reality should be the best evidence that we have in understanding how AI is impacting people now. And how it will continue to impact people in the future, because it demonstrates to us the logic of how this technology is being built and how its gonna work its way through ultimately the fault lines of our society. And right now, it is not looking good. And what I argue is that we need and we can actually turn the ship around, but it cannot be it's too late, there's no role for us. We should just wait for these overlords to hopefully be nice. You know, there are artists and writers that are now suing these companies saying, we don't like the fact that you just trained on our intellectual property. You don't get to do that. No, like this is something that we need to aggressively collectively shape by taking ownership of that data, taking ownership with that land. And I think everyone wants to be in control of it. It's just most people don't know how. And I hope that through reading the book that people will start to figure out how. Audie Cornish 00:31:57 Journalist Karen Hao. Her new book is called "Empire of AI: Dreams and Nightmares in Sam Altman's Open AI.". Audie Cornish 00:32:08 The Assignment is a production of CNN Audio, and this episode was produced by an actual person, Lori Galarreta. Our senior producer is Matt Martinez, Dan Dzula is our technical director, and Steve Lickteig is executive producer of CNN audio. We had support from Dan Bloom, Haley Thomas, Alex Manassari, Robert Mathers, Jon Dionora, Leni Steinhardt, Jamus Anderus, Nichole Pesaru, and Lisa Namerow. As always, thank you so much for listening. We know you can spend your time in a lot of other places. Please hit that subscribe button, share with a friend, and we'll talk next week.

Outdoor drinking area approved in downtown Neenah
Outdoor drinking area approved in downtown Neenah

Yahoo

time18 minutes ago

  • Yahoo

Outdoor drinking area approved in downtown Neenah

NEENAH, Wis. (WFRV) – Community members can now enjoy alcohol in outdoor areas in certain parts of downtown Neenah. In a 6-3 vote, Neenah alders created a designated outdoor refreshment area ('DORA') for several parts of the city's downtown. The 'DORA' would be in effect from noon until 10 p.m. Wednesday through Saturday. It will go into effect immediately. 'I think we've got responsible citizens, responsible council members,' said alder Scott Weber. 'Responsible staff in the city and we're going to do the right thing.' The 'DORA' that alders passed is essentially a trial period. It goes through the end of March and then city officials will assess how things went and adjust accordingly if needed. Alders tell Local 5 News they can terminate the 'DORA' at any time. Green Bay-based vaping store seemingly closes abruptly after 10+ years, no reason found Neenah has implemented temporary 'DORA's' in the past for special events and alders said the results have been positive. 'I think we've got a proven track record downtown with positives,' said alder Mark Ellis. 'We've tested this program and had great success with it.' A local business that Local 5 News spoke with said that they see more customers during the temporary 'DORA's.' 'Exponential people coming into the city, enjoying their time in the city and this downtown area is so beautiful,' said Matt Gloede who is the owner of the Santé Wine Bar & Bistro in downtown Neenah. 'I think people being able to enjoy a refreshment on the street is a huge benefit I think.' Not everybody was on board with creating the 'DORA.' In addition to the three alders who voted against it, there were several people who raised concerns during the public comment portion of the meeting. 'The pervasiveness of 'DORA' sounds like we are asking for trouble, kind of what the gentleman spoke about with the effects of alcohol,' said Neenah resident Jennifer McGuire. Residents said they were worried it would lead to more cases of drunk driving, unruly behavior downtown, and that it would turn Neenah into a drinking destination. Neenah resident Michael Sturn told Local 5 News he's been sober for about four years and will avoid downtown now that people can drink alcohol outside. He said he wishes that the city would invest their time and energy into other things. 'We are at crisis level': Wisconsin Humane Society reduces adoption fees as need for dog adoptions skyrockets 'A couple of things that I feel like we should be investing in more is the farmers market and bringing back festivals,' he said. 'Looking into those things instead of encouraging drinking would be more beneficial to the community.' Alder Cari Lendrum made a motion to amend the terms of the 'Dora' to have it start later in the day and only include the summer months. Council voted down her motion. Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store