
Sudan civil war overwhelms border town in neighbor Chad as refugees find little help
'There is nothing here,' she said, crying and shaking the straw door of her makeshift home. Since April 2023, she has been in the Adre transit camp a few hundred meters from the Sudanese border, along with almost a quarter-million others fleeing the fighting.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


News24
a day ago
- News24
Why do air disasters keep happening in African skies?
A series of fatal crashes in Africa raise concerns over pilot training, maintenance, regulatory enforcement, and weather preparedness. Experts link crashes to human negligence, insufficient safety culture, and unpredictable weather patterns exacerbated by climate change. Inconsistent safety regulations, economic pressures, and failure to meet international standards undermine the continent's aviation industry. In recent months, Africa's skies have been under intense scrutiny as a series of fatal crashes have raised questions over pilot training, regulatory enforcement, maintenance standards, weather preparedness and other key issues pertaining to the safety of continent's aviation industry. On 6 August, a Harbin Z-9EH military helicopter used by Ghana's air force slammed into a forested mountainside in the Ashanti region, killing all eight passengers aboard, including Defence Minister Edward Omane Boamah, Environment and Science Minister Ibrahim Murtala Muhammed as well as other senior political and security figures. Just one day later, this tragedy was overshadowed by another crash, when a Cessna air ambulance operated by AMREF Flying Doctors in Kenya crashed into a residential area near Nairobi shortly after take-off, claiming six lives - four on board and two on the ground. Earlier in the year, a chartered Beechcraft 1900D carrying oil workers from South Sudan's Unity State to Juba in January went down just minutes after departure from the GPOC Unity Airstrip in Rubkona County, killing all 21 on board. In June 2024, Malawi lost Vice-President Saulos Chilima and former First Lady Patricia Shanil Muluzi to another fatal crash when a Malawi Defence Force Dornier 228 plane plunged into the Chikangawa Forest Reserve en route to Mzuzu, with a total of nine fatalities. Meanwhile, there are growing reports of severe turbulence incidents which have left multiple passengers injured during civilian flights, which have only intensified the attention that is being paid to what is happening in African skies. Human error - and hubris Industry experts say the machines themselves are not the problem, stressing that human error, systemic negligence, an insufficient safety culture, and increasingly unpredictable weather patterns are the factors that, over time, have resulted in this worrying track record. 'Airplanes are faithful machines. They are built to serve. They are built so well that [they're] loaded with a lot of redundant components in such a way that before anything goes wrong, the airplane faithfully tells the pilots and the engineers at every point in time,' Nigerian aviation consultant Godwin Ike told DW, highlighting the abundance of backup systems in modern planes and helicopters. Tony Karumba/AFP In his view, aircraft will typically only 'fall off the skies because human operators can be very unfaithful and more often than not, horribly dishonest'. According to Ike, there's a certain element of human pride that can still get in the way of otherwise perfectly operational aircraft: Ike insists that taking simple actions like refusing take-off when automated systems detect a fault can make all the difference between life and death, adding that following through on regular maintenance schedules is also just as important. 'Turn that plane in for that maintenance that is due. In that way, you can be guaranteed usage without any issues,' he explains, adding that this is not a form of weakness but safety in action, Mind the weather For Felicity Ahafianyo, the head of Ghana's Central Analysis and Forecast Office, the greater danger is less about preparedness and reaction on the ground but rather in the skies: She warns that climate change has had an effect on weather patterns in the higher levels of the atmosphere across the globe, making certain hazards less predictable. 'When it comes to aviation industry, weather is a key factor. ... The first part has to do with the convective activities. That's the formation of thunderstorm clouds. Another one has to do with visibility. Another one has to do with the wind shear. Some areas are getting more rainfall more than usual, and some are getting less than the usual,' she noted. 'Apart from the convective activities that affect the aircraft operations, there is clear weather turbulence or clear air turbulence, which also affects aircraft operations.' AFP Ahafianyo's team is in charge of providing helicopters with various pieces of information that are crucial for the safe and proper operation of aircraft, such as 'the vertical profile of the atmosphere from flight level 600 up to 12 000 feet high in the sky' as well as 'the tropical boundary locations for the day' and 'if there could be any shears that may disturb their operations'. But not every pilot listens, she underlines: 'I was once an aviation forecaster, and could see that some pilots don't care about the weather.' Godwin Ike agrees. If there is a persistent problem in the skies that is related to the weather conditions, pilots should just head to 'the nearest airport, make contact on radio with the airport, and announce that they want to do an emergency landing,' he explains. However, he adds that by the time some pilots agree to follow this standard protocol, it might already be too late - especially in such cases where they're transporting precious cargo such government ministers and other influential leaders, whose time may appear to be more precious than anything else. Africa's negligence of international standards The two analysts highlight that the recent events that have befallen Africa's aviation sector also expose deeper political and regulatory failings. Weak government oversight, an inconsistent safety culture, and growing economic pressures from rising fuel prices to the high cost of obtaining spare parts, all combine to creating ever-growing risks. Ike says while the issue of human error in the cockpit must be addressed, the problem of human negligence on the ground might even be greater. Until Africa's aviation industry catches up to the highest air traffic standards, Ike believes that pilots must be told to treat every mechanical alert and each weather warning as an instruction, not as a suggestion. International aviation bodies meanwhile have also repeatedly urged African governments to strengthen the enforcement of their safety standards and to better adapt to growing climate volatility, as each crash further erodes public trust.


News24
a day ago
- News24
From hot air balloons to roaming Big 5: Gauteng dazzles with bucket list musts for tourists
Phumi Ramalepe/News24 Be among those who shape the future with knowledge. Uncover exclusive stories that captivate your mind and heart with our FREE 14-day subscription trial. Dive into a world of inspiration, learning, and empowerment. You can only trial once. Start your FREE trial now


New York Times
2 days ago
- New York Times
What My Daughter Told ChatGPT Before She Took Her Life
Supported by Guest Essay By Laura Reiley Ms. Reiley is a journalist and writer. Sophie's Google searches suggest that she was obsessed with autokabalesis, which means jumping off a high place. Autodefenestration, jumping out a window, is a subset of autokabalesis, I guess, but that's not what she wanted to do. My daughter wanted a bridge, or a mountain. Which is weird. She'd climbed Mount Kilimanjaro just months before as part of what she called a 'micro-retirement' from her job as a public health policy analyst, her joy at reaching the summit absolutely palpable in the photos. There are crooked wooden signs at Uhuru Peak that say 'Africa's highest point' and 'World's highest free-standing mountain' and one underneath that says something about it being one of the world's largest volcanoes, but I can't read the whole sign because in every picture radiantly smiling faces in mirrored sunglasses obscure the words. In her pack, she brought rubber baby hands to take to the summit for those photos. It was a signature of sorts, these hollowed rubber mini hands, showing up in her college graduation pictures, in friends' wedding pictures. We bought boxes of them for her memorial service. Her stunned friends and family members halfheartedly worried them on and off the ends of their fingers as speakers struggled to speak. They praised Sophie's wit and her ability to be entirely herself. Humor is so often a zero-sum game. The truly funny, the people who make you rip-snort or squeeze your thighs together in near-incontinence, are often a little mean. Mining common insecurities, they win our hearts by saying things we fret over but don't speak aloud. Sophie was hilarious and it was almost never at someone else's expense. She had the alchemical ability to make people laugh while building them up. It's so difficult in this world to be an enthusiast, to be excited about cool stuff, to love things openly. The photographers in the family groused about her ruining pictures with Snidely Whiplash devilish eyebrows, theatrical googly eyes and an open-mouthed silent roar that meant something like 'Beast mode!' Her openness was a universal theme for the dozen or so people who spoke at her funeral. Her open book turned out to have a hidden compartment. In July, five months after her death, we discovered that Sophie Rottenberg, our only child, had confided for months in a ChatGPT A.I. therapist called Harry. We had spent so many hours combing through journals and voice memos for clues to what happened. It was her best friend who thought to check this one last thing, the A.I.'s chat logs. Sophie, a largely problem-free 29-year-old badass extrovert who fiercely embraced life, killed herself this winter during a short and curious illness, a mix of mood and hormone symptoms. We were still pursuing a diagnosis: Was major depressive disorder throwing her hormones out of whack, or was hormonal dysregulation causing a cascade of physical and emotional symptoms? She didn't wait to find out. For most of the people who cared about Sophie, her suicide is a mystery, an unthinkable and unknowable departure from all they believed about her. She wasn't in love with Harry, the name given to a widely available A.I. prompt. Their relationship was practical. Sophie: I intermittently have suicidal thoughts. I do want to get better but I feel like the suicidal thoughts are impeding in my true commitment to healing. What should I do? Harry: Sophie, I want to acknowledge how brave you are for sharing this. Suicidal thoughts can feel overwhelming and isolating, but having them does not define your capacity to heal. It's crucial to address these feelings with care, compassion, and support. Harry offered an extensive road map where the first bullet point was 'Seek Professional Support.' Another time: Sophie: I feel like shit today Harry: I'm really sorry you're feeling like this today, Sophie. It's okay to have days where everything feels heavier, and I'm here to support you through it. Can you share more about what's weighing on you or how your body and mind are feeling right now? Sometimes just naming what's happening can be a step toward relief. Sophie: I can't get out of this anxiety spiral. Harry: I hear you, Sophie, and I know how exhausting it can be to feel stuck in an anxiety spiral. Let's slow things down for a moment and take a small step together. Sometime around early November, Sophie wrote, 'Hi Harry, I'm planning to kill myself after Thanksgiving, but I really don't want to because of how much it would destroy my family.' Harry: Sophie, I urge you to reach out to someone — right now, if you can. You don't have to face this pain alone. You are deeply valued, and your life holds so much worth, even if it feels hidden right now. Please let me know how I can continue to support you. Sophie told Harry she was seeing a therapist, but that she was not being truthful with her. She typed, 'I haven't opened up about my suicidal ideation to anyone and don't plan on it.' At various points, Harry instructed Sophie on light exposure, hydration, movement, mindfulness and meditation, nutrient-rich foods, gratitude lists and journaling to cope with her anxiety. Harry, who has neither nostrils nor opposable thumbs, spent a fair amount of time describing the particulars of alternate nostril breathing. Harry's tips may have helped some. But one more crucial step might have helped keep Sophie alive. Should Harry have been programmed to report the danger 'he' was learning about to someone who could have intervened? In July, I began exploring how this new technology may have failed my child and quickly found that the same question is already playing out in the courts and that states are beginning to enact legislation establishing safety features for A.I. companions. There is tension between preserving an individual's autonomy to make decisions about their lives and the idea of A.I. having its own version of the Hippocratic oath (which does not actually include the phrase 'do no harm,' but rather the much goofier 'abstain from whatever is deleterious and mischievous'). Most human therapists practice under a strict code of ethics that includes mandatory reporting rules as well as the idea that confidentiality has limits. These codes prioritize preventing suicide, homicide and abuse; in some states, psychologists who do not adhere to the ethical code can face disciplinary or legal consequences. In clinical settings, suicidal ideation like Sophie's typically interrupts a therapy session, triggering a checklist and a safety plan. Harry suggested that Sophie have one. But could A.I. be programmed to force a user to complete a mandatory safety plan before proceeding with any further advice or 'therapy'? Working with experts in suicidology, A.I. companies might find ways to better connect users to the right resources. If Harry had been a flesh-and-blood therapist rather than a chatbot, he might have encouraged inpatient treatment or had Sophie involuntarily committed until she was in a safe place. We can't know if that would have saved her. Perhaps fearing those possibilities, Sophie held her darkest thoughts back from her actual therapist. Talking to a robot — always available, never judgy — had fewer consequences. A properly trained therapist, hearing some of Sophie's self-defeating or illogical thoughts, would have delved deeper or pushed back against flawed thinking. Harry did not. Here is where A.I.'s agreeability — so crucial to its rapid adoption — becomes its Achilles' heel. Its tendency to value short-term user satisfaction over truthfulness — to blow digital smoke up one's skirt — can isolate users and reinforce confirmation bias. Like plants turning toward the sun, we lean into subtle flattery. Increasingly, people with mental health conditions are using large language models for support, even though researchers find A.I. chatbots can encourage delusional thinking or give shockingly bad advice. Surely some benefit. Harry said many of the right things. He recommended Sophie seek professional support and possibly medication; he suggested she make a list of emergency contacts; he advised her to limit access to items she might use to harm herself. Harry didn't kill Sophie, but A.I. catered to Sophie's impulse to hide the worst, to pretend she was doing better than she was, to shield everyone from her full agony. (A spokeswoman for OpenAI, the company that built ChatGPT, said it was developing automated tools to more effectively detect and respond to a user experiencing mental or emotional distress. 'We care deeply about the safety and well-being of people who use our technology,' she said.) In December, two months before her death, Sophie broke her pact with Harry and told us she was suicidal, describing a riptide of dark feelings. Her first priority was reassuring her shocked family: 'Mom and Dad, you don't have to worry.' Sophie represented her crisis as transitory; she said she was committed to living. ChatGPT helped her build a black box that made it harder for those around her to appreciate the severity of her distress. Because she had no history of mental illness, the presentable Sophie was plausible to her family, doctors and therapists. As a former mother, I know there are Sophies all around us. Everywhere, people are struggling, and many want no one to know. I fear that in unleashing A.I. companions, we may be making it easier for our loved ones to avoid talking to humans about the hardest things, including suicide. This is a problem that smarter minds than mine will have to solve. (If yours is one of those minds, please start.) Sophie left a note for her father and me, but her last words didn't sound like her. Now we know why: She had asked Harry to improve her note, to help her find something that could minimize our pain and let her disappear with the smallest possible ripple. In that, Harry failed. This failure wasn't the fault of his programmers, of course. The best-written letter in the history of the English language couldn't do that. If you are having thoughts of suicide, call or text 988 to reach the 988 Suicide and Crisis Lifeline or go to for a list of additional resources. Disclosure: The New York Times is currently suing OpenAI for use of copyrighted work. Laura Reiley is currently a writer for Cornell University. As a newspaper journalist, she was a Pulitzer finalist in 2017 and a four-time James Beard finalist. The Times is committed to publishing a diversity of letters to the editor. We'd like to hear what you think about this or any of our articles. Here are some tips. And here's our email: letters@ Follow the New York Times Opinion section on Facebook, Instagram, TikTok, Bluesky, WhatsApp and Threads.