logo
Remembering two lost river pilots in Old Dundalk Society talk

Remembering two lost river pilots in Old Dundalk Society talk

On Thursday night I headed for the Spirit Store for the official launch of the book Dundalk Bay Tragedy 1936, a story about the loss of life of two Dundalk pilots engaged in lighting the lights that guided the boats into Dundalk harbour. The two, James Woods and Mr. James Lambe headed out on 10th February 1936 in a boat in gale force conditions and nearing the end of their work, the weather turned for the worse, the boat was swamped and the two men were drowned. Both men lived in Quay Street and left a total of 15 children without a father.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Social media fuelling 'devastating' kids' mental health crisis: NGO
Social media fuelling 'devastating' kids' mental health crisis: NGO

Time of India

time7 minutes ago

  • Time of India

Social media fuelling 'devastating' kids' mental health crisis: NGO

The report said what it termed "problematic" social media use was on the rise, with a direct link between heavy internet use and suicide blanket bans are not the answer, the group warned. Australia passed a law to ban social media use for under-16s. "Such blanket bans may infringe on children's civil and political rights," including access to information, said the report. Tired of too many ads? Remove Ads Tired of too many ads? Remove Ads The "unchecked expansion" of social media platforms is driving an unprecedented global mental health crisis in kids and teens, a children's NGO said Wednesday, calling for urgent coordinated action KidsRights report said one in seven children and adolescents aged between 10 and 19 suffered mental health issues, with the global suicide rate at six per 100,000 for those aged these high rates represent "the tip of the iceberg" as suicide is widely under-reported due to stigma, according to the Amsterdam-based group."This year's report is a wake-up call that we cannot ignore any longer" said Marc Dullaert, KidsRights chairman."The mental health... crisis among our children has reached a tipping point, exacerbated by the unchecked expansion of social media platforms that prioritise engagement over child safety," he report said what it termed "problematic" social media use was on the rise, with a direct link between heavy internet use and suicide blanket bans are not the answer, the group passed a law to ban social media use for under-16s."Such blanket bans may infringe on children's civil and political rights," including access to information, said the group urged "comprehensive child rights impact assessments" at a global level for social media platforms, better education for kids, and improved training for mental health report seized on the popularity of Netflix sensation "Adolescence", which highlighted some of the toxic content kids view mini-series "demonstrated global awareness of these issues, but awareness alone is insufficient," said Dullaert."We need concrete action to ensure that the digital revolution serves to enhance, not endanger, the wellbeing of the world's 2.2 billion children," he said. "The time for half-measures is over.

People Are Asking ChatGPT for Relationship Advice and It's Ending in Disaster
People Are Asking ChatGPT for Relationship Advice and It's Ending in Disaster

Yahoo

time7 minutes ago

  • Yahoo

People Are Asking ChatGPT for Relationship Advice and It's Ending in Disaster

Despite ChatGPT's well-documented issues, people are using it to advise them on relationship issues — and it's going about as well as you'd expect. In a new editorial, Vice advice columnist Sammi Caramela said she had been blissfully unaware of the ChatGPT-as-therapist trend until someone wrote into her work email about it earlier this year. Back in February, an unnamed man told the writer that his girlfriend refused to stop using the chatbot for dating advice and would even bring up things it had told her in arguments. Though Caramela was so shocked that she "nearly choked" on her coffee, the advice-seeker wasn't all that perturbed — and claimed that he found his girlfriend's ChatGPT use fascinating. "I was a bit floored by this confession. I had no idea people were actually turning to AI for advice, much less input on their relationships," the columnist wrote in her more recent piece. "However, the more I explored the topic, the more I realized how common it was to seek help from AI — especially in an era where therapy is an expensive luxury." Intrigued, Caramela found a friend who used the OpenAI chatbot for similar purposes, running relationship issues by it as a "non-biased" sounding board. Eventually, that person realized that ChatGPT wasn't unbiased at all, but rather "seemed to heavily validate her experience, perhaps dangerously so." Similar questions have been posed on the r/ChatGPT subreddit, and as Caramela explained, the consensus over there suggested not only that the chatbot is something of a "yes-man," but also that its propensity to agree with users can be dangerous for people who have mental health issues. "I often and openly write about my struggles with obsessive-compulsive disorder (OCD)," the writer divulged. "If I went to ChatGPT for dating advice and failed to mention how my OCD tends to attack my relationships, I might receive unhelpful, even harmful, input about my relationship." Digger deeper into the world of ChatGPT therapy, Caramela found multiple threads on OCD-related subreddits about the chatbot — and on the forum dedicated to ROCD, or relationship-focused OCD, someone even admitted that the chatbot told them to break up with their partner. "Programs like ChatGPT only speed the OCD cycle up because you can ask question after question for hours trying to gain some sense of certainty," another user responded in the r/ROCD thread. "There's always another 'what if' question with OCD." Like so many poorly-trained human professionals, chatbots aren't equipped to handle the nuance and sensitivity needed in any therapeutic context. Regardless of what OpenAI claims in its marketing, ChatGPT can't be truly empathetic — and if your "therapist" will never be able to have a human-to-human connection, why would you want it to give you dating advice in the first place? More on chatbot blues: Hanky Panky With Naughty AI Still Counts as Cheating, Therapist Says

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store