logo
#

Latest news with #GregJenner

Greg James receives honorary University of York doctorate
Greg James receives honorary University of York doctorate

BBC News

time7 days ago

  • Entertainment
  • BBC News

Greg James receives honorary University of York doctorate

Radio 1 DJ Greg James said it had been a "fantastic day" as he received an honorary degree from the University of was chosen for the honour for his "remarkable contribution to society" and for championing mental health, according to the university's vice his connection with the institution also involves an unusually tall duck that became an online sensation and was regularly mentioned on the presenter's fondness for "Long Boi" saw him lead a memorial service and unveil a statue after its death. Other people being honoured by the University of York this year were Handmaid's Tale author Margaret Atwood and historian Greg said receiving his honorary degree had been "very thrilling"."I feel very proud to receive this and it was a real genuine delight to watch all those very clever people receiving their degrees and Masters."It was a fantastic day. It was very inspiring actually."He admitted to a few nerves ahead of his speech to his fellow graduates."It means a lot to all the people who were in there, that they'd done something really magical."All their family and friends are there and you don't want to mess it up." Long Boi, a 28in (70cm) cross between a mallard and an Indian runner duck, was regularly featured on James' show after gaining fame among students and is believed to have died in 2023 after vanishing from the admitted the initial idea of Britain's tallest duck had been too funny to resist and it had captured the listeners' attention."It spiralled into this ridiculous movement," he said."I love taking small things on the breakfast show and then taking them to their most ludicrous conclusion."And the ludicrous conclusion was doing a state funeral live on the Radio 1 breakfast show from the same hall where they give out the scrolls."He said he believed "silliness" could really cut through the "world beautifully, the absurd world, the sad bits of the world"."So it was about a dead duck but really it was about everyone being together and celebrating something silly."At the core of everything people want to laugh, that's the one thing that unites every single person." Listen to highlights from North Yorkshire on BBC Sounds, catch up with the latest episode of Look North.

The Queen of Sheba
The Queen of Sheba

ABC News

time18-07-2025

  • Entertainment
  • ABC News

The Queen of Sheba

In this episode, Greg Jenner is joined by Dr Jillian Stinchcomb and comedian Sadia Azmat to learn all about the legendary Queen of Sheba. From her first appearance in the Hebrew Bible, the Queen of Sheba has fascinated Jewish, Muslim and Christian writers. But do we know anything about her as a historical figure? And how has her story been told, used and reinterpreted throughout history? This episode traces the legends written about the Queen of Sheba across Europe, Africa and the Middle East from 600 BCE to today, exploring the ambiguous and contradictory depictions of her as a wise and powerful ruler, an exoticised and seductive woman, the founding member of an Ethiopian royal dynasty, and a possible half-demon!

Greg James: Radio 1 Breakfast host gets honorary degree from York
Greg James: Radio 1 Breakfast host gets honorary degree from York

BBC News

time04-07-2025

  • Entertainment
  • BBC News

Greg James: Radio 1 Breakfast host gets honorary degree from York

Radio 1 DJ Greg James is to receive an honorary degree from the University of York - where he once led a memorial service for a beloved campus tall mallard Long Boi became an online sensation due to his height, and was regularly mentioned on the presenter's weekday breakfast it was Greg who, last autumn, made a pilgrimage to the university to unveil a statue erected in Longboi's vice chancellor Charlie Jeffery says the presenter was chosen for his "remarkable contribution to society" and for championing mental health. Greg's relationship with York university and Longboi stretches back to 2022, when he first suggested co-presenting a show with the towering collaboration finally came to pass in March 2023, when Long Boi's quack was broadcast to millions of listeners over the just a few months later the campus mascot was reported missing and feared dead - sparking a campaign to erect the statue in his memory that Greg later he announced news of the honorary degree to listeners, he joked: "I'm getting a duck-torate. I'm going to be a duck-tor".Greg said he was "deeply honoured and flattered" to be chosen for the accolade alongside Handmaid's Tale author Margaret Atwood and historian Greg Jenner. Speaking to BBC Newsbeat, Greg says it was a "huge surprise" to receive a letter from the university inviting him to accept the honorary degree of doctor of the university."It took me a while to compose myself and write a formal reply that wasn't just 'yes, thank you," he says."I did it very calmly."York's vice chancellor said the university wanted to recognise Greg's "ability to connect with audiences on an emotional level" and make "a significant impact on countless lives but in particular those of young people"."I don't feel particularly deserving of a doctorate for that," says Greg."But I do like being a facilitator of fun."Greg is, however, a big advocate of the power of radio to reach audiences "in a world that feels increasingly divided". "Radio is a friend to people. It's a friend to me," he says. "It puts you in a community of people who are like-minded. "I love that I get to talk to millions of people every day and I love that we celebrate everyone's brains and everyone's differences." Greg, and the other honorary graduates, will be headed to the University of York to collect their degrees at the end of tells Newsbeat he's doing some "serious thinking" about his graduation to spoil anything, but you can expect Long Boi to feature heavily, as well as a "celebration of radio" and "people being together".And Greg also suggests that a quote from Pitbull - who regularly gets a mention on the presenter's show - might also be shouldn't have to look too hard, as the US rapper also also received an honorary degree in 2015 from Florida's Doral will also be Greg's second honorary degree after receiving one in 2015 at the University of East Anglia - eight years after missing his actual graduation ceremony. Listen to Newsbeat live at 12:45 and 17:45 weekdays - or listen back here.

Google has a 'You can't lick a badger twice' problem
Google has a 'You can't lick a badger twice' problem

Business Insider

time25-04-2025

  • Entertainment
  • Business Insider

Google has a 'You can't lick a badger twice' problem

Like many English sayings — "A bird in the hand is worth two in the bush," "A watched pot never boils" — it isn't even true. Frankly, nothing stops you from licking a badger as often as you'd like, although I don't recommend it. (I'm sure Business Insider's lawyers would like me to insist you exercise caution when encountering wildlife, and that we cannot be held liable for any rabies infections.) If the phrase doesn't ring a bell to you, it's because, unlike "rings a bell," it is not actually a genuine saying — or idiom — in the English language. But Google's AI Overview sure thinks it's real, and will happily give you a detailed answer of what the phrase means. Someone on Threads noticed you can type any random sentence into Google, then add 'meaning' afterwards, and you'll get an AI explanation of a famous idiom or phrase you just made up. Here is mine [image or embed] — Greg Jenner (@ April 23, 2025 at 6:15 AM Greg Jenner, a British historian and podcaster, saw people talking about this phenomenon on Threads and wanted to try it himself with a made-up idiom. The badger phrase "just popped into my head," he told Business Insider. His Google search spit out an answer that seemed reasonable. I wanted to try this myself, so I made up a few fake phrases — like "You can't fit a duck in a pencil" — and added "meaning" onto my search query. Google took me seriously and explained: So I tried some others, like "The Road is full of salsa." (This one I'd like to see being used in real life, personally.) A Google spokeswoman told me, basically, that its AI systems are trying their best to give you what you want — but that when people purposely try to play games, sometimes the AI can't exactly keep up. "When people do nonsensical or 'false premise' searches, our systems will try to find the most relevant results based on the limited web content available," spokeswoman Meghann Farnsworth said. "This is true of Search overall — and in some cases, AI Overviews will also trigger in an effort to provide helpful context." Basically, AI Overviews aren't perfect (duh), and these fake idioms are "false premise" searches that are purposely intended to trip it up (fair enough). Google does try to limit the AI Overviews from answering things that are "data voids," i.e., when there are no good web results to a question. But clearly, it doesn't always work. I have some ideas about what's going on here — some of it is good and useful, some of it isn't. As one might even say, it's a mixed bag. But first, one more made-up phrase that Google tried hard to find meaning for: "Don't kiss the doorknob." Says Google's AI Overview: So what's going on here? The Good: English is full of idioms like "kick the bucket" or "piece of cake." These can be confusing if English isn't your first language (and frankly, they're often confusing for native speakers, too). My case in point is that the phrase is commonly misstated as "case and point." So it makes lots of sense that people would often be Googling to understand the meaning of a phrase they came across that they don't understand. And in theory, this is a great use for the AI Overview answers: You want to see the simply-stated answer right away, not click on a link. The Bad: AI should be really good at this particular thing. LLMs are trained on vast amounts of the English written language — reams of books, websites, YouTube transcriptions, etc., so being able to recognize idioms is something they should be very good at doing. The fact that it's making mistakes here is not ideal. What's going wrong that Google's AI Overview isn't giving the real answer, which is "That isn't a phrase, you idiot"? Is it just a classic AI hallucination? The ugly: Comparatively, ChatGPT gave a better answer when I asked it about the badger phrase. It told me that it was not a standard English idiom, even though it had the vaguely folksy sound of one. Then it offered, "If we treat it like a real idiom (for fun)," and gave a possible definition. So this isn't a problem across all AI — it seems to be a Google problem. This is somewhat different from last year's Google AI Overview answers fiasco where the results pulled in information from places like Reddit without considering sarcasm — remember when it suggested people should eat rocks for minerals or put glue in their pizza (someone on Reddit had once joked about glue in pizza, which seems to be where it drew from). This is all very low-stakes and silly fun, making up fake phrases, but it speaks to the bigger, uglier problems with AI becoming more and more enmeshed in how we use the internet. It means Google searches are somehow worse, and since people start to rely on these more and more, that bad information is just getting out there into the world and taken as fact. Sure, AI search will get better and more accurate, but what growing pains will we endure while we're in this middle phase of a kinda wonky, kinda garbage-y, slop-filled AI internet? AI is here, it's already changing our lives. There's no going back, the horse has left the barn. Or as they say, you can't lick a badger twice.

Google's AI answers keep telling me 'You can't lick a badger twice' is a real saying
Google's AI answers keep telling me 'You can't lick a badger twice' is a real saying

Business Insider

time25-04-2025

  • Entertainment
  • Business Insider

Google's AI answers keep telling me 'You can't lick a badger twice' is a real saying

Like many English sayings — "A bird in the hand is worth two in the bush," "A watched pot never boils" — it isn't even true. Frankly, nothing stops you from licking a badger as often as you'd like, although I don't recommend it. (I'm sure Business Insider's lawyers would like me to insist you exercise caution when encountering wildlife, and that we cannot be held liable for any rabies infections.) If the phrase doesn't ring a bell to you, it's because, unlike "rings a bell," it is not actually a genuine saying — or idiom — in the English language. But Google's AI Overview sure thinks it's real, and will happily give you a detailed answer of what the phrase means. Someone on Threads noticed you can type any random sentence into Google, then add 'meaning' afterwards, and you'll get an AI explanation of a famous idiom or phrase you just made up. Here is mine [image or embed] — Greg Jenner (@ April 23, 2025 at 6:15 AM Greg Jenner, a British historian and podcaster, saw people talking about this phenomenon on Threads and wanted to try it himself with a made-up idiom. The badger phrase "just popped into my head," he told Business Insider. His Google search spit out an answer that seemed reasonable. I wanted to try this myself, so I made up a few fake phrases — like "You can't fit a duck in a pencil" — and added "meaning" onto my search query. Google took me seriously and explained: So I tried some others, like "The Road is full of salsa." (This one I'd like to see being used in real life, personally.) A Google spokeswoman told me, basically, that its AI systems are trying their best to give you what you want — but that when people purposely try to play games, sometimes the AI can't exactly keep up. "When people do nonsensical or 'false premise' searches, our systems will try to find the most relevant results based on the limited web content available," spokeswoman Meghann Farnsworth said. "This is true of Search overall — and in some cases, AI Overviews will also trigger in an effort to provide helpful context." Basically, AI Overviews aren't perfect (duh), and these fake idioms are "false premise" searches that are purposely intended to trip it up (fair enough). Google does try to limit the AI Overviews from answering things that are "data voids," i.e., when there are no good web results to a question. But clearly, it doesn't always work. I have some ideas about what's going on here — some of it is good and useful, some of it isn't. As one might even say, it's a mixed bag. But first, one more made-up phrase that Google tried hard to find meaning for: "Don't kiss the doorknob." Says Google's AI Overview: So what's going on here? The Good: English is full of idioms like "kick the bucket" or "piece of cake." These can be confusing if English isn't your first language (and frankly, they're often confusing for native speakers, too). My case in point is that the phrase is commonly misstated as "case and point." So it makes lots of sense that people would often be Googling to understand the meaning of a phrase they came across that they don't understand. And in theory, this is a great use for the AI Overview answers: You want to see the simply-stated answer right away, not click on a link. The Bad: AI should be really good at this particular thing. LLMs are trained on vast amounts of the English written language — reams of books, websites, YouTube transcriptions, etc., so being able to recognize idioms is something they should be very good at doing. The fact that it's making mistakes here is not ideal. What's going wrong that Google's AI Overview isn't giving the real answer, which is "That isn't a phrase, you idiot"? Is it just a classic AI hallucination? The ugly: Comparatively, ChatGPT gave a better answer when I asked it about the badger phrase. It told me that it was not a standard English idiom, even though it had the vaguely folksy sound of one. Then it offered, "If we treat it like a real idiom (for fun)," and gave a possible definition. So this isn't a problem across all AI — it seems to be a Google problem. This is somewhat different from last year's Google AI Overview answers fiasco where the results pulled in information from places like Reddit without considering sarcasm — remember when it suggested people should eat rocks for minerals or put glue in their pizza (someone on Reddit had once joked about glue in pizza, which seems to be where it drew from). This is all very low-stakes and silly fun, making up fake phrases, but it speaks to the bigger, uglier problems with AI becoming more and more enmeshed in how we use the internet. It means Google searches are somehow worse, and since people start to rely on these more and more, that bad information is just getting out there into the world and taken as fact. Sure, AI search will get better and more accurate, but what growing pains will we endure while we're in this middle phase of a kinda wonky, kinda garbage-y, slop-filled AI internet? AI is here, it's already changing our lives. There's no going back, the horse has left the barn. Or as they say, you can't lick a badger twice.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store