logo
#

Latest news with #predators

The predator that's set to clip the wings of Britain's growing green parakeet population
The predator that's set to clip the wings of Britain's growing green parakeet population

Daily Mail​

time5 hours ago

  • General
  • Daily Mail​

The predator that's set to clip the wings of Britain's growing green parakeet population

Britain's burgeoning parakeet population could soon have a new nemesis. There are tens of thousands of the green birds in the UK, according to estimates, but experts claim that they will soon come under threat from goshawks. The predators feed on pigeons, grey squirrels and rats, and conservationists say they will soon be targeting parakeets. Hazel Jackson, of the UK Centre for Hydrology and Ecology, said goshawks could possibly prey on parakeets, just as peregrine falcons do now. She said: 'We do know that peregrines in London are predating on parakeets, and so in theory, and given the growing quantity of parakeets, there is a chance goshawks might take advantage of them as another source of food.' Sightings of goshawks – which have a 5 ft wingspan and long talons – have increased on the edges of London in recent months, and experts say they will soon make their way to other British cities. The birds were virtually wiped out by the Victorians but escaped goshawks owned by falconers began to re-establish numbers from the beginning of the 1970s. Goshawks are now living in cities across Europe including Berlin, Amsterdam, Riga and Moscow. In the UK, there have been sightings in Sussex, Kent and Surrey, as well as on marshes on the edges of the River Thames in London. Conor Mark Jameson, author of Looking for the Goshawk, told The Mail on Sunday he expected them to appear in parks around cities including London, Bath, Southampton, Glasgow and Edinburgh within the next few years. 'In my lifetime this bird has come back from extinction in the UK and now goshawks could become a feature of British cities in a matter of a few years,' he added. 'It's recovering in some of the Home Counties – Sussex in particular, Surrey and Kent – and reports in and over London appear to be increasing. 'Slowly but surely they will work their way into suburbia. There is superabundant food in the form of pigeons of all kinds, ring-necked parakeets, corvids, grey squirrels and even rats and stout trees in which to nest in large gardens, hospital grounds, cemeteries and recreation areas.' Andy Evans, of the RSPB, said that 'with continued protections' goshawks may be as common a sight in London as they are in other cities across Europe.

Zoo's request for donations of pets to be fed to predatory animals draws PETA blowback
Zoo's request for donations of pets to be fed to predatory animals draws PETA blowback

Fox News

time2 days ago

  • General
  • Fox News

Zoo's request for donations of pets to be fed to predatory animals draws PETA blowback

A zoo in Europe is inviting the public to turn unwanted pets into prey, calling for donations of fluffy rabbits and family guinea pigs to be euthanized and fed to its predator residents. The Aalborg Zoo in Denmark announced the program in a Facebook post, sparking public uproar, including the ire of the People for the Ethical Treatment of Animals (PETA). "If you have an animal that, for various reasons, needs to be rehomed, you are welcome to donate it to us. The animals are humanely euthanized by trained staff and then used as feed," the zoo wrote in the social media post. "In this way, nothing goes to waste — and we ensure natural behavior, proper nutrition, and well-being for our predators." The zoo said private individuals and businesses can donate chickens, rabbits and guinea pigs, which organizers said make up an important part of predators' diets. Specifically, the zoo said the Eurasian lynx requires "whole" prey animals that resemble what it would naturally hunt in the wild. "In zoos, we have a responsibility to replicate the animals' natural food chain — for the sake of both animal welfare and professional integrity," employees wrote in the post. The zoo is also accepting horses, noting owners "can join us all the way if you wish." PETA weighed in on the controversial initiative, saying there is nothing "natural" about the donation request. "It's not 'natural behaviour' for predators from Asia, who roam and hunt for their meals, to be fed companion animals who originated in South America," PETA wrote in an online news release. "Companion animals are never needed to 'imitate the natural food chain of the [zoo's] animals' (as an Aalborg Zoo spokesperson described the program)." PETA added if the Aalborg Zoo "truly cares about animals," it should focus on protecting species in their natural habitats. The zoo later turned off the comments on its Facebook post detailing the program, citing "significant international interest." "We understand that the post evokes emotions and interest, but hateful and malicious rhetoric is unnecessary — and we encourage keeping a respectful tone," leadership wrote. "We're happy to elaborate and answer questions via private messages or email." The Aalborg Zoo and PETA did not immediately respond to Fox News Digital's request for comment.

Roblox rolls out open-source AI system to protect kids from predators in chats
Roblox rolls out open-source AI system to protect kids from predators in chats

CTV News

time3 days ago

  • CTV News

Roblox rolls out open-source AI system to protect kids from predators in chats

Roblox, the online gaming platform wildly popular with children and teenagers, is rolling out an open-source version of an artificial intelligence system it says can help preemptively detect predatory language in game chats. The move comes as the company faces lawsuits and criticism accusing it of not doing enough to protect children from predators. For instance, a lawsuit filed last month in Iowa alleges that a 13-year-old girl was introduced to an adult predator on Roblox, then kidnapped and trafficked across multiple states and raped. The suit, filed in Iowa District Court in Polk County, claims that Roblox's design features make children who use it 'easy prey for pedophiles.' Roblox says it strives to make its systems as safe as possible by default but notes that 'no system is perfect, and one of the biggest challenges in the industry is to detect critical harms like potential child endangerment.' The AI system, called Sentinel, helps detect early signs of possible child endangerment, such as sexually exploitative language. Roblox says the system has led the company to submit 1,200 reports of potential attempts at child exploitation to the National Center for Missing and Exploited Children in the first half of 2025. The company is now in the process of open-sourcing it so other platforms can use it too. Preemptively detecting possible dangers to kids can be tricky for AI systems — and humans, too — because conversations can seem innocuous at first. Questions like 'how old are you?' or 'where are you from?' wouldn't necessarily raise red flags on their own, but when put in context over the course of a longer conversation, they can take on a different meaning. Roblox, which has more than 111 million monthly users, doesn't allow users to share videos or images in chats and tries to block any personal information such as phone numbers, though — as with most moderation rules — people constantly find ways to get around such safeguards. It also doesn't allow kids under 13 to chat with other users outside of games unless they have explicit parental permission — and unlike many other platforms, it does not encrypt private chat conversations, so it can monitor and moderate them. 'We've had filters in place all along, but those filters tend to focus on what is said in a single line of text or within just a few lines of text. And that's really good for doing things like blocking profanity and blocking different types of abusive language and things like that,' said Matt Kaufman, chief safety officer at Roblox. 'But when you're thinking about things related to child endangerment or grooming, the types of behaviors you're looking at manifest over a very long period of time.' Sentinel captures one-minute snapshots of chats across Roblox — about 6 billion messages per day — and analyzes them for potential harms. To do this, Roblox says it developed two indexes — one made up of benign messages and, the other, chats that were determined to contain child endangerment violations. Roblox says this lets the system recognize harmful patterns that go beyond simply flagging certain words or phrases, taking the entire conversation into context. 'That index gets better as we detect more bad actors, we just continuously update that index. Then we have another sample of what does a normal, regular user do?' said Naren Koneru, vice president of engineering for trust and safety at Roblox. As users are chatting, the system keeps score — are they closer to the positive cluster or the negative cluster? 'It doesn't happen on one message because you just send one message, but it happens because of all of your days' interactions are leading towards one of these two,' Koneru said. 'Then we say, okay, maybe this user is somebody who we need to take a much closer look at, and then we go pull all of their other conversations, other friends, and the games that they played, and all of those things.' Humans review risky interactions and flag to law enforcement accordingly. Barbara Ortutay, The Associated Press

Roblox rolls out open-source AI system to protect kids from predators in chats
Roblox rolls out open-source AI system to protect kids from predators in chats

The Independent

time3 days ago

  • The Independent

Roblox rolls out open-source AI system to protect kids from predators in chats

Roblox, the online gaming platform wildly popular with children and teenagers, is rolling out an open-source version of an artificial intelligence system it says can help preemptively detect predatory language in game chats. The move comes as the company faces lawsuits and criticism accusing it of not doing enough to protect children from predators. For instance, a lawsuit filed last month in Iowa alleges that a 13-year-old girl was introduced to an adult predator on Roblox, then kidnapped and trafficked across multiple states and raped. The suit, filed in Iowa District Court in Polk County, claims that Roblox's design features make children who use it 'easy prey for pedophiles.' Roblox says it strives to make its systems as safe as possible by default but notes that 'no system is perfect, and one of the biggest challenges in the industry is to detect critical harms like potential child endangerment.' The AI system, called Sentinel, helps detect early signs of possible child endangerment, such as sexually exploitative language. Roblox says the system has led the company to submit 1,200 reports of potential attempts at child exploitation to the National Center for Missing and Exploited Children in the first half of 2025. The company is now in the process of open-sourcing it so other platforms can use it too. Preemptively detecting possible dangers to kids can be tricky for AI systems — and humans, too — because conversations can seem innocuous at first. Questions like 'how old are you?' or 'where are you from?' wouldn't necessarily raise red flags on their own, but when put in context over the course of a longer conversation, they can take on a different meaning. Roblox, which has more than 111 million monthly users, doesn't allow users to share videos or images in chats and tries to block any personal information such as phone numbers, though — as with most moderation rules — people constantly find ways to get around such safeguards. It also doesn't allow kids under 13 to chat with other users outside of games unless they have explicit parental permission — and unlike many other platforms, it does not encrypt private chat conversations, so it can monitor and moderate them. 'We've had filters in place all along, but those filters tend to focus on what is said in a single line of text or within just a few lines of text. And that's really good for doing things like blocking profanity and blocking different types of abusive language and things like that,' said Matt Kaufman, chief safety officer at Roblox. "But when you're thinking about things related to child endangerment or grooming, the types of behaviors you're looking at manifest over a very long period of time.' Sentinel captures one-minute snapshots of chats across Roblox — about 6 billion messages per day — and analyzes them for potential harms. To do this, Roblox says it developed two indexes — one made up of benign messages and, the other, chats that were determined to contain child endangerment violations. Roblox says this lets the system recognize harmful patterns that go beyond simply flagging certain words or phrases, taking the entire conversation into context. 'That index gets better as we detect more bad actors, we just continuously update that index. Then we have another sample of what does a normal, regular user do?" said Naren Koneru, vice president of engineering for trust and safety at Roblox. As users are chatting, the system keeps score — are they closer to the positive cluster or the negative cluster? 'It doesn't happen on one message because you just send one message, but it happens because of all of your days' interactions are leading towards one of these two,' Koneru said. 'Then we say, okay, maybe this user is somebody who we need to take a much closer look at, and then we go pull all of their other conversations, other friends, and the games that they played, and all of those things.' Humans review risky interactions and flag to law enforcement accordingly.

Roblox rolls out open-source AI system to protect kids from predators in chats
Roblox rolls out open-source AI system to protect kids from predators in chats

Associated Press

time3 days ago

  • Associated Press

Roblox rolls out open-source AI system to protect kids from predators in chats

Roblox, the online gaming platform wildly popular with children and teenagers, is rolling out an open-source version of an artificial intelligence system it says can help preemptively detect predatory language in game chats. The move comes as the company faces lawsuits and criticism accusing it of not doing enough to protect children from predators. For instance, a lawsuit filed last month in Iowa alleges that a 13-year-old girl was introduced to an adult predator on Roblox, then kidnapped and trafficked across multiple states and raped. The suit, filed in Iowa District Court in Polk County, claims that Roblox's design features make children who use it 'easy prey for pedophiles.' Roblox says it strives to make its systems as safe as possible by default but notes that 'no system is perfect, and one of the biggest challenges in the industry is to detect critical harms like potential child endangerment.' The AI system, called Sentinel, helps detect early signs of possible child endangerment, such as sexually exploitative language. Roblox says the system has led the company to submit 1,200 reports of potential attempts at child exploitation to the National Center for Missing and Exploited Children in the first half of 2025. The company is now in the process of open-sourcing it so other platforms can use it too. Preemptively detecting possible dangers to kids can be tricky for AI systems — and humans, too — because conversations can seem innocuous at first. Questions like 'how old are you?' or 'where are you from?' wouldn't necessarily raise red flags on their own, but when put in context over the course of a longer conversation, they can take on a different meaning. Roblox, which has more than 111 million monthly users, doesn't allow users to share videos or images in chats and tries to block any personal information such as phone numbers, though — as with most moderation rules — people constantly find ways to get around such safeguards. It also doesn't allow kids under 13 to chat with other users outside of games unless they have explicit parental permission — and unlike many other platforms, it does not encrypt private chat conversations, so it can monitor and moderate them. 'We've had filters in place all along, but those filters tend to focus on what is said in a single line of text or within just a few lines of text. And that's really good for doing things like blocking profanity and blocking different types of abusive language and things like that,' said Matt Kaufman, chief safety officer at Roblox. 'But when you're thinking about things related to child endangerment or grooming, the types of behaviors you're looking at manifest over a very long period of time.' Sentinel captures one-minute snapshots of chats across Roblox — about 6 billion messages per day — and analyzes them for potential harms. To do this, Roblox says it developed two indexes — one made up of benign messages and, the other, chats that were determined to contain child endangerment violations. Roblox says this lets the system recognize harmful patterns that go beyond simply flagging certain words or phrases, taking the entire conversation into context. 'That index gets better as we detect more bad actors, we just continuously update that index. Then we have another sample of what does a normal, regular user do?' said Naren Koneru, vice president of engineering for trust and safety at Roblox. As users are chatting, the system keeps score — are they closer to the positive cluster or the negative cluster? 'It doesn't happen on one message because you just send one message, but it happens because of all of your days' interactions are leading towards one of these two,' Koneru said. 'Then we say, okay, maybe this user is somebody who we need to take a much closer look at, and then we go pull all of their other conversations, other friends, and the games that they played, and all of those things.' Humans review risky interactions and flag to law enforcement accordingly.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store