logo
Tijuana River pollution impacts air quality in San Diego: study

Tijuana River pollution impacts air quality in San Diego: study

Yahoo5 days ago

SAN DIEGO (FOX 5/KUSI) — Local researchers with the University of California San Diego and Scripps Institution of Oceanography examine how pollutants in wastewater travel and move along the San Diego coastline.
Researchers took samples of water and air from the U.S.–Mexico border to the Scripps Pier in La Jolla and found a mixture of illicit drugs, chemicals from tires and personal care products in the air. Their study discovered pollution from the Tijuana River is affecting the air quality.
'This is a new route of inhalation exposure, people who are exposed to it are breathing it in and that's hard to control. We can control people not going into the beach, but now how do we control in the air,' said Jonathan Slade, UCSD Assistant Professor of Chemistry.
Imperial Beach is seeing the most impact from airborne chemicals as is Border Field State Park, Slade said. In La Jolla, the research shows chemicals were also found in the air, but not as high as in the South Bay.
The samples were collected in 2020, but researchers say little has changed in how sewage released from the river is processed.
'The Tijuana River is a very dynamic environment with implications for public health,' lead author Adam Cooper said. 'Ours is one of the most comprehensive studies to date investigating water-to-air transfer of these pollutants.'
Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Modesto playground for disabled children still pending 3 years later
Modesto playground for disabled children still pending 3 years later

CBS News

time19 minutes ago

  • CBS News

Modesto playground for disabled children still pending 3 years later

Modesto playground for children with disablilities remains under construction three years later Modesto playground for children with disablilities remains under construction three years later Modesto playground for children with disablilities remains under construction three years later MODESTO — A long-awaited inclusive playground in Modesto remains incomplete more than three years after its groundbreaking, despite millions of dollars already invested. Known as The Awesome Spot, the playground at Beyer Community Park was designed to accommodate children of all abilities, including those with disabilities. Planning began nearly a decade ago, with a groundbreaking held in 2021. Project designer Chad Kennedy said the playground was initially estimated to cost around $2.5 million when first proposed in 2016. Since then, the project has already surpassed $3.5 million in spending, with $1.5 million contributed by the City of Modesto. "We build it with what money we have," Kennedy said. "And then we keep the wheels rolling and try to find more money to keep it going." The city has signed an agreement to take over maintenance once the playground is completed. But Kennedy says an additional $3.5 to $5 million is still needed to finish construction. Residents living near the site say they've watched years of sporadic work with little communication. "I'm all for helping kids with disabilities," said neighbor Kenneth Morris. "But the planning seemed to be a little lacking." The Awesome Spot team is currently transitioning to a new nonprofit partner in hopes of bringing in more national support. Donations can be made at the project's official website.

Apple's most underrated app could change soon, and you're going to love it
Apple's most underrated app could change soon, and you're going to love it

Digital Trends

time34 minutes ago

  • Digital Trends

Apple's most underrated app could change soon, and you're going to love it

Apple's shortcuts app is a power user's dream. I think it's one of the most underrated features you can find on an iPhone, and even Macs. In case you haven't used it yet, it allows you to perform a multi-step task in one go, or even trigger certain actions automatically. One of my favorite shortcuts is instantly generating a QR code of a Wi-Fi network, instead of narrating a complex password. I've got another one that automatically deletes screenshots after a 30-day span. There are a few in my library that trigger Do Not Disturb mode for a certain time slot, turn any webpage into a PDF, even snap Mac windows, and activate my smart devices when I reach home. Recommended Videos All that sounds convenient, but creating those shortcuts isn't a cakewalk. The UI flow and action presets can overwhelm tech-savvy users when it comes to creating their own automations. Apple may have a user-friendly solution, thanks to AI, and you just might get it this year. Apple has the foundation ready According to Bloomberg, Apple is preparing an upgraded version of the Shortcuts app that will put AI into the mix. 'The new version will let consumers create those actions using Apple Intelligence models,' says the report. The AI models could be Apple's own, which means they are better suited for integration with system tools and apps than a third-party AI model. Take, for example, the Siri-ChatGPT integration. OpenAI's chatbot can handle a wide range of tasks that Siri can't accomplish, but ChatGPT isn't able to interact with other apps and system tools on your iPhone. That means it can't assist you with making cross-app Shortcuts either. At WWDC 2025, Apple is rumored to reveal its own AI models and open them to app developers, as well. The idea is to let developers natively integrate AI-driven features in their apps without having to worry about security concerns. Microsoft is already using in-house AI models for a wide range of Copilot experiences on Windows PCs. Moreover, the company also offers its Phi family of open AI models to developers for building app experiences. Apple just needs to follow in Microsoft's footsteps. With developers adopting Apple's AI foundations and the company expanding it to the Shortcuts app, it would be much easier to create multi-step workflows easily. How so? Well, just look at Gemini on Android phones. Shortcuts needs an AI makeover Imagine just narrating a workflow to Siri, and it's turned into a shortcut. That's broadly what AI tools are already capable of, but instead of creating a rule for the future, they just execute the task at hand immediately. With AI in Shortcuts, things should go like: 'Hey Siri, create a shortcut that automatically replies to all messages I get on weekends regarding my unavailability, and tell them to reach me again on Monday. Trigger the action when I say the words I'm out.' With natural language processing on AI models, that's feasible. Look no further than how Gemini works on Android devices, especially those with on-device Gemini Nano processing. With a voice command, Gemini can dip into your workspace data and get work done across Gmail, Docs, and more connected apps. It can even handle workflows across third-party apps such as WhatsApp and Spotify. The list keeps on growing, and as capabilities like Project Mariner and Astra are rolled out through Gemini Live, newer possibilities will open. With a revamped Shortcuts app, Apple just needs to get the voice processing right and convert the prompts into actionable commands. Apple's partner, OpenAI, already offers a feature called Operator that can autonomously handle tasks on the web. Creating a chain of commands across mobile apps that are running locally should be easier and less risky compared to browsing websites. With ChatGPT's language chops already baked at the heart of Apple Intelligence, I won't be surprised if the next-gen Shortcuts app exploits it to the fullest. Oh hey, here's a sample Talking about ChatGPT and its integration with iOS, there's already an open-source project out there that can give a rough idea of how voice commands turn into actions on an iPhone. Rounak Jain, an iOS engineer at OpenAI, has created an AI agent that transforms audio prompts into actions on an iPhone. 🚨🤖 Today, I'm launching an AI agent that gets things done across iPhone apps. It's powered by OpenAI GPT 4.1 and is open source. Try it out! — Rounak Jain (@r0unak) June 1, 2025 Jain says the demo video is built atop OpenAI's GPT-4.1 AI model, and it can get work done across multiple apps with a single voice command. For example, users can control the flashlight after sliding down the control center, click and send a picture to one of their contacts, or text travel details and book a cab. Jain's demo is a clear sign that integrating an AI model at the system level, or having it perform tasks across apps, is feasible. A similar pipeline can be integrated to turn those voice commands into shortcuts, instead of executing them immediately. I am just hoping that when Apple implements AI within Shortcuts and lets users create their own routines with natural language commands, it offers a flow where users have the flexibility to modify them at will. I believe the best approach would be to show users the chain of commands and let them make adjustments before the prompt is turned into a shortcut.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store