logo
Eligible families to receive $120 per child in food aid as schools close for summer

Eligible families to receive $120 per child in food aid as schools close for summer

Yahoo11-06-2025
What the Ohio Department of Job and Family Services calls 'Sun Bucks' will be given to eligible Ohio children throughout the month of June.
[DOWNLOAD: Free WHIO-TV News app for alerts as news breaks]
Ohio Department of Job and Family Services Director Matt Damschroder announced the Summer Electronic Benefit Transfer (EBT) Program for Children will provide family assistance this summer, according to a media release.
'Sun Bucks' will be distributed to eligible children throughout June.
Last year, ODJFS gave $144 million in food assistance to over 1.2 million Ohio children, according to the release.
TRENDING STORIES:
City says arbitrator ignored 'simply obvious dangers' by reinstating fired police sergeant
FBI investigation blocks off Ohio neighborhood
6-year-old hit, killed by car in Harrison Township
The federal program assists eligible families with school-aged children while schools are closed for the summer.
The Ohio Department of Education and Workforce administers this program as a partner of ODJFS.
'Whether it's during the school year or the summer months, Ohio is focused on student wellness to ensure children are nourished and ready to learn, grow, and achieve,' Director of Ohio Department of Education and Workforce Stephen Dackin said.
Eligible families will receive $120 for each school-age child over the next month, according to the release.
'Sun Bucks helps to reduce the lack of access to nutritious food for children over the summer and promotes the importance of nutrition and healthy meals,' Dackin said.
Families on the Supplemental Nutrition Assistance Program (SNAP), cash assistance (Ohio Works First), and who are income-eligible and receiving Medicaid benefits will automatically receive the Summer EBT benefits, according to the release.
Families who receive SNAP benefits can expect the money to be loaded onto their Ohio Direction Card.
Those who received 2024 summer benefits will have 2025 benefits loaded onto their existing card, while new recipients will receive a benefit card by mail, according to the release.
The money can be used to buy food at grocery stores, farmers markets, and other authorized retailers.
Families not automatically eligible can learn more about the application process at https://sebt.ohio.gov/
[SIGN UP: WHIO-TV Daily Headlines Newsletter]
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Budget office says GOP's ‘big beautiful bill' will make rich richer, poor poorer
Budget office says GOP's ‘big beautiful bill' will make rich richer, poor poorer

The Hill

time2 days ago

  • The Hill

Budget office says GOP's ‘big beautiful bill' will make rich richer, poor poorer

The Republicans' 'big, beautiful bill' will make the poorest Americans even poorer, while padding the wallets of the highest earners the most, according to a new analysis released Monday by Congress's budget arm. The assessment, conducted by the Congressional Budget Office at the request of top Democrats, found that the top 10 percent of earners in the country will see an average boost of $13,600 per year over the next decade as a direct result of provisions in the law, while the bottom 10 percent will see an average annual decrease of $1,200. The report challenges the arguments made by President Trump and other Republicans that the massive domestic policy package would benefit workers at all levels of wealth and income. And it's given fuel to the attacks from Democrats that the legislation was, all along, designed to help the wealthiest people at the expense of the working poor. 'They just confirmed Trump is enriching his billionaire friends at the expense of American families,' Rep. Brendan Boyle (D-Pa.), the senior Democrat on the House Budget Committee, posted Monday on X after the CBO report was released. 'It is the largest transfer of wealth from working Americans to the ultra-rich in history.' Enacted last month, the 'big, beautiful bill' was a compilation of virtually all of the major domestic policy items Trump had promised on his way to a presidential victory in November. It features an extension of the sweeping tax cuts Republicans had adopted in 2017, during Trump's first term, which were slated to expire at the end of the year, and provides a big boost in spending for border security, the military and domestic energy production. A portion of those new federal costs were offset by steep cuts in federal programs, including Medicaid and the Supplemental Nutrition Assistance Program (SNAP), previously known as food stamps, which benefit lower-income people. The law also puts new limits on ObamaCare subsidies and adopts new caps on federal student loans, which also affect lower-income people disproportionately. The CBO's analysis aims to gauge the cumulative effect of the various components of the law, as applied to households at differing income levels. Most workers will benefit from the law to some degree, largely due to the extension of the 2017 tax cuts, CBO found. High earners benefit the most — $13,600 for the top 10 percent, $3,200 for the next 10 percent below them — because they make the most money and tend not to receive benefits from the federal programs set to be cut. The 20 percent of workers in the middle of the income spectrum will also see a bump: between $800 and $1,200 per year over the next decade, CBO estimated. The lowest earners, however, will see a reduction in overall resources under the new law, largely because the cuts in federal programs like Medicaid and SNAP will eclipse any benefits, including the tax cuts, elsewhere in the bill. That negative trend is expected to hit those in the bottom 20 percent of earners, CBO said, resulting in a $1,200 reduction for the lowest 10 percent of incomes, and a $400 reduction for the 10 percent directly above them. Republicans have dismissed the CBO's projections in the past, arguing that they fail to take into account the broad economic boost provided by the tax cuts — a 'dynamic' benefit the Republicans say benefits people of all income levels.

Two More States Look To Ban Junk Food From SNAP Benefits
Two More States Look To Ban Junk Food From SNAP Benefits

Newsweek

time2 days ago

  • Newsweek

Two More States Look To Ban Junk Food From SNAP Benefits

Based on facts, either observed and verified firsthand by the reporter, or reported and verified from knowledgeable sources. Newsweek AI is in beta. Translations may contain inaccuracies—please refer to the original content. South Carolina and Tennessee are looking to join a throng of states that have placed restrictions on what can be bought using Supplemental Nutrition Assistance Program (SNAP) benefits. Why It Matters Since the Trump administration began in January, 12 U.S. states have previously moved to ban unhealthy foods and drinks from being bought using SNAP benefits. Such rule changes, known as waivers, need to be approved by the U.S. Department of Agriculture (USDA), which oversees the program. Supporters of restricting SNAP purchases argue that removing unhealthy foods from the program will lead to better health outcomes, and the push to limit what can be bought has been spearheaded by the Make America Healthy Again (MAHA) movement. However, critics contend it dictates how low-income Americans eat and ignores broader issues around access to affordable, nutritious food. If the waiver requests are approved in Tennessee and South Carolina, it could impact nearly 1.3 million SNAP recipients across both states. Stock image/file photo: A person carrying a basket of groceries in a store. Stock image/file photo: A person carrying a basket of groceries in a store. GETTY Tennessee On August 8, Governor Bill Lee announced he would seek a waiver from the USDA to eliminate sugary foods and drinks from being bought using SNAP benefits. The waiver would exclude items listing sugar, corn syrup, high-fructose corn syrup, or a similar caloric alternative as the primary (first) ingredient, as well as carbonated sweetened beverages in which carbonated water and sugar, high-fructose corn syrup, or a similar caloric alternative are the first two ingredients. "Tennessee is leading the nation in creating innovative solutions to enhance quality of life, and I'm proud to continue our legacy of responsible fiscal stewardship while also delivering nutritious food choices for hard-working families," he said in a press release. "I'm grateful to the Trump Administration for its leadership to Make America Healthy Again, and thank our grocery retailers, convenience stores, food producers, and beverage manufacturers for working to ensure that healthier choices reach every community across our state." However, Lee also plans to expand benefit rules in other areas. Plans include allowing SNAP recipients to purchase hot prepared chicken, "including rotisserie and non-fried, non-breaded items like grilled chicken tenders – offering convenient, healthy meal solutions." South Carolina South Carolina Governor Henry McMaster is also looking to limit SNAP purchases, although it is currently unclear exactly what foods will be limited. "America is getting healthy, and South Carolina will do her part," McMaster posted on X, on August 6. "In the next few days, I will issue an executive order directing the Department of Social Services to place common-sense limits on purchases made using SNAP benefits, formerly known as "food stamps." America is getting healthy, and South Carolina will do her part. In the next few days, I will issue an executive order directing the Department of Social Services to place common-sense limits on purchases made using SNAP benefits, formerly known as 'food stamps.' — Gov. Henry McMaster (@henrymcmaster) August 6, 2025 Like Tennessee, this will need to be done via a waiver request to the USDA. Newsweek has contacted McMaster's office via email for an update. SNAP Restrictions Across The U.S. So far this year, 12 states have had waivers approved that limit what SNAP users can buy. These are Arkansas, Colorado, Florida, Idaho, Indiana, Iowa, Louisiana, Nebraska, Oklahoma, Texas, Utah, and West Virginia. Beginning in 2026, these new waivers - each with their own rules - will prohibit certain foods from being purchased with electronic benefit transfer (EBT) cards, which are reloaded monthly for use at participating grocery stores nationwide. Their decisions have been welcomed by USDA Secretary Brooke Rollins. "It is incredible to see so many states take action at this critical moment in our nation's history and do something to begin to address chronic health problems," Rollins said in a press release issued on August 4. "President Trump has changed the status quo, and the entire cabinet is taking action to Make America Healthy Again. At USDA, we play a key role in supporting Americans who fall on hard times, and that commitment does not change. Rather, these state waivers promote healthier options for families in need," What Happens Next Each of the currently approved waivers will go into effect at various points in 2026, meaning there will be no immediate changes for SNAP beneficiaries across the impacted states for now.

Smart Glasses Revolution: Inside the biggest tech trend of the next 10 years
Smart Glasses Revolution: Inside the biggest tech trend of the next 10 years

Tom's Guide

time2 days ago

  • Tom's Guide

Smart Glasses Revolution: Inside the biggest tech trend of the next 10 years

Artificial Intelligence | Smart Glasses | Wearable TechSmartphones | iPhones | Robots | Cars | TVs Ever since I sprinted across Las Vegas in 2017 to pick up a pair of Snapchat Spectacles from a vending machine, smart glasses have changed drastically over the last eight years. From glorified camera glasses and a wearable external monitor, and all the way to an AI-infused pair of specs, we've been through it all to make it to this very moment – and the moment we're in is an interesting one. That's because we all see what we want our smart glasses to be, but in something significantly bigger: VR headsets. Currently, these are very different devices, running along two parallel trajectories of development. But after speaking to Snap, Qualcomm and more, it's clear that the race is on to find the middle ground between these two — to be first to a truly AI-infused augmented spatial future of wearables. With significant developments tackling the key challenges, this 10-year race could very much see the device that could kill the smartphone and be the next big thing. Every big company you know is in the running, with Meta's Project Orion prototype charging into the lead, Android XR and Snap's new consumer specs catching up, and even Apple is 'hell-bent' on making its own glasses. Let's take a look at where we are now, why smart glasses are indeed the next big thing, and what it will take to get there. If you take a look at the best smart glasses you can buy right now, you've got two categories: AI and AR specs. AI glasses like the Ray-Ban Meta Smart Glasses bring the power of multi-modality to something that is super wearable. And you can see the real benefits they bring — from asking quick questions like your standard smart assistant to detailed prompts understanding the world around you. In fact, sales of Ray-Ban Meta glasses so far this year have more than tripled compared to the same time last year, which is more than 200% growth. That's according to EssilorLuxottica, which owns smart glasses brands like Ray-Ban and Oakley. Every big company you know is in the running to develop smart glasses — this race could very well birth the device that kills the smartphone and becomes the next big thing in tech. For me, they really come into their own when I'm travelling. Putting ingredients on a counter and asking for a recipe of what to cook is always a massive help; live translation is a huge move to bridge the gaps of understanding; and asking for more information on historic locations gives you new context like a tour guide. Then you've got AR glasses — essentially a portable external monitor that has been shrunk down into a pair of glasses. With the micro-OLED display tech projecting into prisms in front of the lenses, you can get a 100+ inch display wherever you go. That is huge for long distance travel. Something like the Xreal One Pro specs really come in clutch for reducing the neck strain of looking down at my laptop or Steam Deck. Those prisms don't make them great for walking around with, but they are the best realization of a screen in your glasses right now. And the ever increasing capabilities to simulate an ultra-wide display or use depth-of-field tracking tech (known as 6DoF) to anchor something in place is a signifier of far greater capabilities going forward. I mean, just take a look at the spatial widgets announced in visionOS 26 — with 6DoF, that is possible with glasses. It's clear that while Apple Vision Pro opened the door to spatial computing, a whole lot of software from Cupertino's AR play to SnapOS and even Meta's OS in the Quest 3 are all previews of what you will get in glasses. Or if you wanted to go even more 'tin foil hat conspiracy' with me, I'd argue that the new Liquid Glass design motif of Apple's software is subtly training us to get used to smart glasses. That transparency does make things a little harder to read, but users will adapt — just in time for new specs. But the end-goal is far greater than that. The mission for the future is to bring both AR and AI together, as the possibilities are huge. Removing the smartphone from the equation to ensure someone is present in the moment is the pinnacle to the digital detox movement that is starting to happen — smart glasses that bring both AI and AR to the table are key to this. 'I am somewhat worried my kids think I look like this,' said Scott Myers, VP of hardware engineering at Snap Inc. — holding up a smartphone to his face and talking about how they have become distraction devices. "Specs are the next generation of computing, and they're a powerful, wearable computer in a lightweight glasses form factor. And because they naturally integrate digital experiences with the physical world and enable me to look up at the world, I'll stop pulling out my phone so much, or maybe I don't need to take my tablet with me on trips anymore." Imagine that same recipe situation as above, but with an image-based guide supplementing it, too. Or that same moment of discovering historical monuments, but having map pins identify every single one to visit. While all these companies have their own ideas of what the dream smart glasses are, all are in agreement that there are fundamental key challenges to be solved here. Displays need to get better Right now, you've got a pick of two ways to do this: a glass prism that an OLED picture is projected into (commonly called 'bird baths' and seen in the Viture Luma Pros), or a particular section of the glasses lens being etched to refract light from the arm (named 'waveguide'). Bird baths have the better, wider picture quality, but glasses have to be slightly bigger to house them — looking like the spy glasses you get at the Scholastic book fair. Meanwhile, the waveguide is certainly a lot more subtle, but being the size of a miniature postage stamp on one lens does lead to the display being way smaller and worse in quality. But companies like Lumus are quietly working on this in the background, and working with a lot of big names in the industry. The secret sauce is reflective Waveguides. 'With the geometric waveguide lenses we're making, you can get a far wider field of view while not compromising on the picture quality or brightness needed to see it in daylight, " said David Golman, VP of marketing and communications at Lumus. 'Not only that, but with the liquid crystal display potential, you can actually improve a person's vision too.' The challenge is to get the best of both worlds here — ditching the bird baths to provide full clarity of the world around you like a regular pair of glasses, while still offering that same level of screen quality for both full immersion and augmenting your surroundings. Break the reliance on other devices This comes down to one thing: getting a chip powerful enough to stuff entirely on the glasses without any need to connect to another device. At the moment, we're either limited to AR glasses having a chip that tricks your laptop into thinking you have a 32:9 ultrawide monitor on your face (typing this on my ultrawide Xreal Ones right now on a plane), or a fast but limited chip to keep latency sort of low between making an AI request through your specs and the phone doing the heavy lifting (looking at you, Ray-Ban Metas). Looking forward to the mid-term future, the answer seems to be a puck, like what you see in Meta's Project Orion – a dedicated device to fuel the experience. Other companies agree. You see this in Xreal's Project Aura and Qualcomm believes this concept is on a spectrum. 'Some operators would love a glass that is connected directly to 5G, and we will work on that. Others want sports glasses for going on a run, and others will just want a general assistant,' said Said Bakadir, VP Product Management at Qualcomm. 'I think we're gonna see that spectrum evolving from something that is minimum possible in the glass to getting rid of other devices.' However, if smart glasses are truly going to take off, there can't be any pucks or separate devices. We need it all to work entirely on the glasses for this to be the same truly disruptive iPhone-esque moment for consumer tech. Developers, developers, developers! Speaking of the iPhone, you may not know this given how much of a global icon it is now, but the real breakthrough for Apple's mini slab didn't really arrive until the app store one year later. Opening up a platform for developers to create their own experiences for people to use creates an evergrowing list of reasons to buy your device, and AR glasses need that moment. So far, there hasn't really been a shared app marketplace for people to download onto AR glasses like the app store. But two things may flip this entirely on its head: Android XR bringing the Google Play Store to specs, and Snap's new consumer glasses channeling the word of devs creating hundreds of thousands of lenses over the past few years. 'We're really here to build this with the community because it's an entirely new paradigm,' said Snap's Myers. 'There's a lot of things that will take time for people to understand and figure out. It's not just going to be, 'oh, here you go, developers — come build for this!' That's not going to work in my opinion. It's a community-led discussion and I couldn't be happier with that.' The constant stream of new apps to the smart glasses of the future needs to become as synonymous as the app store is to the iPhone. All-day stamina guaranteed Batteries are not ready for prime time in smart glasses — the longevity of lithium ion cells are always heavily compromised by the limited capacity balanced by ensuring the glasses are not too heavy on someone's face. The end result is making sure you're careful with the number of interactions you make with your Ray-Ban Meta shades at the moment. Fortunately, Meta is on the right track of improving this with the Oakley Meta HSTN glasses effectively doubling the longevity. That being said, there's still a way to go. What's the answer? Nobody is quite sure yet, but it seems to start with the direction smartphones are heading in: silicon carbon. This next generation battery tech is able to pack more power within the same space, meaning this could be a starting point to move forward. The other thing the industry has learned, just like Meta did with the Ray-Bans, is how battery life is all about calculating and optimizing the software usage to every microwatt. "I worked on smartphones for a very long time, said Myers. 'While the battery capacity has grown pretty consistently, it's really the way people are using the software that has gotten much better. We see the same trajectory for Snap OS.' If Ray-Ban Meta smart glasses prove one thing, it's that when it comes to AI devices, glasses are the best realization of that vision — better than Rabbit R1, better than the Humane AI Pin. But even more than that, we've seen multi-modal AI unlock some truly useful features in a pair of smart glasses. Because at the end of the day, you want your glasses to do more than tell you you're looking at a tree. 'AI will be the core intelligence layer. It will understand context, proactively assist, personalize the interface in real time. Wearables will evolve from tools into true companions — adaptive, discreet, and intuitive.' 'XR, for me, is the best interface to interacting with the digital world. What happened in the digital world is being transformed with AI. So it just happens that this AI requires multi-modality.' said Qualcomm's Bakadir Whether I'm exploring the world and want extra facts about a landmark, or I'm stuck on things to eat and want some assistance on what to make from the things in my fridge, having AI directly on your face is the most natural form factor. 'AI will be the core intelligence layer. It will understand context, proactively assist, personalize the interface in real time. Wearables will evolve from tools into true companions — adaptive, discreet, and intuitive.' said David Jiang, CEO of Viture. We've made small steps towards that with Snapdragon AR1+ Gen 1 — allowing you to run a 1-billion parameter AI model entirely locally. That is significant for the future of smart glasses, but it's only one step forward. Now the next step is moving into agentic AI and personalization — using data to train your own device around you for more proactive, more agentic assistance that can help before you even think you were going to look for help. Remember when the Apple Watch came out? The real reason for it existing didn't come until a few years in. When those sensors came into their own, it became the go-to health tracker that it is now. I feel that the moment is coming for smart glasses. The use cases are currently limited, but the moment we start sticking sensors on them, not only would you be able to track physical health, you could even track emotional health, too. 'We believe that understanding emotions is a force multiplier for AI, in terms of it being effective for you in the context of wearing glasses all day. If you want AI to be really effective for you, it's critical that it understands how you're feeling in real-time." said Streen Strand, Emteq CEO. And why wouldn't you? In a February survey by Sentio University, 96% of AI users reach out for some therapeutic advice. Sensor tech is looking like a key focal point of the future of smart glasses — fueling not just eye-tracking and hand gestures, but pairing with AI for more personalization. 'We believe that understanding emotions is a force multiplier for AI, in terms of it being effective for you in the context of wearing glasses all day. If you want AI to be really effective for you, it's critical that it understands how you're feeling in real-time." We've done this dance before. Remember Google Glass? There's a reason why the phrase 'glassholes' exists, and it's because of the social stigma that came with wearing this advanced piece of tech directly on your face. Every new tech category goes through a settling-in process around the way they disrupt common social cues, as they move from seeming traditionally impolite to just being the way things are. But with display tech in smart glasses, I feel that hump of social acceptance is going to take a bit more time to get used to. A great example is the Halliday glasses, which beams a 3.5-inch projected display into your eye from the top rim of the specs. All you have to do is look up at it, which on paper is seriously impressive. However, during my time talking to people wearing them at CES 2025, the amount of perceived eyerolls I got as they looked up to the screen did certainly make me feel like an inconvenience! And then more broadly with the display tech of tomorrow, you'll never really know whether someone is actually looking at you. At least with current bird bath panels making for slightly larger specs, you're giving off a big enough 'do not disturb' signal. But when they disappear and the transition to waveguide happens, it will take time for society to acclimatize. 'We all lose our time to these black rectangles called smartphones, so I see waveguides on smart glasses as a great thing to just glance at when notifications roll in without taking my phone out. But my wife is always on edge about whether I am actually paying attention to her.' said Lumus' Goldman. 'It's not like smartphones in that it's passive AI. There needs to be an AI actively listening to you that memorizes your routines, your conversations, everything about your day to deliver that efficient lifestyle.' Then, of course, there's the privacy concerns of wearing an always-on device on your face. How do you give permission to be seen by these glasses? What does that look like? We saw these become big issues with Google Glass in the early 2010s, and with a personalized AI assistant that needs to be always running to understand you, the worries will be significant and warranted. 'It's not like smartphones in that it's passive AI. There needs to be an AI actively listening to you that memorizes your routines, your conversations, everything about your day to deliver that efficient lifestyle.' said Carter Hou, CEO and Co-founder of Halliday. I know there are significant technical challenges on the road between where we are now and 2035, but more than anything, the cultural one is going to be the bigger mountain to climb. We've already gotten over the 'wearing glasses even though you don't need to' one (look at hipsters wearing spectacles with no lenses for example) — and surely it'll be a matter of time before the technological aspect just becomes a social norm, rather than people asking 'is that a camera in your glasses?' There is a grand vision for 2035, but the future of smart glasses is a lot closer than you think. I initially thought that the race to XR is only just beginning to heat up, but in reality, it's already at fever pitch. With rumored next-gen Ray-Ban Meta smart glasses, the impending launch of Snap Specs in 2026, and let's not forget Apple being 'hell-bent on creating an industry-leading product before Meta can,' we're on the precipice of seeing the next step forward in this space. But what makes this category so fascinating to me is that no one company has all the answers. Every dreamer in this area has one piece of the puzzle, and I do believe that in ten years time, these will all come together to become that next category-defining product — that smartphone moment for wearable technology. So buckle up, because it's going to be a helluva ride over the next decade. • Artificial Intelligence • Smart Glasses• Wearable Tech• Smartphones • iPhones• Robots• Cars• TVs

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store