logo
My smartphone was giving me migraines, so I made these 5 changes

My smartphone was giving me migraines, so I made these 5 changes

Megan Ellis / Android Authority
Around 2022, I was diagnosed with chronic migraines. Unlike normal headaches, migraines come with a variety of neurological symptoms on top of the pain, such as nausea, light and sound sensitivity, and difficulty concentrating.
As I took the time to figure out what my migraine triggers were, a few culprits stood out, including the time spent on my smartphone. The bright screen would often trigger migraines or worsen my existing symptoms — so I had to make a few changes to reduce the impact it had.
Most of these changes also help reduce eye strain, so you can try them out even if you don't have migraines or headaches from screen use. Since changing these settings and features, I notice my eyes don't twitch after using my screen for long periods of time. But the biggest effect was on my migraine symptoms, with me experiencing fewer migraines triggered by screen use and ensuring I could still use my phone during mild migraines.
What do you do to reduce migraines and eye strain from looking at your smartphone?
0 votes
I set my phone to dark mode.
NaN %
I use a warm filter on my screen.
NaN %
I use a browser that forces dark mode.
NaN %
I reduce the brightness of my screen.
NaN %
Other (let us know in the comments!).
NaN %
1. Enabling dark mode as the default
Megan Ellis / Android Authority
While there may be reasons why you might want to avoid dark mode, the feature is essential for me. I used to only use dark mode at night time when I was preparing for bed, but I've now made sure that dark mode is the default mode on all of my devices.
In many ways, dark mode has become an accessibility feature for me.
In many ways, dark mode has become an accessibility feature for me. Without it, I wouldn't be able to use my device without risking triggering a migraine or making an existing migraine worse.
On my Oppo Reno 10 Pro+, I also have the option to set a dark mode style: Enhanced, Medium, or Gentle. Since Enhanced includes the most darkness, with a black background, I opted for this style.
Switching over to dark mode on my phone also means that most of my apps are also in this mode. For the few that didn't switch over automatically, I manually toggled dark mode within the app settings.
When I set up my S23 Ultra, dark mode was one of the first settings I enabled on my Samsung device. I also apply this mode, as well as other settings, to any device I review.
2. Switching to Eye Comfort mode
Megan Ellis / Android Authority
Eye Comfort, also known as Eye Protection or Night Light (depending on the exact OS you're using), is a blue light filter available on Android smartphones. I used to use a blue light filter app on my phone, but since Android introduced the ability to customize the filter setting, I can use the feature directly through my smartphone software.
I also made sure to toggle Eye Comfort to the default setting, rather than limiting it to certain hours. Meanwhile, I always adjust the temperature to as warm as possible to improve my comfort when experiencing light sensitivity.
Since my eyes adjust to this filter, it is not as distracting as it might seem.
Since my eyes adjust to this filter, it is not as distracting as it might seem. However, the drawback is that it can affect the color accuracy of images you're viewing. But I prefer this caveat over the alternative of having my phone cause headaches and eye strain.
3. Switching out Chrome with Brave
Megan Ellis / Android Authority
I've been meaning to make the switch away from Chrome for a while now since I want to be less reliant on Google apps. But the lack of the ability to use my dark mode extension on Chrome's Android app was the final push I needed to stop using the app as my default mobile browser.
Instead, I've switched to Brave, which allows me to force dark mode on web pages that use a light theme by default. While many websites support dark mode, there are still a few that only have a light theme. I noticed this the most when reading news or looking up recipes.
The feature to force dark mode is available in Brave's Appearance settings, where you can choose to enable night mode.' So far, I haven't encountered any issues with getting sites to deliver a black background and white text, which has made the change to a different browser worthwhile.
4. Manually setting brightness
Rushil Agrawal / Android Authority
While Adaptive Brightness on Android is useful, I found that my light sensitivity often meant that the brightness my phone automatically set was simply too bright. This is especially a problem at night, when I set my screen to 0% brightness when I'm in bed.
The main drawback of this is a slight loss of convenience. When I go outside, my phone screen doesn't automatically adjust, and I have to use muscle memory to turn the brightness back up. But the main benefit is that my phone doesn't automatically turn the brightness back up once I've turned it down.
Manually adjusting the brightness has been worth the convenience trade-off.
Sometimes I would be using my phone when I started to feel the familiar pain at the back of my eyes, and I'd realize my screen was too bright — but by this time it was already too late, the migraine cascade had started. So, manually adjusting the brightness has been worth the convenience trade-off.
5. Setting my phone to its maximum refresh rate
Megan Ellis / Android Authority
While most of my changes benefit eye strain in general, this one is a bit more specific to migraines. Motion sensitivity is a common symptom in migraines, and I find that certain motions on my smartphone can make my nausea worse.
That's why I always ensure that my display refresh rate is set to my phone's maximum (120Hz). This higher refresh rate reduces stuttering on my screen while scrolling, which in turn helps me not feel as sick from the jittery motion.
Of course, increasing my screen's refresh rate doesn't do anything when it comes to watching videos that trigger my motion sensitivity — like videos recorded with a shaky camera. But at least I can easily scroll through my phone's settings and Reddit threads without causing issues (as long as I don't scroll too fast).
Making these changes not only reduced my eye strain and light sensitivity but also made my phone much less of a trigger for my migraines. When a migraine does hit, these changes allow me to still use my device when I'm stuck in bed and need to catch up on messages.
If you find yourself having a similar experience or experiencing eye strain from screen use, I'd also suggest trying out different ways to reduce the impact of screen use — both on your smartphone and your computer.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Musk Follows Harvard In Biting The Hand That Feeds
Musk Follows Harvard In Biting The Hand That Feeds

Forbes

time33 minutes ago

  • Forbes

Musk Follows Harvard In Biting The Hand That Feeds

Elon Musk and Harvard Both Bite the Governmental Hand that Feeds Them From an early age, children are taught essential lessons: do not play with fire, do not pet strange dogs, and if one cannot swim, stay out of the deep end. Another timeless rule—often forgotten by those in positions of immense wealth and influence—is this: do not bite the hand that feeds you. This lesson, while simple, has profound implications in the real world. It applies just as readily to billionaires and institutions as it does to children on a playground. Yet recent actions by both Elon Musk and prominent academic institutions—most notably Harvard, but also Columbia, MIT, and others—suggest that even the most successful individuals and organizations are capable of ignoring foundational wisdom. Harvard set the tone. Amid growing political scrutiny and a shifting cultural landscape, the university has drawn intense criticism over its handling of campus protests, particularly those involving slogans such as 'from the river to the sea.' The administration's decision to defend even the most controversial speech—widely viewed by many as antisemitic—has triggered investigations and jeopardized billions in tax-exempt status and government research funding. This raises a critical question: is this truly the hill worth dying on? Is preserving the right to controversial protest slogans worth risking Harvard's institutional future? It is doubtful that most students and faculty would knowingly trade funding, grants, and prestige for this fight. Elon Musk, the world's richest man, has now followed suit—this time turning his attention toward President Donald Trump, with whom he has launched a high-profile and personal feud. What makes this move especially striking is that President Trump is not a distant figure or a fading influence. He is once again sitting in the White House, wielding executive authority over regulatory agencies, defense contracting, and infrastructure initiatives—all areas that directly affect Musk's companies. Tesla, SpaceX, and xAI have flourished in part because of government partnership. SpaceX alone holds multibillion-dollar contracts with NASA and the Department of Defense. Tesla has benefitted from years of energy subsidies and EV tax incentives. Picking a fight with the sitting president—regardless of personal conviction—puts this entire ecosystem at risk. And again the question must be asked: is this battle worth the damage? Whatever principle Musk may be defending, the consequences extend far beyond himself. Shareholders, employees, and retail investors—many of whom placed their trust and savings in his leadership—are the ones left exposed. The parallel between Harvard and Musk is striking: both have been immensely successful, aided in large part by government funding, favorable regulation, and public goodwill. And both have, for different reasons, chosen to confront the very institutions and leaders that have helped sustain their growth. There is precedent for how this ends. Jack Ma, once the most powerful entrepreneur in China, famously criticized the Chinese government. The backlash was immediate and absolute. His companies were dismantled. His IPO was cancelled. His wealth and influence evaporated almost overnight. Even in less authoritarian systems, the lesson holds: those who antagonize the systems that support them may not survive the consequences. While Musk's personal net worth has dropped from nearly $450 billion to approximately $300 billion, the impact is more symbolic than practical for him. But for millions of investors, employees, and stakeholders, these battles matter. Market volatility, regulatory backlash, and reputational risk all come with tangible financial costs—costs borne not just by Musk himself, but by those who have trusted and invested in his vision. The same applies to Harvard and peer institutions. Their leadership may believe they are standing on principle, but the price of alienating government agencies and key financial backers could reshape the long-term trajectory of these universities. The erosion of public trust, the loss of bipartisan support, and the potential withdrawal of federal funding pose existential threats. Leadership—whether in business or academia—requires more than conviction. It requires judgment, timing, and the discipline to separate personal ideology from institutional responsibility. Founder-led companies often outperform when leaders are focused, visionary, and measured. But when ego replaces strategy, the consequences can be swift and severe. No one is demanding absolute political alignment or silence in the face of controversy. No one is asking Elon Musk to wear a MAGA hat. But his recent actions have been so volatile, so self-destructive, that investors may soon be tempted to hand him something else entirely—a MEGA hat: Make Elon Great Again. In today's polarized environment, the margin for error has narrowed. And for those who owe much of their success to public support—whether in Silicon Valley or the Ivy League—biting the hand that feeds is not just unwise. It is unsustainable. ---------------------------------- Disclosure: Past performance is no guarantee of future results. Please refer to the following link for additional disclosures: Additional Disclosure Note: The author has an affiliation with ERShares and the XOVR ETF. The intent of this article is to provide objective information; however, readers should be aware that the author may have a financial interest in the subject matter discussed. As with all equity investments, investors should carefully evaluate all options with a qualified investment professional before making any investment decision. Private equity investments, such as those held in XOVR, may carry additional risks—including limited liquidity—compared to traditional publicly traded securities. It is important to consider these factors and consult a trained professional when assessing suitability and risk tolerance.

An AI Film Festival And The Multiverse Engine
An AI Film Festival And The Multiverse Engine

Forbes

time34 minutes ago

  • Forbes

An AI Film Festival And The Multiverse Engine

In the glassy confines of Alice Tully Hall on Thursday, the third annual Runway AI Film Festival celebrated an entirely new art form. The winning film, Total Pixel Space, was not made in the traditional sense. It was conjured by Jacob Adler, a composer and educator from Arizona State University, stitched together from image generators, synthetic voices, and video animation tools — most notably Runway's Gen-3, the company's text-to-video model (Runway Gen-4 was released in March). Video generation technology emerged in public in 2022 with Meta's crude video of a flying Corgi wearing a red cape and sunglasses. Since then, it has fundamentally transformed filmmaking, dramatically lowering barriers to entry and enabling new forms of creative expression. Independent creators and established filmmakers alike now have access to powerful AI tools such as Runway that can generate realistic video scenes, animate storyboards, and even produce entire short films from simple text prompts or reference images. As a result, production costs and timelines are shrinking, making it possible for filmmakers with limited resources to achieve professional-quality results and bring ambitious visions to life. The democratization of content creation is expanding far beyond traditional studio constraints, empowering anyone with patience and a rich imagination. Adler's inspiration came from Jorge Luis Borges' celebrated short story The Library of Babel, which imagines a universe where every conceivable book exists in an endless repository. Adler found a parallel in the capabilities of modern generative machine learning models, which can produce an unfathomable variety of images from noise (random variations in pixel values much like the 'snow' on an old television set) and text prompts. 'How many images can possibly exist,' the dreamy narrator begins as fantastical AI-generated video plays on the screen: a floating, exploding building; a human-sized housecat curled on a woman's lap. 'What lies in the space between order and chaos?' Adler's brilliant script is a fascinating thought experiment that attempts to calculate the total number of possible images, unfurling the endless possibilities of the AI-aided human imagination. 'Pixels are the building blocks of digital images, tiny tiles forming a mosaic,' continues the voice, which was generated using ElevenLabs. 'Each pixel is defined by numbers representing color and position. Therefore, any digital image can be represented as a sequence of numbers,' the narration continues, the voice itself a sequence of numbers that describe air pressure changes over time. 'Therefore, every photograph that could ever be taken exists as coordinates. Every frame of every possible film exists as coordinates.' Winners at the 3rd Annual International AIFF 2025 Runway was founded in 2018 by Cristóbal Valenzuela, Alejandro Matamala, and Anastasis Germanidis, after they met at New York University Tisch School of the Arts. Valenzuela, who serves as CEO, says he fell in love with neural networks in 2015, and couldn't stop thinking about how they might be used by people who create. Today, it's a multi-million-user platform, used by filmmakers, musicians, advertisers, and artists, and has been joined by other platforms, including OpenAI's Sora, and Google's Veo 3. What separates Runway from many of its competitors is that it builds from scratch. Its research team — which comprises most of the company — develops its own models, which can now generate up to about 20 seconds of video. The result, as seen in the works submitted to the AI Film Festival, is what Valenzuela calls 'a new kind of media.' The word film may soon no longer apply. Nor, perhaps, will filmmaker. 'The Tisches of tomorrow will teach something that doesn't yet have a name,' he said during opening remarks at the festival. Indeed, Adler is not a filmmaker by training, but a classically trained composer, a pipe organist, and a theorist of microtonality. 'The process of composing music and editing film,' he told me, 'are both about orchestrating change through time.' He used the image generation platform Midjourney to generate thousands of images, then used Runway to animate them. He used ElevenLabs to synthesize the narrator's voice. The script he wrote himself, drawing from the ideas of Borges, combinatorics, and the sheer mind-bending number of possible images that can exist at a given resolution. He edited it all together in DaVinci Resolve. The result? A ten-minute film that feels as philosophical as it is visual. It's tempting to frame all this as the next step in a long evolution; from the Lumière brothers to CGI, from Technicolor to TikTok. But what we're witnessing isn't a continuation. It's a rupture. 'Artists used to be gatekept by cameras, studios, budgets,' Valenzuela said. 'Now, a kid with a thought can press a button and generate a dream.' At the Runway Film Festival, the lights dimmed, and the films came in waves of animated hallucinations, synthetic voices, and impossible perspectives. Some were rough. Some were polished. All were unlike anything seen before. This isn't about replacing filmmakers. It's about unleashing them. 'When photography first came around — actually, when daguerreotypes were first invented — people just didn't have the word to describe it,' Valenzuela said during his opening remarks at the festival. 'They used this idea of a mirror with a memory because they'd never seen anything like that. … I think that's pretty close to where we are right now.' Valenzuela was invoking Oliver Wendell Holmes Sr.'s phrase to convey how photography could capture and preserve images of reality, allowing those images to be revisited and remembered long after the moment had passed. Just as photography once astonished and unsettled, generative media now invites a similar rethinking of what creativity means. When you see it — when you watch Jacob Adler's film unfold — it's hard not to feel that the mirror is starting to show us something deeper. AI video generation is a kind of multiverse engine, enabling creators to explore and visualize an endless spectrum of alternate realities, all within the digital realm. 'Evolution itself becomes not a process of creation, but of discovery,' his film concludes. 'Each possible path of life's development … is but one thread in a colossal tapestry of possibility.'

Video shows dolphin calf birth and first breath at Chicago zoo. Mom's friend helped
Video shows dolphin calf birth and first breath at Chicago zoo. Mom's friend helped

Associated Press

time44 minutes ago

  • Associated Press

Video shows dolphin calf birth and first breath at Chicago zoo. Mom's friend helped

CHICAGO (AP) — A bottlenose dolphin at a Chicago zoo gave birth to a calf early Saturday morning with the help of a fellow mom, in a successful birth recorded on video by zoo staff. The dolphin calf was born at Brookfield Zoo Chicago early Saturday morning as a team of veterinarians monitored and cheered on the mom, a 38-year-old bottlenose dolphin named Allie. 'Push, push, push,' one observer can be heard shouting in video released by the zoo Saturday, as Allie swims around the tank, the calf's little tail fins poking out below her own. Then the calf wriggles free and instinctively darts to the surface of the pool for its first breath. Also in the tank was an experienced mother dolphin named Tapeko, 43, who stayed close to Allie through her more than one hour of labor. In the video, she can be seen following the calf as it heads to the surface, and staying with it as it takes that first breath. It is natural for dolphins to look out for each other during a birth, zoo staff said. 'That's very common both in free-ranging settings but also in aquaria,' said Brookfield Zoo Chicago Senior Veterinarian Dr. Jennifer Langan in a video statement. 'It provides the mom extra protection and a little bit of extra help to help get the calf to the surface to help it breath in those couple minutes where she's still having really strong contractions.' In a written statement, zoo officials said early signs indicate that the calf is in good health. They estimate it weighs around 35 pounds (16 kilograms) and stretches nearly four feet in length (115-120 centimeters). That is about the weight and length of an adult golden retriever dog. The zoo's Seven Seas exhibit will be closed as the calf bonds with its mother and acclimates with other dolphins in its group. As part of that bonding, the calf has already learned to slipstream, or draft alongside its mother so that it doesn't have to work as hard to move. Veterinarians will monitor progress in nursing, swimming and other milestones particularly closely over the next 30 days. The calf will eventually take a paternity test to see which of the male dolphins at the zoo is its father. Zoo officials say they will name the calf later this summer.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store