Desert Oasis Healthcare Celebrates 45 Years of Serving the Coachella Valley and High Desert
PALM SPRINGS, CA / ACCESS Newswire / July 7, 2025 / Desert Oasis Healthcare (DOHC) is proud to celebrate 45 years of service, innovation, and commitment to the health and wellness of the Coachella Valley and High Desert.
What began as one of the first medical groups in the region in 1981 has grown into a multidisciplinary network of compassionate providers and care teams serving more than 60,000 members. DOHC has continuously evolved to meet the changing needs of the community, integrating cutting-edge tools such as artificial intelligence into everyday practice, while staying true to its mission: delivering high-quality, patient-centered care with a personal touch.
'Our vision has always been to deliver excellent care with compassion. For 45 years, we've stayed true to that mission while embracing new tools and smarter ways to support our patients,' said Dr. Marc Hoffing, Medical Director of DOHC.
Throughout 2025, DOHC is marking this milestone with a series of community initiatives, including increased charitable support for local nonprofits, health and wellness education campaigns, and a signature celebration event July 1st 2026 on the date of our actual 45th anniversary. Additionally, keep an eye out for snapshots of the last 45 years shared online throughout the course of the year, as we look back at our work with pride. We also continue to tell the stories that matter most through original content like The Pulse, a health-focused TV program created to inspire and inform viewers across the region.
As DOHC looks ahead, our leadership remains focused on expanding access to care, embracing innovation, and staying deeply connected to the community we serve. We hope you'll celebrate this milestone with us, because every step of this journey has been made possible by the trust and support of the community we proudly serve.
About Desert Oasis Healthcare
Formed in 1981 as one of the first medical groups in the desert communities of southern California, Desert Oasis Healthcare (DOHC) continues to advance with changes in the healthcare market. DOHC provides primary and immediate care, home health, palliative care, clinical research studies and other services to more than 60,000 members/patients living in the greater Coachella Valley and the Morongo Basin of Riverside and San Bernardino counties. The multidisciplinary and comprehensive care programs of DOHC are committed to educating individuals on preventive health care in their daily lives, reflected in the DOHC motto, 'Your Health. Your Life. Our Passion.' For more information, visit www.mydohc.com.
Contact:
Rob Banchich
Director of Marketing
Desert Oasis Healthcare
[email protected]
###
SOURCE: Desert Oasis Healthcare
press release
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
an hour ago
- Yahoo
NEWS OF THE WEEK: Michael Madsen's cause of death revealed
The Reservoir Dogs actor was found dead at his home in Malibu, California last week, and his cardiologist has now confirmed the cause of death. The doctor told NBC4 Los Angeles that heart failure will be listed as the cause, with heart disease and alcoholism named as contributing factors. Madsen was 67. An autopsy will not be conducted because the Kill Bill star's cardiologist signed the death certificate, and the Los Angeles Sheriff's Department has closed its investigation into his death, listing it as being due to natural causes.
Yahoo
2 hours ago
- Yahoo
Talk to medical professionals, not just ChatGPT, urge Ontario doctors
ChatGPT and similar artificial intelligence tools can sometimes answer patient questions accurately, but Canadian medical researchers caution that the information needs to be carefully checked before acting on what you see. The researchers' advice comes as the Ontario Medical Association (OMA) hosted a media briefing this week, discussing DIY information sources — from search engines to social media to chatbots — and their impacts, as well as what patients can do instead. It's important to warn people now, said Dr. Valerie Primeau, a psychiatrist from North Bay who leads inpatient and community programs for mental health and addictions, because patients are increasingly turning to AI tools. The chatbots give convincing and empathetic results — but the information might be fake. "I have patients now that talk to ChatGPT to get advice and have a conversation," Primeau said. "So I foresee that we will continue having this issue, and if we don't address it now and help people navigate this, they will struggle." Dr. David D'Souza, a radiation oncologist in London, Ont., who leads clinical research into image-based treatments for cancer, said depending on how patients interrupt what AI tells them, they could put off conventional treatments. "A patient came to me asking if he should wait to have his cancer that was diagnosed treated in a few years because he believes that AI will customize cancer treatments for patients," D'Souza told reporters. "I had to convince him why he should have treatment now." Given that consumers will use the tools, OMA president Dr. Zainab Abdurrahman advised if a post says "doctors have been hiding this from you," she suggests checking the websites of relevant specialist groups, such as provincial cancer care associations, to see if they back it up. Fake ads, including AI-generated images, can also lead patients astray, warned Abdurrahman, who is also a clinical immunologist and allergist. While the technology is progressing, today's chatbots routinely answer health queries with false information that appears authoritative. In one study, Dr. Benjamin Chin-Yee, an assistant professor in the pathology and lab medicine department at Western University and his co-authors fed nearly 5,000 summaries of medical and scientific literature into AI large language models including ChatGPT and asked for summaries. They found three-quarters of the AI versions missed key parts of carefully guarded statements. For example, the journal article might say a drug was only effective in a certain group of patients while the summary leaves out that key detail, said Chin-Yee, who is also a hematologist. "The worry is that when that nuance in detail is lost, it can be misleading to practitioners who are trying to use that knowledge to impact their clinical practice." Chin-Yee said AI is an active area of research that is rapidly changing, with newer models that are more human-like and user-friendly, but there can be drawbacks to relying on the tools alone. Similarly, David Chen, a medical student at the University of Toronto, compared results provided by chatbots to 200 questions about cancer from a Reddit forum to responses provided by oncologists. "We were surprised to find that these chatbots were able to perform to near-human expert levels of competency based on our physician team's assessment of quality, empathy and readability," Chen said. But the experimental results may not reflect what happens in the real world. "Without medical oversight, it's hard to 100 per cent trust some of these outputs of these generative technologies," Chen said, adding concerns about privacy, security, and patient trust still haven't been fully explored. WATCH | Researchers use AI to help treat brain patients: Don't rely on a single chatbot Generative AI technologies like chatbots are based on pattern-matching technologies that give the most likely output to a given question, based on whatever information it was trained on. In medicine, though, unlikely possible diagnoses can also be important and shouldn't be ruled out. Plus, chatbots can hallucinate — produce outputs that sound convincing but are incorrect, made up, nonsensical or irrelevant. "There's also been research studies that have been put out that suggested that there are hallucination rates of these chat bots that can be upwards of 20 per cent," Chen said, which could make the output "clinically erroneous." In the spring, cardiologist Eric Topol, a professor and executive vice president of Scripps Research in San Diego, Calif., published a book, Superagers: An Evidence-Based Approach to Longevity, that looked at the impact of AI on longevity and quality of life. "There's a lot of good anecdotes, there's bad anecdotes," Topol said of patients using chatbots. "It hasn't been systematically assessed in a meaningful way for public use." Topol said he advises people to consult multiple chatbots and to check that you're getting reliable information. He also suggested asking for citations from the medical literature, noting sometimes those aren't real and need to be verified. Ideally, Topol said there would be a real-world test of chatbot responses from tens of thousands of people tracking what tests were done, what diagnosis was given and the outcomes for those who used AI sources and those who didn't. But tech companies are unlikely to participate because each one wouldn't gain, he said. "It's a different world now and you can't go back in time," Topol said of using the tools wisely.


Washington Post
2 hours ago
- Washington Post
How I told my family and friends about my cancer
You've just found out you have a life-threatening disease. Whom do you tell? And what the heck do you tell them? Figuring this out has been both perplexing and illuminating. In this installment of what I call my 'living with dying' series, I'm going to share how I told my friends about my terminal diagnosis. As with all of these pieces, one size does not fit all. But I hope others might snag something useful from it.