logo
FDA and OpenAI to collaborate on AI drug evaluation

FDA and OpenAI to collaborate on AI drug evaluation

Al Bawaba08-05-2025

Published May 8th, 2025 - 09:29 GMT
ALBAWABA - The Food and Drug Administration (FDA), which is responsible for protecting public health by ensuring the safety, efficacy, and security of drugs, biological products, medical devices, the food supply, cosmetics, and radiation-emitting products, has announced plans to collaborate with OpenAI, the American artificial intelligence research organization, to evaluate drugs and medications using AI. Also Read Apple, Google to sign Artificial Intelligence partnership FDA collaborates with OpenAI
The FDA plans to collaborate with OpenAI with the aim of accelerating the drug development and evaluation process.
The project, cderGPT, is an artificial intelligence (AI) tool developed by the Center for Drug Evaluation and Research (CDER), which regulates prescription and over-the-counter drugs in the United States. Notably, partners from Elon Musk's Drug Evaluation and Quality (DOGE) Center were also reportedly involved in the discussions.
The project, cderGPT, is an artificial intelligence (AI) tool developed by the Center for Drug Evaluation and Research (CDER), which regulates prescription and over-the-counter drugs in the United States. (Shutterstock)
As artificial intelligence (AI) becomes increasingly integrated into all aspects of our lives, the FDA and OpenAI aim to accelerate drug development and evaluation in the United States (US) and globally. Artificial Intelligence (AI) has the potential to streamline the drug development process, making it faster and more efficient.
However, experts are raising concerns about the reliability and oversight of artificial intelligence (AI) models, particularly in high-stakes areas such as medicine.
© 2000 - 2025 Al Bawaba (www.albawaba.com)

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Musk claims that Trump exists in Epstein files
Musk claims that Trump exists in Epstein files

Al Bawaba

time10 hours ago

  • Al Bawaba

Musk claims that Trump exists in Epstein files

Published June 6th, 2025 - 02:12 GMT India, June 6 -- Elon Musk made a bombshell claim on Thursday, claiming that President Donald Trump has not made the Jeffrey Epstein files public because he is on it. "Time to drop the really big bomb:@realDonaldTrump is in the Epstein files. That is the real reason they have not been made public. Have a nice day, DJT!" Musk said on X, platform formerly known as Twitter. "Mark this post for the future. The truth will come out," the Tesla CEO further added. Published by HT Digital Content Services with permission from Hindustan Times. Via Copyright � HT Media Ltd.

Visa Cash App Racing Bulls (VCARB) Formula One™ Team Accelerates Racing Car Design with Neural Concept's Engineering AI
Visa Cash App Racing Bulls (VCARB) Formula One™ Team Accelerates Racing Car Design with Neural Concept's Engineering AI

Al Bawaba

time21 hours ago

  • Al Bawaba

Visa Cash App Racing Bulls (VCARB) Formula One™ Team Accelerates Racing Car Design with Neural Concept's Engineering AI

Visa Cash App Racing Bulls Formula One™ Team has deployed Neural Concept, the world's leading AI platform for engineering design to accelerate the team's car design and optimize aerodynamic performance through AI-powered, data-driven engineering workflows that enable faster design iteration and better-informed Concept's proprietary Engineering AI platform complements traditional Computational Fluid Dynamics (CFD) with high-speed predictive simulations. Engineers can use digital twins to evaluate thousands of design variants across complex 'multi-physics' environments that mimic real-world track conditions such as wind and temperature enables VCARB to explore more designs, unlocking new performance gains within every Mekies, Team Principal, Visa Cash App Racing Bulls said: 'In Formula One, every millisecond counts and innovation at the design stage can be the difference between leading the pack or falling behind. By integrating Neural Concept's cutting-edge Engineering AI into our aerodynamic development, we're unlocking new levels of speed and precision in our design process. This partnership allows us to explore more design variants, ultimately giving us a competitive edge where it matters most.'Pierre Baqué, CEO and co-founder of Neural Concept said: 'Formula One is the ultimate proving ground for Engineering Intelligence—where engineering decisions are pushed to their limits and every performance gain counts. At Neural Concept, our mission is to revolutionize engineering with deep learning and unlock a new symbiotic collaboration between human expertise and AI's analytic speed and power. This partnership with Visa Cash App Racing Bulls demonstrates how AI-driven design workflows can turn weeks of iteration into days, helping teams move faster, explore further, and stay ahead in the most competitive engineering environment on the planet.' Neural Concept platform is trusted by over 70 Original Equipment Manufacturers (OEMs) and Tier 1 engineering teams around the world including Bosch, General Motors, Airbus, OPmobility and integrates seamlessly with partner engineering solution ecosystems including NVIDIA and Siemens. © 2000 - 2025 Al Bawaba ( Signal PressWire is the world's largest independent Middle East PR distribution service.

Blind Spots and Wishful Thinking: Why Data Resilience Needs a Reality Check
Blind Spots and Wishful Thinking: Why Data Resilience Needs a Reality Check

Al Bawaba

time21 hours ago

  • Al Bawaba

Blind Spots and Wishful Thinking: Why Data Resilience Needs a Reality Check

For years, many organizations have been guilty of putting data resilience on the back burner. Over time, however, the rising tide of threat levels, regulations, and best practices has lifted all boats. Resilience is now firmly on the for a rethinkAwareness is only half the battle; preparedness is another matter. Now that industry benchmarks have improved so that organizations have a better idea of what to look for, they are waking up to an uncomfortable fact – they aren't as prepared as they ought to be. The Veeam report on data resilience among larger enterprises in collaboration with McKinsey found that key aspects of cyber resilience - even old-hat fundamentals like 'People and Processes - were regularly self-reported as significantly did we get here? And how can organizations shore up these shortcomings? For C-suite decision-makers, resilience perhaps isn't the most exciting or business-critical concern. Historically, it was often lumped in with general cybersecurity and assumed that it was already in place. Unfortunately, like most contingencies, the true value of data resilience can't be appreciated until things go wrong. Aside from the CISO, chief execs would often treat backup and recovery processes like you would an airbag. Forget it's in place at all, until you're involved in an incident, and then suddenly you're thanking your lucky stars you had it in law enforcement cracking down on some of the most prominent groups, including the likes of BlackCat and LockBit, there might have been an assumption that cyberattacks as a whole are trending down. But the reality couldn't be further from the truth. In the last year alone, 69% of companies faced an attack at one point or another, yet 74% still fell short of data resilience best practices. The threat is only evolving, with smaller groups and so-called 'lone wolf' attackers stepping into the gap. And with a new subsection of attackers comes a new set of methods, with the faster data exfiltration attack methods on the writing's on the wallThe same Veeam report, in collaboration with McKinsey revealed that 74% of participating enterprises lacked the maturity needed to recover quickly and confidently from a disruption. While cyber resilience gaps are often a case of 'not realising before it was too late', in this case, many of these deficiencies were self-reported. But if organizations are aware, why haven't they plugged these gaps?For some, it could be down to the simple fact that they've only just realized. The recent wave of EU-focused regulations, including notably NIS2 and DORA, has spotlighted the issue by requiring organizations to up their resilience across the board. In the build-up to their compliance deadlines over the last year, organizations had to critically assess their full data resilience, many for the first time, revealing a number of previously unknown blind no matter how they realized their gaps, organizations did not fall behind overnight. For many, it's happened incrementally with their data resilience standards not keeping up as new technologies and applications have been adopted. With most organizations implementing AI at will to stay ahead of the competition and optimize business processes, the impact on their data profiles has gone largely unnoticed. The sheer amount of data needed and generated by these applications has resulted in sprawling data profiles that fall far outside existing data resilience this with an underdeveloped understanding of modern data resilience, and you've got a recipe for disaster. It's often a case of 'you don't know what you don't know'. As a result, many organizations have been benchmarking themselves against the wrong yardsticks. Take your standard tabletop exercise, sure, it's better than nothing, but data resilience can't be measured on paper. In theory, their processes might work, but in reality, it's a whole other a step in the right directionSo, what's next? Rather than waiting for an incident to come along and put them to the test, organizations need to get comfortable with being uncomfortable. That means proactively uncovering and addressing gaps, however uneasy it might make them first step for any organization with below-par data resilience should be to gather a clear picture of your data profile. What you have, where it's stored, and why you do or don't need it. With this, you can reduce at least some of your data sprawl by filtering out any obsolete, redundant, or trivial data to focus on securing the data you actually need. Then, get to work securing the work doesn't stop there. Once you've got your shiny, new data resilience measures in place, it's time to stress test them. And not just once. Data resilience measures need to be consistently and comprehensively tested to push them to their very limits, much like in the real thing – cyber-attackers won't just stop when your systems start to creak a little. And they won't wait until the perfect through scenarios where key stakeholders are on annual leave, or where security teams are occupied with something else entirely, to expose all of the potential gaps in your measures. It might seem excessive, but otherwise, the first you'll hear about these vulnerabilities will be during or following a real a significant piece of work to undertake, but data resilience is worth every penny. According to the Veeam report, in collaboration with McKinsey, companies with advanced data resilience capabilities have 10% higher annual revenue growth than those lagging. That's not to say that improved data resilience will magically boost these figures for you, but bringing up your data resilience standards will inevitably have a knock-on effect on processes across the board. At the very least, you can be sure that cyberthreats will only grow more complex, and that data footprints won't be getting smaller any time soon. It's an issue that every organization will have to face, so jump in the deep end now before you get pushed beyond your limits by a cyber-attack.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store