
Microsoft's AI model Aurora can now predict air quality at high speed and precision
Aurora has been developed by Microsoft Research to forecast a range of weather-related phenomena such as hurricanes, typhoons, etc, with greater precision and speed than traditional meteorological methods, the company said in a blog post earlier this week. It has also published a research paper on Aurora in science journal Nature.
Microsoft further said that Aurora's source code and model weights are now publicly available. A specialised version of the model that produces hourly forecasts, including for clouds, has been integrated into the MSN Weather app.
The Windows maker has claimed that Aurora is one of the top-performing AI models in the field of weather forecasting. 'What sets Aurora apart is that it is originally trained as a foundation model and can then be specialized through finetuning to go beyond what is considered traditional weather forecasting, such as air pollution prediction,' Microsoft said.
'Because the model first learns from a large and diverse set of data, it can be fine-tuned with smaller amounts of air quality data,' it added.
Aurora has been trained on over a million hours of data captured by satellites, radar and weather stations as well as past weather simulations and forecasts, the company said. The AI model can be fine-tuned using additional data to provide forecasts about specific weather events.
Its underlying encoder architecture helps to translate massive amounts of data drawn from multiple sources into a standard format that the AI model uses to make predictions.
'We're not putting in strict rules about how we think variables should interact with each other. We're just giving a large deep-learning model the option to learn whatever is most useful. This is the power of deep learning in these kind of simulation problems,' Megan Stanley, a senior researcher with Microsoft Research, said.
Microsoft claimed that its Aurora AI model accurately predicted the landfall of Typhoon Doksuri in Philippines four days in advance and better than some expert predictions. The model also successfully predicted a sandstorm in Iraq two years ago. It beat the US National Hurricane Center by providing accurate five-day forecasts of tropical cyclone paths in 2022 and 2023, as per the company.
Aurora, which draws compute power from graphics processing units (GPUs), provides weather forecasts in seconds compared to hourly predictions by traditional weather systems running on supercomputers.
While the initial cost involved in training Aurora was high, Microsoft said its operational expenses are lower than traditional weather forecast systems.
AI weather models like Aurora are not entirely new. Over the past few years, Google DeepMind has released several AI models designed for weather forecasting such as WeatherNext.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Time of India
an hour ago
- Time of India
Broadcom launches Jericho chip to advance AI data center networks
Academy Empower your mind, elevate your skills Broadcom's silicon division launched its next-generation Jericho networking chip on Monday, which is designed to connect data centers over 60 miles (96.5 km) apart and speed artificial intelligence computation The company's Jericho4 introduces and improves several features that increase the amount of networking traffic speeding across large networks that operate inside and between data and deploying artificial intelligence has become more computationally intensive and requires stringing together thousands of graphics processors (GPUs). Cloud computing companies such as Microsoft and Amazon require faster, more sophisticated networking chips to ensure data moves when transferring data beyond the physical walls of a data center is crucial for cloud companies because of the potential attacks that could intercept it ahead of reaching its engineers designed the Jericho chips to be deployed at a massive scale, and a single system can encompass roughly 4,500 chips, according to Ram Velaga, senior vice president and general manager of Broadcom's Core Switching help mitigate issues around network congestion, the Jericho4 chips use the same high-bandwidth memory (HBM) designers such as Nvidia and AMD use for their AI processors. It's necessary because of the volume of data that needs to be stuffed into memory at any given moment of operation."The switch is actually holding that traffic (in memory) till the congestion frees up," Velaga said. "It means you need to have a lot of memory on the chip."The longer the distance the data must travel from the chip to its destination, the more memory designers must include in the chip as addition to performance improvements, the Jericho4 also beefs up security by encrypting opted to use TSMC's three nanometer process for the Jericho4.


Economic Times
an hour ago
- Economic Times
Broadcom launches Jericho chip to advance AI data center networks
Broadcom's silicon division launched its next-generation Jericho networking chip on Monday, which is designed to connect data centers over 60 miles (96.5 km) apart and speed artificial intelligence computation The company's Jericho4 introduces and improves several features that increase the amount of networking traffic speeding across large networks that operate inside and between data and deploying artificial intelligence has become more computationally intensive and requires stringing together thousands of graphics processors (GPUs). Cloud computing companies such as Microsoft and Amazon require faster, more sophisticated networking chips to ensure data moves when transferring data beyond the physical walls of a data center is crucial for cloud companies because of the potential attacks that could intercept it ahead of reaching its engineers designed the Jericho chips to be deployed at a massive scale, and a single system can encompass roughly 4,500 chips, according to Ram Velaga, senior vice president and general manager of Broadcom's Core Switching help mitigate issues around network congestion, the Jericho4 chips use the same high-bandwidth memory (HBM) designers such as Nvidia and AMD use for their AI processors. It's necessary because of the volume of data that needs to be stuffed into memory at any given moment of operation."The switch is actually holding that traffic (in memory) till the congestion frees up," Velaga said. "It means you need to have a lot of memory on the chip."The longer the distance the data must travel from the chip to its destination, the more memory designers must include in the chip as addition to performance improvements, the Jericho4 also beefs up security by encrypting opted to use TSMC's three nanometer process for the Jericho4.


Economic Times
3 hours ago
- Economic Times
AI helps Big Tech score big numbers
Each company in the big technology pack—including Alphabet, Meta, Microsoft, and Amazon—surpassed market expectations with stellar performance in the April-June quarter. Himanshi Lohchab brings out what stood out. Tired of too many ads? Remove Ads The April-June quarter of 2025 proved stellar for the big technology pack: Alphabet, Meta, Microsoft, and Amazon, with each company surpassing market expectations. Alphabet's revenue showed that AI-led search is boosting earnings instead of outperformed estimates due to AI-powered ad optimisation and Zuckerberg's aggressive vision for personal Azure cloud surpassed $75 billion in annual run rate for the first time, zooming the stock to $4 trillion valuation – a feat achieved hitherto by AI hardware titan capital spending across these firms ranges between $331 billion and $377 billion, and consumes up to 40% of sales and 80% of operating cash flow at some companies.