
Anyone Can Now Access the Largest Space Map Ever Created, Here's How
If you've ever wanted a better look at deep space, you now have your chance. A treasure trove of data collected by the James Webb Space Telescope is now available for public consumption. Included with the data is an interactive map that contains nearly 800,000 galaxies and various filters so you can view them in different ways.
The data was made available starting on June 5 by the Cosmic Evolution Survey out of Caltech. It consists of survey data that mapped out 0.54 square degrees of the sky. For reference, if you take a look at this month's strawberry moon, the amount of sky that was mapped is equivalent to about three full moons side by side. The total size of all the map data is about 1.5 TB.
Read more: Coolest Space Photos of 2025 Will Fill You With Cosmic Wonder
Included in the map are galaxies and stars that are 13.5 billion years old, which means it provides the best look into the early universe that science has offered so far. NASA estimates that the universe is 13.8 billion years old, so those galaxies and stars are positively ancient. In all, there are galaxies and stars in the data that cover about 98% of the known universe's history.
That makes it bigger than the largest Milky Way galaxy map ever created, which still only covers a small slice of the galaxy's overall layout. However, the Milky Way galaxy map is still larger in terms of data, as that map is over 500 TB in size.
Researchers mapped the sky with the JWST's Near Infrared Camera and 0.2 square degrees of the sky with the Mid Infrared Instrument. In total, there's about 1.5 TB worth of data to sift through. Arguably the coolest part of the info dump is the interactive map, which loads in a web browser and allows users to move around and see everything that was included.
Clicking on a point of interest shows you data about the star or galaxy you selected.
COSMOS2025
Using the interactive map viewer
Curious individuals can check out the project's map viewer.To use the map, you simply have to follow the link and click the "check it out!" button. Once the map loads, you'll see a square-ish image that contains over 700,000 galaxies and other objects.
Once you load the map, the best way to view it is using the layers and filters in the top right corner. The first box contains views, including NIRCam RGB and several other views. The second box breaks up the image into tiles. These tiles are how the image was mapped, so you can see which James Webb instrument was used to capture each segment of the map.
For education purposes, the third box is the one to use. These options outline the objects of interest. Clicking on them provides you with a catalog ID — which lets you search for those objects again later — along with the raw images taken of each one and additional data points like light wavelength.
The tools in the top left are used for search and configuration, like increasing the brightness and changing the hue to make some objects easier to view. With the controls, your best bet is to just look around and find all the cool stuff. At any point,k you can reset all of the settings by refreshing the browser window.
How to access the data
The data is accessible in a couple of different ways. The COSMOS2025 project published three research papers on the data they collected. The first is a catalog of everything that was observed. The other two focused on the near infrared imaging and mid infrared imaging used to obtain the data.
You can obtain the data by filling out this form. Once done, you'll be able to download the data. It's available as a single, large download or you can download individual tiles if you prefer. There are more advanced instructions available from the COSMOS2025 project website if you need them and more data available here if you want to download it.
You may notice that the majority of the map data is in the FITS file format. That's an unusual file format that your standard Photoshop or similar app won't be able to handle very well. Fortunately, NASA has a trustworthy list of FITS image viewers that you can choose from.
The data will also be used to help answer questions about the early universe, and all this data being freely available to researchers the world over will help with that.
Hashtags

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Forbes
30 minutes ago
- Forbes
CETI Looks Into The Complexities Of Whale Sounds With AI
What can we learn from the whales? It's something that researchers at the CETI project (not to be confused with the SETI Institute) are working on in order to help drive awareness around language models that exist right here in our own world. In a recent TED talk, CETI's Pratyusha Sharma talks about the communication of sperm whales, and how humans can use that to learn more about other species and ourselves. Sharma is a graduate student at CSAIL and works with advisors like our own Daniela Rus to advance this kind of discovery. As a starter, she gave the example of aliens speaking to humans verbally, or through a script – and again, distinguish CETI from what they're doing in space research! 'Communication is a key characteristic of intelligence,' Sharma explained. 'Being able to create an infinite set of messages by sequencing together finite sets of sounds is what has distinguished human beings from other species.' However, she said, CETI research indicates that we may not be alone on the earth in developing these kinds of systems. In figuring this out, she suggested, we can get insights on other species, and understand our own language better as well. Millions of life forms on earth, she said, share some form of language. 'They have their own physical and mental constraints, and are involved in their own unique ecosystems and societies,' she said. 'However, we know very little about – their communications.' So how do you decipher them? In further explaining what goes on at CETI, she listed different stakeholders with credentials in areas like linguistics, biology, cryptography and AI. (Here's some more background on the project). Most of the research, she said, is taking place in the Dominican Republic, or in the Caribbean. Explaining how the large brains of sperm whales have evolved over 16 million years, she described activity that shows advanced thinking: 'The members of the family coordinate their dives, engage in extended periods of socialization, and even take turns babysitting each other's young ones,' she said. 'While coordinating in complete darkness, they exchange long sequences of sounds with one another.' The question, she noted, is this: what are they saying? Researchers at CETI have identified 21 types of 'codas' or call systems with a certain complexity. 'One of the key differentiators between human language and all animal communications is that beautiful property called duality of patterning,' Sharma said. 'It's how a base set of individually meaningless elements sequence together to give rise to words, that in turn are sequenced together to give rise to an infinite space with complex meaning.' She outlined some of the principles through which CETI is building this species knowledge. 'Getting to the point of understanding the communications of sperm whales will require us to understand what features of their (vocalizations) they control,' she said. Presenting a set of 'coda visualizations,' Sharma noted that these simple communications correspond to complex behavior. '(This) presented a fundamental mystery to researchers in the field,' she said. She showed how the CETI work magnifies the structure of a coda: 'Even though the clicks might not have sounded like music initially, when we plot them like this, they start to look like music,' she said, presenting a combinatorial coda system. 'They have different tempos and even different rhythm.' This, she added, reveals a lot about the minds of these creatures. 'The resulting set of individual sounds (in the coda) can represent 10 times more meanings than what was previously believed, showing that sperm whales can be much more expressive than what was previously thought,' she said. 'These systems are rare in nature, but not uniquely human. … these results open up the possibility that sperm whales' communication might provide our first example of this phenomenon in another species. … this will allow us to use more powerful machine learning techniques to analyze the data, and perhaps get us closer to an understanding the meanings of their sounds – and maybe (we can) even communicate back.' The research, she added, continues: 'Hopefully the algorithms and approaches we developed in the course of this project empower us to better understand the other species that we share this planet with,' she said. This type of research has a lot of potential!! Let's see what it turns up as we continue through the age of AI.


Medscape
an hour ago
- Medscape
Pediatric HS Linked to Obesity, Acne, Other Comorbidities
A meta-analysis of 19 studies found that pediatric patients with hidradenitis suppurativa (HS) show an increased rate of medical and psychiatric comorbidities, including obesity. METHODOLOGY: Researchers conducted a systematic review and meta-analysis of 19 observational studies (14 US studies), which included 17,267 pediatric patients with HS (76.7% girls; mean age, 12-17 years) and 8,259,944 pediatric patients without HS. The primary outcome was the prevalence of comorbidities in pediatric patients with HS. The main categories included metabolic, endocrinologic, inflammatory, psychiatric, dermatologic, and genetic comorbidities. TAKEAWAY: In the meta-analysis, the most prevalent condition in patients with HS was acne vulgaris (43%), followed by obesity (37%), anxiety (18%), and hirsutism (14%). Obesity showed moderate certainty association with HS in children, with prevalence ratios ranging up to 2.48, odds ratios ranging from 1.27 to 2.68, and hazard ratios up to 1.52 ( P < .001). < .001). Researchers also found a probable association between depression and HS (moderate certainty), with all studies reporting a higher incidence among patients with HS. An association with diabetes was reported in three studies (low certainty). IN PRACTICE: 'Given the significant risk of chronic comorbidities and negative sequelae in pediatric HS, our findings highlight a need for comprehensive comorbidity screening clinical guidelines in this population and emphasize the involvement of multidisciplinary teams to achieve this,' the study authors wrote. SOURCE: The study was led by Samiha T. Mohsen, MSc, University of Toronto, Toronto, and was published online on June 11 in JAMA Dermatology . LIMITATIONS: Several of the included studies were graded as low quality, and most studies did not compare the risks of comorbidities between the two groups. Most of the studies were from the US, which could limit generalizability. Significant heterogeneity was reported across the studies. DISCLOSURES: The funding source was not disclosed. Three authors reported receiving grants, personal fees, and honoraria from multiple pharmaceutical companies, including AbbVie, Novartis, UCB, Incyte, Novartis, Celltrion, Leo Pharma, Pfizer, Sanofi, and the Pediatric Dermatology Research Alliance. Other authors reported no conflicts of interest.


CBS News
an hour ago
- CBS News
UC Davis breakthrough lets ALS patient speak using only his thoughts
Allowing people with disabilities to talk by just thinking about a word, that's what UC Davis researchers hope to accomplish with new cutting-edge technology. It can be a breakthrough for people with ALS and other nonverbal conditions. One UC Davis Health patient has been diagnosed with ALS, a neurological disease that makes it impossible to speak out loud. Scientists have now directly wired his brain into a computer, allowing him to speak through it using only his thoughts. "It has been very exciting to see the system work," said Maitreyee Wairagkar, a UC Davis neuroprosthetics lab project scientist. The technology involves surgically implanting small electrodes. Artificial intelligence can then translate the neural activity into words. UC Davis researchers say it took the patient, who's not being publicly named, very little time to learn the technology. "Within 30 minutes, he was able to use this system to speak with a restricted vocabulary," Wairagkar said. It takes just milliseconds for brain waves to be interpreted by the computer, making it possible to hold a real-time conversation. "[The patient] has said that the voice that is synthesized with the system sounds like his own voice and that makes him happy," Wairagkar said. And it's not just words. The technology can even be used to sing. "These are just very simple melodies that we designed to see whether the system can capture his intention to change the pitch," Wairagkar said. Previously, ALS patients would use muscle or eye movements to type on a computer and generate a synthesized voice. That's how physicist Stephen Hawking, who also had ALS, was able to slowly speak. This new technology is faster but has only been used on one patient so far. Now, there's hope that these microchip implants could one day help other people with spinal cord and brain stem injuries. "There are millions of people around the world who live with speech disabilities," Wairagkar said. The UC Davis scientific study was just published in the journal "Nature," and researchers are looking for other volunteers to participate in the program.