'Completely new and totally unexpected finding': Iron deficiency in pregnancy can cause 'male' mice to develop female organs
When you buy through links on our articles, Future and its syndication partners may earn a commission.
Iron deficiency during pregnancy can cause a male mouse embryo to develop female features, a new study reveals.
The low iron disrupts the activation of a key gene that spurs the development of male sex organs. This causes embryos with XY chromosomes — the most common combination seen in males — to develop female sex organs instead.
"This is a completely new and totally unexpected finding," study co-author Peter Koopman, a professor emeritus of developmental biology at the University of Queensland in Australia, told Live Science. "It's never been shown before that iron can flip such an important developmental switch."
Earlier research established that the SRY gene on the Y chromosome is the "master switch" for turning on the development of male organs in mammals. An enzyme called JMJD1A plays an important role in flipping this master switch, and it requires iron to function properly. However, the connection between iron levels and sex determination was not fully understood.
Now, in a study published June 4 in the journal Nature, researchers report that iron is essential for the development of testes in XY mice. The results show that maternal iron deficiency disrupts the activity of JMJD1A, which lowers SRY expression and drives the development of ovaries in XY mouse embryos.
However, it's too early to say whether this finding in mice might translate to human pregnancy and sex development, Tony Gamble, an associate professor of biological sciences at Marquette University in Milwaukee who wasn't involved in the study, told Live Science.
Related: Is there really a difference between male and female brains? Emerging science is revealing the answer.
In the study, the researchers used pharmaceutical treatments and low-iron diets to manipulate the iron levels in pregnant mice. When the pregnant mice experienced iron deficiency, this caused six out of 39 total XY embryos to develop ovaries instead of testes. Investigating further, they found that genetics appear to be a factor in which embryos are sensitive to this effect.
To confirm this mechanism, the team also grew embryonic gonads — structures that develop into testes or ovaries in the womb — in lab dishes so they could directly observe the impact of iron depletion. These lab analyses showed that reducing the iron in cells to 40% of normal levels led to a large increase in histones on the SRY gene. Histones are proteins that bind DNA and help control which genes are switched on, and this effect almost completely blocked the SRY gene's expression.
Normally, the JMJD1A enzyme rids the SRY gene of histones, allowing it to turn on. The researchers hypothesize that when iron levels drop, the enzyme's activity is compromised, so suppressive histones build up on the SRY gene.
These results suggest that "some important developmental traits that were previously thought to be purely genetically controlled can also be seriously impacted by nutrition and metabolic factors," Koopman said. And "if iron can have such an impact on sex development, then maybe other organ systems may also critically depend on iron or other dietary factors in a similar way," he added.
Because the research was conducted solely in mice, the question of whether iron may have similar effects in humans is still open. Although sex determination follows a broadly similar blueprint across mammals, there are some important differences between mice and humans, Gamble said.
RELATED STORIES
—Scientists made mice with Y chromosomes female by deleting just 6 tiny molecules
—One in 500 men may carry an extra sex chromosome (most without knowing it)
—These bacteria trigger a sex change in wasps — scientists finally know how
For example, while both species rely on the same genes to drive the development of testes, the consequences of mutations in these genes differ between the two species. Their similarities to humans make mice important models for studying development and disease, Gamble said, "but the differences urge caution in simply assuming processes are acting identically across both species."
Testing the new finding in humans won't be easy, since many of the experiments possible in mice can't ethically be done in humans, Koopman said. "So, the way forward will have to involve doing biochemical, cell culture and gene expression experiments to build a body of indirect evidence that what holds true in mice is also the case in humans," he said.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles
Yahoo
17 hours ago
- Yahoo
Does the color purple really exist?
When you buy through links on our articles, Future and its syndication partners may earn a commission. The world is awash with the color purple — lavender flowers, amethyst gemstones, plums, eggplants and purple emperor butterflies. But if you look closely at the visible-light portion of the electromagnetic spectrum, you'll notice that purple (which is different from bluish hues of violet and indigo) is absent. That's because purple may be made up by our brains; It exists only because of how the brain processes color. So does that mean purple doesn't really exist? Not necessarily. The answer lies within the mind-boggling way that our brains perceive and combine different wavelengths on the visible light spectrum. "I would actually say that none of color actually exists," said Zab Johnson, an executive director and senior fellow at the Wharton Neuroscience Initiative at the University of Pennsylvania. "It's all the process of our neural machinery, and that's sort of both the beauty and the complexity of it all at the same time." Sign up for our newsletter Sign up for our weekly Life's Little Mysteries newsletter to get the latest mysteries before they appear online. All color begins with light. When radiation from the sun hits Earth, a range of wavelengths are present. There are long wavelengths, like infrared rays and radio waves, and shorter, high-energy wavelengths, like X-rays and ultraviolet rays, which are damaging to our bodies, Johnson told Live Science. Toward the middle of the electromagnetic spectrum lies visible light — the light our brains can see — which represents only about 0.0035% of the electromagnetic spectrum. This is what we perceive as the colors of the rainbow. On one end of the spectrum are longer wavelengths, which we perceive as red, and on the other are shorter wavelengths, which we perceive as indigo and violet. Our perception of color involves specialized receptors at the back of our eyeballs, called cones, that detect visible light. Human eyes have three types of cones: long wave, mid wave and short wave. Each is sensitive to particular wavelengths. Long-wavelength cones take in information on reddish light, mid-wavelength cones specialize in green, and short-wavelength cones detect blue. Related: What color is the universe? When light hits our eyeballs, these three receptors take in information about the light and their respective wavelengths and send electrical signals to the brain. The brain then takes that information and makes an average deduction of what it's seeing. "Our machinery is sort of doing this complex sort of calculation of these three different ratios all the time," which forms our perception of color, Johnson said. For example, if long-wavelength and mid-wavelength cones are triggered, the brain infers that we're seeing orange or yellow. If mid-wavelength and short-wavelength cones are activated, the brain will make a conclusion of teal. So what about purple? When short-wavelength (blue) and long-wavelength (red) cones are stimulated, your brain "makes something that's actually not out there in the world," Johnson said. Red and blue are on opposite ends of the visible spectrum: When the brain encounters these wavelengths, it ends up bending this linear visible spectrum into a circle. In other words, it brings red and blue together to make purple and magenta, even though that's not what light is really doing. As a result, purple and magenta are known as "nonspectral" colors, because they don't really exist as actual electromagnetic radiation. Nonspectral colors like purple are made of two wavelengths of light. In contrast, spectral colors — red, orange, yellow, green, blue, indigo and, importantly, violet and indigo — are made of just one wavelength. RELATED MYSTERIES —Why is the sky blue? —Why do we see colors that aren't there? —What would colors look like on other planets? Regardless of its physical existence, purple has captivated people for millennia, noted Narayan Khandekar, director of the Straus Center for Conservation and Technical Studies at Harvard Art Museums. For example, ancient Phoenicians ground up sea snails to make a color known as Tyrian purple, which was reserved for royal or ceremonial robes. Today, purple is still often associated with wealth, power and even magic. "So that connection still exists, even though there are other versions of purple available now," he told Live Science. So, whether manufactured in our minds or made from ground-up shellfish, purple is unique and deserves closer look. "It doesn't really exist in nature. And so when you can create it, it has this extra value," Johnson said. "Now purple is even more special."
Yahoo
17 hours ago
- Yahoo
Scientists Intrigued by Conical Skull Found in Ancient Burial Ground
Archaeologists in Iran have discovered an ancient cone-shaped skull that is believed to have belonged to a teen girl — and there are signs of tragedy in her bones. As Live Science reports, the skull, which was found in a prehistoric burial ground known as Chega Sofla without its corresponding skeleton, shows signs not only of intentional modification, but also possibly fatal blunt force trauma. Dated to roughly 6,200 years old, the strange cone shape of the skull appears to be the product of a practice archaeologists today call artificial cranial modification, a process similar to foot-binding in which the soft skulls of children are bandaged to deliberately deform them. Found across cultures and millennia, this type of body modification has been undertaken for various reasons, including to denote social status or adhere to beauty standards, as evidenced by it more often being seen in girls than boys. Though it's still occasionally practiced today, the practice sometimes referred to as "skull elongation" was far more common in prehistoric times. The girl with the conical skull in this study, for instance, was believed to have lived in the fifth millennium BCE. Aside from the cone-shaped cranium of the young woman, who was believed to be younger than the age of 20, archaeologists Mahdi Alirezazadeh and Hamed Vahdati Nasab of the Tarbiat Modares University in Tehran — who also authored a study about their discovery that was recently published in the International Journal of Osteoarcheology — also found a long, unhealed fracture on the back of the skull that likely killed her. "We know this woman experienced the fracture in the final moments of her life," Alirezazadeh told Live Science, "but we don't have any direct evidence to say that someone intentionally struck her." Though it's unclear whether the ancient teen in question was intentionally killed or died by accident, the researchers believe that the modified shape of her skull likely made it weaker and more susceptible to trauma than a conventional cranium. Along with pointing out that an unmodified fractured skull was found alongside the conical skull in the portion of Chega Sofia where they were working, Alirezazadeh also noted that whatever killed the latter "was so severe that it would have fractured a normal, unmodified skull as well." More on head science: Scientists Want You to Ink an Electronic Tattoo On Your Forehead So Your Boss Can Detect Your Mental State
Yahoo
19 hours ago
- Yahoo
Gaming Consoles Tied To Emotional Issues In Children, Study Finds
A massive new study found that increased screen time was linked to emotional challenges in children. Researchers examined data from nearly 300,000 children across 117 long-term studies and found a two-way link between the duration kids spend on screens, like TVs and tablets, and their emotional and social health. Kids who spent more time glued to devices were found to be more aggressive and anxious and possess lower self-esteem later in life. However, the reverse was also true. The researchers say that children already dealing with social challenges were more likely to turn to screen time. Gaming consoles were found to have the strongest link to emotional and social problems, but the study did not identify a particular risk among more violent video games. 'Some studies broke the games down by whether they were violent, but our interpretation was that most parents knew to limit the amount of violent content to give kids under 10,' Dr. Michael Noetel, one of the study's authors and associate professor of psychology at The University of Queensland, Australia, said to ABC News. 'Instead, [the] key finding was that gaming in general — regardless of the specific type — showed much stronger links to emotional problems than other screen activities like watching TV or using educational apps.' The researchers focused on children aged 10 and under, allowing them to track them over time and establish potential cause-and-effect relationships. According to the American Academy of Pediatrics, children between the ages of two and five should limit recreational screen time to no more than 60 minutes per day on weekdays, and three hours on weekends. According to ABC News chief medical correspondent Dr. Tara Narula, more than four out of 10 kids ages eight to 12 use screens for more than four hours per day. The negative impact of screens does not appear to end in childhood either. Last year, The Dallas Express reported on a poll that found 40% of teens aged 13 to 17 reported that smartphones made it harder for them to acquire good social skills.