logo
Mysterious skull found decades ago could belong to legendary king, archaeologists say

Mysterious skull found decades ago could belong to legendary king, archaeologists say

Yahoo01-05-2025

An unidentified skull uncovered at a Hungarian basilica decades ago may belong to the legendary 15th century king Matthias Corvinus, a new study found.
While further DNA tests are needed to confirm the identity of the remains, researchers from the King Stephen Museum in Hungary say the skull bears resemblance to that of Corvinus's illegitimate son.
The skull was first found during excavations at the southern aisle of the Basilica of the Virgin Mary in 2002 and has remained marked simply as 'I/10'.
Now, archaeologists suspect it belongs to Matthias Corvinus, the legendary king of Hungary and Croatia, who ruled from 1458 to 1490.
Known by his nickname – King Corvinus 'The Just' – was renowned for reforming the justice system and for favouring talented individuals, chosen for their abilities, instead of working with people based on their social status.
His kingdom was also one of the first to embrace the Renaissance, spreading from Italy, ushering in a new age of art and science.
King Cornivus's royal library – Bibliotheca Corviniana – was one of the largest collections of books in Europe at the time.
He is also known for establishing one of the earliest professional standing armies in medieval Europe that played a key role in driving out the invading Ottomans.
The Basilica of the Virgin Mary, where the skull was found, is known to have been the coronation and burial site for medieval Hungarian kings. However, it too was destroyed in the 19th century following centuries of damage, including by the Ottomans.
The latest study comes after researchers reconstructed skulls found at the basilica.
They found that the one labelled I/10 particularly bore resemblance to that of Matthias' illegitimate son János Corvinus, whose remains were previously found in Croatia.
Given the skull's burial location at the basilica where other kings were also buried, researchers suspect the only likely candidate could be King Matthias himself.
Several historical written sources also point to King Corvinus's funeral taking place at the cathedral in 1490.
'The king's body, covered in purple, was lifted into a coffin with a sword, sceptre, crown, orb, golden spurs, and an uncovered face, in a carefully chosen space of the open vestibule of the basilica,' Antonio Bonfini, Matthias' court historian and poet, wrote.
However, Hungarian officials urge caution, stating that 'scientific verification of the hypothesis is ongoing'.
'We will be able to make a final statement after the verification,' the Institute of Hungarian Research said, according to Archaeology Mag.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

15 of the Most Important Inventions of All Time According to AI
15 of the Most Important Inventions of All Time According to AI

Time​ Magazine

timea day ago

  • Time​ Magazine

15 of the Most Important Inventions of All Time According to AI

This article is published by a partner of TIME. Inventions are the building blocks of civilization, shaping everything from our daily routines to the entire global economy. From the first stone tools created by our ancestors to the cutting-edge technologies driving innovation today, human creativity has consistently led to breakthroughs that improve lives and change the course of history. Some inventions have dramatically altered entire industries, while others have redefined our interaction with the world around us, pushing the boundaries of what's possible. The following list highlights 15 of the most important inventions of all time, selected for their far-reaching impact and transformative influence on society. These inventions have laid the foundation for modern life, from medicine and communication to transportation and energy. With the research assistance of AI, we will explore how these breakthroughs have shaped the world and continue to do so today. Here are the 15 inventions that stand out as some of the most influential throughout history. Breakthrough Inventions 1. The Wheel (3500 BC) The wheel is one of the oldest and most important inventions, dating back to around 3500 BC. Originally used for pottery, it soon found application in transportation, revolutionizing human mobility. The ability to transport goods and people over long distances led to the growth of trade and connected far-flung cultures. The wheel also became a critical component in machinery, laying the foundation for the Industrial Revolution and modern engineering. Additional Insights: Engineering Impact: It has been used in countless machines and continues to be foundational in modern engineering. Transportation: The wheel made it possible to move goods and people quickly and efficiently, boosting trade. Symbol of Innovation: It's often seen as a symbol of human ingenuity and problem-solving. Global Adoption: Its utility spread across the world, influencing various civilizations and industries. 2. The Printing Press (1440) Johannes Gutenberg's invention of the printing press in 1440 dramatically changed the dissemination of information. By making it possible to mass-produce books, pamphlets, and newspapers, it allowed knowledge to be distributed widely and inexpensively. This invention played a crucial role in the Renaissance, Reformation, and the spread of scientific ideas. The printing press was key to the democratization of knowledge and laid the foundation for modern education and the information age. Additional Insights: Knowledge Distribution: Books became affordable, allowing information to reach a much wider audience. Cultural Shift: The press helped spread ideas that challenged religious and political norms, such as during the Reformation. Scientific Advancements: It enabled the rapid dissemination of new scientific discoveries, propelling the Scientific Revolution. Global Impact: Its influence extended across Europe and beyond, shaping societies worldwide. Enduring Legacy: Paved the way for the modern information society. 3. The Atomic Bomb (1945) Developed during World War II, the atomic bomb was a revolutionary and highly controversial invention. Its creation led to the first use of nuclear weapons in warfare, resulting in the bombings of Hiroshima and Nagasaki in 1945. While its destructive power reshaped global politics and military strategy, it also sparked the nuclear arms race and significant shifts in geopolitical relations during the Cold War. The atomic bomb's legacy continues to influence the field of international relations and nuclear disarmament. Additional Insights: World War II Impact: It helped bring the war to a close but raised ethical questions about the use of such destructive weapons. Nuclear Arms Race: Initiated the Cold War competition between the United States and the Soviet Union. Geopolitical Shifts: Nuclear weapons fundamentally altered military strategies and global diplomacy. Nuclear Energy: The underlying technology contributed to the development of nuclear energy for peaceful purposes. Moral Dilemmas: Sparked global debates about the morality and necessity of nuclear weapons in warfare. 4. Electricity (18th–19th Centuries) Electricity powers nearly all modern technology, from lighting in homes to methods of communication, transportation, and industry. Discovered and refined throughout the 18th and 19th centuries, it paved the way for countless innovations, including the telegraph, electric motors, and telecommunications. Its widespread adoption during the Industrial Revolution allowed factories to run more efficiently, transforming economies and improving quality of life. Today, electricity remains essential in almost every aspect of daily life. Additional Insights: Early Discoveries: Benjamin Franklin, Michael Faraday, and Thomas Edison made key contributions. Industrial Impact: Powered the growth of various manufacturing sectors and mass transportation. Technological Advancements: Enabled telegraphy, radio, television, and eventually the computer and internet. Global Accessibility: Powers homes and businesses worldwide, essential for modern living. Sustainability Challenges: The quest for renewable energy sources remains central to the future of electricity generation. 5. The Telephone (1876) Invented by Alexander Graham Bell in 1876, the telephone allowed for instant communication over long distances, revolutionizing how people connect. Before the telephone, communication was limited to written letters or telegrams, which could take days to deliver. The invention enabled real-time conversations and opened up new possibilities for business, government, and personal connections. Today, the telephone has evolved into the smartphone, which plays a central role in daily life. Additional Insights: First Words: Bell's first successful telephone call was 'Mr. Watson, come here, I want to see you.' Global Connectivity: Made distant communication fast and efficient, fostering a new era of interconnectedness. Business Revolution: Allowed organizations to operate more effectively and make decisions quickly. Modern Evolution: The landline phone transformed into mobile phones and smartphones. Telecommunications Industry: Gave rise to a vast global industry dedicated to communication technologies. 6. Penicillin (1928) Penicillin, discovered by Alexander Fleming in 1928, was the first antibiotic and revolutionized medicine. It allowed doctors to treat previously fatal bacterial infections, drastically reducing mortality rates and ushering in the era of modern antibiotics. This breakthrough saved millions of lives and paved the way for other lifesaving antibiotics. Today, penicillin remains a cornerstone of medical treatments, although challenges such as antibiotic resistance have emerged. Additional Insights: Life-Saving Discovery: Dramatically reduced deaths from infections once considered fatal. Medical Impact: Initiated the antibiotic era, radically changing the treatment of infectious diseases. Global Health: Its use is widespread, reaching patients worldwide. Antibiotic Resistance: Overuse has led to resistant strains of bacteria, a growing global concern. Ongoing Research: Scientists continue to develop new antibiotics to combat evolving pathogens. 7. The Airplane (1903) The invention of the airplane by the Wright brothers in 1903 forever changed human travel. For the first time, flight over long distances became possible, shrinking the world and enabling global commerce and tourism. Airplanes transformed industries like international business and leisure travel, making transit faster and more accessible. Today, air travel is one of the most essential forms of long-distance transportation. Additional Insights: First Flight: The Wright brothers' initial flight lasted just 12 seconds, ushering in the aviation age. Global Connectivity: Made cross-continental and intercontinental travel feasible in mere hours. Economic Impact: Aviation is vital for modern commerce, connecting businesses and services worldwide. Technological Advances: Fuel-efficient planes and improved safety measures continue evolving. Future Developments: Electric and autonomous flying vehicles may soon redefine air travel again. 8. The Computer (1940s) The invention of the computer has had a profound impact on almost every aspect of human life. Early computers in the 1940s were large machines used primarily for military and scientific calculations. With the advent of personal computers in the 1970s and 1980s, computing power became accessible to the masses, driving the digital revolution. Computers are now central to business, education, communication, and entertainment, and they continue to evolve through innovations in artificial intelligence, big data, and cloud computing. Additional Insights: Early Models: The first computers occupied entire rooms and performed complex calculations. Personal Computing: The rise of home and office computers democratized access to technology. Global Connectivity: Computers power the internet, enabling global communication and collaboration. Technological Growth: Laptops, smartphones, and tablets have integrated computing into daily life. AI and Beyond: Modern computers support advanced technologies like machine learning. 9. Email (1970s) Email transformed communication by enabling people to send and receive messages instantly across long distances. Developed in the early 1970s, it quickly replaced traditional mail and telegrams for many uses, offering a faster and more efficient medium. Email is indispensable in personal and professional contexts, allowing real-time communication and easy document sharing. Despite the rise of social media and instant messaging, email remains one of the most widely used communication tools today. Additional Insights: Business Efficiency: Streamlined workplace communication, reducing the need for physical memos and meetings. Global Reach: Made it possible to communicate instantly with anyone anywhere in the world. Security Considerations: Phishing attacks and spam are modern challenges in email usage. Email Evolution: Integration with calendars, task managers, and file-sharing solutions is commonplace. 10. Television (1930s) Television fundamentally altered how information and entertainment are consumed. By the 1930s, it had become a popular medium for delivering news, shows, and educational content. TV shaped global culture and opinion, serving as a powerful platform for political discourse, advertising, and mass communication. The medium continues to evolve with the rise of streaming services and on-demand viewing, offering audiences a wealth of content anytime, anywhere. Additional Insights: Broadcasting: Revolutionized mass communication by reaching large audiences at once. Cultural Impact: Influenced everything from music to politics on a global scale. Technological Innovation: Shift to digital and high-definition improved quality and accessibility. Global Connectivity: Networks and streaming platforms bring international events to viewers worldwide. Future of TV: On-demand and interactive features are redefining the viewing experience. 11. The Refrigerator (1834) The refrigerator changed how people preserved and stored food, improving public health by preventing spoilage and reducing foodborne illnesses. Before refrigeration, methods like salting and drying were common but inefficient. By enabling long-term storage, the refrigerator revolutionized the food industry, making mass production and distribution of perishable goods possible. It's now a kitchen staple worldwide. Additional Insights: Food Safety: Maintains safe temperatures to prevent bacterial growth. Energy Efficiency: Modern designs focus on reducing electricity consumption. Environmental Impact: Early models used harmful chemicals; newer units use more eco-friendly refrigerants. Global Distribution: Essential for international trade in perishable items. Smart Technology: Some modern refrigerators come with connectivity features for better inventory management. 12. The Light Bulb (1879) Invented by Thomas Edison in 1879, the practical light bulb changed how people lived by providing a reliable source of artificial light. It extended productive hours beyond daylight, boosted nighttime safety, and influenced the layout of modern cities. The light bulb also led to the establishment of electrical grids powering homes and businesses. Contemporary designs like LEDs have made lighting more energy-efficient and environmentally friendly. Additional Insights: Early Developments: Edison and other inventors like Joseph Swan made key breakthroughs. Cultural Impact: Allowed activities to continue past sunset and enhanced public safety. Energy Efficiency: LEDs and CFLs are reducing global energy usage. Worldwide Adoption: Became a universal standard in households and commercial spaces. Environmental Impact: Ongoing push for sustainable lighting to cut electricity consumption. 13. The Automobile (1885) Invented by Karl Benz in 1885, the automobile revolutionized transportation, allowing personal mobility on an unprecedented scale. People could travel long distances rapidly, reshaping urban design and fueling suburban growth. The global adoption of cars propelled trade and commerce, but also led to environmental concerns tied to fossil fuel consumption. Innovations like electric and hybrid vehicles continue to shape the industry's future. Additional Insights: Assembly Line: Henry Ford's production methods made cars affordable for the masses. Economic Growth: The auto industry is a massive global employer and economic driver. Environmental Challenges: Emissions drive the push toward electric and alternative-fuel vehicles. Suburbanization: Cars enabled the rise of suburbs and changed city infrastructures. Future Innovations: Autonomous vehicles promise another revolution in transportation. 14. The Radio (1890s) Radio transformed communication by transmitting sound over long distances through electromagnetic waves. First demonstrated in the 1890s by innovators like Guglielmo Marconi and Nikola Tesla, it became wildly popular in the early 20th century for news, music, and entertainment. Radio gave rise to shared cultural experiences and played an influential role in shaping public opinion, especially during significant historical events. Additional Insights: First Broadcast: Marconi's successful transmission in 1901 was a landmark in wireless communication. Cultural Influence: Radio dramas, music, and news broadcasts became staples of daily life. Global Reach: Served as a vital communication method for people in remote areas. Evolution of Radio: FM radio and digital broadcasting expanded the medium's range and quality. Media Convergence: Online streaming and podcasts continue to adapt radio for the digital era. 15. The Camera (Early 19th Century) The invention of the camera revolutionized how we capture memories, document events, and record history. Joseph Nicéphore Niépce took the first permanent photograph in 1826, and camera technology has evolved continuously since then—from bulky film cameras to compact digital devices. Photography has greatly impacted art, journalism, and personal expression, enabling people to preserve and share moments in real time. Additional Insights: First Photograph: Taken by Niépce in 1826, marking the birth of modern photography. Cultural Impact: Influenced visual art, media, and public perception. Technological Progress: Digital cameras and smartphone integration have made photography ubiquitous. Social Media: Photos are at the heart of social platforms, fostering global visual storytelling. Historical Documentation: Cameras have captured landmark events, shaping our collective memory. Conclusion on Key Inventions These 15 inventions have fundamentally shaped human civilization, influencing everything from how we communicate and travel to how we work and live. Each marks a leap forward in human ingenuity that addressed critical needs and created opportunities for continued progress. Their impact reverberates in modern industries, improving the quality of life for countless people around the world. As we look to the future, these foundational innovations serve as a springboard for even more groundbreaking developments. The spirit of creativity and the relentless drive to overcome global challenges will fuel progress, bringing about new inventions that will once again transform the world in ways we can only imagine. Related Articles: About the Authors: Richard D. Harroch is a Senior Advisor to CEOs, management teams, and Boards of Directors. He is an expert on M&A, venture capital, startups, and business contracts. He was the Managing Director and Global Head of M&A at VantagePoint Capital Partners, a venture capital fund in the San Francisco area. His focus is on internet, digital media, AI and technology companies. He was the founder of several Internet companies. His articles have appeared online in Forbes, Fortune, MSN, Yahoo, Fox Business and Richard is the author of several books on startups and entrepreneurship as well as the co-author of Poker for Dummies and a Wall Street Journal-bestselling book on small business. He is the co-author of a 1,500-page book published by Bloomberg on mergers and acquisitions of privately held companies. He was also a corporate and M&A partner at the international law firm of Orrick, Herrington & Sutcliffe. He has been involved in over 200 M&A transactions and 250 startup financings. He can be reached through LinkedIn. Dominique Harroch is the Chief of Staff at She has acted as a Chief of Staff or Operations Leader for multiple companies where she leveraged her extensive experience in operations management, strategic planning, and team leadership to drive organizational success. With a background that spans over two decades in operations leadership, event planning at her own start-up and marketing at various financial and retail companies, Dominique is known for her ability to optimize processes, manage complex projects and lead high-performing teams. She holds a BA in English and Psychology from U.C. Berkeley and an MBA from the University of San Francisco. She can be reached via LinkedIn.

A High IQ Makes You an Outsider, Not a Genius
A High IQ Makes You an Outsider, Not a Genius

Yahoo

time4 days ago

  • Yahoo

A High IQ Makes You an Outsider, Not a Genius

Who has the highest IQ in history? One answer would be: a 10-year-old girl from Missouri. In 1956, according to lore, she took a version of the Stanford-Binet IQ test and recorded a mental age of 22 years and 10 months, equivalent to an IQ north of 220. (The minimum score needed to get into Mensa is 132 or 148, depending on the test, and the average IQ in the general population is 100.) Her result lay unnoticed for decades, until it turned up in The Guinness Book of World Records, which lauded her as having the highest childhood score ever. Her name, appropriately enough, was Marilyn vos Savant. And she was, by the most common yardstick, a genius. I've been thinking about which people attract the genius label for the past few years, because it's so clearly a political judgment. You can tell what a culture values by who it labels a genius—and also what it is prepared to tolerate. The Renaissance had its great artists. The Romantics lionized androgynous, tubercular poets. Today we are in thrall to tech innovators and brilliant jerks in Silicon Valley. Vos Savant hasn't made any scientific breakthroughs or created a masterpiece. She graduated 178th in her high-school class of 613, according to a 1989 profile in New York magazine. She married at 16, had two children by 19, became a stay-at-home mother, and was divorced in her 20s. She tried to study philosophy at Washington University in St. Louis, but did not graduate. She married again and was divorced again at 35. She became a puzzle enthusiast, joined a high-IQ society, and occasionally wrote an essay or a satirical piece under a pen name for a newspaper. Mostly, she devoted herself to raising her boys. That all changed in 1985, when The Guinness Book of World Records published her childhood IQ score. How its authors obtained the record is murky: An acquaintance once told the Financial Times that he'd urged her to submit her result as a way of making her famous. [Read: How smart people actually talk about themselves] Thanks to all the publicity, vos Savant met her third husband, Robert Jarvik, who had developed a pioneering model of an artificial heart. Jarvik had his own story of being overlooked: Before ultimately enrolling in medical school at the University of Utah, he had been rejected by 15 other institutions. He tracked down vos Savant after seeing her on the cover of an airline magazine, and she agreed to a date after finding a picture of him taken by Annie Leibovitz. They quickly became an item, and eventually took up residence in New York. At their 1987 wedding, the rings were made of gold and pyrolytic carbon, a material used in Jarvik's artificial heart. The science-fiction writer Isaac Asimov gave away the bride. A news report has them telling their guests that they were relieved to meet each other, because they found most people difficult to talk to—the implication being that mere mortals were not on their wavelength. The honeymoon would be spent in Paris, they revealed; vos Savant would write a screenplay for a futuristic satire, and Jarvik would continue researching his 'grand unification theory' of physics. Yet despite their superior brains, vos Savant's screenplay was never made into a film, and Jarvik—who, according to a New York profile of the couple, thought the Big Bang theory was 'wrong' and the theory of relativity was 'probably wrong'—did not revolutionize physics. What did happen, though, is that on the back of her anointment in Guinness, vos Savant built a career as a professional genius. She wrote books such as the Omni I.Q. Quiz Contest and Brain Building in Just 12 Weeks. Billing her as 'the smartest person in the world,' Parade magazine gave her an advice column, where she answered readers' queries and published puzzles. (She didn't respond to my attempts to contact her through the magazine.) Her specialty was logic problems—which showcase the particular type of mental ability most readily identified by IQ tests. In one column, she provided a solution for an apparently insoluble conundrum, the Monty Hall problem. Angry readers wrote in to correct her, but she stood firm. Vos Savant's life perfectly illustrates how genius can be a self-fulfilling prophecy. She was a housewife raising her children in total obscurity, until she was labeled a genius. And then she became one. She embodied what I call the 'genius myth,' the idea that humanity contains a special sort of person, what Samuel Johnson's dictionary defined in 1755 as 'a man endowed with superiour faculties.' Seeing yourself as such can be poisonous: Think of the public intellectuals who embarrass themselves by straying far from their area of expertise. Think of the smart people who twist logic in impressive ways to convince themselves of crankish ideas. Think of, say, a man who has had great success in business, who decides that means he must be equally good at cutting government bureaucracy. One of the cruelest things about the genius myth is that its sufferers cannot understand their failures: I'm so clever. I can't possibly have screwed this up. I prefer to talk about moments of genius: beautiful paintings, heartbreaking novels, inspired military or political decisions, scientific breakthroughs, technological marvels. Nowhere are the downsides of the genius myth more obvious than in ultrahigh-IQ societies. I don't mean Mensa, which began in England after the Second World War; it asks only that members are drawn from the top 2 percent of the population. Even more rarified are groups such as the Mega Society, which was limited to people with 'one-in-a-million' intelligence. Vos Savant made the cut. The funny thing about ultrahigh-IQ groups is that they quarrel and schism with a frequency otherwise reserved for doomsday cults and fringe political movements. An exhaustive online history of the high-IQ movement, compiled by the blogger Darryl Miyaguchi in the 1990s, recounts the story of the Cincinnatus Society, which admitted only those with an IQ higher than 99.9 percent of the population. It usurped a previous group with the same criteria, called the Triple Nine Society, which was itself a breakaway faction from another group, the International Society for Philosophical Enquiry. From the start, Mega was riven by infighting. In the 1990s, it merged with another society and announced that members would have to retake the entry test. This prompted something close to a civil war, and by 2003, the various factions in the high-IQ movement were so splintered that a dispute over who could use the group's name ended up in court. The loser in that case, Christopher Langan, has a Facebook group where he outlines his 'Cognitive Theoretical Model of the Universe,' as well as his belief that George W. Bush staged the 9/11 attacks to stop people from learning about Langan's cognitive-theoretical model of the universe. In another post, he wrote that humanity was failing because 'rich libtards' were 'pandering like two-dollar whores to the degenerate tastes, preferences, and delusions of the genetic underclass, the future of humanity be damned.' Is Langan smart? Yes. Is he insightful about humanity, or at least fun to be around? Perhaps not. Another onetime member of Mega was Keith Raniere, whose local paper, the Albany Times Union, claimed in 1988 that his self-administered test proved his intellect was 'one in 10 million.' In 2020, he was sentenced to 120 years in prison over the abuse he perpetrated as the leader of a cult called NXIVM. This operated according to a 'master and slave' hierarchy in which no one ranked higher than Raniere, who was known as 'Vanguard.' Some of NXIVM's disciples were branded with Raniere's initials. (Prosecutors also branded the group a pyramid scheme.) As the cult collapsed, many of Raniere's early claims to genius came under new scrutiny. Had he really learned to read the word homogenized off a milk carton at age 2, and understood quantum physics by 4, as a news reporter had suggested in 1988—and was he also an avid juggler who needed only 'two to four hours of sleep'? People began to wonder, and then noticed something potentially important: The Mega test was not supervised, could be taken at home, and had no time limit. Draw your own conclusions. Today, because of their infighting and their members' lack of worldly success, high-IQ groups have become kind of a joke. But their history helps illuminate why intelligence alone does not necessarily yield sublime works. In the 1980s, when some of these groups' members were asked to propose a term for the intangible quality that distinguished them from everyone else, none chose genius, according to a contemporaneous account by Grady Towers, a stalwart of the high-IQ community. 'When asked what it should be called, they produced a number of suggestions, sometimes esoteric, sometimes witty, and often remarkably vulgar,' Towers wrote in 1987. 'But one term was suggested independently again and again. Many thought that the most appropriate term for people like themselves was Outsider.' [Read: The decline and fall of Elon Musk] Towers believed that those with unusually high intelligence fell into three groups: the well-adjusted middle class, who were able to use their talents; those living marginal lives, working in manual or low-paid jobs and reading textbooks by night; and finally the dropouts, whose families had had no idea how to support their brilliant children, and might have gone so far as to treat them as a 'performing animal, or even an experiment.' The first group did not get involved with high-IQ societies, Towers thought, because their intellectual and social lives were already full. 'It's the exceptionally gifted adult who feels stifled that stands most in need of a high IQ society,' he wrote, adding that 'none of these groups is willing to acknowledge or come to terms with the fact that much of their membership belong to the psychological walking wounded.' The predominance of the lonely, frustrated, and socially awkward in ultrahigh-IQ societies was enough, he wrote, 'to explain the constant schisms that develop, the frequent vendettas, and the mediocre level of their publications. But those are not immutable facts; they can be changed. And the first step in doing so is to see ourselves as we are.' Grady Towers was murdered on March 20, 2000, while investigating a break-in at the park in Arizona where he worked as a security guard. He was 55. In 1990, The Guinness Book of World Records retired the highest-IQ category, conceding that no definitive ranking was possible, given the limitations of and the variation among the available tests. This new mood of caution means that vos Savant's Guinness record will remain untouched. If, that is, it was a record at all—critics have been arguing about the validity of her result for decades. Why does the superlative matter? Because vos Savant couldn't and wouldn't have become a 'genius' without the label being pinned on her first. Attention was paid, and then more attention followed, because if people were looking, then there must have been something worth looking at, surely. That should make us wonder if the same process happens in reverse. Do children who struggle at school get the message that they aren't 'academic,' and lose interest and enthusiasm? By thinking about IQ, I was venturing into one of the most bitter battles in 20th-century social science. In the decades following the development of standardized tests, the 'IQ wars' pitted two factions against each other: the environmentalists and the hereditarians. The first believed that IQ was entirely or largely influenced by surroundings—childhood nutrition, schooling, and so on—and the second argued that IQ was largely determined by genes. In America, these became synonymous with two extreme positions: hard-left advocacy for pure blank-slatism and far-right belief in racial hierarchy. The hereditarians were tainted by the fact that so many of them dabbled in the murky waters of race and IQ—extrapolating beyond the observed differences in average IQ scores across various countries to the suggestion that white people are innately and immutably smarter than Black people. One example would be the Nobel Prize–winning engineer William Shockley, who followed what now seems a very modern trajectory: years of real achievements, including his involvement in the invention of the transistor, followed by a second career of provocative statements and complaints about what we would now call 'cancellation.' Shockley's views on white racial superiority were coupled with his advocacy for eugenics. In a 1980 interview with Playboy, he argued that people with 'defective' genes should be paid not to reproduce. As he put it: '$30,000 put into a trust for a 70 IQ-moron, who might otherwise produce 20 children, might make the plan very profitable to the taxpayer.' But the environmentalists went too far in their claims too. Most geneticists now acknowledge that IQ is partially heritable, even though progressive activists attack almost anyone who says so out loud. When the geneticist Kathryn Paige Harden began to advance the arguments she would later turn into her 2021 book, The Genetic Lottery—which argued for social equality but conceded that genes influence educational attainment—The New Yorker reported that she was subjected to 'parades of arguments and counterarguments, leaked personal e-mails, and levels of sustained podcasting that were, by anyone's standards, extreme.' Fascinated by the dangerous allure of IQ—its promise to provide a definitive ranking of human intellectual worth—I decided to sit for an IQ test myself. At the exam site, I was one of two dozen adults, plus a couple of children. One was reading a book called Why the West Rules—For Now, which didn't assuage my worries about the political overtones of this debate. The question of what exactly IQ tests measure—and how accurately they can deliver judgment—is one that's wrapped around inflammatory questions about group identity, as well as a lively policy debate about the best system of schooling. It is no accident that so many IQ researchers have ended up endorsing scientific racism or sexism. If humans can be reduced to a number, and some numbers are higher than others, it is not a long walk to decide that some humans are 'better' than others too. In 2018, Christopher Langan wrote an obituary for Koko, a celebrated gorilla that he said could sign 1,000 words and therefore had an IQ between 75 and 95. 'Koko's elevated level of thought would have been all but incomprehensible to nearly half the population of Somalia (average IQ 68),' Langan wrote on Facebook, citing dubious research about that African country. 'Obviously, this raises a question: Why is Western civilization not admitting gorillas? They too are from Africa, and probably have a group mean IQ at least equal to that of Somalia.' Langan was featured in Malcolm Gladwell's book Outliers, which attributed his lack of academic success to his chaotic, violent upbringing and the reluctance of educational authorities to extend him the same sort of grace and understanding a middle-class child might receive. But Langan has found other answers for why he did not fulfill the glorious destiny written in his genes. He blames affirmative action and a society controlled by 'globalists' and 'banksters.' Inevitably, he has a Substack. As for me, I took two IQ tests that day. The first was a test designed in 1949 to be 'culture fair,' meaning that there were no language- or logic-based questions, only shape rotation. What became immediately apparent is that the test selects heavily for speed. The strict time limits mean you simply don't have time to luxuriate over questions, turning them over in your head. Now, you could argue that quickly grasping concepts is exactly what intelligence is. But you'd also have to admit that some of history's greatest breakthroughs came from years of careful observation and rumination. That first test convinced me that whatever an IQ test is measuring, it can't be genius—that label we are so keen to bestow on people with singular achievements. It doesn't measure showing up day after day. It doesn't measure the ego necessary to insist that you're right and everyone else is wrong. And it doesn't measure the ability to market yourself as the spirit of the age. [Read: A reality check for tech oligarchs] The second test was more recent, having been updated in 1993, and leaned heavily into verbal reasoning. What I noticed here, first, was how arguable some of these questions were. Is idle a synonym for inactive or a synonym for lazy? Both, surely—it can be used as a pure descriptor, as in 'an idle engine,' or to convey a value judgment, as in 'the idle rich.' My desire to argue with the test maker only increased in the analogies section, where the example given was: 'Trousers are to boy as skirt is to … ?' The supervisor read this out with some embarrassment, assuring us that the language was 'traditional.' Things got worse. The logic puzzles in the final section included one about an explorer who might have been eaten by either lions or 'savages.' Another question asked me to work out what my surname would be, based on clues about family relationships, and clearly rested on the assumption that women all took their husband's name, and so would their children. Full of feminist zeal, I prissily ticked the box labeled 'It is not possible to know what my surname is' and resigned myself to losing points. What were my results? Sorry—I'm not saying; we already know I'm not a genius, but I'm not an outsider either, so they don't matter. My time researching Langan, Raniere, and the others convinced me that IQ testing has narrow scientific uses, but it is a false god. Vos Savant, who is now 78, made a career of being the smartest person alive, because she had a number to prove it. Once she was hailed as a genius, vos Savant was one. Nothing about her changed, but her life did. As big a brain as Stephen Hawking had little time for this kind of thinking. In a 2004 Q&A with The New York Times Magazine, the physicist was asked what his IQ was. 'I have no idea,' he replied. 'People who boast about their IQ are losers.' This article was adapted from The Genius Myth: A Curious History of a Dangerous Idea, which will be published in the United States on June 17. Article originally published at The Atlantic

A High IQ Makes You an Outsider, Not a Genius
A High IQ Makes You an Outsider, Not a Genius

Atlantic

time4 days ago

  • Atlantic

A High IQ Makes You an Outsider, Not a Genius

Who has the highest IQ in history? One answer would be: a 10-year-old girl from Missouri. In 1956, according to lore, she took a version of the Stanford-Binet IQ test and recorded a mental age of 22 years and 10 months, equivalent to an IQ north of 220. (The minimum score needed to get into Mensa is 132 or 148, depending on the test, and the average IQ in the general population is 100.) Her result lay unnoticed for decades, until it turned up in The Guinness Book of World Records, which lauded her as having the highest childhood score ever. Her name, appropriately enough, was Marilyn vos Savant. And she was, by the most common yardstick, a genius. I've been thinking about which people attract the genius label for the past few years, because it's so clearly a political judgment. You can tell what a culture values by who it labels a genius—and also what it is prepared to tolerate. The Renaissance had its great artists. The Romantics lionized androgynous, tubercular poets. Today we are in thrall to tech innovators and brilliant jerks in Silicon Valley. Vos Savant hasn't made any scientific breakthroughs or created a masterpiece. She graduated 178th in her high-school class of 613, according to a 1989 profile in New York magazine. She married at 16, had two children by 19, became a stay-at-home mother, and was divorced in her 20s. She tried to study philosophy at Washington University in St. Louis, but did not graduate. She married again and was divorced again at 35. She became a puzzle enthusiast, joined a high-IQ society, and occasionally wrote an essay or a satirical piece under a pen name for a newspaper. Mostly, she devoted herself to raising her boys. That all changed in 1985, when The Guinness Book of World Records published her childhood IQ score. How its authors obtained the record is murky: An acquaintance once told the Financial Times that he'd urged her to submit her result as a way of making her famous. Thanks to all the publicity, vos Savant met her third husband, Robert Jarvik, who had developed a pioneering model of an artificial heart. Jarvik had his own story of being overlooked: Before ultimately enrolling in medical school at the University of Utah, he had been rejected by 15 other institutions. He tracked down vos Savant after seeing her on the cover of an airline magazine, and she agreed to a date after finding a picture of him taken by Annie Leibovitz. They quickly became an item, and eventually took up residence in New York. At their 1987 wedding, the rings were made of gold and pyrolytic carbon, a material used in Jarvik's artificial heart. The science-fiction writer Isaac Asimov gave away the bride. A news report has them telling their guests that they were relieved to meet each other, because they found most people difficult to talk to—the implication being that mere mortals were not on their wavelength. The honeymoon would be spent in Paris, they revealed; vos Savant would write a screenplay for a futuristic satire, and Jarvik would continue researching his 'grand unification theory' of physics. Yet despite their superior brains, vos Savant's screenplay was never made into a film, and Jarvik—who, according to a New York profile of the couple, thought the Big Bang theory was 'wrong' and the theory of relativity was 'probably wrong'—did not revolutionize physics. What did happen, though, is that on the back of her anointment in Guinness, vos Savant built a career as a professional genius. She wrote books such as the Omni I.Q. Quiz Contest and Brain Building in Just 12 Weeks. Billing her as 'the smartest person in the world,' Parade magazine gave her an advice column, where she answered readers' queries and published puzzles. (She didn't respond to my attempts to contact her through the magazine.) Her specialty was logic problems—which showcase the particular type of mental ability most readily identified by IQ tests. In one column, she provided a solution for an apparently insoluble conundrum, the Monty Hall problem. Angry readers wrote in to correct her, but she stood firm. Vos Savant's life perfectly illustrates how genius can be a self-fulfilling prophecy. She was a housewife raising her children in total obscurity, until she was labeled a genius. And then she became one. She embodied what I call the 'genius myth,' the idea that humanity contains a special sort of person, what Samuel Johnson's dictionary defined in 1755 as 'a man endowed with superiour faculties.' Seeing yourself as such can be poisonous: Think of the public intellectuals who embarrass themselves by straying far from their area of expertise. Think of the smart people who twist logic in impressive ways to convince themselves of crankish ideas. Think of, say, a man who has had great success in business, who decides that means he must be equally good at cutting government bureaucracy. One of the cruelest things about the genius myth is that its sufferers cannot understand their failures: I'm so clever. I can't possibly have screwed this up. I prefer to talk about moments of genius: beautiful paintings, heartbreaking novels, inspired military or political decisions, scientific breakthroughs, technological marvels. Nowhere are the downsides of the genius myth more obvious than in ultrahigh-IQ societies. I don't mean Mensa, which began in England after the Second World War; it asks only that members are drawn from the top 2 percent of the population. Even more rarified are groups such as the Mega Society, which was limited to people with 'one-in-a-million' intelligence. Vos Savant made the cut. The funny thing about ultrahigh-IQ groups is that they quarrel and schism with a frequency otherwise reserved for doomsday cults and fringe political movements. An exhaustive online history of the high-IQ movement, compiled by the blogger Darryl Miyaguchi in the 1990s, recounts the story of the Cincinnatus Society, which admitted only those with an IQ higher than 99.9 percent of the population. It usurped a previous group with the same criteria, called the Triple Nine Society, which was itself a breakaway faction from another group, the International Society for Philosophical Enquiry. From the start, Mega was riven by infighting. In the 1990s, it merged with another society and announced that members would have to retake the entry test. This prompted something close to a civil war, and by 2003, the various factions in the high-IQ movement were so splintered that a dispute over who could use the group's name ended up in court. The loser in that case, Christopher Langan, has a Facebook group where he outlines his 'Cognitive Theoretical Model of the Universe,' as well as his belief that George W. Bush staged the 9/11 attacks to stop people from learning about Langan's cognitive-theoretical model of the universe. In another post, he wrote that humanity was failing because 'rich libtards' were 'pandering like two-dollar whores to the degenerate tastes, preferences, and delusions of the genetic underclass, the future of humanity be damned.' Is Langan smart? Yes. Is he insightful about humanity, or at least fun to be around? Perhaps not. Another onetime member of Mega was Keith Raniere, whose local paper, the Albany Times Union, claimed in 1988 that his self-administered test proved his intellect was 'one in 10 million.' In 2020, he was sentenced to 120 years in prison over the abuse he perpetrated as the leader of a cult called NXIVM. This operated according to a 'master and slave' hierarchy in which no one ranked higher than Raniere, who was known as 'Vanguard.' Some of NXIVM's disciples were branded with Raniere's initials. (Prosecutors also branded the group a pyramid scheme.) As the cult collapsed, many of Raniere's early claims to genius came under new scrutiny. Had he really learned to read the word homogenized off a milk carton at age 2, and understood quantum physics by 4, as a news reporter had suggested in 1988—and was he also an avid juggler who needed only 'two to four hours of sleep'? People began to wonder, and then noticed something potentially important: The Mega test was not supervised, could be taken at home, and had no time limit. Draw your own conclusions. Today, because of their infighting and their members' lack of worldly success, high-IQ groups have become kind of a joke. But their history helps illuminate why intelligence alone does not necessarily yield sublime works. In the 1980s, when some of these groups' members were asked to propose a term for the intangible quality that distinguished them from everyone else, none chose genius, according to a contemporaneous account by Grady Towers, a stalwart of the high-IQ community. 'When asked what it should be called, they produced a number of suggestions, sometimes esoteric, sometimes witty, and often remarkably vulgar,' Towers wrote in 1987. 'But one term was suggested independently again and again. Many thought that the most appropriate term for people like themselves was Outsider.' Towers believed that those with unusually high intelligence fell into three groups: the well-adjusted middle class, who were able to use their talents; those living marginal lives, working in manual or low-paid jobs and reading textbooks by night; and finally the dropouts, whose families had had no idea how to support their brilliant children, and might have gone so far as to treat them as a 'performing animal, or even an experiment.' The first group did not get involved with high-IQ societies, Towers thought, because their intellectual and social lives were already full. 'It's the exceptionally gifted adult who feels stifled that stands most in need of a high IQ society,' he wrote, adding that 'none of these groups is willing to acknowledge or come to terms with the fact that much of their membership belong to the psychological walking wounded.' The predominance of the lonely, frustrated, and socially awkward in ultrahigh-IQ societies was enough, he wrote, 'to explain the constant schisms that develop, the frequent vendettas, and the mediocre level of their publications. But those are not immutable facts; they can be changed. And the first step in doing so is to see ourselves as we are.' Grady Towers was murdered on March 20, 2000, while investigating a break-in at the park in Arizona where he worked as a security guard. He was 55. In 1990, The Guinness Book of World Records retired the highest-IQ category, conceding that no definitive ranking was possible, given the limitations of and the variation among the available tests. This new mood of caution means that vos Savant's Guinness record will remain untouched. If, that is, it was a record at all— critics have been arguing about the validity of her result for decades. Why does the superlative matter? Because vos Savant couldn't and wouldn't have become a 'genius' without the label being pinned on her first. Attention was paid, and then more attention followed, because if people were looking, then there must have been something worth looking at, surely. That should make us wonder if the same process happens in reverse. Do children who struggle at school get the message that they aren't 'academic,' and lose interest and enthusiasm? By thinking about IQ, I was venturing into one of the most bitter battles in 20th-century social science. In the decades following the development of standardized tests, the 'IQ wars' pitted two factions against each other: the environmentalists and the hereditarians. The first believed that IQ was entirely or largely influenced by surroundings—childhood nutrition, schooling, and so on—and the second argued that IQ was largely determined by genes. In America, these became synonymous with two extreme positions: hard-left advocacy for pure blank-slatism and far-right belief in racial hierarchy. The hereditarians were tainted by the fact that so many of them dabbled in the murky waters of race and IQ—extrapolating beyond the observed differences in average IQ scores across various countries to the suggestion that white people are innately and immutably smarter than Black people. One example would be the Nobel Prize–winning engineer William Shockley, who followed what now seems a very modern trajectory: years of real achievements, including his involvement in the invention of the transistor, followed by a second career of provocative statements and complaints about what we would now call 'cancellation.' Shockley's views on white racial superiority were coupled with his advocacy for eugenics. In a 1980 interview with Playboy, he argued that people with 'defective' genes should be paid not to reproduce. As he put it: '$30,000 put into a trust for a 70 IQ-moron, who might otherwise produce 20 children, might make the plan very profitable to the taxpayer.' But the environmentalists went too far in their claims too. Most geneticists now acknowledge that IQ is partially heritable, even though progressive activists attack almost anyone who says so out loud. When the geneticist Kathryn Paige Harden began to advance the arguments she would later turn into her 2021 book, The Genetic Lottery —which argued for social equality but conceded that genes influence educational attainment— The New Yorker reported that she was subjected to 'parades of arguments and counterarguments, leaked personal e-mails, and levels of sustained podcasting that were, by anyone's standards, extreme.' Fascinated by the dangerous allure of IQ—its promise to provide a definitive ranking of human intellectual worth—I decided to sit for an IQ test myself. At the exam site, I was one of two dozen adults, plus a couple of children. One was reading a book called Why the West Rules—For Now, which didn't assuage my worries about the political overtones of this debate. The question of what exactly IQ tests measure—and how accurately they can deliver judgment—is one that's wrapped around inflammatory questions about group identity, as well as a lively policy debate about the best system of schooling. It is no accident that so many IQ researchers have ended up endorsing scientific racism or sexism. If humans can be reduced to a number, and some numbers are higher than others, it is not a long walk to decide that some humans are 'better' than others too. In 2018, Christopher Langan wrote an obituary for Koko, a celebrated gorilla that he said could sign 1,000 words and therefore had an IQ between 75 and 95. 'Koko's elevated level of thought would have been all but incomprehensible to nearly half the population of Somalia (average IQ 68),' Langan wrote on Facebook, citing dubious research about that African country. 'Obviously, this raises a question: Why is Western civilization not admitting gorillas? They too are from Africa, and probably have a group mean IQ at least equal to that of Somalia.' Langan was featured in Malcolm Gladwell's book Outliers, which attributed his lack of academic success to his chaotic, violent upbringing and the reluctance of educational authorities to extend him the same sort of grace and understanding a middle-class child might receive. But Langan has found other answers for why he did not fulfill the glorious destiny written in his genes. He blames affirmative action and a society controlled by 'globalists' and 'banksters.' Inevitably, he has a Substack. As for me, I took two IQ tests that day. The first was a test designed in 1949 to be 'culture fair,' meaning that there were no language- or logic-based questions, only shape rotation. What became immediately apparent is that the test selects heavily for speed. The strict time limits mean you simply don't have time to luxuriate over questions, turning them over in your head. Now, you could argue that quickly grasping concepts is exactly what intelligence is. But you'd also have to admit that some of history's greatest breakthroughs came from years of careful observation and rumination. That first test convinced me that whatever an IQ test is measuring, it can't be genius—that label we are so keen to bestow on people with singular achievements. It doesn't measure showing up day after day. It doesn't measure the ego necessary to insist that you're right and everyone else is wrong. And it doesn't measure the ability to market yourself as the spirit of the age. The second test was more recent, having been updated in 1993, and leaned heavily into verbal reasoning. What I noticed here, first, was how arguable some of these questions were. Is idle a synonym for inactive or a synonym for lazy? Both, surely—it can be used as a pure descriptor, as in 'an idle engine,' or to convey a value judgment, as in 'the idle rich.' My desire to argue with the test maker only increased in the analogies section, where the example given was: 'Trousers are to boy as skirt is to … ?' The supervisor read this out with some embarrassment, assuring us that the language was 'traditional.' Things got worse. The logic puzzles in the final section included one about an explorer who might have been eaten by either lions or 'savages.' Another question asked me to work out what my surname would be, based on clues about family relationships, and clearly rested on the assumption that women all took their husband's name, and so would their children. Full of feminist zeal, I prissily ticked the box labeled 'It is not possible to know what my surname is' and resigned myself to losing points. What were my results? Sorry—I'm not saying; we already know I'm not a genius, but I'm not an outsider either, so they don't matter. My time researching Langan, Raniere, and the others convinced me that IQ testing has narrow scientific uses, but it is a false god. Vos Savant, who is now 78, made a career of being the smartest person alive, because she had a number to prove it. Once she was hailed as a genius, vos Savant was one. Nothing about her changed, but her life did. As big a brain as Stephen Hawking had little time for this kind of thinking. In a 2004 Q&A with The New York Times Magazine, the physicist was asked what his IQ was. 'I have no idea,' he replied. 'People who boast about their IQ are losers.'

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store