logo
Former journalist uncovers ancient human activity at Gault Site in Central Texas

Former journalist uncovers ancient human activity at Gault Site in Central Texas

CBS News26-04-2025

If you've never heard of Gault, lean in.
"So, leave it to a Texan to go to Antarctica and come home with a story about Texas," said Dallas' Olive Talley with a laugh.
CBS News Texas
It was 2017. The former journalist was enjoying a lecture on the peopling of the Americas. Her ears perked up when she heard Texas.
"And he told us about this place in central Texas that has helped rewrite history," said Talley. "And all of us Texans turned to each other and said, 'Have you ever heard of this? No.' And I thought, as a journalist, 'Why not? Why the heck had I never heard of this place?'"
Talley's natural curiosity kicked in and she reached out to Mike Collins, the University of Texas at Austin archaeologist responsible for discovering the site. So, yes, she began to dig. Sorry, I couldn't resist, and Talley played along.
"Yes, I do love that pun," she said with a chuckle. "Yes, I dug into Gault and realized that we have this special gem, this hidden treasure of an archaeological site here in our backyard in central Texas. And most people have never heard of it."
Talley's five-year journey to change that became an award-winning independent film called "The Stones Are Speaking."
"Experts thought the first people came into the Americas around 13,000 or 13,500 years ago," said Talley. "But how can that be if, in fact, there's this little spot in what is now modern-day Texas where there's evidence of people living here and camping here 20,000 years ago?"
Those first humans in the Americas are called the Clovis Culture. Evidence that Collins and his team discovered put them in central Texas thousands of years earlier than experts once believed. But securing the artifacts that proved it was at times like shoveling quicksand.
"The first time he walked onto the site, it looked like a World War I battlefield with all these big holes gouged out," said Talley. "But he looked beyond that, and he saw a landscape. And he's looking through the eyes of a geologist and an archaeologist, and he's saying, 'There's a constant supply of water. There's a big supply of chert, which people make their stone tools out of... and there were food sources for people and animals. You had everything here to sustain life in prehistoric times.'"
And it almost remained hidden. The site had been on the radar of both expert and amateur collectors for nearly a century. Generations of landowners had allowed visitors to dig for arrowheads for a fee. But treasures far more precious were hidden in the bedrock below, just waiting for the right mix of curiosity, commitment and expertise. UT archaeologist Dr. Mike Collins, said Talley, was all these and more.
"And as I got to know Mike Collins and as I learned about all the thousands of volunteers who he rallied, who felt the same way about him that I was feeling, I just thought, 'There's a wonderful, untold story here about the perseverance and the passion that it took to develop this place, protect this place, and reveal all these secrets.'"
Although it would take years for Talley to tell Collins' story, it took much longer for him to prove that there was a story that needed to be told. He first began exploring the site in 1991. But at one point, citing ethical concerns, the university pulled its support and the dig was shut down. Later, after those issues were resolved, the landowners stopped cooperating and the site was closed to scientists again. But Collins, said Talley, was as stubborn as the rocks being dug out of the ground around him.
"After overcoming many odds, he purchased the property with his own money, mortgaging a lot of personal property that he and his wife had, and they bought the place to save it because the landowners were going to sell it," said Talley.
Collins not only purchased what is now known as the Gault Site, he then promptly gave it away, turning it over to local nonprofits to ensure that its treasures would be protected for generations to come.
"It could have become a development," said Talley. "There could have been houses there. It's surrounded by 20 different rock quarries! So I mean, who knows what would have happened to the property?"
Talley's moviemaking mission was to allow the stones to speak. The celebration of its success, though, is not how the story ends.
"So I feel such pressure to tell his story..." Talley paused, fighting for composure, and her eyes filled with tears. "Mike Collins is one of the most passionate, selfless, inspiring people I've ever met in my life. I guess what I find so tragic about this story is that this brilliant mind is being eroded by this horrible disease called Alzheimer's, and I feel compelled and I felt compelled throughout the whole project to do this story and get it done, because I knew that at some point Mike could not tell his own story. And so I'm telling his story on his behalf."
The history-altering site that Collins saved is now open for monthly tours and future scientists, all drawn to hear what the stones have to say.
And when asked why we should listen?
"It's history," said Talley. "It's our history. It's human history, and it's really fascinating. If you'll just stop and think and let the stones speak to you."
"The Stones Are Speaking" will air as part of the USA Film Festival on Sunday, April 27 at 4 p.m. at the Angelika Theater in Dallas.

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

New book details ‘troubling history' of eugenics in Texas
New book details ‘troubling history' of eugenics in Texas

Yahoo

time3 days ago

  • Yahoo

New book details ‘troubling history' of eugenics in Texas

AUSTIN (KXAN) — Eugenics, or the pseudoscience of human breeding, reached the height of its notoriety in the early 1900s but never truly disappeared, according to a new book that examines the influence that the debunked movement had on Texas. 'The Purifying Knife: The Troubling History of Eugenics in Texas,' written by Michael Phillips and Betsy Friauf, was published this week. It examines the history and influence of eugenics in the state. Co-author and former history professor Dr. Michael Phillips spoke with KXAN about the book on Wednesday. 'We had mixed feelings doing this book, because this comes in a time when there's so much dangerous rejection of science in terms of vaccines, in terms of climate change and other issues,' Phillips said. Phillips, who earned his doctorate in 2002 at the University of Texas at Austin, has focused his work on the history of racism in Texas. His first book 'White Metropolis: Race, Ethnicity, and Religion in Dallas, 1841–2001' published in 2006 and built off his UT Austin thesis. His and Friauf's research began in 2014. 'Although there were a lot of victims on the way to this science becoming marginalized, the scientific method ultimately did work, and mainstream science did reject it,' he said. 'Eugenics was accepted all across the political spectrum from the very conservative to what were called progressives then, who were the forerunners of liberals today, and it was just accepted scientific wisdom.' Before science moved past eugenics, 36 states passed laws enacting some of the movement's ideas. This included forced sterilization of people deemed 'unfit' — at least 60,000 people were victims of these laws. Texas was one of 12 states that didn't pass eugenics laws, Phillips said. 'Cotton growers in Texas and the big landowners were very much in favor of immigration, because they wanted to exploit Mexican workers as underpaid labor in their fields,' he said. 'Eugenicists were very anti-immigration. So [Texas] had a powerful economic interest that was afraid that if eugenics became law, that immigration from Mexico would stop and that would drive up the cost of their labor.' Fundamentalist Protestantism, which had become a force in Texas politics in the 1920s, was also opposed to eugenicist ideas derived from Darwin's theory of evolution. Phillips said that he sees the emerging pro-natalist movement as a home for discredited eugenics ideas — a natalist conference at UT Austin in March featured speakers who self-described as eugenicists, he said. But also leveled criticism at the environmental movement of the 1960s and 70s for allowing eugenicists. 'I think natalism is easier to sell than outright explicit eugenics. I think a lot of times, modern eugenicists describe themselves as pro-family,' Phillips said. 'But in the 1960s and 1970s … there was a real panic about the world becoming overpopulated. And they really pushed for birth control policy, but they always focused on Africa, Asia and Latin America. It was always places where people of color lived that they wanted to control population.' He warned that people should look critically at anyone who claims the existence of biological differences between racial groups or who believe IQ should determine if a person should be allowed to reproduce. 'There's an assumption that somehow, 'smarter,' whatever that means, is better. And I don't think that necessarily bears up in history,' he said. 'People who had ethics, emotional intelligence, a sense of the need for community, may not have scored well on IQ tests, but they function better in and help contribute to a better society.' Copyright 2025 Nexstar Media, Inc. All rights reserved. This material may not be published, broadcast, rewritten, or redistributed.

Optimizing AI apps in a million-token world
Optimizing AI apps in a million-token world

Fast Company

time3 days ago

  • Fast Company

Optimizing AI apps in a million-token world

The context size problem in large language models is nearly solved. In recent months, models like GPT-4.1, LLaMA 4, and DeepSeek V3 have reached context windows ranging from hundreds of thousands to millions of tokens. We're entering a phase where entire documents, threads, and histories can fit into a single prompt. It marks real progress—but it also brings new questions about how we structure, pass, and prioritize information. WHAT IS CONTEXT SIZE (AND WHY WAS IT A CHALLENGE)? Context size defines how much text a model can process in one go, and is measured in tokens, which are small chunks of text, like words or parts of words. It shaped the way we worked with LLMs: splitting documents, engineering recursive prompts, summarizing inputs—anything to avoid truncation. Now, models like LLaMA 4 Scout can handle up to 10 million tokens, and DeepSeek V3 and GPT-4.1 go beyond 100K and 1M respectively. With those capabilities, many of those older workarounds can be rethought or even removed. FROM BOTTLENECK TO CAPABILITY This progress unlocks new interaction patterns. We're seeing applications that can reason and navigate across entire contracts, full Slack threads, or complex research papers. These use cases were out of reach not long ago. However, just because models can read more does not mean they automatically make better use of that data. The paper ' Why Does the Effective Context Length of LLMs Fall Short? ' examines this gap. It shows that LLMs often attend to only part of the input, especially the more recent or emphasized sections, even when the prompt is long. Another study, ' Explaining Context Length Scaling and Bounds for Language Models,' explores why increasing the window size does not always lead to better reasoning. Both pieces suggest that the problem has shifted from managing how much context a model can take to guiding how it uses that context effectively. Think of it this way: Just because you can read every book ever written about World War I doesn't mean you truly understand it. You might scan thousands of pages, but still fail to retain the key facts, connect the events, or explain the causes and consequences with clarity. What we pass to the model, how we organize it, and how we guide its attention are now central to performance. These are the new levers of optimization. CONTEXT WINDOW ≠ TRAINING TOKENS A model's ability to accept a large context does not guarantee that it has been trained to handle it well. Some models were exposed only to shorter sequences during training. That means even if they accept 1M tokens, they may not make meaningful use of all that input. This gap affects reliability. A model might slow down, hallucinate, or misinterpret input if overwhelmed with too much or poorly organized data. Developers need to verify if the model was fine tuned for long contexts, or simply adapted to accept them. WHAT CHANGES FOR ENGINEERS With these new capabilities, developers can move past earlier limitations. Manual chunking, token trimming, and aggressive summarization become less critical. But this does not remove the need for data prioritization. Prompt compression, token pruning, and retrieval pipelines remain relevant. Techniques like prompt caching help reuse portions of prompts to save costs. Mixture-of-experts (MoE) models, like those used in LLaMA 4 and DeepSeek V3, optimize compute by activating only relevant components. Engineers also need to track what parts of a prompt the model actually uses. Output quality alone does not guarantee effective context usage. Monitoring token relevance, attention distribution, and consistency over long prompts are new challenges that go beyond latency and throughput. IT IS ALSO A PRODUCT AND UX ISSUE For end users, the shift to larger contexts introduces more freedom—and more ways to misuse the system. Many users drop long threads, reports, or chat logs into a prompt and expect perfect answers. They often do not realize that more data can sometimes cloud the model's reasoning. Product design must help users focus. Interfaces should clarify what is helpful to include and what is not. This might mean offering previews of token usage, suggestions to refine inputs, or warnings when the prompt is too broad. Prompt design is no longer just a backend task, but rather part of the user journey. THE ROAD AHEAD: STRUCTURE OVER SIZE Larger context windows open important doors. We can now build systems that follow extended narratives, compare multiple documents, or process timelines that were previously out of reach. But clarity still matters more than capacity. Models need structure to interpret, not just volume to consume. This changes how we design systems, how we shape user input, and how we evaluate performance. The goal is not to give the model everything. It is to give it the right things, in the right order, with the right signals. That is the foundation of the next phase of progress in AI systems.

Giant plume of Saharan dust to hit US. What does it mean for tropical storm development?
Giant plume of Saharan dust to hit US. What does it mean for tropical storm development?

Yahoo

time4 days ago

  • Yahoo

Giant plume of Saharan dust to hit US. What does it mean for tropical storm development?

A plume of dust from the Saharan Desert is expected to sweep across the Gulf Coast this week, bringing hazy skies and reduced air quality to millions from Texas to Florida. While the heaviest concentrations are forecast for the southeastern U.S., the eastern half of Texas, particularly the stretch from Houston to Dallas, will still see dust-laden skies beginning this weekend. Fortunately for Texans, the dust isn't expected to be as dense as in other states, though it may still pose discomfort for those with respiratory issues. It's also fortunate for residents of Florida and other Gulf Coast states — where the National Hurricane Center is monitoring a tropical system that has the potential to develop — as the dust could suppress and weaken the system that's right off the southeastern United States coastline. This refers to the dust carried by winds across the Atlantic from the Sahara Desert. Thunderstorms in the Sahara region stir up dust and push it into the atmosphere. The dust is then transported across the Atlantic and deposited in the Caribbean, Central America, and South America. While it primarily affects Puerto Rico, it can also reach states such as Florida and Texas. According to the CDC, Saharan dust transported to the United States is a normal occurrence, especially from late June to mid-August. While Saharan dust can lead to hazy skies and poor air quality across parts of the Gulf Coast, it also plays a significant role in shaping tropical weather patterns, often for the better, at least in terms of storm suppression. Alex DaSilva, hurricane expert for AccuWeather, previously told USA TODAY that Saharan dust can act as a natural deterrent to tropical storm development. 'It basically can choke off these systems because, again, they want plenty of moisture, and when you're taking the moisture away, it makes it harder for thunderstorms to develop," DaSilva previously told USA TODAY. In addition to drying out the atmosphere, the dust also increases wind shear and atmospheric stability — two other key factors that can hinder storm formation. Although Saharan dust was crossing the Atlantic during the intensification of Hurricane Beryl, the storm was able to strengthen because it developed ahead of the densest plume of dust. This placed it in a pocket of relatively moist, unstable air — the kind of environment tropical systems need to grow. A light layer of Saharan dust is expected to settle over South Texas on Monday, drifting north into the Dallas-Fort Worth area by Tuesday. Conditions should clear briefly on Wednesday before a thicker wave moves in Thursday and Friday. Another round is possible over the weekend or early next week. This article originally appeared on Austin American-Statesman: What is Saharan dust? How it may impact tropical system near Florida

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store