Latest news with #InstituteforEducationSciences
Yahoo
24-03-2025
- Business
- Yahoo
Scholar Douglas Harris Debuts New ‘Wikipedia' of K–12 Research
The actions of the Trump administration over the last few months could make it vastly more difficult to understand what's actually happening in schools. Already, the president's team has announced the cancellation of dozens of contracts through the Institute for Education Sciences, the Department of Education's research arm. Over 1,300 of the department's employees, amounting to roughly half of its workforce, have been terminated, casting doubt on whether key functions like national testing initiatives can carry on without interruption. And the future of dedicated learning hubs, including one credited with triggering a breakthrough in reading instructions, is in serious doubt. Related The wave of cuts and firings was the unspoken agenda item at the 50th annual convening of the Association for Education Finance and Policy, one of the most prominent professional organizations for education researchers. In mid-March, amid three days of panels and paper sessions touching on every conceivable topic in K–12 schooling, hundreds of academic economists, education activists, graduate students, and district staffers exchanged concerns about the future of public insight into schools. Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter Ironically, those worries emerged just as AEFP unveiled a critical new tool: its Live Handbook of education policy research, gathering and distilling the findings of thousands of studies. Its 50 chapters address a bevy of questions ranging from preschool to higher education, including the makeup of local school boards, performance of charters and school vouchers, teacher preparation programs, the effects of education spending, and more. The Association hopes the extensive and growing site, an update of previous printed versions, can provide educators and lawmakers alike with something akin to a Wikipedia for research. Leading the effort is Tulane University economist Douglas Harris, a veteran researcher who also heads the Education Research Alliance for New Orleans and the National Center for Research on Education Access and Choice. An expert on charter schooling, Harris has been one of America's most productive scholars studying how district, state, and national policies shape what kids learn — and addressing some of the most contested questions in the field, including whether school choice actually improves the delivery of education. In a conversation with The 74's Kevin Mahnken at the conference in Washington, Harris talked about the origins of the Live Handbook project, how its creators intend to ward off ideological bias, and why IES and other federal research efforts are irreplaceable supports in the U.S. education infrastructure. 'In some sense,' he said, 'the Live Handbook is a monument to IES, at a time when IES is being knocked over.' What's the purpose of this project? The idea is to make research more useful and make researchers more useful. One of our purposes is to just get research summarized and discussed in a way that's actually accessible to a broad audience, and another is to connect researchers to policymakers and journalists. If you're looking for an expert on an issue, you can find their names in these articles, click on them to get their information, and just email that person. We're hoping users create these networks of expertise and connect them with people who need that expertise. I have to say, it sounds like you're trying to put education journalists out of work. [Laughs] No! Part of what we're doing is offering journalists something they can easily cite. One of the exercises I told people to do when first developing this idea was to just see what they got after searching the internet for a summary of research on their favorite topic. The results were not very heartening. So I think this will be very useful for journalists, who will be able to find something much easier and link to it in their stories. And hopefully, when somebody is searching for a topic, the handbook will come up as the top result, which will make it build further. The more reach it has, the more people will want to write for it, and the more existing authors will want to update it — which is another important part of the document. It's not just static, it'll be updated every year, and all the authors will be expected to continue working on it. If they decide they don't want to do that, they're going to hand it back to us, and we can turn over the authorship to somebody else. Related That kind of arrangement is actually unusual from the standpoint of intellectual property. We weren't sure how that was going to work at the beginning, or if it was even legal to do it that way. But it turns out that, as long as everyone is clear about it, you can write the agreement that way. Part of the motivation for this project was to marry the traditional handbook with Wikipedia, but with Wikipedia, there's no issue with authorship. I didn't realize you took inspiration from Wikipedia. Well, there had been some talk of doing another handbook, which we'd been doing just about every decade. They were all about 700 pages long, there'd be 30 or 40 chapters written both for and by academics. We'd mostly use them as syllabi and readings for education policy classes, but the only real audience was other AEFP members. The other inspiration emerged from our website at Education Research Alliance-New Orleans. We had something like 40 studies on our website, and the page with key conclusions was like an integrated summary. All this evidence was just for the relatively narrow topic of school reform in New Orleans. The question we were asking ourselves, and which you should always ask if you're writing something, was who our intended audience was. What we realized that we didn't have to choose between researchers and policymakers. The beginning of each entry looks like a policy brief, and non-researchers will probably stop when they get to the last key finding. But if you want more detail, you just click on that finding, which takes you to the longer discussion that would have been included in a printed version of the handbook. If you want even more than that, you can click on the endnotes, which will hyperlink you to the underlying studies themselves. So you're serving everybody: At the top, policymakers are your main audience, but by the time you get to the bottom, the researchers and experts in the field can dig in. Up to this point, would you say that education research has been effectively communicated to the public, and that it has informed how politicians create policy and oversee schools? Uh, hard no. [Laughs] We have not done a good job with those things. There have certainly been moves toward that. There's something called the Research Practice Partnership movement, which is supposed to develop genuine partnerships between the research and policy worlds, and it's great. But they're really hard to create and sustain, and they tend to be very localized. We wanted to do something that had broader research. Inline pullquote: 'Recent studies tend to be methodologically better — again, partly because of IES and the principles and demands that IES has placed upon us.' We've also got the What Works Clearinghouse, which is federally funded. If you look at those releases, though, they never realized their potential. They were too slow, they were written by committee, not very readable. All of this was aiming in the right direction, but not hitting the target well. There was clearly a hole there that we're now trying to fill. As you mentioned, the What Works Clearinghouse is a federal resource that's only a few decades old — although, given reports that its funding has been cut, it may not get much older. Is the need for a live handbook related to the fact that the social science around education outcomes doesn't go back very far? The federal government certainly led the way in moving toward evidence- based policy. And everything I just mentioned emerged out of that orientation, which was mandated by law around the time of No Child Left Behind and is still in effect today. Related There is also a natural demand, in the sense that people want to do the right thing. They want to make their K–12 schools and colleges better, and they want advice. But advice is usually pretty ad hoc; it depends on who's in your network, who's got a friend in a school nearby, and what they're hearing. There will always be a place for that, but having actual evidence at the root of those conversations has a lot of potential to improve things. Can you think of an area of research where the evidence has managed to break out of academic discourse and influence the public? Research doesn't drive most conversations about policy and practice, but it can have influence at the margins and create new ideas. The science of reading is the example that comes to mind immediately. Russ Whitehurst, who became the first director of IES, is a psychologist, and he was the one who really emphasized the reading research. Almost all the underlying evidence for that is IES-funded research, which is noteworthy under the current circumstances. Another example would be class-size reduction. There was a lot of interest in that for a while, until it became clear that, while it works pretty well, it's also very expensive. The school funding debates, and whether money matters in student learning, would be another case. I think people can get their arms around class sizes and budgets being important issues in schooling. But even for someone like me, who has experience consulting research, it can be very difficult to weigh the evidence that various experts marshal on questions like teacher evaluation or early childhood education. That's the hole we want to fill, right there. We want you to do that google search and come to what we're doing because we have answers to those questions. We've got 50 chapters in this first round, and the plan is to update each of those next year while adding another 25. Part of it depends on funding, and growth is very time-intensive. We have to do all the things that a publisher does, putting it on the website and creating PDF versions and all that. But the idea is to grow the handbook so that it becomes comprehensive, both in terms of covering every part of education — early childhood through higher education — and also trying to cover all the key policy areas. How comprehensive is comprehensive? There are really old, but foundational studies, like the 1966 Coleman Report on segregation and achievement gaps, which were conducted when methods were much more crude. Are you trying to include that kind of evidence? What we're doing is an awful lot without also trying to write the history of education research. The most recent research is obviously more relevant because context matters. The world is changing around us, and that affects education. So we want more recent studies. Another important thing to remember is that recent studies tend to be methodologically better — again, partly because of IES and the principles and demands that IES has placed upon us. We're telling authors to focus on the most recent and best studies, because those are going to be more useful to the field. In some areas, like school finance, the debate among leading researchers still burns very hot. I'm sure it's difficult to arrive at anything like a consensus, so are you just trying to represent the state of play? It's a challenge. From the beginning, we didn't expect that we would release these chapters and people would say, 'Oh, you're exactly right!' Sometimes they'll say, 'Wait a second, I don't agree with that.'I wrote the charter school section, and we showed it to a group of policymakers and practitioners who are advising us. We brought them into the design process to make it more useful to them, and when I was doing a show-and-tell a couple of weeks ago, somebody said, 'I'm not sure I agree about your point on charter schools!' We knew that was going to happen. So we've advised the authors to present different sides of the debate — the major positions that are pretty widely known — and, if a key finding falls clearly on one side, then at least address what the other side of the argument is. We're being really clear about the evidence base, and I can't just say, 'Such-and-such is true of charter schools because I feel like it.' It has to be because one group of charter studies is stronger than another group of studies, or there's something really unusual about the context of some studies that make them less convincing. Basically, there has to be a reason that addresses the different sides of arguments. What structures have you put in place to prevent a kind of ideological drift? Our editorial board helps with that. It's a very wide-ranging group where you have some people who are seen as more on the left, and some who are more on the right. There are people from different disciplines. We've encouraged authors to include studies from outside their disciplines; quantitative people should include qualitative work, and vice-versa. Inline pullquote: 'It's very worrisome, and I think we're at a really uncertain time. I don't think IES is going to go away. But we don't really know what direction it's headed in or how long it will take to get there.' The board is there to enforce that and make sure we were getting the right range of perspectives, so that when there are disagreements, someone can say, 'Hold on a second, you're missing something.' We're not going to be perfect in the first round, but the process is set up to be updated and receive feedback. If you're reading a piece and have a question, or you want to debate some point, you can click the feedback link and explain it. We'll send all that information to authors, and once a year, they'll be expected to go through those comments. If we see significant issues with a piece, we'll push them to update it. And I imagine the various writers and editors can weigh in on other entries as well. Here's the way we handle these discussions: I wrote the charter schools chapter, which was edited by [University of Arkansas professor] Patrick Wolf. He's the one pressing me on the evidence there, but I'm the editor for his section on school vouchers. We view those topics a little bit differently, but that's a way we enforce objectivity. We recognize that everybody's subject to that sort of bias. That's an interesting pairing. From my perspective, the public debate around charter schools — which has been extremely contentious in the past — has become somewhat quiescent, while the voucher issue has just roared into prominence over the last few years. There are a lot of studies in play on vouchers, so Pat will probably have to update his chapter next year, and every year after that. Much of the research in that area is old and based on the city-based voucher programs in places like Milwaukee or Washington. Then you had the four states where we could study statewide voucher programs, which are probably the most relevant to the current discussion. And we'll also be including three or four national studies that we've got going at the REACH Center, which I lead. Related Part of the problem with the way the new voucher programs are set up is that, in a sense, they're designed not to be studied. There's no state testing requirement, so we don't have test-based outcomes, and you're confined in what you can study. Still, there will be a lot of interest in that topic. What do you make of the cuts to federally supported research that have been announced over the last month? It's a very big deal. If you look at the endnotes for this handbook, probably half of them have a basis in IES. Either the studies themselves were funded by IES, or they're using IES-funded data sets, or they're written by researchers who were in the IES pre-doc or post-doc programs. There was a whole set of training programs that were designed to develop the next generation of scholars. So in some sense, the Live Handbook is a monument to IES, at a time when IES is being knocked over. It's very worrisome, and I think we're at a really uncertain time. I don't think IES is going to go away. But we don't really know what direction it's headed in or how long it will take to get there, given that they just fired essentially everybody. The best-case scenario is that they hire a new director who's allied with the administration and who has sympathy and a desire to build it back up. There's no question that there are ways the institute could be made better, but there are also a lot of things you'd want to keep about the old structure. It's good to have decisions made by people who are researchers and know the field. It's good to have policymakers involved in decisions about what gets funded, which has been true for a long time. Should the research process be faster? Sure, we could find ways to do that. So it's possible that IES comes out better at the end of this. But will it? It's a huge question right now. There's obviously a lot of concern at a conference like this, where people have seen IES as the root of so much of the work we're doing. Virtually every researcher I've spoken to has said something similar. People will generally concede that improvements can be made, but where the process calls for a scalpel, DOGE is using a dump truck. I think that's right. In the amount of time they had, they couldn't have possibly learned what grants or contracts should be kept. If you're trying to do it based on reason, there's no way to do it in a matter of weeks. It's been very arbitrary, just searching for keywords and things like that. It's no way to fix anything, it's a way to knock things down. Something people don't realize is how long it took to build IES to begin with, and to gain support for it. It started, I believe, back in 2001, and it took a long time to build up the staff and the expertise. Especially in terms of data collection, it's just underestimated how much expertise goes into what IES does. All these contracts with Mathematica and AIR depend on those organizations' very significant internal capacity in areas like getting schools and students to respond to surveys. It doesn't just happen. There's so much expertise that goes into those tasks, which you've now destroyed. Even if they succeed in making things better in other ways, that's going to make it much harder to build back the things they should want to keep. It'll be like a wave pushing against them. Related Mark Schneider, the IES director under both Presidents Trump and Biden, told me that the original intention was for the institute to grow much more substantially than it has, until it more closely resembled a $40 billion agency like the NIH. Even though that hasn't happened, it has punched above its weight in expanding the knowledge base about schools. Oh, absolutely. When you think about what a good organization of any kind spends on R&D, it's a much greater proportion than the IES budget relative to total education spending. IES has about a $1 billion budget, and the United States spends something like $700 billion per year on education. So that's less than .2 percent. It's less than any standard you could come up with. Inline pullquote: 'If you're trying to do it based on reason, there's no way to do it in a matter of weeks. It's been very arbitrary, just searching for keywords and things like that. It's no way to fix anything, it's a way to knock things down.' It's always been underfunded, and they use those resources well. Collecting data, for example, creates so many positive spillover effects because once you've collected it, anyone can use it. The pre-doc and post-doc programs are really important for producing people who can work at school districts and state agencies, which need professionals who are really trained in research. That may go away too. One of the things that doesn't get enough attention is that the federal government was very involved in creating the state longitudinal data systems, which have played an enormous role in just about every area of policy research. The federal government gives money to the states to create those systems and make them available, and they allow us to link schools and programs to student outcomes. Without that, you've got nothing. Related You mentioned that IES was instrumental in generating research on the science of reading, too. That's probably the best example of the organizational influence. It's not so much about the data they were collecting, but it was related to the projects they were funding. It's also a good example of something Republicans support. They're all about the science of reading, but what's happening is that they're basically undercutting the next science of reading. There might be some research studies that don't seem very useful, but in a way, that's the point of research. You don't know what's useful until you actually do it. We don't know what the next science of reading is going to be. Hopefully, the Live Handbook will help find it, but we'd find it faster with IES underpinning the research that will get us there.
Yahoo
06-03-2025
- Politics
- Yahoo
Opinion: An Open Letter to Linda McMahon
Dear Madam Secretary, Congratulations and welcome to a place we once knew well. You face any number of tough challenges on behalf of American students, parents, educators and taxpayers, as well as the administration you serve, but your 'Department's Final Mission' speech shows that you're well prepared to meet them. We particularly admire your commitment to making American education 'the greatest in the world.' But how will we — and you, and our fellow Americans — know how rapidly we're getting there? By now, you're probably aware that the single most important activity of the department you lead is the National Assessment of Educational Progress, known to some as NAEP and to many as the Nation's Report Card. Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter That's the primary gauge by which we know how American education is doing, both nationally and in the states to which you rightly seek to restore its control. Almost four decades ago — during Ronald Reagan's second term — it was our job to modernize that key barometer of student achievement. Five years after A Nation at Risk told Americans that their education system was far from the world's greatest, state leaders — governors especially — craved better data on the performance of their students and schools. And they were right. At the time, they had no sure way of monitoring that performance. Related That was one of our challenges, back in the day. Advised by a blue-ribbon study group led by outgoing Tennessee governor (and future U.S. senator) Lamar Alexander, and with congressional cooperation spearheaded by the late Ted Kennedy, in 1988 we proposed what became a bipartisan transformation of an occasional government-sponsored test into a regular and systematic appraisal of student achievement in core academic subjects, administered by the National Center for Education Statistics (part of your Institute for Education Sciences) and overseen by an independent group of state and local leaders, plus educators and the general public. (One of your responsibilities is appointing several terrific people each year to terms on the 26-member National Assessment Governing Board.) That 1988 overhaul made three big changes: Creation of that independent board to ensure the data's integrity, accuracy and utility; Inauguration of state-level reporting of student achievement in grades 4, 8 and 12, i.e. at the ends of elementary, middle and high school; and Authorization for the board to set standards — known as achievement levels — by which to know whether that achievement is satisfactory. Much else was happening in U.S. education at the time: School choice was gaining traction. States were setting their own academic standards and administering their own assessments. Graduation requirements were rising as the economy modernized and its human capital needs increased. Related As these and other reforms gathered speed, NAEP became the country's most trusted barometer of what was (and wasn't) working. You alluded to NAEP data during your confirmation hearing. President Donald Trump deploys it when referencing the shortcomings of U.S. schools. For example, his Jan. 29 executive order on school choice began this way: 'According to this year's National Assessment of Educational Progress (NAEP), 70% of 8th graders were below proficient in reading, and 72% were below proficient in math.' Everybody relies on NAEP data, and its governing board's standards have become the criteria by which states gauge whether their own standards are rigorous enough. Just the other day, Gov. Glenn Youngkin's board of education used them to benchmark Virginia's tougher expectations for students and schools. Reading and math were, and remain, at the heart of NAEP, but today it also tests civics, U.S. history, science and other core subjects — exactly as listed in your speech. But NAEP is not perfect. It needs another careful modernization. It should make far better use of technology, including artificial intelligence. It should be nimbler and more efficient. The procedures by which its contractors are engaged need overhauling. (The Education Department's whole procurement process needs that, too — faster, more competitive, more efficient, less expensive!) Yet NAEP also needs to do more. Today, for instance, it gives state leaders their results only in grades 4 and 8, not at the end of high school. It doesn't test civics and history nearly often enough, and never in 12th grade, even though most systematic study of those subjects occurs in high school. (It probably tests fourth- and eighth-grade reading and math too often — the result of a different federal law.) Related Doing more shouldn't cost any more. Within NAEP's current budget — approaching $200 million, a drop in the department's murky fiscal ocean — much more data should be gettable by making new contracts tighter and technology smarter, squeezing more analysis from NAEP's vast trove and having staffers put shoulders to the wheel. (Former IES director Mark Schneider has pointed the way.) But making this happen will take strong executive leadership, an agile, hardworking governing board and your own oversight. You may decide it's time for another blue-ribbon group to take a close look at NAEP and recommend how to modernize it again without losing its vital ability to monitor changes over time in student achievement. Yes, this is all sort of wonky. NAEP results get used all the time, but it's far down in the bureaucracy and doesn't make much noise. Nobody in Congress (as far as we know) pays it much attention. Yet it remains — we believe — the single most important activity of your department. Which, frankly, is why it needs your watchful attention! We wish you well in your new role. Please let us know if we can help in any way. Sincerely, William J. Bennett, U.S. Secretary of Education (1985-88) Chester E. Finn Jr., Assistant Secretary for Research & Improvement and Counselor to the Secretary (1985-88)
Yahoo
21-02-2025
- Politics
- Yahoo
Mend, Don't End, the Institute for Education Sciences
Last week, DOGE's 'shock and awe' campaign came to education. The chaotic canceling of grants and contracts for various research activities at the Institute for Education Sciences (IES), a little-known yet important agency rarely at the center of public debate, was unprecedented. It showed that the Trump administration is becoming adept at using the tools of government against the federal bureaucracy. Many voters cheer these efforts, frustrated with a system they see as prioritizing elite interests over their problems. The IES chaos energized Trump supporters and horrified the education research community. But few addressed the most important question: What now? Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter Get stories like this delivered straight to your inbox. Sign up for The 74 Newsletter Like many government activities, the value of education research isn't always immediately obvious. But just because something is obscure, that doesn't mean it's irrelevant. In fact, a strong case can be made that the nation underinvests in education research. IES's budget of $793 million is a fraction of the more than $900 billion spent annually by federal, state and local governments on just K-12 public schools. That's a staggeringly lower percentage for R&D than most industries — certainly less than what Elon Musk's companies spend. Related Federal investment in education research focuses on closing the gap between the aspirations of public schools and real-world outcomes. Daniel Patrick Moynihan, chief architect of the first federal modern education research agency, envisioned it as a way to develop 'the art and science of education' to achieve true equality of opportunity. An essential mission — but the U.S. is failing to deliver on it. 'Shoddy work on trivial topics,' research warped by political priorities and bloated bureaucracies draining limited resources. That's not Elon Musk and DOGE talking, that's Chester E. Finn Jr., a key architect of federal education research-turned-critic pleading for reform in 2000. Just two weeks ago, the Nation's Report Card, produced by IES, showed the largest achievement gaps between the lowest- and highest-achieving students ever recorded. A decade of decline, coupled with disastrous pandemic responses, set achievement for struggling students back to 1990s level. International assessments reveal the U.S. as a global outlier, with a growing share of adults assessed at the lowest levels of literacy. This is not inevitable. For decades, America made steady gains in educational achievement. States are recovering from the pandemic in differentiated ways. Overall, however, achievement stagnated in the years leading up to COVID, and the nation has clearly failed to recover from the pandemic learning loss, despite significant federal spending on schools. This makes government investments in education research instrumental to understanding America's slow, halting progress toward making good on the promise of public education, and the cliff it's gone off the past few years. Related The 'science of reading' movement illustrates the power of research and the shortcomings of the existing federal approach. Journalist Emily Hanford's reporting on reading instruction did more to change classroom practices than the entire What Works Clearinghouse — a federally funded, bureaucratic mechanism for reviewing evidence. IES's mission, to 'provide national leadership in expanding fundamental knowledge and understanding of education from early childhood through postsecondary study … to provide parents, educators, students, researchers, policymakers and the general public with reliable information about … the condition and progress of education in the United States' remains essential. Yet IES is not meeting these goals. The answer is not to jettison the federal role in education research. On the contrary, the nation needs more of it, and better. The lack of outrage from people working in schools about the DOGE cuts is a silence worth listening to. Here are five ideas for a more strategic, agile, relevant and impact-driven IES: Developing effective strategies is not enough — the real challenge is getting educators to use them at scale in a decentralized system where states, districts and schools operate independently. Testing and innovation must have buy-in from those in the field so they are more strongly linked to adoption. Political pressures, bureaucratic inertia and rigid regulations often prevent research-backed solutions from taking hold. IES should prioritize research that not only evaluates effectiveness, but also identifies the policy, governance and systemic barriers that block effective implementation. The agency prioritizes rigorous experimental studies, which is good, but other methods are also needed to answer questions about implementation. And this work must be better disseminated and applied, not just passed around among researchers. Every year, school districts spend billions on curriculum, technology and instructional interventions, often with little regard for evidence. IES should evolve beyond the passive and hard-to-interpret What Works Clearinghouse and become an active information and standards-setting body. That could mean: Continuing, even expanding, essential data that inform parents and policymakers, like the National Assessment of Educational Progress (NAEP) and the Common Core of Data, the Department of Education's primary database on K-12 schools and districts. Issuing A-F ratings for educational interventions, modeled after the U.S. Preventive Services Task Force in health care. Convening expert panels, like the National Reading Panel, to resolve key education debates and provide clear, evidence-based guidance. Tracking successes and failures, publishing reports on which states and districts effectively use research-based strategies. For too long, education research has avoided politically sensitive but critical questions. IES should lead on issues such as: Why is early reading proficiency still tied so strongly to family income? How does the teacher pay structure discourage ambitious, high-achieving individuals from entering or staying in the profession? What outdated regulations and funding mechanisms are stifling school innovation? IES must be willing to confront uncomfortable truths — and ensure its research drives real policy action. The Common Core of Data, along with other information IES collects, represents some of the most used evidence in education research. Yet there are also glaring holes in what IES collects and, therefore, what researchers can explore. Very little is known, for instance, systemically about what teacher candidates learn when they are preparing to teach. Nor is there good, comprehensive national information about how much teachers earn or even what compensation is based on. IES can collect some data, but it must ask hard questions about whether this or other data collections should be done in house. Over time, IES has held onto functions that nonprofit organizations like RAND and the Advanced Education Research & Development Fund have proven they can do as well, or better. It can take years for IES to publish results, while others can do it in months. A reformed IES should focus on what it does best — funding and evaluating research, operating nimbly and maintaining quality and independence — while supporting capacity elsewhere in the field for things like large-scale data collection and reporting, fast-turnaround field surveys and DARPA-like R&D investments. Related The Department of Defense's DARPA has pioneered breakthrough innovations in the military by funding high-risk, high-reward research with clear objectives and short timelines. IES could replicate this strategy by funding one or more bold new initiatives to conduct ambitious, time-bound research. This would bring together top scientists, technologists and educators for five-year terms to work on pioneering transformative solutions, such as AI-driven personalized learning, early literacy breakthroughs and reimagined teacher preparation. Notably, DARPA is not a new governmental function; it's a mechanism for using fieldwide capacity in the private and university sectors as a problem-solving framework. Rather than spreading resources across countless disconnected projects, IES should focus on the most urgent educational challenges. A National Education Challenge Panel should be convened every five years to identify critical research priorities tied to a broader federal policy strategy. Immediate areas of focus could include: 'Eliminate the early literacy gap by 2035.' 'Ensure every eighth grader can master algebra' 'Ensure every high school graduate is truly college- or career-ready by 2030.' 'Revolutionize the teaching profession to attract a cross-section of top college graduates.' Instead of fragmented efforts, this would focus the entire education research ecosystem on delivering real, transformative change. Trump identifies as a deal maker. The ideas here could be the beginning of a new deal for education research, producing timely and usable evidence. We recognize that reforming IES in these ways will be controversial, requiring hard decisions about what research should prioritize and how the federal government should support it. But the status quo or abandonment of federal education research would be worse — leaving progress to a fragmented, underfunded patchwork of individual researchers and often ideological interest groups. Related Even if you don't like how DOGE and the Trump administration are approaching their work — and we don't — it is past time to substantially mend the federal role in education research. Especially now, if you don't want to see that role end. Disclosure: The authors have all received funding from, or worked on projects funded by, IES and have worked or currently work with RAND. Andy Rotherham sits on The 74's board of directors.