logo
Former Air Force Commander at Wright-Patterson Faces Court-Martial

Former Air Force Commander at Wright-Patterson Faces Court-Martial

Yahoo15-04-2025

A court-martial is underway this week for a former commander at Wright-Patterson Air Force Base near Dayton, Ohio, who is accused of adultery and fraternization.
Col. Christopher Meeker, the former commander of the 88th Air Base Wing, was removed from his leadership position in late December 2023. In December 2024, it was announced that he was facing three violations of the Uniform Code of Military Justice.
The charges include "one charge and one specification under Article 90, Willfully Disobeying Superior Commissioned Officer; and another charge and two specifications under Article 134, Extramarital Sexual Conduct and Fraternization," Wright-Patterson announced at the time in a press release.
Read Next: Massachusetts Guard Revokes Shaving Waivers
WDTN, Dayton's NBC affiliate, reported Tuesday afternoon that Meeker entered a guilty plea for all charges.
Base spokespeople at Wright-Patterson did not immediately return a request for comment Tuesday afternoon from Military.com asking about Meeker's plea or whether he faces a bench or jury trial.
Derek Kaufman, a spokesman for Air Force Materiel Command, told Military.com on Tuesday morning that the trial is docketed for two days but that could be subject to change.
The trial was originally scheduled for this summer at Scott Air Force Base in Illinois. However, it was announced in December that Air Force Life Cycle Management Center Commander Lt. Gen. Donna Shipton at Wright-Patterson could "be a material witness in the case," adding the transfer to a different base was "in the interest of justice."
Kaufman told Military.com in a statement that, for the "convenience of trial participants," the court-martial would take place at Wright-Patterson. Scott Air Force Base's 18th Air Force Commander Maj. Gen. Charles D. Bolton is still the convening authority.
Such moves aren't uncommon in the military justice system. Eric Carpenter, a former military lawyer who is now a law professor at Florida International University, told Military.com in an interview Tuesday that choosing a convening authority or judge from another base eliminates any potential conflicts that might emerge, while having the trial locally prevents witnesses from having to pay for travel to a different base.
"That can happen when you have a convening authority that might otherwise be conflicted," Carpenter said. "They find somebody neutral, and then that convening authority can refer the case."
WHIO-TV, Dayton's CBS affiliate, reported last year that Meeker was being represented by a defense lawyer with Joint Base Langley-Eustis Area Defense Counsel. Spokespeople for the counsel group did not immediately return a request for comment asking whether it was still representing Meeker.
Carpenter told Military.com that it's "pretty unusual" for any allegations of consensual affairs to go to trial in the military justice system, saying they're typically handled with administrative punishment.
Meeker's trial this week follows other high-profile Air Force officers facing court-martial proceedings in recent years.
Maj. Gen. Phillip Stewart, the former commander of the 19th Air Force, faced a court-martial last year after being accused of sexual assault and other allegations by a subordinate. He was ultimately found not guilty of sexual assault, but he pleaded guilty to and was found guilty of lesser charges.
In 2022, Military.com reported on the court-martial proceedings for former Maj. Gen. William Cooley, previously the commander of the Air Force Research Laboratory, who was convicted in a 2023 bench trial of forcibly kissing his sister-in-law and was the first Air Force general ever to face a military trial.
Related: Former Air Force Commander at Wright-Patterson Charged with Adultery, Faces Court-Martial

Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

UK judge warns of risk to justice after lawyers cited fake AI-generated cases in court
UK judge warns of risk to justice after lawyers cited fake AI-generated cases in court

Yahoo

time2 hours ago

  • Yahoo

UK judge warns of risk to justice after lawyers cited fake AI-generated cases in court

LONDON (AP) — Lawyers have cited fake cases generated by artificial intelligence in court proceedings in England, a judge has said — warning that attorneys could be prosecuted if they don't check the accuracy of their research. High Court justice Victoria Sharp said the misuse of AI has 'serious implications for the administration of justice and public confidence in the justice system.' In the latest example of how judicial systems around the world are grappling with how to handle the increasing presence of artificial intelligence in court, Sharp and fellow judge Jeremy Johnson chastised lawyers in two recent cases in a ruling on Friday. They were asked to rule after lower court judges raised concerns about 'suspected use by lawyers of generative artificial intelligence tools to produce written legal arguments or witness statements which are not then checked,' leading to false information being put before the court. In a ruling written by Sharp, the judges said that in a 90 million pound ($120 million) lawsuit over an alleged breach of a financing agreement involving the Qatar National Bank, a lawyer cited 18 cases that did not exist. The client in the case, Hamad Al-Haroun, apologized for unintentionally misleading the court with false information produced by publicly available AI tools, and said he was responsible, rather than his solicitor Abid Hussain. But Sharp said it was 'extraordinary that the lawyer was relying on the client for the accuracy of their legal research, rather than the other way around.' In the other incident, a lawyer cited five fake cases in a tenant's housing claim against the London Borough of Haringey. Barrister Sarah Forey denied using AI, but Sharp said she had 'not provided to the court a coherent explanation for what happened.' The judges referred the lawyers in both cases to their professional regulators, but did not take more serious action. Sharp said providing false material as if it were genuine could be considered contempt of court or, in the 'most egregious cases,' perverting the course of justice, which carries a maximum sentence of life in prison. She said in the judgment that AI is a 'powerful technology' and a 'useful tool' for the law. 'Artificial intelligence is a tool that carries with it risks as well as opportunities,' the judge said. 'Its use must take place therefore with an appropriate degree of oversight, and within a regulatory framework that ensures compliance with well-established professional and ethical standards if public confidence in the administration of justice is to be maintained.'

UK judge warns of risk to justice after lawyers cited fake AI-generated cases in court
UK judge warns of risk to justice after lawyers cited fake AI-generated cases in court

San Francisco Chronicle​

time2 hours ago

  • San Francisco Chronicle​

UK judge warns of risk to justice after lawyers cited fake AI-generated cases in court

LONDON (AP) — Lawyers have cited fake cases generated by artificial intelligence in court proceedings in England, a judge has said — warning that attorneys could be prosecuted if they don't check the accuracy of their research. High Court justice Victoria Sharp said the misuse of AI has 'serious implications for the administration of justice and public confidence in the justice system.' In the latest example of how judicial systems around the world are grappling with how to handle the increasing presence of artificial intelligence in court, Sharp and fellow judge Jeremy Johnson chastised lawyers in two recent cases in a ruling on Friday. They were asked to rule after lower court judges raised concerns about 'suspected use by lawyers of generative artificial intelligence tools to produce written legal arguments or witness statements which are not then checked,' leading to false information being put before the court. In a ruling written by Sharp, the judges said that in a 90 million pound ($120 million) lawsuit over an alleged breach of a financing agreement involving the Qatar National Bank, a lawyer cited 18 cases that did not exist. The client in the case, Hamad Al-Haroun, apologized for unintentionally misleading the court with false information produced by publicly available AI tools, and said he was responsible, rather than his solicitor Abid Hussain. But Sharp said it was 'extraordinary that the lawyer was relying on the client for the accuracy of their legal research, rather than the other way around.' In the other incident, a lawyer cited five fake cases in a tenant's housing claim against the London Borough of Haringey. Barrister Sarah Forey denied using AI, but Sharp said she had 'not provided to the court a coherent explanation for what happened.' The judges referred the lawyers in both cases to their professional regulators, but did not take more serious action. Sharp said providing false material as if it were genuine could be considered contempt of court or, in the 'most egregious cases,' perverting the course of justice, which carries a maximum sentence of life in prison. She said in the judgment that AI is a 'powerful technology' and a 'useful tool' for the law. 'Artificial intelligence is a tool that carries with it risks as well as opportunities,' the judge said. 'Its use must take place therefore with an appropriate degree of oversight, and within a regulatory framework that ensures compliance with well-established professional and ethical standards if public confidence in the administration of justice is to be maintained.'

UK judge warns of risk to justice after lawyers cited fake AI-generated cases in court
UK judge warns of risk to justice after lawyers cited fake AI-generated cases in court

Yahoo

time3 hours ago

  • Yahoo

UK judge warns of risk to justice after lawyers cited fake AI-generated cases in court

LONDON (AP) — Lawyers have cited fake cases generated by artificial intelligence in court proceedings in England, a judge has said — warning that attorneys could be prosecuted if they don't check the accuracy of their research. High Court justice Victoria Sharp said the misuse of AI has 'serious implications for the administration of justice and public confidence in the justice system.' In the latest example of how judicial systems around the world are grappling with how to handle the increasing presence of artificial intelligence in court, Sharp and fellow judge Jeremy Johnson chastised lawyers in two recent cases in a ruling on Friday. They were asked to rule after lower court judges raised concerns about 'suspected use by lawyers of generative artificial intelligence tools to produce written legal arguments or witness statements which are not then checked,' leading to false information being put before the court. In a ruling written by Sharp, the judges said that in a 90 million pound ($120 million) lawsuit over an alleged breach of a financing agreement involving the Qatar National Bank, a lawyer cited 18 cases that did not exist. The client in the case, Hamad Al-Haroun, apologized for unintentionally misleading the court with false information produced by publicly available AI tools, and said he was responsible, rather than his solicitor Abid Hussain. But Sharp said it was 'extraordinary that the lawyer was relying on the client for the accuracy of their legal research, rather than the other way around.' In the other incident, a lawyer cited five fake cases in a tenant's housing claim against the London Borough of Haringey. Barrister Sarah Forey denied using AI, but Sharp said she had 'not provided to the court a coherent explanation for what happened.' The judges referred the lawyers in both cases to their professional regulators, but did not take more serious action. Sharp said providing false material as if it were genuine could be considered contempt of court or, in the 'most egregious cases,' perverting the course of justice, which carries a maximum sentence of life in prison. She said in the judgment that AI is a 'powerful technology' and a 'useful tool' for the law. 'Artificial intelligence is a tool that carries with it risks as well as opportunities,' the judge said. 'Its use must take place therefore with an appropriate degree of oversight, and within a regulatory framework that ensures compliance with well-established professional and ethical standards if public confidence in the administration of justice is to be maintained.' Jill Lawless, The Associated Press Error while retrieving data Sign in to access your portfolio Error while retrieving data Error while retrieving data Error while retrieving data Error while retrieving data

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into the world of global news and events? Download our app today from your preferred app store and start exploring.
app-storeplay-store