Latest news with #LeahWilson


Reuters
07-05-2025
- Business
- Reuters
After California bar exam mess in February, July's test will cost millions more, official says
May 7 (Reuters) - California's botched February bar exam will cost the State Bar of California almost $6 million more than expected in July, with millions in added expenses in the years to come, officials told state lawmakers on Wednesday. The State Bar's decision to switch to its own bar exam, given both remotely and in person, was expected to save the state bar up to $3.8 million a year. But the problem-plagued February exam and its fallout are now a financial drain on the already cash-strapped state bar. February examinees faced unprecedented technical and logistical problems on the test and the state is returning to its traditional in-person exam in July. The state bar expects to lose about $3 million in revenue because it is waiving the July testing fees for those who failed or withdrew from the February exam, executive director Leah Wilson told the state's Senate Judiciary Committee during a hearing on a bill, opens new tab that would mandate a state audit of the February test. Wilson said on Friday that she is stepping down from her state bar post in July, citing the faulty rollout of the new exam. The bar will spend an additional $2 million to secure large testing sites for the July exam after the California Supreme Court ordered that test to be given only in person. And returning to the Multistate Bar Exam — the 200 multiple-choice questions developed by the National Conference of Bar Examiners — will cost the bar an additional $620,000 in July, Wilson said. The high court on Friday ordered the bar to return to the MBE for the July test after legal academics and test takers questioned the quality and development process of the multiple-choice questions that appeared on the February exam. Additionally, the state bar must pay test prep company Kaplan Exam Services $6.1 million more before it can exit the five-year contract it entered into last year to provide multiple-choice questions for the bar exam, even if it does not use Kaplan-produced questions, Wilson told the committee. The state bar revealed two weeks ago that due to time constraints, a subset of the February multiple-choice questions were written by a separate contractor using ChatGPT. On Monday, the state bar sued testing platform Meazure Learning for unspecified damages, claiming the company failed to live up to its promises that its systems could handle thousands of bar examinees. Meazure said on Tuesday that the state bar was trying to "shift the blame for its flawed development process for the February exam." The company declined to comment further on Wednesday. State bar officials previously had said there would be added costs following the February bar's failure but had given significantly lower estimates. The hearing also included testimony from four February bar takers, as well as several legal academics who said the development of the exam was rushed and that bar exam experts were excluded from that process. 'How do we make sure we never come back to this place?' said California's Senate Judiciary Committee Chair Tom Umberg, who sponsored the audit bill. 'It's important that we dig deep into this issue.'


Reuters
06-05-2025
- Business
- Reuters
California Bar says it has sued vendor over exam meltdown
May 5 (Reuters) - The State Bar of California said on Monday it has sued exam vendor Meazure Learning following the disastrous rollout of its February bar exam, accusing the vendor of failing to live up to its promises that its systems could handle thousands of bar examinees. The state bar, represented by partners from Hueston Hennigan, said it is seeking an unspecified amount of damages from Meazure. The state bar signed a $4.1 million contract with the company in September 2024 to administer the exam. A spokesperson for Meazure did not immediately respond to a request for comment. Reuters could not independently verify a lawsuit was filed. California's February exam was a hybrid, two-day remote and in-person test that did not use any components of the national bar exam, which the state has used for decades. Some test takers were unable to log into the bar exam at all, while many experienced delays, lax exam security, distracting proctors, and a copy-and-paste function that didn't work. The state bar alleged that Meazure disabled its own spell-check feature because it froze the platform. "Test takers reported that copy and paste, highlighting, and annotation functions did not work. Even basic typing exhibited significant lags," according to the lawsuit the state bar said it filed in Los Angeles County Superior Court. State Bar Executive Director Leah Wilson on Friday said she was stepping down from that post in July, citing the botched rollout of the new bar exam. Meazure is already facing two proposed federal class actions from two people who took the February test. Both lawsuits are pending in Oakland, California, federal court. Meazure has not answered the allegations in those lawsuits. California was the first state to break away from the national bar exam developed by the National Conference of Bar Examiners, as part of an effort to cut costs. The California Supreme Court on Monday ordered the state bar to use the Multistate Bar Exam for the upcoming July test. Meazure, based in Birmingham, Alabama, bills itself as the "largest and most experienced remote proctoring operation in the market" with more than 1,500 test centers in 115 countries. Meazure was formed through the 2020 merger of testing companies ProctorU and Yardstick.


Reuters
05-05-2025
- Business
- Reuters
California scraps new bar exam for July, adjusts scores on botched February test
May 5 (Reuters) - California will not administer its newly developed bar exam in July, after the state's Supreme Court on Friday ordered a return to the previous test following a disastrous rollout in February. The California Supreme Court directed, opens new tab the State Bar of California to use to the Multistate Bar Exam — the 200-question multiple choice portion of the exam the state had used prior to the February test — for the upcoming July test. The court said that it 'remains concerned over the processes used to draft' the multiple-choice questions that appeared on California's February exam, and it also cited in its decision the 'previously undisclosed' use of artificial intelligence in drafting some of California's February questions. In the same order, the court approved several scoring adjustments requested by the state bar that are intended to address some of the various problems February examinees encountered on the attorney licensing test. The state bar on Friday told examinees that their results, which were originally scheduled to be released on Friday, would be pushed back to Monday as it worked to adjust scores based on the court's order. A state bar spokesperson declined further comment on Friday about the court's decision. California has the second-largest number of annual bar exam takers, behind New York. About 8,000 people typically sit for its July exam. The court-ordered return of the MBE, which is developed by the National Conference of Bar Examiners, is the latest blow to California's efforts to break away from the national bar exam in a bid to cut costs. The February exam was administered both remotely and in-person and did not use any components of the national bar exam that the state has used for decades. That change was expected to save as much as $3.8 million annually by eliminating the need to rent out large event spaces, but examinees faced unprecedented technical and logistical problems. The California Supreme Court in March ordered the July exam to be given in-person at testing centers, meaning that the upcoming test will have the same format and test components as before the development of California's own exam. The state bar now projects that addressing the problems from February's exam will cost at least $2.3 million more than anticipated for July. State Bar Executive Director Leah Wilson on Friday said she will step down from that post in July, citing the botched rollout of the new bar exam. The court's order sets the raw passing score for the attorney licensing exam at 534 — lower than the 560 score recommended by its standardized testing expert who looked at February's results. Raw pass scores can fluctuate each year and are converted according to a standardized scale. The order also directs the state bar to 'impute' scores for test takers who weren't able to complete significant portions of the two-day exam.


Los Angeles Times
24-04-2025
- Business
- Los Angeles Times
California Supreme Court demands State Bar answer questions on AI exam controversy
The California Supreme Court urged the State Bar of California Thursday to explain how and why it utilized artificial intelligence to develop multiple-choice questions for its botched February bar exams. California's highest court, which is responsible for overseeing the State Bar, disclosed Tuesday that its justices were not informed before the exam that the State Bar had allowed its independent psychometrician to use AI to develop some questions. The Court demanded the State Bar to explain how it used AI to develop questions — and what actions it took to ensure the reliability of the questions. The demand comes as the State Bar petitions the court to adjust test scores for hundreds of prospective California lawyers who complained of multiple technical problems and irregularities during the February exams. The controversy is about more than the State Bar's use of artificial intelligence per se. It's about how the State Bar used AI to develop questions — and how rigorous its vetting process was — for a high stakes exam that determines whether hundreds of aspiring attorneys can practice law in California. It also raises questions about how transparent State Bar officials were as they sought to ditch the National Conference of Bar Examiners' Multistate Bar Examination — a system used by most states — and roll out a new hybrid model of in-person and remote testing in an effort to cut costs. In a statement Thursday, the court said it was seeking answers as to 'how and why AI was used to draft, revise, or otherwise develop certain multiple-choice questions, efforts taken to ensure the reliability of the AI-assisted multiple-choice questions before they were administered, the reliability of the AI-assisted multiple-choice questions, whether any multiple-choice questions were removed from scoring because they were determined to be unreliable, and the reliability of the remaining multiple-choice questions used for scoring.' Last year, the Supreme Court approved the State Bar's plan to forge an $8.25 million, five-year deal with Kaplan to create 200 test questions for a new exam. The State Bar also hired a separate company, Meazure Learning, to administer the exam. It was not until this week — nearly two months after the exam — that the State Bar revealed in a news release that it had deviated from its plan to use Kaplan Exam Services to write all the multiple-choice questions. In a presentation, the State Bar revealed that 100 of the 171 scored multiple-choice questions were made by Kaplan and 48 were drawn from a first-year law students exam. A smaller subset of 23 scored questions were made by ACS Ventures, the State Bar's psychometrician, and developed with artificial intelligence. 'We have confidence in the validity of the [multiple-choice questions] to accurately and fairly assess the legal competence of test-takers,' Leah Wilson, the State Bar's executive director, said in a statement. Alex Chan, Chair of the Committee of Bar Examiners, which exercises oversight over the California Bar Examination, initially played down the controversy, telling the Times that only a small subset of questions used AI — and not necessarily to create the questions. Chan also noted that the California Supreme Court urged the State Bar in October to review 'the availability of any new technologies, such as artificial intelligence, that might innovate and improve upon the reliability and cost-effectiveness of such testing.' 'The court has given its guidance to consider the use of AI, and that's exactly what we're going to do,' Chan said. But on Thursday Chan revealed to The Times that State Bar officials had not told the Committee of Bar Examiners ahead of the exams that it planned to use AI. 'The Committee was never informed about the use of AI before the exam took place, so it could not have considered, much less endorsed, its use,' Chan said. Katie Moran, an associate professor at the University of San Francisco School of Law who specializes in bar exam preparation, said this begged a series of questions. 'Who at the State Bar directed ACS Ventures, a psychometric company with no background in writing bar exam questions, to author multiple-choice questions that would appear on the bar exam?' she said on LinkedIn. 'What guidelines, if any, did the State Bar provide?' Mary Basick, assistant dean of academic skills at UC Irvine Law School, said it was a big deal that the changes in how the State Bar drafted its questions were not approved by the Committee of Bar Examiners or the California Supreme Court. 'What they approved was a multiple-choice exam with Kaplan-drafted questions,' she said. 'Kaplan is a bar prep company, so of course, has knowledge about the legal concepts being tested, the bar exam itself, how the questions should be structured. So the thinking was that it wouldn't be a big change.' Any major change that could impact how test-takers prepare for the exam, she noted, requires a two-year notice under California's Business and Professions Code. 'Typically, these types of questions take years to develop to make sure they're valid and reliable and there's multiple steps of review,' Basick said. 'There was simply not enough time to do that.' Basick and other professors have also raised concerns that hiring a non-legally trained psychometrist to develop questions with AI, as well as determine whether the questions are valid and reliable, represents a conflict of interest. The State Bar has disputed that idea: 'The process to validate questions and test for reliability is not a subjective one, and the statistical parameters used by the psychometrician remain the same regardless of the source of the question,' it said in a statement. On Tuesday, the State Bar told The Times that all questions were reviewed by content validation panels and subject matter experts ahead of the exam for factors including legal accuracy, minimum competence and potential bias. The State Bar has yet to answer questions about why it deviated from its plan for Kaplan to draft all the exam multiple-choice questions. It has also not elaborated on how ACS Ventures used AI to develop its questions.