logo
A divinity school steeped in history tests the boundaries of artificial intelligence

A divinity school steeped in history tests the boundaries of artificial intelligence

CBC24-07-2025
So astonishing was the resemblance, in both voice and appearance, that college president Anna Robbins's own family couldn't tell the difference between herself and the on-screen avatar generated by artificial intelligence to deliver lectures to six graduate students.
It was an illuminating and, undoubtedly to some in the broader university world, unnerving experiment in education, one that took place last fall at Acadia Divinity College, a small school steeped in Baptist history in rural Nova Scotia.
A course whose syllabus was generated by AI, whose lectures were scripted and conducted online by AI, and where students were graded by AI for real marks. A course with an especially germane topic: the ethics of artificial intelligence in Christian ministry.
"What blew our minds was realizing that I can speak 80 languages," Robbins said in a recent interview at the school in Wolfville, N.S.
At first blush, the college of about 200 mostly graduate students would seem a curious place for such a plunge into the world of artificial intelligence. But it reflects not just a changing education landscape, but an energetic discussion in Christian circles about the technology.
Like so many other facets of online life, AI has been inserted into religion. Christian chatbots answer theological questions, and apps help priests write sermons (or simply write the sermons for them). A church in Switzerland installed an AI Jesus avatar on a screen in a confessional booth.
There's also plenty of worry, such as how artificial intelligence will be used in war or for selfish gain. Pope Leo XIV has called AI an "exceptional product of human genius," but warned it could harm humanity's "openness to truth and beauty" and "ability to grasp and process reality."
The premise at Acadia Divinity was this: only by testing the limits of something that has so much promise, and likewise generates so much uneasiness, can you begin to understand its potential, and its pitfalls, and figure out what to do about it all.
Future pastors at the college learn to counsel parishioners using an AI program that mimics real people with real problems, and students have online chats with historical Christian figures.
The entirely AI-generated course last fall was simply an experiment, according to Robbins. It's not about replacing professors, she said, but examining ways that AI can help.
An AI program was fed reams of information about the school, including its history and teaching style. The six students who took part were volunteers and their tuition for the course was covered by the college.
Rev. John Campbell, the college's director of technology for education, and Jodi Porter, the school's director of education for ministry innovation, gave a keynote address in December at an Atlantic universities teaching conference about using AI in the classroom.
"One professor, of course, really didn't like the idea of an AI marking the assignments," Campbell said.
"Well, you know, a first-year English professor put up her hand and said, 'I have 300 students and I would love to have some sort of tool to help give some sort of personalized feedback to these students.'"
Joel Murphy, a "futurist" at Acadia Divinity who researches trends, said he believes AI will have a greater impact than the internet, and the implications for faith are profound, with people creating a "self-curated spirituality."
There are benefits to self-curation, he said. But the danger, he said, is that so much is left to a person's own whims, with AI tuned to give us what we want, not push back or question.
"I think it's going to create isolation, further isolation," he said. "At the centre, I think, of most faith movements is community, belonging, relationship — that can be lost in this self-curated experience."
Robbins said she shares the same concerns. But she said she believes the church has a unique place in "what has become a very artificial world," a hub for people when they finally step away from their phones and their "existential questions come crashing in."
The work at Acadia Divinity is also a matter of preparing pastors for a new world. For instance, how to talk about grief to a parishioner who is frantically uploading every video they have of a terminally ill loved one so they can converse with an AI avatar after they die.
"This is not science fiction, this is happening now," Campbell said. "That's always the dangerous side, and so some of what we're doing is to help people understand what's there and to prepare them to be able to function and minister in the midst of that."
But from the point of view of education, Acadia Divinity professors see some clear advantages.
Glen Berry, the chair of pastoral psychology, deploys an AI program so students can practise counselling skills in life-like conversations. In his view, it beats pairing up students and getting one to act the part of a troubled parishioner seeking help.
There's numerous scenarios: a grieving widower, a medical student with obsessive-compulsive disorder, a burnt-out pastor, or people who are quick to anger and take offence, sound worried or upset. At the end, it spits out a transcript Berry can review.
Melody Maxwell, an associate professor of Christian history, last term used an AI chatbot that allows students to ask questions of historical Christian figures. It helps build "historical empathy," an understanding of the feelings and motivations of people in the context of their time periods.
With a couple of clicks of a computer mouse, Robbins's AI-generated avatar can switch between 80 different languages, some quite convincingly.
"We're concerned about equipping the church globally, not for our own strengthening, but for the strengthening of the church worldwide," Robbins said.
"It would be amazing if we could offer theological education to the 90 per cent of pastors in the developing world, for example, who have no access to theological education. Suddenly there's an opportunity to serve."
As for the AI-generated course last fall, the reviews were mixed. The students agreed the "learning outcomes" were met, and they liked the near-instantaneous feedback.
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

'Can I see some ID?' As online age verification spreads, so do privacy concerns
'Can I see some ID?' As online age verification spreads, so do privacy concerns

CBC

timean hour ago

  • CBC

'Can I see some ID?' As online age verification spreads, so do privacy concerns

Ontario-based cybersecurity expert Richard Rogerson was surprised to get ID'd by a children's online gaming platform last month. His three kids had been playing the Roblox game Murder Mystery, until the platform put it behind an age verification system. They were asked to enter the email address of a parent, for permission to update their content maturity setting from "Mild" to "Moderate" if they wanted to keep playing. The email, which Rogerson shared with CBC News, asked him to enter his birthday and verify his age using a government-issued ID or credit card. "When my kids sent me a form saying, 'Upload your driver's licence,' you can imagine the amount of resistance that I had towards that," he said. While he had no problem with his kids playing the game, Rogerson declined to upload his ID, unsure how securely his data would be stored. His kids have had to give up Murder Mystery as a result. Online age verification systems are sweeping the internet as jurisdictions introduce laws to prevent children from accessing certain content, and a wave of privacy concerns has followed. The U.K.'s Online Safety Act took effect last Friday, putting age gates on pornography and other content deemed "harmful" to minors. Many websites are now required to ask U.K. users to verify their age using identifiers like selfies, driver's licences, passports or credit cards, triggering frustration and backlash from some users. While some sites have pulled out of the country rather than comply with the new rules, others are rolling out their own measures beyond the U.K. YouTube announced Tuesday it will introduce an age verification system with select U.S. users before implementing it more broadly. The video streaming platform will analyze the browsing patterns of users to determine their age before asking for other forms of ID. "We're seeing this kind of pop up everywhere," said Rogerson, who is the founder of Packetlabs, a cybersecurity firm specializing in professional hacking services. Data breaches raise alarms Rogerson says it's a good idea to restrict minors from accessing certain content, but it's also important to address related privacy and security concerns. Most companies rely on third-party verification systems, so the average user won't know who is privy to their private information, how secure it is, or how it's being stored. If that data is breached, the fallout can be disastrous. "You think about an adult website getting compromised, and here's all the faces of all the people and where they live, and here's all the content they viewed," Rogerson said. "That would be quite, quite damaging." Last Friday, San Francisco-based Tea Dating Advice Inc said in a statement that some 72,000 images, including 13,000 selfies and photo IDs, were leaked online from Tea, an app designed for women to vet dates by sharing information about men. Rogerson says Canada has mandatory breach notification laws, but some companies aren't sophisticated enough to even know when they've been breached. He says the U.K.'s General Data Protection Regulation, while not perfect, offers strong privacy protections in some areas and stiff penalties for violations. But he worries Canada is lagging behind, as its proposed legislation aimed at modernizing online privacy laws has been stalled. "We need the privacy laws here in Canada to protect Canadians, because it's every every day we're seeing companies get breached with hordes and hordes of privacy related information," he said. The Online Harms Act was introduced in February 2024 but didn't pass, and the federal government may rewrite or reintroduce the act. Justice Minister Sean Fraser recently said the government plans to take a "fresh" look at the legislation over the summer, though it's unclear what changes are coming. A ministry spokesperson did not answer whether age verification might be one of those changes, saying in an email Friday that the government "cannot speculate on legislation or content at this time." WATCH | Federal government debates online age verification: Can Canada really block kids from watching porn? | About That 1 year ago Tough choice for users Cliff Steinhauer, director of information security and engagement at the National Cybersecurity Alliance, a non-profit that promotes online safety, says age verification requirements present users with a tough choice when faced with insufficient or unclear privacy policies from the apps and websites they use. "It's like, either do this and accept the policy, or don't do it and don't get access to the service," he said. "As a user, I think you have to really ask yourself: Am I comfortable with this? If this were to be breached, what would the impact on me be? What is the risk that I'm taking by putting this selfie into this app? And is this service worth the risk?" Steinhauer says privacy laws need to put safeguards around age verification systems, like guarantees of deleting the data as soon as the user's age is verified. Several U.S. states have already implemented age verification laws. Last week, the U.S. Supreme Court upheld a Texas law requiring porn sites to verify the ages of all visitors after a judge previously struck it down as a violation of free speech. Australia has legislated a social media ban for users under 16, which is slated to go ahead in the coming months, and Quebec is considering following suit. Verification needed: advocate The Canadian Centre for Child Protection has been lobbying the federal government to mandate online age verification. Jacques Marcoux, the centre's director of research and analytics, worries about children being harmed by sexually explicit material online, pointing to a 2022 Mediasmart survey that found three in 10 Canadian youth in Grades 7 to 11 had encountered explicit material inadvertently, with most saying they were between the ages of nine and 13 when it first happened. Marcoux says while there is no perfect way to do it, verification is needed to require website operators to act more responsibly. Without that piece, he says other proposed online child protection measures could be hard to enforce. "I think this is maybe a first step in moving towards a better system, where it will be refined over time," he said. As a tech-savvy parent himself, Marcoux says even with parental controls, it's "impossible" to monitor a child's online activity 24/7. "The bottom line is that the current situation — which is that children of all ages have unfettered, no-friction access to essentially the whole of the content of the internet — is a problem that we can't turn away from," he said.

Riot Platforms, Inc. Faces Critical Risk in AI/HPC Sector Due to Dependency on External Partners
Riot Platforms, Inc. Faces Critical Risk in AI/HPC Sector Due to Dependency on External Partners

Globe and Mail

time3 hours ago

  • Globe and Mail

Riot Platforms, Inc. Faces Critical Risk in AI/HPC Sector Due to Dependency on External Partners

Riot Platforms, Inc. (RIOT) has disclosed a new risk, in the Sales & Marketing category. Elevate Your Investing Strategy: Take advantage of TipRanks Premium at 50% off! Unlock powerful investing tools, advanced data, and expert analyst insights to help you invest with confidence. Riot Platforms, Inc. faces a significant risk in its AI/HPC sector endeavors due to its reliance on third-party consultants, vendors, and potential customers. The company's success hinges on its ability to attract and retain long-term, creditworthy partners and customers to support the development and commercialization of its infrastructure. Failure to secure these relationships or if these external parties do not perform as expected, could result in the investment not delivering the anticipated returns. This dependency on external entities presents a critical risk factor that could impact Riot Platforms, Inc.'s financial outcomes. The average RIOT stock price target is $18.36, implying 66.46% upside potential. To learn more about Riot Platforms, Inc.'s risk factors, click here.

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store