Rise of AI risks undermining HSC fairness
Lucy Carroll reports in today's Sun-Herald that the number of students caught cheating in the HSC has doubled in the past five years, a trend some in the sector have attributed to rising instances of teenagers using generative AI in their assessments.
With all we know about the pervasiveness of artificial intelligence and its increasing sophistication, the figure is likely to be an underestimate. Indeed, Australian Tutoring Association president Mohan Dhall told Carroll that malpractice as a result of AI was likely going 'vastly undetected'.
Over the past two years, the university sector has grappled with how to manage the use of AI in assessments.
After initially reacting with outright bans, the institutions – increasingly reliant on online learning as a cost-saving teaching model – have changed their tune, allowing AI to be used in at least some assessments.
At present, the University of Sydney is phasing in a policy that allows students to use AI in some assessments – a radical reversal of its previous ban on the technology. In the coming semester, students will be able to use AI in all take-home assessments, and co-ordinators cannot ban its use.
Loading
At the University of NSW, teachers set a level of acceptable AI use for each assessment. The university signed an Australian-first deal last year with ChatGPT to roll out a special version of the technology on campus.
The NSW Education Standards Authority (NESA) believes it is the responsibility of individual schools and school sectors to manage policies for the use of AI in their establishments. But, in the case of the high-pressure, statewide HSC, this approach surely cannot hold.
If scenes outside selective schools test centres when some computers malfunctioned last month are anything to go by, any sense of unfairness across the system will not be tolerated: too much rides on the marks received by students in the NSW school system, be it a place at a top-performing selective school, or admission into a dream university course.
As Carroll reports today, a paper published last month by Catholic Schools NSW said HSC take-home assessments should decrease in importance for a student's overall grade until 'the AI threat to assessment integrity can be satisfactorily contained'.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


The Advertiser
8 hours ago
- The Advertiser
Why consumer trust matters more than ever
In a fast-changing economy, consumer trust isn't just a virtue - it's a necessity. Whether you're buying a fridge, downloading an app, or applying for a mortgage, you're taking a leap of faith that the product will do what it claims, the terms will be fair, and someone will listen if things go wrong. Trust is the invisible infrastructure of the modern marketplace. Strip it away, and what remains is a minefield of confusion, caution and exploitation. When people stop believing the system works for them, they disengage. And when bad actors go unchallenged, good businesses suffer, too. Australia has made real strides in building trust through strong consumer protection. The Australian Consumer Law, introduced in 2011, brought together a tangle of federal, state and territory rules into a single national framework. Since then, protections have expanded - unfair contract terms are now banned, consumer guarantees are better enforced, and regulators have more tools to hold businesses to account. But markets don't stand still. And nor can our laws. MORE OPINION: Today, technology has transformed the way we live and shop. Global platforms dominate online commerce. Artificial intelligence increasingly determines what ads we see, what prices we're offered, even which products we're shown in the first place. With these advances come opportunities - but also new risks. AI, for all its potential, raises serious questions for consumer protection. What happens when a chatbot gives dangerously wrong advice? When a recommendation algorithm steers vulnerable users toward harmful content? Or when a company uses machine learning to nudge people into purchases they later regret? That's why the Australian government is working to ensure our legal frameworks keep pace with our technological ones. We're reviewing how AI-enabled products and services are regulated, with a view to ensuring transparency, accountability and fairness. We're also introducing civil penalties for breaches of consumer guarantees - because rights without consequences are just suggestions. We're improving the Unit Pricing Code, so shoppers can easily compare the real value of groceries. And we're rolling out a Scams Prevention Framework that places obligations on banks, telcos and digital platforms to detect and prevent scams, not just clean up after them. Meanwhile, our regulators are stepping up enforcement in the sectors where trust is most fragile. With targeted government funding, the Australian Competition and Consumer Commission is focusing on competition and fair trading issues in the supermarket and retail sectors, including misleading pricing and abuse of market power. Its compliance priorities include tackling greenwashing, protecting vulnerable NDIS users, and cracking down on manipulative digital practices. Too often, the worst harms occur when companies grow too dominant or operate in spaces where oversight is weak. Take the example of a remote Indigenous community, where an elderly woman walked into a Telstra store seeking a basic mobile phone. She walked out with multiple postpaid contracts she didn't understand, and monthly bills she couldn't afford. The Federal Court later fined Telstra $50 million for unconscionable conduct. In another case, Meta subsidiaries were fined $20 million after marketing an app that claimed to protect user data while secretly harvesting it. Apple was fined $9 million for telling customers they weren't eligible for repairs after using third-party technicians - a claim that flew in the face of Australian law. And when Hungry Jack's included a toy in a children's meal without the required button battery warning, it was penalised for breaching safety standards. These rules exist for a reason: to protect families from serious harm. These aren't fringe operators. They're household names. And their conduct was made possible by market power, information asymmetry, and the assumption that no one was watching. Consumer protection is sometimes seen as a sideline to "serious" economic reform. But in truth, it's central to how modern economies function. When markets reward deception, not value; when fine print outweighs fairness; when firms grow too dominant to care - the entire system suffers. That's why competition policy and consumer policy must work hand in hand. A competitive market isn't just about lower prices. It's about higher standards. It's about firms knowing they'll lose customers if they don't do the right thing - and that regulators will step in if they cross the line. This is particularly urgent in the digital economy. As AI systems become more powerful, the risk of opaque decisions, embedded bias and automated unfairness grows. Scams are becoming more sophisticated, with deepfakes and spoofed messages designed to exploit even the savvy. "Dark patterns" - manipulative website designs that make it hard to cancel, return or say no - are becoming increasingly common. Globalisation, too, has complicated enforcement. Many online platforms operate across jurisdictions, making redress harder and regulation slower. To meet these challenges, we need laws that are modern, regulators that are empowered, and markets that reward ethical behaviour. We also need cooperation. Consumer protection can't be achieved by government alone. It requires collaboration between regulators and researchers, businesses and advocates, technologists and policymakers. It means listening to real consumers, not just corporate lobbyists. It means designing systems that are built for trust - and proving worthy of that trust, again and again. At its best, consumer protection doesn't just stop bad behaviour. It shapes the kind of economy we want to live in - one that rewards integrity, encourages innovation, and treats fairness not as a luxury, but as a baseline. In the end, every tap of a card, every "I agree" clicked, is a small act of trust. Australians deserve to know that the system will live up to it. In a fast-changing economy, consumer trust isn't just a virtue - it's a necessity. Whether you're buying a fridge, downloading an app, or applying for a mortgage, you're taking a leap of faith that the product will do what it claims, the terms will be fair, and someone will listen if things go wrong. Trust is the invisible infrastructure of the modern marketplace. Strip it away, and what remains is a minefield of confusion, caution and exploitation. When people stop believing the system works for them, they disengage. And when bad actors go unchallenged, good businesses suffer, too. Australia has made real strides in building trust through strong consumer protection. The Australian Consumer Law, introduced in 2011, brought together a tangle of federal, state and territory rules into a single national framework. Since then, protections have expanded - unfair contract terms are now banned, consumer guarantees are better enforced, and regulators have more tools to hold businesses to account. But markets don't stand still. And nor can our laws. MORE OPINION: Today, technology has transformed the way we live and shop. Global platforms dominate online commerce. Artificial intelligence increasingly determines what ads we see, what prices we're offered, even which products we're shown in the first place. With these advances come opportunities - but also new risks. AI, for all its potential, raises serious questions for consumer protection. What happens when a chatbot gives dangerously wrong advice? When a recommendation algorithm steers vulnerable users toward harmful content? Or when a company uses machine learning to nudge people into purchases they later regret? That's why the Australian government is working to ensure our legal frameworks keep pace with our technological ones. We're reviewing how AI-enabled products and services are regulated, with a view to ensuring transparency, accountability and fairness. We're also introducing civil penalties for breaches of consumer guarantees - because rights without consequences are just suggestions. We're improving the Unit Pricing Code, so shoppers can easily compare the real value of groceries. And we're rolling out a Scams Prevention Framework that places obligations on banks, telcos and digital platforms to detect and prevent scams, not just clean up after them. Meanwhile, our regulators are stepping up enforcement in the sectors where trust is most fragile. With targeted government funding, the Australian Competition and Consumer Commission is focusing on competition and fair trading issues in the supermarket and retail sectors, including misleading pricing and abuse of market power. Its compliance priorities include tackling greenwashing, protecting vulnerable NDIS users, and cracking down on manipulative digital practices. Too often, the worst harms occur when companies grow too dominant or operate in spaces where oversight is weak. Take the example of a remote Indigenous community, where an elderly woman walked into a Telstra store seeking a basic mobile phone. She walked out with multiple postpaid contracts she didn't understand, and monthly bills she couldn't afford. The Federal Court later fined Telstra $50 million for unconscionable conduct. In another case, Meta subsidiaries were fined $20 million after marketing an app that claimed to protect user data while secretly harvesting it. Apple was fined $9 million for telling customers they weren't eligible for repairs after using third-party technicians - a claim that flew in the face of Australian law. And when Hungry Jack's included a toy in a children's meal without the required button battery warning, it was penalised for breaching safety standards. These rules exist for a reason: to protect families from serious harm. These aren't fringe operators. They're household names. And their conduct was made possible by market power, information asymmetry, and the assumption that no one was watching. Consumer protection is sometimes seen as a sideline to "serious" economic reform. But in truth, it's central to how modern economies function. When markets reward deception, not value; when fine print outweighs fairness; when firms grow too dominant to care - the entire system suffers. That's why competition policy and consumer policy must work hand in hand. A competitive market isn't just about lower prices. It's about higher standards. It's about firms knowing they'll lose customers if they don't do the right thing - and that regulators will step in if they cross the line. This is particularly urgent in the digital economy. As AI systems become more powerful, the risk of opaque decisions, embedded bias and automated unfairness grows. Scams are becoming more sophisticated, with deepfakes and spoofed messages designed to exploit even the savvy. "Dark patterns" - manipulative website designs that make it hard to cancel, return or say no - are becoming increasingly common. Globalisation, too, has complicated enforcement. Many online platforms operate across jurisdictions, making redress harder and regulation slower. To meet these challenges, we need laws that are modern, regulators that are empowered, and markets that reward ethical behaviour. We also need cooperation. Consumer protection can't be achieved by government alone. It requires collaboration between regulators and researchers, businesses and advocates, technologists and policymakers. It means listening to real consumers, not just corporate lobbyists. It means designing systems that are built for trust - and proving worthy of that trust, again and again. At its best, consumer protection doesn't just stop bad behaviour. It shapes the kind of economy we want to live in - one that rewards integrity, encourages innovation, and treats fairness not as a luxury, but as a baseline. In the end, every tap of a card, every "I agree" clicked, is a small act of trust. Australians deserve to know that the system will live up to it. In a fast-changing economy, consumer trust isn't just a virtue - it's a necessity. Whether you're buying a fridge, downloading an app, or applying for a mortgage, you're taking a leap of faith that the product will do what it claims, the terms will be fair, and someone will listen if things go wrong. Trust is the invisible infrastructure of the modern marketplace. Strip it away, and what remains is a minefield of confusion, caution and exploitation. When people stop believing the system works for them, they disengage. And when bad actors go unchallenged, good businesses suffer, too. Australia has made real strides in building trust through strong consumer protection. The Australian Consumer Law, introduced in 2011, brought together a tangle of federal, state and territory rules into a single national framework. Since then, protections have expanded - unfair contract terms are now banned, consumer guarantees are better enforced, and regulators have more tools to hold businesses to account. But markets don't stand still. And nor can our laws. MORE OPINION: Today, technology has transformed the way we live and shop. Global platforms dominate online commerce. Artificial intelligence increasingly determines what ads we see, what prices we're offered, even which products we're shown in the first place. With these advances come opportunities - but also new risks. AI, for all its potential, raises serious questions for consumer protection. What happens when a chatbot gives dangerously wrong advice? When a recommendation algorithm steers vulnerable users toward harmful content? Or when a company uses machine learning to nudge people into purchases they later regret? That's why the Australian government is working to ensure our legal frameworks keep pace with our technological ones. We're reviewing how AI-enabled products and services are regulated, with a view to ensuring transparency, accountability and fairness. We're also introducing civil penalties for breaches of consumer guarantees - because rights without consequences are just suggestions. We're improving the Unit Pricing Code, so shoppers can easily compare the real value of groceries. And we're rolling out a Scams Prevention Framework that places obligations on banks, telcos and digital platforms to detect and prevent scams, not just clean up after them. Meanwhile, our regulators are stepping up enforcement in the sectors where trust is most fragile. With targeted government funding, the Australian Competition and Consumer Commission is focusing on competition and fair trading issues in the supermarket and retail sectors, including misleading pricing and abuse of market power. Its compliance priorities include tackling greenwashing, protecting vulnerable NDIS users, and cracking down on manipulative digital practices. Too often, the worst harms occur when companies grow too dominant or operate in spaces where oversight is weak. Take the example of a remote Indigenous community, where an elderly woman walked into a Telstra store seeking a basic mobile phone. She walked out with multiple postpaid contracts she didn't understand, and monthly bills she couldn't afford. The Federal Court later fined Telstra $50 million for unconscionable conduct. In another case, Meta subsidiaries were fined $20 million after marketing an app that claimed to protect user data while secretly harvesting it. Apple was fined $9 million for telling customers they weren't eligible for repairs after using third-party technicians - a claim that flew in the face of Australian law. And when Hungry Jack's included a toy in a children's meal without the required button battery warning, it was penalised for breaching safety standards. These rules exist for a reason: to protect families from serious harm. These aren't fringe operators. They're household names. And their conduct was made possible by market power, information asymmetry, and the assumption that no one was watching. Consumer protection is sometimes seen as a sideline to "serious" economic reform. But in truth, it's central to how modern economies function. When markets reward deception, not value; when fine print outweighs fairness; when firms grow too dominant to care - the entire system suffers. That's why competition policy and consumer policy must work hand in hand. A competitive market isn't just about lower prices. It's about higher standards. It's about firms knowing they'll lose customers if they don't do the right thing - and that regulators will step in if they cross the line. This is particularly urgent in the digital economy. As AI systems become more powerful, the risk of opaque decisions, embedded bias and automated unfairness grows. Scams are becoming more sophisticated, with deepfakes and spoofed messages designed to exploit even the savvy. "Dark patterns" - manipulative website designs that make it hard to cancel, return or say no - are becoming increasingly common. Globalisation, too, has complicated enforcement. Many online platforms operate across jurisdictions, making redress harder and regulation slower. To meet these challenges, we need laws that are modern, regulators that are empowered, and markets that reward ethical behaviour. We also need cooperation. Consumer protection can't be achieved by government alone. It requires collaboration between regulators and researchers, businesses and advocates, technologists and policymakers. It means listening to real consumers, not just corporate lobbyists. It means designing systems that are built for trust - and proving worthy of that trust, again and again. At its best, consumer protection doesn't just stop bad behaviour. It shapes the kind of economy we want to live in - one that rewards integrity, encourages innovation, and treats fairness not as a luxury, but as a baseline. In the end, every tap of a card, every "I agree" clicked, is a small act of trust. Australians deserve to know that the system will live up to it. In a fast-changing economy, consumer trust isn't just a virtue - it's a necessity. Whether you're buying a fridge, downloading an app, or applying for a mortgage, you're taking a leap of faith that the product will do what it claims, the terms will be fair, and someone will listen if things go wrong. Trust is the invisible infrastructure of the modern marketplace. Strip it away, and what remains is a minefield of confusion, caution and exploitation. When people stop believing the system works for them, they disengage. And when bad actors go unchallenged, good businesses suffer, too. Australia has made real strides in building trust through strong consumer protection. The Australian Consumer Law, introduced in 2011, brought together a tangle of federal, state and territory rules into a single national framework. Since then, protections have expanded - unfair contract terms are now banned, consumer guarantees are better enforced, and regulators have more tools to hold businesses to account. But markets don't stand still. And nor can our laws. MORE OPINION: Today, technology has transformed the way we live and shop. Global platforms dominate online commerce. Artificial intelligence increasingly determines what ads we see, what prices we're offered, even which products we're shown in the first place. With these advances come opportunities - but also new risks. AI, for all its potential, raises serious questions for consumer protection. What happens when a chatbot gives dangerously wrong advice? When a recommendation algorithm steers vulnerable users toward harmful content? Or when a company uses machine learning to nudge people into purchases they later regret? That's why the Australian government is working to ensure our legal frameworks keep pace with our technological ones. We're reviewing how AI-enabled products and services are regulated, with a view to ensuring transparency, accountability and fairness. We're also introducing civil penalties for breaches of consumer guarantees - because rights without consequences are just suggestions. We're improving the Unit Pricing Code, so shoppers can easily compare the real value of groceries. And we're rolling out a Scams Prevention Framework that places obligations on banks, telcos and digital platforms to detect and prevent scams, not just clean up after them. Meanwhile, our regulators are stepping up enforcement in the sectors where trust is most fragile. With targeted government funding, the Australian Competition and Consumer Commission is focusing on competition and fair trading issues in the supermarket and retail sectors, including misleading pricing and abuse of market power. Its compliance priorities include tackling greenwashing, protecting vulnerable NDIS users, and cracking down on manipulative digital practices. Too often, the worst harms occur when companies grow too dominant or operate in spaces where oversight is weak. Take the example of a remote Indigenous community, where an elderly woman walked into a Telstra store seeking a basic mobile phone. She walked out with multiple postpaid contracts she didn't understand, and monthly bills she couldn't afford. The Federal Court later fined Telstra $50 million for unconscionable conduct. In another case, Meta subsidiaries were fined $20 million after marketing an app that claimed to protect user data while secretly harvesting it. Apple was fined $9 million for telling customers they weren't eligible for repairs after using third-party technicians - a claim that flew in the face of Australian law. And when Hungry Jack's included a toy in a children's meal without the required button battery warning, it was penalised for breaching safety standards. These rules exist for a reason: to protect families from serious harm. These aren't fringe operators. They're household names. And their conduct was made possible by market power, information asymmetry, and the assumption that no one was watching. Consumer protection is sometimes seen as a sideline to "serious" economic reform. But in truth, it's central to how modern economies function. When markets reward deception, not value; when fine print outweighs fairness; when firms grow too dominant to care - the entire system suffers. That's why competition policy and consumer policy must work hand in hand. A competitive market isn't just about lower prices. It's about higher standards. It's about firms knowing they'll lose customers if they don't do the right thing - and that regulators will step in if they cross the line. This is particularly urgent in the digital economy. As AI systems become more powerful, the risk of opaque decisions, embedded bias and automated unfairness grows. Scams are becoming more sophisticated, with deepfakes and spoofed messages designed to exploit even the savvy. "Dark patterns" - manipulative website designs that make it hard to cancel, return or say no - are becoming increasingly common. Globalisation, too, has complicated enforcement. Many online platforms operate across jurisdictions, making redress harder and regulation slower. To meet these challenges, we need laws that are modern, regulators that are empowered, and markets that reward ethical behaviour. We also need cooperation. Consumer protection can't be achieved by government alone. It requires collaboration between regulators and researchers, businesses and advocates, technologists and policymakers. It means listening to real consumers, not just corporate lobbyists. It means designing systems that are built for trust - and proving worthy of that trust, again and again. At its best, consumer protection doesn't just stop bad behaviour. It shapes the kind of economy we want to live in - one that rewards integrity, encourages innovation, and treats fairness not as a luxury, but as a baseline. In the end, every tap of a card, every "I agree" clicked, is a small act of trust. Australians deserve to know that the system will live up to it.

Sydney Morning Herald
a day ago
- Sydney Morning Herald
Social media ban must look to future teen trends
The federal government plans to introduce its social media ban for under-16s by December. Announced to mixed reviews last year – parent groups were ecstatic, while mental health organisations have warned about the risk of isolating vulnerable teens and tech commentators questioned the data security trade-offs – the ban would eventually require all Australians to complete an age verification process to use Instagram, Facebook, TikTok and other social media apps. The exact parameters of the ban remain to be seen, and will need to pass parliament, but last week, the Herald reported eSafety Commissioner Julie Inman Grant had advised the government to not restrict its new rules to specific social media platforms. Inman Grant is specifically seeking to include video platform YouTube in the ban, after it previously received an exemption due to its 'significant educational purpose'. According to the eSafety Commission's research, four in 10 young teenagers have been exposed to harmful content, such as eating disorder videos, misogynistic or hateful material, or violent fight videos, while watching YouTube. As the Albanese government finalises the details of its attempt to restrict social media on a national scale, the Sun-Herald believes it is extremely prudent to not include a discrete list of platforms the rules cover. Indeed, as Emily Kowal reports in today's Sun-Herald, there are emerging forms of online engagement driven by artificial intelligence, for which regulation should also be considered. Companion chatbots such as Replika and allow users to converse, call and exchange photos and videos with an AI 'friend'. The user can style this friend as their favourite character from a movie, a celebrity, or someone they know in real life. Loading It is not hard to see why child safety experts are concerned. The eSafety Commissioner said she had received reports of children as young as 10 spending hours on chatbots, which AI researchers say learn from their user, evolving to respond in ways to keep them talking for longer. Some bots are designed to be mean, others tend towards pornographic or other forms of conversation inappropriate for children. All collect information about their user, and few have any real mechanism to validate their user's age.

The Age
a day ago
- The Age
Social media ban must look to future teen trends
The federal government plans to introduce its social media ban for under-16s by December. Announced to mixed reviews last year – parent groups were ecstatic, while mental health organisations have warned about the risk of isolating vulnerable teens and tech commentators questioned the data security trade-offs – the ban would eventually require all Australians to complete an age verification process to use Instagram, Facebook, TikTok and other social media apps. The exact parameters of the ban remain to be seen, and will need to pass parliament, but last week, the Herald reported eSafety Commissioner Julie Inman Grant had advised the government to not restrict its new rules to specific social media platforms. Inman Grant is specifically seeking to include video platform YouTube in the ban, after it previously received an exemption due to its 'significant educational purpose'. According to the eSafety Commission's research, four in 10 young teenagers have been exposed to harmful content, such as eating disorder videos, misogynistic or hateful material, or violent fight videos, while watching YouTube. As the Albanese government finalises the details of its attempt to restrict social media on a national scale, the Sun-Herald believes it is extremely prudent to not include a discrete list of platforms the rules cover. Indeed, as Emily Kowal reports in today's Sun-Herald, there are emerging forms of online engagement driven by artificial intelligence, for which regulation should also be considered. Companion chatbots such as Replika and allow users to converse, call and exchange photos and videos with an AI 'friend'. The user can style this friend as their favourite character from a movie, a celebrity, or someone they know in real life. Loading It is not hard to see why child safety experts are concerned. The eSafety Commissioner said she had received reports of children as young as 10 spending hours on chatbots, which AI researchers say learn from their user, evolving to respond in ways to keep them talking for longer. Some bots are designed to be mean, others tend towards pornographic or other forms of conversation inappropriate for children. All collect information about their user, and few have any real mechanism to validate their user's age.