logo
Dark side of young people's phone habit revealed

Dark side of young people's phone habit revealed

West Australian15-05-2025
For many young Australians, tracking a partner's every move is a normal sign of love and affection, but there is a darker side to the growing trend.
Online following and monitoring has become common among family and friends but research published by the eSafety Commissioner on Thursday suggests this behaviour might be spilling over into romantic relationships.
Ashton Wood, chief executive of DV Safe Phone, said the trend was concerning and could have harmful consequences.
Mr Wood leads the organisation that provides free mobile phones to domestic violence victims across Australia.
"In domestic violence, we see lots around technology-facilitated abuse," he told AAP.
"It becomes a method of control and before the victim realises it, their partner is watching everything."
Mr Wood said it was important to have a safe phone - one that was not tracked or monitored.
"It's really critical if someone's in danger to have access to a device that their partner doesn't know about, that can be used without fear of being tracked or monitored," he said.
The eSafety Commissioner's research found 18.6 per cent of people aged 18 to 24 expected to track their partner whenever they wanted.
The study surveyed 2000 Australians aged 18 to 75, asking whether they agreed with certain harmful expectations and attitudes linked to tech-based coercive control in intimate relationships.
Tracking a partner can take many forms, including using Apple's Find My app or third-party apps such as Life360, which is popular among parents.
Maneesha Prakash from the Youth Advocacy Centre works with young people and delivers community legal education programs in schools.
The domestic and family violence lawyer said it had become widely normalised for people to track friends, partners and loved ones through social media.
"Most apps have the ability to share locations," Ms Prakash told AAP.
"(Young) people don't blink twice. They think it's normal. They think it's part of somebody caring about them.
"That leads to them getting into quite toxic relationships and all the flow-on effects."
The prospect of tracking a partner can be a form of tech-based coercive control, which is a pattern of abusive behaviour used to control someone within a relationship.
"A lot of young people find it really confronting when you talk to them about coercive control and how it's not normal behaviour to be constantly monitored," Ms Prakash said.
"We are seeing quite a lot of DV behaviours stemming from coercive control that comes with locating someone."
Ms Prakash said there were significant gaps in knowledge that left young people at a disadvantage.
"It's important to keep having conversations around consent and coercive control in schools and at home.," she said.
1800 RESPECT (1800 737 732)
Lifeline 13 11 14
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Australians outraged at telco providers' emergency service failures, missed sales and frozen banking apps
Australians outraged at telco providers' emergency service failures, missed sales and frozen banking apps

7NEWS

time9 minutes ago

  • 7NEWS

Australians outraged at telco providers' emergency service failures, missed sales and frozen banking apps

As Australia becomes increasingly dependent on telecommunications services, consumers and businesses expect a reliable network provider and not one that works some of the time. Aside from frustration and inconvenience, studies show network outages are exceedingly costly for businesses in a world where cash is becoming increasingly obsolete. WATCH THE VIDEO ABOVE: Service provider disruptions cost Australian millions New research from PagerDuty, a global leader in digital operations, has revealed Australians lost an estimated 73 million hours of operation in the past year. The release showed that 41 per cent of Australians experienced a system failure, incident or outage with their telecommunications provider in the last 12 months. In addition, a March report from cybersecurity and observability leader Splunk revealed that unplanned tech disruptions are costing businesses with over 500 employees a staggering $86 billion. Independent telecommunications analyst Paul Budde spoke to 7NEWS exclusively about the dangers of frequent outages. 'People are affected by it (outages) ... particularly in health care services, education services where it gets cut and it's dangerous,' Budde said. 'People can lose their life if the telephone isn't working.' Budde also detailed the reason behind these outages is often linked to human error, continuous updates and software changes. 'They (telecommunications providers) are highly cost driven ... cost cutting all the time in order to keep profitable,' he said. 'Obviously, in a process like that, you can see mistakes are made and then it becomes easier for outages to occur.' If this issue is posing a significant threat to life and business, what then is being done? The Australian Communications and Media Authority (ACMA) recently introduced stronger consumer protections to safeguard Australians. These protections include customers being informed of the outages when they occur, the locations that are affected, and the likely cause. Providers will also be responsible for detailing the services their outage will impact, and the estimated timeframe consumers should expect to see the issue resolved. Member for the ACMA Samantha Yorke spoke on the changes, considering them vital to addressing the substantial impact these outages have on individuals and communities. 'It's not just frustrating, it can cause significant issues, including disrupting businesses and impacting public safety,' Yorke said. New obligations will require telecommunications services to provide greater certainty that calls to emergency services will be carried by an available network in the event of an outage. 'Improving industry accountability for the carriage of calls to Triple Zero will give Australians greater confidence that their safety is prioritised when networks are down,' Yorke said.

Real punishment for 'terrifying tech' deepfake makers
Real punishment for 'terrifying tech' deepfake makers

The Advertiser

time30 minutes ago

  • The Advertiser

Real punishment for 'terrifying tech' deepfake makers

Young men and teens using artificial intelligence to create sexually explicit deepfakes of women and girls could face time behind bars. NSW Attorney-General Michael Daley introduced legislation on Thursday to expand existing offences for the production and distribution of intimate images without consent. "Playing with these images on your phone is a serious offence that could land you in jail," he told reporters in a warning to parents about young boys being possibly involved. "We are ensuring that anyone who seeks to humiliate, intimidate or degrade someone using AI can be prosecuted." Deepfakes refer to digitally altered images of a person's face or body and young women and girls are often targeted in a sexual manner. Sharing of explicit deepfake images of underage Australians has doubled since 2023, data from the eSafety Commissioner in June showed. Almost all the deepfake AI images circulating were pornographic, with 98 per cent being of women, Mr Daley said. Those dabbling in deepfakes in NSW by stealing a person's real identity could face up to three years in jail or $11,000 in fines. Sharing or extortion by threatening to share such damaging content online known as revenge porn, even if the person hasn't created them, can also result in up to three years behind bars. Content also will encompass the creation, recording and distribution of sexually explicit audio, whether real or altered. Full Stop CEO Karen Bevan said violence was perpetrated in many forms. Cutting at the root in the digital world was a necessary intervention to change ingrained misogynistic attitudes about women and girls in society, she said. "It's critical that we are really clear in our community that sexual violence of any kind is not acceptable and that this is real harm" she said. "These kinds of images and this kind of distribution is humiliating and it's degrading." The harms created by the click of a mouse and a few strokes of a keyboard could be all too real, long-lasting and devastating in their impacts, NSW Women's Safety Commissioner Hannah Tonkin said. "We know that women and girls are the main targets of deepfake images. This is terrifying technology, which can be weaponised to cause immense harm." Laws cracking down on the sharing of sexually explicit AI-generated images and deepfakes without consent were recently introduced to federal parliament. Meanwhile, multiple reports have emerged of deepfake images being circulated in schools across the nation, including an incident where explicit deepfake images of 50 Melbourne schoolgirls were created and shared online in 2024. Lifeline 13 11 14 Fullstop Australia 1800 385 578 Young men and teens using artificial intelligence to create sexually explicit deepfakes of women and girls could face time behind bars. NSW Attorney-General Michael Daley introduced legislation on Thursday to expand existing offences for the production and distribution of intimate images without consent. "Playing with these images on your phone is a serious offence that could land you in jail," he told reporters in a warning to parents about young boys being possibly involved. "We are ensuring that anyone who seeks to humiliate, intimidate or degrade someone using AI can be prosecuted." Deepfakes refer to digitally altered images of a person's face or body and young women and girls are often targeted in a sexual manner. Sharing of explicit deepfake images of underage Australians has doubled since 2023, data from the eSafety Commissioner in June showed. Almost all the deepfake AI images circulating were pornographic, with 98 per cent being of women, Mr Daley said. Those dabbling in deepfakes in NSW by stealing a person's real identity could face up to three years in jail or $11,000 in fines. Sharing or extortion by threatening to share such damaging content online known as revenge porn, even if the person hasn't created them, can also result in up to three years behind bars. Content also will encompass the creation, recording and distribution of sexually explicit audio, whether real or altered. Full Stop CEO Karen Bevan said violence was perpetrated in many forms. Cutting at the root in the digital world was a necessary intervention to change ingrained misogynistic attitudes about women and girls in society, she said. "It's critical that we are really clear in our community that sexual violence of any kind is not acceptable and that this is real harm" she said. "These kinds of images and this kind of distribution is humiliating and it's degrading." The harms created by the click of a mouse and a few strokes of a keyboard could be all too real, long-lasting and devastating in their impacts, NSW Women's Safety Commissioner Hannah Tonkin said. "We know that women and girls are the main targets of deepfake images. This is terrifying technology, which can be weaponised to cause immense harm." Laws cracking down on the sharing of sexually explicit AI-generated images and deepfakes without consent were recently introduced to federal parliament. Meanwhile, multiple reports have emerged of deepfake images being circulated in schools across the nation, including an incident where explicit deepfake images of 50 Melbourne schoolgirls were created and shared online in 2024. Lifeline 13 11 14 Fullstop Australia 1800 385 578 Young men and teens using artificial intelligence to create sexually explicit deepfakes of women and girls could face time behind bars. NSW Attorney-General Michael Daley introduced legislation on Thursday to expand existing offences for the production and distribution of intimate images without consent. "Playing with these images on your phone is a serious offence that could land you in jail," he told reporters in a warning to parents about young boys being possibly involved. "We are ensuring that anyone who seeks to humiliate, intimidate or degrade someone using AI can be prosecuted." Deepfakes refer to digitally altered images of a person's face or body and young women and girls are often targeted in a sexual manner. Sharing of explicit deepfake images of underage Australians has doubled since 2023, data from the eSafety Commissioner in June showed. Almost all the deepfake AI images circulating were pornographic, with 98 per cent being of women, Mr Daley said. Those dabbling in deepfakes in NSW by stealing a person's real identity could face up to three years in jail or $11,000 in fines. Sharing or extortion by threatening to share such damaging content online known as revenge porn, even if the person hasn't created them, can also result in up to three years behind bars. Content also will encompass the creation, recording and distribution of sexually explicit audio, whether real or altered. Full Stop CEO Karen Bevan said violence was perpetrated in many forms. Cutting at the root in the digital world was a necessary intervention to change ingrained misogynistic attitudes about women and girls in society, she said. "It's critical that we are really clear in our community that sexual violence of any kind is not acceptable and that this is real harm" she said. "These kinds of images and this kind of distribution is humiliating and it's degrading." The harms created by the click of a mouse and a few strokes of a keyboard could be all too real, long-lasting and devastating in their impacts, NSW Women's Safety Commissioner Hannah Tonkin said. "We know that women and girls are the main targets of deepfake images. This is terrifying technology, which can be weaponised to cause immense harm." Laws cracking down on the sharing of sexually explicit AI-generated images and deepfakes without consent were recently introduced to federal parliament. Meanwhile, multiple reports have emerged of deepfake images being circulated in schools across the nation, including an incident where explicit deepfake images of 50 Melbourne schoolgirls were created and shared online in 2024. Lifeline 13 11 14 Fullstop Australia 1800 385 578 Young men and teens using artificial intelligence to create sexually explicit deepfakes of women and girls could face time behind bars. NSW Attorney-General Michael Daley introduced legislation on Thursday to expand existing offences for the production and distribution of intimate images without consent. "Playing with these images on your phone is a serious offence that could land you in jail," he told reporters in a warning to parents about young boys being possibly involved. "We are ensuring that anyone who seeks to humiliate, intimidate or degrade someone using AI can be prosecuted." Deepfakes refer to digitally altered images of a person's face or body and young women and girls are often targeted in a sexual manner. Sharing of explicit deepfake images of underage Australians has doubled since 2023, data from the eSafety Commissioner in June showed. Almost all the deepfake AI images circulating were pornographic, with 98 per cent being of women, Mr Daley said. Those dabbling in deepfakes in NSW by stealing a person's real identity could face up to three years in jail or $11,000 in fines. Sharing or extortion by threatening to share such damaging content online known as revenge porn, even if the person hasn't created them, can also result in up to three years behind bars. Content also will encompass the creation, recording and distribution of sexually explicit audio, whether real or altered. Full Stop CEO Karen Bevan said violence was perpetrated in many forms. Cutting at the root in the digital world was a necessary intervention to change ingrained misogynistic attitudes about women and girls in society, she said. "It's critical that we are really clear in our community that sexual violence of any kind is not acceptable and that this is real harm" she said. "These kinds of images and this kind of distribution is humiliating and it's degrading." The harms created by the click of a mouse and a few strokes of a keyboard could be all too real, long-lasting and devastating in their impacts, NSW Women's Safety Commissioner Hannah Tonkin said. "We know that women and girls are the main targets of deepfake images. This is terrifying technology, which can be weaponised to cause immense harm." Laws cracking down on the sharing of sexually explicit AI-generated images and deepfakes without consent were recently introduced to federal parliament. Meanwhile, multiple reports have emerged of deepfake images being circulated in schools across the nation, including an incident where explicit deepfake images of 50 Melbourne schoolgirls were created and shared online in 2024. Lifeline 13 11 14 Fullstop Australia 1800 385 578

Real punishment for 'terrifying tech' deepfake makers
Real punishment for 'terrifying tech' deepfake makers

Perth Now

time2 hours ago

  • Perth Now

Real punishment for 'terrifying tech' deepfake makers

Young men and teens using artificial intelligence to create sexually explicit deepfakes of women and girls could face time behind bars. NSW Attorney-General Michael Daley introduced legislation on Thursday to expand existing offences for the production and distribution of intimate images without consent. "Playing with these images on your phone is a serious offence that could land you in jail," he told reporters in a warning to parents about young boys being possibly involved. "We are ensuring that anyone who seeks to humiliate, intimidate or degrade someone using AI can be prosecuted." Deepfakes refer to digitally altered images of a person's face or body and young women and girls are often targeted in a sexual manner. Sharing of explicit deepfake images of underage Australians has doubled since 2023, data from the eSafety Commissioner in June showed. Almost all the deepfake AI images circulating were pornographic, with 98 per cent being of women, Mr Daley said. Those dabbling in deepfakes in NSW by stealing a person's real identity could face up to three years in jail or $11,000 in fines. Sharing or extortion by threatening to share such damaging content online known as revenge porn, even if the person hasn't created them, can also result in up to three years behind bars. Content also will encompass the creation, recording and distribution of sexually explicit audio, whether real or altered. Full Stop CEO Karen Bevan said violence was perpetrated in many forms. Cutting at the root in the digital world was a necessary intervention to change ingrained misogynistic attitudes about women and girls in society, she said. "It's critical that we are really clear in our community that sexual violence of any kind is not acceptable and that this is real harm" she said. "These kinds of images and this kind of distribution is humiliating and it's degrading." The harms created by the click of a mouse and a few strokes of a keyboard could be all too real, long-lasting and devastating in their impacts, NSW Women's Safety Commissioner Hannah Tonkin said. "We know that women and girls are the main targets of deepfake images. This is terrifying technology, which can be weaponised to cause immense harm." Laws cracking down on the sharing of sexually explicit AI-generated images and deepfakes without consent were recently introduced to federal parliament. Meanwhile, multiple reports have emerged of deepfake images being circulated in schools across the nation, including an incident where explicit deepfake images of 50 Melbourne schoolgirls were created and shared online in 2024. Lifeline 13 11 14 Fullstop Australia 1800 385 578

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store