logo
Misleading ATO statement could have put trio in jail for a decade

Misleading ATO statement could have put trio in jail for a decade

A misleading witness statement tendered to court by an ATO officer that could have sent three innocent Australians to jail has been uncovered.
The document was discovered by businessman Jae Jang through Freedom of Information laws and will now form part of an independent investigation by the Tax Ombudsman into a decade-long case first exposed by A Current Affair and published by this masthead.
ATO officer Anthony Rains was the lead investigator in the criminal prosecution of Jang and two of his employees, Gold Coast-based Debbie and Bill Ingleton.
The trio were charged in late 2017 with conspiracy to defraud the Australian Taxation Office, which carries a maximum sentence of 10 years' jail.
Jang was arrested just days before Christmas that year, and may have spent three weeks in jail had extradition to Queensland been successful.
After 2½ years with strict bail conditions, the charges were dropped with 'no evidence to offer'.
It can now be revealed that a witness statement, tendered by another ATO officer, appears to have had a crucial line added to it by Rains.
'Anthony Rains is the criminal investigator taking witness statements, he should be independent,' Jang said.
'However, in this case, it's clearly shown that he has actually written that for the witness, which, in my view, is totally wrong.'
Orange background

Try Our AI Features

Explore what Daily8 AI can do for you:

Comments

No comments yet...

Related Articles

Why you should never remove chalk marks from your parked car
Why you should never remove chalk marks from your parked car

9 News

time8 hours ago

  • 9 News

Why you should never remove chalk marks from your parked car

Your web browser is no longer supported. To improve your experience update it here A criminal lawyer is warning Australians about the potential consequences of removing chalk marks from their tyres before a parking inspection is complete. A recent TikTok depicting an individual removing chalk marks from multiple car tyres has gone viral and sparked online conversation. But Avinash Singh, from Astor Lega has warned of the potential consequences of the act. "Removing chalk off a tyre could be seen as attempting to pervert the course of justice. This is because the removal of chalk would hinder a parking ranger from carrying out their duties and prevent them from issuing a fine," he said. The video sparked debate, with Singh criticising the video, the comment section was flooded with people showing support for the stunt. In NSW, Section 319 Crimes Act 1900 makes it an offence to do any act or omission intending in any way to pervert the court of justice. Chalk mark on car tyre from parking inspectors ( South Australia is the only state that has a specific law that addresses this issue due to the issue becoming so widespread. Section 174AB of the Road Traffic Act 1961 makes it an offence to remove a parking inspector's chalk from a vehicle. The maximum fine this offense can carry is a $750 fine. "If a driver finds that a fine has been issued, they can contest the fine and ask for evidence that they were timed correctly. This is usually in the form of timestamped photos that a parking ranger has taken," says Singh. parking national Cars Fines driving Australia TikTok CONTACT US

Pig farm horror ‘on another level'
Pig farm horror ‘on another level'

Perth Now

time8 hours ago

  • Perth Now

Pig farm horror ‘on another level'

In a disturbing development to an ongoing story involving the alleged mistreatment of hundreds of pigs at a farm in South Australia, the RSPCA has ordered more than a dozen creatures be euthanised. The Andgar Piggery, located near the small regional town of Dublin, SA, has become the centre of widespread controversy since a dossier of photos, videos and documents revealed the shocking conditions being experienced by pigs at the farm. Pigs were observed in various states of decay when activists broke in and began recording. Supplied Credit: Supplied Released in June by the Farm Transparency Project (FTP), a Melbourne-based activist group, the huge catalogue of photos depicts animals living – and dying – in squalor. Footage shows animals consuming the remains of their dead littermates as other exhausted creatures wade through thick muck, which FTP chief executive Chris Delforce said was 'up to their stomachs, at least, if not higher'. 'Just seeing the pigs wading through their own filth … I've been investigating piggeries for 13 years or so now, and it's always a pretty horrific experience … but this place in particular, I think, was kind of on another level,' he said. RSPCA South Australia released a statement on Tuesday that said its investigation into the Andgar piggery was 'progressing'. 'RSPCA inspectorate officers accompanied by PIRSA veterinary staff have conducted two raids of the piggery and 14 pigs have been euthanised,' a spokesman said. Some of the more distressing pictures featured an animal with a severe, necrotic wound about 10cm wide and deep enough to hold a pile of dirt. Supplied Credit: Supplied 'The RSPCA has issued 21 animal welfare notices instructing the owners and manager to take immediate action regarding conditions and maintenance. They must maintain compliance and the inspectorate is monitoring the operation with spot inspections.' The RSPCA said the farm's owners had been formally interviewed as part of a 'large and highly complex' investigation, and it is 'now preparing a comprehensive brief of evidence with a view to instigate court proceedings'. 'The RSPCA South Australia is empowered to investigate animal cruelty and enforce animal welfare legislation in our state. In addition to issuing animal welfare notices, we can also lay criminal charges,' the spokesman said. 'We acknowledge the distress and concern these images have caused and we want to assure the community that we take any allegation of animal cruelty extremely seriously.' Despite these comments, Mr Delforce claimed the RSPCA was approached by a whistleblower well before the FTP infiltrated the property. Protesters turned up to the piggery on Saturday in their dozens to condemn the conditions and call for change. Supplied Credit: Supplied In screenshots shared to Facebook on Thursday, FTP advertised segments of the anonymous whistleblower's claims that their partner – somebody who regularly attended the pig farm – 'would come home traumatised by some of the cruelty and lack of maintenance and care of animals'. Mr Delforce said the RSPCA was alerted 'a month before' activists arrived at the farm and accused it of allowing 'unchecked, unmonitored, unaddressed' cruelty to proliferate. 'It seems the RSPCA is not adequately resourced or funded or motivated to go and inspect these places on their own,' he said 'They are the authority that has been legally assigned to investigate and prosecute cruelty issues in animal farms, and if they're not doing it, nobody else is doing it.' FTP chief executive Chris Delforce said the Andgar Piggery had already been flagged to the RSPCA by a whistleblower, but he said no action was taken in the first instance. Supplied Credit: Supplied One of Andgar's co-owners spoke to NewsWire earlier this month, saying the piggery was struggling because it 'went from four workers to one' and 'no one wants to work'. 'Of course the piggery's never been like that. For all the years we've run pigs, they've never been like that. It's just all of a sudden, you've got no workers,' he said. Mr Delforce believes the state and federal government have failed to provide 'any support for farmers who want to get out of this industry'. 'I think he should have made the decision to shut down … it's not an excuse to have pigs drowning in their own waste just because you can't get employees, so stop breeding them,' he said. The South Australian government declined to comment on the ongoing RSPCA investigation. In South Australia, those found guilty of animal cruelty offences can be fined up to $250,000 and/or receive a maximum 10 years in jail.

'Why do we need that?': Push to ban AI nudity apps
'Why do we need that?': Push to ban AI nudity apps

The Advertiser

time9 hours ago

  • The Advertiser

'Why do we need that?': Push to ban AI nudity apps

Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028 Parents are being warned their kids may be exploited online for child abuse material, amid a push to criminalise the use of apps that "nudify" pictures. Possessing nudify apps, digital platforms that allow users to insert a person's photos and use generative artificial intelligence to sexualise them, would become a criminal offence and carry up to 15 years in jail under proposed laws. "Why do we need that in an Australian community?" International Centre for Missing and Exploited Children's Dannielle Kelly told reporters in Canberra on Monday. One in four children has experienced sexual abuse, according to the Australian Child Maltreatment Study. Independent MP Kate Chaney, who introduced the proposed laws, said the federal government needed to respond more nimbly to ensure it wasn't outpaced by technological developments, such as AI being used to exploit children with little consequence. The proposed laws contain small carve-outs for law enforcement and researchers. "This is just the start, but it's something that the government could do right now," Ms Chaney said after introducing her private member's bill on Monday. The legislation follows a roundtable on AI-facilitated child exploitation, which called for urgent action. Child safety advocates and law enforcement representatives at the roundtable called for AI literacy for young people, the use of new technology to detect child exploitation material, legal restrictions on downloading such apps and better resourcing for police to tackle the issue. There was a consensus that AI was being weaponised to harm children, from creating deepfakes - which digitally manipulate images and video to superimpose someone's face or voice - to generating child abuse material, creating the potential for exploitation, blackmail and bullying. MP Zali Steggall, who seconded Ms Chaney's bill, branded it every parent's worst nightmare. "When a criminal is downloading this technology to then create this material, that's going to have a lifelong impact on children and is really damaging," the independent MP said. "We need these guardrails with urgency, we need the government to show it can act quickly. "My concern is, amidst the paralysis of a broad review of AI, we have these very clear areas of harm that go unaddressed for months at a time ... this is a very clear area of harm identified that can be dealt with very quickly." International Justice Mission Australia chief executive David Braga called for the government to legislate a digital duty of care, requiring platforms to actively take steps to prevent harm. "Now is the time for the Australian government to strengthen the Online Safety Act to require companies ... to detect and disrupt child sexual abuse material in all its forms on their platforms," he said. Attorney-General Michelle Rowland said keeping vulnerable Australians safe was the government's priority, and it would consider the legislation. "Keeping young people safe from emerging harms is above politics and the government will give appropriate consideration to the private member's bill," she said in a statement to AAP. Lifeline 13 11 14 Kids Helpline 1800 55 1800 (for people aged 5 to 25) 1800 RESPECT (1800 737 732) National Sexual Abuse and Redress Support Service 1800 211 028

DOWNLOAD THE APP

Get Started Now: Download the App

Ready to dive into a world of global content with local flavor? Download Daily8 app today from your preferred app store and start exploring.
app-storeplay-store