Latest news with #Leduc


Focus Malaysia
20-07-2025
- Focus Malaysia
AI and ethics can help stop online harassment
Letter to Editor FROM school halls to digital spaces, moral and civic education teaches us to be helpful, considerate, and kind members of society. Yet, despite these teachings, various forms of harm continue to plague both physical and online worlds. One such issue is online harassment—also commonly referred to as cyberbullying. Online harassment has become a distressingly common experience for many internet users. It involves acts of aggression, intimidation, or abuse carried out across digital platforms. According to researchers like Leduc and colleagues in Computers in Human Behavior, it can take many forms—disinformation, name-calling, threats, sexual harassment, and public humiliation. This digital abuse can affect people from all walks of life, although certain demographic factors such as ethnicity, age, and gender may influence how likely someone is to experience it. Pew Research Center reports by Monica Anderson in 2018 and more recent updates by Atske in 2024 highlight how widespread and persistent the issue is, particularly among teens. Similarly, a Malaysian-based study published in BMJ Open by Samsudin and colleagues in 2023 found that young adults experiencing cyberbullying often also report psychological distress and strained family dynamics. In Malaysia, researchers Kee, Anwar, and Vranjes pointed out in 2024 that online harassment is a risk factor for suicidal thoughts among youth. Often, the abuse stems from prejudice—negative stereotypes based on religion, ethnicity, gender, or even personal interests can quickly snowball into digital attacks. Victims may receive a barrage of cruel messages, mockery, or hate comments targeting their identity. Cultural norms can also fuel the problem. When mocking or humiliating others is treated as entertainment, especially in online communities, abusers feel emboldened. The anonymity of the internet offers a protective mask that emboldens people to say what they would never say face-to-face. Combined with the misuse of free speech, this creates a digital culture that tolerates—even encourages—harmful behaviour. The effects of online harassment are not limited to bruised egos. Victims often face serious mental health challenges. Studies by Dr Cheryl Nixon in 2014 reveal how victims may suffer from depression, anxiety, disrupted sleep patterns, appetite loss, and even suicidal ideation. These psychological effects can lead to social withdrawal, strained relationships, and a deep sense of helplessness. Embarrassment, fear, and self-blame are common emotional responses. Many victims, especially teens and young adults, avoid telling friends or family about their experiences, which only amplifies their isolation. A landmark case in Canada, R. v. Elliott in 2016, highlighted the legal implications of online abuse. The case was connected to Rehtaeh Parsons, a 17-year-old girl who took her life after a photo of her sexual assault was widely shared online, followed by relentless digital harassment. Although initial investigations failed to yield justice, public outcry prompted a renewed effort that led to charges under Canada's Cyberbullying Prevention Act—also known as Bill C-13. This tragic case led to legislative reform. Nova Scotia passed 'Rehtaeh's Law,' the first of its kind in Canada, which broadened the legal definition of cyberbullying and provided new tools for law enforcement to act. Writing in Crime, Media, Culture, researcher Alice Dodge in 2023 emphasised how the case shifted public perception of cyberbullying—from a social issue to a serious crime requiring legal intervention. Can ethics and AI offer solutions? As technology evolves, so do our opportunities to address online harassment in smarter ways. Media ethics plays a key role here. Researchers like Milosevic and colleagues in 2022, writing in the International Journal of Bullying Prevention, argue that media platforms must uphold ethical standards that prioritise harm reduction. This includes creating clear content guidelines, efficient reporting mechanisms, and psychological support systems for those affected. Media outlets should portray victims with dignity and avoid sensationalising abuse, while ensuring perpetrators are held accountable. Technology, particularly artificial intelligence, could also help stem the tide. AI-powered moderation tools, if designed ethically, can assist in identifying abusive content and preventing its spread. But these systems must prioritise fairness, transparency, and accountability. Many current algorithms are geared toward boosting engagement—even if that means promoting provocative or harmful content. Instead, platforms need to redesign algorithms to avoid amplifying negativity. As highlighted by Zubiaga in the International Review of Information Ethics in 2021, tech companies must also be transparent about how moderation decisions are made and offer clear ways for users to report abuse. Ultimately, it's not just up to lawmakers, media companies, or AI developers. All internet users share the responsibility to create a culture of empathy, respect, and mutual accountability. By standing against online harassment, speaking up for victims, and supporting efforts for ethical technology, we can help make digital spaces safer for everyone. —July 20, 2025 The authors are from the Department of Science and Technology Studies, Faculty of Science, Universiti Malaya The views expressed are solely of the author and do not necessarily reflect those of Focus Malaysia. Main image: Kaspersky

Malay Mail
20-07-2025
- Malay Mail
AI and ethics can help stop online harassment — Lim Jo Yi, He Xiaoyan and Mohd Istajib Mokhtar
JULY 20 — From school halls to digital spaces, moral and civic education teaches us to be helpful, considerate, and kind members of society. Yet, despite these teachings, various forms of harm continue to plague both physical and online worlds. One such issue is online harassment—also commonly referred to as cyberbullying. Online harassment has become a distressingly common experience for many internet users. It involves acts of aggression, intimidation, or abuse carried out across digital platforms. According to researchers like Leduc and colleagues in Computers in Human Behavior, it can take many forms—disinformation, name-calling, threats, sexual harassment, and public humiliation. This digital abuse can affect people from all walks of life, although certain demographic factors such as ethnicity, age, and gender may influence how likely someone is to experience it. Pew Research Center reports by Monica Anderson in 2018 and more recent updates by Atske in 2024 highlight how widespread and persistent the issue is, particularly among teens. Similarly, a Malaysian-based study published in BMJ Open by Samsudin and colleagues in 2023 found that young adults experiencing cyberbullying often also report psychological distress and strained family dynamics. In Malaysia, researchers Kee, Anwar, and Vranjes pointed out in 2024 that online harassment is a risk factor for suicidal thoughts among youth. Often, the abuse stems from prejudice—negative stereotypes based on religion, ethnicity, gender, or even personal interests can quickly snowball into digital attacks. Victims may receive a barrage of cruel messages, mockery, or hate comments targeting their identity. Cultural norms can also fuel the problem. When mocking or humiliating others is treated as entertainment, especially in online communities, abusers feel emboldened. The anonymity of the internet offers a protective mask that emboldens people to say what they would never say face-to-face. Combined with the misuse of free speech, this creates a digital culture that tolerates—even encourages—harmful behaviour. The effects of online harassment are not limited to bruised egos. Victims often face serious mental health challenges. Studies by Dr Cheryl Nixon in 2014 reveal how victims may suffer from depression, anxiety, disrupted sleep patterns, appetite loss, and even suicidal ideation. These psychological effects can lead to social withdrawal, strained relationships, and a deep sense of helplessness. Embarrassment, fear, and self-blame are common emotional responses. Many victims, especially teens and young adults, avoid telling friends or family about their experiences, which only amplifies their isolation. Can ethics and AI offer solutions? As technology evolves, so do our opportunities to address online harassment in smarter ways. — Reuters pic A landmark case in Canada, R. v. Elliott in 2016, highlighted the legal implications of online abuse. The case was connected to Rehtaeh Parsons, a 17-year-old girl who took her life after a photo of her sexual assault was widely shared online, followed by relentless digital harassment. Although initial investigations failed to yield justice, public outcry prompted a renewed effort that led to charges under Canada's Cyberbullying Prevention Act—also known as Bill C-13. This tragic case led to legislative reform. Nova Scotia passed 'Rehtaeh's Law,' the first of its kind in Canada, which broadened the legal definition of cyberbullying and provided new tools for law enforcement to act. Writing in Crime, Media, Culture, researcher Alice Dodge in 2023 emphasised how the case shifted public perception of cyberbullying—from a social issue to a serious crime requiring legal intervention. Can ethics and AI offer solutions? As technology evolves, so do our opportunities to address online harassment in smarter ways. Media ethics plays a key role here. Researchers like Milosevic and colleagues in 2022, writing in the International Journal of Bullying Prevention , argue that media platforms must uphold ethical standards that prioritise harm reduction. This includes creating clear content guidelines, efficient reporting mechanisms, and psychological support systems for those affected. Media outlets should portray victims with dignity and avoid sensationalising abuse, while ensuring perpetrators are held accountable. Technology, particularly artificial intelligence, could also help stem the tide. AI-powered moderation tools, if designed ethically, can assist in identifying abusive content and preventing its spread. But these systems must prioritise fairness, transparency, and accountability. Many current algorithms are geared toward boosting engagement—even if that means promoting provocative or harmful content. Instead, platforms need to redesign algorithms to avoid amplifying negativity. As highlighted by Zubiaga in the International Review of Information Ethics in 2021, tech companies must also be transparent about how moderation decisions are made and offer clear ways for users to report abuse. Ultimately, it's not just up to lawmakers, media companies, or AI developers. All internet users share the responsibility to create a culture of empathy, respect, and mutual accountability. By standing against online harassment, speaking up for victims, and supporting efforts for ethical technology, we can help make digital spaces safer for everyone. * This is the personal opinion of the writer or publication and does not necessarily represent the views of Malay Mail .


CBC
26-06-2025
- Business
- CBC
Edmonton climate policies drive up city building costs, report shows
The City of Edmonton is amending a key climate policy after a report showed building facilities like fire halls and recreation centres in Edmonton is higher due to its current standards than it would be if the city followed a basic design. Last November, city councillors asked administration for a cost-benefit analysis to see how much city policies were adding to the price of building capital projects, concerned that it generally costs less to build in neighbouring jurisdictions like Leduc. The city commissioned a third party, S2 Architecture, to compare two theoretical fire station models: one designed with the City of Edmonton's bylaws and policies and one designed to meet only the minimum code requirements. The findings show that building a fire hall under the city's current standards would cost just over $21 million, 58 per cent more than the $13 million estimated to build a station with a basic design. The case study factored in four city policies when building: the climate resilience policy, the fire rescue service delivery policy, City of Edmonton facility construction standard and the Edmonton Design Committee process. Council's new infrastructure committee discussed the report's findings at a meeting Wednesday. "Direct construction costs are increased by the application of city requirements," Pascale Ladouceur, the city's branch manager of infrastructure planning and design, told councillors. "The biggest cost driver is the climate resilience policy." The committee heard from several speakers, including Lindsay Butterfield with BILD Edmonton Metro, a real estate industry association, who asked councillors to review the policies. "Look at all the options and make trade-offs where they're necessary because we should be looking to minimize costs as well for the entire city's benefit," Butterfield said. But climate advocates, including Jim Sandercock with the Alberta Ecotrust Foundation, urged councillors to follow the current climate policy. "It's going to be really expensive in the future to retrofit buildings that were built to minimum code." Mayor Amarjeet Sohi introduced a motion directing administration to amend the climate policy and explore options for reducing costs while still meeting the goal of creating zero emissions. Committee agreed to the motion and administration is scheduled to present the proposed amendments next spring. "Absolutely, we cannot lose the intent of these policies," Sohi said. "They are there for a good reason, whether they're there for climate resiliency, whether they're there for the safety and protection and creating the right conditions for our front-line folks." 'Valid question' Ladouceur said the findings in the report are a springboard to reviewing the current rules. "I think it's a valid question for councillors to understand: Have we made decisions in the past, administration and council together, that impacts the cost of our infrastructure?" Ladouceur said in an interview Tuesday. The climate resilience policy requires the design to be emissions-neutral. The co-chair of the city's energy transition and climate resilience committee, Jacob Komar, argues that the report findings are inflated because the consultants used higher standards in the case study examples than what's actually needed to create an energy-efficient building. "The walls are probably to an insulation level that is double what is needed for a net-zero building," Komar said in an interview with CBC News Tuesday. Also an engineer who works on net-zero emissions projects, Komar said there's a diminishing return on insulation — the more you add, the less you get for it. "So the walls, the roof, the windows, the doors — there's over $2 million of extra cost that they've added." Ward sipiwiyiniwak Coun. Sarah Hamilton said the case study is an opportunity for reviewing and possibly revising policies, not setting a firm path for council to take. "The government has a role in terms of furthering climate resilience. We have a role in furthering design excellence. We have a role in furthering, I think even our own construction standards," Hamilton said. "We've heard over the decades that Edmontonians don't want something disposable. They want to be proud of the buildings that we're building with their money."


CTV News
25-06-2025
- CTV News
5 teens face 19 charges after Leduc break-in
Five teens between the ages of 13 and 15 face over a dozen charges after breaking into a home in Leduc on Monday night. RCMP say they arrested the teens after receiving a call from a neighbour who saw them enter a home in the area of 48 Street and 46 Avenue around 8:20 p.m. All five of the teens face charges of break and enter to a residence, mischief over $5,000 and being unlawfully in a dwelling. The other charges are from breaching conditions including not being allowed to interact with each other. Police say the youth were held for judicial interim release hearings and were released with multiple conditions.


CTV News
21-06-2025
- Health
- CTV News
AHS warns of several measles exposures in Edmonton, Leduc in last week
The Misericordia Community Hospital can be seen in this undated file photo. (File) Alberta Health Services on Saturday alerted the public about several measles exposures in Edmonton and Leduc. A person who has been confirmed to have measles was in the following locations while infectious. Others who were in the same locations may have been exposed and should monitor for symptoms and review their immunization record. Edmonton Remedium Medical Clinic at Meadowlark Health and Shopping Centre 11:26 a.m. to 3 p.m. June 19 Walmart at Meadowlark Health and Shopping Centre 1 p.m. to 3:30 p.m. on June 17, 18 and 19 Misericordia Community Hospital emergency department 5:21 p.m. June 17 to 3:53 a.m. June 18 Edmonton Transit Service Route 4 and 54 buses 2:30 p.m. to 7:20 p.m. June 17 Leduc Leduc Community Hospital emergency department 10:39 p.m. June 17 to 3:10 a.m. June 18 Leduc Community Hospital diagnostic imaging 12:30 a.m. to 2:35 a.m. June 18 Measles is extremely contagious and spreads easily through the air. Symptoms include a fever, cough, runny nose, red eyes, and a rash that starts three to seven days after a fever starts, usually beginning behind the ears or face and spreading down the body. Anybody with symptoms should stay at home and call Alberta's measles hotline at 1-844-944-3434 before visiting any healthcare facility or provider, including a family physician clinic or pharmacy, AHS says. Measles can cause ear infections, pneumonia, brain inflammation, premature delivery and sometimes death.