
The digital conflict and empowering awareness
Digital influence comes from many sources and manifests negatively both individually and collectively. A recent international study published on The Australian website found a link between playing video games, especially those featuring 'loot boxes', and increased risks of gambling as well as related mental health issues. Researchers noted that purchasing these loot boxes can trigger anxiety, stress, depression, and impulsivity, factors common to other behavioural addictions, warning that children involved in buying them may later be prone to gambling problems. Similarly, a 2023 report by CyberSafeKids revealed that 65 per cent of Irish children aged 8 to 12 experienced external contact from strangers online, highlighting a lack of awareness and parental supervision regarding online safety. This deficiency raises the likelihood of children encountering harmful content or exploitation by dangerous groups.
Digital platforms and video games have also become easily exploitable tools for terrorist and criminal organisations to recruit and indoctrinate young minds. These groups use virtual spaces to interact with youth, influencing their ideas and behaviours through targeted content or direct interactions, thus facilitating the recruitment of minors. Such phenomena require both societal awareness and effective tools to monitor and combat them.
Furthermore, many young people, often referred to as the digital generation, rely on platforms like TikTok and YouTube as primary sources of information. This reliance exposes them to media misinformation and erroneous intellectual guidance. In this context, Jean Baudrillard's notion of 'virtual reality' becomes apparent, as the boundaries between truth and illusion blur, leaving youth vulnerable to both intentional and unintentional manipulation.
Smart algorithms on these platforms can promote extremist content; a study by Egypt's Suez Canal University revealed that certain games and digital platforms broadcast ideas conflicting with societal and religious values. Additionally, research reported by Saudi's Al-Mowaten electronic newspaper found that about 30 per cent of children playing online games experience bullying, which increases social isolation and anxiety, weakening social bonds and making individuals more susceptible to external influences, including recruitment by criminal groups.
Beyond the risks associated with video games, violent films or those depicting acts of terrorism can also instil unethical behaviours in young minds. A study published in PubMed found that excessive exposure to violent content correlates with higher rates of depression and aggressive behaviour among teenagers. Moreover, some films are indirectly used by extremist and criminal groups by portraying 'heroic' characters from their ranks to sway viewers ideologically, stimulate emotions, and redirect their thinking. Similar tactics are seen in some songs, music or religious-themed chants that hide subliminal psychological messages promoting non-religious ideologies.
Many reports and studies indicate that terrorist groups recruit youth online, often using AI techniques to target those most addicted to digital platforms. In this context, the Italian philosopher Antonio Gramsci's ideas on 'cultural hegemony' intersect with the digital reality: the struggle to control collective consciousness is no longer limited to traditional tools but extends into virtual space, where narratives and beliefs are carefully crafted by specialised groups capable of infiltrating societies.
Given this frightening scenario, it is imperative to urgently activate robust measures to confront these challenges. This starts with effective digital education, especially within families and schools, through awareness, supervision, and curricula designed to develop digital literacy and necessary defences. There is also a need to enforce digital ethics, redefining digital responsibility and its risks. Drawing on Immanuel Kant's philosophy, which emphasises duty over self-interest, governments, educational institutions, and families must work together to enhance digital awareness. Measures include restricting harmful content using AI algorithms, promoting critical education that trains youth to analyse digital content, and encouraging balanced digital usage with regulated screen time alongside cultural, social and sports programmes.
Only through a threefold alliance, governments enacting strict regulations on harmful digital practices, educational institutions incorporating critical digital literacy in curricula and families raising awareness and monitoring digital activities, can the digital realm be transformed from an ideological battleground into a positive space that enriches minds and benefits both individuals and society.

Try Our AI Features
Explore what Daily8 AI can do for you:
Comments
No comments yet...
Related Articles


Observer
04-07-2025
- Observer
BIG TECH MODERATORS UNITE TO FIGHT TRAUMA
Content moderators from the Philippines to Türkiye are uniting to push for greater mental health support to help them cope with the psychological effects of exposure to a rising tide of disturbing images online. The people tasked with removing harmful content from tech giants like Meta Platforms or TikTok, report a range of noxious health effects from loss of appetite to anxiety and suicidal thoughts. "Before I would sleep seven hours," said one Filipino content moderator who asked to remain anonymous to avoid problems with their employer. "Now I only sleep around four hours". Workers are gagged by non-disclosure agreements with the tech platforms or companies that do the outsourced work, meaning they cannot discuss exact details of the content they are seeing. But videos of people being burned alive by the IS, babies dying in Gaza and gruesome pictures from the Air India crash in June were given as examples by moderators. Social media companies, which often outsource content moderation to third parties, are facing increasing pressure to address the emotional toll of moderation. Meta, which owns Facebook, WhatsApp and Instagram, has already been hit with workers' rights lawsuits in Kenya and Ghana; and in 2020 the firm paid a $52 million settlement to American content moderators suffering long-term mental health issues. The Global Trade Union Alliance of Content Moderators was launched in Nairobi in April to establish worker protections for what they dub 'a 21st century hazardous job', similar to the work of emergency responders. Their first demand is for tech companies to adopt mental health protocols, such as exposure limits and trauma training, in their supply chains. "They say we're the ones protecting the internet, keeping kids safe online," the Filipino worker said, "But we are not protected enough". SCROLLING TRAUMA Globally, tens of thousands of content moderators spend up to 10 hours a day scrolling through social media posts to remove harmful content — and the mental toll is well-documented. "I've had bad dreams because of the graphic content and I'm smoking more, losing focus," said Berfin Sirin Tunc, a content moderator for TikTok in Türkiye employed via Canadian-based tech company Telus, which also does work for Meta. In a video call, she said the first time she saw graphic content as part of her job she had to leave the room and go home. While some employers do provide psychological support, some workers say it is just for show — with advice to count numbers or do breathing exercises. Therapy is limited to either group sessions or a recommendation to switch off for a certain number of 'wellness break' minutes. But taking them is another thing. "If you don't go back to the computer, your team leader will ask where are you and (say) that the queue of videos is growing," said Tunc, "Bosses see us just as machines". In emailed statements, Telus and Meta said the well-being of their employees is a top priority and that employees should have access to 24/7 healthcare support. RISING PRESSURE Moderators have seen an uptick in violent videos. A report by Meta for the first quarter of 2025 showed a rise in the sharing of violent content on Facebook, after the company changed its content moderation policies in a commitment to 'free expression'. However, Telus said in its emailed response that internal estimates show that distressing material represents less than 5 per cent of the total content reviewed. Adding to the pressure on moderators is a fear of losing jobs as companies shift towards AI-powered moderation. Meta, which invested billions and hired thousands of content moderators globally over the years to police extreme content, scrapped its US fact-checking programme in January, following the election of Donald Trump. In April, 2,000 Barcelona-based workers were sent home after Meta severed a contract with Telus. A Meta spokesperson said the company has moved the services that were being performed from Barcelona to other locations. "I'm waiting for Telus to fire me," said Tunc, "because they fired my friends from our union". Fifteen workers in Türkiye are suing the company after being dismissed, they say, after organising a union and attending protests this year. A spokesperson for Telus said in an emailed response that the company "respects the rights of workers to organise". Telus said a May report by Türkiye's Ministry of Labour found contract terminations were based on performance and it could not be concluded that the terminations were union-related. The Labour Ministry did not immediately respond to a request for comment. PROTECTION PROTOCOLS Moderators in low-income countries say that the low wages, productivity pressure and inadequate mental health support can be remedied if companies sign up to the Global Alliance's eight protocols. These include limiting exposure time, making realistic quotas and 24/7 counselling, as well as living wages, mental health training and the right to join a union. Telus said that it was already in compliance with the demands and Meta said it conducts audits to check that companies are providing required on-site support. "Bad things are happening in the world. Someone has to do this job and protect social media," said Tunc. "With better conditions, we can do this better. If you feel like a human, you can work like a human". — Thomson Reuters Foundation JOANNA GILL The writer is Europe correspondent for Thomson Reuters Foundation


Observer
28-06-2025
- Observer
Matcha: the Japanese tea taking over the world
Matcha is the new drink of choice at hip cafes worldwide, but Japanese producers are struggling to keep up with soaring demand for the powdered green tea. Here's what you need to know about the drink beloved of weekend treat-seekers and "wellness" influencers: Matcha: the Japanese tea taking over the world - What is matcha? - The word matcha means "ground tea" in Japanese and comes in the form of a vivid green powder that is whisked with hot water and can be added to milk to make a matcha latte. Green tea was introduced to Japan from China in the early ninth century, and was first used for medicinal purposes. Matcha came much later, in 16th century Kyoto -- part of the tea ceremony tradition developed by tea master Sen no Rikyu. Today, there are different grades of matcha quality, from "ceremonial" to "culinary" types used in baking. Matcha: the Japanese tea taking over the world - How is it produced? - Matcha is made from leaves called "tencha", which are grown in the shade in the final weeks before their harvest to concentrate the flavour, colour and nutrients. This "requires the construction of a complex structure with poles and a roof to filter the light", explained Masahiro Okutomi, a tea producer in Sayama, northwest of Tokyo. Tencha leaves, rich in chlorophyll and L-theanine, a compound known for its relaxing effects, are hand-picked and deveined, then steamed, dried and ground between two stone mills to produce an ultra-fine powder. It can take up to an hour to produce just 40 grams (1.4 ounces) of matcha -- making the powder on average twice as expensive to produce as standard green tea leaves. Matcha: the Japanese tea taking over the world - What are its benefits? - Many drink matcha for its rich grass-like taste, but others are drawn to the drink's nutritional properties. It is rich in antioxidants, and can aid concentration because of its caffeine content: one cup contains on average 48 milligrams, slightly less than a drip coffee but nearly twice as much as a standardly brewed green tea. "Matcha is often seen as being good for your health," said Shigehito Nishikida, manager of Tokyo tea shop Jugetsudo. "But people are also attracted to the Japanese culture around tea: the ritual, the time taken, the aesthetics," he said. Matcha: the Japanese tea taking over the world - Why is it so popular? - Japan produced 4,176 tonnes of matcha in 2023 -- a huge increase from the 1,430 tonnes in 2012. More than half of the powder is exported, according to the agriculture ministry, mostly to the United States, Southeast Asia, Europe, Australia and the Middle East. Millions of videos on TikTok, Instagram and YouTube demonstrate how to make photogenic matcha drinks or choose a traditional "chasen" bamboo whisk. Matcha: the Japanese tea taking over the world "I feel like Gen Z really drove this enthusiasm for matcha, and they heavily relied on social media to do so," Stevie Youssef, a 31-year-old marketing professional, told AFP at a matcha bar in Los Angeles. Matcha can also be used in cooking, extending its appeal to others aside from tea lovers. "Some customers simply enjoy drinking it, others like preparing it themselves. And of course, many buy it as a gift -- Japanese matcha is always appreciated," said Jugetsudo's Nishikida. —AFP


Observer
19-06-2025
- Observer
Grief in filters: The digital mask of emotion
Gen Z is known for introducing many new concepts — some are worth the trend, others not so much. Over the past few years, Gen Z has started to treat sadness as an aesthetic, something to play around with instead of confronting it as a real emotion. Whether it's losing someone or going through a traumatic event, each emotion carries weight — and turning it into a trend makes those feelings harder to understand. Instead of dealing with emotions directly, many young people turn to various coping mechanisms. Some are healthy, but others raise concerns. From ironic memes to oversharing on social media, these habits have become common ways to process pain. Studies suggest that around 45 per cent of youth depend on harmful coping strategies. One of the most common is binge-watching series, which often leads to severe procrastination and distraction from studies or activities they used to enjoy. Another example is oversharing with strangers or online friends when they feel no one else understands — this could lead to leaks of personal information or even emotional harm. Gen Z also uses dark humour to mask pain, often without realising that others going through the same thing might not find it funny. The most serious and damaging of all these habits is emotional numbing — thinking that suppressing emotions will stop the pain. But this only leads to endless scrolling, gaming, and surface-level interactions. On the other hand, a portion of Gen Z is turning to healthier methods. Meditation helps calm the mind, journaling allows for emotional release, and reading gives a chance to relate to characters and better understand one's own feelings. Social media platforms like Instagram and TikTok often glorify these unhealthy habits. Reels, posts, and trauma-dump stories can make sadness look beautiful — and when pain becomes a trend, it becomes hard to tell who genuinely needs help and who's following the aesthetic. These habits can seriously affect mental health. Many experience anxiety, stress, or even depression without realizing what's really causing it. To change this, we need to shift towards better strategies — like opening up to someone we trust, doing creative or active things like art or sports, and seeing therapy as a healthy, not shameful, option. In conclusion, it's time to stop pretending everything is fine or turning pain into a joke. Sadness is real, and everyone experiences it. The difference lies in how we cope — and it's up to us to turn aesthetic into awareness.