Latest news with #digitaladulthood


Malay Mail
2 days ago
- Politics
- Malay Mail
EU debates stricter social media rules to protect children from harmful content
LUXEMBOURG, June 7 — From dangerous diet tips to disinformation, cyberbullying to hate speech, the glut of online content harmful to children grows every day. But several European countries have had enough and agree the EU should do more to prevent minors' access to social media. The European Union already has some of the world's most stringent digital rules to rein in Big Tech, with multiple probes ongoing into how platforms protect children — or fail to do so. Backed by France and Spain, Greece spearheaded a proposal for how the EU should limit children's use of online platforms as a rising body of evidence shows the negative effects of social media on children's mental and physical health. They discussed the plan Friday with EU counterparts in Luxembourg to push the idea of setting an age of digital adulthood across the 27-country bloc, meaning children would not be able to access social media without parental consent. France, Greece and Denmark believe there should be a ban on social media for under-15s, while Spain has suggested a ban for under-16s. Australia has banned social media for under-16s, taking effect later this year, while New Zealand and Norway are considering a similar prohibition. After the day's talks in Luxembourg, it appeared there was no real appetite at this stage for an EU-wide ban on children under a specific age. But Danish Digital Minister Caroline Stage Olsen indicated there would be no let-up. 'It's going to be something we're pushing for,' she said. Top EU digital official Henna Virkkunen admitted specific age limits would be 'challenging' for multiple reasons, including cultural differences in member states and how it would work in practice. But the European Commission, the EU's digital watchdog, still intends to launch an age-verification app next month, insisting it can be done without disclosing personal details. 'Very big step' The EU last month published non-binding draft guidelines for platforms to protect minors, to be finalised once a public consultation ends this month, including setting children's accounts to private by default, and making it easier to block and mute users. French Digital Minister Clara Chappaz said it would be 'a very big step' if the EU made platforms check the real age of their users, as theoretically required under current regulation. The worry is that children as young as seven or eight can easily create an account on social media platforms despite a minimum age of 13, by giving a false date of birth. 'If we all agree as Europeans to say this needs to stop, there needs to be a proper age verification scheme, then it means that children below 13 won't be able to access the platform,' Chappaz said. France has led the way in cracking down on platforms, passing a 2023 law requiring them to obtain parental consent for users under the age of 15. But the measure has not received the EU green light it needs to come into force. France also gradually introduced requirements this year for all adult websites to have users confirm their age to prevent children accessing porn—with three major platforms going dark this week in anger over the move. TikTok, also under pressure from the French government, on Sunday banned the '#SkinnyTok' hashtag, part of a trend promoting extreme thinness on the platform. In-built age verification France, Greece and Spain expressed concern about the algorithmic design of digital platforms increasing children's exposure to addictive and harmful content — with the risk of worsening anxiety, depression and self-esteem issues. Their proposal — also supported by Cyprus and Slovenia — blames excessive screen time at a young age for hindering the development of minors' critical and relationship skills. They demand 'an EU-wide application that supports parental control mechanisms, allows for proper age verification and limits the use of certain applications by minors'. The goal would be for devices such as smartphones to have in-built age verification. The EU is clamping down in other ways as well. It is currently investigating Meta's Facebook and Instagram, and TikTok under its mammoth content moderation law, the Digital Services Act (DSA), fearing the platforms are failing to do enough to prevent children accessing harmful content. And last week, it launched an investigation into four pornographic platforms over suspicions they are failing to stop children accessing adult content. — AFP

RNZ News
2 days ago
- Politics
- RNZ News
New push in Europe to curb children's social media use
By Raziye Akkoc , AFP Photo: Unsplash / Getty Images From dangerous diet tips to disinformation, cyberbullying to hate speech, the glut of online content harmful to children grows every day. But several European countries have had enough and agree the European Union (EU) should do more to prevent minors' access to social media. The EU already has some of the world's most stringent digital rules to rein in Big Tech, with multiple probes ongoing into how platforms protect children - or fail to do so. Backed by France and Spain, Greece spearheaded a proposal for how the EU should limit children's use of online platforms as a rising body of evidence shows the negative effects of social media on children's mental and physical health. They discussed the plan Friday with EU counterparts in Luxembourg to push the idea of setting an age of digital adulthood across the 27-country bloc, meaning children would not be able to access social media without parental consent. France, Greece and Denmark believe there should be a ban on social media for under-15s, while Spain has suggested a ban for under-16s. Australia has banned social media for under-16s, taking effect later this year, while New Zealand and Norway are considering a similar prohibition. After the day's talks in Luxembourg, it appeared there was no real appetite at this stage for an EU-wide ban on children under a specific age. But Danish Digital Minister Caroline Stage Olsen indicated there would be no let-up. "It's going to be something we're pushing for," she said. Top EU digital official Henna Virkkunen admitted specific age limits would be "challenging" for multiple reasons, including cultural differences in member states and how it would work in practice. But the European Commission, the EU's digital watchdog, still intends to launch an age-verification app next month, insisting it can be done without disclosing personal details. The EU last month published non-binding draft guidelines for platforms to protect minors, to be finalised once a public consultation ends this month, including setting children's accounts to private by default, and making it easier to block and mute users. French Digital Minister Clara Chappaz said it would be "a very big step" if the EU made platforms check the real age of their users, as theoretically required under current regulation. The worry is that children as young as seven or eight can easily create an account on social media platforms despite a minimum age of 13, by giving a false date of birth. "If we all agree as Europeans to say this needs to stop, there needs to be a proper age verification scheme, then it means that children below 13 won't be able to access the platform," Chappaz said. France has led the way in cracking down on platforms, passing a 2023 law requiring them to obtain parental consent for users under the age of 15. But the measure has not received the EU green light it needs to come into force. France also gradually introduced requirements this year for all adult websites to have users confirm their age to prevent children accessing porn - with three major platforms going dark this week in anger over the move. TikTok, also under pressure from the French government, on Sunday banned the "#SkinnyTok" hashtag, part of a trend promoting extreme thinness on the platform. France, Greece and Spain expressed concern about the algorithmic design of digital platforms increasing children's exposure to addictive and harmful content - with the risk of worsening anxiety, depression and self-esteem issues. Their proposal - also supported by Cyprus and Slovenia - blames excessive screen time at a young age for hindering the development of minors' critical and relationship skills. They demand "an EU-wide application that supports parental control mechanisms, allows for proper age verification and limits the use of certain applications by minors". The goal would be for devices such as smartphones to have in-built age verification. The EU is clamping down in other ways as well. It is currently investigating Meta's Facebook and Instagram, and TikTok under its mammoth content moderation law, the Digital Services Act (DSA), fearing the platforms are failing to do enough to prevent children accessing harmful content. And last week, it launched an investigation into four pornographic platforms over suspicions they are failing to stop children accessing adult content. - AFP

CTV News
3 days ago
- Politics
- CTV News
New push in Europe to curb children's social media use
An 11-year-old boy plays with his father's phone outside school in Barcelona, Spain, Monday, June 17, 2024. THE CANADIAN PRESS/AP-Emilio Morenatti From dangerous diet tips to disinformation, cyberbullying to hate speech, the glut of online content harmful to children grows every day. But several European countries have had enough and agree the EU should do more to prevent minors' access to social media. The European Union already has some of the world's most stringent digital rules to rein in Big Tech, with multiple probes ongoing into how platforms protect children -- or fail to do so. Backed by France and Spain, Greece spearheaded a proposal for how the EU should limit children's use of online platforms as a rising body of evidence shows the negative effects of social media on children's mental and physical health. They discussed the plan Friday with EU counterparts in Luxembourg to push the idea of setting an age of digital adulthood across the 27-country bloc, meaning children would not be able to access social media without parental consent. France, Greece and Denmark believe there should be a ban on social media for under-15s, while Spain has suggested a ban for under-16s. Australia has banned social media for under-16s, taking effect later this year, while New Zealand and Norway are considering a similar prohibition. After the day's talks in Luxembourg, it appeared there was no real appetite at this stage for an EU-wide ban on children under a specific age. But Danish Digital Minister Caroline Stage Olsen indicated there would be no let-up. 'It's going to be something we're pushing for,' she said. Top EU digital official Henna Virkkunen admitted specific age limits would be 'challenging' for multiple reasons, including cultural differences in member states and how it would work in practice. But the European Commission, the EU's digital watchdog, still intends to launch an age-verification app next month, insisting it can be done without disclosing personal details. 'Very big step' The EU last month published non-binding draft guidelines for platforms to protect minors, to be finalised once a public consultation ends this month, including setting children's accounts to private by default, and making it easier to block and mute users. French Digital Minister Clara Chappaz said it would be 'a very big step' if the EU made platforms check the real age of their users, as theoretically required under current regulation. The worry is that children as young as seven or eight can easily create an account on social media platforms despite a minimum age of 13, by giving a false date of birth. 'If we all agree as Europeans to say this needs to stop, there needs to be a proper age verification scheme, then it means that children below 13 won't be able to access the platform,' Chappaz said. France has led the way in cracking down on platforms, passing a 2023 law requiring them to obtain parental consent for users under the age of 15. But the measure has not received the EU green light it needs to come into force. France also gradually introduced requirements this year for all adult websites to have users confirm their age to prevent children accessing porn -- with three major platforms going dark this week in anger over the move. TikTok, also under pressure from the French government, on Sunday banned the '#SkinnyTok' hashtag, part of a trend promoting extreme thinness on the platform. In-built age verification France, Greece and Spain expressed concern about the algorithmic design of digital platforms increasing children's exposure to addictive and harmful content -- with the risk of worsening anxiety, depression and self-esteem issues. Their proposal -- also supported by Cyprus and Slovenia -- blames excessive screen time at a young age for hindering the development of minors' critical and relationship skills. They demand 'an EU-wide application that supports parental control mechanisms, allows for proper age verification and limits the use of certain applications by minors'. The goal would be for devices such as smartphones to have in-built age verification. The EU is clamping down in other ways as well. It is currently investigating Meta's Facebook and Instagram, and TikTok under its mammoth content moderation law, the Digital Services Act (DSA), fearing the platforms are failing to do enough to prevent children accessing harmful content. And last week, it launched an investigation into four pornographic platforms over suspicions they are failing to stop children accessing adult content. Raziye Akkoc, Agence France-Presse

News.com.au
3 days ago
- Politics
- News.com.au
New Europe push to curb children's social media use
From dangerous diet tips to disinformation, cyberbullying to hate speech, the glut of online content harmful to children grows every day. But several European countries have had enough and now want to limit minors' access to social media. The European Union already has some of the world's most stringent digital rules to rein in Big Tech, with multiple probes ongoing into how platforms protect children -- or not. There are now demands for the EU to go further as a rising body of evidence shows the negative effects of social media on children's mental and physical health. Backed by France and Spain, Greece has spearheaded a proposal for how the EU should limit children's use of online platforms as fears mount over their addictive nature. They will present the plan on Friday to EU counterparts in Luxembourg "so that Europe can take the appropriate action as soon as possible", Greek Digital Minister Dimitris Papastergiou said. The proposal includes setting an age of digital adulthood across the 27-country EU, meaning children will not be able to access social media without parental consent. Since the proposal was published last month, other countries have expressed support including Cyprus and Denmark -- which takes over the rotating EU presidency in July. Danish officials say the issue will be a priority during their six-month presidency. France has led the way in cracking down on platforms, passing a 2023 law requiring them to obtain parental consent for users under the age of 15. But the measure has not received the EU green light it needs to come into force. France also gradually introduced requirements this year for all adult websites to have users confirm their age to prevent children accessing porn -- with three major platforms going dark this week in anger over the move. Also under pressure from the French government, TikTok on Sunday banned the "#SkinnyTok" hashtag, part of a trend promoting extreme thinness on the platform. - Real age verification - Greece says its aim is to protect children from the risks of excessive internet use. The proposal does not say at what age digital adulthood should begin but Papastergiou said platforms should know users' real ages "so as not to serve inappropriate content to minors". France, Greece and Spain expressed concern about the algorithmic design of digital platforms increasing children's exposure to addictive and harmful content -- with the risk of worsening anxiety, depression and self-esteem issues. The proposal also blames excessive screen time at a young age for hindering the development of minors' critical and relationship skills. They demand "an EU-wide application that supports parental control mechanisms, allows for proper age verification and limits the use of certain applications by minors". The goal would be for devices such as smartphones to have in-built age verification. The European Commission, the EU's digital watchdog, wants to launch an age-verification app next month, insisting it can be done without disclosing personal details. The EU last month published draft guidelines for platforms to protect minors, to be finalised once a public consultation ends this month, including setting children's accounts to private by default, and making it easier to block and mute users. Those guidelines are non-binding, but the bloc is clamping down in other ways. - EU investigations - It is currently investigating Meta's Facebook and Instagram, and TikTok under its mammoth content moderation law, the Digital Services Act (DSA), fearing the platforms are failing to do enough to prevent children accessing harmful content. In the Meta probe, the EU fears the platform's age-verification tools may not be effective. And last week, it launched an investigation into four pornographic platforms over suspicions they are failing to stop children accessing adult content. Separately, the EU has been in long-running negotiations on a law to combat child sexual abuse material, but the proposal has been mired in uncertainty, with worries from some countries that it would allow authorities to access encrypted communications. The legal proposal has pitted proponents of privacy against those working to protect children -- and despite repeated attempts, it has failed to get EU states' approval. raz/ec/phz
Yahoo
3 days ago
- Politics
- Yahoo
New Europe push to curb children's social media use
From dangerous diet tips to disinformation, cyberbullying to hate speech, the glut of online content harmful to children grows every day. But several European countries have had enough and now want to limit minors' access to social media. The European Union already has some of the world's most stringent digital rules to rein in Big Tech, with multiple probes ongoing into how platforms protect children -- or not. There are now demands for the EU to go further as a rising body of evidence shows the negative effects of social media on children's mental and physical health. Backed by France and Spain, Greece has spearheaded a proposal for how the EU should limit children's use of online platforms as fears mount over their addictive nature. They will present the plan on Friday to EU counterparts in Luxembourg "so that Europe can take the appropriate action as soon as possible", Greek Digital Minister Dimitris Papastergiou said. The proposal includes setting an age of digital adulthood across the 27-country EU, meaning children will not be able to access social media without parental consent. Since the proposal was published last month, other countries have expressed support including Cyprus and Denmark -- which takes over the rotating EU presidency in July. Danish officials say the issue will be a priority during their six-month presidency. France has led the way in cracking down on platforms, passing a 2023 law requiring them to obtain parental consent for users under the age of 15. But the measure has not received the EU green light it needs to come into force. France also gradually introduced requirements this year for all adult websites to have users confirm their age to prevent children accessing porn -- with three major platforms going dark this week in anger over the move. Also under pressure from the French government, TikTok on Sunday banned the "#SkinnyTok" hashtag, part of a trend promoting extreme thinness on the platform. - Real age verification - Greece says its aim is to protect children from the risks of excessive internet use. The proposal does not say at what age digital adulthood should begin but Papastergiou said platforms should know users' real ages "so as not to serve inappropriate content to minors". France, Greece and Spain expressed concern about the algorithmic design of digital platforms increasing children's exposure to addictive and harmful content -- with the risk of worsening anxiety, depression and self-esteem issues. The proposal also blames excessive screen time at a young age for hindering the development of minors' critical and relationship skills. They demand "an EU-wide application that supports parental control mechanisms, allows for proper age verification and limits the use of certain applications by minors". The goal would be for devices such as smartphones to have in-built age verification. The European Commission, the EU's digital watchdog, wants to launch an age-verification app next month, insisting it can be done without disclosing personal details. The EU last month published draft guidelines for platforms to protect minors, to be finalised once a public consultation ends this month, including setting children's accounts to private by default, and making it easier to block and mute users. Those guidelines are non-binding, but the bloc is clamping down in other ways. - EU investigations - It is currently investigating Meta's Facebook and Instagram, and TikTok under its mammoth content moderation law, the Digital Services Act (DSA), fearing the platforms are failing to do enough to prevent children accessing harmful content. In the Meta probe, the EU fears the platform's age-verification tools may not be effective. And last week, it launched an investigation into four pornographic platforms over suspicions they are failing to stop children accessing adult content. Separately, the EU has been in long-running negotiations on a law to combat child sexual abuse material, but the proposal has been mired in uncertainty, with worries from some countries that it would allow authorities to access encrypted communications. The legal proposal has pitted proponents of privacy against those working to protect children -- and despite repeated attempts, it has failed to get EU states' approval. raz/ec/phz