EU Warns Big Tech to Strengthen Cyberbullying Protections for Children

European policymakers are sharpening their focus on how social networks and other online platforms handle harassment of children, and they are signaling that current efforts are not enough. As part of a broader push to protect minors online, The EU is tying tougher expectations for Big Tech to the Digital Services Act and a new cyberbullying action plan that reaches into product design, content moderation, and cross-border enforcement. The emerging framework could redefine what it means for the largest platforms to provide a “safe” service for young users across the bloc.

At the heart of the shift is a simple premise: if one in six children is being targeted online, then the status quo is failing. The EU executive is now moving from guidance and voluntary codes toward binding duties, backed by the threat of significant penalties for companies that do not adapt their systems to detect, limit, and respond to cyberbullying.

The scale of cyberbullying and why The EU is escalating

Policymakers are not acting in a vacuum. One in six youngsters has been the victim of cyberbullying, according to a comprehensive poll conducted across many European nations, a figure that underscores how deeply harassment has seeped into daily digital life for minors and how sharply it has risen in recent years, as highlighted by One. Within the European Union specifically, 1 in 6 children, aged 11 to 15, report being victims of cyberbullying, a pattern that aligns with the broader poll and reinforces the sense that online abuse is now a structural public health issue rather than a fringe problem, according to data cited by Melike.

Against that backdrop, The EU has begun to frame cyberbullying as a systemic risk that must be addressed through regulation of Big Tech rather than left solely to parents or schools. A Quick Summary of the new approach makes clear that The EU expects Big Tech to improve cyberbullying measures under the Digital Services Act, with obligations that touch content moderation workflows, user reporting tools, and access to support for victims, as set out in guidance on the Digital. The same Quick Summary stresses that these rules are not abstract: they are designed to affect one in six children who are already experiencing harm, and they sit alongside broader requirements in the Digital Services Act that require online platforms to do more to counter illegal and harmful content, as detailed in a separate overview of Legislative.

How the Digital Services Act tightens the screws on Big Tech

The Digital Services Act is the legal backbone for the EU’s new expectations, and it already requires online platforms to do more to counter illegal and harmful content, including harassment of minors. Under this framework, The Digital Services Act contains guidelines on how online platforms should deal with cyberbullying, from faster removal of abusive posts to clearer pathways for users to report incidents and get support, as spelled out in a detailed summary of Digital Services Act. The EU executive has signaled that these baseline duties may need to be interpreted more aggressively when it comes to cyberbullying, especially for the largest services that shape how teenagers communicate.

In BRUSSELS, the European Commission has gone further, publicly warning that Online platforms may have to do more to fight cyberbullying and hinting that enforcement of the Digital Services Act will increasingly focus on how companies handle abuse of minors, according to a briefing that opened with the line “BRUSSELS, Feb 10 (Reuters)” and described how the European Commission set out its expectations on a Tuesday, as reported by Reuters. Another account from BRUSSELS notes that Big Tech may have to do more to combat cyberbullying and that Online platforms are being told to step up their efforts, reinforcing that the message is aimed squarely at the largest players in the social media and messaging ecosystem, as described in a report that framed the pressure on Big.

The new EU-wide action plan and its reporting app for minors

Alongside the legal obligations in the Digital Services Act, the European Commission is rolling out a dedicated cyberbullying action plan that puts a reporting tool for minors at its core. The Commission will support the rollout of an online safety app across European member states, enabling children and young people to quickly flag abusive content and connect to their national services for help, a move that aims to standardize access to support regardless of where a child lives in the bloc, as outlined in a description of how The Commission plans to act. In this way, the app will not only help protect minors but also hold bullies accountable and empower victims with a tool to fight back, a dual purpose that the Commission has framed as central to EU action on cyberbullying, according to its own explanation of how Feb.

The plan also seeks a more coordinated EU approach, urging member states to develop comprehensive national strategies and adopt a common understanding of what measures are necessary to protect children online, a call for alignment that is spelled out in a summary noting that the plan also seeks such coordination to protect children online, as described in coverage of how the initiative aims to harmonize The plan. On Tuesday the European Commission will present its new action plan against cyberbullying, and EU lawmakers have stressed that Fighting cyberbullying is one of the Commission’s priorities, with particular attention to children who have been victims or who admit to having participated in it, as previewed in a parliamentary agenda that notes how Tuesday the European will act.

Platforms, “addictive” design and tougher detection duties

For Platforms, the most immediate operational change will likely come from tougher detection obligations that sit alongside the new action plan. Specialist news and analysis on legal risk has already flagged that Platforms will face tougher detection obligations under the EU cyberbullying plan, with regulators expected to scrutinize how automated systems and human moderators identify patterns of harassment and intervene before abuse escalates, according to an analysis. A separate briefing on the same theme notes that Platforms to face tougher detection obligations under the EU cyberbullying plan will likely need to adapt their risk assessments and transparency reporting, reinforcing that detection is no longer a “nice to have” but a regulated duty, as underlined in another Specialist note.

Regulators are also looking beyond content to the mechanics of how apps keep children engaged. Addictive design features are under scrutiny as part of the EU-wide action plan, with officials warning that endless scroll, streaks, and other engagement tricks can trap minors in environments where bullying thrives and make it harder for them to disengage, according to a description that highlights how Addictive elements are being questioned. The same reporting notes that the proposal comes as the EU pushes on multiple fronts to shield children from harmful content and the use of deepfakes for bullying, underscoring that design choices and emerging technologies are now part of the regulatory conversation, as detailed in a broader look at how the EU is expanding its scrutiny of European platforms.

National bans, parental control debates and what comes next

The EU-level push is unfolding alongside a patchwork of national experiments that could further reshape how minors use social media. Countries Across Europe Take Action to Ban Social Media for Minors, with at least 15 governments on the continent planning social media restrictions or outright bans for the use of kids and teens, a trend that illustrates how political patience with self-regulation has worn thin and how far some capitals are willing to go, as documented in an overview of how several states are moving to Ban Social Media. These national measures sit uneasily with the EU’s preference for harmonized rules, but they also increase pressure on Big Tech to adapt quickly rather than wait for years of litigation to clarify their duties.

Leave a Reply

Your email address will not be published. Required fields are marked *