Meeting Summary · 27 Mar 2023 · Safety of users on big platforms · Source:

OPP Meeting Summary: EP IMCO Committee - Safety of users on big platforms (27 March 2023)

A summary of the exchange of views is available.

Chair Anna Cavazzini (Greens/EFA, Germany)

  • they decided to hold the exchange of views due to the political developments on "one particular platform" and the discussions about banning it from work phones, or even altogether, as was the case in the United States;
  • she noted that there were also new elements of this platform that could have serious consequences for minors as they perpetuated beauty standards.

Mr Prabhat Agarwal, Head of Unit for Digital Services (Programme Office & Societal aspects) (CNECT.F.2) at DG CNECT, European Commission

  • he highlighted three preliminary points;
  • first, he acknowledged that the work of the European Parliament (EP) had improved the risk assessment under the Digital Services Act (DSA) compared to the Commission's proposal, particularly in relation to the mental health of minors, as well as synthetic images and content;
    • these inclusions had anticipated some of the risks that they were identifying at the time;
  • second, the co-legislators had decided to anticipate the application timeline in the last trilogue and give the Commission less time than Member States (MS) to prepare for implementation;
    • this was logical since the legal powers of the Commission did not require additional legal specifications besides some secondary pieces of legislation;
    • this was decided upon due to the common concern about the importance of regulating and providing regulatory oversight as quickly as possible;
  • third, given the scale of the challenge posed by online platforms, the Commission had to rely on expertise networks and cooperation with MS;
    • he noted that they were in the process of establishing the expertise through recruitment and cooperation;
    • the Commission was also establishing cooperation mechanisms with MS;
    • he noted that Article 64 DSA was designed to support the establishment and development of expertise;
    • since the scale of the issues was so broad the Commission had to rely on this network even if it was the lead regulator;
  • regarding the legislative preparation, he noted that 6 secondary acts had either been transmitted to the co-legislator for scrutiny or were under preparation;
  • first, the delegated act on the methodology for calculating the supervisory fee, which provided details on the methods for the determination of the estimated costs that the Commission would need to incur, the calculation of the individual fees as well as the necessary procedures to levy the fee;
    • it was very important for the Commission to be adequately resourced in carrying out its complex supervisory task;
    • he stressed the importance of the fee given that the Commission would supervise some of the most technologically sophisticated companies in the world, some of which had a market power that was bigger than the GDP of some countries;
  • second, the implementing act on procedures for Very Large Online Platforms (VLOPs) and Very Large Search Engines (VLOSEs) under Article 83;
    • he noted that the feedback period had recently ended;
    • the adoption was planned for the end of May 2023;
    • it would be voted in Comitology;
    • this act was necessary to provide platforms with the necessary procedural rights;
    • he recalled that some of the powers of the Commission were modelled on Regulation 1/2003 on Antitrust procedures;
  • third, the delegated act on conducting independent audits. Its adoption was planned for the end of June 2023, and it would be subject to public feedback probably in May;
    • the act set out the procedures, and methodology and gave reporting templates that should underpin the independent audits;
    • the audits were a very important element to control the validity of the risk assessment;
  • fourth, the DSA provided for a mandatory delegated act on the access of data for researchers under Article 40, which should detail the purposes for which the data could be used, the conditions for sharing, and the relevant objective indicators, as well as some of the procedural elements including a potential advisory mechanism
    • the Commission planned to run a Call for Evidence to understand how the practicalities could be implemented uniformly across the EU;
    • the understanding of the mental health risks, such as the impact of beauty filtres, would require both regulatory supervision and access from experts to data;
    • he recalled that Article 40 required the existence of the Board, so the act could not be adopted before its establishment. Since this was scheduled for February 2024 the Commission would adopt the delegated act in March 2024;
  • fifth, they were preparing another implementing regulation on transparency reporting obligations under Articles 24 and 15 DSA;
    • the Commission would probably consult some more experts and adopt the rules towards the end of 2023;
  • sixth, the information-sharing system that would allow communication with MS, in order to bring up cases to the Commission and provide feedback, would be put in place in 2024 once the board was fully established;
  • he noted that the Commission was putting internal procedures in place and they were in dialogue with companies as part of the designation process and in order to assist them to comply with the DSA;
  • during these dialogues, companies had to approach the Commission with their plans and their readiness status. After this they went through a sequence of issues, that was pretty much the same for all VLOPs and VLOSEs, identifying the areas in which further work was needed;
    • these discussions included on-site visits;
  • the companies that had self-declared themselves above the threshold were engaging constructively;
    • however, they did not agree with them in every element and not all had the same level of quality and preparation;
  • he noted that the Commission was taking the protection of minors and mental health very seriously;
  • they were establishing networks with relevant experts to gather the evidence;
  • although he could not comment on any specific company or incident, he ensured that were taking these elements that had been reinforced by the co-legislator during the negotiations particularly into account.

Karen Melchior (RE, Denmark)

  • she called for more openness in the discussions of the Working Group on the implementation of the DSA;
  • it was not "as sexy as a US Congress hearing" but the EU had a much better foundation to regulate platforms than the US where they needed "theatrical Congress hearings";
  • she called for combining the work of the Working Group and the preparatory work for the implementation with political calls for more accountability of the political platforms.

Chair Anna Cavazzini (Greens/EFA, Germany)

  • she noted that TikTok had refused an invitation to join the debate.

Andreas Schwab (EPP, Germany)

  • the EP had been quite successful in steering the issue without public hearings;
  • he agreed with the fact that more public discussions could be helpful, however, what counted was their content;
  • he had two questions for the Commission;
    • first, he asked how Commission considered the identified privacy, freedom of expression and security concerns posed by some platforms in the context of the EU-US negotiations on data transfers. Moreover, since the DSA required platforms to protect EU users against foreign interference, he questioned whether it was difficult to include in the agreement;
    • second, he noted that most children and young people used social media every day, in which there was plenty of non-age-appropriate content, and issues such as cyber-mobbing were raising. He asked whether the Commission wanted to come up with specific protection tools for minors during the next legislature.

Christel Schaldemose (S&D, Denmark)

  • she welcomed MEP Schwab's question and suggested that maybe they should start thinking about what the EP would want the Commission to do in the next term;
  • although she noted that the DSA would be a very strong tool also for protecting minors in general, since for instance platforms would have to look at public health, she acknowledged that perhaps they had not done enough;
  • she noted that they were lucky to have the Commission to ensure the implementation of the DSA;
    • she commended their speed;
  • when they reached the trilogue agreement they knew that it didn't give much time to the Commission but it was important to ensure that the implementation happened fast;
  • she brought up the risk assessment and the risks for minors. Although it was not yet possible to give any details, as it was still confidential, she asked whether there would be a time when the Commission could be more open about it.

Mr Prabhat Agarwal, Head of Unit for Digital Services (Programme Office & Societal aspects) (CNECT.F.2) at DG CNECT, European Commission

  • regarding cooperation with the US, he noted that there was already an open dialogue on data protection and related issues led by DG JUST;
  • Working Group 5 (WG5) of the Trade and Technology Council (TTC) was the forum in which they discussed DMA and DSA-related issues;
    • he noted that some of the issues in the DSA were also a priority for the US administration and they were included in the State of the Union Speech by President Biden, including the protection of minors and the prohibition of targeted advertising;
    • the dialogue with the US was constructive;
    • they were currently working on further deliverables, but it was early to give an update on the particular content of the discussions;
  • he could not know what the next Commission would propose, but he noted that the European Media Freedom Act and the Audiovisual Media Services Directive already contained a set of rules for the protection of minors including age verification to access certain sites. The DSA also included a list of tools and the mitigation measures included age verification and empowering kids online;
    • they should expect platforms to put in place a broad range of these tools as mitigation measures;
  • he also noted the ongoing pilot project under the Better Internet for Kids Strategy (BIK);
    • the responsible department was working on the inclusion of age verification tools and other elements to protect minors online;
  • once the Board was in place these codes of conduct could become DSA codes of conduct under Article 45;
  • there were a number of public disclosure elements which were intended to be presented and discussed before the Parliament, so the Commission would be happy to answer any questions on these occasions;
  • although the individual enforcement actions against specific companies were subject to confidentiality requirements, he stressed that the Commission would take its reporting requirements towards the Parliament very seriously and respect its public accountability obligations.

Kim Van Sparrentak (Greens/EFA, Netherlands)

  • she regretted that representatives from TikTok or Twitter had not joined the exchange of views;
    • she stated that "if they were not doing anything wrong she did not understand why they should not be there to be held accountable";
  • given the prohibition of TikTok from the Commission official's phones and the recommendation to uninstall it to the EP staff, TikTok users were worried about their privacy. However, as many people were also worried about Facebook, Instagram and Twitter;
  • she asked for more information about why the Commission decided to ban TIkTok and why it was so much worse than Meta or Twitter. She also wanted to know what would the Commission recommend to European TikTok users if the privacy concerns were so high;
  • she noted that TikTok was successful in dragging users into more extreme content, and both adults and kids were exposed to videos glorifying self-harm, eating disorders, and conspiracy theories;
    • she asked whether the Commission was looking specifically into the "addictive design" of these platforms.

Karen Melchior (RE, Denmark)

  • there was some interest regarding Article 21 DSA on out-of-court dispute settlement;
  • she highlighted the importance of issuing criteria on which bodies could be nominated under this provision and she asked about the timeline for this.

Mr Prabhat Agarwal, Head of Unit for Digital Services (Programme Office & Societal aspects) (CNECT.F.2) at DG CNECT, European Commission

  • he could not comment on the decision to remove TikTok from corporate devices;
    • he stressed that it was taken by the Corporate Board, which was different from the DSA enforcement work;
  • privacy risks were included in the risk assessment under the DSA, on top of the GDPR rules that applied nonetheless, and was one of the risks that would be scrutinised by the Commission once the platforms sent their risk assessment;
  • he noted that the addictive design had been in the mind of the co-legislator regarding Articles 34 and 35, and issues such as the adjustments of algorithms and recommender systems came at the top of the list of mitigating measures;
  • legislative history showed that the legislator had in mind the practices referenced by MEP van Sparrenstak regarding the analysis of risks to mental and physical health;
  • the preparatory work required establishing expertise and legal standards that would allow the Commission to scrutinise what platforms were doing and find objective facts and evidence;
    • the Commission aimed at “being led where the evidence led them”;
    • he stressed that they had to be objective and they would be in contact with experts
  • the work on algorithms and recommender systems would be done jointly with the new centre for algorithmic transparency;
  • he highlighted that these cases were complex and it was too simplistic to put the full responsibility on the algorithm since user behaviour and peer norming also played a part;
  • a part of his team was dedicated to analysing these questions in detail and building up the necessary expertise to scrutinise what platforms were telling them;
    • the Commission “would not be naive
  • he noted that the establishment of the necessary criteria for the out-of-court dispute settlement was a work in progress;
    • he could not provide a precise timeline;
    • the Commission had to work together with MS and he noted that they were in close contact to ensure that they were getting ready. This was very important, particularly for the application of Article 21.

Chair Anna Cavazzini (Greens/EFA, Germany)

  • they could try to have another exchange of views with the presence of platform representatives.

The simultaneous interpretation of debates provided by the EU institutions serves only to facilitate communication amongst the participants in the meeting. It does not constitute an authentic record of proceedings. One Policy Place uses these translations so this text is only a guide and should not be relied on as an official account of the meeting. Only the original speech or the revised written translation of that speech is authentic.

This site uses cookies. By continuing to browse the site you are agreeing to our use of cookies. Find out more.