Providers of very large online platforms and of very large online search engines shall put in place reasonable, proportionate and effective mitigation measures, tailored to the specific systemic risks identified pursuant to Article 34, with particular consideration to the impacts of such measures on fundamental rights. Such measures may include, where applicable:
(a) adapting the design, features or functioning of their services, including their
online interfaces;
(b) adapting their
terms and conditions and their enforcement;
(c) adapting
content moderation processes, including the speed and quality of processing
notices related to specific types of
illegal content and, where appropriate, the expeditious removal of, or the disabling of access to, the content notified, in particular in respect of illegal hate speech or cyber violence, as well as adapting any relevant decision-making processes and dedicated resources for
content moderation;
(d) testing and adapting their algorithmic systems, including their
recommender systems;
(e) adapting their advertising systems and adopting targeted measures aimed at limiting or adjusting the presentation of
advertisements in association with the service they provide;
(f) reinforcing the internal processes, resources, testing, documentation, or supervision of any of their activities in particular as regards detection of systemic risk;
(g) initiating or adjusting cooperation with
trusted flaggers in accordance with Article 22 and the implementation of the decisions of out-of-court dispute settlement bodies pursuant to Article 21;
(h) initiating or adjusting cooperation with other providers of
online platforms or of
online search engines through the codes of conduct and the
crisis protocols referred to in Articles 45 and 48 respectively;
(i) taking awareness-raising measures and adapting their
online interface in order to give recipients of the service more information;
(j) taking targeted measures to protect the rights of the child, including age verification and parental control tools, tools aimed at helping minors signal abuse or obtain support, as appropriate;
(k) ensuring that an item of information, whether it constitutes a generated or manipulated image, audio or video that appreciably resembles existing persons, objects, places or other entities or events and falsely appears to a person to be authentic or truthful is distinguishable through prominent markings when presented on their
online interfaces, and, in addition, providing an easy to use functionality which enables recipients of the service to indicate such information.