DSA Glossary

Definitions of key terms from the Digital Services Act

42 terms defined

A B C D E F G H I J K L M N O P Q R S T U V W X Y Z

A

Action

An action is the concrete step a hosting provider takes on notified content, such as removing it or blocking access, after deciding the report is justified.

Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easily accessible and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned (‘notice’), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content (‘action’). Such mechanisms should be clearly identifiable, located close to the information in question and at least as easy to find and use as notification mechanisms for content that violates the terms and conditions of the hosting service provider. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice in order to ensure the effective operation of notice and action mechanisms. The notification mechanism should allow, but not require, the identification of the individual or the entity submitting a notice. For some types of items of information notified, the identity of the individual or the entity submitting a notice might be necessary to determine whether the information in question constitutes illegal content, as alleged. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in so far as they qualify as hosting services covered by this Regulation.

Article 0 - Paragraph 50

Active recipient of an online platform

An active recipient of an online platform is someone who either posts to the platform or is shown content hosted there.

‘active recipient of an online platform means a recipient of the service that has engaged with an online platform by either requesting the online platform to host information or being exposed to information hosted by the online platform and disseminated through its online interface;

Article 3 - Paragraph 1

Active recipient of an online search engine

An active recipient of an online search engine is any user who submits a query and sees the results.

‘active recipient of an online search engine means a recipient of the service that has submitted a query to an online search engine and been exposed to information indexed and presented on its online interface;

Article 3 - Paragraph 1

C

Caching service

A caching service temporarily stores data as it is transmitted so requests can be fulfilled faster and more efficiently.

a ‘caching’ service, consisting of the transmission in a communication network of information provided by a recipient of the service, involving the automatic, intermediate and temporary storage of that information, performed for the sole purpose of making more efficient the information's onward transmission to other recipients upon their request;

Article 3 - Paragraph 1

Commercial communication

Commercial communication is any messaging that promotes a person or business, including advertising or marketing materials.

‘commercial communication’ means ‘commercial communication’ as defined in Article 2, point (f), of Directive 2000/31/EC;

Article 3 - Paragraph 1

Competent authorities

Competent authorities are the national regulators that Member States formally appoint to supervise intermediary services and enforce the DSA.

Member States shall designate one or more competent authorities to be responsible for the supervision of providers of intermediary services and enforcement of this Regulation (‘competent authorities’).

Article 49 - Paragraph 1

Compliance function

A compliance function is an independent internal team within a VLOP/VLOSE responsible for monitoring compliance with the DSA. It must be independent from operational teams, staffed by qualified compliance officers, and have adequate authority, resources, and direct access to senior management to effectively oversee regulatory compliance.

Providers of very large online platforms or of very large online search engines shall establish a compliance function, which is independent from their operational functions and composed of one or more compliance officers, including the head of the compliance function. That compliance function shall have sufficient authority, stature and resources, as well as access to the management body of the provider of the very large online platform or of the very large online search engine to monitor the compliance of that provider with this Regulation.

Article 41 - Paragraph 1

Consumer

A consumer is an individual using a service for personal reasons outside their trade, business, craft, or profession.

‘consumer’ means any natural person who is acting for purposes which are outside his or her trade, business, craft, or profession;

Article 3 - Paragraph 1

Content moderation

Content moderation covers the processes—manual or automated—that platforms use to detect, evaluate, and act on illegal or policy-breaking material.

‘content moderation’ means the activities, whether automated or not, undertaken by providers of intermediary services, that are aimed, in particular, at detecting, identifying and addressing illegal content or information incompatible with their terms and conditions, provided by recipients of the service, including measures taken that affect the availability, visibility, and accessibility of that illegal content or that information, such as demotion, demonetisation, disabling of access to, or removal thereof, or that affect the ability of the recipients of the service to provide that information, such as the termination or suspension of a recipient’s account;

Article 3 - Paragraph 1

Crisis

Under the crisis response mechanism, a crisis is an extraordinary situation that poses a serious EU-wide risk to public security or public health, activating special mitigation duties for very large platforms.

For the purpose of this Article, a crisis shall be deemed to have occurred where extraordinary circumstances lead to a serious threat to public security or public health in the Union or in significant parts of it.

Article 36 - Paragraph 2

D

Dark patterns

Dark patterns are manipulative interface tricks—such as confusing wording or hidden options—that push users toward choices they might not otherwise make freely or knowingly.

Dark patterns on online interfaces of online platforms are practices that materially distort or impair, either on purpose or in effect, the ability of recipients of the service to make autonomous and informed choices or decisions. Those practices can be used to persuade the recipients of the service to engage in unwanted behaviours or into undesired decisions which have negative consequences for them. Providers of online platforms should therefore be prohibited from deceiving or nudging recipients of the service and from distorting or impairing the autonomy, decision-making, or choice of the recipients of the service via the structure, design or functionalities of an online interface or a part thereof. This should include, but not be limited to, exploitative design choices to direct the recipient to actions that benefit the provider of online platforms, but which may not be in the recipients’ interests, presenting choices in a non-neutral manner, such as giving more prominence to certain choices through visual, auditory, or other components, when asking the recipient of the service for a decision. It should also include repeatedly requesting a recipient of the service to make a choice where such a choice has already been made, making the procedure of cancelling a service significantly more cumbersome than signing up to it, or making certain choices more difficult or time-consuming than others, making it unreasonably difficult to discontinue purchases or to sign out from a given online platform allowing consumers to conclude distance contracts with traders, and deceiving the recipients of the service by nudging them into decisions on transactions, or by default settings that are very difficult to change, and so unreasonably bias the decision making of the recipient of the service, in a way that distorts and impairs their autonomy, decision-making and choice. However, rules preventing dark patterns should not be understood as preventing providers to interact directly with recipients of the service and to offer new or additional services to them. Legitimate practices, for example in advertising, that are in compliance with Union law should not in themselves be regarded as constituting dark patterns. Those rules on dark patterns should be interpreted as covering prohibited practices falling within the scope of this Regulation to the extent that those practices are not already covered under Directive 2005/29/EC or Regulation (EU) 2016/679.

Article 0 - Paragraph 67

Digital Services Committee

The Digital Services Committee is the EU comitology group that helps the Commission adopt implementing acts under the DSA.

The Commission shall be assisted by a committee (‘the Digital Services Committee’). That Committee shall be a Committee within the meaning of Regulation (EU) No 182/2011.

Article 88 - Paragraph 1

Digital Services Coordinator of destination

The Digital Services Coordinator of destination is the regulator in a Member State where the service is actually offered.

‘Digital Services Coordinator of destination’ means the Digital Services Coordinator of a Member State where the intermediary service is provided;

Article 3 - Paragraph 1

Digital Services Coordinator of establishment

The Digital Services Coordinator of establishment is the regulator in the Member State where a provider is based or represented.

‘Digital Services Coordinator of establishment’ means the Digital Services Coordinator of the Member State where the main establishment of a provider of an intermediary service is located or its legal representative resides or is established;

Article 3 - Paragraph 1

Dissemination to the public

Dissemination to the public means making information accessible to an unlimited audience at a user’s request, not just to a closed group.

‘dissemination to the public’ means making information available, at the request of the recipient of the service who provided the information, to a potentially unlimited number of third parties;

Article 3 - Paragraph 1

Distance contract

A distance contract is a consumer agreement concluded without face-to-face contact, using channels like websites, apps, or email.

‘distance contract’ means ‘distance contract’ as defined in Article 2, point (7), of Directive 2011/83/EU;

Article 3 - Paragraph 1

E

European Board for Digital Services

The European Board for Digital Services is the EU-wide advisory network of Digital Services Coordinators that helps align supervision, share guidance, and support enforcement of the DSA.

An independent advisory group of Digital Services Coordinators on the supervision of providers of intermediary services named ‘European Board for Digital Services’ (the ‘Board’) is established.

Article 61 - Paragraph 1

H

Hosting service

A hosting service stores information at a user’s request, such as cloud storage, web hosting, or social media publishing.

a ‘hosting’ service, consisting of the storage of information provided by, and at the request of, a recipient of the service;

Article 3 - Paragraph 1

I

Illegal content

Illegal content covers any information or activity that breaks EU or aligned national law, from unlawful goods to prohibited speech.

‘illegal content’ means any information that, in itself or in relation to an activity, including the sale of products or the provision of services, is not in compliance with Union law or the law of any Member State which is in compliance with Union law, irrespective of the precise subject matter or nature of that law;

Article 3 - Paragraph 1

Information society service

An information society service is any digital service provided for pay, at a distance, online, and on demand—like cloud hosting, messaging apps, or marketplaces.

‘information society service’ means a ‘service’ as defined in Article 1(1), point (b), of Directive (EU) 2015/1535;

Article 3 - Paragraph 1

Intermediary service

An intermediary service is the broad category of digital services that simply transmit, cache, or host information on behalf of others.

‘intermediary service’ means one of the following information society services:

Article 3 - Paragraph 1

M

Mere conduit

A mere conduit service just passes information or provides network access without altering the content, like standard internet access or telecom routing.

a ‘mere conduit’ service, consisting of the transmission in a communication network of information provided by a recipient of the service, or the provision of access to a communication network;

Article 3 - Paragraph 1

N

Negative opinion

A negative opinion signals the auditor has found the very large platform or search engine is not complying with required DSA obligations or promised measures.

The audit report should be substantiated, in order to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the providers of the very large online platform and of the very large online search engine to comply with their obligations under this Regulation. The audit report should be transmitted to the Digital Services Coordinator of establishment, the Commission and the Board following the receipt of the audit report. Providers should also transmit upon completion without undue delay each of the reports on the risk assessment and the mitigation measures, as well as the audit implementation report of the provider of the very large online platform or of the very large online search engine showing how they have addressed the audit’s recommendations. The audit report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A ‘positive opinion’ should be given where all evidence shows that the provider of the very large online platform or of the very large online search engine complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A ‘positive opinion’ should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A ‘negative opinion’ should be given where the auditor considers that the provider of the very large online platform or of the very large online search engine does not comply with this Regulation or the commitments undertaken. Where the audit opinion could not reach a conclusion for specific elements that fall within the scope of the audit, an explanation of reasons for the failure to reach such a conclusion should be included in the audit opinion. Where applicable, the report should include a description of specific elements that could not be audited, and an explanation of why these could not be audited.

Article 0 - Paragraph 93

Notice

A notice is the formal alert a person sends to a hosting provider pointing to specific content they believe is illegal so that the provider can review it and decide on appropriate action.

Providers of hosting services play a particularly important role in tackling illegal content online, as they store information provided by and at the request of the recipients of the service and typically give other recipients access thereto, sometimes on a large scale. It is important that all providers of hosting services, regardless of their size, put in place easily accessible and user-friendly notice and action mechanisms that facilitate the notification of specific items of information that the notifying party considers to be illegal content to the provider of hosting services concerned (‘notice’), pursuant to which that provider can decide whether or not it agrees with that assessment and wishes to remove or disable access to that content (‘action’). Such mechanisms should be clearly identifiable, located close to the information in question and at least as easy to find and use as notification mechanisms for content that violates the terms and conditions of the hosting service provider. Provided the requirements on notices are met, it should be possible for individuals or entities to notify multiple specific items of allegedly illegal content through a single notice in order to ensure the effective operation of notice and action mechanisms. The notification mechanism should allow, but not require, the identification of the individual or the entity submitting a notice. For some types of items of information notified, the identity of the individual or the entity submitting a notice might be necessary to determine whether the information in question constitutes illegal content, as alleged. The obligation to put in place notice and action mechanisms should apply, for instance, to file storage and sharing services, web hosting services, advertising servers and paste bins, in so far as they qualify as hosting services covered by this Regulation.

Article 0 - Paragraph 50

O

Offer services in the Union

Offering services in the Union means making an intermediary service meaningfully available to people in at least one EU Member State, such as by targeting them or already serving many of them.

‘to offer services in the Union’ means enabling natural or legal persons in one or more Member States to use the services of a provider of intermediary services that has a substantial connection to the Union;

Article 3 - Paragraph 1

Online interface

An online interface is the software surface—such as a website or app—through which users interact with the service.

‘online interface’ means any software, including a website or a part thereof, and applications, including mobile applications;

Article 3 - Paragraph 1

Online platform

An online platform is a hosting service that also makes user-provided information available to the public, like social networks or marketplaces.

‘online platform’ means a hosting service that, at the request of a recipient of the service, stores and disseminates information to the public, unless that activity is a minor and purely ancillary feature of another service or a minor functionality of the principal service and, for objective and technical reasons, cannot be used without that other service, and the integration of the feature or functionality into the other service is not a means to circumvent the applicability of this Regulation;

Article 3 - Paragraph 1

Online search engine

An online search engine lets users query the web (or a large subset) and returns results in response to keywords, voice queries, or other inputs.

‘online search engine’ means an intermediary service that allows users to input queries in order to perform searches of, in principle, all websites, or all websites in a particular language, on the basis of a query on any subject in the form of a keyword, voice request, phrase or other input, and returns results in any format in which information related to the requested content can be found;

Article 3 - Paragraph 1

P

Persons with disabilities

Persons with disabilities refers to individuals covered by the EU accessibility definition, whose needs must be considered when applying the DSA.

‘persons with disabilities’ means ‘persons with disabilities’ as referred to in Article 3, point (1), of Directive (EU) 2019/882 of the European Parliament and of the Council;

Article 3 - Paragraph 1

Positive opinion

A positive opinion is an auditor's clean bill of health showing a very large platform or search engine is meeting its DSA duties and any extra commitments, including risk assessments and mitigation.

The audit report should be substantiated, in order to give a meaningful account of the activities undertaken and the conclusions reached. It should help inform, and where appropriate suggest improvements to the measures taken by the providers of the very large online platform and of the very large online search engine to comply with their obligations under this Regulation. The audit report should be transmitted to the Digital Services Coordinator of establishment, the Commission and the Board following the receipt of the audit report. Providers should also transmit upon completion without undue delay each of the reports on the risk assessment and the mitigation measures, as well as the audit implementation report of the provider of the very large online platform or of the very large online search engine showing how they have addressed the audit’s recommendations. The audit report should include an audit opinion based on the conclusions drawn from the audit evidence obtained. A ‘positive opinion’ should be given where all evidence shows that the provider of the very large online platform or of the very large online search engine complies with the obligations laid down by this Regulation or, where applicable, any commitments it has undertaken pursuant to a code of conduct or crisis protocol, in particular by identifying, evaluating and mitigating the systemic risks posed by its system and services. A ‘positive opinion’ should be accompanied by comments where the auditor wishes to include remarks that do not have a substantial effect on the outcome of the audit. A ‘negative opinion’ should be given where the auditor considers that the provider of the very large online platform or of the very large online search engine does not comply with this Regulation or the commitments undertaken. Where the audit opinion could not reach a conclusion for specific elements that fall within the scope of the audit, an explanation of reasons for the failure to reach such a conclusion should be included in the audit opinion. Where applicable, the report should include a description of specific elements that could not be audited, and an explanation of why these could not be audited.

Article 0 - Paragraph 93

R

Recipient of the service

The recipient of the service is the person or organisation using an intermediary service, whether to seek information or to share it.

‘recipient of the service’ means any natural or legal person who uses an intermediary service, in particular for the purposes of seeking information or making it accessible;

Article 3 - Paragraph 1

Recommender system

A recommender system is the automated logic a platform uses to highlight, rank, or suggest content to users.

‘recommender system’ means a fully or partially automated system used by an online platform to suggest in its online interface specific information to recipients of the service or prioritise that information, including as a result of a search initiated by the recipient of the service or otherwise determining the relative order or prominence of information displayed;

Article 3 - Paragraph 1

S

Shadow banning

Shadow banning occurs when a platform quietly reduces the visibility of a user or their posts without telling them, so they keep posting but others cannot easily see the content.

Restriction of visibility may consist in demotion in ranking or in recommender systems, as well as in limiting accessibility by one or more recipients of the service or blocking the user from an online community without the user being aware (‘shadow banning’). The monetisation via advertising revenue of information provided by the recipient of the service can be restricted by suspending or terminating the monetary payment or revenue associated to that information. The obligation to provide a statement of reasons should however not apply with respect to deceptive high-volume commercial content disseminated through intentional manipulation of the service, in particular inauthentic use of the service such as the use of bots or fake accounts or other deceptive uses of the service. Irrespective of other possibilities to challenge the decision of the provider of hosting services, the recipient of the service should always have a right to effective remedy before a court in accordance with the national law.

Article 0 - Paragraph 55

Substantial connection to the Union

A substantial connection exists when a provider is established in the EU or clearly targets or serves a significant number of EU users.

‘substantial connection to the Union’ means a connection of a provider of intermediary services with the Union resulting either from its establishment in the Union or from specific factual criteria, such as: — a significant number of recipients of the service in one or more Member States in relation to its or their population; or — the targeting of activities towards one or more Member States;

Article 3 - Paragraph 1

T

Terms and conditions

Terms and conditions are the contractual rules that govern the relationship between a service provider and its users.

‘terms and conditions’ means all clauses, irrespective of their name or form, which govern the contractual relationship between the provider of intermediary services and the recipients of the service;

Article 3 - Paragraph 1

Trader

A trader is any natural or legal person acting for professional purposes, including through agents, when using an intermediary service.

‘trader’ means any natural person, or any legal person irrespective of whether it is privately or publicly owned, who is acting, including through any person acting in his or her name or on his or her behalf, for purposes relating to his or her trade, business, craft or profession;

Article 3 - Paragraph 1

Trusted flagger

A trusted flagger is an entity accredited by a Member State’s Digital Services Coordinator because it meets strict expertise, independence, and diligence requirements, so its notices receive fast-track handling.

The status of ‘trusted flagger’ under this Regulation shall be awarded, upon application by any entity, by the Digital Services Coordinator of the Member State in which the applicant is established, to an applicant that has demonstrated that it meets all of the following conditions:

Article 22 - Paragraph 2

Turnover

Turnover is the total revenue generated by a business, as defined under EU competition law.

‘turnover’ means the amount derived by an undertaking within the meaning of Article 5(1) of Council Regulation (EC) No 139/2004.

Article 3 - Paragraph 1

V

Very Large Online Platform VLOP

Very Large Online Platforms (VLOPs) are platforms with over 45 million monthly active users in the EU (a number that can be adjusted over time), that are subject to enhanced obligations under the DSA.

Platforms are explicitly designated by the European Commission as VLOPs when they reach this threshold, which then triggers additional compliance requirements including risk assessments, transparency obligations and oversight.

The designation process and obligations are parallel to those for VLOSEs, with both categories defined together.

This Section shall apply to online platforms and online search engines which have a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, and which are designated as very large online platforms or very large online search engines pursuant to paragraph 4.

Article 33 - Paragraph 1

Very Large Online Search Engine VLOSE

Very Large Online Search Engines (VLOSEs) are search engines with over 45 million monthly active users in the EU (a number that can be adjusted over time), that are subject to enhanced obligations under the DSA.

Platforms are explicitly designated by the European Commission as VLOSEs when they reach this threshold, which then triggers additional compliance requirements including risk assessments, transparency obligations and oversight.

The designation process and obligations are parallel to those for VLOPs, with both categories defined together.

This Section shall apply to online platforms and online search engines which have a number of average monthly active recipients of the service in the Union equal to or higher than 45 million, and which are designated as very large online platforms or very large online search engines pursuant to paragraph 4.

Article 33 - Paragraph 1

Vetted researcher

A vetted researcher is an academic or similar expert that a Digital Services Coordinator officially recognises as meeting strict independence, transparency, and data-protection safeguards, unlocking deeper platform data access for systemic-risk studies.

Upon a duly substantiated application from researchers, the Digital Services Coordinator of establishment shall grant such researchers the status of ‘vetted researchers’ for the specific research referred to in the application and issue a reasoned request for data access to a provider of very large online platform or of very large online search engine a pursuant to paragraph 4, where the researchers demonstrate that they meet all of the following conditions:

Article 40 - Paragraph 8