What every marketer needs to know about the DSA

The Digital Services Act, or DSA, is a new European Union regulation designed to protect the fundamental rights of internet users. 

It regulates activities for: 

  • Intermediary services: internet service providers, browsers, DNS/VPN services, caching or web hosting

  • Internet platforms: Amazon, eBay, AliExpress, social networks including chat applications

  • Search engines: Google, Bing

  • App stores: Google Play, App Store

  • Online platforms for travel and accommodation: Airbnb, Booking 

Marketing is also affected by the DSA requirements – with the regulation in force come many restrictions on promotion and advertising. To avoid any inconvenience and wasting time and money on inappropriate advertisements, it’s a good idea to know your rights and responsibilities. 

Let’s take a look at what is changing for the giants of the online space, but also for marketers.



No more personalised content

The DSA requires online platforms to guarantee greater transparency and consumer control over what they see on them. Very large online platforms must therefore offer the option to turn off personalised content for one’s account. The user should also have access to information about why the content is being shown to them. This is based on increased privacy protection. This part of the regulation affects advertisements on social networks. It can cause targeting problems for marketers. 


Combating toxic content

In the past, harmful content had been overlooked by many online platforms. Now, thanks to the DSA, platforms are being pushed to actively filter such content. One of their obligations is to make a simple system of reporting illegal content available to their users. The platform has an obligation to take note of your reports and if it decides not to remove the reported content, it has an obligation to give a reason.

For marketers, however, this doesn’t bring much change to content moderation on social media. A significant part of the fight against harmful content on their profiles will remain their responsibility. Companies like elv.ai have long supported media, businesses and institutions in the fight against inappropriate content in the discussions under their posts. 

It is important for brands to engage in content moderation and in reporting harmful posts. Doing so protects their followers, their employees and the whole society.


Prohibition of dark patterns

Dark patterns could be defined as unethical tactics to manipulate consumer behaviour. Do you notice that “Accept all cookies” is often highlighted in green next to the grey “Reject all cookies”? We consider this tactic a dark pattern as well. It tells our subconscious that the highlighted coloured button is definitely the right answer, and so we’re more likely to click it.

The DSA strongly opposes dark patterns by banning them altogether. In this way, it protects users from making uninformed decisions and gives them more freedom to manage their own online space. 

 


The DSA also applies to you if you manage an online marketplace – in which case you are required to verify the identity of every seller on your portal to prevent the sale of illegal goods.

Moderate content, including in discussions. If you have any doubts about content moderation, we can refute them in one of our previous articles. 

When we don’t want to suffer from regulation, but benefit socially from it, content moderation is the right step towards a healthier and more respectful internet and online community.

At elv.ai, we can help you with content moderation on your marketplace, social media profiles, and web forums in any language. Be sure to reach out to us anytime – we’d be happy to discuss your needs and options with you.

After all, it’s good that the EU is taking a responsible approach to consumer rights. Thus, a new era for marketers begins. It will be full of challenges, but it also opens more doors to creativity.