Published in May 2019 by Data Guidance (with a 30 day exclusivity period)
Although major legislative frameworks such as the General Data Protection Regulation (Regulation (EU) 2016/679) (‘GDPR’) and Directive 2002/58/EC on Privacy and Electronic Communications (‘ePrivacy Directive’) may represent a general approach by the European Commission (‘the Commission’) in its regulation of personal data, the Commission has recognised the multi-faceted and complex reach that data use has, particularly with regard to business practice. This can be acknowledged through the Commission’s publication of various proposals for legislative reform, reports and discussions around uses of personal data relating to areas such as consumer protection, competition, ethics and artificial intelligence (‘AI’). Geraldine Proust, Director for Legal Affairs at the Federation of European Direct and Interactive Marketing (‘FEDMA’), discusses the intricacies of this multi-pronged approach from the Commission.
Have you noticed how data protection is more than the GDPR and ePrivacy? There now exists a deeper understanding of data protection law and how it interacts with ethics. Indeed, European legislators, in their efforts to protect consumers, have developed laws more intricately in this context. Acknowledging this is essential in order for the industry to play a key role in increasing consumer trust and keeping the EU economy competitive. Other legal frameworks, such as consumer and competition laws, are evolving towards upholding data protection and privacy. Major complaints and political circumstances also drive debate around profiling, ethics and the ‘manipulation model.’ We believe that going back to the basics of GDPR and multi-stakeholder dialogue is the way forward.
Legal intricacies of the GDPR and consumer law
The Council of the European Union adopted the Directive on Certain Aspects Concerning Contracts for the Supply of Digital Content and Digital Services (Directive (EU) 2019/770) (‘the Digital Content Directive’) on 15 April 2019. The Digital Content Directive aims at providing remedies to consumers in cases where the digital content or service (e.g. social media account) does not comply with the relevant contract. Interestingly, these contractual rights will apply in an equal manner to consumers who provide personal data for such content or services, and to ‘paying’ consumers alike. Personal data is protected by a fundamental right; it cannot be used as a counter performance or traded against digital content or services1. Yet, the provision of personal data triggers the benefit of contractual rights of consumers to have remedies in case of a faulty content or service. This is tricky, as there now exists tension between the law and data driven economy, and between consumer protection and contractual freedom of enterprises. The European Data Protection Board’s draft Guidelines 2/2019 on the Processing of Personal Data Under Article 6(1)(b) [of the] GPDR in the Context of the Provision of Online Services to Data Subjects currently reinforce this delicate situation2. Indeed, personal data necessary for the performance of a contract or service is narrowly interpreted, and any data not necessary for the performance of a contract should rely on another legal basis, such as legitimate interest or consent. However, despite the risk-based approach of the GDPR, reflected in Article 7(4), the extent of ‘free’ consent is still debated, notably in the context of the ePrivacy Directive proposal (the so-called ‘cookie-wall’ discussion). An equilibrium needs to be found, and referral questions to the European courts are foreseeable.
The Digital Content Directive does not apply where:
However, contractual law remains the competence of Member States. The Digital Content Directive is without prejudice to the GDPR and, in case of a conflict, the GDPR takes precedence (e.g. processing of data in case of termination of the contract and portability)3. However, the Digital Content Directive has a broader view, including when the trader must refrain from using consumer generated digital content (see Recital 65 of the Digital Content Directive) and what digital content consumers have a right to retrieve (in addition to personal data).
Even though the GDPR takes precedence, delicate situations are likely to arise. Indeed, facts leading to a lack of compliance with requirements provided for under the GDPR, including core principles such as the requirements for data minimisation and data protection by design and default, may, depending on the circumstances of the case, be considered to constitute a lack of conformity of the digital content or digital service with subjective or objective requirements for conformity provided for under Recital 48 of the Digital Content Directive. Monitoring the implementation of the Digital Content Directive over the next two years is a good idea, because of the links with the GDPR, and because the notion of a contract relies on different national definitions in the various Member States.
The Commission’s New Deal for Consumers4 is composed of two proposals; one on the better enforcement and modernisation of EU consumer protection rules, and the other on representative actions for the protection of the collective interests of consumers5. On the first proposal, the European Parliament and Council reached a provisional agreement on 21 March 2019. The European Parliament adopted this provisional agreement in its plenary on 17 April 2019, but at the time of publication, approval from the EU Council of Ministers is still required.
The first proposal provides for the extension of the Directive on consumer rights, amending Council Directive 93/13/EEC and Directive 1999/44/EC of the European Parliament and of the Council and repealing Council Directive 85/577/EEC and Directive 97/7/EC of the European Parliament and of the Council (Directive (EU) 2011/83)6 (‘the Consumer Rights Directive’) to contracts for digital services under which the consumer provides personal data to the trader without paying a price7. Thanks to the Consumer Rights Directive, consumers have a right to withdraw from a contract for digital content or services within 14 days. This applies in a consistent manner with the Digital Content Directive, although Recital 35 of the Digital Content Directive provides that Member States are free to extend the application for the rules of the Consumer Rights Directive to situations excluded from the scope of the Digital Content Directive. Hence the need to monitor the implementation of the Digital Content Directive. The Digital Content Directive also provides a new information requirement to the Consumer Rights Directive to inform the consumer when the price is personalised based on automated decision-making8, stating that ‘[t]his information requirement should not apply to techniques such as “dynamic” or “real-time” pricing that involves changing the price in a highly flexible and quick manner in response to market demands when it does not involve personalisation based on automated decision making. This information requirement is without prejudice to [the GDPR], which provides, inter alia, for the right of the individual not to be subjected to automated individual decision-making, including profiling.’
Regarding representative actions, the second proposal aims to establish a minimum level of harmonisation for representative actions (collective actions, whether for redress9 or injunction), including under the GDPR, thereby limiting the choice provided to Member States by the GDPR10. At the time of publication, the second proposal is still in the decision-making process.
Examples of further intricacies
There is a concern that data is being used by some companies to exploit or exclude competitors from the opportunities of AI and other data-based innovation. The report, Competition Law and Data11, jointly produced by the The German Federal Cartel Office (‘Bundeskartellamt’) and the French Competition Authority (Autorité de la concurrence), states: ‘Two aspects, of particular relevance when looking at data’s contribution to market power, can be identified: the scarcity of data or ease of replicability, on the one hand; whether the scale or scope of data collection matters, on the other.’ According to the European Commissioner for Competition, Margrethe Vestager, what is key is that the Commission is already looking very closely at whether companies are using their control of data to harm competition12. Due to shared concerns around the implications for competition when a handful of companies control data that others need to compete in the market, special advisors to the Commission put forward ideas in a report on what digitisation means for consumers, and how competition law and enforcers should respond to the challenges of the new digital era13. In particular, the report, Competition Policy for the Digital Era14, proposes:
A key development in this context is a decision taken by the Bundeskartellamt15 against Facebook. The Bundeskartellamt considered that Facebook had abused its dominant position in the German market of social networks, because “[a]ll data collected on the Facebook website, by Facebook-owned services such as e.g. WhatsApp and Instagram and on third party websites can be combined and assigned to the Facebook user account.” Although there is no financial harm to the consumer, they are considered to suffer from a loss of control over their personal data, and such data being shared.
The European Data Protection Supervisor (‘EDPS’) has raised concerns around individuals being at risk of being lost and defined only by data and algorithms. Consequently, the EDPS proposed the establishment of a Digital Clearinghouse to bring together competition, consumer and data protection agencies to discuss how best to enforce current legislation. All regulators in the digital space, based in the EU or around the world, are invited to take part in the discussions. In December 2018, authorities debated the ‘deceptive framing of a free offer as unfair practice, the opportunity to adopt structural remedies able to provoke a change in the business models, asymmetric regulation of access to data and its impact on competitive dynamics, essential facility theory applied to the specificities of data resources and misuse of the data protection framework to obstacle investigations by national authorities including competition agencies16.’ The authorities agreed ‘to continue discussions on developing a methodology to assess the real costs of where the monetary cost of services is zero or below marginal cost17.’
‘Surveillance or manipulation model,’ profiling, and ethics
The EDPS published an Opinion on online manipulation and personal data18 (‘the Opinion’), which voices concern “with the way personal information is used in order to micro-target individuals and groups with specific content, the fundamental rights and values at stake, and relevant laws for mitigating the threats.” Some authorities, such as the UK Information Commissioner’s Office (‘ICO’), are particularly active in this area too19. Due to approaching national European elections, the Commission published in September 2018 an electoral package including a Code of Practice on Disinformation20. Moreover, the Commission has a High-Level Expert Group on AI, which in April 2019 published its Ethics Guidelines for Trustworthy AI21.
At the CDPD 2019 conference22, academics discussed the right to fair or reasonable data inference in the context of profiling. Activists called on consumers to use their data subject access rights to request access to inferred data. In some cases, data subject access rights are leading to major complaints, such as those made by None Of Your Business (‘NOYB’). Most of these cases challenge the legal basis (whether consent or legitimate interest) to process data for profiling purposes. The recent decision by the French data protection authority (CNIL) regarding Google23, in response to one of the NOYB complaints, focused on the principles of minimisation and transparency under the GDPR, and its risk-based approach. One could remain optimistic, as better perception by the consumer of the value exchange will reinforce protection of individuals and consumer choice24. Across all countries, control, trust and transparency form the foundational basis for a healthy data economy. FEDMA states that 88% of consumers cite transparency as the key to trusting organisations25, therefore improving transparency and control for people will help companies be in a much stronger position to engage them within the data economy. This strikes at one of the core principles of the GDPR: ‘accountability.’
Back to the basics – the GDPR’s risk-based approach and accountability
The GDPR provides that data processing must be fair and that controllers are accountable. A risk-based approach allows for flexibility and for market diversity too, among both paid and free services. Processing is fair when the controller asks itself “what if this was my data, would I consider the processing fair? Would I reasonably expect this processing to take place?” Additionally, effective transparency drives consumer trust. A risk-based approach would acknowledge consent as one legal basis, whilst legitimate interest combined with effective transparency can also benefit the consumer26. The possibility for a controller to process personal data for its legitimate interest must be the result of a legitimate interest assessment. This assessment must go through three key stages:
The balancing test must always be conducted fairly, taking into account the nature of the interests (e.g. the reasonable expectations of the individual), the impact of the processing (e.g. the positive and negative impacts on the individual) and any safeguards which are or could be put in place (e.g. data minimisation, de-identification, technical and organisational measures)27. When carrying such assessments, profiling for advertising purposes, unlike profiling for political purposes, does not have a legal effect on an individual28. It is important to remember that “nothing is wrong with data and digital29.” Our industry aims at a constructive dialogue with the European institutions and authorities. FEDMA, the only European association with an approved Code of Conduct by data protection authorities at the time of publication, is updating its Code of Conduct on processing of personal data for direct marketing purposes.
To conclude, I encourage you to follow the new European Parliament and the Commission, and their upcoming programmes to legislate on the digital economy. The Directive on Certain Legal Aspects of Information Society Services, in Particular Electronic Commerce, in the Internal Market (Directive (EU) 2000/31/EC) (the E-commerce Directive) is very likely to be reviewed.
Geraldine Proust Director for Legal Affairs
DO NOT MISS OUR NEWS