Tuesday Coffee Seminar - Dark Patterns and the GDPR

The seminar is open for everyone, and there is no registration. Coffee and tea will be offered for attendants. See you there!

Dark Patterns and the GDPR - Limits to exploiting users' bounded rationality

When using their digital services, Facebook, Google, Windows, often ask their users for their consent. Being able to consent or object to the processing of personal data is deemed the supreme way of protecting privacy. However, behavioral studies find that persons do not act rationally when giving their consent. Instead, they only possess "bounded rationality", relying on biases and heuristics - such as Loss Aversion, Confirmation Bias, and Recommendation Bias. The providers of digital services are well aware of these insufficiencies of rational thinking. They make use of this by nudging people to consent using so-called Dark Patterns (or "abusive design"). 

Contrary to these findings, Data Protection and Consumer law still build upon the idea of the rational user. The General Data Protection Regulation (GDPR), e.g., puts consent at the center of its justifications for data processing, creating, in fact, a notice & consent-regime. However, the GDPR establishes privacy by design (PbD) and privacy by default (PbDefault) as new principles. These seem to take up behavioral findings prohibiting opt-outs.

In this talk, Quirin explores whether this goes far enough, and how could PbD and PbDefault prevent providers from exploiting users' bounded rationality. What could and should be done to protect users more?

Lecturer: Quirin Weinzierl

Ph.D. candidate at Speyer University (Germany). Visiting Ph.D. researcher at NRCCL/SERI. LL.M. from Yale University.

Arrangør

Senter for rettsinformatikk (NRCCL)
Publisert 23. sep. 2019 15:04 - Sist endret 23. sep. 2019 15:25