Online services such as websites, social media, mobile and IoT apps provide user interfaces proposing to control its users’ personal data. While Data Protection and Consumer law set high-level principles applicable to such interfaces, online service providers still have a large design space to test various interfaces on its users. This situation gave rise to the use of manipulative tactics in UX/UI commonly known as dark patterns, or deceptive design. Today, policy makers and regulators worldwide are concerned. First, dark patterns are difficult to define -- the line between nudging techniques that may be perceived as acceptable marketing strategies and intentional deception causing detriment to users is often blurred. Second, it is unclear what is acceptable evidence for regulators to demonstrate presence of dark patterns to sustain their legal proceedings. This panel aims to discuss the unified definitions of dark patterns and analyze which evidence can be legally relevant for regulators to protect data subjects from such manipulative practices.
• How do existing and upcoming Data Protection and Consumer laws regulate the presence of dark patterns?
• Are there unified and acceptable definitions of dark patterns for policy makers and regulators?
• What kind of evidence of dark patterns has been so far acceptable for policy makers and regulators?
• Which empirical research insights can be useful to gather evidence of dark patterns?