CPDP2017 will stage more than 60 panels and workshops with a stimulating mix of academics, practitioners, regulators and advocates, as well as multiple side events such as open debates, PechaKucha performances and artistic interventions. Please note that this is a preliminary version of the programme which is still in its early stages. Accordingly, some panels may change or be rescheduled.
wednesday 25 january 2017 • LA CAVE
academic •• policy ••• business •
organised by CPDP
Chair Patrick Penninckx, Council of Europe (INT)
Moderator Gus Hosein, Privacy International (UK)
Panel Glyn Moody, Ars Technica (UK), Catherine Van De Heyning, University of Antwerp (BE), Pat Walshe, Privacy Matters (UK)
The crypto wars continue to simmer and deserve renewed pressure and attention. Yet governments are not standing still while end to end cryptography is deployed by just a few companies. Rather they are exploring and using new powers, notably hacking. This includes orders in companies to assist with hacking, such as the Technical Service Notes in the new U.K. legislation and as we saw arise in a form with the Apple v FBI fight and the NSO Group story in Bahrain and Mexico. What are and should be the boundaries of these new powers?
10.00 - Coffee break
academic •• policy •• business ••
organised by TYPES project
Chair Townsend Feehand, IAB Europe (BE)
Moderator Nikolaos Laoutaris, Telefónica and Data Transparency Lab (ES)
Panel John W. Byers, Boston University (US), Christopher W. Clifton, Purdue University (US), Will DeVries, Google (US), Pedro Martin Jurado, Spanish Ministry of Industry (ES), Claire Levallois-Barth, Telecom ParisTech (FR)
The lack of awareness of citizens regarding the management of personal information and their increasing concern regarding privacy and data protection pose a serious risk for the sustainable economic growth of online services. In 2015, the Eurobarometer study revealed that 63% of EU citizens do not trust online businesses (search engines, online social networks, e-mail services), more than half of the citizens neither like providing personal information in return for free services, nor appreciate the use of their personal data for targeted advertising. These numbers implicitly demand for more transparency over the use of personal data in online services. The recently adopted General Data Protection Regulation (GDPR) aims at easing the current situation by enhancing users’ control over their personal data. In light of the concerns this situation raises for online advertising sector, the panel will address the following questions:
academic •• policy ••• business •
organised by Future of Privacy Forum
Moderator Kelsey Finch, FPF (US)
De-identification—the process of modifying personal data to ensure that data subjects are no longer identifiable—is one of the primary measures that organizations use to protect privacy. Over the past few years, however, computer scientists and mathematicians have demonstrated that de-identification is not foolproof. At the same time, organizations around the world necessarily continue to rely on a wide range of technical, administrative and legal measures to reduce data identifiability. The GDPR recognizes the concept of pseudonymization, albeit with limited legal implications compared to its stricter relative, anonymization. This session draws on The Brussels Privacy Symposium, which has generated scholarship on technical, policy, economic and ethical aspects of the de-identification debate.
policy •••• business ••
organised by CPDP
Chair Stewart Dresner, Privacy Laws & Business (UK)
Moderator Fanny Hidvegi, AccessNow (INT)
What are the implications for privacy of the growing electoral strength of populist politicians and parties within Europe and among Europe's trading partners and neighbours? What are the most important privacy-related changes we can expect from populist national leaders and governments that will affect policies, businesses, and individuals? What sectors will be most affected? How will populist politics and politicians project their power internationally in ways that impact privacy? What sorts of privacy-related domestic and international conflicts can we expect? How are businesses coping? What new academic work is being carried out, or is called for, in this area? This session will provide a forum for the panelists and the audience to share and compare our ideas, predictions, and responses to an issue that is already on many of our minds.
15.45 - Coffee break
academic • policy ••• business ••
organised by BSA | The Software Alliance
Chair Bruno Gencarelli, DG JUST (EU)
Moderator Guido Lobrano, Business Europe (EU)
This panel will focus on the recent legal challenges to data transfers from Europe to the rest of the world: from the Schrems II case on the use of Standard Contractual Clauses to the recent formal complaints against Privacy Shield seeking to annul the European Commission implementing decision Digital Rights Ireland and La Quadrature du Net, and what these could entail for global data transfer mechanisms. After a brief explanation of the various challenges and the transfer tools put into question, we will focus on the implications that these challenges may have if they were to succeed. This panel will allow for a timely and very topical discussion on a series of ongoing legal developments that may have a profound impact on the future of the Europe and its economy.
academic ••• policy •• business •
organised by the Institute for Information Law (IViR), University of Amsterdam (NL)
Chair Katja de Vries, VUB-LSTS (BE)
Moderator Frederik Zuiderveen Borgesius, IViR, University of Amsterdam (NL)
Lenders use profiling to estimate a consumer’s creditworthiness. Lenders have legitimate reasons to adapt interest rates to certain consumers, or refuse to lend to them. Increasing amounts of information (‘big data’) become available for profiling. One UK online lender uses up to 8000 data points to assess, automatically, a consumer’s creditworthiness. However, automatically deciding whether a consumer is granted credit or not brings problems. For instance, profiling-based decisions are often incorrect for a particular consumer. A second problem is the opaqueness of profiling: consumers may not know why they are denied services or why they have to pay a higher interest rate. Third, profiling can discriminate unintentionally, for instance when an algorithm learns from data reflecting biased human decisions.
18.30 - CoCKTAIL sponsored by EPIC IN LE VILLAGE