Abstract
The EU’s General Data Protection Regulation is poised to present major challenges in bridging the gap between law and technology. This paper reports on a workshop on the deployment, content and design of the GDPR that brought together academics, practitioners, civil-society actors, and regulators from the EU and the US. Discussions aimed at advancing current knowledge on the use of abstract legal terms in the context of applied technologies together with best practices following state of the art technologies. Five themes were discussed: state of the art, consent, de-identification, transparency, and development and deployment practices. Four traversal conflicts were identified, and research recommendations were outlined to reconcile these conflicts.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
Cf. also the US National Privacy Research Strategy, https://www.nitrd.gov/cybersecurity/nationalprivacyresearchstrategy.aspx.
- 2.
- 3.
- 4.
- 5.
E.g. 30 years of EPIC’s work: https://www.epic.org/algorithmic-transparency/.
References
Hoepman, J.-H.: Privacy design strategies. In: Cuppens-Boulahia, N., Cuppens, F., Jajodia, S., Abou El Kalam, A., Sans, T. (eds.) SEC 2014. IAICT, vol. 428, pp. 446–459. Springer, Heidelberg (2014). https://doi.org/10.1007/978-3-642-55415-5_38
ENISA: Privacy Enhancing Technologies: Evolution and State of the Art A Community Approach to PETs Maturity Assessment (2016). https://www.enisa.europa.eu/publications/pets-evolution-and-state-of-the-art
Schaub, F., Balebako, R., Durity, A.L., Cranor, L.F.: A design space for effective privacy notices. In: Eleventh Symposium on Usable Privacy and Security (SOUPS 2015), Ottawa, pp. 1–17. USENIX Association (2015)
President’s Council of Advisors on Science and Technology: Big data and privacy: a technological perspective. Report to the U.S. President, Executive Office of the President, May 2014
Cranor, L.F.: Necessary but not sufficient: standard mechanisms for privacy notice and choice. J. Telecommun. High Technol. Law 10, 273 (2012)
Cate, F.H.: The limits of notice and choice. IEEE Secur. Priv. 8(2), 59–62 (2010)
Schaub, F., Balebako, R., Cranor, L.F.: Designing effective privacy notices and controls. IEEE Internet Comput. 21(3), 70–77 (2017)
Wenning, R., et al.: The platform for privacy preferences 1.1 (P3P 1.1) specification (2006). https://www.w3.org/TR/2018/NOTE-P3P11-20180830/
Fielding, R.T., Singer, D.: Tracking preference expression (DNT) W3C candidate recommendation (2017). https://www.w3.org/TR/2017/CR-tracking-dnt-20171019/
Article 29 Working Party. Opinion 05/2014 on anonymisation techniques (2014). WP216. http://ec.europa.eu/justice/data-protection/article-29/documentation/opinion-recommendation/files/2014/wp216_en.pdf
Narayanan, A., Shmatikov, V.: Robust de-anonymization of large sparse datasets. In: 2008 IEEE Symposium on Security and Privacy, SP 2008 (2008)
Cavoukian, A., Castro, D.: Big data and innovation, setting the record straight: de-identification does work. In: Information and Privacy Commissioner, p. 18 (2014)
Hu, R., Stalla-Bourdillon, S., Yang, M., Schiavo, V., Sassone, V.: Bridging policy, regulation and practice? A techno-legal analysis of three types of data in the GDPR. In: Data Protection and Privacy: The Age of Intelligent Machines, p. 39 (2017)
Ye, L.R.: The value of explanation in expert systems for auditing: an experimental investigation. Expert Syst. Appl. 9(4), 543–556 (1995)
Article 29 Working Party. Guidelines on transparency under regulation 2016/679 (2016). 17/EN WP260. http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id = 615250
Wachter, S., Mittelstadt, B., Floridi, L.: Why a right to explanation of automated decision-making does not exist in the general data protection regulation. Int. Data Priv. Law 7, 76–99 (2017)
Selbst, A.D., Powles, J.: Meaningful information and the right to explanation. Int. Data Priv. Law 7(4), 233–242 (2017)
Biran, O., Cotton, C.: Explanation and justification in machine learning: a survey. In: IJCAI-17 Workshop on Explainable AI (XAI) Proceedings, pp. 8–13 (2017). http://www.intelligentrobots.org/files/IJCAI2017/IJCAI-17_XAI_WS_Proceedings.pdf#page=8
Lipton, Z.C.: The mythos of model interpretability. In: ICML 2016 Workshop on Human Interpretability in Machine Learning (WHI 2016) (2016). http://zacklipton.com/media/papers/mythos_model_interpretability_lipton2016.pdf
Edwards, L., Veale, M.: Slave to the algorithm? Why a ’right to an explanation’ is probably not the remedy you are looking for. Duke Law Technol. Rev. 16, 18 (2017)
Article 29 Working Party. Guidelines on automated individual decision-making and profiling for the purposes of regulation 2016/679 (2018). 17/EN WP251rev.01. http://ec.europa.eu/newsroom/article29/item-detail.cfm?item_id=612053
Obar, J.A., Oeldorf-Hirsch, A., The biggest lie on the internet: ignoring the privacy policies and terms of service policies of social networking services. In: TPRC 44: The 44th Research Conference on Communication, Information and Internet Policy (2016)
Cate, F.H.: Information security breaches: looking back & thinking ahead. Technical report Paper 233, Articles by Maurer Faculty (2008). http://www.repository.law.indiana.edu/facpub/233
Atzori, M., Bonchi, F., Giannotti, F., Pedreschi, D.: Anonymity preserving pattern discovery. VLDB J. 17(4), 703–727 (2008)
Hansen, M., Jensen, M., Rost, M.: Protection goals for privacy engineering. In: 2015 IEEE Security and Privacy Workshops (SPW), pp. 159–166, May 2015
Schmidt , A., Herrmann, T., Degeling, M.: From interaction to intervention: an approach for keeping humans in control in the context of socio-technical systems. In: 4th Workshop on Socio-Technical Perspective in IS development (STPIS 2018) (2018)
Ribeiro, M.T., Singh, S., Guestrin, C.: “Why should I trust you?”: explaining the predictions of any classifier. In: Proceedings of the 22nd ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, KDD 2016, pp. 1135–1144. ACM, New York (2016)
Gürses, S., van Hoboken, J.: Privacy after the agile turn. In: Selinger, E., Polonetsky, J., Tene, O. (eds.) The Cambridge Handbook of Consumer Privacy (Cambridge Law Handbooks, pp. 579–601). Cambridge University Press, Cambridge (2018). https://doi.org/10.1017/9781316831960.032
Ding, L., Bao, J., Michaelis, J.R., Zhao, J., McGuinness, D.L.: Reflections on provenance ontology encodings. In: McGuinness, D.L., Michaelis, J.R., Moreau, L. (eds.) IPAW 2010. LNCS, vol. 6378, pp. 198–205. Springer, Heidelberg (2010). https://doi.org/10.1007/978-3-642-17819-1_22
Oliver, I.: Privacy Engineering: A Data Flow and Ontological Approach. CreateSpace Independent Publishing, July 2014. 978-1497569713
Anton, A.I., Earp, J.B.: A requirements taxonomy for reducing web site privacy vulnerabilities. Requirements Eng. 9(3), 169–185 (2004)
Solove, D.J.: A taxonomy of privacy. Univ. Pennsylvania Law Rev. 154(3), 477 (2006). GWU Law School Public Law Research Paper No. 129
Solove, D.J.: Conceptualizing privacy. Calif. Law Rev. 90(4), 1087–1155 (2002)
Kost, M., Freytag, J.C., Kargl, F., Kung, A.: Privacy verification using ontologies. In: ARES, pp. 627–632. IEEE (2011)
Kern, T.: Flight Discipline. McGraw-Hill Education, New York (1998)
Card, A.J., Ward, J.R., Clarkson, P.J.: Beyond FMEA: the structured what-if technique (SWIFT). J. Healthc. Risk Manag. 31, 23–29 (2012)
Scandariato, R., Wuyts, K., Joosen, W.: A descriptive study of Microsoft’s threat modeling technique. Requirements Eng. 20(2), 163–180 (2015)
Gawande, A.: The Checklist Manifesto. Profile Books (2011)
Reason, J.T.: Managing the Risks of Organizational Accidents. Ashgate, Farnham (1997)
Pfleeger, S.L.: Risky business: what we have yet to learn about risk management. J. Syst. Softw. 53(3), 265–273 (2000)
Oliver, I.: Experiences in the development and usage of a privacy requirements framework. In: 24th IEEE International Requirements Engineering Conference, RE 2016, Beijing, China, 12–16 September 2016, pp. 293–302. IEEE Computer Society (2016)
Power, M.: The risk management of everything. J. Risk Finance 5, 58–65 (2004)
Acknowledgments
This work was partially funded by the European Union’s Horizon 2020 project grant no. 740829 and 778615, the Luxembourg National Research Fund project PETIT grant agreement no. 10486741, the National Science Foundation grant agreements CNS-1330596 and SBE-1513957, under the Brandeis privacy initiative DARPA, AFRL grant agreement no. FA8750-15-2-0277, the Research Foundation Flanders, the Research Fund KU Leuven, and the KUL-PRiSE research project.
The views and conclusions contained herein are those of the authors and should not be interpreted as necessarily representing the official positions, policies or endorsements, either expressed or implied, of the institutions they are affiliated with, the EDPS, the National Science Foundation, DARPA, the Air Force Research Laboratory or the US Government.
We thank Ian Oliver and Jef Ausloos and the APF reviewers for their valuable input and comments, and all workshop participants for their contributions and stimulating discussions.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2018 Springer Nature Switzerland AG
About this paper
Cite this paper
Schiffner, S. et al. (2018). Towards a Roadmap for Privacy Technologies and the General Data Protection Regulation: A Transatlantic Initiative. In: Medina, M., Mitrakas, A., Rannenberg, K., Schweighofer, E., Tsouroulas, N. (eds) Privacy Technologies and Policy. APF 2018. Lecture Notes in Computer Science(), vol 11079. Springer, Cham. https://doi.org/10.1007/978-3-030-02547-2_2
Download citation
DOI: https://doi.org/10.1007/978-3-030-02547-2_2
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-02546-5
Online ISBN: 978-3-030-02547-2
eBook Packages: Computer ScienceComputer Science (R0)