Seminars

NO.032 Privacy by Transparency for Data-Centric Services

Shonan Village Center

August 6 - 8, 2013 (Check-in: August 5, 2013 )

Organizers

  • Prof. Dr. Isao Echizen
    • National Institute of Informatics, Japan
  • Prof. Dr. Günter Müller
    • University of Freiburg, Germany
  • Prof. Dr. Ryoichi Sasaki
    • Tokyo Denki University, Japan
  • Prof. Dr. A Min Tjoa
    • Vienna University of Technology, Austria

Overview

The objective of the seminar is to close an existing “expressivity” gap between privacy requirements and policy compliance for data centric services by transparency mechanisms. The evolution of privacy and security mechanisms occurs in distinguishable steps following the progress of technology. Ongoing relevant IT-initiatives clearly show that transparency is the next most essential factor, which will play a prominent and major role in this evolution of privacy.

”Access control” is the most widely used metaphor used to model security. As a consequence authentication is the only source for protecting private data with the advantage of relatively easy control. In the early 90ties, data minimization at authentication time was the means to protect privacy (Müller, Rannenberg). Data minimization generated a set of successful mechanisms, whose most well known example are digital signatures, Public key Infrastructures, and identity management (Wohlgemuth, Müller). In its most extreme case, anonymization (Chaum) totally omits data for authentication, while “Secure Multiparty Computing (SMC)” proposed 1983 by Dolev and Yao can be used to minimize personal data to the agreed limit. The reason for the disappointing acceptance of access control mechanisms for privacy in modern data centric services is characterized by Acquisti’s “privacy paradox”. This privacy paradox explains the gap between awareness and actual actions of users with regard to privacy enforcement. The reasons range from cumbersome and hard-to use mechanisms to lack of trust, but in essence it has been shown, that not the access but the usage of data is of concern (Müller). In 2004 Park and Sandhu’s specification of usage control model solved this gap. The mechanisms encompass privacy policy languages such as P3P (Wenning, Schunter) and its Freiburg variant ExPDT (Sackmann, Kähmer). Sticky policies (Karjoth), secure logging (Accorsi), and data provenance (Haas, Wohlgemuth) are early examples of Transparency Enhancing Technologies (TET). With the advent of innovative business opportunities of “Big Data” it became obvious that transparent usage control is the most promising approach if privacy is to be controlled.

Transparency mechanisms are a better balance of interests of users and industry by offering “signaling and screening” functions. While “signaling” allows specifying privacy rules or policies under which services are to be conducted, “screening” encompasses all mechanisms to control the enforcement of the signaled rules. The range of topics involved in realizing transparency is shown in the figure below, where between User infaces (UI) and mechanisms (Mechs) is distinguished. The dashboard is taken as an instrument to visualize the adherence to privacy policies, while monitoring and auditing indicate the available alternatives to observe or prove compliance.

Transparency topics.

The necessary monitoring can take place before or after execution of services. If it is performed afterwards it is called “auditing” using forensic methods. Dashboards are the user interface to control privacy. As a necessity future appliance such as cars, household devices, or other cyber-physical systems will have to employ dashboards for monitoring their status. Dashboards, as offered in todays online social networks (OSN) experience a significant lack of trust by users, pointing a lack of transparency. At MIT, Brynjolfsson convincingly showed that data centric services have a positive impact on economic welfare, but maybe jeopardized by a lack of trust in privacy mechanisms. Other initiatives, like “Smarter Planet” of IBM, “Internet of Things” by the European Union, and “Autonomous Computing” in the USA, are technologies where transparency mechanisms are envisioned to assure privacy.

The goal of this NII Shonan Meeting is to set thematic milestones for the technical implementation of transparency on the one hand, and on the other, trace ways in which technical progress, users and industry could profit from transparency. Specifically at the technical level, this NII Shonan Meeting (a) determines the current expressivity gaps between the privacy requirements and the policy specification languages, (b) compares existing mechanisms for testing adherence to privacy policies, and (c) identifies ways in which monitoring and audit could be combined into a “continuous auditing”. At the deployment level, this NII Shonan Meeting (a) lists tangible business models for transparency mechanisms (b) sketches guidelines on how to carry over transparency to different scenarios: e.g. Smart-Grids, e-Home, cyber-physical systems, and (c) categorizes the requirements for privacy sensitive and user-friendly design of dashboards. While the European organizers have strong background in (a) Privacy Enhancing Technologies (PET) and (b) in usage control, the Japanese organizers have internationally recognized competence in (c) digital rights, (d) critical infrastructures, and (e) public security, and the participants to be invited are composed of experts contributing to the technical objective as well as to the deployment issues

Appendix: Key References

[1] R. Accorsi: A secure log architecture to support remote auditing. Mathematical and Computer Modelling, Elsevier, doi:10.1016/j.mcm.2012.06.35, 2012.

[2] D. Basin, M. Harvan, F. Klaedtke, E. Zalinescu: MONPOLY: Monitoring Usage-Control Policies. RV, 360-364, 2011.

[3] D. Chaum: Untraceable Electronic Mail, Return Addresses, and Digital Pseudonyms. Communications of the ACM 24(2), ACM, 84-88, 1981.

[4] D. Dolev and A.C. Yao: On the Security of Public Key Protocols. IEEE Transactions on Information Theory 2(29), IEEE Press, 198-208, 1983.

[5] S. Haas, S. Wohlgemuth, I. Echizen, N. Sonehara, and G. Müller: Aspects of Privacy for Electronic Health Records. Int. Journal of Medical Informatics, Special Issue: Security in Health Information Systems 80(2), Elsevier, e26-e31, 2011.

[6] L. Kagal and H. Abelson: Access Control is an Inadequate Framework for Privacy Protection. W3C Workshop on Privacy for Advanced Web APIs, 2010. Available at http://www.w3.org/2010/api-privacy-ws/papers/privacy-ws-23.pdf

[7] G. Karjoth, M. Schunter, and M. Waidner: Platform for Enterprise Privacy Practices: Privacy-enabled Management of Customer Data. 2nd Workshop on Privacy Enhancing Technologies. LNCS 2482, Springer, 69-84, 2003.

[8] G. Müller and K. Rannenberg (eds.): Multilateral Security in Communications - Technology, Infrastructure, Economy. Addison-Wesley, 1999.

[9] J. Park and R. Sandhu: The UCONABC Usage Control Model. 24th ACM Transactions on Information and System Security 7(1), ACM, 128-174, 2004.

[10] S. Sackmann and M. Kähmer: ExPDT: A Policy-based Approach for Automating Compliance. Wirtschaftsinformatik 50(5), Gabler, 366-374, 2008.

[11] A. Schröpfer, F. Kerschbaum, and G. Müller: L1 - An Intermediate Language for Mixed-Protocol Secure Computation, COMPSAC ’11 Proceedings of the 2011 IEEE 35th Annual Computer Software and Applications Conference, IEEE Press, 298-307, 2011.

[12] N. Sonehara, I. Echizen, S. Wohlgemuth, G. Müller, and A. Tjoa (eds.): Proceedings of the International Workshop on Information Systems for Social Innovations (ISSI) 2009, http://www.nii.ac.jp/issi, National Center for Sciences, 2009.

[13] K. Takaragi, R. Sasaki, and S. Singai: A Probability Bounds Estimation Method in Markov Reliability Analysis. IEEE Transactions on Reliability 3(35), IEEE Press, 257-261, 1985.

[14] R. Wenning and M. Schunter (eds.): The Platform for Privacy Preferences 1.1 (P3P1.1) Specification. W3C Working Group Note 13, 2006. Available at http://www.w3.org/TR/P3P11/.


Report

No-032.pdf