I’ve published a new article in the journal Datenschutz Nachrichten. In this article, I argue that privacy and data protection theory and law should get rid of its fixation on „personal information“. Based on a historical analysis of this fixation, I illustrate its shortcomings. Finally, I propose to focus instead on „person-related decision“ as a new basic building block for privacy and data protection theory and law.

Local download (in German): PERSONAL DATA NOT FOUND: Personenbezogene Entscheidungen als überfällige Neuausrichtung im Datenschutz

Finally, my article on „Transparency and Computability vs. Loss of Autonomy and of Control: The Industrialization of Societal Information Processing and Its Consequences“ has been published today in an online, open access and peer-reviewed journal called „Mediale Kontrolle unter Beobachtung“. The article is based on an extended abstract I submitted last year to a workshop on »Privacy and Quantifiability« in Passau and the presentation I gave at this workshop in February 2015.

While the article is in German (and also bears a German title: „Transparenz und Berechenbarkeit vs. Autonomie- und Kontrollverlust: Die Industrialisierung der gesellschaftlichen Informationsverarbeitung und ihre Folgen“), there is an English abstract:

The far-reaching digitization of all areas of life is blatantly obvious, its individual and societal consequences are the subject of a broad public and scientific debate. The concept of privacy with its traditional notion of a categorical separation between ‘private’ and ‘public’ is the wrong starting point to describe, analyze or explain these consequences. Instead, a thorough analysis must take the processes of modern societal and especially organized information processing as a starting point, the organizations and institutions they’re producing as well as the power relationship between organizations and the datafied individuals, groups and institutions.

After the first industrialization (of physical or manual labor), a second industrialization is now taking place: that of ‘intellectual work’, i.e. of societal information processing. It undermines the old mechanisms of distribution and control of power in society and threatens the bourgeois society’s promise of liberty by structurally rescinding individual and societal areas of autonomy. This very development and its consequences are what Datenschutz is addressing. Its function is the maintenance of contingency for the structurally and informationally weak under the terms of the industrialization of societal information processing and against the superior normalization power of organizations.

Download article from „Mediale Kontrolle unter Beobachtung“ or download a local copy.

On the journal’s website you can also comment on or discuss this article.

Last week, I held a short talk at the workshop “Surveillant Antiquities and Modern Transparencies:Exercising and Resisting Surveillance Then and Now” at the Excellence Cluster TOPOI in Berlin, which was organized by Tudor Sala and Seda Gürses.

For the workshop I submitted the following abstract:

Modern, organized social information processing – the new surveillance (Gary T. Marx) – is certainly different from older forms of surveillance: within the last three or four decades it has been steadily industrialized. The industrialization of information processing is the process of socialization of ‘mental functions’: they are taken from the individual context of the individual and put into formalized machine-processable procedures (Wilhelm Steinmüller); it’s a transformation of a subjective into an objective process (Andreas Boes et al. with reference to Karl Marx).

Against the backdrop of this industrialization process, the presentation will challenge two assumptions commonly held in privacy research and Surveillance Studies alike: first, that one can ignore the fundamental difference between a community and a society in analyzing surveillance, and second, that social interaction theory is an adequate means for the analysis of modern, industrialized surveillance.

The community–society (Gemeinschaft–Gesellschaft) dichotomy is a very basic sociological concept (Ferdinand Tönnies, Max Weber). Social collectives that are held together by emotional ties, personal social interactions, and implicit rules are called communities, whereas society are held together by more rational decisions, indirect interactions, impersonal roles, and mechanisms such as contracts and explicit legal rules (Christoph Lutz and Pepe Strathoff). Additionally, due to the increase of complexity modern society has developed from a segmentary to a stratified-hierarchical and then to a functionally differentiated (Talcott Parsons, Niklas Luhmann). In contrast, many privacy and surveillance researchers seem to share a strong commitment to identity politics, seeking for an abandonment of modern society and a return to community-style social collectives. The question is, however, whether this is an adequate starting point for analyzing modern surveillance.

Most privacy and surveillance theory also share a common theoretical foundation: Erving Goffman’s sociological role theory. According to Goffman, social interaction is “all the interaction which occurs throughout any one occasion when a given set of individuals are in one another’s continuous presence.” Addressing the relationship between organization and individual with this theory is highly problematic, especially if the organization has industrialized its own information processing and decision making – there simply is no interpersonal relationship between a user (or better: usee) and Google’s computers. Therefore, it’s highly implausible if focusing on persons as possible attackers is adequate for understanding modern industrialized information processing.

In the presentation, I talked about three issues: (1) the relationship between the sociological concepts of community and society, (2) the relationship between privacy and surveillance theories based on interpersonal relationships and modern, organized social information processing practices, and (3) the relationship between the division of labor and the concept of informed consent in privacy theories and laws.

The conclusions of my presentation relate especially to the question whether and how we can compare “ancient and modern systems of surveillance”. With respect to the first issue mentioned above, I concluded that we can compare communities with communities, but we should not compare communities with societies. The conclusion for the second issue is highly similar: we can compare sensitivities towards or actions against surveillance practices and surveillance systems, but we should not compare pre-industrialized and industrialized surveillance systems and practices. My third conclusion is a bit more related to practical considerations for governing the privacy, surveillance and data protection problem in our society: We can – and we should – demand the education of the public to enhance its understanding of modern information processing, but we certainly should not create a (legal, technical, economic, social) protection system that only works if and only if the data subjects do understand modern surveillance practices and the risks they pose for individuals, groups, organizations and the society.

I wrote a new short article which I submitted to DANA – Datenschutznachrichten, a quarterly journal on data protection issues published by the Deutsche Vereinigung für Datenschutz. The article will be published in the 3rd or 4th issue of 2015.

Against the backdrop of the history of the purpose limitation principle in privacy and data protection discourse and law I examine and reevaluate this principle as an artifact of of a specific operationalization of privacy and data protection in the law. The term artifact on the one hand refers to something human-made, on the other hand it denotes an – often disturbing phenomenon that occurs as a result of something like the choice of the method of measurement in social research or the algorithm used in lossy image compression. Here both readings should be merged: I will show that the purpose limitation principle is a secondary product of previous operationalization decisions as well as explicitly human-made.

The historical operationalization decisions I survey include the informed consent principle first formulated by Ruebhausen and Brim in 1965, the „phase orientation“ of German data protection law developed by Steinmüller and his colleagues in 1971 and the specific design of controllability of information processing and decision making examined by Hoffmann in 1991.

I also show that the purpose limitation principle is – contrary to popular opinion – not outdated. From the very beginning, this principle has been consciously created as a normative, but counterfactual response of the law to modern data and information processing capabilities, which are fundamentally purposeless.

Local download (in German): Zweckbindung revisited

[Update, 27 August 2015] The article has been published (in German): Zweckbindung revisited

The whole DANA issue has been published as Open Access: DANA 3/2015

I wrote a new short article which I submitted to FIfF Kommunikation, a quarterly journal on computers, informatics and society published by the Forum InformatikerInnen für Frieden und gesellschaftliche Verantwortung. The article will be published in the 2nd issue of 2015, to be released in June or July.

Based on a thorough reading of draft versions of the German Bundesdatenschutzgesetz, committee records from the German Bundestag, and earlier literature on „legal cybernetics“, the „automation of public administration“ and „automation-friendly laws“ I pointed out how a legal provision originally demanding „Datenschutz by Design“ was transformed into a pure IT security provision in a very short time at the end of the 1970s. Surprisingly, this transformative move was initiated by German data protection commissioners – federal as well as state commissioners – in cooperation with „interessierten Kreisen der Wirtschaft“ („interested industry stakeholders“).

In my view, it comes as no surprise then that most research which is sold today as „Datenschutz by Design“ oder „Privacy by Design“ is actually nothing more than „Security by Design“.

Local download (in German): Das Scheitern von Datenschutz by Design – Eine kurze Geschichte des Versagens

[Update, 5 July 2015] The article has been published and can be found here: http://www.fiff.de/publikationen/fiff-kommunikation/fk-2015/fk-2015-2/fk-2-15-s41.pdf (Yeah, it’s Open Access!)

I just submitted an extended abstract to the workshop on »Privacy and Quantifiability«, which I mentioned some time ago.

The limit of 20,000 characters was rather harsh, even if I missed it. Worse still, the organizers asked me to include references to other privacy theories to clarify the relationship between Datenschutz and these privacy theories. Therefore, I haven’t had enough space for the text I really wanted to write. In the end, I’m not really happy about the text. I still think that I’ve made clear that it’s possible to analyze modern, organized information processing as being industrialized and its consequences for society, organizations, groups and individuals alike.

I’m confident that I’ll have the chance to elaborate on this in the future.

Local download (in German): Transparenz und Berechenbarkeit – Die Industrialisierung der gesellschaftlichen Informationsverarbeitung und ihre Folgen (Extended Abstract)

[Update April 6, 2016]: The article was finally published: http://privtec.de/blog?p=135

For a festschrift for Rosemarie Will, a professor of Public Law, State and Legal Theory at Humboldt-Universität zu Berlin, I wrote a short article analyzing a previously ignored paradigm shift in the debate about data protection which took place in the 1970s.

While the privacy debate takes the object of protection – privacy, Privatheit, Privatsphäre, digital intimacy – as a starting point, the first generation of data protection scholars in Germany realized at the beginning of the 1970s that the analytical perspective needed to be changed fundamentally for a thorough analysis of the data protection problem. Not the outdated notion of categorically separate spheres should be used as the starting point for the analysis, but the specific practices of modern organized information processing and their properties.

This is still true today. We need a sound theory of the information society, but at least a well-founded theory of modern social information processing in social relationships, which are characterized by structural power imbalances: between individuals and groups on the one hand and organizations on the other hand, between small and large organizations, between local and central government or supranational entities, between the Parliament and the judiciary on the one hand and the public administration on the other hand. Only then we can identify problems for which there is no place in a world of categorically separated spheres, problems that even cannot be addressed there, such as the »Modellifizierung« (Wilhelm Steinmüller) or the tendencies towards the industrialization of information processing.

Jörg Pohle (2014). Die kategoriale Trennung zwischen »öffentlich« und »privat« ist durch die Digitalisierung aller Lebensbereiche überholt – Über einen bislang ignorierten Paradigmenwechsel in der Datenschutzdebatte. In: Thomas Fritsche, Michael Kuhn, Sven Lüders, Michael Plöse, Julian Zado (Eds.). »Worüber reden wir eigentlich?« Festgabe für Rosemarie Will.

Local download (in German): Die kategoriale Trennung zwischen »öffentlich« und »privat« ist durch die Digitalisierung aller Lebensbereiche überholt – Über einen bislang ignorierten Paradigmenwechsel in der Datenschutzdebatte

I just submitted a short abstract to a workshop on »Privacy and Quantifiability«, which is supposed to happen in February 2015. In this abstract, I argue that the concept of privacy is the wrong starting point for any understanding of the individual and societal consequences of modern information processing. Instead, key for any understanding is the insight that the far-reaching digitization of all aspects of life has thrown the categorical distinction between »public« and »private« into the dustbin of history. The starting point for an analysis should therefore be the very process of modern, organized information processing, the specific kind of organization it produces, and the relationship between the organization and the datafied individual, group or institution.

In my paper, I will try to present an analysis of the data protection problem, particularly based on previous work of Wilhelm Steinmüller and Martin Rost, which is appropriate to the state of the art of societal information processing in organizations.

Local download (in German): Transparenz und Berechenbarkeit: Die Industrialisierung der gesellschaftlichen Informationsverarbeitung und ihre Folgen (Abstract)

[Update April 6, 2016]: The article was finally published: http://privtec.de/blog?p=135

[Update] 23 November 2014: Unfortunately, we did not receive enough abstracts for the workshop, which adequately focused on the – admittedly very demanding – workshop’s topic. Today, the organizers have decided to take the necessary action and to cancel the workshop. We’re very sorry for any inconveniences caused.[/Update]

Workshop site: http://fundationes.de/datenschutz2.html

Date: 27 March 2015
Location: Berlin, Germany

The digitization seizes all areas of life. Then, why is the categorical separation between „public“ and „private“ still retained? Bourgeois society is a society, not a community. Sociologically, such a finding makes a significant difference. So why is the vast majority of privacy and data protection theories based on under-complex assumptions, which are based on a community-relatedness and a communityboundedness of the individual? Many passages, to cite examples, use the pronoun „we“, an identifier for a community. Who exactly is meant by this consensus sheltering „we“? One of the central features of modern social information processing is that it is increasingly industrialized. Why are then individual sensitivities and needs still chosen as the starting point for a problem analysis? And why is the focus primarily on persons as potential attackers? Why should different demands be made on public and private data processors, if their organizational and technical practices of information processing are now largely the same? Is this distinction between „public“ and „private“ perhaps only an artifact planted by legal thinking?

The number of works describing and explaining privacy, Privatsphäre, Privatheit, surveillance or data protection is tremendous their quality often scientifically questionable. Assumptions are often either not disclosed, are historically outmoded or based on a widespread misunderstanding of the information technological and sociological foundations. The constellations of actors of the underlying theories do not or only marginally overlap with the constellations of actors observed in the phenomenon area. For the participating actors are properties assumed such as rationality, knowledge or practices of information processing, which do not necessarily coincide with the observable and the observed. The same applies to the objectives pursued by the various stakeholders with their information processing. Finally, the question of how actors and social structures, how individuals and social systems come together, is not even recognized as a problem. In short, the current discussion on privacy, Privatsphäre, Privatheit, surveillance and data protection lacks sound theoretical foundations in the context of a global society, where organizations even in the field of information processing act largely in an industrialized manner. We do not believe that a theoretical study with these issues has become obsolete, quite the opposite.

In the workshop, we want to formulate quality requirements for a well-founded theory of data protection for the 21st century. What demands are to be made to such a theory and its genesis from a scientific – disciplinary as well as interdisciplinary – point of view? Which phenomenon areas shall be described and explained by the theory, which shall not? Which actor constellations and what power relations shall be described and explained by the theory, which shall not? What assumptions about the environment society, organization, interaction, technology, processes are allowed and what not?

Please submit your abstracts until 2 November 2014.

The workshop will be held in German and English. You should therefore be able to speak either German or English and to understand both. Abstracts and papers could be submitted in either German or English.

It’s the second workshop in our series „Fundationes„. The first one was on the history and the theory of data protection.

In February 2013, we organized a little workshop – the first in our series „Fundationes“ – on the history and the theory of data protection. A few month before Edward Snowden’s revelations about the NSA’s massive Internet surveillance, 16 scholars and practitioners from different disciplines met in Berlin for a very prolific discussion. More than a year later, we finally published the workshop proceedings encompassing reviewed academic papers, introductory presentations, and a transcript of the workshop discussions.

Jörg Pohle, Andrea Knaut (Eds.) (2014). Fundationes I: Geschichte und Theorie des Datenschutzes. Münster: Monsenstein und Vannerdat.

The proceedings are also available as an EPUB under CC BY 4.0 International (in German/English): Fundationes I – Geschichte und Theorie des Datenschutzes.

Wilhelm Steinmüller (1934-2013) was a pioneer of the scientific study of the social effects and consequences of information technology. As a Law scholar, he first looked at the relationships between Law and electronic data processing and coined to the term „Legal Informatics.“ Based on the possibilities and conditions of the use of computers in the legal domain, he turned more and more on the impact of information technology. With this new perspective, he laid the foundation for the German data protection legislation. His analytical view expanded subsequently to a system-theoretical consideration of computer science and society as a whole.

When the news of Wilhelm Steinmüller’s death on February 1, 2013 broke, former employees, colleagues and friends met on 24 June 2013 in the European Academy Berlin for a commemoration ceremony. We agreed that Steinmüller’s life and work could be appreciated best by placing the various ideas initiated by Steinmüller in a a contemporary context and by identifying future lines of development.

For this festschrift to be presented at a symposium in memory of Wilhelm Steinmüller on May 22, 2014, I wrote a short paper analyzing the 1971 expertise on behalf of the German Federal Ministry of the Interior, „Grundfragen des Datenschutzes“ („Fundamental question of data protection“). This expertise framed the data protection problem as a threat to freedom and liberty and as a structural attack by data processors on the decision space of individuals, groups, and other less powerful entities that are datafied, computed, simulated, predicted. The German data protection legislation adopted the regulatory goals envisioned in the expertise as well as the legal architecture presented by Steinmüller, Lutterbeck and Mallmann.

In the paper, I analyzed the assumptions underlying Steinmüller’s analysis of the data protection problem and uncovered hidden assumptions as to the instrumental character of the computer and the rationality of the data processor. In addition, I reflected on the concept of information used by the authors and found it to be still up-to-date, especially in contrast to newer concepts of information presented in scientific literature for the analysis of organizational information processing in the information society. Finally, I analyzed the process-oriented model of information processing that was mirrored by the process-oriented architecture of the German data protection law, predating a very similar approach by Daniel Solove by pretty much 35 years.

Jörg Pohle (2014). Die immer noch aktuellen Grundfragen des Datenschutzes. In: Hansjürgen Garstka, Wolfgang Coy (Eds.). Wovon – für wen – wozu. Systemdenken wider die Diktatur der Daten. Wilhelm Steinmüller zum Gedächtnis. Berlin: Helmholtz-Zentrum für Kulturtechnik, Humboldt-Universität zu Berlin. pp. 45-58.

Local download (in German): Die immer noch aktuellen Grundfragen des Datenschutzes

In preparation of a workshop we organized last year, I wrote a short paper in German explaining that and how the German data protection law is based on the assumption that all organized processing of personally identifiable information is causality-based. As correlation-based methods are booming now, real-world information processing increasingly clashes with legal requirements due to the law’s implicit assumptions.

Jörg Pohle (2014). Kausalitäten, Korrelationen und Datenschutzrecht. In: Jörg Pohle, Andrea Knaut (Eds.). Fundationes I: Geschichte und Theorie des Datenschutzes. Münster: Monsenstein und Vannerdat. pp. 85-105.

Abstract: In recent years, correlation-based information processing methods are booming. But from the perspective of data protection, they are particularly problematic because they don’t correspond to the implicit assumptions about information processes and information process design, which since the 70s determined the specific form of how the phase orientation was implemented in German data protection law and how the concept of necessity is to be understood. In addition to strengthening the principles of data avoidance and data minimization to replace the toothless concept of necessity, it would be necessary to reformulate legal requirements under a goal-oriented approach. Additionally, if correlation-based methods are being used, requirements on the data subject’s consent, the disclosure and the use of generated data as well as data security need to be much higher than in the past.

Abstract (in German): Korrelationsbasierte Verfahren haben in den letzten Jahren einen starken Aufschwung genommen. Sie sind aber aus der Sicht des Datenschutzes besonders problematisch, da sie nicht den impliziten Annahmen über Verfahren und Verfahrensgestaltung entsprechen, die seit den 70er Jahren die spezifische Ausprägung der Phasenorientierung und die Ausgestaltung des Erforderlichkeitsbegriffes bestimmten. Neben einer Stärkung des Prinzips der Datenvermeidung und Datensparsamkeit als Ersatz für den zahnlos gewordenen Erforderlichkeitsbegriff in diesem Bereich bedarf es vor allem einer Reformulierung der rechtlichen Anforderungen unter Verwendung von Datenschutz-Schutzzielen. Außerdem sind besondere Anforderungen an die Einwilligung der Betroffenen, an die Weitergabe und die Nutzung der dabei erzeugten Daten sowie an die Datensicherheit zu stellen, wenn korrelationsbasierte Verfahren zum Einsatz kommen.

Local download (in German): Kausalitäten, Korrelationen und Datenschutzrecht

Almost two years ago, I published a short paper about a systems theory approach on data protection:

Jörg Pohle (2012). Social Networks, Functional Differentiation of Society, and Data Protection. In: arXiv, arXiv:1206.3027v1.

Abstract: Most scholars, politicians, and activists are following individualistic theories of privacy and data protection. In contrast, some of the pioneers of the data protection legislation in Germany like Adalbert Podlech, Paul J. Müller, and Ulrich Dammann used a systems theory approach. Following Niklas Luhmann, the aim of data protection is (1) maintaining the functional differentiation of society against the threats posed by the possibilities of modern information processing, and (2) countering undue information power by organized social players. It could be, therefore, no surprise that the first data protection law in the German state of Hesse contained rules to protect the individual as well as the balance of power between the legislative and the executive body of the state. Social networks like Facebook or Google+ do not only endanger their users by exposing them to other users or the public. They constitute, first and foremost, a threat to society as a whole by collecting information about individuals, groups, and organizations from different social systems and combining them in a centralized data bank. They transgress the boundaries between social systems that act as a shield against total visibility and transparency of the individual and protect the freedom and the autonomy of the people. Without enforcing structural limitations on the organizational use of collected data by the social network itself or the company behind it, social networks pose the worst totalitarian peril for western societies since the fall of the Soviet Union.

Local download: Social Networks, Functional Differentiation of Society, and Data Protection

The modern history of privacy and data protection has spawned thousands of books and tens of thousands of schorlarly articles. Most of them are either useless or simple rewritings of things that should already be known but often are not.

For a good start into the topic of data protection I recommend reading the following scholarly works:

  1. Bloustein, Edward J. (Dec. 1964). “Privacy as an Aspect of Human Dignity: An Answer to Dean Prosser”. In: New York University Law Review 39, pp. 962–1007.
  2. Westin, Alan F. (1966). “Science, Privacy, and Freedom: Issues and Proposals for the 1970’s. Part I—The Current Impact of Surveillance on Privacy”. In: Columbia Law Review 66.6, pp. 1003–1050.
  3. Westin, Alan F. (1966). “Science, Privacy, and Freedom: Issues and Proposals for the 1970’s. Part II—Balancing the Conflicting Demands of Privacy, Disclosure, and Surveillance”. In: Columbia Law Review 66.7, pp. 1205–1253.
  4. Stone, M. G. and Malcolm Warner (July 1969). “Politics, Privacy, and Computers”. In: The Political Quarterly 40.3, pp. 256–267.
  5. Kamlah, Ruprecht B. (1970). “Datenüberwachung und Bundesverfassungsgericht”. In: Die Öffentliche Verwaltung 23.11, pp. 361–364.
  6. Podlech, Adalbert (1970). “Verfassungsrechtliche Probleme öffentlicher Datenbanken”. In: Die Öffentliche Verwaltung 23.13–14, pp. 473–475.
  7. Mallmann, Christoph (Nov. 1971). “Das Problem der Privatsphäre innerhalb des Datenschutzes”. In: Datenschutz – Datensicherung. Ed. by Jochen Schneider. Beiträge zur integrierten Datenverarbeitung in der öffentlichen Verwaltung Heft 5. München: Siemens Aktiengesellschaft. Chap. 3, pp. 19–26.
  8. Steinmüller, Wilhelm (1971). “Rechtspolitische Bemerkungen zum geplanten staatlichen Informationssystem”. In: Rechtsphilosophie und Rechtspraxis. Referate auf der Tagung der Deutschen Sektion der Internationalen Vereinigung für Rechts- und Sozialphilosophie e.V. in Freibug i. Br. am 7. Oktober 1970. Ed. by Thomas Würtenberger. Frankfurt am Main: Vittorio Klostermann, pp. 81–87.
  9. Steinmüller, Wilhelm et al. (1971). Grundfragen des Datenschutzes. Expertise on behalf of the German Ministry of the Interior, BT-Drs. VI/3826, Appendix 1.
  10. Podlech, Adalbert (1973). Datenschutz im Bereich der öffentlichen Verwaltung. Datenverarbeitung im Recht (DVR) Beiheft 1. Berlin: J. Schweitzer Verlag.
  11. Schmidt, Walter (1974). “Die bedrohte Entscheidungsfreiheit”. In: Juristenzeitung 28.8, pp. 241–250.
  12. Müller, Paul J. (1975). “Funktionen des Datenschutzes aus soziologischer Sicht”. In: Datenverarbeitung im Recht 4, pp. 107–118.
  13. Podlech, Adalbert  (1975). “Verfassung und Datenschutz”. In: Erfassungsschutz. Der Bürger in der Datenbank: zwischen Planung und Manipulation. Ed. by Helmut Krauch. Stuttgart: Deutsche Verlags-Anstalt, pp. 72–77.
  14. Schwan, Eggert (1975). “Datenschutz, Vorbehalt des Gesetzes und Freiheitsgrundrechte”. In: Verwaltungsarchiv 66, pp. 120–150.
  15. Mallmann, Christoph  (1976). Datenschutz in Verwaltungs-Informationssystemen. Vol. 2. Rechtstheorie und Informationsrecht. München, Wien: R. Oldenbourg Verlag.
  16. Podlech, Adalbert  (1976). “Aufgaben und Problematik des Datenschutzes”. In: Datenverarbeitung im Recht 5, pp. 23–39.
  17. Podlech, Adalbert (1976). “Die Trennung von politischer, technischer und fachlicher Verantwortung in EDV-unterstützten Informationssystemen”. In: Informationsrecht und Rechtspolitik. Ed. by Wilhelm Steinmüller. Rechtstheorie und Informationsrecht 1. München, Wien: Oldenbourg Verlag, pp. 207–216.
  18. Steinmüller, Wilhelm, Leonhard Ermer, and Wolfgang Schimmel (1978). Datenschutz bei riskanten Systemen. Vol. 13. Informatik-Fachberichte. Berlin, Heidelberg, New York: Springer.
  19. Podlech, Adalbert  (1979). “Das Recht auf Privatheit”. In: Grundrechte als Fundament der Demokratie. Ed. by Joachim Perels. Frankfurt am Main: Suhrkamp Verlag, pp. 50–68.
  20. Schneider, Jochen (Apr. 1979). “Zum Verhältnis von Datenschutz und Datensicherung”. In: Datenschutz und Datensicherung 3.2, pp. 98–102.
  21. Steinmüller, Wilhelm  (1979). “Der aufhaltsame Aufstieg des Geheimbereichs”. In: Kursbuch 56, pp. 169–198.
  22. Rule, James et al. (1980). The Politics of Privacy. New York: Elsevier.
  23. Podlech, Adalbert (1982). “Individualdatenschutz – Systemdatenschutz”. In: Beiträge zum Sozialrecht – Festgabe für Grüner. Ed. by Klaus Brückner and Gerhard Dalichau. Percha: Verlag R. S. Schulz, pp. 451–462.
  24. Aly, Götz and Karl Heinz Roth (1984). Die restlose Erfassung. Volkszählen, Identifizieren, Aussondern im Nationalsozialismus. Berlin: Rotbuch Verlag Berlin.
  25. Bischoff, Stefan and Benedikt Burkard (1984). “Zum Verhältnis von Datenschutz und Organisation”. In: Koordination von Informationen. Ed. by Rainer Kuhlen. Vol. 81. Informatik-Fachberichte. Gesellschaft für Informatik. Berlin, Heidelberg, New York: Springer, pp. 195–204.
  26. Podlech, Adalbert (1988). “Unter welchen Bedingungen sind neue Informationssysteme gesellschaftlich akzeptabel?” In: Verdatet und vernetzt. Sozialökologische Handlungsspielräume der Informationsgesellschaft. Ed. by Wilhelm Steinmüller. Frankfurt am Main: Fischer Taschenbuch Verlag, pp. 118–126.
  27. Bräutigam, Lothar, Heinzpeter Höller, and Renate Scholz (1990). Datenschutz als Anforderung an die Systemgestaltung. Vol. 12. Sozialverträgliche Technikgestaltung. Opladen: Westdeutscher Verlag.
  28. Hoffmann, Bernhard (1991). Zweckbindung als Kernpunkt eines prozeduralen Datenschutzansatzes. Baden-Baden: Nomos Verlagsgesellschaft.
  29. Seelos, Hans-Jürgen (1991). Informationssysteme und Datenschutz im Krankenhaus. DuD-Fachbeiträge 14. Braunschweig, Wiesbaden: Vieweg.
  30. Gandy Jr., Oscar H. (1993). The Panoptic Sort. Boulder, San Francisco, Oxford: Westview Press.
  31. Rost, Martin (2004). “Verkettbarkeit als Grundbegriff des Datenschutzes? Identitätsmanagement soziologisch betrachtet”. In: Innovativer Datenschutz 1992 – 2004. Wünsche, Wege, Wirklichkeit. Für Helmut Bäumler. Ed. by Johann Bizer et al. Kiel: Unabhängiges Landeszentrum für Datenschutz Schleswig-Holstein, pp. 315–334.
  32. Steinmüller, Wilhelm (2007). “Das informationelle Selbstbestimmungsrecht – Wie es entstand und was man daraus lernen kann”. In: Recht der Datenverarbeitung, pp. 158–161.
  33. Lewinski, Kai von (2009). “Geschichte des Datenschutzrechts von 1600 bis 1977”. In: Freiheit – Sicherheit – Öffentlichkeit. Ed. by Felix Arndt. 48. Assistententagung Öffentliches Recht. Nomos Verlagsgesellschaft, pp. 196–220.
  34. Rost, Martin and Andreas Pfitzmann (2009). “Datenschutz-Schutzziele – revisited”. In: Datenschutz und Datensicherheit 6, pp. 353–358.
  35. Ohm, Paul (2010). “Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization”. In: UCLA Law Review 57, pp. 1701–1777.
  36. Rost, Martin (2012). “Standardisierte Datenschutzmodellierung”. In: Datenschutz und Datensicherheit 36.6, pp. 433–438.
  37. Rost, Martin (2013). “Zur Soziologie des Datenschutzes”. In: Datenschutz und Datensicherheit 37.2, pp. 85–91.