Home
| Databases
| WorldLII
| Search
| Feedback
Privacy Law and Policy Reporter |
Roger Clarke
PLPR has a well established focus on the legal and policy aspects of privacy. However, technologies that affect privacy have exploded in number, complexity and power, and these have received only incidental attention in PLPR. Technologies can be used to invade privacy, and to enhance it. The more effective they become, the more likely it is that law and policy can reveal only part of the reality of privacy. In order for PLPR to more directly address technologies affecting privacy, we have invited Roger Clarke to contribute a series of occasional articles on the tensions between privacy invasive technologies (PITs) and privacy enhancing technologies (PETs). This article launches the series with an overview and some explanations of terminology — General Editor.
This series on technologies affecting privacy will use some terms that may not yet be in everyone’s lexicon, so this article starts by declaring and defining them. I treat them as no more than working definitions, and reserve the right to vary them over time. On the other hand, I’ve been using them fairly consistently for some years now; I will draw attention to drift in the meaning as it occurs. The origins of the terms are explained in the resource pages accompanying this series (reference at end).
The ‘PITs’ usefully describes the many technologies that intrude into privacy. Among the host of examples are data-trail generation through the denial of anonymity, data-trail intensification (such as identified phones, stored-value cards, and intelligent transportation systems), data warehousing and data mining, stored biometrics, and imposed biometrics.
PETs are tools, standards and protocols that set out to reverse the trend towards privacy invasion by directly assisting in the protection of the privacy interest. They are of the following broad kinds:
Many technologies have a negative impact on privacy. Some of them do so as a byproduct of the technology’s primary functions. Others are designed specifically as surveillance tools. Some technologies can be highly invasive to privacy, depending on the manner in which they are applied, such as chip-cards (used in health, in stored value schemes, and as identification tokens), and public key infrastructure (PKI). I include all of these categories within the scope of the term ‘the PITs’, because all are harmful, and all need to be subjected to serious study and either controlled or banned outright.
During the second half of the 20th century, people’s preoccupations were shaped by Orwell’s anti-utopian novel 1984 and the Cold War. Discussion tended to focus on government techniques such as front end verification, data matching, profiling, cross-system enforcement and multi-purpose identification schemes. Video surveillance, despite its apparent shortcomings, has assumed epidemic proportions and technology which allows visual pattern matching and pattern recognition has increased its threat to privacy. In the internet context, agencies of the State have acquired enhanced powers to undertake telephone, email and web use surveillance.
In recent years, there has been a change of emphasis in privacy concerns. Consumer marketing organisations have outstripped the public sector invaders by exploiting the potential to collect and analyse personal data. Consumer profiles are no longer based only on the individuals’ dealings with a single organisation, because their data is shared by multiple merchants — and the privacy amendments of December 2000 won’t change that, because they actually legitimise it.
Telephone communications have been used to gather data, through call centre technologies and calling number display (CND, also known as caller ID, and calling line identification or CLI). Internet communications have been intruded upon by spam, cookies and single pixel gifs. Commercial transactions that have long been anonymous increasingly require customer identification, due to businesses’ refusal to accept cash and failure to implement electronic equivalents.
Some tools have been applied by both governments and the corporate sector. Requirements for identification and authentication have been imposed on people in an increasing array of situations. Even highly intrusive biometric tests have been used — not only on people under close care and in gaols, but also on those merely visiting people in gaol, and on employees of companies that judge the security of their premises to be more important than the privacy of their employees.
Data warehousing and data mining technologies have been developed in order to exploit data that has been expropriated from multiple sources. Means have been devised to locate and track not just goods, but also vehicles and, increasingly, even people. Intelligent transportation systems include such contentious applications. One example is the unheralded and unauthorised use of the the NSW Roads and Traffic Authority’s Safe-T-Cam system on cars, even though it was designed expressly for the monitoring of trucks. Another is the denial of anonymous use of major public thoroughfares such as the Melbourne CityLink.
In the workplace, employees and contractors are being subjected to dramatically increased privacy invasions. Video surveillance, email and web behaviour surveillance, and person location and person tracking have been added to by requirements for biometrics, and the testing of employees for consumption of banned substances.
This series will inevitably concern itself primarily with information privacy. It will involve considerable focus on the internet. The resource page (reference below) that accompanies this series provides access to an introductory paper on the internet. Attention will also be paid to other telecommunications infrastructure, such as mobile telephony, cable and satellites. The scope of the series extends beyond information privacy, however, to embrace privacy of the person and privacy of personal behaviour.
One antidote to privacy invasive technologies is the development and deployment of additional technologies that undermine the PITs.
This has been especially necessary in the computing and telecommunications arena. Security tools can be applied to protect personal data on servers and clients. The risks of data being intercepted by unauthorised parties can be addressed by the application of cryptography to channel protection. Particular forms of cryptography can, at least in principle, be applied to support the authentication of sender and receiver, and to deny the parties the ability to repudiate transactions that they have conducted. In practice, substantial infrastructure is necessary, and the techniques deployed to date are themselves highly privacy invasive. Specific countermeasures have also been devised for particular techniques such as spam, cookies and single pixel gifs (sometimes called ‘web bugs’).
A number of pseudo protections have also been created, such as privacy policy statements, trademarks, and what I call ‘metabrands’ like Trust-e and WebTrust, which pretend to assure appropriate behaviour by website operators. The next article in this series will revisit a particular web protocol that appeared to be a PIT countermeasure, but whose final design and implementation have fallen so far short of expectations that it has become just another pseudo protection.
The explosion in privacy invasive technologies, together with the challenges involved in devising and disseminating specific countermeasures, have encouraged the investment of substantial effort in the development of a generic countermeasure.
Many services which provide anonymity and deny the ability for governments and corporations to associate data with an identified individual have been prototyped, and some launched. On the internet, a common means for achieving the effect is a succession of intermediary operated services. Each intermediary knows the identities of the intermediaries adjacent to it in the chain, but has too little information to enable it to identify the prior and subsequent intermediaries. Even if it wants to, it cannot track the communication back to the originator or forward to the ultimate recipient. Examples of these savage PETs include anonymous remailers and web surfing services, and David Chaum’s payer anonymous ECash or Digicash.
Some of the more sophisticated tools enable a non-traceable identifier to be used over an extended period; not only for the whole of a single session or conversation, but even over a long succession of episodes. A number of such ‘persistent nyms’ may be acquired by a single person, which they can use to sustain independent personae, for example, for different roles that they play. The design effectively precludes the personae from being related with one another, or with a person.
Denial of identity causes serious concern to law enforcement agencies, because it undermines accountability; that is, most people are likely to perform in a less responsible manner if they are able to escape the consequences of their actions. Regulatory measures can only be effective if entities can be made to take legal responsibility for negative consequences of their actions — and that is only possible if they can be found. I use the term ‘savage PETs’ for anonymity tools precisely because of this impact. On the other hand, the deterrent effect that arises from the possibility of retribution is only one form of incentive encouraging reasonable social behaviour.
Undoubtedly, untraceable electronic anonymity will be used by people with criminal intent. On the other hand, the electronic world creates new threats, and hence the level of anonymity available in the electronic world actually needs to be higher than that in the real world. Moreover, in some jurisdictions, there appear to be legal and even constitutional rights to anonymity in some contexts, such as political speech.
Anonymous schemes serve the needs of both individuals and organisations. Examples of communications that organisations like to protect include accesses to patents databases (which could disclose product development strategies); traffic analysis (which could make the nature of an organisation’s business apparent); whistleblowing; ephemeral internal communications (which might otherwise become subject to subpoena); headhunter communications with employees of other organisations; and overseas employees who need protection against local incursions into privacy.
Anonymity might be thought to set the balance sufficiently far in favour of individual freedom that cheats will prosper, and law and order will be too difficult to sustain. Is a ‘middle way’ feasible?
Very substantial protections could be provided for individuals’ identities, but those protections could be breachable when particular conditions are fulfiled. This is the concept of ‘pseudonymity’, and I refer to technologies that implement it as ‘gentle PETs’.
Fundamental to pseudonymity services are that:
The challenge confronting developers of gentle PETs is that the legal, organis-ational and technical protections need to be trustworthy. If the power to override them is in the hands of a person or organisation that flouts the conditions, then pseudonymity’s value as a privacy protection collapses. Unfortunately, governments throughout history have shown themselves to be untrustworthy when their interests are too seriously threatened. Corporations are dedicated to shareholder value alone, and will only comply with the conditions when they are subject to sufficiently powerful preventative mechanisms and sanctions.
Time will tell whether gentle PETs can be devised that distribute power among multiple parties, and thereby justify trust. Unless and until they are designed, proven and deployed, it appears that PIT countermeasures and savage PETs will line up against the PITs and engage in both guerilla warfare and direct conflict.
This series will analyse the ebb and flow of those battles. It will subject individual PITs and PETs to close examination. In doing so, the articles will investigate the scope for technology to play a determinative role in the survival of privacy as a human value, despite the ravages it is subject to.
Burkert H ‘Privacy-enhancing technologies: typology, critique, vision’ in Agre P E and Rotenberg M (eds) Technology and Privacy: The New Landscape MIT Press 1997.
Clarke R ‘Human identification in information systems: management challenges and public policy issues’ Information Technology & People 7(4) December 1994 6-37; available at <http://www.anu.edu.au/people/Roger.Clarke/DV/HumanID.html> .
Clarke R ‘Identified, anonymous and pseudonymous transactions: the spectrum of choice’ User Identification & Privacy Protection Conference Stockholm, June 1999; available at <http://www.anu.edu.au/people/Roger.Clarke/DV/UIPP99.html> .
Clarke R ‘The origins of “PIT”and “PET’’’; available at <http://www.anu.edu.au/people/Roger.Clarke/DV/PITsPETsRes.html#Orig> .
EPIC EPIC online guide to practical privacy tools 1996; available at <http:// www.epic.org/privacy/tools.html> .
Froomkin A M ‘Anonymity and its enmities’ 1995 Journal of Online Law; available at <http://www.wm.edu/law/publications/jol/95_96/froomkin.html> .
Information and Privacy Commis-sioner, Canada Privacy-enhancing technologies: the path to anonymity IPCR Ontario, Canada and Registratiekamer, The Netherlands, 2 vols August 1995; available at <http://www.ipc.on.ca/english/pubpres/sum_pap/papers/anon-e.htm> .
This series is supplemented by a resource page that will be maintained on an ongoing basis. It is to be found at <http://www.anu.edu.au/people/Roger.Clarke/DV/PITsPETsRes.html> .
PLPR readers are encouraged to contribute sources and suggestions for enhancement to <Roger.Clarke@xamax.com.au> and to bookmark the page for their own use and for communication to others.
Roger Clarke, Principal, Xamax Consultancy Pty Ltd.
AustLII:
Copyright Policy
|
Disclaimers
|
Privacy Policy
|
Feedback
URL: http://www.austlii.edu.au/au/journals/PrivLawPRpr/2001/12.html