AustLII Home | Databases | WorldLII | Search | Feedback

Law, Technology and Humans

You are here:  AustLII >> Databases >> Law, Technology and Humans >> 2022 >> [2022] LawTechHum 6

Database Search | Name Search | Recent Articles | Noteup | LawCite | Author Info | Download | Help

Onitiu, Daria --- "Incorporating 'fashion Identity' Into the Right to Privacy" [2022] LawTechHum 6; (2022) 4(1) Law, Technology and Humans 102


Incorporating ‘Fashion Identity’ Into the Right to Privacy

Daria Onitiu

Edinburgh Law School, United Kingdom

Abstract

Keywords: Privacy; autonomy; profiling technologies; fashion identity.

I. Introduction

The right to privacy is not flawless. At the centre of the discourse on the meaning and value of privacy is the view that it is often a ‘sweeping concept’, which leads to the great difficulty of reaching a conclusive answer regarding its exact meaning.[1] Key to the right to privacy is the delimitation of its parameters with regard to the constraints as well as to the conditions for the exercise of this fundamental freedom.

This paper, drawing from findings in fashion studies, intends to detangle the conceptual muddle in legal scholarship surrounding the right to privacy. The theory of ‘fashion’ is a field that is produced by diverse scholarship carving out various ‘identities’ expressed through the dimension of ‘clothing’ in social interaction.[2] A triangular framework for ‘fashion’ and ‘identity’, suggesting the embodiment of human behaviour in a social context, sets the scene for viewing the right to privacy, in its outward and inward forms, as a construction for identity and a protective space to explore it.

The paper’s objective is to revise the understanding of the right to privacy using a definition of ‘fashion identity’ as managing and perceiving appearance. As suggested by Agre, privacy is ‘the freedom from unreasonable constraints on the construction of one’s own identity’.[3] This conception significantly contributes to the understanding of ‘identity’ as a process that is safeguarded by a relational concept of privacy, which protects an individual’s autonomy and maintaining selfhood.[4] Nevertheless, the question remains of what the exact nature of the right to privacy is in maintaining an objective and a subjective sense of self.

The paper investigates the parameters and conditions of privacy for identity-building, focusing on Article 8 of the European Convention on Human Rights (ECHR).[5] First, I submit that the ‘reasonable expectation of privacy’ test in Article 8(1) offers an incomplete picture of the role of privacy to develop with evolving norms that define individual perception.[6] Second, I contend that case law by the European Court of Human Rights (ECtHR) on personal autonomy and data protection[7] underscores the importance of an individual’s unconscious associations with identity, which I define as the process of self-relationality of identity-building.

Understanding the right to privacy as being connected to identity stimulates a series of uncertainties regarding its functions in self-realisation in the digital age. Profiling technologies not only affect individual control concerning the flow of personal information, but also algorithms shape the contours of an individual’s agency and choice.[8] Thus, a structural account of privacy does not offer reliable guidance for defining the potential of profiling technologies to create a ‘new’ reality of self-relation, which attaches direct meanings to an individual’s values and attitudes. We need a view of privacy that incorporates both the way individual perceptions are formed and the understanding of the notion of self-relationality, as it includes a person’s unconscious associations with fashion.

‘Fashion identity’ can offer a starting point for elaborating on the value of the right to privacy in the digital age because it signifies more than controlling impressions and self-representation in an environment. I suggest that the nature of the right to privacy is to hold together our separate selves in the face of objective and subjective constraints on identity formation. I conclude with guidance on clarifying the concept of the right to privacy, emphasising the relevance of individual perception and self-relationality regarding objective and subjective constraints on identity formation.

II. A Triangular Framework on ‘Fashion Identity’

Before we address the question of the nature of the right to privacy, it is important to clarify the meaning of ‘fashion identity’ to define the methodological framework. A triangular framework of ‘fashion identity’ sets the scene for investigating the extent to which individual perception and the process of inference of knowledge of self are relevant to the interpretation of the right to privacy. There is an inevitable connection between ‘fashion’ and ‘identity’ based on the study of the individual, who is a situated object within the roles of ‘dress’, with regard to the management and perception of appearance.[9] We can classify the process of identity construction in fashion identity within a framework of the management and perception of appearance.

1. The Connection Between Fashion And Identity

The first way we can describe the meaning of fashion identity is based on an individual’s process of self-representation. By way of illustration, imagine an individual getting up in the morning and putting on clothes and accessories. Every individual ‘gets dressed’ in a way, be it through wearing garments, or a hairstyle, or using body modifications, such as tattooing.[10] This process of getting dressed is a way ‘to adorn the body’.[11] An individual’s management of appearance is performative in that we use ‘fashion’ as a tool to reveal and disguise aspects of identity.[12] Accordingly, ‘fashion’ is an aspect of the ‘material self’ of identity, being the management of appearance, as well as impression management.

Further, imagine how the individual who gets up in the morning might get ready for work or a meeting with friends. A particular occasion or certain professions may require certain forms of bodily representation to ‘[look] appropriate for a particular setting’.[13] For example, an individual who undermines social codes or cultural conventions through appearance is argued to be inappropriate, risking ‘social exclusion or ridicule’.[14] In contrast, we may use our appearance as a source of identification, such as wearing an ethnic dress or choosing the ‘look’ to join a group of football fans.[15] What this shows is that ‘fashion’ illustrates the dynamic of the ‘social selves’ of identity, which may include the external constraints on the performative role of ‘dress’, as well as an individual’s strife for differentiation and belonging to a social environment.[16]

Perhaps the most evident constraint to individual self-representation in fashion is the notion of an individual’s perception of fashion. The role of dress in a social setting influences not only exterior behaviour but also perceptions towards oneself.[17] Kaiser uses the fictitious example of someone visiting a clothing store, observing an item and imagining how it would fit him or her, how it would look, and whether that would make them attractive.[18] This process, entailing the management and perception of appearance, results from the evaluation of how others perceive one’s appearance and how an individual is ‘drawing inferences on how people look’.[19] The way we attribute meaning to the dialectic tendency on the management and perception of appearance is based on so-called ‘perceiver variables’, which are our conscious and unconscious association of ‘fashion’ with reference to the self and others.[20] For example, ‘perceiver variables’ can be argued to illustrate the contextual cues clothes may give about an individual’s behaviour, such as connecting a ‘suit’ with an individual’s reliability and/or occupation.[21] Further, ‘fashion’ can shape our own emotions and attitudes.[22] Accordingly, if Karen, who is a fashion psychologist, asks ‘how are you wearing your stress?’ this is an important inquiry in how your outfit choices can improve your mental health during the COVID-19 pandemic.[23] Accordingly, we see how the social selves can produce ‘fashion narratives’ on the role of appearance in a social context, having an impact on an individual’s perception of identity. Thus, every representation an individual makes for his or her appearance is interpreted with a view of the intimate self of identity, such as the way an individual’s beliefs, current emotions or desires are projected through the meaning of fashion narratives.

Based on these considerations, we can establish a definition to elaborate on this interrelationship between fashion and identity:

Fashion, as a form of social behaviour, entails the various identities of self—the material self, the social selves and intimate self—for the management and perception of appearance.

I will show that the notion of fashion identity is a relevant factor to shape the meaning of the right to privacy in the big data age. I submit that fashion identity contributes to an individual’s management of own behaviour, the formation of values including perception, as well as the formation of attitudes and preferences.

III. Defining Individual Perception

The first aspect of my analysis intends to identify how the notion of individual perception of fashion identity connects with the right to privacy and whether it adds value to the constraints of identity-building. In doing so, I focus on Agre’s definition of the right to privacy as connected to identity construction, whereby the first part of the definition focuses on the parameters including ‘unreasonable constraints on the construction of one’s own identity’.[24] The question that arises is how does the view that privacy, being the control of access to the self,[25] fit with the proliferation of surveillance practices while capturing the aspects of identity in the big data sphere?[26]

1. Privacy and the Unreasonable Constraints on Identity-Building

The parameters of the right to privacy have developed most in an operational sense. Warren and Brandeis’s idea of separation and seclusion views privacy as a condition that enables a private life outside the observation and influence of others.[27] Others have more explicitly argued for the connection of privacy to an individual’s self-concept. This understanding of privacy is further discussed by Altman,[28] for whom privacy is the ‘selective control of access to the self’.[29] A crucial element of Altman’s theory is that privacy is a dialectic process, which is based on the interplay of people.[30] The right to privacy is a multifaceted concept, gaining meaning in a social context that affects individual behaviour.[31]

Altman’s analysis focuses on the way an individual or a group of individuals experiences the ‘states of privacy’ including the physical or behavioural barriers set by themselves.[32] Thus, social interaction is based on an individual’s reasonable perception of privacy.[33] For example, an individual assumes that their family home is a physical barrier against eavesdropping, and that strangers are not recording them.[34] Some social practices develop into normative practices that are codified in law.[35] When these norms are not respected, a situation is created where an individual’s privacy is violated.

The question, of course, arises around how we interpret the parameters of privacy considering that individuals increasingly act in public or semi-public spaces.[36] As indicated by Roessler, ‘privacy in public then means, for example, not listening in on private conversations between friends on the street or in a cafe’.[37] However, the dichotomy between private and public information is not practicable in the online sphere without considering the way individuals’ expectations of privacy are formed.[38] Indeed, privacy as a form of a protective shell that is free from unwarranted scrutiny entails the individual’s state as an autonomous subject to establish the connection between the self and the environment, being an important element of the notion of appearance management in fashion identity.

2. Privacy and Appearance Management

There is a clear connection between the dynamic nature of privacy and the meaning of fashion identity. Privacy overlaps with the expression of the social selves, focusing on the connection between appearance management in the material self of fashion identity. The development of the material self in fashion identity entails a degree of intimacy, which allows an individual to explore the nuances of appearance, including interactions with people. In addition, we can argue that privacy protection, including notions of intimacy, solitude, reserve and anonymity,[39] should enable the development of contingent features regarding the management of appearance and identity.[40] These contingent features are the act of self-representation with a particular style or a look and the search for differentiation and belonging with a particular social context. Thus, privacy seems to provide the secure space wherein individuals can act within the material self and think within the social selves of their fashion identity.

However, how can an individual maintain that his or her expectation of privacy is free from the scrutiny of others? Surveillance and tracking practices expose an individual to the identification and observance of behaviour in unprecedented ways.[41] From surveillance cameras to the smart assistant, surveillance and profiling technologies challenge the shape of the spatial context of privacy discourse.[42] The rise of networked environments certainly challenges our expectations of boundary management within the ‘nonintimate’ spheres.[43] These practices can shape our sense of freedom, undermining an individual’s ‘necessity of relief’ from outside scrutiny.[44] Further, Brincker even questions ‘if it is reasonable to have an expectation of privacy at all’ in an information society.[45]

While the performative function of fashion identity allows an individual to flexibly form aspects regarding his or her management of appearance within social contexts, a normative approach to privacy requires some sort of ‘observable activity’.[46] Indeed, fashion identity can flesh out some key considerations on how a new dimension of privacy discourse could include the dialectic tendencies of appearance management. I will focus on the ‘reasonable expectation of privacy’ notion in Article 8(1) of the ECHR to elaborate on this argument.[47]

3. Article 8(1) of the ECHR: Individual Perception and Privacy

Can an individual’s reasonable expectation of privacy extend to ‘newly extended audience[s]’, such as the new opportunities of communication and perception within algorithmic landscapes?[48] Indeed, Brownsword underlines that one has to look at ‘prevailing custom and practice’ to judge whether an individual’s expectation of privacy is reasonable.[49] That is, we often assume that prevailing expectations change whenever there is a conflict between the individual interests and the pursuance of common values pertaining to society as a whole.[50] Expectations are formed based on the social interactions that can raise or lower the benchmark regarding an individual’s reasonable expectation of privacy.[51]

In Bloomberg LP v ZXC, the Supreme Court of the United Kingdom examined interesting points on the scope of an individual’s privacy in the context of misuse of private information, investigating the relevance of the ‘reasonable expectation of privacy’ test in Article 8 of the ECHR.[52] In particular, the court carefully investigated whether the claimant, who has not been charged with an offence, had a ‘reasonable expectation of privacy’ regarding information that was published considering a criminal investigation.[53] Indeed, the court highlights that an individual under criminal investigation has reasonable expectation of privacy, being a ‘legitimate starting point’ regarding ‘certain categories of information, such as the information in this case’.[54] The court’s reasoning is closely aligned to the ECtHR’s construction of harm, which can be ‘irremediable and profound’, such as harm caused by information that damages an individual’s reputation, as well as multiple aspects of the person’s physical and social identity.[55]

Nevertheless, I argue that a problem regarding the ‘reasonable expectation of privacy’ test concerning Article 8(1) is that it focuses on the objectively identified norms establishing a privacy interest. In Benedik v Slovenia, the court, analysing whether the applicant using the internet had a reasonable expectation of privacy that his otherwise public online activity would remain anonymous, focused on the dynamic IP address that could not be traced to a specific individual without the internet service provider’s verification, and upon specific request.[56] The court analysed the applicant’s degree of online anonymity in light of the measures to obtain identifiable information, focusing on the access to content data rather than the insights the access to personal information can generate into an individual’s perception of privacy.[57] Accordingly, the ‘reasonable expectation of privacy’ test does not contain any substantive guidance on the perception of privacy, but rather a structural account of existing objectively determined privacy interests.

This approach is problematic because it undermines the value of privacy as the individual’s control of self-presentation. The ECtHR defines the right to privacy as an inanimate object that conveys information on the social norms and ‘barriers’ defining individual perception. The claimant will obtain a finding of reasonable expectation of privacy protection once a practice becomes a widespread intrusive measure, such as the systematic collection of personal data or the indiscriminate monitoring of individual actions.[58] Thus, the ‘reasonable expectation of privacy’ is a factual test assessing a privacy interest through the circumstances of the case rather than the claimant’s (subjective) experience of the process of appearance management. Indeed, in Benedik v Slovenia the question was not whether the applicant had a ‘reasonable expectation of privacy’ when surfing the web, but whether they had an interest in privacy protection based on the dynamic IP address.[59] This approach underscores the importance of ‘fashion narratives’ in a normative account of social attitudes, being a test that asks what information or measure is considered private and leaving out the individual’s ability to control the desired access regarding the contours of appearance management.

There is something fundamentally wrong with assuming that privacy is the access to data pertaining to the unspecified relationships in the online sphere, or, indeed, the context through which social relationships are evaluated.[60] We tend to focus on the social processes shaping individual expectations rather than the individual’s internal processes premating his or her sense of privacy and autonomy. However, algorithmic processes do not only negate the possibilities to manage our appearance but also impact the way our perceptions are formed.

In other words, it is necessary to define the value of privacy in the big data age to elaborate that ‘fashion narratives’ are necessary with regard to the scope of the ‘reasonable expectation of privacy’ test. In particular, what is known as ‘big data analytics’—the indiscriminate collection and unprecedented analysis of individual behaviour—has become a system of statistical observation regulating and governing human behaviour.[61] However, it is not only the aggregation of data sets that leads to the erosion of an individual’s privacy in the public sphere,[62] but also the extent we can maintain our perception of the self within networked environments, which fundamentally alters common norms on privacy expectations.

The expansion of the use of algorithmic systems, from targeted advertising to predictive policing, does not simply necessitate measures used to address the old privacy problem, such as an objective standard capturing a structural account of an invasion into one’s privacy sphere. It also requires a classification of anticipated harm, given that almost any individual action leaves digital traces.[63] It is not the CCTV camera installed on a public street or the voice-user interfaces in our living room that underlines the chilling effects of the loss of solitude and intimacy. It is also the algorithms, such as advanced computer vision methods using AI or natural language processing techniques, which diminish an individual’s autonomy in establishing the parameters of self-presentation. Our communicative structures, the ability to disclose and withhold aspects of our identity, are shaped by the ‘mere belief that one is being observed’.[64] These considerations indicate that our knowledge and beliefs are shaped by the algorithms defining an external constraint on the right to privacy.

Following these considerations, the ‘reasonable expectation of privacy’ test needs to emphasise our individual perception considering one’s associations in the management of appearance, which inform barriers to privacy. Hughes has partly addressed this point, suggesting that the ‘reasonable expectation of privacy’ test needs to consider the individual’s knowledge when determining whether the applicant’s expectation was reasonable.[65] However, the notion of fashion identity in appearance management can also contribute to the external constraints that invasions of privacy can impose on the sense of self. ‘Fashion identity’ can clarify an individual’s subjective sense of privacy based on the analysis of fashion narratives to interpret individual behaviour. Accordingly, an analysis of fashion narratives can define the barriers regarding a privacy interest, balancing an individual’s negotiation of the social selves of fashion identity, as shown in Section V.

IV. Defining Self-Relationality

Having examined the parameters for establishing interferences in privacy regarding the ‘reasonable expectation of privacy’ test, the next task is to examine the conditions for identity-building in light of the right to privacy. The second element of Agre’s definition of the right to privacy are the conditions for identity construction establishing the relational character of privacy, as well as individual practice and exercise of that freedom within the self and a social context.[66] Nevertheless, we need an understanding of privacy that sheds new light on the meaning of ‘autonomy’ and ‘authenticity’, and that considers the conscious and unconscious associations defining a person’s interrelationship with ‘fashion’. An individual’s association with beliefs, attitudes and emotions can be defined as a form of relationality established by the individual regarding the meaning of ‘fashion’ (i.e., an individual’s self-relationality).

1. Privacy and the Conditions for Identity-Building

An important aspect regarding the conditions for identity-building is the view that privacy, in terms of dynamic boundary negotiations, is an inseparable aspect of individual autonomy.[67] It allows an individual to freely frame their appearance and freely explore any further potential senses, which have not yet been examined.[68]

Accordingly, viewing privacy as a process of boundary control enhances an individual’s ability to exercise the conditions for privacy embodied in social contexts—a common interpretation that is relevant to the notion of ‘identity’ and ‘self’ with regard to profiling technologies.[69] Profiling technologies, relying on the analysis and matching of data to classify and infer individual behaviour for decision-making, influence an individual’s sense of self, an ‘evolving presence’ that is shaped and defined by algorithms.[70] Accordingly, authors tend to emphasise the role of privacy as a tool for an individual’s self-realisation, which is rooted in the protection of personal autonomy and authenticity in social interactions.[71]

For example, Hildebrandt refers to Ricoeur’s distinction between idem-identity and ipse-identity when investigating the effect of profiling technologies on individual privacy and autonomy.[72] To clarify, idem-identity illustrates the process of recognition and sameness of identity that is shaped by the feedback of others, whereas ipse-identity is the process of classification and selfhood, the establishment of an individual’s sense of self as an embodied experience.[73] Following this analysis, profiling technologies influence the experiences of idem-identity and ipse-identity in that the inferences generated by the identification of idem-identity affect the sense of ipse-identity.[74] As argued by Hildebrandt, ‘profiling may indeed lead to me being presented with certain pre-chosen aspects of that world in the form of a limited range of options’.[75] Defining the extent to which evolving information technologies, as well as profiling technologies, have an impact on the external management of identity formation is always a question about how these technologies influence the conditions through which the perception of self is formed, such as agency and choice.[76] Accordingly, in terms of identity formation, privacy maintains a protective framework for self-realisation.

However, our notion of ‘fashion’ adds to the facets of privacy and autonomy based on the role of the social selves and intimate self of identity. In this respect, the distinction between the social selves and intimate self is relevant when discussing the enablers of identity-building regarding an individual’s privacy. ‘Fashion identity’ notes that an individual’s inference of knowledge of self develops with social interaction, connecting the right to privacy with autonomy.[77] Nevertheless, the intimate self further adds to our understanding concerning the conditions of identity-building in that self-knowledge can illustrate an associative process detached from social life based on the individual’s formation of beliefs and attitudes. When we discuss privacy, we tend to assume that our autonomy is shaped within social interactions detached from an individual’s own associative process of fashion identity.

We see this disjointed vision of privacy as a ‘boundary exchange’, rather than boundary creation in the ECHR’s interpretation of Article 8(1). An analysis of Article 8(1) in terms of personal development and personal data protection reveals a structural account of privacy, suggesting that an individual’s autonomy is socially embedded within the parameters that define the external constraints on identity formation. In turn, this conception of privacy has a significant effect on the way we understand the influence of profiling technologies on the sense of self.

2. Article 8(1) of the ECHR and Autonomy

The ECtHR has continuously taken an expansive approach to interpreting privacy, without outlining an exhaustive list of its meanings.[78] Two notable areas underline an extension of the right to privacy based on the notion of personal autonomy, which includes aspects of an individual’s personal development as well as developments in the area of data protection.[79]

Focusing on the notion of personal autonomy, Article 8(1) of the ECHR provides for the protection of multiple aspects of an individual’s personality, such as physical or social identity, physical and psychological integrity, as well as a person’s ability to develop relationships.[80] In this respect, the ECtHR accepts that privacy, as well as the notion of personal autonomy, covers sexual orientation,[81] gender identification,[82] the right to discover one’s origins,[83] religious and philosophical convictions,[84] the right to a name in identity documents,[85] the right to personal choice including desired appearance,[86] as well as the right to an ethnic or a group identity.[87]

This shows that privacy focuses on the conditions for expressing and exploring aspects of identity, making evident its resemblance with ‘fashion identity’. The right to privacy, like ‘fashion identity’, views the sense of self as an embodied experience, which includes the way an individual’s management of appearance is shaped by the perceptions of others. For example, the ECtHR stipulated that an individual’s choice of appearance, such as a haircut or a beard when attending university, relates to their personality within the scope of Article 8 of the ECHR.[88] As a result, the right to privacy illustrates the affordance for an individual to control the parameters of appearance management and perception of (fashion) identity, which includes the communication of values, as well as the inference of knowledge of self.

Another rationale of Article 8 of the ECHR and personal autonomy is an individual’s informational self-determination in the area of data protection. In Satakunnan Markkinapörssi Oy and Santamedia Oy v Finland, the court explicitly recognised the right of informational self-determination, which allows individuals to ‘rely on their right to privacy as regards data which, albeit neutral, are collected, processed and disseminated collectively and in such a form or manner that their Article 8 rights may be engaged’.[89] The notion of personal autonomy ensures the positive obligations of the state regarding the protection of an individual’s privacy, which may entail appropriate rules ensuring that there is an independent supervisory body in secret surveillance cases, specific safeguards regarding sensitive data, as well as guidance that personal data should not be used in ways that are beyond the normally foreseeable.[90] In this respect, privacy reflects the predominance of external factors in ‘fashion identity’, which strengthen or challenge an individual’s control in defining the contours of appearance management, such as the reading and assessment of personal attributes.

Nevertheless, if we take the view that privacy is a mere product of symbolic interactionism, then we must accept that perception is a matter of reading the other, which is disembodied from the unconscious elements of self. Article 8 of the ECHR provides a structural account of the right to privacy, which is based on the notion of personal autonomy strengthening an individual’s control of the expression and exploration of identity. This view of privacy does not protect the conditions of personal autonomy, which are affected by profiling technologies. For example, Zarsky, focusing on the impact of profiling technologies on the sense of self, argues that the interaction with these technologies causes an ‘autonomy trap’, whereby conscious decisions are affected by the information asymmetries inherent in algorithmic systems.[91] In addition, I would argue that profiling technologies shape both an individual’s external and internal worlds, including bodily experience and unconscious forms of thought. Therefore, we need to move beyond an understanding of privacy as a means of control to preserve a person’s individuality.

3. Self-Relationality and Privacy

Several authors highlight how profiling technologies have an impact on the conditions for exercising the right to privacy, influencing an individual’s agency, constraining their choice to a range of options, and affecting their ability to develop a sense of identity.[92] In this respect, Ricoeur’s theoretical outlook on selfhood in ipse-identity is helpful for seeing an individual’s identity as a process of identification with certain values, beliefs and aspirations.[93] Indeed, our expectations are shaped by those who profile us, and it is correct to assume that algorithms create knowledge that could be used to shape individual preferences beyond the awareness of the person being ‘profiled’.[94] Following this reasoning, it is argued that the right to privacy, which seeks to establish an individual’s control and maintain aspects of their identity in a social context, is implicated by the ‘pre-emptions’ made by algorithms that create a ‘new normativity’ regarding how individuals establish their sense of self in relation to others.[95]

Let us elaborate on why an understanding of privacy does not singularly encompass the constitutive elements of an individual’s identity. Fashion identity illuminates that an important aspect of individual perception is an individual’s association with fashion narratives with reference to the self. That is, an individual’s hair colour, their geographical location, or race, are all attributes that only gain meaning if there is an established relationality for self-evaluation.[96] Much of our privacy discourse focuses on what aspects of identity are replicated within social processes, rather than how conclusions on my identity disturb my own identity discourse with reference to the self.

However, this view that individual perception is shaped by the assessment of algorithms does not highlight those algorithms as having the potential to create a ‘new’ reality of self-relation. Profiling technologies actualise new forms concerning the relevance of identity in the big data sphere, such as creating links and patterns in data between an individual’s browsing behaviour and their current mood, or a person’s unique physical specifications and clothing style.[97] Therefore, profiling technologies have an impact on an individual’s sense of self (i.e., authenticity and selfhood) based on the translation of their appearance into hidden meanings, rather than the assumptions generated by the algorithms about individuals.

More concretely then, profiling technologies not only disturb a person’s relationship to their own values, beliefs and desires, but also create the basis through which they make their associations and transform a ‘thing’ into reality, such as the value of a ‘dress’ as an expression of their femininity or the meaning of a ‘suit’ as an assertion of their social status. In this sense, the individual’s process of association between the self and the ‘perceived self’, such as the identification of a targeted advert for a ‘feminine dress’ with my body image or my current mood, is not something that is fully exhausted by the algorithmic manipulation of inferred desires, but an oversimplification of the individual’s unconscious purpose.

There is a shortcoming in the right to privacy, which does not clarify the value of protecting the individual’s unconscious associations that define perception as a semblance of individual qualities. The ‘self’ as embodied within the social context is a conjunction of associations within the individual mind. The right to privacy currently suggests that an individual’s management and perception of appearance is shaped by virtue of external stimuli, which are the realities and demands of a social context. In other words, privacy as a regulator between the self and external stimuli focuses on the conscious acts of representation for identity construction. Further, the relationship between the self and the sociocultural environment is framed as an act of pure human automatism because the value of human behaviour is a pure reproduction of a social act and feedback from others. It does not elaborate on behaviour as a sequence of steps, which implies both the conscious reasoning self that establishes social values and the source from which impressions and feelings originate. The current relational understanding of privacy suggests that pre-reflective choices are predetermined by those conscious associations that make up a belief system about the self and the environment. However, any relationship and association regarding an individual’s management and perception of appearance contain a certain degree of independence that is not simply exhausted by the readings of ‘others’. It is the individual who constantly gives the notion of appearance and perception a renewed meaning.

Once the process of exploration of self becomes a task of statistical observation and classification of individual behaviour, we have a concept of human agency and choice that deliberately ignores a person’s underlying motivations and self-evaluation. It follows that an individual’s notion of self-representation and personal identification derives from statistical correlations and shared group characteristics.[98] In other words, we live in a world where there is an artificial information structure against which freedom is assessed. Following these considerations, the current interpretation of privacy suggests that dynamic boundaries are negotiated within the inherent constraints of identity formation. If we accept that the right to privacy only deals with the tangible repercussions for the self and objectively determined values regarding the parameters of interferences in the notion of identity-building, then we fail to understand the process through which notions of personal autonomy are negotiated in light of technological developments. Once privacy is equated with control regarding the space for maintaining personal autonomy, individuality is negated.

Accordingly, we need to focus on an individual’s self-relationality within privacy discourse. In this respect, we need an understanding of privacy that does not focus on its social conception regarding personal autonomy and authenticity, but which incorporates a person’s individuality. Our perspective should be premised on why certain values contribute to an individual’s identity construction, instead of what contributes to intersubjective friction. An individual may possess the qualities of being ‘hard-working’ and ‘focused’ at work, which is evidenced in their work ethic as well as their casual clothing. However, that same individual may be perceived as ‘outgoing’ and ‘fun’ in their circle of close friends and, thus, different qualities dominate, reflected in their communication skills and modern clothing. Profiling technologies, by contrast, do not focus on the individual’s appearance management and perception, but rather on how attributes such as ‘work ethic’ and ‘modern clothing’ shape the data about the individual or individuals sharing similar characteristics. Accordingly, we need an understanding of privacy that recognises the extent to which an individual’s perception shapes the interpretation of his or her attributes, such as the relationality of the ‘painting brush’ to the art student or the ‘black suit’ to the barrister. I call this process of identity-building an individual’s self-relationality.

V. Privacy Considering Fashion Identity

The final part of this discussion establishes a basis for future discourse on privacy regarding the impact of technology on individual behaviour. Fashion identity is a valuable tool for clarifying the nature of the right to privacy, which is a construct that holds together our separate selves in the face of objective and subjective constraints on identity formation. Using this definition, we can expand the facets of the right to privacy to incorporate the notion of individual perception and self-relationality into a legal landscape.

1. Privacy and Fashion Identity

Figure 1 illustrates the overlap between privacy and fashion identity and how we can broaden the perspective of the former.

2022_600.png

Figure 1. On privacy considering fashion identity

I identified that there is a connection between fashion identity and privacy based on the communication of values, in that privacy overlaps with the expression of the social selves, focusing on an individual’s appearance management. The right to privacy, just as fashion identity, addresses the context where personal values are formed, in that we consider the evaluative judgements, including external stimuli, regarding an individual’s appearance management. However, we need to expand on the notion of privacy, identity and autonomy as a means to shape the performative role of fashion including an individual’s process to manage his or her behaviour.

In this respect, we need to elaborate on how fashion identity can clarify an individual’s privacy, autonomy and identity as an embodied experience. Privacy recognises the continuity of negotiated relationships, supporting the idea that fashion identity builds on the opportunities and constraints given within social codes. Nevertheless, fashion identity further clarifies that social processes illustrate an individual’s dialectic tendency to weigh up between fashion narratives and personal values. Accordingly, fashion identity can contribute to our understanding of how values are formed, based on the notion of individual perception.

Finally, while privacy rightly recognises that the self is constructed based on the feedback of others, it is fashion identity that offers an elaborative view on the individual’s inference of knowledge of self. Fashion identity, as a form of social behaviour, suggests that human perception is both rational and emotional. Perception is formed based on the interaction between the social selves and the intimate self, which concern the formation of values as well as attitudes. That said, the construction of fashion identity with the environment illustrates the tension between conformity and individuality, as well as the goals that define an individual’s unconscious aspirations to define one’s identity. Thus, there seems to be a gap in how privacy relates to the intimate self of fashion identity. With the intimate self, identity construction builds on further levels of thoughts that are based on the formation of values, emotions and attitudes rather than pure symbolic interactionism. We attach the meaning to our intimate self as autonomous subjects based on our self-relationality.

2. Defining Individual Perception and Self-Relationality in Privacy

The following definition intends to illustrate guidance to clarify the scope of the right to privacy:

Privacy is a construct that holds together our separate selves from the objective and subjective constraints on identity formation.

How should privacy safeguard the separate selves of identity? It is important to highlight that the separate selves are a perspective that illustrates the detachment of perception and self-relationality influenced by profiling technologies. The notion of privacy should extend to the multiplicity of identities to include external and internal worlds and broaden the outlook on external stimuli that have an impact on individual perception and self-relationality.

The concept of privacy posited by this paper allows us to incorporate ‘fashion identity’ with regard to the objective constraints on privacy in terms of identity construction. The notion of ‘fashion identity’ in appearance management can contribute to the external constraints that invasions of privacy can impose on the sense of self, being relevant to the clarification of the ‘reasonable expectation of privacy’ with regard to Article 8(1) of the ECHR. In doing so, ‘fashion identity’ can sustain the social dialogue between dominant values and personal expression of style to define the expectations of privacy in the digital age. The dialectic tendency in the social selves of ‘fashion identity’ could illustrate the parameters for addressing social constraints on a fundamental level, rather than limiting the interpretation of values to a particular form of control. In other words, ‘fashion identity’ may offer the means to analyse the way new communication patterns on the expression of identity arise, such as the emergence of new algorithmic infrastructures that shape and predict aspects of an individual’s personality.

Moreover, ‘fashion identity’ could expand the notion of privacy and autonomy that relates to certain characteristics, acknowledging the generation of attitudes for the inference of knowledge about the self. This premise regarding the internal constraints on identity-building should shift the focus from (physical) self-representation to an individual’s experience in specific relations. Our understanding of the right to privacy should not simply respond to the demands of the social context but translate the notion of individual perception and self-relationality into its invisible usefulness, such as the relevance of personal desires within the process of management and perception of appearance.

Thus, we can agree that interference in the right to privacy not only applies when algorithms extend to instances where individual perception is used to attribute relationality. While it could be argued that inaccurate profiling is trivial in that an individual can still maintain agency and choice in choosing the ‘right’ clothing, the engagement with profiling technologies undermines an individual’s self-relationality through filtered content. As a result, the interplay between the intimate self and other aspects of ‘fashion identity’ can contribute to the scope of privacy to maintain one’s self-relationality, as it investigates the extent to which fashion narratives relate to an individual’s perception (i.e., attitudes towards gender and the role of femininity/masculinity in appearance management).

VI. Conclusion

This paper intended to fulfil two important functions of this research study. One, it focused on the understanding of the right to privacy in terms of identity construction and offers a fresh outlook on the right to privacy, focusing on the meaning of identity in fashion studies. Second, it defined two key values that aim to shape the meaning of the right to privacy in the big data age, which is an individual’s perception and self-relationality of fashion identity. The first notion focuses on an individual’s ambivalence of the social and personal aspects of fashion, whereby the latter notion is the individual’s process of sense-making of fashion.

Fashion identity is not a mere conceptual abstraction of the self and the environment but uncovers the essence of a ‘profiled identity’. Fashion identity signifies more than controlling impressions and self-representation in an environment. It is not only about the visual stimuli that awakened them but also the conditions an individual imposes on self-perception and appearance management. In this respect, the study of ‘fashion identity’ includes the various nuances of how norms have an impact on managing behaviour in the material self; how fashion narratives in the social selves can illustrate personal preferences, as well as a tendency towards specific social norms; and how personal preferences and attitudes in the intimate self in fashion identity illustrate the generation of knowledge about the self. Conversely, privacy does not concern the unconscious motivations of the self, but rather the conditions in which an individual interacts with the environment. The notion of individual perception and self-relationality is not an end but a starting point, which may inform our expectations about the value of autonomy and identity in the age of big data.

Bibliography

Aaker, Jennifer L. “The Malleable Self: The Role of Self-Expression in Persuasion.” Journal of Marketing Research 36, no 1 (1999): 45–57. https://doi.org/10.1177/002224379903600104

Adam, Hajo and Adam G. Galinsky. “Enclothed Cognition.” Journal of Experimental Social Psychology 48, no 4 (2012): 918–925. https://doi.org/10.1016/j.jesp.2012.02.008

Agre, Philip E. “Introduction.” In Technology and Privacy: The New Landscape, edited by Philip E. Agre and Marc Rotenberg, 1–28. Cambridge: MIT Press, 1997.

Altman, Irwin. The Environment and Social Behaviour: Privacy, Personal Space, Territory, Crowding. Monterey: Brooks/Cole, 1975.

Attorney-General’s Department. Privacy Act Review: Discussion Paper (Attorney-General’s Department, October 2021).

Australian Law Reform Commission. Serious Invasions of Privacy in the Digital Era, ALRC Report 123 (Australian Law Reform Commission, 2014).

Bennett, Colin J. “In Defence of Privacy: The Concept and the Regime.” Surveillance & Society 8, no 4 (2011): 485–516. https://doi.org/10.24908/ss.v8i4.4184

Blumer, Herbert. Symbolic Interactionism: Perspective and Method. Englewood Cliffs: Prentice-Hall, 1969.

Brincker, Maria. “Privacy in Public and the Contextual Conditions of Agency.” In Privacy in Public Space: Conceptual and Regulatory Challenges, edited by Tjerk Timan, Bryce C. Newell and Bert-Jaap Koops, 64–90. Cheltenham: Edward Elgar, 2017.

Brownsword, Roger. “Friends, Romans, Countrymen: Is There a Universal Right to Identity?” Law, Innovation and Technology 1, no 2 (2009): 223–249. https://doi.org/10.1080/17579961.2009.11428372

———. Law, Technology and Society: Reimagining the Regulatory Environment. Abingdon and New York: Routledge, 2019.

Bygrave, Lee A. “Automated Profiling: Minding the Machine: Article 15 of the EC Data Protection Directive and Automated Profiling.” Computer Law & Security Report 17, no 1 (2001): 17–24. https://doi.org/10.1016/S0267-3649(01)00104-2

Calo, M. Ryan. “The Boundaries of Privacy Harm.” Indiana Law Journal 86, no 3 (2011): 1131–1162.

Corner, Francis. Why Fashion Matters. Thames& Hudson, 2014.

De Armond, Elizabeth. “Tactful Inattention: Erving Goffman, Privacy in the Digital Age, and the Virtue of Averting One’s Eyes.” St. John’s Law Review 92, no 2 (2018): 283–325.

De Vries, Katja. “Identity, Profiling Algorithms and a World of Ambient Intelligence.” Ethics and Information Technology 12, no 1 (2010): 71–85. https://doi.org/10.1007/s10676-009-9215-9

Degli Esposti, Sara. “When Big Data Meets Dataveillance: The Hidden Side of Analytics.” Surveillance & Society 12, no 2 (2014): 209–225. https://doi.org/10.24908/ss.v12i2.5113

Doyle, Tony and Judy Veranas. “Public Anonymity and the Connected World.” Ethics and Information Technology 16, no 3 (2014): 207–218. https://doi.org/10.1007/s10676-014-9346-5

Durham, Deborah. “The Lady in the Logo: Tribal Dress and Western Culture in a Southern African Community.” In Dress and Ethnicity: Change Across Space and Time, edited by Joanne B Eicher, 183–194. Oxford: Berg, 1995.

Eco, Umberto. “Lumbar Thought.” In Fashion Theory: A Reader, edited by Malcolm Barnard, 315–317. London: Routledge, 2007.

Entwistle, Joanne. “Introduction.” In The Handbook of Fashion Studies, edited by Sandy Black, Amy de la Haye, Joanne Entwistle, Agnès Rocamora, Regina A. Root and Helen Thomas, 97–102. London and New York: Bloomsbury, 2013.

———. The Fashioned Body: Fashion, Dress, and Modern Social Theory. 2nd ed. Malden: Polity Press, 2015.

Finkelstein, Joanne. The Fashioned Self. Cambridge: Polity Press, 1991.

Gavinson, Ruth. “Privacy and the Limits of Law.” Yale Law Journal 89, no 3 (1980): 421–471. https://doi.org/10.2307/795891

Goffman, Erving. The Presentation of Self in Everyday Life. London: Penguin, 1990.

Grafanaki, Sofia. “Autonomy Challenges in the Age of Big Data.” Fordham Intellectual Property, Media & Entertainment Law Journal 27, no 4 (2017): 803–868.

Hildebrandt, Mireille. “Profiling and the Identity of the European Citizen.” In Profiling the European Citizen: Cross-Disciplinary Perspectives, edited by Mireille Hildebrandt and Serge Gutwirth, 303–343. Dordrecht: Springer, 2008.

———. Smart Technologies and the End(s) of Law: Novel Entanglements of Law and Technology. Cheltenham: Edward Elgar, 2015.

Hildebrandt, Mireille, Serge Gutwirth and Paul De Hert. D7.4: Implications of Profiling Practices on Democracy and the Rule of Law (Future of Identity in the Information Society, 5 September 2005).

Hughes, Kristy. “A Behavioural Understanding of Privacy and Its Implications for Privacy Law.” Modern Law Review 75, no 5 (2012): 806–836. https://doi.org/10.1111/j.1468-2230.2012.00925.x

Kaiser, Susan B. The Social Psychology of Clothing: Symbolic Appearances in Context. New York: Macmillan, 1990.

Kinchen, Rosie. “What to Wear in the Lockdown, by the World’s First Fashion Psychologist.” The Times, 29 March 2020. https://www.thetimes.co.uk/article/what-to-wear-in-the-lockdown-by-the-worlds-first-fashion-psychologist-sh733nklg

Koops, Bert-Jaap. “Privacy Spaces.” West Virginia Law Review 121, no 2 (2018): 611–666.

Laceulle, Hanne. Aging and Self-Realization: Cultural Narratives about Later Life. Bielefeld: Transcript Verlag, 2018.

Levy, K. E. C. “Relational Big Data.” Stanford Law Review Online 66 (2013): 73–79.

Lurie, Alison. The Language of Clothes. William Heinemann, 1981.

Lynch, Annette and Mitchell D. Strauss. Changing Fashion: A Critical Introduction to Trend Analysis and Meaning. Oxford: Berg, 2010.

Margulis, Stephen T. “On the Status and Contribution of Westin’s and Altman’s Theories of Privacy.” Journal of Social Issues 59, no 2 (2003): 411–429. https://doi.org/10.1111/1540-4560.00071

Markus, Hazel and Paula Nurius. “Possible Selves.” American Psychologist 41, no 9 (1986): 954–969. https://doi.org/10.1037/0003-066X.41.9.954

Matzner, Tobias and Carsten Ochs. “Privacy.” Internet Policy Review 8, no 4 (2019): 1–14. https://doi.org/10.14763/2019.4.1427

Nissenbaum, Helen. “Privacy as Contextual Integrity.” Washington Law Review 79, no 1 (2004): 119–158.

———. “Toward an Approach to Privacy in Public: Challenges of Information Technology.” Ethics & Behaviour 7, no 3 (1997): 207–219. https://doi.org/10.1207/s15327019eb0703_3

Parent, William A. “Recent Work on the Concept of Privacy.” American Philosophical Quarterly 20, no 4 (1983): 341–355.

Purshouse, Joe. “The Reasonable Expectation of Privacy and the Criminal Suspect.” Modern Law Review 79, no 5 (2016): 871–884. https://doi.org/10.1111/1468-2230.12218

Regan, Priscilla M. Legislating Privacy: Technology, Social Values and Public Policy. Chapel Hill: University of North Carolina Press, 1995.

Reidenberg, Joel R. “Privacy in Public.” University of Miami Law Review 69, no 1 (2014): 141–160.

Roach, Mary Ellen and Joanne Bubolz Eicher. “Introduction to the Study of Dress, Adornment, and the Social Order.” In Dress, Adornment, and the Social Order, edited by Mary Ellen Roach and Joanne Bubolz Eicher, 1–4. New York: Wiley, 1965.

Roessler, Beate. “Privacy and/in the Public Sphere.” Yearbook for Eastern and Western Philosophy 2016, no 1 (2016): 243–256. https://doi.org/10.1515/yewph-2016-0021

Roosendaal, Arnold. Digital Personae and Profiles in Law: Protecting Individuals’ Rights in Online Contexts. Oisterwijk: Wolf Legal Publishers, 2013.

Rouvroy, Antoinette. “Privacy, Data Protection, and the Unprecedented Challenges of Ambient Intelligence.” Studies in Ethics, Law and Technology 2, no 1 (2008): 1–52. https://doi.org/10.2202/1941-6008.1001

Schneier, Bruce. Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World. New York: W. W. Norton, 2015.

Seubert, Sandra and Carlos Becker. “Verdächtige Alltäglichkeit: Sozialkritische Reflexionen zum Begriff des Privaten.” Figurationen 19, no 1 (2018): 105–120. https://doi.org/10.7788/figurationen-2018-190111

Shoemaker, David W. “Self-Exposure and Exposure of the Self: Informational Privacy and the Presentation of Identity.” Ethics and Information Technology 12, no 1 (2010): 3–15. https://doi.org/10.1007/s10676-009-9186-x

Simmel, Georg. On Individuality and Social Forms. Chicago and London: University of Chicago Press, 1971.

Solove, Daniel J. “Conceptualizing Privacy.” California Law Review 90, no 4 (2002): 1087–1155. https://doi.org/10.2307/3481326

Steeves, Valerie. “Reclaiming the Social Value of Privacy.” In Lessons from the Identity Trail: Anonymity, Privacy and Identity in a Networked Society, edited by Ian Kerr, Valerie Steeves and Carole Lucock, 191–212. New York: Oxford University Press, 2009.

Susser, Daniel, Beate Roessler and Helen Nissenbaum. “Online Manipulation: Hidden Influences in a Digital World.” Georgetown Law Technology Review 4 (2019): 1–45.

Thomas, Helen. “Introduction.” In The Handbook of Fashion Studies, edited by Sandy Black, Amy de la Haye, Joanne Entwistle, Agnès Rocamora, Regina A. Root and Helen Thomas, 17–22. London and New York: Bloomsbury, 2013.

Tseëlon, Efrat. “Erving Goffman: Social Science as an Art of Cultural Observation.” In Thinking Through Fashion, edited by Agnès Rocamora and Anneke Smelik, 149–164. London: I. B. Tauris, 2016.

Van der Sloot, Bart. “Privacy as Human Flourishing: Could a Shift Towards Virtue Ethics Strengthen Privacy Protection in the Age of Big Data?” Journal of Intellectual Property, Information Technology and Electronic Commerce Law 5, no 3 (2014): 230–244.

Veblen, Thorstein. The Theory of Leisure Class. London: Penguin, 1994.

Wachter, Sandra and Brent Mittelstadt. “A Right to Reasonable Inferences: Re-Thinking Data Protection Law in the Age of Big Data and AI.” Columbia Business Law Review 2019, no 2 (2019): 494–620. https://doi.org/10.7916/cblr.v2019i2.3424

Warren, Samuel D. and Louis D. Brandeis. “The Right to Privacy.” Harvard Law Review 4, no 5 (1890): 193–220. https://doi.org/10.2307/1321160

Wells, Samuel R. New Physiognomy, or, Signs of Character. New York: Fowler and Wells, 1867.

Westin, Alan F. Privacy and Freedom. London: The Bodley Head, 1967.

Yeung, Karen. “ ‘Hypernudge’: Big Data as a Mode of Regulation by Design.” Information, Communication & Society 20, no 1 (2017): 118–136. https://doi.org/10.1080/1369118X.2016.1186713

Zarsky, Tal Z. “ ‘Mine Your Own Business!”: Making the Case for the Implications of the Data Mining of Personal Information in the Forum of Public Opinion.” Yale Journal of Law and Technology 5, no 1 (2003): 1–55.

Zuiderveen Borgesius, F. J. “Improving Privacy Protection in the Area of Behavioural Targeting.” PhD diss., University of Amsterdam, 2014.

Primary Materials

AP Garcon and Nicot v France App Nos 79885/12, 52471/13 and 52596/13 (ECHR, 6 April 2017).

Aurel Popa v Romania App No 4233/09 (ECHR, 18 June 2013).

Bărbulescu v Romania App No 61496/08 (ECHR, 5 September 2017).

Beizaras and Levickas v Lithuania App No 41288/15 (ECHR, 14 May 2020).

Benedik v Slovenia App No 62357/14 (ECHR, 24 July 2018).

Bloomberg LP v ZXC [2022] UKSC 5; [2022] 2 WLR 424.

Burghartz v Switzerland [1994] ECHR 2; (1994) 18 EHRR 101.

Convention for the Protection of Human Rights and Fundamental Freedom, opened for signature 4 November 1950, 213 UNTS 221 (entered in force 3 September 1953).

Folgerø and Others v Norway App No 15472/02 (ECHR, 29 June 2007).

Gaskin v United Kingdom [1989] ECHR 13; (1990) 12 EHRR 36.

Mikulic v Croatia (2002) 2 WLUK 216.

Niemietz v Germany [1992] ECHR 80; (1993) 16 EHRR 97.

Peck v United Kingdom [2003] ECHR 44647/98; (2003) 36 EHRR 41.

P. G. and J. H. v United Kingdom (2008) 46 EHRR 51.

Pretty v United Kingdom [2002] ECHR 427; (2002) 35 EHRR 1.

Satakunnan Markkinapörssi Oy and Satamedia Oy v Finland (2018) 66 EHRR 8.

Sousa Goucha v Portugal App No 70434/12 (ECHR, 22 March 2016).

Tasev v North Macedonia App No 9825/13 (ECHR, 16 August 2019).

X and Y v The Netherlands [1985] ECHR 8978/80; (1986) 8 EHRR 235.


[1] Solove, “Conceptualizing Privacy,” 1088.

[2] Entwistle, “Introduction,” 97; Thomas, “Introduction,” 17.

[3] Agre, “Introduction,” 7. Other scholars discussing Agre’s concept of privacy are Zuiderveen Borgesius, “Behavioural Targeting,” 92–95; Hildebrandt, Smart Technologies, 80.

[4] Hildebrandt, Smart Technologies, 80.

[5] Convention for the Protection of Human Rights and Fundamental Freedom, opened for signature 4 November 1950, 213 UNTS 221 (entered in force 3 September 1953), art. 8.

[6] See P. G. and J. H. v United Kingdom (2008) 46 EHRR 51, para. 57.

[7] See Satakunnan Markkinapörssi Oy and Satamedia Oy v Finland (2018) 66 EHRR 8; Bărbulescu v Romania App No 61496/08 (ECHR, 5 September 2017).

[8] Zarsky, “Mine Your Own Business,” 35.

[9] Corner, Why Fashion Matters, 7; Entwistle, “Introduction,” 97; Lurie, The Language of Clothes, 5.

[10] Lynch, Changing Fashion, 13.

[11] Roach, “Adornment and the Social Order,” 1.

[12] Entwistle, The Fashioned Body, 16; Finkelstein, The Fashioned Self, 128; Lurie, The Language of Clothes, 5; Goffman, Presentation of Self, 32–40; Tseëlon, “Erving Goffman,” 154.

[13] Entwistle, The Fashioned Body, 16.

[14] Entwistle, The Fashioned Body, 7; Simmel, On Individuality and Social Forms, 131.

[15] See, for example, Durham, “The Lady in the Logo,” 183–194.

[16] Aaker, “The Malleable Self,” 46; Veblen, The Theory of Leisure Class, 35–67.

[17] Eco, “Lumbar Thought,” 316.

[18] Kaiser, The Social Psychology of Clothing, 8.

[19] Kaiser, The Social Psychology of Clothing, 7.

[20] Kaiser, The Social Psychology of Clothing, 271–272, 288; Blumer, Symbolic Interactionism, 79.

[21] Kaiser, The Social Psychology of Clothing, 288.

[22] Adam, “Enclothed Cognition,” 918.

[23] Kinchen, “Lockdown,” para. 2.

[24] Agre, “Introduction,” 7.

[25] Shoemaker, “Self-Exposure,” 3–4.

[26] See also Steeves, “Social Value of Privacy,” 192.

[27] Warren, “The Right to Privacy,” 193; Parent, “Concept of Privacy,” 342; Gavinson, “Limits of Law,” 428.

[28] Altman, The Environment and Social Behaviour, 18; Roessler, The Value of Privacy, 73.

[29] Altman, The Environment and Social Behaviour, 24.

[30] Margulis, “Westin and Altman,” 418–421.

[31] Hughes, “Behavioural Understanding,” 806.

[32] Altman, The Environment and Social Behaviour, 32–42.

[33] Altman, The Environment and Social Behaviour, 27.

[34] Hughes, “Behavioural Understanding,” 812.

[35] Hughes, “Behavioural Understanding,” 813.

[36] Doyle, “Public Anonymity,” 207.

[37] Roessler, “Public Sphere,” 243.

[38] Nissenbaum, “Privacy as Contextual Integrity,” 119.

[39] Westin, Privacy and Freedom, 31–34, 38.

[40] Convention for the Protection of Human Rights and Fundamental Freedom, opened for signature 4 November 1950, 213 UNTS 221 (entered in force 3 September 1953), art. 8.

[41] Brincker, “Contextual Conditions of Agency,” 64.

[42] Koops, “Privacy Spaces,” 619.

[43] Nissenbaum, “Privacy in Public,” 208; Reidenberg, “Privacy in Public,” 146.

[44] Koops, “Privacy Spaces,” 620; Westin, Privacy and Freedom, 35.

[45] Brincker, “Contextual Conditions of Agency,” 69.

[46] Reidenberg, “Privacy in Public,” 141.

[47] On the significance of the reasonable expectation of privacy test regarding the ECtHR, see Purshouse, “Reasonable Expectation,” 871.

[48] Matzner, “Privacy,” 5.

[49] Brownsword, Law, Technology and Society, 327.

[50] The right to privacy is argued to be a ‘public good’ as well as a ‘collective interest’, protecting an individual’s self-realisation; Regan, Legislating Privacy, 213; see also Bennett, “In Defence of Privacy,” 487.

[51] Brownsword, Law, Technology and Society, 327.

[52] Bloomberg LP v ZXC [2022] UKSC 5; [2022] 2 WLR 424, 443.

[53] Bloomberg LP v ZXC [2022] UKSC 5; [2022] 2 WLR 424, 429.

[54] Bloomberg LP v ZXC [2022] UKSC 5; [2022] 2 WLR 424, 437, 443, 462; see also the Australian Attorney-General’s Department discussion paper, whereby one of the proposals intends to move towards a definition of privacy in tort, considering the flexibility of the reasonable expectation of a privacy test; Attorney-General’s Department, Privacy Act Review: Discussion Paper, 26.1; Australian Law Reform Commission, Privacy in the Digital Era, 91.

[55] Bloomberg LP v ZXC [2022] UKSC 5; [2022] 2 WLR 424, 443.

[56] Benedik v Slovenia App No 62357/14 (ECHR, 24 July 2018), paras 101, 105–117.

[57] See also Benedik v Slovenia App No 62357/14 (ECHR, 24 July 2018) (Concurring Opinion of Judge Yudkivska, Joined by Judge BoJoine).

[58] P. G. and J. H. v United Kingdom (2008) 46 EHRR 51, para. 54.

[59] Bărbulescu v Romania App No 61496/08 (ECHR, 5 September 2017), paras 121–122.

[60] Seubert, “Verdächtige Alltäglichkeit,” 106.

[61] Degli Esposti, “When Big Data Meets Dataveillance,” 209; Schneier, Data and Goliath, 45, 109–110.

[62] Koops, “Privacy Spaces,” 651.

[63] Calo, “The Boundaries of Privacy Harm,” 1131, 1145.

[64] Calo, “The Boundaries of Privacy Harm,” 1146.

[65] Hughes, “Behavioural Understanding,” 816, 824.

[66] Agre, “Introduction,” 7.

[67] Goffman, Presentation of Self, 166, 203; De Armond, “Tactful Inattention,” 286.

[68] On the exploration of potential aspects of self, see Markus, “Possible Selves,” 954.

[69] Zuiderveen Borgesius, “Behavioural Targeting,” 92.

[70] Rouvroy, “Ambient Intelligence,” 36–37.

[71] Hildebrandt, Smart Technologies, 81; Roosendaal, Digital Personae, 25.

[72] Hildebrandt, D7.4, 70–71.

[73] Hildebrandt, “European Citizen,” 314.

[74] De Vries, “Ambient Intelligence,” 79.

[75] Hildebrandt, D7.4, 42.

[76] In this respect, see Susser, “Online Manipulation,” 38.

[77] See also Wells, New Physiognomy, 6.

[78] Pretty v United Kingdom [2002] ECHR 427; (2002) 35 EHRR 1, para. 61; Niemietz v Germany [1992] ECHR 80; (1993) 16 EHRR 97, para. 29.

[79] See also Van der Sloot, “Privacy as Human Flourishing,” 230.

[80] Mikulic v Croatia (2002) 2 WLUK 216, para. 53; X and Y v The Netherlands [1985] ECHR 8978/80; (1986) 8 EHRR 235, para. 22.

[81] Sousa Goucha v Portugal App No 70434/12 (ECHR, 22 March 2016), para. 27; Beizaras and Levickas v Lithuania App No 41288/15 (ECHR, 14 May 2020), para. 109.

[82] AP Garcon and Nicot v France App Nos 79885/12, 52471/13 and 52596/13 (ECHR, 6 April 2017), paras 95–96.

[83] Gaskin v United Kingdom [1989] ECHR 13; (1990) 12 EHRR 36, paras 39, 49.

[84] Folgerø and Others v Norway App No 15472/02 (ECHR, 29 June 2007), para. 98.

[85] Burghartz v Switzerland [1994] ECHR 2; (1994) 18 EHRR 101, para. 24.

[86] See, for example, Aurel Popa v Romania App No 4233/09 (ECHR, 18 June 2013), paras 30–32.

[87] Tasev v North Macedonia App No 9825/13 (ECHR, 16 August 2019), paras 32–33.

[88] Aurel Popa v Romania App No 4233/09 (ECHR, 18 June 2013), para. 32.

[89] Satakunnan Markkinapörssi Oy and Satamedia Oy v Finland (2018) 66 EHRR 8, para. 137.

[90] This is known as the ‘purpose limitation’ principle; Peck v United Kingdom [2003] ECHR 44647/98; (2003) 36 EHRR 41; P. G. and J. H. v United Kingdom (2008) 46 EHRR 51.

[91] Zarsky, “Mine Your Own Business,” 35.

[92] Zuiderveen Borgesius, “Behavioural Targeting,” 92; Grafanaki, “Autonomy Challenges,” 810–813; Yeung, “Hypernudge,” 118.

[93] Laceulle, Aging and Self-Realization, 155.

[94] Hildebrandt, D7.4, 38; Brownsword, “Friends, Romans, Countrymen,” 224.

[95] Hildebrandt, D7.4, 38; Levy, “Relational Big Data,” 77.

[96] Kaiser, The Social Psychology of Clothing, 289–290.

[97] Wachter, “Reasonable Inferences,” 507–508.

[98] Bygrave, “Automated Profiling,” 17.


AustLII: Copyright Policy | Disclaimers | Privacy Policy | Feedback
URL: http://www.austlii.edu.au/au/journals/LawTechHum/2022/6.html