top of page
  • Buddhi Nishita Gauri

Privacy v. Posner – In the Age of Big Data Analytics

“Google knows you more than your mom”; a phrase that lingered long after it was first heard. The rise of the internet in the past decade has implications that were unfathomable when it dawned upon us. The sheer dependence and the vesting of the human mind in an electronic device is a scary fascination. The idea that ‘my mind’ is no longer is within the grey cells, is bewildering.

In light of the 2017 Judgment by the Honorable Supreme Court, in K. S. Puttaswamy v. Union of India1 – the nine judge bench unanimously upheld Right to Privacy as a Fundamental Right. It highlighted privacy intrusion by non-state actors and the requirement of state intervention through a comprehensive legislation that has become the need of the hour- the Data Protection Bill and the Srikrishna Committee being a consequence of the same.

To be able to draw an analogy, privacy is akin to a zorbing ball. A Fundamental Right to Privacy, ensures a sphere of unrestrained activity; activity within the socio-political boundaries that seeks no interference. However, this zorbing ball has perforations, which impose restrictions and ensures that the certain activities cannot remain unrestrained. In the age of Big Data Analytics however, human activity is within the transparent ball. Digital footprints are perpetually visible in their entirety. Visibility to a state and non- state actor.  This research paper primarily concerns itself with non-state actors who hold in their possession enormous data that is difficult to bring under the umbrella of regulation. While state actors are also placed similarly, it is relatively easier to carve exceptions to privacy and attribute liability in case of breach of duty.

The absence of sound data protection laws and the lack of informational privacy has questioned the immense trust that people of the internet place in non-state actors .This paradigm is contrasted against the state who is expected to act in furtherance of the interests’ of its citizens and yet is questioned ( rightfully ) about its activities in doing the same.

Big Data, Big Problems

Today’s infinitely complex and labyrinthine data ecosystem is beyond the comprehension of most ordinary users.2 Habituated to frequent decisions that Artificial Intelligence makes on an individual’s behalf, it isn’t surprising that those decisions are not too different to the ones an individual would make for himself3 YouTube suggestions, Netflix match percentage, traffic alerts are minuscule examples of the manner in which big data plays a significant role in most decisions we take.

With the availability of big data, predictive analytics uses algorithms to recognize data patterns and predict future outcomes. Predictive analytics encompasses data mining, predictive modeling, machine learning, and forecasting 4.

Commercial Interests v. Individual Interests – In a Trade-off

Changing Paradigm

The Srikrishna Committee report broadly describes three approaches to data Protection which is a reflection of the relationship that each state has with its citizens.

The US follows a laissez-faire approach and does not have an overarching data protection framework and is based on the approach that liberty is freedom from state control.  The EU, on the other hand is at the vanguard of global data protection norms. It has recently enacted the GDPR. This is a comprehensive framework that covers all kinds of procession of personal data while illustrating the rights and obligations to protect the privacy of Europeans in all its facets.

It is founded on the need for the State to act as a facilitator in upholding the individual’s dignity and therefore has stringent laws in place. China on the other hand articulated its own approach which is on the lines of averting National Security risks; with strict controls on cross-border sharing of personal data. The State frames its law keeping the collective interests over the individual interests and State privileges over the individual.

Each of these Paradigms above does not represent the citizen-state relationship that exists in India- which is based on two planks. First, that the state is a facilitator of human progress and it is directed by the DPSP to serve the common good. Secondly, that the state is prone to excess and is therefore checked by effectuating a vertical and horizontal separation of powers ,as well as Fundamental Rights that can be enforced against the state.

It is often perceived that economic growth and data protection are antithetic to each other. However this report aims to concur the two parallel lines that are presumed to never meet. The Srikrishna Committee Report however, wishes to reverse this position, by making the individual the “data principal” and the institute in whose confidence such data resides, the “data fiduciary”.

Is Consent Opaque?

Understanding consent forms are a herculean task, for they are written for lawyers, by lawyers, with the tiniest of alphabets and incomprehensible clauses.

The problem with regard to consent is two fold in nature. Firstly, the consent forms are manufactured and designed to make the Non-State actor receive consent. Secondly, that users do not understand the implications of consent that are granted with utmost convenience.

Why is consent necessary?

The Srikrishna Committee illustrated two advantages. First, it respects user autonomy. Second, it provides a clear basis for the entity to whom consent is given to disclaim liability regarding matters to which such consent pertains5 There is however a vast difference between what consent ought to be and what it currently is.

The privacy paradox can be described as the phenomenon where an individual expresses strong privacy concerns but behaves in a contradictory way to these concerns. This flows from the economic concept that rational individuals are willing to give up information about themselves when they see benefits arising out of such a transaction.6 If privacy were to be viewed as a commodity or a product, consent is a trade-off between privacy and efficiency. Customized ads to predictive words suggestions on G-Board are best examples of the aforementioned contention.

If the above mentioned view is further taken into consideration, then the existing consent forms and nudging should be acceptable. If privacy is viewed as a product, then there can be two defects that the “consent” suffers from. Manufacturer’s liability is about the incomprehensibility of consent forms and Designer’s liability of ‘nudging the user’ is an example.

Nudges are considered to follow the soft or libertarian paternalism approach, where the user is not forbidden any options but only given a push to alter their behavior in a predictable way.7 Acceptance or rejection of cookie policies on websites are mostly examples which make it intuitive for users to change settings and secure their data.

Nudging can be a privacy-enhancing tool or a privacy-compromise tool. Visual designs which “Accept” the terms and conditions are in bold blue and can positively influence the user to accept the terms and conditions, while “reject” is usually sober colors that the eye diverts lesser attention to.

Manufacturers do not exhibit accountability when the use of nudges is not directed at the well-being of its users. Visual notices must fulfill their primary purpose of meaningful communication.8

When the product is free, you are the product that is being sold

This is the aphorism that goes with the economic concept of “there is no such thing as a free lunch” .Google collects information to provide better services to all its users- from figuring out basic stuff like which language the user speaks, to more complex things like which ads are the most useful, to the people who matter most to you online.

They collect information about activity undertaken while using their services, the activity information they collect may include:

  1. Terms searched for,

  2. Videos watched,

  3. Views and interactions with content and ads,

  4. Voice and audio information when audio features are used,

  5. Purchase activity,

  6. People with whom users communicate or share content,

  7. Activity on third-party sites and apps that use their services,

  8. Chrome browsing history users have synced with their own Google Account.

Google uses various technologies to collect and store information, including cookies, pixel tags, local storage, such as browser web storage or application data caches, databases, and server logs.9

This illustrates that the digital footprints are left behind like a trail that directs the “data fiduciary” to do what its users want best, at the cost of privacy.

Privacy v. Posner

Having understood privacy through the lens of its supporters, it is important to not eliminate Richard Posner who critiques privacy at its best. Richard Posner, in ‘the Economics of Justice’ (published in 1981), argued that privacy is protected in ways that are economically inefficient.10

Posner makes two distinct claims about privacy while defending the National Security Agency (“NSA”) and the profiling activities that it undertakes. First, he contends that machines cannot by themselves invade privacy; only other humans can .This claim is based on the argument that had the NSA employed human beings to undertake the exact same activity that currently is undertaken by machines, then Posner implicitly concedes that there is an invasion of privacy. Collection and processing of private communications by an intelligence officer would involve the element of human sentiment and therefore will be a breach of privacy.

Hence the importance of Posner’s second claim, which is that the NSA’s vacuuming up of personal data through electronic means can safeguard privacy by reducing the amount of human review.11

If this argument is extended to Big Data Analytics, which is similar to the nature of the task undertaken by the NSA, there essentially cannot be a breach of privacy. It is undoubted that this argument cannot sustain in its entirety, but it also cannot be disregarded.


Understanding the nature of privacy in the context of a digital world is all the more necessary in the twenty first century. Having begun this article with the comparison of privacy to a zorbing ball, it must be reiterated that the comparison is profound, for the nature of digital footprints that are inevitably left behind. These digital footprints are can be constructed to build the individual we are and one cannot backlash at this juncture.

Scott MnNealy, the chairman of Sun Microsystems, said in 1999 “You already have zero privacy anyway, get over it.” Similar sentiments were expressed by Mark Zuckerberg who stated that Privacy is no longer the “social norm”12 At this juncture of trading privacy with efficiency, one cannot undo the comfort acquired with better services that the ‘Internet of Things’ has to offer. It is trite that law has to adapt and amend itself to the new technologies. It would be impossible to look at privacy merely through the eyes of Richard Posner.

With Big data analytics, all of human activity is perpetually visible, but there is the lack of interference from such pursuit of activity. Within this ball, is one still left alone? And if not, the question still remains, to be left alone from what?

The human species is inevitably moving forward with greater dependence on technology. This cannot change and privacy at every step will only be a compromise. The zorbing ball will have to succumb to multiple perforations and the exceptions to privacy will loom large enough to consume the Right itself.

It is important for law to keep a check and the state to act in furtherance of the common good, as illustrated by the goals in the Srikrishna Committee report where India has taken the major step forward and is carving a niche in the Big Data World.


  1. (2017) 10 S.C.C. 1.

  2. A critique of consent in information privacy, by Amber Sinha and Scott Mason .

  3. Leak of Faith, Navin J. Anthony – The Week April 8 2018.

  4. Predictive Policing: What is it , How it works and its legal implications, Rohan George , available at <> last visited 14 Sept 2018.

  5. Srikrishna Committee Report ,page 34.

  6. State of the Information Privacy Literature: Where are We Now And Where Should WeGo? Author(s): Paul A. Pavlou Source: MIS Quarterly, Vol. 35, No. 4 (December 2011), pp. 977-988.

  7. Privacy Nudges for Social Media: An Exploratory Facebook Study, available at <>.

  8. Use of Visuals and Nudges in Privacy Notices, by Saumyaa Nayudu  <>   last visited on 15 Sept 2018.

  9. Available at Google’s Privacy Policy.

  10. K S Puttaswamy vs Union of India ,2017 10 SCC 1 ,para 140.

  11. Privacy-Privacy Tradeoffs, David E. Pozen, The University of Chicago Law Review, Vol. 83, No. 1 (Winter   2016), pp. 221-247, page 240.

  12. Online Privacy , Robert Gellman and Pam Dixon, page 15.

By Buddhi Nishita Gauri, III Year Student, Christ.


bottom of page