Jump to content

User:Ceidlitz1510/Biggest lie on the internet

From Wikipedia, the free encyclopedia
Terms of Service; Didn't Read -- Fixing the biggest lie on the web

"The Biggest Lie on the Internet" is a colloquialism used to describe the tendency of Internet users to accept terms and conditions in order to use a website without actually reading them.[1] It can be seen as an extension of the “TL;DR” abbreviation since most terms of service agreements (TOS) are long and contain extraneous information. Websites such as Terms of Service; Didn't Read, (presentation featured in the video) exist to help users decipher the lengthy and complex Terms of Service that are featured on popular websites such as Google and GitHub.

Definition

[edit]

"The biggest lie on the internet" is a term that refers to the tendency of Internet users to click on a confirmation button that states "I have read, understood, and agreed to the Terms and Conditions" to access a social network site, software, or service without actually having read the terms and conditions of that site.

The titular lie of this phrase is "I have read, understood, and agree to abide by the Terms and Conditions of this website".[2] The "size of the lie" this phrase refers to is the number of people who tell "the lie" rather than the number of people who are told or believe "the lie".

TL;DR

[edit]

(Follow link for TL;DR News; an independent British online newspaper)

Originating on a Usenet newsgroup for video games in 2002,[3] by 2014, tl;dr had become standard internet jargon standing for "Too Long; Didn't Read" (or, more colloquially "Too Lazy; Didn't Read") in response to long posts.[4] Usually, there is a concern that the time invested to read the post would not be worth the utility gained from reading it.[5] On social media sites such as Reddit, community members will use this tag for long, detailed posts while including a summary of the post at the bottom.[6]

The phrase extended to the Terms of Service for many major websites since these were often long and full of small print.[7]

"tl;dr philosophy" and big data

[edit]

During the first decade of the 21st century, two paradigms emerged in tandem as the capacity to store on digital storage outweighed the capacity to store on analog storage [8]: Firstly that of 'big data', wherein the sheer size and diversity of data available in nearly all fields of research (including those used by commerce) is leveraged to achieve statistical analyses above and beyond the quality of an individual datum; this paradigm was expounded upon in the subsequent decade since as danah boyd and Kate Crawford pointed out in 2011, [9] the amount of data didn't really matter until predictive analytics, user behavior analytics, and other advanced data analytics models were available to analyze the information.

Secondly, the tl;dr philosophy (the idea that if something was long, it was not worth the time to read because someone else would summarize it if interested) became pervasive enough for "tl;dr" to become Internet slang between 2002 to 2011. The amount of consent provided to social media sites through "the biggest lie" growing in size provided the owners of social media websites a financial impetus to develop analytics in order to monetize the data that their users provided them during the same period. The personal data collected though online consent during this time became the largest data set in recorded history (for instance handily dwarfing that compiled by the Stasi and currently stored by the Stasi Records Agency).

The "tl;dr philosophy" of the 2000's that gave rise to "the biggest lie on the internet" is directly responsible for the quantities of personal data provided to big data firms for analytics.

History

[edit]

Prior to Hyppönen's Definition

[edit]

Prior to Hyppönen's use of the phrase in reference to his social experiment, "thebiggestlie.com" was a website launched in 2010 with the intention of "exposing the biggest lie on the internet: 'yes, I have read and agree to the terms and conditions'".[10][11] Technology bloggers and journalists such as Klint Finley used the phrase in articles they published in Tech Crunch about the issues surrounding the Terms of Service for major websites.[12]

This was also a time when terms and conditions became increasingly long and complex as software companies and social media sites cultivated the growth of "the biggest lie" in order to increase their users and data sets.

Hyppönen's Definition

[edit]

In 2014, Mikko Hyppönen was cited as saying "The biggest lie on the internet is ‘I have read and agree to the terms and conditions".[2] He used his company, F-Secure to prove this point, setting up a free WiFi hotspot in central London with one of the terms of use being a Herod clause (having individuals agree to give up their first born child in exchange for use of the WiFi).[13] 6 people used a Clickwrap to accept the terms.[14]

Academic Definition

[edit]

Privacy and law scholars such as Fred Cate have been writing about "The Failure of Fair Information Practice Principles" with reference to how large companies handle data privacy issues via complex terms and conditions since 2006. [15] The phrase was defined academically by privacy scholars Jonathan Obar and Anne Oeldor-Hirsch in their 2018 paper as a way to describe how individuals interact with privacy polices of social networking sites.[16] Due to the clickwrap available on the login page for most social media sites, Professors Obar and Oeldor-Hirch state that the "notice" portion of the notice and choice privacy framework[17] is often ignored by users when creating or logging into social networking accounts.

[edit]

The "notice and choice" framework that allowed for the deduction of the academic definition comes from the first two sections of the International Safe Harbor Privacy Principles.[18]. This framework was updated to the EU-US Privacy Shield in 2016, but academics argue that the "notice and choice" framework has roots far older than the international safe harbour laws of 2000.[17]

Paul M. Schwartz and Daniel J. Solove wrote a paper for the Berkeley Media Studies Group outlining the importance of Notice & Choice framework in 2009.[19] Here, the authors traced notice and choice framework to the Fair Information Practices developed in a report by the United States Department of Health, Education, and Welfare in 1973.[19] This framework formed the core of what was updated into the FTC fair information practice in 2009. The authors noted problems with the framework such as a "lack of enforcement", a "lack of substantive restrictions", and a "lack of real notice". The "lack of real notice" section raised the point that "Privacy policies are long, cumbersome, and hard to read. Moreover, most people do not read privacy policies", [19] showing that the "I agree to have read and understood the privacy policy of this site" would be a popular lie told by Internet users if privacy policies weren't "often vague and unclear because they are drafted with the companies’ self-interest in mind."[19] Usually, the terms of service of a website will state that viewing a webpage constitutes an agreement to having read and understood the privacy policies of that site so as to ensure ease of access for users. This leads to the final problem identified in Schwartz and Solove's piece, the "lack of real choice" [19] as by choosing to access a website, the agreement is automatically made between the user and the entity offering the website as service.

Ann Cavoukian, Information and Privacy Commissioner of Ontario

The "notice and choice legal framework" was noted by Ontario Information and Privacy Commissioner Ann Cavoukian in her 2011 report on data privacy as an antiquated model from the early 1980s still functioning as the primary method for privacy decisions made by the United States Chamber of Commerce at the time despite the recognition that the "legalese" used in terms and conditions are typically "unintelligible" and "overwhelming to the average consumer".[20]

In 2015, the New York University School of Law's study on the "effectiveness of the notice and choice framework" revealed poor notice standards,[21] specifically citing the issue that

Policies are long, complex, ambiguous, incomplete, and often silent on important subject matter. Moreover, the substance of the rights that are explicitly reserved are sometimes fairly concerning. Policies that claim compliance with various regulatory benchmarks inevitably omit or contradict features of those benchmarks.

— Florencia Marotta-Wurgler, Does "Notice and Choice" Disclosure Regulation Work? An Empirical Study of Privacy Policies, NYU Law (April 2015)

The mismatches between users' understanding of a privacy policy, and what the organization that produced it actually means represent the failure of the "notice" portion of the "notice and choice framework".[22] Multiple attempts have been made to "push past" the notice and choice framework towards a legal framework with a more contemporary understanding of online privacy and big data ethics.[23]

Social Importance

[edit]

The development of this phrase is a result of Internet users prioritizing efficiency over privacy. The development of these attitudes is believed to be accelerated by the introduction of the clickwrap feature to social networking sites.

The size of this lie is related to the "politics of resignation" that sprung from "tl;dr philiosophy". Corporations were able to use this philosophy as strategy: by increasing privileges for those who provided more personal data, social media corporations capitalized on the "politics of resignation" that people adopted when posting on social media after telling "the biggest lie". This is a typical capitalist strategy that relies on public neoliberal sentiment.[24] The size of this "lie" therefore represents the size of neoliberal sentiment on the Internet.[25]

Neoliberal Sentiment and Big Data Ethics

[edit]

Main Article: Big data ethics

Some of the principles of big data ethics are drawn into question when individuals choose to give personal information to large corporations in the name of convenience. Key questions regarding data ownership become essential to understanding new jobs in the information sector such as data broker, data steward, and data custodian given that all of these jobs will deal with copious amounts of individuals' personal data. Furthermore, many employed in professions regarding data analysis of "big data" are highly paid for data mining operations by marketing firms interested in knowing more about their client base (See Examples of data mining).

Consent is defined by any Clickwrap using a non-written express consent model. Informed consent is often confused with the type of consent model used on the Internet, which is why the "notice and choice framework for privacy" is ill-prepared to deal with Big data ethics.

In the data ethics model of consent, the individual must be informed (and remain informed) on every piece of data they have provided to a "big data entity" and should be continuously informed and updated on any new uses. While an individual should be informed of new use, they can give continuous consent regarding a topic (i.e. "marketing information"), with the understanding that consent can be revoked at any time (including withdrawal from the service).

It is not possible to retract past consent for data transactions. This means that any information a user provides to a webpage (subject to the webpage's terms of service) belongs to the entity which owns that webpage. Hence, the "biggest lie on the internet" represents an aspect of social media's "tl;dr philosophy" that led to the existence of big data.

The Corporate Cultivation of Digital Resignation

[edit]

In 2019, Privacy Scholars Nora A Draper and Joseph Turow wrote an academic paper about "the condition produced when people desire to control the information digital entities have about them but feel unable to do so,"[26] calling this concept digital resignation. In their article, the authors make efforts to show a link between feelings of digital resignation and the user-surveillance mechanisms of large corporations. They point out that these mechanisms are created and used by large corporations to better market their products, while promising their patrons more conveniences for the personal data they supply. This tradeoff is the same as is seen on social media websites.

As a key method of user-surveillance, having long and intricate terms of service to both cover potential lawsuits and encourage people to tell "the biggest lie" is a routine practice in corporate governance that cultivates digital resignation. Since people need to agree to terms and conditions in order to use a service, it is likely that they will feel the futility of their choice, and therefore develop feelings of digital resignation.

The corporate cultivation of digital resignation is in the best financial interest of "big data entities" since it is this resignation that provides them with the large data sets upon which they can perform advanced analytics, which they can then convert to information usable for marketing purposes.

"The biggest lie" as Faustian bargain

[edit]

See also: Faustian bargain

A Faustian bargain has an idiomatic definition as "a deal in which one focuses on present gain without considering the long-term consequences".

Agreeing to the terms and conditions of a service in order to access that service without reading what those terms and conditions are or considering their long-term consequences is essentially agreeing to a faustian bargain.[27] Therefore, anyone who tells the "biggest lie" is essentially within the confines of a faustian contract regarding any information they provide to the website.

Most Americans who use the Internet understand this concept, but choose to abide by the terms for the conveniences the service provides.[28] Multiple sources refer to this as "the faustian bargain of user data".[29][30][31]

Theory of "diminishing returns on privacy tradeoff"

[edit]

See also: Internet Privacy

In 2009, the Federal Trade Commission issued the following conclusion regarding the use of information gained through big data collection and analytics[32]

If misguided government regulation chokes off the Internet’s growth or evolution by starving content and service providers of much-needed advertising revenue, we would be killing the goose that laid the golden eggs. (...) The dangers of regulation to the health of the Internet are real, but the ease with which government could disrupt the economic motor of the Internet(advertising) is not widely understood—and therein lies the true danger in this debate.

— Privacy Trade-Offs: How Further Regulation Could Diminish Consumer Choice, Raise Prices,Quash Digital Innovation & Curtail Free Speech, FTC

The above statement means that the FTC sees use of personal data provided to social media (i.e. through use of a Clickwrap agreement) as fair game for advertising until the use of that data hits a point of diminishing returns for advertisers and the markets that support them. In other words, the use of personal information provided to social media is free for use for advertising and marketing because that use is an inherent risk of the freedom of speech and expression in a public forum. This statement also implicitly ties this freedom to the value it creates through the use of The Goose That Laid the Golden Eggs as metaphor.

This builds to the understanding of a tradeoff between privacy and utility. While some researchers tried to deny this by pointing out the Data anonymization process used by those who collect big data[33], they did not predict the rapid expansion of Data re-identification analytic processes that became threats to public health records in 2011.[34]

De-identification practices consist of procedures where direct identifiers are removed or transformed into "subgroups" (i.e. people from a certain geographic area) to create pseudonymous data, and then indirect identifiers are removed (or transformed) to create de-identified data sets.[35] Anonymous data sets remove all identifiers through differential privacy, but analytics will likely continue to improve so as to threaten the existence of differential privacy.

The demonstrable financial value of user data is theoretically limitless,[36] with corporate safeguards as a theoretical "last line of defence" for user privacy. This gives social media companies a financial impetus to participate in surveillance capitalism. The result of this participation is the invasion of user privacy and the acceleration of economic benefit, as discussed by Shoshana Zuboff in her 2019 book The Age of Surveillance Capitalism. Another example of the results of social media websites choosing to participate in surveillance capitalism is the Facebook–Cambridge Analytica data scandal. While this scandal and the "user surveillance/invasion issue" are often cited as signs of diminishing returns on privacy tradeoff, there is little financial loss associated with either of these tradeoffs.

ToS;DR

[edit]

Main article: Terms of Service; Didn't Read

ToSDR Ratings.png
ToSDR Ratings

In response to the crisis created by "the biggest lie", Hugo Roy, Michael de Jong, and Jan-Christoph Borchardt created a community project that analyzes and grades the Terms of Service for major websites on a scale from A (highest) to E (lowest).[12] Metrics that relate to these classifications include user copyright as it relates to uploaded content, site agreements with third parties, opportunities to curate user content, and the use of personal information. Site security, including whether or not the website saves uploaded content deleted by the user, is also an important metric for this classification. Similar sites such as tldrlegal.com offer similar services.[37]

[edit]

South Park

[edit]

Main Article: HumancentiPad

In 2011, the fifteenth season premiere of South Park was premised around one of the characters (Kyle) facing a comeuppance for telling "the biggest lie" in agreeing to the iTunes Store user terms of service. The overtly direct metaphor of a "big data entity feeding people what they already ate because of an agreement they didn't read" was seen as an apt analogy for the algorithm the iTunes store used to suggest new music to its users.

Futurama

[edit]

Main Article: Overclockwise

A 2011 episode of Futurama featured a character getting arrested for choosing to overclock his robot, an act prohibited by his robot's user terms of service agreement. The character chose to tell "the biggest lie" upon first using his robot to bend his drinking straw due to his thirst. In a telling exchange, the robot's owners (a pair of clones) state "I clicked without reading!"/ "And I slightly modified a thing that I own!"/"We're monsters!", thus highlighting the ways in which users can unwittingly violate the terms of service for a device they own once they have told "the biggest lie".

Alternate Uses

[edit]

The phrase has been used in reference to Net Neutrality, claiming that "inflammatory rhetoric" gave way to exaggerations which eventually became lies related to revisions made to net neutrality policy by the FCC between 2010 and 2014.[38]

Mashable posts a series known as "lies the internet told me" featuring thought pieces on prominent "Internet lies".[39]

See Also

[edit]

Reference List

[edit]
  1. ^ Natasha Lomas, Romain Dillet (August 21, 2015). "Terms And Conditions Are The Biggest Lie Of Our Industry". TechCrunch. Retrieved April 15, 2020.
  2. ^ a b Alex Hern (June 15, 2015). "I read all the small print on the internet and it made me want to die". The Guardian. Retrieved April 15, 2020.
  3. ^ Dictionary.com. "TL;DR". Acronyms Dictionary. Retrieved April 15, 2020.
  4. ^ Andrew Heizman (August 2, 2016). "What Does "TLDR" Mean, and How Do You Use It?". How-To Geek. Retrieved April 15, 2020.
  5. ^ Tom Chatfield (2016). Netymology: From Apps to Zombies: A Linguistic Celebration of the Digital World. Quercus. p. 124. ISBN 978-1-62365-165-7.
  6. ^ Soonmme (2008-07-14). "UrbanDictionary, definition #7". UrbanDictionary.com. Retrieved 2014-08-18.
  7. ^ USA Today (TNS) (February 5, 2020). "What to know before clicking 'I agree" on terms of service agreement or privacy policy". The Jordan Times. Retrieved April 15, 2020.
  8. ^ Hilbert, Martin; López, Priscilla (April 1, 2011). "The World's Technological Capacity to Store, Communicate, and Compute Information". Science. 332 (6025): 60–65. Bibcode:2011Sci...332...60H. doi:10.1126/science.1200970. PMID 21310967. S2CID 206531385. Retrieved April 15, 2020.
  9. ^ boyd, danah; Crawford, Kate (21 September 2011). "Six Provocations for Big Data". Social Science Research Network: A Decade in Internet Time: Symposium on the Dynamics of the Internet and Society. doi:10.2139/ssrn.1926431. S2CID 148610111.
  10. ^ "CommonTerms: Towards better terms and conditions". commonterms.org. Retrieved April 15, 2020.
  11. ^ "biggestlie.com". Web Archive. March 31, 2018. Archived from the original on 2018-03-31. Retrieved April 15, 2020.
  12. ^ a b Klint Finley (August 13, 2012). "Putting An End To The Biggest Lie On The Internet". TechCrunch. Retrieved April 15, 2020.
  13. ^ Naina Bajekal (September 29, 2014). "Londoners Unwittingly Exchange First Born Children For Free WiFi". Time. Retrieved April 15, 2020.
  14. ^ Tom Fox-Brewster (September 29, 2014). "Londoners give up eldest children in public Wi-Fi security horror show". The Guardian. Retrieved April 15, 2020.
  15. ^ Cate, Fred (January 28, 2013) [2006]. "Chapter 13: The Failure of Information Practice Principles" (PDF). In Winn, Jane K. (ed.). Consumer Protection in the Age of the 'Information Economy'. Ashgate Publishing Ltd. pp. 343–380. S2CID 154191331.
  16. ^ Obar, Jonathan; Oeldorf-Hirsch, Anne (July 3, 2018). "The biggest lie on the Internet: ignoring the privacy policies and terms of service of social networking services". Information, Communication & Society. 23 (1): 128–147. doi:10.1080/1369118X.2018.1486870. S2CID 214655004. Retrieved April 15, 2020.
  17. ^ a b Reidenberg, Joel R.; Russell, N. Comeron; Callen, Alexander; Quasir, Sophia; Norton, Thomas (July 26, 2015). "Privacy Harms and the Effectiveness of the Notice and Choice Framework" (PDF). SSRN I/S: A Journal of Law and Policy for the Informaion Society. Fordham Center on Law and Information Policy. Retrieved April 15, 2020.
  18. ^ European Court of Justice 2000/520/EC: Commission Decision of 26 July 2000 pursuant to Directive 95/46/EC of the European Parliament and of the Council on the adequacy of the protection provided by the safe harbour privacy principles and related frequently asked questions issued by the US Department of Commerce (notified under document number C(2000) 2441) (Text with EEA relevance.) 25 August 2000, retrieved 30 October 2015
  19. ^ a b c d e Schwartz, Paul M.; Solove, Daniel (June 30, 2009). "Notice and Choice: Implications for Digital Marketing to Youth" (PDF). The Second NPLAN/BMSG Meeting on Digital Media and Marketing to Children sponsored by The Robert Wood Johnson Foundation. Berkley Media Studies Group. Retrieved April 15, 2020. Cite error: The named reference "SchSol" was defined multiple times with different content (see the help page).
  20. ^ Ann Cavoukian (August 2011). "Privacy by Design in Law, Policy and Practice: A White Paper for Regulators, Decision-makers and Policy-makers" (PDF). privacybydesign.ca. Information and Privacy Commissioner, 2 Bloor Street E, Suite 1400, Toronto, Ontario. p. 8. Retrieved April 15, 2020.
  21. ^ Florencia Marotta-Wurgler (April 2015). "Does "Notice and Choice" Disclosure Regulation Work? An Empirical Study of Privacy Policies" (PDF). University of Michigan Law. New York University Law School. p. 30. Retrieved April 15, 2020.
  22. ^ Reidenberg, Joel R.; Breaux, Travis; Cranor, Lorrie Faith; French, Brian; Grannis, Amanda; Graves, James T.; Liu, Fei; McDonald, Aleecia; Norton, Thomas B.; Ramanath, Rohan (2015). "Disagreeable Privacy Policies: Mismatches Between Meaning and users' understanding". Berkeley Technology Law Journal. 30 (1). Berkeley Tech. L.J.: 39–88. Retrieved April 15, 2020.
  23. ^ Sloan, Robert; Warner, Richard (April 12, 2013). "Beyond Notice and Choice: Privacy, Norms, and Consent". Chicago-Kent College of Law Legal Studies Research Paper Series. 2013 (16). doi:10.2139/ssrn.2239099. Retrieved April 15, 2020.
  24. ^ Benson, Peter; Kirsch, Stuart (August 2010). "Capitalism and the Politics of Resignation". Current Anthropology. 51 (4). The University of Chicago Press: 459–486. doi:10.1086/653091. JSTOR 10.1086/653091. Retrieved April 15, 2020.
  25. ^ Nick Whitaker (November 12, 2019). "Constructing a Neoliberal Approach to Privacy". Exponents. Retrieved April 15, 2020.
  26. ^ Draper, Nora A; Turow, Joseph (March 8, 2019). "The Corporate Cultivation of Digital Resignation". New Media & Society. 21 (8): 1824–1839. doi:10.1177/1461444819833331. S2CID 150502017. Retrieved April 15, 2020.
  27. ^ Diomidis Spinellis (July 5, 2013). "The Faustian Bargain of Online Services". Harvard Business Review. Retrieved April 15, 2020.
  28. ^ Mark Sullivan (November 15, 2019). "Americans finally understand big tech's faustian bargain but that doesn't mean they'll quit". FastCompany.com. Retrieved April 15, 2020.
  29. ^ Kathleen Hall (May 16, 2012). "Public has struck 'Faustian bargain' in disclosure of data". ComputerWeekly.com. Retrieved April 15, 2020.
  30. ^ Carlos Solari (December 16, 2018). "The Faustian Bargain And Its Shameless Effect On Privacy". The Threat Report. Retrieved April 15, 2020.
  31. ^ Neef, Dale (December 2015). Digital Exhaust: What Everyone Should Know About Big Data, Digitization and Digitally Driven Data. Pearson Education LTD. pp. 40–42. ISBN 9780133838343. Retrieved April 15, 2020.
  32. ^ Szoka, Berin (December 7, 2009). "Privacy Trade-Offs: How Further Regulation Could Diminish Consumer Choice, Raise Prices, Quash Digital Innovation & Curtail Free Speech" (PDF). FTC.gov. The Progress & Freedom Foundation. Retrieved April 15, 2020.
  33. ^ Li, Tiancheng; Li, Ninghui (June 2009). "On the tradeoff between privacy and utility in data publishing". KDD '09: Proceedings of the 15th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining: 518–526. doi:10.1145/1557019.1557079. ISBN 9781605584959. S2CID 4262408. Retrieved April 15, 2020.
  34. ^ El Emam, Khaled; Jonker, Elizabeth; Arbuckle, Luk; Malin, Bradley (December 2, 2011). "A Systematic Review of Re-Identification Attacks on Health Data". PLOS ONE. 6 (12): e28071. Bibcode:2011PLoSO...628071E. doi:10.1371/journal.pone.0028071. PMC 3229505. PMID 22164229.{{cite journal}}: CS1 maint: unflagged free DOI (link)
  35. ^ "A Visual Guide to Data De-Identification" (PDF). Future of Privacy Forum. EY. April 2016. Retrieved April 15, 2020.
  36. ^ Davcik, Debojsa S.; da Silva, Rui Vinhas; Hair, Joseph F. (March 2015). "Towards a Unified Theory of Brand Equity: Conceptualizations, Taxonomy and Avenues for Future Research". Journal of Product & Brand Management. 24 (1): 3–17. doi:10.1108/JPBM-06-2014-0639. hdl:10071/8511. Retrieved April 15, 2020.
  37. ^ Wong, Kevin (2017). "Software Licenses in Plain English". TL;DR Legal. Retrieved April 15, 2020.
  38. ^ Downes, Larry (July 17, 2014). "The Biggest Net Neutrality Lie of All". Forbes. Retrieved April 15, 2020.
  39. ^ Chris Taylor (June 28, 2019). "Lies the Internet Told Me". Mashable. Retrieved April 15, 2020.