The Black Box Society
The Black Box Society: The Secret Algorithms That Control Money and Information is a 2016 academic book authored by law professor Frank Pasquale that interrogates the use of opaque algorithms—referred to as black boxes—that increasingly control decision-making in the realms of search, finance, and reputation.[1]
Pasquale uses the term "black box" as a metaphor with dual meanings: a black box can describe both a recording device (such as a data-monitoring system), as well as a system whose inner workings are secret or unknown.[2] The 319-page academic book, published by Harvard University Press, contains six chapters. Chapter one introduces the challenge of investigating technologies whose functions are overwhelmingly complex and incredibly mysterious.[3] Chapter two examines citizens’ digital reputation and the automated decision-making that can perpetuate systemic disadvantage for some while advantaging others.[4] Chapter three exposes the hidden mechanisms of profit-driven search engines through a series of disputes over bias and abuse of power in Silicon Valley.[5] Chapter four investigates the deeply problematic, unethical use of automated decision-making in the finance industry.[6] Chapter five deconstructs the need to open black boxes,[7] while chapter six stresses the emergent threat that black boxes pose to democratic societies and capitalist economies, as well as the need for an informed, autonomous citizenry.[8]
The Black Box Society has been reviewed in several academic journals by experts in the field, who largely praise the book for both its originality and timeliness as well as its vital contributions to the areas of law, technology, and social science.[9][10][11][12][13] However, the book has received some critical feedback on its conception of transparency as a solution to black boxes—raising questions surrounding privacy protection and ethics—as well as its operationalization of the term "black box."[14][15][16][17]
Definition of "black box"
[edit]In academic discourse, the usage of the term “black box” dates back to at least 1963 with Mario Bunge's work on a black box theory in mathematics.[18]
The term “black box,” as used throughout The Black Box Society by author and law professor, Frank Pasquale, is a dual metaphor for a recording device such as a data-monitoring system and for a system whose inner workings are secret or unknown.[19] According to Pasquale, in a black box society, individuals encounter these dual meanings every day: as people are pervasively tracked by private firms and government, they come to lack a meaningful understanding of where their data travels, how their data is used, and the potential consequences of data usage and disclosure by these organizations.[20]
Contents
[edit]Central thesis
[edit]Published by Harvard University Press in 2016, The Black Box Society has six chapters, totalling 319 pages. In his review of the book for Business Ethics Quarterly, law professor Alan Rubel identifies Pasquale’s central thesis: the algorithms which control and monitor individual reputation, information seeking, and data retrieval in the search, reputation, and finance sectors embody "some combination of too powerful, not subtle enough, and too obscure."[21]
Chapter one: The Need to Know
[edit]Chapter one of The Black Box Society introduces the epistemological problem that emerges from the increasing adoption of enigmatic technologies: “We cannot understand, or even investigate, a subject about which nothing is known.”[22] The author contends that American law has grown silent as questions of personal privacy increasingly arise, instead prioritizing corporate secrecy.[23] This incongruity of secrecy and complexity in the information economy is the focus of The Black Box Society.[24] Pasquale declares that a black box society is “closely resembles a one-way mirror.”[25] In other words, processes of Big Data collection, usage, and disclosure by private and public organizations loom invisibly. Pasquale argues that this is especially true in the areas of reputation, search, and finance—increasingly expressed through algorithms—thereby compromising individual freedoms and market fairness.[26]
The author's discussion of the power of secrecy is informed by the work of legal scholar Jack M. Balkin, who argues that one of the most important forms of power entails the privilege to interrogate others without being interrogated oneself.[27] Pasquale repurposes this constitutional phenomenon to argue that Internet companies that are resistant to regulation collect increasingly mass quantities of sensitive personal data on their users, with no way for those same users to exercise meaningful consent over their "digital dossiers."[28]
According to Pasquale, while companies assemble digital dossiers about the intimate lives of private individuals, the decisions these entities make for themselves are influenced as Internet and finance businesses become decision-makers about how they can or should live their lives by way of credit scoring, predictive analytics, and recommendation engines.[29]
Chapter two: Digital Reputation in an Era of Runaway Data
[edit]In Digital Reputation in an Era of Runaway Data, chapter two of The Black Box Society, Pasquale declares that private data is a significant source of profit for other people, but often at the expense of the person to whom the data belongs.[30]
According to the author, data brokers use data mining to analyze private and public records in order to draw inferences about people, largely unrestrained by government laws; it is only leaks, investigations, and complaints that crack open the black boxes of reputation analysis.[31] The author labels the exchange of sensitive personal data between commercial and government organizations as a form of “dataveillance."
Pasquale explains the pervasive process of digital scoring, wherein “many of our daily activities are processed as ‘signals’ for rewards or penalties, benefits or burdens” by algorithms.[32] Pasquale’s main concern here is that original black boxed techniques in credit reporting are now becoming used in other relatively unregulated sectors including medicine, law enforcement, marketing, and employment.[33] The author deems this “runaway data” as not only “creepy” but also having substantial costs.[34]
Pasquale highlights the concern related to the emergent racial bias inherent in automated decision-making alongside other forms of identity-based prejudice and discrimination. These decisions are “subtle manipulations that are nearly impossible to discern for ordinary citizens not privy to the internal computer code.”[35] Reputation-ranking algorithmic systems are programmed by human beings who cannot easily separate the embedding of their implicit biases and values into the software that they code.[36]
The author cites a 2012 research study on algorithmic discrimination by computer scientist Latanya Sweeney, former director of the Data Privacy Lab at Harvard University and now senior technologist at the Federal Trade Commission, who found that online advertisements suggestive of arrest records appear more frequently when searching for names that are commonly associated with African American cultures.[37]
Chapter two concludes with a call to action in regard to justice system reform: “It is time to reclaim our right to the presumption of innocence, and to the security of the light.”[38]
Chapter three: The Hidden Logics of Search
[edit]Chapter three of The Black Box Society, The Hidden Logics of Search, exposes the new masters of the Internet: search engines that influence, reshape, and answer billions of queries every day.[39] Pasquale asserts that search engines have a dark side: opacity. Platforms make decisions daily that impact personal, commercial, and democratic matters by way of automated personalization.[40] The author thus asks how far the public can trust the people who make these black boxed systems that possess the social, cultural, and political power to include, exclude, and rank people in the digital economy.[41] Furthermore, Pasquale emphasizes that market forces are inadequate in protecting optimal levels of privacy, writing, “In an era where Big Data is the key to maximizing profit, every business has an incentive to be nosy.”[42]
Socially constructed as neutral and objective, the author highlights how these algorithmic systems regularly make value-laden decisions imbued with bias that come to reflect people’s perception of the world.[43] This chapter examines four areas wherein the behaviour of major search engines and companies raises concerns over trust—transparency, competition, compensation, and control—using case studies of disputes over bias and abuse of power in Silicon Valley.[44]
One case study interrogated by Pasquale is the deflection of concerns by microblogging platform X, then named Twitter, over its automated decision-making regarding trending hashtags.[45] On X, hashtags can nominate certain terms as “trending”—or interesting and relevant—and tweets containing trending hashtags become recommended and broadcast to all users.[46] According to Pasquale, in late September of 2011, the Occupy Wall Street protest movement began gaining media traction, and despite the hashtags #OWS and #occupy being used in numerous tweets, these tweets did not appear on Twitter’s official Trending Topics list. Supporters and activists accused Twitter of algorithmic suppression, condemning the censoring of a politically controversial movement. Twitter subsequently released a statement, explaining that Trending Topics is dependent on tweet velocity, not popularity, and the contention settled, sparking discussions surrounding media literacy.
Chapter four: Finance’s Algorithms: The Emperor’s New Codes
[edit]In chapter four of The Black Box Society, Finance’s Algorithms: The Emperor’s New Codes, Pasquale investigates the increasing computerization and consequent lack of transparency in the finance sector over the past several decades.[47] The author contends that the worldwide 2007-2008 financial crisis exposed the hidden practices of large banks: bad data, bad apparatuses, and devious corporate structures.[48]
According to Pasquale, secret algorithms are “obscured by a triple layer of technical complexity, secrecy, and ‘economic espionage’ laws that can land would-be whistle-blowers in prison,” thereby restricting citizens from truly knowing how major financial firms collect, use, and disclose their personal data.[49] Citizens also cannot know when complex, trade-secret mathematic calculations including rating formulas are drawing from biased, incomplete, or inaccurate data.[50] This obscurity causes instability and conflict, according to the author.[51]
This chapter explores two sources of obfuscation and opacity resulting from the black boxing of the finance industry: illegality and algorithmic complexity.[52] Problematically, black box finance opens endless possibilities “for self-serving or reckless behavior” among insiders in the finance sector.[53] Pasquale emphasizes that dissent from whistle-blowers in financial corporations is often silenced, demonstrated using a case study of the Countrywide Financial political loan scandal.[54] In this scandal, business executives and U.S. politicians allegedly received favourable mortgage rates.[55]
Pasquale ultimately asserts that contemporary finance depends confidently upon the use of mechanisms to obscure important information from borrowers, lenders, clients, regulators, and the general public.[56]
Chapter five: Watching (and Improving) the Watchers
[edit]In the fifth chapter of The Black Box Society, Watching (and Improving) the Watchers, Pasquale argues that societies need to improve the major business firms controlling the flows of information and media as well as individuals' financial destinies.[57]
This chapter calls for transparency, with the author declaring that solutions must build upon successes in opening black boxes in distinct corporate sectors, and subsequently apply these successful methods to the entire black box society—thereby illuminating Big Data-driven decision-making from credit scoring to digital advertising and employment.[58] Pasquale insists that “[d]ata is the fuel of the information economy, and the more data a company already has, the better it can monetize it,” warning of the price that this monetization will cost individuals.[59] Hence, the first step toward de-obscuring the reputation, search, and finance sectors is to learn more about their corporate practices, which need to be reformed to level commercial playing fields and ensure integrity in business transactions.[60]
The author contends that opening black boxes raises key issues surrounding the who, when, what, and why of disclosure—viewing depth, scope, and timing as a spectrum between preserving secrecy and providing transparency.[61]
According to Pasquale, notice policy merely perpetuates a fiction of privacy.[62] Therefore, the author proclaims that citizens, in digital and physical spaces, deserve the right to inspect and correct their data in reputation, search, and finance, as societies must ensure that independent citizens who do not perform operations of surveillance apparatuses themselves have some role to play in processing the overwhelming quantity of information that even a data oversight program would produce.[63]
Chapter six: Toward an Intelligible Society
[edit]Pasquale begins the sixth and final chapter of The Black Box Society, Toward an Intelligible Society, by referencing Cory Doctorow’s story Scroogled.[64] In this story, Doctorow imagines Google intertwined with the United States Department of Homeland Security, creating a politically powerful institution of information control.[65] Speaking to America’s current reality, Pasquale writes, “Doctorow’s story confronts us with a stark question: Do we permit Google to assert trade secrecy to the point that we can’t even tell when a scenario like that has come to pass?”[66]
The author argues that black boxes embody the paradox of the “information society,” wherein data has become a vast, valuable resource, yet these resources are available only to the watchers.[67] In a black box society, Pasquale claims that a rule of reputational scores and bets, separate and unequal economies, invisible powers, and wasteful and unfair competitions is created.[68] This chapter consequently unveils “the lure of the black box,” “surfac[ing] its seductions—and their limits."[69]
Pasquale acknowledges the efficiency afforded by black boxes but proposes reforms to make ordinary transactions slower by delegating responsibility back to human beings from algorithms.[70] The author argues that “only humans can perform the critical function of making sure that, as our social relations become ever more automated, domination and discrimination aren’t built invisibly into their code.”[71] Pasquale states that massive Internet and finance companies “present a formidable threat” to necessary values of fairness, integrity, and privacy, with this threat increasingly intertwined with government powers and far too frequently obscured by the secrecy of black boxes.[72]
The author emphasizes how in capitalist democracies, algorithms achieve the longstanding desire humans have to predict the future, “tempered with a modern twist of statistical sobriety” via algorithms.[73] Pasquale concludes that despite their wonder, black boxes have created a threat to democratic society and the economy—insisting that only a knowledgable citizenry can combat this danger.[74]
Academic reception
[edit]Positive reviews
[edit]In one academic book review in Church, Communication and Culture, business professor Kevin de Souza writes that Frank Pasquale’s book is a “thought-provoking” result of ten years of research—a contribution to the theory and practice of social justice covering aspects of social science, law, and technology.[75] Similarly, in one academic book review in Business Ethics Quarterly, law professor Alan Rubel asserts that Pasquale’s greatest accomplishment in the book is “connecting several forms of obscurity (actual secrecy, legal secrecy, and deliberate obfuscation) to broader social phenomena.”[76]
Moreover, in the European Data Protection Law Review, business professor Alessandro Spina’s academic book review praises The Black Box Society as “an important, intelligent and timely contribution” in legal doctrine. Spina argues that the book serves as a crucial reference point for informing public debates over data protection and privacy, especially where new legislation has emerged such as in Europe with the General Data Protection Regulation (GDPR).[77]
Critical reviews
[edit]Rubel states that the extensive scope of The Black Box Society is impressive; however, Pasquale’s emphasis on the black box metaphor raises a couple of critical questions.[78] The first question asks exactly what work the black box metaphor is doing, and whether black boxes in the reputation, search, and finance sectors share commonalities other than obscurity. The second question addresses the moral salience of black boxes and the varying importance of transparency from this perspective.[79] Nevertheless, Rubel writes that The Black Box Society, as a complex and compelling book, should encourage scholars of political philosophy and business ethics to seek answers to these looming questions.[80]
In an academic book review for the European Journal of Risk Regulation, law professor Sofia Ranchordás notes that although The Black Box Society “sheds new light on the problem of asymmetric data collection and makes a very important contribution to the literature,” yet readers with an interest in regulation “might miss some specific normative suggestions.”[81] Ranchordás adds that the asymmetry of knowledge and transparency defined in chapter one “appears to become unfortunately diluted in the general narrative at the end of the book.”[82] Likewise, law professor Anupam Chander, in another academic book review in the Michigan Law Review, argues that despite Pasquale’s careful and significant contribution to a line of scholarship interrogating the role of automated algorithms in our lives, The Black Box Society “lends itself to a misdiagnosis of the discrimination problem likely to lie in algorithmic decisionmaking.”[83] Chander consequently emphasizes how even a transparent algorithm can still lead to discriminatory outcomes.[84] Chander nonetheless appreciates Pasquale’s critical recognition that algorithmic systems are neither neutral or objective technologies.[85]
References
[edit]- ^ Pasquale, Frank (2016). The Black Box Society: The Secret Algorithms That Control Money and Information. Cambridge, Massachusetts: Harvard University Press. pp. 1–319. ISBN 9780674970847.
- ^ Pasquale 2016, p. 3
- ^ Pasquale 2016, pp. 1-18
- ^ Pasquale 2016, pp. 19-58
- ^ Pasquale 2016, pp. 59-100
- ^ Pasquale 2016, pp. 101-139
- ^ Pasquale 2016, pp. 140-188
- ^ Pasquale 2016, pp. 189-220
- ^ Chander, Anupam (2017). "The Racist Algorithm?". Michigan Law Review. 6 (115): 1023–1045. doi:10.36644/mlr.115.6.racist.
- ^ Ranchordás, Sofia (2016). "The Black Box Society: The Secret Algorithms That Control Money and Information by Frank Pasquale Cambridge, MA: Harvard University Press, 2015, 320 pp.€ 31.50". European Journal of Risk Regulation. 7 (2): 460–462. doi:10.1017/S1867299X00005894.
- ^ Rubel, Alan (2016). "The Black Box Society: The Secret Algorithms that Control Money and Information, by Frank Pasquale. Cambridge: Harvard University Press, 2015. 320 pp. ISBN 978–0674368279". Business Ethics Quarterly. 4 (26): 568–561. doi:10.1017/beq.2016.50.
- ^ de Souza, Kevin (2020). "Opening up Digital Players' Black Boxes: The Black Box Society: The Secret Algorithms That Control Money and Information, by Frank Pasquale". Church, Communication and Culture. 5 (1): 136–139. doi:10.1080/23753234.2020.1713012.
- ^ Spina, Alessandro (2015). "The Black Box Society. The Secret Algorithms That Control Money and Information by Frank Pasquale Harvard University Press, 2015, 311 P. I 31,50; Hardcover". European Data Production Law Review. 1 (1): 87–90.
- ^ Chander 2016
- ^ Ranchordás 2016
- ^ Rubel 2016
- ^ Spina 2015
- ^ Bunge, Mario (1963). "A General Black Box Theory". Philosophy of Science. 4 (30): 346–358. doi:10.1086/287954.
- ^ Pasquale 2016, p. 3
- ^ Pasquale 2016, p. 3
- ^ Rubel 2016, p. 568
- ^ Pasquale 2016, p. 1
- ^ Pasquale 2016, p. 3
- ^ Pasquale 2016, p. 3
- ^ Pasquale 2016, p. 11
- ^ Pasquale 2016, p. 5
- ^ Balkin, Jack (2008). "The Constitution in the National Surveillance State". Minnesota Law Review (93): 1–25.
- ^ Pasquale 2016, p. 4
- ^ Pasquale 2016, pp. 4-5
- ^ Pasquale 2016, p. 19
- ^ Pasquale 2016, p. 21
- ^ Pasquale 2016, p. 21
- ^ Rubel 2016
- ^ Pasquale 2016, p. 26
- ^ Chander 2017, p. 1026
- ^ Pasquale 2016, p. 38
- ^ Sweeney, Latanya (2012). "Discrimination in Online Ad Delivery: Google Ads, Black Names and White Names, Racial Discrimination, and Click Advertising". Communications of the ACM. 5 (56): 44–54. arXiv:1301.6822. doi:10.1145/2460276.2460278.
- ^ Pasquale 2016, p. 57
- ^ Pasquale 2016, pp. 59-60
- ^ Pasquale 2016, p. 60
- ^ Pasquale 2016, p. 81
- ^ Pasquale 2016, p. 81
- ^ Pasquale 2016, p. 61
- ^ Pasquale 2016, p. 61
- ^ Pasquale 2016, p. 75
- ^ Pasquale 2016, p. 76
- ^ Pasquale 2016, p. 102
- ^ Pasquale 2016, p. 120
- ^ Pasquale 2016, p. 103
- ^ Pasquale 2016, p. 103
- ^ Pasquale 2016, p. 103
- ^ Pasquale 2016, p. 103
- ^ Pasquale 2016, p. 103
- ^ Pasquale 2016, p. 111
- ^ Pasquale 2016, p. 111
- ^ Pasquale 2016, p. 319
- ^ Pasquale 2016, p. 140
- ^ Pasquale 2016, p. 140
- ^ Pasquale 2016, p. 141
- ^ Pasquale 2016, p. 141
- ^ Pasquale 2016, p. 142
- ^ Pasquale 2016, p. 143
- ^ Pasquale 2016, p. 157
- ^ Doctorow, Cory (2014). Scroogled. Milan, Italy: Zoom Feltrinelli. p. 24. ISBN 9788858853689.
- ^ Doctorow 2014
- ^ Pasquale 2016, p. 189
- ^ Pasquale 2016, p. 191
- ^ Pasquale 2016, pp. 191-194
- ^ Pasquale 2016, p. 195
- ^ Pasquale 2016, p. 213
- ^ Pasquale 2016, p. 213
- ^ Pasquale 2016, p. 215
- ^ Pasquale 2016, p. 216
- ^ Pasquale 2016, p. 218
- ^ de Souza 2020
- ^ Rubel 2016, p. 568
- ^ Spina 2015, p. 87
- ^ Rubel 2016, p. 570
- ^ Rubel 2016, p. 560
- ^ Rubel 2016, p. 571
- ^ Ranchordás 2016, p. 462
- ^ Ranchordás 2016, p. 462
- ^ Chander 2017, p. 1024
- ^ Chander 2017, p. 1024
- ^ Chander 2017, p. 1025