9/6/2019 Precision and Uncertainty in a World of Data: Panel Stream, 2019 Meeting of the Society for Social Studies of Science



During the 2019 meetings of the Society for Social Studies of Science, the Johns Hopkins based faculty of the Mellon Foundation-supported Sawyer Seminar” in collaboration with faculty from the University of Edinburgh Center for Biomedicine, Self, and Society assembled a three-panel stream of researchers from around the world to bring the varied disciplines of STS to bear on the variegated nature of uncertainty produced within a data saturated environment. Our aim is to examine how past and present invocations of big data, which hold out the promise of precision and certainty, also proliferate uncertainties within many domains of practice: from medicine to marketing, criminal law to news media, and across almost all scientific fields. We seek new lines of inquiry into the challenges posed to scientific inquiry and social institutions by the consolidation of computational analysis, machine learning and the generation of big data. Papers were asked cohere around the dialectic of certainty and uncertainty produced by big data and algorithms in practice, and consolidated around three major themes: data citizens, data systems, and data bodies:


The first panel brought together papers thinking at the intersection of data-laden citizenship and other novel subjectivities. Alessandro Angelini, of the Department of Anthropology of Johns Hopkins University, presented “Twitches and Seizures: A Political Interruption of Big Data in Smart-City Rio de Janeiro”, which discussed a “smart city” initiative commissioned by Rio’s city government and brought to life at the end of 2010. While smart city’s remote cameras register partial views of city of life in Rio, the operators in its control room strive to use the data generated to present solutions for civic and urban problems. Smart city’s operators consider the city to be always in flux—from traffic conditions to the movement of rainwater. Their technophiliac work is predicated on the assumption that urban problems can be discretely reduced to data sets and that those data have intrinsic value to be extracted through algorithmic operations. But smart city infrastructures, Angelini suggests, are equipped with the ability to sense certain kinds of interruptions, and not others, opening up opportunities for users to “outsmart” the smart city. Angelini proposes an ethnographic approach of smart cities that do not play into either triumphal or critical narratives, but rather attends to forms of resistance, sabotage, and tricksterdom.

Jennifer Gabrys, from the Department of Sociology at Cambridge University, presented a paper on “Data Citizens and the Right to Data,” which followed nicely on the questions of how data use transforms urban citizenship. Gabrys called attention to how our thinking about data and citizenship often stays at the level of the “right to data,” or the right to own one’s data. What’s overlooked is a broader framework for making claims to citizenship within the context of data—or an interest in how rights could be articulated through data. Citizen science projects, Gabrys suggested, generate citizens attached to the idea that can facilitate democratic life, but many such projects fail to go beyond generating observations instead of claims—observing, for instance, that gentrification constructions cause pollution, but falling short of making an appeal to a right for housing. Gabrys discussed the use of “dustboxes” in a project of citizen-generated air-quality data points.

Sonja Erikainin, from the School of Sociology and Social Policy of the University of Edinburgh, presented a paper co-authored with her colleague Sarah Chan, titled “Data, Diagnosis, and Decision-making: Navigating Uncertainty Through Moral Discourse and Practice.” Their paper surveys the field of precision medicine in terms of how it leverages big data to produce individualized, stratified solutions. In the rise of direct-to-consumer health and the proliferation of digital fitness and wellness data, Erikainin identified a shift in medicine from treatment of disease to health management—management predicated on individuals’ responsibilization and self-knowledge. Deriving actionable insights into individual health from big data is, Erikainin, proposed, increasingly delegated to algorithms, which brings with itself a black-box problem and decreased accountability. Erikainin proposed that what is at stake at this moment are the very meanings of biomedical knowledge production and health itself.

Aaron Plasek, from the Department of History of Columbia University, concluded the panel with a paper on “Making Machines that Make Us: How Machine Learning Shaped Human Capacity and Limited Social Possibility,” which offered a close look at the histories of the methods that facilitate how we experience data. Opening with remarks on the definitional problems surrounding the many traditions of machine learning and artificial intelligence. Plasek asked, “Why and how did it become thinkable to use machine learning to arbitrate social problems?” Through a discussion of his archival research, Plasek traced this question back to the 1950s—a moment that saw the emergence of different kinds of AI. Plasek pointed out that what is now considered canonical (human-imitative) AI constituted only a tiny part of a vaster range of machine learning systems, like intelligence augmentation AI. Tracing how those traditions that valued building expert systems gradually lost out to those focused on building neural networks, Plasek suggested, can help us uncover the shift in the social values and goals of machine learning—shifts necessary in order to define the question of data in terms of rights and citizenship.


The papers on this panel dwelled explicitly on kinds of uncertainty systemic and inherent to scientific and technological practices. Robert Soden, from the Department of Computer Science at the University of Colorado-Boulder, reported from his fieldwork on the science and politics of flood risk in Colorado, starting with the 2013 floods in the region—a major rainfall event that resulted in several deaths and hundreds of millions of dollars in damages. Through a discussion of the maps plotting flood risk, Soden observed that uncertainty is not a problematic or unappealing aspect of technoscience, but a generative and systemic feature. The engineers Soden observed were motivated by earlier government-produced maps’ attempted closure of how flood hazard was to be understood, to create more “precise” maps and to stir publicly engaged debate around the plotting of risk and flood management bureaucracy.

Canay Özden-Schilling, from the Department of Anthropology at Johns Hopkins University, also dwelled on the systemic nature of uncertainty in processes of data proliferation—with a focus on economic processes. She discussed the case of market intelligence firms operating in electricity markets. These firms strive to represent the electric grid in granular data form to service data-hungry market actors with the promise of boosting prediction power and, by extension, profits. In this new moment of economic exchange, Özden-Schilling argued, the focus of market actors seeking competitive edge had shifted away from observing each other for each other’s next move, to capturing the commodities and their surrounding infrastructures in electronic representation. This effort, she argued, continues even when whether extra profit is in fact achieved is not easily verifiable.

Anne Dippel, from the University of Jena, reported from her longstanding fieldwork with the physicists working at the large Hadron Collider at CERN. Her analysis of uncertainty, as it takes shape in the data practices of high energy physics, was instantiated in examples like a year’s delay of operations for a single loose crew. Dippel suggested that these ruptures and procedural uncertainties, as unpredicted as they are, provide a predictable rhythm for scientists’ quest to derive scientific certainty. Dippel further showed that the management of systemic uncertainty in high energy physics is unevenly distributed within the community of CERN, especially across gender lines. The accomplishments of “capital M guys” can be chalked up to Mankind, when it is, in fact, the “behind the screen work” of various female physicists to internalize uncertainty and smooth out operations.

David Demortain, of the French National Institute for Agricultural Research was also intrigued by the dialectic of certainty and uncertainty. His talk took this issue up in the context of the regulation of chemicals—an EPA-led project to determine toxicity based on biological impact datasets, as opposed to conventional methods like animal testing. Demortain argued that the unlikely regulatory innovation this program has shown hinged on the fact that decisions were now increasingly mediated by the systematic public auditing of models and the disclosing of their inherent uncertainty.

Discussant Steve Sturdy, from the University of Edinburgh Center for Biomedicine, Self, and Society pointed out that the panel as a whole had surveyed models and representations in one form or another, which had uncertainty written into them—uncertainties in lived world created uncertainties in models, which then became further implicated in lived-world uncertainties. The question the papers asked urgently, he argued, was the uneven distribution of responsibility for dealing with systemic uncertainty in models and lived experience alike.

Jeremy Greene presenting his paper, “The Computer in the Clinic.”


Jeremy Greene, from the Department of the History of Medicine at Johns Hopkins presented a historical survey of an earlier moment of big data in precision medicine (that partly drew on a recent paper co-authored with Andrew Lea) and the new forms of uncertainty generated across several modalities of electronic media in computational medicine over the past 60 years. Greene walked the audience through the modes of electronic engagement that were instrumental in creating an electronic patient in the 1950s through the 1970s—from collating electronic records at massive scale to generating electronic diagnoses more precisely than an individual doctor is capable, to, ideally, preventing disease electronically. Greene pointed out that early experimenters in the field, like Kaiser Permanente, designed new clinic spaces that the patient could move through within the span of two hours and generate a full health record without seeing a computer. Some 65-75 million Americans’ health records were digitized at this time. And yet in each of these modes, promises of precision medicine generated new forms of living with uncertainty: in the access to medical information, in the structure of the electronic medical record, in the use of algorithms for diagnosis, treatment, and prevention. Then, as the case now, the project of precision medicine drew fear and criticism as much as it inspired hope.

Anna Jabloner, from the Columbie University Center for Research on Ethical, Legal, and Social Implications of Psychiatric, Neurologic, & Behavioral Genetics, continued to develop themes of uncertainty in precision medicine with her survey of “precision psychiatry” identified in mental health circles. The promise of precision psychiatry, like the broader range of precision medicine, is tapping into big data to generate tailored solutions for individual patients. It is an articulation of the computational turn in psychiatry, Jabloner argued, as well as the latest incarnation of genomic medicine, the use of which is still controversial in clinical medicine. Fundamentally, precision psychiatry develops against an intense background of discriminatory use of genetic medicine in criminology, where data collection targets vulnerable populations. Precision psychiatry is constituted as a clinical project but it could equally be considered as a legal or criminological one, to the extent that it taps into behavior prediction database apparatuses. Her talk called attention to the reductionist agenda of “preventive” psychiatry and the racist, classist, and sexist implications of testing individuals for future behavioral problems.

Ashveen Peerbaye, of the CERMES3 interdisciplinary social science collective at the French National Institute of Health and Medical Research, discussed the introduction of precision medicine into tumor genetics in France and argued that the promises of precision come at a price. Based on observations of tumor genetics platforms, Peerbaye argued that the clinic now relies increasingly more on costly, highly specialized equipment, collaboration between various professionals across organizations, and data-work which often leads to data friction. Precision medicine engenders epistemic uncertainty in the form of a rupture between a good model of diagnostics and good clinical practice, as well as an ontological uncertainty as far as the definition of a tumor. Peerbaye concluded on the note that certainty in the context of clinical utility can come from less precise, supposedly outdated forms of knowledge.

Steven Sturdy rounded the panel and the conference with a discussion of conflicting roles of precision and uncertainty in the governance of intellectual property in the biotech sphere, using a comparative case of gene sequencing in the 1990s an early 2000s, and DNA diagnostics in the early 21st century. Sturdy’s talk hinged on the definition of ‘utility’ and the complex gamesmanship in patent disputes over how ‘broad’ or ‘narrow’ any given intellectual property claim can be made—especially for objects like DNA sequences which are seemingly parts of the natural world. Sturdy demonstrated that the dialectic of precision and uncertainty in the world of genomic data has been a constitutional element in the monetization of the biotech industry from its origins to the present day.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s