This workshop attended to the design and deployment of sensors that populate the world around us, charting the territory from quantified environments to quantified selves. It asked what kinds of assumptions were built into them at the design level and, once they came alive, how they remade the social fabric around them.
Event organizers Michael Degani and Alessandro Angelini kicked off the event with commentary on the lay of the land. Notoriously, sensors are caught up in surveillance projects, on a spectrum of militarism to capitalism, camouflaging into everyday activities—from “stingray” surveillance that extracts information from individual phones by mimicking nearby cell towers to environmental sensing technologies like the smart phone that detects a user talk about Disney and recommends a Goofy video. Sensors push what we know about the difference between sensing and making sense as they learn the difference between a wink a twitch of the eye. While engendering or intensifying inequality, sensors can also help expose existing ones patterns and social problems (like the Baltimore Open Air Project that focuses on visualizing environmental racism).
Natasha Schüll (New York University) told us about her research on consumer-grade wearable electronics, laden with sensors that fit into our pockets or adorn our bodies. These wearables are associated with the “quantified self” movement that often promises some kind of human capacity augmentation. Schüll’s ethnographic research with these technologies’ designers, however, suggests that the underlying agenda is less one of optimizing or enhancing human experience than helping users merely “cope” in settings that challenge our flourishing. The Hapifork helps you calibrate how long you chew your bite before you swallow; glasses that measure your attention level to the subject at hand give you a little signal to snap you back into attention. These devices assume a certain data fatigue on the part of the users; instead of explaining the statistics of correlations, they give you small nudges to help you acquire better habits. The resulting self is perhaps more responsible for their wellbeing than previously, but that responsibility is also severely outsourced to devices the self comes to depend on.
Biostatistician Ciprian Crainiceanu’s (Johns Hopkins University) presentation on the same panel also focused on wearable technology, though with an agenda of assessing their efficiency, with a healthy dose of skepticism and an eye for their several failures. We watched a movie visualizing a day of Crainiceanu wearing an accelerometer. Assessing the movie visually, we acted the way a computer would—trying to detect patterns between graphs of, for instance, movement, age, and body-mass index. One could, as Crainiceanu pointed out, easily generate terabytes of such data, but the choice is up to the statistician as to how much to zoom into the data to extract meaningful correlations, how this relates to individual wellbeing, and how much, critically to trust certain data products, like calorie counts. During the panel discussion, the questions around how to connect data to everyday experience became more pronounced. Yulia Frumer asked, with many sensors that revolutionize daily life and become indispensable, like diabetes tracking technologies, what does one do when they fail? Where does our haptic literacy come from? Do we risk a pathologization of everyday life by surrounding ourselves with sensors that constantly assign scores to our otherwise health everyday experiences?
And as a number of participants pointed out, the blurriness of the line between data for one’s self and data for surveillance fast becomes problematic. Crainiceanu relayed that a certain sense of vigilance is needed from users to avoid privacy regulations, while Schüll brought up the flip side—people who design closed data systems to track and protect themselves. We also discussed how these sensors change what a person needs to sense to be human? Schüll and the participants agreed that the way wearables sense our experience and how we do are not coterminous—that there is a certain disjuncture between the temporalities of these processes, a point to which we would return later in the event.
Our second panel was kicked off by historian Etienne Benson (University of Pennsylvania), who presented on the post-1950 quantitative revolution in fluvial geomorphology (or the study of how water bodies shape the Earth), which was part of a larger quantitative turn in geography and the environmental sciences. Benson’s vivid presentation on a team led by geomorphologist and hydrologist Luna Leopold brought to the event’s focus the subjectivities of the individual researchers initiating novel forms of sensing. Leopold’s team was instrumental in the field’s shift from describing forms (morphology) to land-forming processes (dynamics). The new agenda of “hydraulic geometry” depended on collection of massive levels of local data to explain how geological phenomena were affected by river flows. Aided with evocative pictures from Leopold’s fieldwork, Benson showed that fieldwork was revived at the time as a quantitative practice, with Leopold and his team being literally immersed in fieldwork, knee-deep in Baldwin Creek, Wyoming, counting pebbles and measuring flows. These men imagined themselves in the mold of 19th century expeditionists, refusing office jobs in big cities, and in the meantime, transforming into the sensors of their own data collection. Benson suggested that we shouldn’t assume that data collection projects are solely a function of infrastructure projects, but also of people who have the motivation to go out to the field to immerse themselves in it. He finished by asking: “How do people go out into the field once they have all of this data?”
The last speaker, David Bowen, a studio artist from the University of Minnesota, capped off the event with a visually engrossing presentation—a chronological overview of his data-laden studio art. Much of Bowen’s artwork makes use of sensors planted in particular locations creating physical shapes elsewhere in real-time—for instance data from sensors on Lake Superior being used to recreate waves in a gallery space. Others turn data into some other artistic form. For instance, the “cloud piano” relies on a camera pointed towards the sky running on custom software, where dense clouds press the keys on the virtual piano hard (and blue sky days didn’t produce too exciting results). Or a fly colony adjacent to a keyboard generates tweets as the flies hit the keys. Bowen’s presentation wrapped things up in a way that pushed the audience to consider the instrumental and the poetic juxtaposed. Bowen considers the artistic process as a collaboration “between natural forms, the mechanisms, and myself” and data involved as “aesthetic data.”
The discussion of the second panel delved further into the relationship between sensing and information. Benson noted that sensing was not only a transduction of an information patterns; it was a deeply embodied and affective process that can’t be captured in such transduction. Unlike what the cyberneticians might have thought, the vehicle of information matters and shapes the resulting outcome we call information—in the meantime giving way to instantiations of human consciousness in different forms. We returned to the question of temporality from the first panel as the participants noted Bowen’s interest in real-time data capturing and what its difference from recorded time could be. While the first panel had highlighted temporalities missed, the second looked at what other temporalities there could be.