“Data Sciences and Society” Reading Group
Discussion notes by Naveeda khan:
On May 6, the Sawyer sponsored Data Sciences and Society reading group met to discuss select writings of Juliet Floyd, professor of logic and mathematics at Boston University, and Matthew Jones, professor of history of mathematics and surveillance at Columbia University. While earlier we had met with individual scholars, this was the first time the reading group brought together two scholars to discuss their work jointly to see how areas of overlap and divergence may be productive of further lines of inquiry. Each presented a succinct introduction to their areas of interest with Canay Ozden-Schilling, the Sawyer postdoctoral fellow, offering some general comments, followed by a collective discussion. This was followed by discussions of individual papers by two separate groups. The summary provides the main highlights of the discussions.
In Professor Floyd’s introduction to her paper that we read titled “Lebensformen: Living Logic” she informed us that while Wittgenstein, the philosopher of logic and language, was clearly informed by his training in mathematics, it was less well known that he was also influenced by computer science, specifically through the work of Alan Turing. That Wittgenstein influenced Turing was not in doubt as the latter made incompleteness central to his work on computer science, taking it to be central to the science of the social as explicated by Wittgenstein. In other words, he understood the human perspective to be pervasive in the crafting and deployment of any computer-based algorithm. Turing also claimed that there was no dichotomy between facts and values with science needing to know the limits of its knowledge and philosophy to know its facts. What Wittgenstein got from Turing was that it was not the actual interface between humans and machines that was important but rather human to human interactions in the presence of machines. There could be no general or universal reply to the question of what is thinking but that we had to attend to the specificities of each instance of it.
In Professor Jones’ introduction to his paper titled “How We Became Instrumentalists (Again): Data Positivism since World War II” he reminded us that we do not really have a proper account or accounts of how we have come to be captive to big data in every aspect of our lives. For this reason he has deemed the 1990s as the Paleolithic age of data and considers himself to be working to elicit that pre-history without presuming any set teleology. He locates the big data revolution at the conjunction of three historical trends, specifically related to statistics, artificial intelligence and databases. Neither one is capable of having brought about the data revolution with statistics incapable of dealing with the large number of data points integral to big data, AI aiming too high without adequate attention as to whether it has the knowledge base for its claims and databases being the most pragmatic of the three trends, attentive largely to the practical problem of storage. Jones is interested in the genealogies of each and how they produced ecologies that engendered the explosion of big data.
Canay had specific interventions for each of the two authors. Based on her reading of their individual papers she noted that Floyd’s intervention suggested how ethics and algorithms were entwined everywhere that algorithms existed, primarily because of the inseparability of the human perspective from algorithms. Algorithms could be most productively taken as an instance of humans trying to create a rule, with all that this implies. Thus rather than discussing whether algorithms are bias free or not, we are more productively placed if we ask where lies the bias, in the sameness or difference between human acts of rule creation and calculations and that of algorithms? Do we need to rethink proxies or can we not operate without them? In her comments for Jones, Ozden-Schilling pointed out that Jones’ project could be aptly characterized as showing how prediction takes over for interpretation. She wanted to know more about the use value of prediction for commerce and business and as to whether academia had been similarly overtaken by prediction? She wondered how we, as analysts, might make clearer (even to ourselves) the continued place of interpretation within prediction.
The discussion that ensued was wide-ranging. A persistent question was about reality directed largely at Floyd. Floyd said that some bias is necessary to grasp reality but that it does not compromise objectivity. Uncertainty was a persistent aspect of all reality. But in our present, with disinformation rife and the paranoid style in politics, uncertainty had a more ambiguous status as to whether good or not. This led Veena Das to ask if the structure of reality has changed, including the “furniture of the world,” and “new intentional objects.” What would reality be without humans? Another question that was directed at Jones was the importance of taking into consideration histories of statistics arising from other contexts, notably India, where statistics wasn’t just about impersonal population management but about management of food resources to avoid famines. How did histories from elsewhere stand to change the story of big data largely being centered in the industrialized West? There was a general sense in the audience iterated by Floyd that we are in the middle of a big experimentation with evolutionary aspects to it and that we ought to be attendant to this aspect of the present.