fellow

Jessica Feldman

2023-2024
Home institution
American University of Paris
Country of origin (home institution)
France
Discipline(s)
Information and communication sciences
Theme(s)
Artificial Intelligence Democracy, Citizenship, Governance Digital Society
Fellowship dates
Biography

Professor Feldman joined the Global Communications faculty at AUP in 2018. Before that, she was a Postdoctoral Fellow at the Digital Civil Society Lab at Stanford University, after earning a Ph.D. in Media, Culture, and Communication from New York University in 2017. Her dissertation considered how advances in the surveillance of cell phone data, decentralized mobile networks, and vocal affective monitoring software are changing the ways in which listening exerts power and frames social and political possibilities. This research was funded in part by a grant from the National Science Foundation to support interdisciplinary research on privacy and democracy across the social sciences and engineering. She is also an artist whose work has been exhibited and performed internationally. She received an MFA from Bard in 2007 and taught media and sound art at Temple University and The New School from 2009-2012. She often collaborates with designers, and combines theory and practice in her teaching and her research. 

Feldman’s current book project, Radical Protocols: Designing Democratic Digital Tools in Social Movements, is a study of the ways in which democratic values are (or are not) inscribed in the design of emerging networked communication technologies. The book is the result ethnographic fieldwork with democratic social movements, especially the “movements of the squares,” during which she studied these movements’ communications practices and the alternative digital tools that they designed to serve their political values. This is combined with a “values-in-design” analysis of new decentralized communication, consensus, and trust models, such as mesh networks, blockchain, and algorithmic governance applications, which claim to have democratic values. The book asserts the promise that peer-to-peer tools have for democratic practice in a moment when representative democracy is in decay, while pointing out concerns about the ways in which illegitimate power and control could be inscribed into these communication tools at lower layers. 

Research Project
Collective Trust? Comparative Commons-Centric Design for AI and Algorithmic Trust

Trust has emerged as a central concern in contemporary societies due in part to the influential, yet opaque, role played by algorithmic decision-making. This heightened significance of trust and trustworthiness is evident in official discourses and policies surrounding Artificial Intelligence (AI), not only in Western societies but also globally. There is a growing agenda to foster trust in AI among the general public, as emphasized by state-funded research in the United States and calls from the European Commission for trustworthy algorithms from a societal perspective. These demands give rise to more fundamental questions regarding the nature of trust and the role of digital tools: What does it mean for trust to be built and experienced collectively in democratic societies? How can and should the role and potential of algorithmic tools be conceptualized in shaping, and potentially reconditioning, trust?

This project’s thesis is that “collective trust” is ontologically different from private trust (individual-to-individual), and essential to democratic practice. New and remerging experiments with direct democracy – often developing in tandem with global networked communications and intelligent computing – require practices of decision making, sharing of authority, and security/access control that can help to conceptualize collective trust in design, and can perhaps be further supported or developed through algorithmic tools. At the same time, horizonal community groups and social movements are often stymied in their democratic practices at the layer of digital security, where power bottlenecks around the holder of the passwords or the managers of the social media accounts, and struggles ensue. More challenges arise as these groups scale up across distances, work remotely, or automate decisions, requiring the use of digital tools for a form of consensus-minded decision-making that generally does not rely on binary logic. As projects in collective self-governance continue to advance, globalize, and digitize, these problems will only become more acute and a greater threat to progressive democratic practice. While research proliferates on “trustable AI” from a marketing perspective, and algorithmic governance research considers how AI can simplify or inform existing bureaucratic processes, this research project seeks to understand the co-formation of collective trust, democratic practice, and algorithmic decision-making.

Jessica Feldman’s research project bridges theory and practice, combining comparative philosophy, ethnographic work, and values-in-design analysis, to contribute to her work on collective-based access control design, a related artistic project, and a forthcoming book on trust and algorithms.

Research Interests:

Collective trust and democracy; algorithmic governance; values-in-design; ethnography of democratic practice