top of page

Research

My interdisciplinary research draws from political theory and interpretive methods to address the challenges and opportunities surrounding emerging technologies, particularly in the context of quantified self-tracking and climate change. 

Peer-Reviewed Publications

Billesbach, G., 'The Politics of Predictive Technology in the Intergovernmental Panel on Climate Change' (20 Mar. 2025).

Oxford Intersections: AI in Society

DOI: 10.1093/9780198945215.003.0020 .

This article draws from Hannah Arendt’s political theory and interpretive methods to address underlying tensions concerning technology and politics in the Intergovernmental Panel on Climate Change (IPCC). It analyzes IPCC leadership, reports, and original interviews with climate scientists. These sources illustrate how predictive algorithms and expert rule are prominent in the IPCC's work, yet meaningful efforts remain to integrate regional differences and non-quantitative perspectives. The climate scientists interviewed frequently acknowledge tensions between technocratic and humanistic approaches, often articulating distinctly humanistic or poetic sensibilities in tandem with their rigorous deployment of predictive technology. They demonstrate how we can resist purely technocratic approaches to climate governance while utilizing state-of-the-art technologies.

Working in Progress

Invocations of Freedom in the Age of Predictive Algorithms

Predictive algorithms promise enhanced freedom by way of increasingly accurate forecasts of human behavior. Critics frequently invoke traditional theories of negative, positive, and republican liberty in alerting us to the dangers of predictive technologies, but these approaches inadequately capture the full impacts of algorithmic prediction, including our willingness and ability to exercise freedom with others. Drawing from Charles Taylor’s concept of self-interpretive freedom and Hannah Arendt’s notion of freedom as political action, I argue that predictive algorithms can both undermine and support our capacities for reflection and interpretation, and alter our disposition toward spontaneous political engagement. Arendt’s understanding of freedom as inherently pluralistic and unpredictable highlights the risks posed by algorithmic governance that prioritizes certainty over democratic deliberation and openness. Taylor’s emphasis on the temporal, dialogical, and embodied nature of self-interpretation reveals the dangers of representing human experience primarily through quantitative measures.

Self-Interpretive Freedom and the Politics of Quantified Self Technologies

The Quantified Self (QS) movement promises to enhance personal freedom by using technology to track and optimize human physiology and activity. This article examines how we might best understand and respond to the challenges posed by QS technologies. Adopting an ethnographic methodology and employing Charles Taylor’s framework of self-interpretive freedom, it analyzes publicly available primary evidence. QS practitioners need neither blindly surrender to self-tracking technology nor lose their humanity when deploying it. Rather, their technologically enabled activities can enrich self-interpretive freedom. As digital human twins and AI-driven self-tracking tools evolve, their development and use are best guided by this principle: that technology should serve, rather than supplant, the temporal, dialogical, and embodied dimensions of human agency.

AI and the Exercise of Freedom: The Politics of Personal &Planetary Prediction

Algorithms and artificial intelligence (AI) promise unprecedented predictive power, which many associate with greater efficiency, objectivity, and control. They also imperil many cherished values, such as justice, equality, and freedom. This book investigates how the predictive power of algorithms and AI threatens freedom. It neither counsels the wholesale rejection of algorithms and AI nor overlooks their potential benefits. Instead, it (1) assesses how negative, positive, and republican conceptions of liberty have been invoked in prevailing discussions of algorithms and AI; (2) advances interpretations of Hannah Arendt and Charles Taylor on science, technology, and the exercise of freedom to clarify the political impact of algorithmic prediction; and (3) applies these theoretical insights in two empirical case studies that employ predictive tools at planetary and personal scales—the Intergovernmental Panel on Climate Change (IPCC) and the Quantified Self (QS) movement.

​

Traditional approaches to freedom, the negative, positive, and republican conceptions of liberty, help us understand the consequences of algorithms when they malfunction or are used to facilitate political interference, inequalities, and unaccountable decision-making. However, they inadequately address the full impact of predictive technologies in our political and personal lives. Arendt’s theory of political action and Taylor’s concept of self-interpretation provide crucial supplements, revealing the political and personal impacts of predictive technologies and the prospects for addressing them. An investigation of the IPCC and QS movement demonstrates that predictive technologies are neither inherently oppressive nor inevitably liberating. Members within this institution and movement illustrate how predictive technologies can fruitfully coexist with democratic and reflective practices. In this manner, algorithms and AI can be guided by wisdom. And human freedom, with all of its uncertainties, can endure.

bottom of page