What the FAccT? Evidence of bias. Now what?
In part four of our FAccT deep dive, Alix joins Marta Ziosi and Dasha Pruss to discuss their paper “Evidence of What, for Whom? The Socially Contested Role of Algorithmic Bias in a Predictive Policing Tool”. In their paper they discuss how an erosion of public trust can lead to ‘any idea will do’ decisions, and often these lean on technology, such as predictive policing systems. One such tool is the Shot Spotter, a piece of audio surveillance tech designed to detect gunfire — a contentious system which has been sold both as a tool for police to surveil civilians, and as a tool for civilians to keep tabs on police. Can it really be both? Marta Ziosi is a Postdoctoral Researcher at the Oxford Martin AI Governance Initiative, where her research focuses on standards for frontier AI. She has worked for institutions such as DG CNECT at the European Commission, the Berkman Klein Centre for Internet & Society at Harvard University, The Montreal International Center of Expertise in Artificial Intelligence (CEIMIA) and The Future Society. Previously, Marta was a Ph.D. student and researcher on Algorithmic Bias and AI Policy at the Oxford Internet Institute. She is also the founder of AI for People, a non-profit organisation whose mission is to put technology at the service of people. Marta holds a BSc in Mathematics and Philosophy from University College Maastricht. She also holds an MSc in Philosophy and Public Policy and an executive degree in Chinese Language and Culture for Business from the London School of Economics. Dasha Pruss is a 2023-2024 fellow at the Berkman Klein Center for Internet & Society and an Embedded EthiCS postdoctoral fellow at Harvard University. In fall 2024 she will be an assistant professor of philosophy and computer science at George Mason University. She received her PhD in History & Philosophy of Science from the University of Pittsburgh in May 2023, and holds a BSc in Computer Science from the University of Utah. She has also co-organized with Against Carceral Tech, an activist group working to ban facial recognition and predictive policing in the city of Pittsburgh. This episode is hosted by Alix Dunn. Our guests are Marta Ziosi and Dasha Prussi Further Reading Evidence of What, for Whom? The Socially Contested Role of Algorithmic Bias in a Predictive Policing Tool Refusing and Reusing Data by Catherine D’Ignazio
In part four of our FAccT deep dive, Alix joins Marta Ziosi and Dasha Pruss to discuss their paper “Evidence of What, for Whom? The Socially Contested Role of Algorithmic Bias in a Predictive Policing Tool”.
In their paper they discuss how an erosion of public trust can lead to ‘any idea will do’ decisions, and often these lean on technology, such as predictive policing systems. One such tool is the Shot Spotter, a piece of audio surveillance tech designed to detect gunfire — a contentious system which has been sold both as a tool for police to surveil civilians, and as a tool for civilians to keep tabs on police. Can it really be both?
Marta Ziosi is a Postdoctoral Researcher at the Oxford Martin AI Governance Initiative, where her research focuses on standards for frontier AI. She has worked for institutions such as DG CNECT at the European Commission, the Berkman Klein Centre for Internet & Society at Harvard University, The Montreal International Center of Expertise in Artificial Intelligence (CEIMIA) and The Future Society. Previously, Marta was a Ph.D. student and researcher on Algorithmic Bias and AI Policy at the Oxford Internet Institute. She is also the founder of AI for People, a non-profit organisation whose mission is to put technology at the service of people. Marta holds a BSc in Mathematics and Philosophy from University College Maastricht. She also holds an MSc in Philosophy and Public Policy and an executive degree in Chinese Language and Culture for Business from the London School of Economics.
Dasha Pruss is a 2023-2024 fellow at the Berkman Klein Center for Internet & Society and an Embedded EthiCS postdoctoral fellow at Harvard University. In fall 2024 she will be an assistant professor of philosophy and computer science at George Mason University. She received her PhD in History & Philosophy of Science from the University of Pittsburgh in May 2023, and holds a BSc in Computer Science from the University of Utah. She has also co-organized with Against Carceral Tech, an activist group working to ban facial recognition and predictive policing in the city of Pittsburgh.
Further Reading