Artwork

Контент предоставлен Algorithmic Governance Research Network. Весь контент подкастов, включая эпизоды, графику и описания подкастов, загружается и предоставляется непосредственно компанией Algorithmic Governance Research Network или ее партнером по платформе подкастов. Если вы считаете, что кто-то использует вашу работу, защищенную авторским правом, без вашего разрешения, вы можете выполнить процедуру, описанную здесь https://ru.player.fm/legal.
Player FM - приложение для подкастов
Работайте офлайн с приложением Player FM !

Episode 2: Conversation with Simon Egbert and Matthias Leese on Criminal Futures: Predictive Policing and Everyday Police Work

1:24:16
 
Поделиться
 

Manage episode 341029609 series 3394510
Контент предоставлен Algorithmic Governance Research Network. Весь контент подкастов, включая эпизоды, графику и описания подкастов, загружается и предоставляется непосредственно компанией Algorithmic Governance Research Network или ее партнером по платформе подкастов. Если вы считаете, что кто-то использует вашу работу, защищенную авторским правом, без вашего разрешения, вы можете выполнить процедуру, описанную здесь https://ru.player.fm/legal.

Joining me today are Simon Egbert, Postdoctoral Fellow at Bielefeld University, on an ERC research project on The Future of Prediction, and Matthias Leese, Senior Researcher at the Center for Security Studies (CSS) in Zürich, to discuss their recent book Criminal Futures: Predictive Policing and Everyday Police Work, published in 2021 with Routledge. The book is available to download open-access here.

Today we discuss predictive policing and the ways in which it is transforming police work. Police departments across the globe are embracing algorithmic techniques to support decision-making through risk assessments and predictions based on big data and real-time analytics, utilizing tools such as facial recognition. Silicon Valley’s ‘technological solutionism’, to use Evgeny Morozov’s concept, has been making its way into law enforcement agencies across the globe, promising to smoothly, efficiently and effortlessly anticipate, predict, and control (future) criminal behaviour and deviance. But predictive policing has met with resistance from civil society and academics alike. Even though data-driven predictions and algorithmic risk assessments are sold by tech developers as ‘neutral’ and ‘objective’ forms of ‘evidence’ and ‘intelligence’ – because technological – as something ‘solid’ and ‘hard’ in ‘liquid times,’ critical social scientists tend to know better. What counts as data and how it is collected, what is included and what excluded, all this reflects historical, representational, cultural, gender, and other inequalities and biases. Prejudices about criminality of certain groups can be built into crime data, resulting in their reinforcement, rather than dispelling. We increasingly read about systems trained on biased and ‘dirty’ data, about ‘rogue algorithms’, and ‘algorithmic injustice’ and violations of human rights and civil liberties. As Cathy O’Neil put it, algorithms can create ‘a pernicious feedback loop’, where ‘policing itself spawns new data, which justifies more policing’ (O’Neil 2016: 87). Last year, acting on these insights, the city of Santa Cruz in California, one of the earlierst adopters of predictive policing, became the first US city to ban the use of predictive technologies in policing. Calls for ethical, transparent and explainable AI are emerging both from within computer science, law and social sciences, and from policymakers and civil society. It is clear that both the development and adoption of these technologies does not happen in a cultural, political or economic vacuum. In many countries, for instance, police forces are experiencing financial cuts, increasing pressures to outsource certain tasks to private actors, often accompanied by organizational reform. Demands on response time, results, performance, and efficiency are increasing, while resources may be shrinking, thus structurally creating a market for a wide range of optimization tools for police work. Simon Egbert and Matthias Leese have studied predictive policing, the datafication of security and the transformation of police work ethnographically in Germany and Switzerland. In this podcast, we discuss in detail the reality behind the sleek commercials for predictive policing software tools that promise to forecast crime and control futures. Are we headed towards a dystopian society of total surveillance, social sorting, and control or a utopia of a perfectly optimized police force? What futures lie ahead for predictive policing and what will the police force of the future look like?

Text © Tereza Østbø Kuldova, 2021

Produced with the financial support of The Research Council of Norway under project no. 313626 – Algorithmic Governance and Cultures of Policing: Comparative Perspectives from Norway, India, Brazil, Russia, and South Africa (AGOPOL).

  continue reading

14 эпизодов

Artwork
iconПоделиться
 
Manage episode 341029609 series 3394510
Контент предоставлен Algorithmic Governance Research Network. Весь контент подкастов, включая эпизоды, графику и описания подкастов, загружается и предоставляется непосредственно компанией Algorithmic Governance Research Network или ее партнером по платформе подкастов. Если вы считаете, что кто-то использует вашу работу, защищенную авторским правом, без вашего разрешения, вы можете выполнить процедуру, описанную здесь https://ru.player.fm/legal.

Joining me today are Simon Egbert, Postdoctoral Fellow at Bielefeld University, on an ERC research project on The Future of Prediction, and Matthias Leese, Senior Researcher at the Center for Security Studies (CSS) in Zürich, to discuss their recent book Criminal Futures: Predictive Policing and Everyday Police Work, published in 2021 with Routledge. The book is available to download open-access here.

Today we discuss predictive policing and the ways in which it is transforming police work. Police departments across the globe are embracing algorithmic techniques to support decision-making through risk assessments and predictions based on big data and real-time analytics, utilizing tools such as facial recognition. Silicon Valley’s ‘technological solutionism’, to use Evgeny Morozov’s concept, has been making its way into law enforcement agencies across the globe, promising to smoothly, efficiently and effortlessly anticipate, predict, and control (future) criminal behaviour and deviance. But predictive policing has met with resistance from civil society and academics alike. Even though data-driven predictions and algorithmic risk assessments are sold by tech developers as ‘neutral’ and ‘objective’ forms of ‘evidence’ and ‘intelligence’ – because technological – as something ‘solid’ and ‘hard’ in ‘liquid times,’ critical social scientists tend to know better. What counts as data and how it is collected, what is included and what excluded, all this reflects historical, representational, cultural, gender, and other inequalities and biases. Prejudices about criminality of certain groups can be built into crime data, resulting in their reinforcement, rather than dispelling. We increasingly read about systems trained on biased and ‘dirty’ data, about ‘rogue algorithms’, and ‘algorithmic injustice’ and violations of human rights and civil liberties. As Cathy O’Neil put it, algorithms can create ‘a pernicious feedback loop’, where ‘policing itself spawns new data, which justifies more policing’ (O’Neil 2016: 87). Last year, acting on these insights, the city of Santa Cruz in California, one of the earlierst adopters of predictive policing, became the first US city to ban the use of predictive technologies in policing. Calls for ethical, transparent and explainable AI are emerging both from within computer science, law and social sciences, and from policymakers and civil society. It is clear that both the development and adoption of these technologies does not happen in a cultural, political or economic vacuum. In many countries, for instance, police forces are experiencing financial cuts, increasing pressures to outsource certain tasks to private actors, often accompanied by organizational reform. Demands on response time, results, performance, and efficiency are increasing, while resources may be shrinking, thus structurally creating a market for a wide range of optimization tools for police work. Simon Egbert and Matthias Leese have studied predictive policing, the datafication of security and the transformation of police work ethnographically in Germany and Switzerland. In this podcast, we discuss in detail the reality behind the sleek commercials for predictive policing software tools that promise to forecast crime and control futures. Are we headed towards a dystopian society of total surveillance, social sorting, and control or a utopia of a perfectly optimized police force? What futures lie ahead for predictive policing and what will the police force of the future look like?

Text © Tereza Østbø Kuldova, 2021

Produced with the financial support of The Research Council of Norway under project no. 313626 – Algorithmic Governance and Cultures of Policing: Comparative Perspectives from Norway, India, Brazil, Russia, and South Africa (AGOPOL).

  continue reading

14 эпизодов

Все серии

×
 
Loading …

Добро пожаловать в Player FM!

Player FM сканирует Интернет в поисках высококачественных подкастов, чтобы вы могли наслаждаться ими прямо сейчас. Это лучшее приложение для подкастов, которое работает на Android, iPhone и веб-странице. Зарегистрируйтесь, чтобы синхронизировать подписки на разных устройствах.

 

Краткое руководство