CFP: Algorithmic Bias, Phallicism & Counter-Insurgency: Understanding the Racialized Male Target
Submission deadline: April 20, 2026
Topic areas
Details
Algorithmic Bias, Phallicism & Counter-Insurgency: Understanding the Racialized Male Target
Hosted by the Clay-Gilmore Institute for Philosophy, Technology, and Counterinsurgency (CG-IPTC) in collaboration with the Algorithmic Bias Project in Canada & Centre for Ethics, University of Toronto
Workshop (in-person & online): Summer 2026
Conference (University of Toronto): Winter 2027
Edited Anthology (same title): 2027–2028
Overview
The Clay-Gilmore Institute for Philosophy, Technology, and Counterinsurgency (CG-IPTC), in collaboration with the Centre for Ethics at the University of Toronto, invites abstract submissions for an interdisciplinary workshop and subsequent international conference addressing how racism is being reproduced through AI and how AI technologies can be located within the long history of slavery and colonization.
This initiative situates algorithms, data infrastructures, and AI-enabled systems of surveillance within longer genealogies of colonial militarism, genocide, racial capitalism, and counterinsurgency doctrine. We are especially interested in work that theorizes and historicizes the racialized male body as a primary site of technological targeting, focusing on how Black and racialized men have been repeatedly constructed as objects of risk, control, expendability, and elimination across colonial, military, and data-driven regimes. This project develops what we call the technologization of counterinsurgency: the translation of racialized fear, militarized governance, and tactical logics into algorithmic systems of prediction, classification, and surveillance.
We seek 10–12 contributors whose work will form the basis of:
• a Summer 2026 in-person workshop
• a Winter 2027 conference at the University of Toronto
• an edited anthology by the same title
Core Themes – Submissions should engage one or more of the following themes:
Algorithmic Targeting and the Racialization of Risk
Phallicism, Gendercide, and the Political Construction of the “Dangerous Male”
Counterinsurgency Logics in Contemporary AI Systems
Genocide Studies and Slow Violence: From Camps to Code
Necro-Being, Social Death, and Digital Ontologies of the Racialized Male
Militarized Data and the War Origins of Artificial Intelligence
Mapping the Racialized Body: Computer Vision and the Politics of Recognition
Statistical Objects and the Colonial Invention of Populations
Philosophy of Technology and the Myth of Neutral Systems
Predictive Policing, “Pre-Crime,” and Temporal Violence
Resistance, Refusal, and Counter-Surveillance Practices
Art, Visualization, and the Algorithmic Imagination
We particularly encourage work that:
• Connects AI technologies to colonial, genocidal, and military histories
• Engages Black Male Studies, Africana philosophy, Black Power thought, and especially Phallicism
• Analyzes facial recognition, predictive policing, gang databases, drone warfare, biometric surveillance, risk modeling, or “pattern-of-life” technologies
• Employs philosophical, historical, ethnographic, legal, technical, artistic, or data-driven methods
Submissions may be traditional academic papers or include creative, visual, or experimental components.
Submission Guidelines
Please submit the following materials by April 15, 2026:
• Abstract (300–500 words)
• Short bio (max 150 words)
• Institutional affiliation (if any)
• Contact information
Send submissions to: [email protected]
Subject line: Algorithmic Bias/CG-IPTC CFP – [Your Last Name]
We strongly encourage submissions from:
• Early-career researchers
• Black, Indigenous, and racialized scholars
• Scholars from the Global South
• Independent researchers and artists
• Community, abolitionist, and activist practitioners
Limited travel support will be available for selected participants when possible.
Timeline
Abstract deadline: April 15, 2026
Decisions announced: May 15, 2026
Workshop (selected participants): July 2026
Full paper drafts due: December 2026
Conference: March 2027 (University of Toronto)
Final revised papers due: May 2027
Edited anthology publication: 2027–2028
About the Hosts
The Clay-Gilmore Institute for Philosophy, Technology, and Counterinsurgency (CG-IPTC) is an independent research institute dedicated to examining the connections between of artificial intelligence, liberal humanism, racialization, and militarized state power. Through philosophical inquiry, historical analysis, and data-driven research, it investigates the technological infrastructures that govern life, death, and social control. Website: https://www.cg-iptc.org
The Centre for Ethics, University of Toronto is a leading interdisciplinary research center engaged in critical inquiry into emerging technologies, governance, and public life. It supports innovative scholarship at the intersection of ethics, science, and society.
Together, this collaboration responds to the urgent need to interrogate the role of AI in the reproduction of racialized violence, population control, and the management of life and death in the contemporary world. Website: https://algorithmicbias.ca