
​In the image (left to right): (moderator) Nadine Ridder, (speakers) Masuma Shahid, Radical Data (Jo Kroese & Rayén Jara Mitrovich), Nico Voskamp, Megan Thomas
For the final special episode of Designing Cities for All, titled, AI & Identity: Challenges for the Marginalised Communities, I had the opportunity to expand on my ongoing research around [body, identity, and AI]. I invited a diverse group of speakers from different p.o.v's and practices to touch upon justice, human rights, and liberation perspectives within the heavy implementation of digital technologies. Together, we interrogated the disproportionate impact of AI systems on marginalized groups. Even though the implementation of algorithmic models is heavily increasing, the inclusivity of these structures does not improve evenly. In fact, they exacerbate discrimination, going as far as building models to detect one’s sexual orientation from their facial features. This reinforces the urgency about the trajectory of technologies, as they are becoming the tools to govern societies.

​ This program aimed to uncover intersections between AI, body, and identity to discuss the extent identification systems can be developed, or fail to do so. As gender and sexual identity are also crucial parts of one’s identity, we explored if data structures could ever reach a complexity to accurately ‘decode’ this multiplicity. With bringing forward the negative and discriminating cases of artificial intelligence towards certain groups, the speakers highlighted affirmative and alternative ways of being within algorithmic structures. My goal was for the discussion to conclude by increasing awareness for non-discriminatory, just, and accountable systems of data. Speakers shared how their practice positions the advocacy to go beyond (yet always much needed) critical discussions around AI, and implement ways to dismantle injustices it generates.
​​
​
Biometric data (physical, physiological, or behavioral characteristics) we represent is collected by surveillance systems, leaving no part of our bodies unanalyzed. This information is automatically labeled into rigid classifications without room for ‘error’ or divergence. During this translation, from data to meaning, queer individuals are systematically excluded. Many incidents show how misidentification has detrimental effects on LGBTQ+ communities in everyday life.

