Coded Colour

Classroom Project
M.Des 2019
Photography Design

This  work looked at the algorithmic bias of artificial intelligence-based image generating softwares.  Automated systems are not inherently neutral as they reflect the priorities, preferences and prejudices of those who have the power to mould artificial intelligence. The outcome was a grid of 540 portraits investigating the nature of these systems.

We are entering a world where machine-learning algorithms are used every day, influencing interactions, opinions, thoughts and thus, impacting society. This gradual change in the nature of how we interact with the world begs to ask certain questions –
  • Who designs the algorithms and how they are deployed?
  • Who decides what level of accuracy is acceptable?
  • Who decides which applications of technology are ethical?
  • Are these systems inclusive? 

To delve into identity politics, I started by exploring various StyleGAN systems to generate a “self-portrait” using the existing datasets. I found that one such system categorizes a face into six different races. The user can adjust the sliders for the inputs and make a face look more like a certain race. This intrigued me to dive deeper into the process and identify what the system understands when I input racial characteristics. Through this exercise, I questioned the nature of the algorithm itself. I shifted my approach to look beyond self-identity and to the larger politics of identity in the world of algorithms.

This project serves as an entry into proposing questions about these systems that are becoming essential to our daily life.

I see this project taking more nuanced approaches and investigating the various social implications of artificial intelligence. Since my future practice as a designer/ photographer/ artist will happen in a world governed by AI, this project made me more sensitive towards inclusivity, and learning to ask small questions can help me look at socio-cultural politics from a broader perspective.


“Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification"
(Joy Buolamwini).

Conversations with my brother about neural network based systems