Ibiye_Camp_Digital_Shadows_ARTSFormation_Waag
Waag BY-NC-SA

Biased Technology: Behind Shirley

Technology is not neutral. It is, instead, a reflection of our cultural and political beliefs, which includes prejudices, biases and racism. In this series, we’ll try to get to the root of the matter. As we search for solutions, we highlight concrete examples to demonstrate that technology is not neutral.

In the 1950s Kodak (major US multinational photography company) introduced their so-called ‘Shirley card’. A tool intended to make it easier to calibrate colours in photography. The card shows a portrait of a white women – Shirley Page, a former Kodak employee – surrounded by coloured blocks (pink, yellow, green, blue, red, black, grey and white). It worked in the following way: the photographer would photograph the physical Shirley card in a corner of one of the thirty-six frames of the film role. The printer could calibrate the lighting based on the reference card. If Shirley had a clear white skin, the photo was considered well lit.  

Shirley’s light skin served as the ‘neutral’ standard. But it had one massive flaw: it could not properly pick up on darker colours such as brown, which led to faults when photographing people with darker skin. It shows us how photography is not just a system of recording light, rather a technology of subjective decisions. By using a white woman as the norm an unconscious racial bias was built into the technology of photography. Eighty years on, the default towards lighter skin in technology is still present in facial recognition software.  

Shirley_Card_Waag_Digital_Shadows

Artist Ibiye Camp’s installation ‘Behind Shirley’ explores this racial bias in film processing chemistry and in facial-recognition software. Camp’s installation is part of the exhibition Digital Shadows and is on view until February 26th at OBA Oosterdok. A conversation with artist Ibiye Camp about her installation ‘Behind Shirley’.  

Behind Shirley  

‘Behind Shirley’ (2020) deconstructs and rethinks the colonial narratives in the development of facial recognition systems. The installation explores how darker skin was not taken into account in film processing chemistry and is now ignored in facial recognition software. The installation presents a video-essay, two 3D printed busts and an augmented reality application.   

Digital_Shadows_Artsformation_Behind_Shirley_Ibiye_Camp_Waag

Camp’s initial idea for this artwork came from working with photogrammetry: the art and science of extracting 3D information from photographs. You insert pictures into the software, and then the software stitches the imagery together. 

When scanning the busy market in Lagos (Nigeria) with her phone, Camp noticed how, because of the contrasted lighting and the massive amount of moving people, the scans became voided and insufficient. Camp found out the space was not easily recognisable for Google (via Maps). Beside street scenes she also wanted to portray the people in the market.

still_form_behind_shirley_by_ibiye_camp_2020_12_Waag_Digital_Shadows_ARTSFormation

Glitches showed her how black people’s faces were oftentimes not recognised as faces. The software was unable to separate black people from the brown walls behind them and they sometimes morphed into each other. This misidentification, especially of black women, made Camp delve deeper into how this software is designed and, more importantly, for who. Earlier research showed how facial recognition technology is subject to biases based on the data sets provided and the conditions in which algorithms are created.  

A people’s problem  

Misidentifying or not recognising black people brought Camp onto Joy Buolamwini’s work. ‘Poet of code’ Joy Buolamwini became a great inspiration during Camp’s research on biases in technology. Buolamwini’s research on artificial intelligence bias (literally) illuminated how the history of misidentification and the preference for light skin, which started in colour photography in the 1950s, is still is at work in today’s development of facial recognition software. This software is developed and tested by mostly white men in North-America, drawing information from deficient data pools. Collected data does not represent the whole of society which leads to inaccurate facial recognition technology.  

‘Scanning software is mostly tested in Western, rather pristine and technology-friendly places’ – Ibiye Camp  

It became clear the Shirley card was the basis of how people in photography would recognise skin tones and match colour. It had a huge impact on how the society saw people. ‘Standardised ideas of colour and skin tone has now been translated into AI,’ Camp says. In her video-essay, exploring the present-day tensions and biases against dark skin, glitches also invited Camp to think differently.  

‘There is a positive and also a terrible negative of being misidentified’ – Ibiye Camp  

Camp argues that if people of colour and/or queer people are in a way ‘unreadable’ to technology, maybe we can think of a new type of technology. She continues: ‘my work is in that sense meant to be speculative and questions what a potential future would look like. I particularly enjoy augmented reality as a way of designing and re-imagining the future.’ As the AI’s voice in her video-essay explains: it is not really an AI problem that people are being misidentified, it is more of a people’s problem. Bias comes from humans, not from AI.  

What can we do?  

‘Be critical about the types of software you use,’ Camp answers. ‘I, myself, am using Spark AR which is part of Meta,’ she says, ‘and I probably shouldn’t do that.’ Camp notices how a lot of people in the digital community are currently writing new codes and sharing them: ‘It is important to share conversations and code, so we can start to dilute the mainstream data pools that are gathered in North-America.’ More and more evidence points towards the need for diversifying not only the collected data sets but also the people who create, test and deploy technologies such as facial recognition.  

Visit the exhibition Digital Shadows at OBA Oosterdok and join one of the guided tours! On view until February 26th 2023.

And/or come to the Digital Shadows Performance Evening on Friday February 24th 19:30 where artist Dani Ploeger will give a lecture-performance around his short sci-fi movie The Cults.

Read more about Ploeger's Artwork in the Dutch article in De Volkskrant: ‘The Cults’ is een verrukkelijke mengelmoes van scifi-droom en cultuurkritiek