datasorting
Caroline Sinders ©
open design lab

Interview: Artist Caroline Sinders on Feminist Data

A more free and just technological future: this is key in the work of machine-learning-design researcher and artist Caroline Sinders. She examines the impact of technology on society, artificial intelligence and politics. Waag spoke to Sinders about thoughtful and intentional data collection, social justice and community participation in technology.

Hi Caroline! Last September, during Waag’s Transformer Summit, we had the pleasure to speak with you and to follow your workshop about your project, Feminist Data Set. Could you tell us in short what it is about?

'Feminists Data Set is one of my art projects. In it, I'm using intersectional feminism as an investigatory framework, in order to look at machine learning. As an artist, I'm really interested in speculative design and critical design: to try to build something related to an ideology rooted in social justice. That is why I take a look at every step in the 'pipeline' of machine learning, the process that leads to making an algorithm. I try to build and rebuild the machine learning process, imbued with intersectional feminism.

I'm interested in what machine learning looks like. What are the processes and the procedures? What are the protocols it entails? Data collection is an important part of machine learning. That's where the first step comes in, with Feminist Data Set. In the project, I organise workshops where we talk about intersectional feminism and data collection. Then, we try to slowly gather data. The goal is the opposite of traditional data collection, where you more or less vacuum up as much data as possible. It's really about trying to be thoughtful and intentional. I like to think of it as slow data, like farm-to-table data. How do you slowly and intentionally gather data? This way, a data set actually becomes an archive.

I'm very interested in reframing data as a very precious material. Because in a way, all data come from people. Even data that are really abstract or technical: at some point, a person was involved in the process of determining why that data was generated or collected. So even if it’s just two computers analysing each other for performances, at a certain point, a human made a decision that that should be analysed.'

Higher Resolutions - by Caroline Sinders
Higher Resolutions - credit Caroline Sinders

 

What does the term intersectional feminism mean to you?

'To me, intersectional feminism means recognising that a person has many different parts of their identity. So if you're a woman, are you a cis woman? Where are you from? How old are you? What is your skin tone? Are you disabled? What is your race? All of those aspects add parts of your identity and change your relationships with society. So you're not just, for example, a cis woman, you're not just a cis man. Your lived experience and aspects of your identity contribute so much more to how you are treated by society.

Intersectional feminism is recognising that we all come to the table with different kinds of privileges or lacks of privilege. It’s not enough to present feminism as a one size fits all. We have to acknowledge and understand that, for example, black trans women are treated differently than white cis women. Or, being a person that's disabled, you are treated differently than a non disabled person. Feminism and politics need to reflect that kind of diversity.'

Caroline Sinders Datasorting
Datasorting - credit Caroline Sinders

What do you think of Code for Children’s Rights, a project that has been developed by Waag and was commissioned by the Dutch Ministry of Internal Affairs?

'I think it's really important to think about how children are treated and viewed, especially in relationship to technology. How can we involve children and teenagers more in policy, particularly when this policy affects them? That's something I'm really passionate about. I also think it's important that, when we think about or analyse technology that we don't, as adults, place our own fear onto children, but that we also understand that there are so many issues with technology and technology platforms that can exacerbate some of the issues that teens and children may be facing. 

When you're doing work that is about a group of people, it is important to actually involve those people in the research process and in the policy process. Children are no different. It's important that they have a say in policy and technology, and also that when we're crafting policy, or building technology, we are building with them and not for them. By that, I mean really understanding what they need and want. What are their hopes and dreams with technology? There should be different kinds of social networks for children, but those actually have to be designed differently than for adults. There are all different kinds of protection needed. Then, how do you also design it in a way that reflects the needs and wants of children? It could be so completely different from how we could ever imagine a social network.

Also, how do we think about harm reduction and harm prevention or mitigation in a different way than we think about it for adults? There are many reported issues of how adolescents, particularly teenage girls, have lower self esteem in relationship to Instagram and Facebook. How do we think about those harms in a very real way? We have to recognise that children and teenagers are using technology and the internet, and it's impossible to keep them off of it. How do we design safe spaces for them to gather and congregate, that also reflect what they want?'

Stop that feature - by Caroline Sinders
Higher Resolutions - credit Caroline Sinders

 

Do you see a relation between Code for Children’s Rights and Feminist Data Set? 

'I think they're related in the sense of my practice. As an artist and designer, I care a lot about social justice and technology. Community participation in technology is of the utmost importance. The project Feminist Data Set is done by translating my research into workshops. These workshops are key in terms of getting feedback and situating people's thoughts, desires and input around what technology, and what machine learning and algorithms can be. If we're creating any kind of social justice in technology for people, you have to have input and buy-in from the communities you're researching.

This input can be gathered either via direct participation or co-design, or via specific feedback in which the voices of the communities are centered. Their concerns are then reflected in the design of technology. A lot of my work centers around that idea. It's mostly the ethos of the project and the way you as a designer are going about doing the project. That's how Feminist Data Set and Code for Children’s Rights are related.'

What projects are you currently working on?

'I'm working on a few different things. I'm writing a book about responsible design and machine learning. I'm also teaching right now: I’m a lecturer at the London College of Communication in the data visualisation programme. Also, I am working on a project with the Art Institute of Chicago, on oil and gas and energy infrastructure. In Louisiana, where I'm from, Jamie Allen and I are starting a small project that thinks about metaphors and carbon capture. For example, when people say things like ‘this is equal to twenty long-haul flights’, what does that really mean?

This year I’m hoping to organise workshops together with Waag on the subject of TikTok, to help people get an understand privacy and security issues around these type of opaque systems. What are exercises that you can do that require almost no coding? By doing these type of experiments, we can see what the algorithms do, and intervene.'

Would you like to stay updated on Caroline's work? Follow on Instagram and Twitter. Go to carolinesinders.com for more info.

Published

Author

Project

EU official flag

Artsformation has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement no. 870726.