Creating Responsible AI Tools with Artists at PublicSpaces 2025
Waag Futurelab
Creating Responsible AI Tools with Artists at PublicSpaces 2025
Waag Futurelab
Creating Responsible AI Tools with Artists at PublicSpaces 2025
Waag Futurelab
Creating Responsible AI Tools with Artists at PublicSpaces 2025
Waag Futurelab

Creating Responsible AI Tools with Artists at PublicSpaces 2025

Generative AI tools such as chatbots, image and voice generators create ethically controversial situations: copyright violations, deepfakes and more. Artists and creatives are questioning the ethics and originality of works generated by AI platforms. If an AI’s ability to make art comes from learning from millions of human artists, who should get credit, control or compensation for what the AI produces? This leads to a broader question: What forms of AI do we truly need? On 13 June 2025, HAMLET’s Waag team organised a panel at the PublicSpaces Conference in Amsterdam, titled 'Creating Responsible AI Tools with Artists', where these questions were discussed. 

During the panel, Joumana Mourad, artistic director of IJAD Dance Company, James Patton, independent narrative game designer, Flavia Dzodan, lector algorithmic cultures, and Sabine Roeser, professor of ethics of emerging technologies, discussed art as an ethical practice in the creation of AI tools. 

AI and dance

Joumana Mourad spoke about the OOTFest25 (Open Online Theatre Festival), where dancers and choreographers are using AI to enhance somaesthetic creativity, working with the body, sensations, and emotions. For Mourad, AI is not a replacement but a tool shaped by the creator’s imagination. Artists need to understand AI to work critically and productively with it.

A Marxist perspective

James Patton approached the discussion from a Marxist perspective, situating AI in a long trajectory of technologies that have displaced labour and creativity such as machines in factories, cameras, and recording devices. He reminded us that protection of creative labour historically came through collective action and unionisation. A key example is the establishment of royalties, which arose from unionised organising in response to the invention of recording technology. James argued that the current development of AI in creative fields demands similar collective safeguards to protect creators against capitalist enclosure.

AI as infrastructural regime

Flavia Dzodan explained her research on affective logistics and infrastructure of algorithmic technologies. Affective logistics is a term that describes the contemporary infrastructural regime through which emotion is operationalised, routed and optimised. This can take many forms, from user-interface designs that produce dependency to biometric surveillance systems that claim to determine whether a refugee is lying. Dzodan argues that most mainstream discussions around AI tend to flatten all algorithmic work into the same register which pressures artists to participate in big tech platforms’ extractive and exploitative practices. She also cautioned that the algorithms and AI systems that capture people’s data and use them to predict the probability of criminality are highly problematic and harmful.

Biases, emotions and empathy

Sabine Roeser emphasised that AI systems often reflect and amplify prejudices that cause harm in societies. A case in point is the Dutch childcare benefits scandal (Toeslagenaffaire), which was linked to racist biases programmed into the AI system. Roeser argued that emotions are vital for ethical judgment, and that art uniquely promotes imagination and empathy, enabling humans to envision alternative futures.

Together the speakers emphasised the importance of artistic critical intervention into how AI systems are designed, built, and used. We need to ask the Hamletian question: AI or not AI? If so, for what, for whom, how, and at what cost?

Watch the recording

 

 

Meta data

Published

Author

Project

EU official flag

Funded by the European Union under grant agreement number 101178362.