The 'Future and Emergent Art and Technology' project has resulted in the publication of an illustrated catalogue, that documents the processes and outcomes of unique and in depth collaborations between artists and scientists exploring the fields of synthetic materials, nuclear time measurement, quantum physics and quantum computing, gene regulation, high performance computing and underwater swarm robotic cultures.
These cutting edge collaborations are the result of the FEAT: Future Emerging Art and Technology project. FEAT is supported by the EU programme FET Open. FET stands for Future Emerging Technologies. FEAT is a support initiative to inform and advise the European Commission about best practice methodologies for the arts to engage meaningfully with techno-scientific research and developments in emerging technologies, considering complimentary methods, critical reflection, widening public engagement and potentially enhancing take up in potential future technologies.
The FEAT project invited new and recently started FET supported research projects, to be paired with artists that were selected by a jury via an open call for participation. After a two-day gathering of researchers and artists meeting each other and learning about each others’ work the artists chose to work with the projects that most inspired them. Six pairs were selected to participate in a fully funded nine-month period of collaboration working with together embedded in laboratory settings, studios and workshops. All the invited FET projects accepted the FEAT invitation and no less than 267 high profile artists applied to the successful open call.
The results of FEAT are diverse in terms of how they allude to the original research question of the roles that art can take on in collaboration with techno-sciences, ranging from communicating new scientific research from an aesthetic perspective to coming up with new research questions and critiquing emerging techno-scientific development and ethical issues.
The catalogue is published under a Creative Commons license (BY-NC-SA).
This project has received funding from the European Union’s Horizon 2020 research and innovation programme under grant agreement no. 686527.