Jake Elwes (/ˈɛlwɪs/) is a British media artist, hacker, radical faerie, neuroqueer, and researcher. Their practice is the exploration of artificial intelligence (AI), queer theory and technical biases.[1] They are known for using AI to create art in mediums such as video, performance and installation.[2] Their work on queering technology addresses issues caused by the normative biases of artificial intelligence.[3][1]
Elwes was born in London to British contemporary artist and painter Luke Elwes, grandson of painter Simon Elwes RA, from a landed gentry family, and Army officer James Hennessy, and Anneke, daughter of Hans Dumoulin, of Farnham, Surrey.[4][5]
They studied at the Slade School of Fine Art from 2013 to 2017, where they began using computer code as a medium.[2]
In 2016 they attended the School of Machines, Making & Make-Believe in Berlin with artist and educator Gene Kogan.[2]
Elwes was introduced to drag performance by Dr Joe Parslow[6] who holds a PhD in drag performance; drag performance has since become instrumental to Elwes' work.[1]
Installations projecting conversations between neural networks[edit]
Elwes has created works based on the conversations between two neural networks including Closed Loop from 2017, A.I. Interpreting ‘Against Interpretation’ (Sontag 1966) from 2023 and Auto-Encoded Buddha from 2016. In Auto-Encoded Buddha, a computer struggles with the notion of Buddha's philosophy. This is Elwes' tribute to Nam June Paik's TV Buddha (1974).[2]A.I. Interpreting ‘Against Interpretation’ (Sontag 1966) challenges the idea of ‘prompting’ through linguistic image generators in neural networks. Elwes has programmed a system with an image-generating diffusion model that interprets Susan Sontag’s essay Against Interpretation and passes the results to an image labelling algorithm that translates them back into language.[19]
Knowing that facial recognition technology statically struggle to recognize black women or transgender people, Elwes set out to "Queer the Dataset" through an open-sourced generative adversarial network (GAN). Elwes added a dataset of 1,000 photos of drag kings and queens into the GAN's 70,000 faces collected in a dataset called Flickr-Faces-HQ Dataset (FFHQ). They then created new simulacra faces, known as deep fakes.[1]
Zizi &Me is a performance and video installation that shows a joint performance between drag queen 'Me The Drag Queen' and her deepfake A.I. clone.[20]
The Zizi Show is a deep fake drag act based on artificial intelligence (AI). It has been presented live and as interactive online artwork. It is an exploration of queer culture and the algorithms philosophy and ethics of AI.[21]
CUSP 2019 - film stillCUSP 2019 - montage of machine learning generated birds
In their video work CUSP (2019) Elwes places marsh birds generated using artificial intelligence into a tidal landscape. These digitally generated and constantly shifting birds are recorded in dialogue with native birds. The video work is also accompanied by a soundscape of artificially generated bird song.[22]