New
Art
City
Virtual Art Space

Catalog view is the alternative 2D representation of our 3D virtual art space. This page is friendly to assistive technologies and does not include decorative elements used in the 3D gallery.

Space Title

surfacecollider

Within the World Titled Six Minutes Re:search
Credited to James Irwin
Opening date January 1st, 2021
View 3D Gallery
Main image for surfacecollider

Statement:

This New Art City project with Six Minutes Past Nine builds on research undertaken over the last four years at the Contemporary Art Research Centre, Kingston School of Art, where James Irwin is doing a PhD at the point at which code becomes image (www.surfacecollider.net). The research bypasses photographic or lens-based image making techniques to speculate on the kinds of images machines make. The project foregrounds a visual language channelled through the collaborative process of working with computer software and hardware systems as a human subject. Resulting images are often weird and alien, distanced from that which the eye is familiar with seeing (outside of the psychedelic). These images offer little for us to cling to; the representative, which reaffirms ideas of self relative to the world we are familiar with, is left alone.

As we interact - through the screen, keyboard and mouse - with the code behind the work to bring the imagery to life, abstract patterns, colours and shapes are summoned to the rectangular frame of the computer display. The images move and flicker in ways similar to the cyclical rhythms of other lifeforms. James is interested in locating a vitalism within code-based imagery, one that alludes to or confirms how subjectivities extend beyond the boundaries of human bodies to inhabit machinic entities. This distributed subjectivity is consistent with - or perhaps speculates on - how modern life appears so fractured and anxiety inducing. The project uses us (our human bodies) as a channel for synthetic imagery, and in doing so suggests a recombinant cyborg form. The human body is remodelled to include our computational appendages within its edges.

This repositioning of the subjective (authorial) voice folds back on itself and resurfaces through the text-based outputs of the project. These writings sketch out a contextual framework for the work using a fine tuned custom version of the large language model GPT3 to generate text. They are written in real time, at the point at which the user interacts with the webpages that house the imagery. As collaborative writings, in ways that mirror how the images are made, they recombine the singular actions of the user or viewer, with those of artist and artificial intelligence, to come into being. They redistribute the single voice of the writer across multiple bodies to build a continually evolving archive of data-based writing that acknowledges the networked nature of more-than-human subjectivity.

​​The project has also been used to experiment with sound composition across 3D digital space. By s​ampling soundtracks ripped from YouTube ASMR videos and running them through digital processes in Ableton live​, various ticks and whispers originating from human voices and actions were modified​ to sound like machines. The individual layers of the Ableton track were then exported individually and positioned at different coordinates within the NAC world. These soundtracks are activated, and layer up in various different manifestations, as the user moves through the space. The repeating rhythms and patterns of sounds - made by machines, but from human beginnings - reinforce the visual elements of the work to produce a hyper synthetic digitality.