This exclusive work is part of the "Truncation" series by Liam Power. What do neural networks look like? How does a computer learn? Through a dataset trained on the cumulative history of artworks from antiquity to modernism, a neural network steps through an interpolated series of numbers. These numbers by chance correlate to an artwork, a hyperreal simulation of the original. In this work, we get a glimpse inside the neural processing unit of a computer. Tensors shift and flow. A generative network splits attention into hemispheres. Artificial paint shifts in smooth interpolated waves, truncation incrementing in the latent space. Simulated neurons fire infinite variations on a machine dream. A simulated soundscape traverses the network in a feedback loop. Includes a HQ .mp4 for inclusion in offline/offline realities.
Liam Power (b. 1991) is a multidisciplinary artist and creative-coder based in Melbourne, Australia. He works a lot with video, sound and data. His work primarily questions and critiques the boundaries of systems and their function. These systems could be computing technologies, perceptual and c...
More details