< Show all Artworks
Truncation #1: Daubing the Latent Space

Truncation #1: Daubing the Latent Space

Editions available 2 of 2
Published: 2021
Number of Artist Proofs: 1
Resolution: 1920 × 1080 px
File Format: mp4
File Size: 125.8 MB
Duration: 1 min 40 sec

This exclusive work is part of the "Truncation" series by Liam Power. What do neural networks look like? How does a computer learn? Through a dataset trained on the cumulative history of artworks from antiquity to modernism, a neural network steps through an interpolated series of numbers. These numbers by chance correlate to an artwork, a hyperreal simulation of the original. In this work, we get a glimpse inside the neural processing unit of a computer. Tensors shift and flow. A generative network splits attention into hemispheres. Artificial paint shifts in smooth interpolated waves, truncation incrementing in the latent space. Simulated neurons fire infinite variations on a machine dream. A simulated soundscape traverses the network in a feedback loop. Includes a HQ .mp4 for inclusion in offline/offline realities.

Liam Power

7 Artworks

Liam Power (b. 1991) is a multidisciplinary artist and creative-coder based in Melbourne, Australia. He works a lot with video, sound and data. His work primarily questions and critiques the boundaries of systems and their function. These systems could be computing technologies, perceptual and c...

More details

More artworks byLiam Power


Liam Power

Synthetic: 004 -- Cybernetic Landscape (2020)

Editions 2 of 2


Art. Original. Blockchain. 

© elementum.art - v0.3.58
* Ether prices marked with an asterisk are estimates only, as they are subject to exchange rate fluctuations. The final Ether price is determined upon checkout.