Lucid Dreams 


In the Summer of 2019 I began experimenting with Image and Video Generation using Machine Learning Library called Lucid.  The library exists to help visualize and describe how machine learning models like ImageNet are able to function.  The library looks at specific “activations” which correspond to the patterns it uses to identify objects in photos.  I find the imagery so compelling because it draws a connection between the way we as humans find patterns in imagery and how machine learning models are able to find patterns.  

I started using the library and writing my own custom code to take advantage of the abstract and often psychadelic imagery it created.  Once the imagery has been created I often import it into multimedia programs like TouchDesigner and link further transformations with sound as seen in the following video.  



Or in the following video setup functionality to dynamically distort the imagery based on incoming instruments.  In the following video, parameters are being modulated based on different notes coming from a guitar.  





The Following are Gifs showing the process of “Interpolating” between two different pattern “activations”.  These Gifs are typically what I feed into TouchDesigner in order to generate the videos.  




                       

           

I also have spent a lot of time trying to soley get high fidelity imagery using my techniques, which have yielded a variety of results.