AIs, deepfakes, vox avatars

This stuff is the stuff of dreams and a subset of contemporary reality. I’m interested, but tend to keep my distance with engaging with new tech too heavily until there’s a solid reason too; life is already full of enough tech to occupy my free time. But my next project CORM does seem to necessitate that I begin a deeper study on what is out there.

My initial dig is only a cursory study so far, but I’ve been tracking Holly Herndon’s work, and curious about groups like https://modulate.ai/ and https://lyrebird.ai. And of course Google has their own products like Google Cloud’s AI and Machine Learning services https://cloud.google.com/products/ai/ Deepfakes are its own thing. The most famous one being Barak Obama’s announcement of the danger of deepfakes. There’s deepfake Trump heads spewing garbage all over the place and deep fake porn.

Conceptually the intent of CORM, which stands for Consequence of Recursive Memory, is to employ real time training of a system through conversation to convey an example of how recursion is used in memory. The word Consequence implies a hint at what I’m getting at, which is to be critical of how it plays out in human memory. Of course this assumes a lot, primarily, that recursion in human memory exists. I have this idea of hinting at how humans are referencing themselves in their own conception of themselves and outside/exterior persons and objects due to this storage mechanism. Past, present, and future are all relative, and inextricably wound up with the self.

In the end though, this is just play, NOT a valid philosophical argument. This just moves me through this tech to experiment.

Leave a comment

Your email address will not be published. Required fields are marked *