This person was generated by a computer using StyleGAN2. His image looked so unremarkable I felt it necessary to use Neural Filters and Facewarp in Photoshop to push him back into the Uncanny Valley.
Neither of these people are real. They are both images created by StyleGAN2. I took those images and brought them to life using First Order Motion Model. I then used WAV2LIP to get them to speak a bunch of words that had been written by a custom trained GPT-2 and passed through a text-to-speech program. The lady has a far more realistic voice, thanks to Tacotron2 and Waveglow, the man was voiced manually using the inbuilt Mac TTS function
This lady looks strange and is a bit glitchy. I think it might be because I used my face to inhabit her image as my “meat-puppet”. And before you call Clarice Starling on me. Don’t forget, this is not a person.
Here’s what she looked like before. Hang on. She’s not even a she, is she? I mean, there’s probably a classifier somewhere that says she has the visual characteristics of a female.
These are just hacks, done by me, a hack, with no skill, no idea and no technology. I’m so excited about where this stuff could go from the point of view of storytelling / gaming and all kinds of things. And yes, it’s a bit terrifying too
Freaky Fish ‘n’ Chips
I trained a StyleGAN on a not very large and not very well cleaned dataset of images of fish and chips (and mushy peas for a bit of colour). After a few hours training it was spitting out odd but fascinating greasy images.
Ministers of Ceremonies
When I first discovered WAV2LIP I wondered; how could I overthrow the ruling Tory party? I ran out of ideas, then thought it might be their downfall if I used them as meat-puppets to deliver UK Garage Classics, and other bits of pop culture.