… for Visual Effects
Tip #1309: Deep Fakes – A Creative Perspective
Deep fake face swapping is both boon and bane – depending upon how it is used.
This article, Deep Fakes – A Creative Perspective, from VFXVoice, looks at the impact of emerging face-swapping technology on visual effects. This is a summary.
New ‘deep fake’ face-swapping videos seem to go viral on the internet with increasing regularity. Perhaps that’s because artists, and the machine learning algorithms they use for deep fakes, are increasingly getting so much better at them. Many have pondered, therefore, whether deep fakes will soon make their mark – or are already doing so – in traditional filmed entertainment.
The deep fakes you tend to see in online videos, where a speaking person’s face is typically replaced with that of another (often famous) person, rely on deep learning algorithms and training data. This data is usually video footage or photographs of the other person used to craft a convincing model for face swapping.
Some of the most popular deep fakes made by ctrl shift face include actors doing impressions of other actors, with their faces swapped. Those, in particular, have an extra level of appeal since the voice is already part-way there. “I make videos that I want to see and make me laugh,” ctrl shift face advises, in relation to deep fake video project choices. “Some ideas fail because the technology is not there yet, but most often they fail because of YouTube policies and copyrights.”
“As far as using deep fakes in VFX goes, they’re still very much in their infancy, but they do offer creatives an exciting new range of storytelling possibilities,” outlines Framestore Executive Creative Director William Bartlett. “We used it for example on a very small part of our work on Pokémon: Detective Pikachu where the film required Bill Nighy’s character to appear younger in an ‘old’ news reel. It required some fixes, but deep fakes suited this need because of the nature of what would ultimately be presented on screen.”
The full article presents interviews, videos and more details regarding this process.
From my perspective, this article reminds me of the adage: “Just because we can, does not mean we should.” The problem is the deep fake genie is out of the bottle – now what do we do with it? These may solve a creative problem, but there still seems a major societal price to pay.
What happens when we can’t trust anything we see?