Opinion

'Face-swap' apps are being perverted to create homemade porn

How would you feel if someone superimposed a photo of your head onto a porn star's busy body? Deepfakes are a worrying phenomenon, writes Paula Andropoulos

01 February 2018 - 00:00 By Paula Andropoulos
subscribe Just R20 for the first month. Support independent journalism by subscribing to our digital news package.
Subscribe now
There are a number of apps which allow you to superimpose one face over another in a photograph.
There are a number of apps which allow you to superimpose one face over another in a photograph.
Image: 123RF/georgerudy

People in the U.S. have found a new way to titillate their vanity using the Google Arts & Culture app, which has – rather ingeniously – lured a new demographic of narcissists with a selfie feature.

The application uses facial recognition technology to find a painting that resembles your digital self-portrait, and the museum correlates are either comically unflattering, or somehow touching, establishing an uncanny relation between human lives past and present.

But this egocentric pastime represents a comparatively innocuous application of facial recognition technology, which is also being used for less savoury pursuits by some inventive voyeurs.

Deepfakes are a relatively new phenomenon. The Reddit-derived term refers to homemade – so to speak – pornographic videos, which use the same kind of neural network technology as the Google app to superimpose different faces onto porn stars’ busy bodies.

Dreaming of a night with Taylor Swift? You need only download one of the apps that have evolved out of this trend, and apply yourself to fabricating her fall from grace.

Needless to say, this sordid sensation is enacting some of the ethical downfalls of making facial recognition technology widely available on a largely unregulated basis.

If you superimpose the face of a 12-year-old girl onto the writhing body of a mature adult performer, does the final product constitute child porn?

Mash-ups of bodies and faces are problematic when you consider the established notions of consent, autonomy and identity: if you superimpose the face of a 12-year-old girl onto the writhing body of a mature adult performer, does the final product constitute child porn?

It is worrying – and philosophically disorienting – to consider that the creator of one of these vulgar pastiches might be able to argue that, since the woman in the movie is simply an amalgamation of parts, the unwitting donors thereof can't contest their simulated participation.

Deepfakes lend themselves to a medley of malevolent impulses, including – but not limited to – a new brand of revenge porn, defamatory news, and the relentless reduction of marginalised bodies into the sum of their parts.

And, while the celebrities involved in these deepfakes presumably have the resources to mitigate this abuse, the same cannot be said for the porn performers whose powers of consent – in an already tenuous industry – are effectively nullified.


subscribe Just R20 for the first month. Support independent journalism by subscribing to our digital news package.
Subscribe now