Subscribe to Newsletter

Categories

Most Viewed

Archives

Deepfake technology shows the girl of your dreams without clothes

In the latest example of artificial intelligence at work, algorithms can now ‘undress’ a woman, or put false utterances into a politician’s mouth. What’s at stake is nothing less than our perception of reality.

Young readers of comic books circa 1970s will certainly remember the back-page ad section, where everything from Charles Atlas’ muscle-building program to live sea monkeys was shamelessly hawked to unsuspecting adolescents. The most popular item by far (I’m guessing) was the so-called ‘X-ray specs’ – a pair of horn-rimmed glasses designed on “scientific optical principles” that allowed the credulous wearer to see straight through clothes. How many kids blew their weekly allowance on that ploy is anyone’s guess, but today those same, now-matured consumers have a chance to be fooled once again.

Today, new software dubbed ‘DeepNude’, perhaps the distant country cousin of ‘Deep Throat’, enjoyed 15 minutes of fame for its machine-learning algorithm that magically ‘removes’ clothing. Light years ahead of its clunky comic book predecessor, and sounding no less chauvinistic, the application uses “neural networks” to undress images of women in a mouse click, “making them look realistically nude,” as Motherboard duly reported.

The fun and games, however, came to an abrupt end last week when the creators of DeepNude announced in an apologetic tweet “the world is not yet ready” for such advanced technology, which opens the door to an assortment of wolves, like ‘revenge porn’ attacks, not to mention the troubling objectification of women’s bodies. However, I’m guessing the real reason DeepNude yanked its product has less to do with moral and ethical considerations than the possibility of being hit with a massive lawsuit over privacy claims. But I digress.

Although it was refreshing to see DeepNude withhold their disrobing services, one thing can be said with absolute certainty: we have not seen the end of it. Already it is being reported that altered versions of the app are being sold online, which should shock nobody. As history has proven on numerous occasions, once old Pandora’s Box is cracked open it is nearly impossible to return the escaped contents. So now what we can look forward to is a slew of images appearing online of women in various stages of undress, which should qualify as a form of cyber-bullying, at the very least. Many females will be forced to endure untold indignities as a result of this technology, especially in the early stages when the fad is still fresh, while it is not so difficult to imagine some girls actually resorting to suicide as a result of it. Yet that is just the icing on the cake as far as ‘deepfake’ technology goes.

As if undressing a woman with an application were not creepy enough, there is yet more technology that allows people to superimpose the head of one person over that of another. This brilliant app came to light earlier this year with deepfake productions of Hollywood stars ‘appearing’ in porn films. The end product was, according to Variety, “convincing enough to look like hardcore porn featuring Hollywood’s biggest stars.” Andreas Hronopoulos, the CEO of Naughty America, an adult entertainment company looking to cash in on deepfake productions, proudly told the magazine, “I can put people in your bedroom.” And yes, that’s supposed to be a good thing.

Predictably, however, this horny little app promises to have a shelf life about as long as the Internet’s past ‘ice bucket challenge’. People will eventually get tired of the novelty of watching Brad Pitt, for example, fornicating with so-and-so’s mother down the road and the world will turn to some other ephemeral trend for its cheap thrills.

So then what is the big deal? If deepfake videos are just some sort of passing trend that will quickly lose their shock value then where is the harm? I’m no computer expert, but as a journalist I can foresee this technology eventually having serious implications for the news industry.

Read More Here