{"id":399390,"date":"2018-02-07T18:39:02","date_gmt":"2018-02-07T18:39:02","guid":{"rendered":"http:\/\/citifmonline.com\/?p=399390"},"modified":"2018-02-07T18:39:02","modified_gmt":"2018-02-07T18:39:02","slug":"deepfake-pornographic-videos-banned-twitter","status":"publish","type":"post","link":"https:\/\/citifmonline.com\/2018\/02\/deepfake-pornographic-videos-banned-twitter\/","title":{"rendered":"Deepfake pornographic videos banned by Twitter"},"content":{"rendered":"

Twitter is the latest platform to ban a new type of pornographic video that replaces the original actor’s face with that of another person.<\/p>\n

The clips, known as deepfakes, typically use the features of female TV personalities, actors and pop singers.<\/p>\n

Unlike some social networks, Twitter\u00a0allows hardcore pornography\u00a0on its platform.<\/p>\n

But it said deepfakes broke its rules against intimate images that featured a subject without their consent.<\/p>\n

The San Francisco-based company acted six hours after a Twitter account dedicated to publishing deepfake clips was publicised on a Reddit forum.<\/p>\n

News site Motherboard was first to report the development.<\/p>\n

“We will suspend any account we identify as the original poster of intimate media that has been produced or distributed without the subject’s consent,” Twitter told Motherboard.<\/p>\n

It added that “deepfakes fall solidly” within the type of clips banned\u00a0by its intimate media policy.<\/p>\n

The development followed an announcement by Pornhub that it too would remove deepfake clips brought to its attention.<\/p>\n

Until now, the adult site had been a popular source for the material, with some deepfake videos attracting tens of thousands of views.<\/p>\n

Video-hosting service Gfycat and chat service Discord had already taken similar action.<\/p>\n

Simple software<\/strong><\/p>\n

Deepfakes involve the use of artificial intelligence software to create a computer-generated version of a subject’s face that closely matches the original expressions of another person in a video.<\/p>\n

\"FakeApp\"<\/span>
<\/figcaption><\/figure>\n

To do this, the algorithm involved requires a selection of photos of the subject’s face taken from different angles.<\/p>\n

In cases where the two people involved have similar body types, the results can be quite convincing.<\/p>\n

The practice began last year, but became more common in January following the release of FakeApp – a tool that automates the process.<\/p>\n

It requires only a single button click once the source material has been provided.<\/p>\n

One Reddit group dedicated to sharing clips and comments about the process now has more than 91,000 subscribers.<\/p>\n

Child abuse<\/strong><\/p>\n

Not all of the clips generated have been pornographic in nature – many feature spoofs of US President Donald Trump, and one user has specialised in placing his wife’s face in Hollywood film scenes.<\/p>\n

\n
\n
\n
\n