Deepfakes Fakeworlds


On deepfake and alternative data: “The Future of Trust in Audiovisual Data”

Deepfake videos are based on a software technology that creates fake videos (statements, acts) of individual persons, which is easily accessible and runs on a regular laptop. Today, most of the examples in the web are from famous politicians and actors as the machine learning part of the software still needs a large data set of previously recorded videos to create fictional ones. But with a bit of technological progress and further increases in the amount of every day image recording, it will soon be possible to create such videos of any individual person.


It all started only a year ago, in Q4 2017 on the frequented parts in the digital world: porn websites. The face-swapping of famous actresses’ head on other women’s naked bodies in porn scenes. This non-consensual faking of individuals and their actions already resulted into a controversial discussion all over the world, and to the implementation of a new law in Australia. Deepfakes are a new form of harassment as never seen before and can violet individuals, their image and reputation. It’s basically the future of cyberbullying.


But not only individual bullying can result out of this technology: The biggest impact of tech development like deepfake lies in the easy accessibility for everyone. Deepfake videos are basically a fun weapon for trolls and other interested parties to target specific audiences in the web (or their school) with the aim, to alter their perception about a current political issue, opinion or state of mind or simply harass them.
Technology development is and has always been an arms race between innovation and mitigation of potential negative consequences. With the technology of deepfake, we already see governmental, academic and organizational reactions towards new ways of verification strategies video material. However, there’s not yet a solution out there at the moment. Moreover, deepfakes will add another layer to our prevalent #fakenews, ‘alternative data’ and global digital trust crisis. Our audiovisual data, produced by ourselves or consumed with third-party content, has become much more vulnerable and easier to manipulate than ever before

How many times did you share a picture on a social media platform? You don’t need to get hacked because you already deliver the audiovisual data yourself.