I recently did a talk on deepfake machine learning which included a long intro about the dangers of deep fakes. If you don’t know what deepfakes are, just think of using photoshop to swap people’s faces, except applied to movies instead of photos, and using AI instead of a mouse and keyboard. The presentation ended with a short video clip of Rutgar Hauer’s “tears in rain” speech from Blade Runner, but replacing Hauer’s face with Famke Janssen’s.
But back to the intro – besides being used to make frivolous videos that insert Nicholas Cage into movies he was never in (you can search for it on Youtube), it is also used to create fake celebrity pornography and worse of all to create what is known as “revenge porn” or just malicious digital face swaps to humiliate women.
Noelle Martin has, in her words, become the face of the movement against image based abuse of women. After years of having her identity taken away, digitally altered, and then distributed against her will on pornography websites since she was 17 years old, she decided to regain her own narrative by speaking out publicly about the issue and increasing awareness of it. She was immediately attacked on social media for bringing attention to the issue and yet she persisted and eventually helped to criminalize image based sexual abuse in New South Wales, Australia, with a provision specifically about altered images.
Criminalization of these acts followed at the commonwealth level in Australia. She is now working to increase global awareness of the issue – especially given that the webservers that publish non-consensual altered images can be anywhere in the world. She was also a finalist in the 2019 Young Australian of the Year award for her activism against revenge porn and for raising awareness of the way modern altered image technology is being used to humiliate women.
I did a poor job of telling her story in my presentation this week. Beyond that, because of the nature of the wrong against her, there’s the open question of whether it is appropriate even to try to tell her story – after all, it is her story to tell and not mine.
Fortunately, Noelle has already established her own narrative loudly and forcefully. Please hear her story in her own words at Tedx Perth.
Once you’ve done that, please watch this Wall Street Journal story about deepfake technology in which she is featured.
When you’ve heard her story, please follow her twitter account @NoelleMartin94 and help amplify her voice and raise awareness about the dark side of AI technology. As much as machine learning is in many ways wonderful and has the power to make our lives easier, it also has the ability to feed the worst impulses in us. Because ML shortens the distance between thought and act, as it is intended to do, it also easily erases the consciousness that is meant to mediate our actions: our very selves.
By speaking out, Ms. Martin took control of her own narrative. Please help her spread both the warning and the cure by amplifying her story to others.
Can Neural Network help spot DeepFake Photos?
https://technoidhub.com/machine-learning/can-neural-network-help-spot-deepfake-photos/17159/