Search
Close this search box.

Deepfakes: Tackling a disturbing internet trend

The disturbing internet trend targeting everyone.

When eighteen-year-old Noelle Martin landed in Sydney from Western Australia, she was ready to start the new chapter of her life and the first year of her dream law degree. She had no idea a quick Google search would change everything.

Martin was getting ready for her university adventure one night when a split-second decision to reverse image a photograph of herself online lead to a horrific discovery.

Her eyes flickered with astonishment as a collection of graphic images showing her engaged in sexual acts engulfed her screen.

The problem was, she’d never done porn in her life.

Martin, who told her story during a Ted Talk in 2018, was the victim of a phenomenon called deepfake, a process where online algorithms digitally manipulate images and videos.

Although digital alteration has been used in Hollywood films for a long time, there has been a recent spike in the number of videos created which swaps the faces of celebrities on bodies of porn actresses.

In 2017, after a young programmer’s digitally manipulated pornographic videos of actress Gal Gadot went viral, several fake porn videos of other actresses and actors have since emerged online.

But experts warn that this is only the beginning.

Dr Asher Flynn, an associate professor in Criminology at Monash University, says that previously people with the technology and knowledge were able to create manipulated media at a high standard.

“Historically, it was more people who has the high graphic processing units and the technological skills to be able to create these types of images,” she told upstart.

“But what we’re problematically finding now is that anyone can get access to these tools.”

One such tool was an app developed this year, designed with the purpose of replacing the clothes worn by women or men in a photograph with female genitalia.

The tool is available to the public for only US $50.

Flynn believes that the commercialisation of abuse through apps like these makes anyone a target.

“I think that’s really problematic, because nowadays with social media there’s thousands of images of many of us online, like in the cloud and on our devices. So, it means that almost anyone is fair game to be faked in a sense,” she says.

According to a recent study by cybersecurity company Deeptrace, a majority of the 96 percent of deepfakes found on the top adult websites were created without consent.

The Australian legal system has made efforts to try and keep up with the issue through different laws across every state and territory to help victims. Since Martin’s story, amendments to the Broadcasting Services Act 1992, Criminal Code Act 1995 and Enhancing Online Safety Act 2015 have been made. The recent changes mean that deepfakes now align with the definition of intimate images under these acts.

The expansion of powers given to the Office of the E-Safety Commissioner through the amendments have made it easier for victims to have a reliable source of support in they instance they find deepfaked media of themselves online.

Erica Chan, a technology and digital lawyer at Gilbert + Tobin, believes that despite these laws, Australia will find it a challenge dealing with the new technology.

“I think all countries, including Australia, will find it difficult to tackle the rising issue of deepfakes. This is because the rising issue of deepfakes is a deeply social problem that requires attack from several angles,” she told upstart.

Another issue, Chan claims, is the inefficiency of the procedures in Australia’s legal system to match with speed of online information will make it hard to confront deepfakes.

“The biggest challenge in focusing on the law is that it’s an after-the-fact solution. Our legal system moves slowly and requires going to court to enforce and is therefore at complete odds with how fast things move on social media,” she says.

But while Dr Asher Flynn acknowledges that is an issue for women, she warns that it goes beyond genders.

“While we do need to be looking at this problem as a gendered paradigm, we also need to be looking at it through an intersexual lens,” she told upstart.

According to Flynn, this type of image-based abuse is occurring in other vulnerable populations like members of the LGBTQ community, culturally linguistically diverse, Indigenous Australians and people with disabilities.

For some victims, deepfake tools have been used as an act of revenge, as investigative journalist and writer Rana Ayyub discovered.

Ayyub became a target of deepfake videos after being slammed by the public for her views on India’s legal system.

“My brother flew in from Mumbai to see me in Delhi, but I just couldn’t face anyone from my family. I felt so embarrassed. The entire country was watching a porn video that claimed to be me and I just couldn’t bring myself to do anything,” she wrote for The Huffington Post.

Despite agreeing that deepfakes will likely to be used to cause further destruction by perpetrators in dating scams, Flynn suggests a multimodal approach rather than just focusing on the law.

“We need to be looking to tech companies to have some type of corporate responsibility,” she says.

“We need more education, prevention and awareness of [deepfakes] and we need to make sure we’re focusing on the abusive actions of the perpetrator, not the victims.”

 


Writer: Ashleigh Matosevic is a first-year Bachelor of Media and Communications (Sport Journalism) student at La Trobe University. You can follow her on Twitter @ashleighmats_

Photo: Digital Art January 20, 2014 by Rishbh Sharma available HERE and used under a Creative Commons Attribution. The image has not been modified.

Related Articles

Editor's Picks