AI Race Could Lead To A Surge In Deepfake Pornography

  • Home
  • Trending
  • AI Race Could Lead To A Surge In Deepfake Pornography
Deepfake pornography

Using artificial intelligence imaging, artists can create art, virtual fitting rooms, and advertising campaigns. However, experts worry about nonconsensual deepfake pornography, which predominantly victimizes women.

Artificial intelligence or machine learning can digitally create or alter videos and images called deepfakes. A few years ago, a Reddit user shared porn clips that superimposed the faces of female celebrities on porn actors’ bodies. Thus, initiating the spread of deepfake technology on the internet.


Intensifying Deepfake Issues

Following its introduction, deepfake creators have targeted journalists, online influencers, and public figures with similar videos and images. Thousands of videos exist on various websites, and some even allow users to create images. Unfortunately, this technology can make nonconsensual sexual fantasies or harm former partners.

Experts believe that the deepfake problem intensified as it became simpler to produce high-quality and visually captivating deepfakes. They also warn that it could worsen due to generative AI tools that learn from billions of internet images and generate new content using available data.

Adam Dodge, the founder of EndTAB, a group that educates on technology-enabled abuse, stated that the technology would continue to spread and develop, eventually becoming as easy as pressing a button. As long as this technology remains accessible, people will undoubtedly misuse it to harm others. This would be primarily through deepfake pornography, online sexual violence, and phony nude images.


A Victim of Deepfake Tech

Noelle Martin, a 28-year-old from Perth, Australia, discovered deepfake pornography of herself ten years ago. She used Google to search for an image of herself out of curiosity, only to find disturbing photos of herself. Martin still has no idea who created the fake pictures or videos of her engaged in sexual activity that she discovered. She believes someone may have taken a photo from her social media page or elsewhere and edited it into pornography.

Martin was horrified and spent several years contacting different websites to have the images removed. Some did not respond, while others removed them only to see them reappear later. According to Martin, you cannot win. This will always be out there, forever ruining you.” As she spoke out, the problem only worsened, with some people blaming her for the situation due to how she dressed and shared images.

Featured image from tribuneindia.com

Tags: