, updated
A mother-of-one has revealed how she has become a victim of AI-deepfake porn – technology that’s advancing faster than laws designed to protect victims.
Alyssa Rosa, 29, revealed she learned of the pornographic images bearing her likeness after a woman contacted her on social media, saying she found the artificially generated media on her boyfriend’s phone.
Recognizing it was fake but noticing the unmistakable likeness of a real person, she tracked down Rosa to tell her of the infringement.
‘I was mad,’ Rosa, who lives in a small town in the southern part of the state with her son, told ABC-6 Action News.
‘That kind of content never existed of me before. And now it does. And it’s completely without my consent.’
Rosa went on to detail how she turned that anger into motivation – toward finding the person responsible.
She soon learned the images were likely being created …