A little over a year ago, MG was leading the relatively normal life of a twentysomething in Scottsdale, Arizona. She worked as a personal assistant and supplemented her income by waiting tables on the weekends. Like most women her age, she had an Instagram account, where she’d occasionally post Stories and photos of herself getting matcha and hanging out by the pool with her friends, or going to Pilates.
“I never really cared to pop off and become popular on social media,” says MG (who is cited only as MG in the lawsuit to protect her identity). “I just used it the way most people did when it first came out, to share their lives with the people closest to them.” She has a little more than 9,000 followers—a robust following, but nowhere close to a massive platform.
Last summer, she received a DM from one of her followers. Did she know, the person asked her, that photos and videos of a woman who looked exactly like MG were circulating on Instagram? MG clicked the link and saw multiple Reels of what appeared to be her face superimposed onto a body that looked exactly like her own. The woman in the photo was scantily clad, with tattoos in the same places as MG.
MG was horrified. “If you didn’t know me well, you could very well think they were images of me,” she said. “It was kind of like this reality check that I don’t have any control over my own image.”
She was even more appalled when she discovered that not only were doctored nude or scantily clad photos of her being circulated on the Internet, as she outlined in a recently filed complaint—they were also being used to advertise AI ModelForge, a platform that teaches men how to generate their own AI influencers. In a series of online classes and tutorials, the men allegedly taught subscribers to use a software called CreatorCore to train AI models using photos of unsuspecting young women, posting the resulting content on Instagram and TikTok.

