Older men are being scammed and fooled left and right by a deluge of AI generated female influencers,according the NY Post.

What appears to be a growing wave of glamorous influencers online isn’t always what it seems.In some cases, these personalities are entirely artificial - carefully engineered digital figures designed to look, act, and interact like real people. One widely followed pro-MAGA persona, for example, was ultimately exposed as “nothing more than an algorithm run by a guy in India,” revealing just how convincingly these accounts can mimic authenticity.

Despite that, audiences continue to engage—often deeply. Many followers, particularly older men, are “falling for them left, right and center.” Experts suggest this isn’t just about deception, but about a deeper emotional gap. Some describe the phenomenon as a “pandemic of loneliness,” even pointing to a broader “societal loss of humanity” as people increasingly form attachments to digital illusions instead of real relationships.

What’s striking is that these accounts don’t always hide the truth. Some openly identify as AI and still attract admiration. Take Ana Zelu, a fictional influencer who clearly labels herself an “ai-influencer,” yet maintains a highly curated feed filled with aspirational imagery—luxury travel, fashionable outfits, and picturesque city scenes. Her posts draw enthusiastic responses, with followers commenting things like “Number one is my favourite…May God bless you,” and “You are genuinely in a class of your own.” The awareness that she isn’t real doesn’t seem to diminish the appeal.

ThePost writesthat a similar pattern appears with Milla Sofia, another digital creation presented as a pop singer. Her content includes stylized videos and performances, and although her profile identifies her as virtual, fans respond as if she were a real celebrity.Comments such as “my sweet love,” “Listening to the music of this woman I love,” and “I love you” reflect genuine emotional investment.

Psychotherapist Jonathan Alpert explains why this happens: “people don’t actually need something to be real…they just need it to feel responsive.” When an account appears engaging, consistent, and attentive, “the brain starts to treat that interaction as meaningful.” In other words, emotional connection can form even without a real person on the other side.

Forensic psychologist Carole Lieberman ties this behavior to social isolation. Even when users suspect something isn’t real, “it seems better than nothing,” and many “convince ourselves that it is — or could be — a real person.” The illusion becomes a kind of emotional substitute—one that feels easier, safer, and more accessible than real-world interaction.

She said it is a “very sad state of affairs” and “a societal loss of humanity.”

At the same time, the technology behind these personas is improving rapidly. AI-generated faces, voices, and videos have moved beyond the so-called “uncanny valley,” making them increasingly indistinguishable from reality. As AI expert Hany Farid notes, while some accounts disclose their artificial nature, “the vast majority of content is not.” This creates an environment where users are highly “vulnerable to being deceived,” often without realizing it.

The result is a digital landscape where the boundary between real and fake is fading. These AI influencers may not exist in the physical world, but the emotions they evoke are real—and for many people, that emotional connection is enough.

Source: ZeroHedge News