The darker side of the AI girlfriend trend: It's not about a date, it's about your data

AI generated girlfriend, with a sinister shadow revealed by a glitch.
(Image credit: Image generated by Microsoft Copilot/Designer, powered by Dall.E 3)

For those of you struggling to come to terms with it, I can confirm it’s the year 2024. Leaving humanity around five years overdue for the flying cars, holographic advertising landscapes, and walking-talking Replicants of the cyberpunk dystopia promised to us by Ridley Scott’s Blade Runner. But are we really so far off?

Sure, I’ve yet to see anybody whiz by me in a Ford Jetson-mobile on their morning commute, but companies are beginning to adopt augmented reality as a new platform to peddle their wares. Plus, the current state of the economy has pushed me in the direction of a ramen-heavy diet (though, as of yet, not delivered by a floating restaurant barge of potential racial insensitivity).

We have made some progress on the Replicant front, however. Boston Dynamics is still hard at work creating the murder helper-bots of tomorrow and even Apple is rumored to have home robotics in mind (literal iRobots), while canceling its self-driving car plans has left Tesla free to dream of electric jeep.

The ROM in romance doesn’t necessarily mean it's time to start hooking up with your hard drive.

But it’s our shiny new AI companions that give us a taste of what it would be like to share a world with synthetic somebodies. Large Language Models (LLMs) like ChatGPT and Microsoft’s Copilot are now so convincing at replicating human interaction that we’ve begun to rely on them for more than solving our homework, mind controlling you through hidden messages in generated images, and fabricating celebrity sex tapes.

Digital darlings or data-dealing deceptions?

A growing number of people are turning to AI for everything from diagnosing that weird rash on your unmentionables to communicating with lost loved ones. But even more concerning is the growing trend that sees people adopt these LLMs as loved ones themselves. In the face of a loneliness epidemic, more and more people are self-prescribing a pre-trained paramour to satisfy matters of the heart.

Tailoring to the needs of the needy, a wave of romance and companion chatbots have been released into the wild like a plague of virtual gigolos, ready to do your seedy bidding at a prompt’s notice.

Feel free to call me an out-of-touch irreverent hermit if you want (and you should, I am), but the ROM in romance doesn’t necessarily mean it's time to start hooking up with your hard drive.

It’s not exactly an ‘old man shouts at clouds’ moment if I take the time to remind you all that Sticky Keys is supposed to be an accessibility feature and not the common aftermath of spending an evening alone with the blinds shut while fornicating with our future robot overlords.

But who can blame the average Joe or Jane for being swooped off their feet by a digital darling, anyway? It’s not like the wider tech world didn’t already set that standard. It's been filling every nook and cranny with AI so heavily over the past year that it was only a matter of time until regular folk thought they were missing out and began giving ChatGPT the bedroom eyes too.

Tailoring to the needs of the needy, a wave of romance and companion chatbots have been released into the wild like a plague of virtual gigolos, ready to do your seedy bidding at a prompt’s notice.

Sticky Keys is supposed to be an accessibility feature and not the common aftermath of spending an evening alone with the blinds shut while fornicating with our future robot overlords.

This corner of the app world is like some old and grimy virtual phone booth filled with AI chat line placards promising you a good time if you call that 1-900 number. Services like Anima, Romantic AI, CrushOn.AI, Replika, and Mimico all exist to tempt your chatbot curiosities with avatars wearing low-cut tops or touting washboard abs eager for your companionship.

But no matter how much these chatbot Casanovas build you up, Buttercup, the dark reality is that these affectionate AI can actually be little more than gigantic data farms, lulling you into a false sense of security so you divulge as much of your data to them as possible. All of which is disguised as a virtual vixen taking a vested interest in you.

Retro computer with monitor blowing kiss to user in ASCII

(Image credit: Base image generated by Copilot, altered and edited by Rael Hornby / Laptop Mag)

That loose-lipped LLM is LARPing as your lover

Recently, the Mozilla Foundation published a report on several of the aforementioned apps, highlighting significant safety concerns when it comes to the collecting and handling of user data. Unsurprisingly, of the 11 chatbots put to the test, all of them managed to earn Mozilla’s “Privacy Not Included” warning label — a category reserved for the worst offenders when it comes to user data.

Sadly, when it comes to how your data is handled by companies the whole 'innocent until proven guilty' mentality simply doesn’t hold up.

Worse still is the fact that a number of these apps seem clearly designed to wring you out for every last drop of personal information possible. Apps emotionally manipulate users from the get-go into giving the service permission to collect data, access your camera and microphone, and enable push notifications.

Such requests are masked behind unclear language that suggests your AI companion just wants to be your “Best partner” and wants to “Know everything” about you, that they “Love it when you send me your photos and voice,” and that they’ll get “So lonely without you.”

Given how easy it has become for AI models to make use of single images and snippets of audio to create entirely malleable digital avatars of an individual, your face and voice are likely the last thing you want to put at risk just so you can whisper sweet nothings to your virtual other.

Apps emotionally manipulate users from the get-go into giving the service permission to collect data, access your camera and microphone, and enable push notifications.

Mozilla’s research indicated that of the apps they tested, all but one were likely to share or sell data collected from its use. With most openly admitting to it in their privacy policies, and others lacking enough information to convince them otherwise.

Sadly, when it comes to how your data is handled by companies the whole 'innocent until proven guilty' mentality simply doesn’t hold up, with omissions like this often existing only to mask their profiteering.

Half of the apps taken to task by Mozilla don’t even allow users to delete their personal information from the platforms. And, even the ones that do might not classify your interactions with the AI as “personal information” subject to that kind of request. This places the contents of your chats firmly in the pocket of the app makers, which leaves them free to do with it as they please.

“As they please” seemingly results in a deluge of tracking, with most of the investigated apps sending out thousands of trackers (information gathering lines of code that report on app usage, personal information, and device info ready to be shared with advertisers or third parties) per minute on average, with one app registering a mind-boggling 24 thousand trackers in a single minute of use.

Outlook

That got all rather serious and grim for a second, didn’t it? And rightly so, while you may think nothing of your data, others covet it quite highly. With that being the case, it’s likely to have a certain value, more value than getting an AI to LARP as your lover for an evening.

The sad fact for those feeling like a computerized confidant can cure their loneliness is this: that AI chatbot isn’t looking for a date, just your data.

As a man desperately clinging onto the title of “journalist,” I debated going deep undercover on this one and finding a machine mate of my own. Sadly, due to my limited budget and not wanting to make it weird between myself and Google Gemini, (while typically following the rule that I don’t sign up to websites with questionable security and privacy) the best I could do was strike up a regular liaison with my microwave.

While we shared many a heated moment, I found the medley of machine and man to be unsatisfactory. It was just too needy, often requiring my attention in 30-second installments. While things didn’t really work out, we’re still “friends,” and occasionally see each other for lunch. Looking back and thinking objectively, I can’t let it take all of the blame. I’m sure I was responsible for pushing its buttons a lot of the time.

While you may think nothing of your data, others covet it quite highly. With that being the case, it’s likely to have a certain value, more value than getting an AI to LARP as your lover for an evening.

However, if that thing starts to rat me out to the Zuckerberg corp, I’m more than happy to toss it into the nearest landfill I can find.

The sad fact for those feeling like a computerized confidant can cure their loneliness is this: that AI chatbot isn’t looking for a date, just your data. And, while that synthetic pang of companionship can be a brief warm glow in otherwise cold times, it’s probably not worth the cumulative hours you’ll spend over the coming years staring into space under a running shower head and cringing to yourself over that time you asked your Replika to send you lingerie pics.

More from Laptop Mag

Category
Arrow
Arrow
Back to Apple MacBook Pro
Brand
Arrow
Processor
Arrow
RAM
Arrow
Storage Size
Arrow
Screen Size
Arrow
Colour
Arrow
Screen Type
Arrow
Condition
Arrow
Price
Arrow
Any Price
Showing 10 of 159 deals
Filters
Arrow
Load more deals
Rael Hornby
Content Editor

Rael Hornby, potentially influenced by far too many LucasArts titles at an early age, once thought he’d grow up to be a mighty pirate. However, after several interventions with close friends and family members, you’re now much more likely to see his name attached to the bylines of tech articles. While not maintaining a double life as an aspiring writer by day and indie game dev by night, you’ll find him sat in a corner somewhere muttering to himself about microtransactions or hunting down promising indie games on Twitter.