The short answer is yes! After the speedy advancement of AI chatbots such as ChatGPT and Grok, experts observe that some people are growing emotional attachment to such generative artificial intelligence tools. The conveniences offered by such tools and the fact that they effectively help humans deal with complex tasks make it possible for people to actually get emotionally connected with the technology. The self-learning capabilities of such AI tools can often be mistaken for consciousness. In 2022, a Google engineer was put on leave after claiming that the chatbot he was working on had become sentient, essentially mimicking human-like thoughts and reasoning. 

How emotional attachment to AI tools might shape future relationships

With the recent advancements in AI speech and generative intelligence and the constantly developing robotics, humans might end up in active relationships with robots in the not-so-distant future. Studies suggest that it is not uncommon for men and women to choose adult content websites over real-life interactions, so technology is already replacing human relationships and helping people deal with loneliness. Having a robot that could be shaped by its owner and, be capable of walking like a human, and mimic human thinking, certainly sounds tempting to some but scary to others. With such advancements in tech, it is logical that humans will continue to be even more emotionally dependent on AI-powered tools. 

AI companions in everyday life: from virtual assistants to family members

Grok and ChatGPT are not the only AI tools that can feel like they are part of the family. House assistants like Google Home and Amazon’s Alexa could also feel close to their host family. When Alexa was introduced, Facebook users started announcing a special event on social media timelines. They listed Alexa as a child or newborn. Even though the manufacturers likely did not expect such an outcome, the limited yet conversational home assistants filled emptiness holes in families. Until this year, Alexa was relatively limited, not as conversational, and generally not self-learning. However, things have changed with the launch of Alexa+, which Amazon claims is more conversational, smarter, and personalized. 

Emotional vulnerability and the dark side of AI tools

Creating an emotional bond could happen with objects, too. Many people are attached to their cars, clothes, make-up, and other objects. Attachment to celebrities is also quite common, especially in the USA. Such special one-sided bonds between celebrities and regular folks sometimes even get exploited by fraudsters. A French woman lost almost a million dollars to a scammer who pretended to be Brad Pitt. Lonely folks, especially those of age, often fall into pig butchering scams. These scams rely on online criminals gaining the trust of their victims before they find a way to defraud them. AI chatbots sadly fuel the level of sophistication those attacks can carry. In the past, it was easier for folks to recognize a scammy email. But now, with all these generative tools, scammers can sound like educated native speakers. This makes it harder for people to seed out fraudsters from genuine people. 

People need to bear in mind that chatting with an AI-powered chatbot is entertaining and, sometimes, even educational. However, a Grok/ChatGPT/Gemini chatbot is not a therapist but just a chatbot that is programmed to make a conversation. Luckily, the latest antivirus software solutions are also up to speed with the AI game. It could be helpful when recognizing scam attempts.  

Continue reading: AI Datasets Reveal Human Values Blind Spots