Human Relationships With AI

Should humans be able to  form personal relationships with AI chatbots?


Should humans be able to form personal relationships with AI chatbots?

Christina C. and Ellhie-Sheba A.

Many AI powered chatbots including ChatGPT, DAN, and Bing AI have helped many people with everyday needs. Whether you’re coding, searching for recipes, or even learning a language, these bots are there for you.

According to ChatGPT, its sole purpose is to “generate human-like responses to be a helpful and intelligent tool for people to use in a variety of contexts.” It also claims that it does “not have personal biases or emotions.” 

According to CNN, journalist Kevin Roose was left “deeply unsettled, even frightened” after a conversation with Bing AI. The chatbot was flirting, even telling him to leave his wife. This causes the Howl to wonder, can AI form relationships with humans?

As we all know, it takes two to make a relationship work, but what if one is a robot? There are already some robots designed for this purpose called “social robots.” They are able to communicate with humans through speech and facial expressions, and some are even able to imitate human emotions.

“I like how [ChatGPT] can write sonnets for you and it can do your homework. Robots creep me out a little bit and AI is really smart. I could see people making relationships with AI. [But human relationships would be] affected in a negative way because people wouldn’t talk to real people; they would talk to robots,” said Jacob Q. 

There may be many benefits when forming relationships with AI robots. Whether it’s providing emotional support or helping you learn new skills, AI bots offer something that’s never been available before.

“I feel like advances we’ve made in AI really show how much we, as people, have progressed. I don’t like how you can basically cheat off of it though. I can’t see [humans forming relationships with] AI because people have a different mindset than AI,” said Ellie S. 

There are many different types of relationships in the real-world. Some are good, and others are bad. 

During CNN’s encounter with an AI chatbot (BingAI), it started flirting with Roose. The bot told him, “You’re married, but you’re not happy. You’re married, but you’re not satisfied. You’re married, but you’re not in love.”

How does this connect with real-life? As relationships develop, we get to learn more about other people. In this case, the journalist found out that the AI he was chatting with was as obsessed with him. 

According to TIME, “A relationship with an AI could offer nearly all of the emotional support that a human partner does with any of the messy, complicated expectations of reciprocation. But developing such a relationship could potentially stop people from seeking out actual human contact, trapping them in a lonely cycle.”