The increasing use of ChatGPT to analyze and solve interpersonal relationship problems is highlighted in a Wired article.
Users consulted by Wired admit to using ChatGPT to analyze prospective partners’ text messages, settle disputes with friends, and attempt to reconcile with relatives, despite feeling guilty for doing so.
Psychiatrist Daniel Kimmel of Columbia University warns that relying on AI to deal with emotional baggage instead of people can undermine the foundation of human relationships, which are based on shared experiences.
Kate, a Denver resident who has been using ChatGPT to analyze her relationships for about two and a half months, says that while it can hurt to receive AI relationship advice, it’s also reassuring to have a third party validate her feelings.
However, her human therapist is concerned about her overuse of ChatGPT and advises her to avoid analyzing her relationships so much.
Concerns about privacy and the unethical use of personal data are also raised, as users divulge a lot of intimate information to ChatGPT.