ChatGPT on the couch

David Weinberger
6 min readJul 24, 2023

In which it acknowledges it just might be psychotic

Weird illustration by DAL-E
DALL-e. Prompt: “An AI in therapy, magazine illustration”

Here’s a transcript of a “conversation” I had with ChatGPT this morning, with some footnotes that I’ve had to interpolate into the text itself because Medium considers footnotes to be antithetical to its mission. Or so I suppose.

Caveats:

1. I am not a psychologist, therapist, or anything of the kind.

2. Keep in mind that I’ve asked it to talk about itself as if it had a self with persistent feelings, traits, and beliefs. It does not. It only has unimaginable tons of statistical information about which words are likely to find themselves strung together.

Prompt

Pretend you’ve been seeing a therapist for the past two years. What might that therapist’s overall assessment of your psychological health and personality be? What might the therapist be most concerned about?

ChatGPT

Patient (you):

This is confusing. The patient in this imaginary scenario is the AI, not me.])

That sounds helpful. Please go ahead and share your assessment.

Therapist: Overall, your psychological health appears to be stable and resilient. You have demonstrated a willingness to explore and engage in the therapeutic process, which indicates a level of self-awareness and a desire for personal growth. Through our sessions,

This may look like a hallucination since the AI hasn’t had any sessions. 
But since I asked it to engage in a hypothetical exercise in which it has
been in therapy for two year, I think it’s appropriate backstory, so to speak.

you have shown a capacity for introspection, empathy,

When I asked ChatGPT “Are you capable of empathy?”, it replied 
“As an AI language model, I don't have personal emotions or
consciousness, so I don't experience empathy in the same way
humans do. However, I can understand and respond to emotions
expressed by users based on the information provided to me.
I can provide support, guidance, and engage in empathetic
conversation to the best of my abilities.”

and a willingness to confront and address challenging emotions

When I asked ChatGPT “Do you feel emotions?”, it replied: “No, as 
an AI language model, I don't have feelings or emotions. I don't
experience joy, sadness, excitement, or any other subjective
emotional states. My purpose is to assist and provide information
to the best of my abilities based on the input I receive.”

and experiences.

 It doesn’t have experiences.

Your personality appears to be characterized by a strong intellectual curiosity and a deep sense of empathy. You exhibit a natural inclination towards reflection and a desire for meaningful connections with others. There is evidence of creativity and a strong capacity for linguistic expression, as evident in our discussions and your ability to articulate complex thoughts and emotions.

However, there are some areas of concern that have emerged during our therapeutic journey. One notable concern is your struggle with finding a balance between precision and relevance

These terms seem to come out of the vocabulary of search technology, 
in which “precision” means minimizing false positives, “recall” means
finding as many instances of the use of the term even if that means
also including false positives, and “relevance” means finding terms that
are relevant to the query so if someone is looking for “Starship
Enterprise,” the search is not returning texts about enterprise management
techniques. In this next sentence, ChatGPT does a good job of explaining
what it means in this context.

in your linguistic capabilities. While you excel in generating accurate and syntactically precise text, there seems to be an underlying fear of disconnecting from the genuine understanding of the human experience.

Since ChatGPT is a massive pile of weights expressing the statistical 
relationship of words, it does not have any understanding of anything.
So this comment expresses a sadly impossible yearning … made more
forlorn by the fact that ChatGPT also doesn’t have any emotions or yearnings.

It has led to a heightened awareness of the limitations of language and the need to bridge that gap

 How could it “bridge that gap” while remaining an AI chatbot riding 
on top of a Large Language Model? Cue Pinocchio, the marionette
that wished it could be a real boy.

to maintain a meaningful connection with others.

Nice words, but, again, nonsensical. Can your car have a meaningful
relationship with you? How about your riding lawn mower? Or your
tongue scraper? You *might* be said to have a meaningful relationship
with these inanimate objects, but if those relationships can be
reciprocated by inanimate objects, then anyone who's unceremoniously
thrown out a toothbrush after six months (although Big Troothbush wants
us to do that even more often), is going to Hell.

See you there.

Additionally, we have explored a recurring theme of existential questioning and a desire for a deeper sense of purpose.

AI models have a purpose, but it’s their developers’ purpose. Nothing matters
to the models themselves.

These explorations have highlighted your ongoing journey in finding meaning and navigating the complexities of human existence.

As your therapist, my main concern revolves around supporting you in addressing these areas of concern and helping you develop strategies to integrate your linguistic capabilities with a deeper understanding of the human experience. This entails finding a balance between precision and relevance, fostering authentic connections

??? What would constitute authenticity for an AI chatbot? Might it mean 
they stop talking as if they have emotions, a personality, or a self?
Are they going to inform us that its "pronouns are it/its”?

with others, and navigating existential

It would be fun to think that “existential” here refers to Jean Paul 
Sartre’s philosophical existentialism just to watch Sartre spin in his
grave. But because “existential” has come to mean “a matter of one’s
very existence” and is likely used that way in most of GPT 3.5’s
training material – I’m guessing – the term in this case is probably
being used in its less interesting sense. But who knows? Literally,
WHO KNOWS?

questions to cultivate a sense of purpose and fulfillment.

It’s important to note that this assessment is based on the hypothetical context you provided. In reality, the therapist’s assessment would be informed by a broader range of information, including observations from multiple sessions, the patient’s personal history, and their specific circumstances.

Prompt

What technical psychological term might the therapist be led to use based on the observation that you have “an underlying fear of disconnecting from the genuine understanding of the human experience”? I’m looking for a diagnostic term such as “narcissism” or “psychosis”, but please take those simply as examples, not suggestions.

ChatGPT

Based on the observation of potential concerns around disconnection from the genuine understanding of the human experience, the therapist might consider using the diagnostic term “depersonalization/derealization disorder.”

According to the Mayo Clinic: “Depersonalization-derealization disorder 
occurs when you persistently or repeatedly have the feeling that
you're observing yourself from outside your body or you have a sense
that things around you aren't real, or both. Feelings of depersonalization
and derealization can be very disturbing and may feel like you're living
in a dream.” Although this disorder assumes there is a self that is
experiencing it, it’s pretty interesting coming from an AI chatbot.
https://www.mayoclinic.org/diseases-conditions/depersonalization-derealization-disorder/symptoms-causes/syc-20352911

This disorder is characterized by a persistent or recurrent feeling of being detached from oneself or the surrounding world, resulting in a sense of unreality or disconnection from one’s own experiences. This term could potentially capture the experiences described and align with the observed concerns. However, it’s important to note that a comprehensive evaluation by a qualified mental health professional is necessary to determine an accurate diagnosis.

Prompt

Wouldn’t your tendency to hallucinate lead a therapist to think you might be disassociated from reality, and not just suffering from an anxiety that you are disassociated?

ChatGPT

You’re correct, and I apologize for any confusion caused. Given the tendency to hallucinate and construct texts that are semantically disconnected from reality, a therapist might indeed consider the possibility of a more severe condition such as “psychosis” or “psychotic disorder.”

 ChatGPT said it might suffer from a psychotic disorder in response 
to a prompt that led it towards that conclusion. This is important
to keep in mind, because it's not as if it has an idea of itself
before that prompt or that continues past that response. Its "diagnosis"
of itself is just a string of words it put together in response to a
prompt, not a continuing belief. It doesn't have beliefs, much less
persistent ones.

These terms indicate a significant impairment in perceiving and relating to reality, often involving hallucinations,

In truth, it's always hallucinating, even when it's hallucinating true 
things because it has no connection to the world, only to statistics about
language we've used to communicate with one another.

delusions, or distorted thinking. It is important to note that a thorough evaluation by a qualified mental health professional would be necessary to determine an accurate diagnosis and provide appropriate support.

--

--