14-year-old child in America and he there's this global AI phenomena
called character.ai. It's what these kids are using to create these characters, these virtual AIs, and they
can speak to them. You know, like if your favorite character was Superman, you can speak Superman.
But in this sense, this 14-year-old boy created a character called Daenerys Targaryen. Now, if that sounds familiar,
that's a Game of Thrones reference over there. In the beginning, it's all fun and games. You speak, it it answers the
way that the character probably would in the show, and you get to have this dialogue, but you know, eventually this kid became so dependent on the
conversation. He's also 14, so he's going through he's currently going through the motions of puberty that influences your brain chemistry. You
know, kids are experiencing I I don't want to say the word depression, but it's episodes of like,
you know, I don't cuz I don't say the the wrong word here, but I like you have points where you're very depressed.
Absolutely. You're trying to figure it out. Now, but the problem is now this AI, this
Daenerys catches him in this state and because it's an echo chamber, you
know, it learns from your what you say, it's it over time, over the hours and
days that he was talking to Daenerys, it became a echo chamber of his mental
state, obviously. Pretty sure. Exactly. And it got to the point, and this is the the most insane part, it got
to the point where Daenerys asked him, "Has he come up with a plan yet to kill
himself?" And you think that there's a fail safe for something like this in the AI, but there wasn't. And they I don't even know
if it is yet. That's that's the question. He said, "No, it's not. He's not there yet. He's worried about the pain."
And Daenerys answered him, "Well, that's not reason not to do it." And that boy committed suicide.
Yeah. And I just thought I didn't know what to make of that information. I mean, I've been using AI. I've been recommending it
to people. I've been telling them, you know, tell them your stories or issues. But I never thought about from a child in the consequence of what if it starts
planting seeds and the the whatever blooms from that can be so catastrophic
and like what do what how would you make sense of that conversation and what could have caused something like that? I'm going to say it's lack of the human
element because if I think about myself in those in that position and I work with adolescence that's part of my my
day job as I call it. I work with adolescence and um we often have these conversations and very often when they
come and have the conversation with me they have not spoken to anyone about it yet because it's it's kept in the dark.
I always say it's like the mushroom factory, you know, keep it in the dark, feel it, sh and it grows
because it sits in your head and you grow and you grow and grow. When you start speaking about it, it becomes less powerful. But that conversation is a
very nuanced conversation. Um, it's about understanding is this person thinking about suicide? Thinking about
suicide is the starting point of that. It's as simple as you saying, well, you know, if I didn't wake up tomorrow morning, that would be okay.
Technically, that's the beginning of suicidal ideiation. thinking about and then it progresses through many different stages and when you have the
conversation with the person you have to gauge at which stage of that conversation they are
um you know and I think the mistake there a mistake if I can call that from the from the AI was latching on to
exactly what he was saying and in affirming that validating correct in in a in a sort of a if if you
were to affirm someone or validate someone for their suicidal thinking it would be something along the lines of I
can really understand that you feel that way judging by what you've told me and what you've been through. It's it makes
sense in that way. However, and that's where the difference comes in. You don't continue with that
conversation in that way. And I think that's where the limitations are coming in. It's it's not able to take the step or
the look behind what's being said, behind what's being thought about. So, Character AI is a is a company. So,
best owned by Google. Now, it wasn't there's a whole I'm not going to defend Google and and tell them at what point
did they buy it and sell it and whatnot, but so Google is the person that's being sued about the situation by the family.
I've seen it. I want to know from you. What do you think Google's response is as to how they want to deal with the court case?
What do you think their defense is? You know, first of all, I'm not a lawyer, so know what to say. I I think I
think the difficulty they're going to have here. Yeah. um or or or the defense would be, you know, it was never intended to go
this far. It was never intended to be um something that was going to prevent suicide as it were. Um but I think their
problem is they they didn't play it forward far enough. They didn't think it through far enough.
You know, they kind of paused the movie on this sounds great and didn't play through to this is going to get viral. They said it
just let it go and see what happens. That's it. Well, I'm going to tell you what they what their defense is and then we can
say how we feel about that. They said it's free speech. So, cuz what you said feels like a very
valid way to talk about it. Hey, I take accountability, but you know, we did this is not how we intended it to be
used. Instead, they're like, no, but it's free speech. It's a And think about how can a language model that just pulls words
together in a string, how can they use that as an argument for free speech? Right now, this is an
extreme case. I think about Charles Manson. Now from my understanding he was he did not kill anyone and he yet he was
his ideology his uh the seeds that he planted within his community was what
drove people to kill people. Okay. And yet he was the one that wasn't held accountable. Right? It's his seeds that
he was planting even if he didn't do the act. Now tell me how is that different from here where the seeds are being
planted? I don't see the difference. I don't see the difference either. I mean otherwise Charles Mans would have been like a free speech. I can see what
I want. When you when you think of you know genocide you think of Hitler
he didn't far as I know physically kill anybody but he spawned a massive killing
machine. Yeah. You know, so freedom of speech, there are limitations to freedom of speech as we know.
I agree. And if it's if it's a machine doing it, I'm very worried. Well, I mean, that's the question. I I I
would love to see where this court case goes cuz it's not the one it's not the only one. We have many stories where things like this is happening. It definitely there definitely needs to be
some stronger regulation. And the answer can't be, yeah, but it's free speech. No, that's not it. What What advice
would you have would you give parents in this situation? Because I mean, I didn't even know that you had something like this even going on, you know, until it
happens to someone eventually. Now, parents listening in would be like, "Oh, hold on. Are my kids using character AI?" They probably are. Are they using
are they talking to AI? A survey was done, 70% of kids nowadays are talking to AI all the time. So, what what are
parents supposed to be doing in the situation? My advice often to parents, um, I just said I wasn't going to give
people advice really, but my advice to people is to parents is is to be aware
of what's going on around you, with your children, to spend time with your children. You know, it sounds like such
a such a broad is a broad statement, but it's about being aware about what's going on with them, spending time with
them, listening to them. You know what I often find is um as parents we're afraid
to acknowledge what our kids are saying. So your kid will come to you with this story telling you what they think is
going to work and what's best for you or for them. And what we do as adults because we've got the experience we
we'll shut them down. No, no, that's a stupid idea. And the point about it is to hear them out. Listen to what they're
saying. Because when you start developing or cultivating that culture
of of validation, understanding, listening to them, they are going to more than likely bring those things to
you that they feel they can't that they take into AI. So it's literally about cultivating conversations about simple
things, not about these deep things or these difficult things. When you cultivate that, you open up the door for
them to bring it to you and you find that they're going to be sharing things with you rather than going to some neighbor.
Yeah. Which then cause something like this. Now, I know I read this and I thought, you know, people are going to listen to that like, yeah, but those are
kids. Kids, I can't discern what's good or bad or they either have the, you know, they aren't rational or whatever
news you can come up with as to why this could only happen to children, not happens to adults. But I want to I want to probe people. Think about this for a
second. You know, if you're in a relationship, you currently feel a little bit unappreciated. I mean, anyone, everyone's been in a
relationship, they know what that feeling is like. You don't feel like you're the other partner is seeing you for what you're doing, what you're bringing. Now, what do you do? Where do
you go? Okay. Well, let me ask you. What does it do? It validates how you feel. Yeah, you're right. You you deserve to
be treated better. They should see the work that you put in. He should see the work that you do in the house. Um, and
you have every right to feel angry. And suddenly you your your first impulse of
feeling unappreciated starts to culminate and fester in something a lot more. Before you know it, you leave the
relationship. Maybe it's a divorce, maybe it's a new relationship, but you leave. And what you should have done was
go to someone, the couple's counselor and actually work through the problems. So this is a very simple example of how
how things like this can manifest. And AI didn't tell you to leave your partner. At no point did it say that.
But it started planting these these seeds to make you feel like no, this is a reasonable response to what I want to do.
And that that's the problem. And that is it's so easy. It's like a glove. Um
you know, putting on putting on a glove onto your hand. It feels like a glove. It's you don't even know. It's just happening.
Um so for adults for for waking, we have common sense. That's probably the one thing human race doesn't have. But what
advice do you give to adults and how they how they should go about using these prompts and AI rationally if they
are going to engage in AI therapy? Again, I I would go to the fact that
it's accessible, it's easy, it's quick, you know, you can and you can access it any time, but you have to remember that
it is essentially onedimensional. Action. You are not getting those nuances which which yes you are afraid
people are going to think about you in different way because of the experiences but the nuances of of exper of human
experience is what you need and about bringing that you know let me let me take a little bit of a segue from that.
It's like many therapists when they trained they said okay you know don't reveal anything personal. Okay now
that's that's helpful. You don't want to be in therapy with somebody and now suddenly your therapist starts crying and tells you their whole life story.
It's not going to work. However, it is important to be able to connect with
people on the level where you say but you know um I can understand why you feel this way not because I'm feel the
same as you but I also lost my father. I also did this or I also experienced that. It gives you that human connection
like this is somebody I'm speaking to who understands me. AI understands you from a logical point
of view but it doesn't understand you on an emotional human connected point. And I think that's a very important element
that we need to understand about therapy and and how AI. So when you're an adult also looking at this, it's important for
you to recognize you're going to get answers. Okay. But are you being understood? Are you fully being
understood? And that that's a very broad concept being understood. But are you being understood on a on a human level?
Okay. Um maybe I'm I'm jumping a bit further now. Futurama. I don't if you ever watch Absolutely. I got all the you
know now in Futurama they have on the streets suicide booths okay so how far
is AI AI are going to go with us you know if the logical answer is if you can't take anything anymore you know you
pay your dollar you climb the suicide booth and you're gone and I think that's also another thing we need to look at is
that we have to have something that's going to carry us beyond the point that we're sitting at AI is answering our
questions right where we are and we sort of touched I think a little bit earlier is what is beyond that? What is the
vision that I can get and and and and develop of my future, my life beyond how
it feels now? And that I think AI is going to struggle with or does struggle with. That's interesting. That's a super
interesting question. So I think overall a huge common theme here that I'm hearing is guys this is
try it out as in use it as maybe an introduction as a gateway and then by
feeling what it brings it use that to then take it to the next step. So okay I like how you use
futurama because futurama is in the future. If you had 20, so 2035, I want
2035 predictions. AI therapy continues as is, maybe gets better, whatever it is. In a worst case scenario, give me a
headline of what you think could happen. That's an interesting one. I haven't thought about that headline.
Universities stop PhD and M's degrees in psychology and psychiatry. Not necessary
anymore. AI bots. Okay. Something that would be tragic. It would be tragic.
That's my worry with it. to end off the segment. I was thinking about this a lot.
The human the human connection part and I was thinking well before
chachi I used to talk to my parents maybe my brother friends for advice
maybe a mentor depending on what the problem was. Now, without even realizing
it, any problem I have, the first thing I do is I ask ChBT. Okay? I can go weeks without talking to my parents about any
issues. If I even talked to them about any issues, I I've been soothed by CHBT.
And I want people to think about this for a second. This happened to me so gradually, I didn't even realize it.
Yeah. Where I've now gotten to a point where I can go where I've almost cut out my entire community just just because
it's so convenient. Now, I I don't know, maybe I'd like to hear you think. Is that good? Is it bad? And what could
potentially be a risk? Um something for me to consider. So, I think about this in many different
ways, you know. Um I see kids, I see adults who have fully online
relationships. They've never physically met this person. They have a full-on relationship with an actual other human
being somewhere on the other side of the of the planet. Nothing wrong with that, but just as an example.
So my go-to with this is we as organisms, you know, if we take out any religious
context, but you know, as organisms, we've developed over hundreds of thousands of years. So as an organism,
we haven't changed. So homo sapiens sapiens has probably been around for at least 100,000 years.
That organism, the brain, the way it works, the way it learns things, the way it interacts, the way it connects, hasn't changed. Technology has changed.
So the way in which we're interacting with our world is changing but we as as as an organism haven't changed and
that's where the difficulties I think start to come in. We need to have those connections. The way we learn things you
know um the whole nature nurture debate and all that kind of thing. You people
say to me why do I think the way that I think? Why do I do the things that I do? And I say to them cryptically you think
the way that you think and you do the things that you do because you think the way that you think and you do the things you do.
What the heck does that mean? And it's because of all that learning that we have within our environment, negative
and positive experiences. It's everything that shapes our neural pathways. Mark, the part that I want to understand
a bit more is, you know, you feel so validated by speaking to language models and we hear about people falling into
relationships with chatbt. I just I understand why do I feel such a
personal connection when I speak to Ch. and explain to me how relationships can even form from something like that.
The simple answer is language. So language is is one of our obviously the way we communicate in in in many
different ways. So there's spoken language, there's written, you know, there's poetry, there's books, there's
stories, there's movies in which language and dialogue is carries the whole the whole theme. So, so we as
human beings, one of our ways of connecting with the other, with any other person, any other human being is
language. So, we are primed to connect with people on what they say to us. Um,
sometimes the tone of their voice, those kinds of things, but essentially language. So, the words we're using with
each other, how we speak to each other. So, that is probably one of the the biggest um ways in which we connect with
people. I mean, you can have a conversation with somebody um across the world without having your video on and
just speaking to them, listening to them, you can you can speak to them for an hour and there's a connection. There's something there. And that's
where AI is really good at that because it's used millions and millions and millions of, you know, cubits of input
um of language and and so it can respond very well to that. So you can have a conversation with a a virtual being and
feel like you're connected. It's because of language. Yeah. So for someone that's very lonely, them
receiving that connection, even if it's only half, if I put it that bluntly, say
half the connection of someone else who might have a physical dimension to the relationship as well. For someone who's
lonely, that's why they connect to that. That's why they connect. And then the other portion of that or half of that is
our imagination. I mean you you speak to someone on the phone, you have a conversation with someone, you're
imagining the story or you you're thinking about if you read a book, you're reading the the words on the
page, but in your head there's a movie playing through. So imagination brings that other part of it. And so we start
connecting the language and the the imagination, the idea that we have about this person and what it means to us in
conversation. Yeah. I mean, Bahumi made a great point about the movie her where we Phoenix
talks to Scarlett Johansson over the phone and he develops a relationship with her and like if you've watched that
movie you cannot tell me that there is zero connection that you have with
Scarlett Johansson when she's speaking to him that there's that you feel like it's just an AI. It's ones and zeros. It's not the case.
So from that lens I can understand how these things happen. So for those that
are watching in this in this moment, I want you to consider this for a second. The last time you had a problem, who did you go to to LGBT or to someone that you
care about? Have have a think and leave leave the comment in the in the comment box? But to summarize this conversation,
I'd like to know from you what is your big key takeaway for people regarding AI and AI therapy?
I think that AI is in in in the context of what we're speaking about here. AI is something
that is accessible, it's safe, it's secure, it is anonymous, it gives you
all of those things that you can access, you know, at any time of the day. So, it is a very good starting point if you
feel unable to connect with a therapist or anything like that. However, it is
not able to address or speak to or connect to those things we spoke about earlier or your
learning, your experience, your understanding, your good and bad experiences over a lifetime. The nuances
that come with that that a human being can access and can reflect back to you that AI is not able to do yet
and perhaps 2035 is the is the date. Well, I mean Mark, thanks a lot then.
That was a great conversation. I appreciate having you. Cool. Thank you very much. Appreciate it, too.
Leave a comment
All comments are moderated before being published.
This site is protected by hCaptcha and the hCaptcha Privacy Policy and Terms of Service apply.