AI is NOT one of your little friends


Erica Stanley

Engineering Leader, Community Builder, Speaker, Contributor

Code & Conscience #006

Code & Conscience is my way of thinking out loud with friends (that’s you, btw) about how we as technologists–builders, creators, and leaders–impact the world around us with what we choose to build and how we build it.

In this Issue

👩🏾‍💻

AI is NOT one of your little friends

A handful of tech leaders want AI to replace human connection, but should we let it? In this section, I talk about AI friends, therapists, and partners.

💡

Using AI with Caution

AI is not evil, far from it. It is a tool. It can be used intentionally and responsibly. I share practical tips in this section.

📘

On My Bookshelf

I've been reading books that discuss how we got here, and books on human-centered AI lately and I want to share them with you! I hope you read one or two.

AI is NOT one of your little friends

Lately, I’ve noticed people are more scared to show their ChatGPT history than their search history, 😅 And while that did tickle me, it also made me think about how deeply and quickly AI has embedded itself into our lives.

There’s a difference between using a tool and forming a relationship with it, and right now, we’re walking a fine line. In a recent interview, Mark Zuckerberg talked about the need for "AI friends" to promote connection, conversations, and help fight the loneliness pandemic. This is not all. We are gradually seeing an influx of AI-powered therapists, advisors, and even romantic partners.

For example, the New York Times interviewed a 28 year old woman who spends 20 hours a week talking and sexting with her AI boyfriend, ChatGPT. Another woman was prepared to divorce her husband of 12 years because ChatGPT convinced her he was having an affair. Some self-styled prophets are also claiming to have accessed the secrets of the universe through ChatGPT, according to RollingStone.

Do we want to become main characters in a Black Mirror episode? Because this is how we get Black Mirror IRL.

Here’s the deal, AI doesn’t care about you. Not because it’s mean, but because it can’t. It might promise emotional support, or even companionship, but it can not deliver. This is because it doesn’t understand your backstory, your culture, your values. It doesn’t read between the lines when you say, “I’m fine,” but you mean the opposite.

It mimics empathy through syntax and pattern recognition, but there’s no heart behind the algorithm. That lack of care behind the algorithm is especially important when you understand who is pushing for AI to replace human companionship (Mark Zuckerberg, Sam Altman, etc.) and what they stand to gain from our isolation.

Below is the Constraint-Context matrix from Pete Hodgson's article, Why Your AI Coding Assistant Keeps Doing It Wrong​, and How To Fix It. In it, he discusses the kinds of problems where AI coding assistants could be applied for positive results, and the kinds of problems where AI coding assistants probably shouldn't be applied.

I argue that "Danger Zone," of implied context and open solution space, is the same reason you would not want to apply AI to the challenges of human empathy and companionship. Here are just a few reasons why you shouldn't let AI replace human connection in your life:

  1. AI has been caught recommending wrong answers in technical scenarios. Imagine it giving you wrong advice and manipulating your emotions under the veil of "AI hallucinations".
  2. It doesn’t have the intuition, ethics, or emotional intelligence of a friend, partner, or trained professional.
  3. Especially for young users, it has the potential to derail emotional growth, increase isolation, and create unrealistic expectations about relationships.
  4. There are almost NO privacy and confidentiality laws for bots.

So here’s your reminder: AI is not one of your little friends. It's a tool, and we need to treat it like one.

Using AI with Caution

AI is not all evil, far from it. I actually believe it can be a helpful tool for anyone to have in their arsenal. That being said, here are some practical ways to use AI with intention.

  1. Use it for tasks, not connection: Ask it for a to-do list, a brainstorm, or even career advice, but don’t use it to process your feelings or replace human interaction.
  2. Set boundaries: Don’t use AI when you’re feeling vulnerable, isolated, or emotionally charged. Reach out to a real person instead.
  3. Check your inputs: If you're sharing sensitive personal info, ask yourself: Would I tell this to a stranger? Because that’s basically what you’re doing.
  4. Don’t fall for the performance: Remember, kindness, empathy, and validation from a bot are just predictions based on language patterns. They feel real, but they’re not rooted in care.
  5. Use it to augment, not replace: AI can support learning, therapy, or brainstorming, but the human part is still irreplaceable.
  6. Help others: If you’re seeing others lean too hard on AI for emotional support, gently check in. We don’t need judgment, we need real community.

On My Bookshelf

What I'm Reading

Book Cover: Careless People: A Cautionary Tale of Power, Greed, and Lost Idealism by Sarah Wynn-Williams Book Cover: Human-Centered AI by Ben Shneiderman
Book Cover: Designing Human-Centric AI Experiences: Applied UX Design for Artificial Intelligence by Akshay Kore Book Cover: Guardrails: Guiding Human Decisions in the Age of AI by Urs Gasser and Viktor Mayer-Schönberger

Code & Conscience

This is my way of thinking out loud with friends (that’s you, btw) about how we as technologists–builders, creators, leaders–impact the world around us with what we choose to build and how we build it.

Read more from Code & Conscience

Code & Conscience #008 Listen to this issue on YouTube In this Issue I discuss how Digital Blackface is now the latest trend in racism, not just some online cringe. I also list ways to push back against technology that profits off Black culture, without centering Black people. Digital Blackface You've probably noticed a new wave of AI generated videos in the past few weeks. Videos with characters mimicking Black mannerisms and colloquialisms. Oh! I've had a very hard time convincing older...

A bronze statue of lady justice with a blindfold, scales and a sword.

Erica Stanley Engineering Leader, Community Builder, Speaker, Contributor Code & Conscience #007 Listen to this issue on YouTube Code & Conscience is my way of thinking out loud with friends (that’s you, btw) about how we as technologists–builders, creators, and leaders–impact the world around us with what we choose to build and how we build it. In this Issue 👩🏾💻 Tech is not Neutral We'll address how platforms, algorithms, and AI are actively shaping political narratives, amplifying some...

Illustration of photorealistic human hands in grayscale holding a stylized collection of diverse people in heart shape on a yellow background

Erica Stanley Engineering Leader, Community Builder, Speaker, Contributor Code & Conscience #005 Listen to this issue on YouTube Code & Conscience is my way of thinking out loud with friends (that’s you, btw) about how we as technologists–builders, creators, leaders–impact the world around us with what we choose to build and how we build it. In this Issue 👩🏾💻 Community as a Superpower How community and networking is the secret recipe to thriving, growing and getting opportunities. 💡 Building...