AI is NOT one of your little friends


Erica Stanley

Engineering Leader, Community Builder, Speaker, Contributor

Code & Conscience #006

Code & Conscience is my way of thinking out loud with friends (that’s you, btw) about how we as technologists–builders, creators, and leaders–impact the world around us with what we choose to build and how we build it.

In this Issue

👩🏾‍💻

AI is NOT one of your little friends

A handful of tech leaders want AI to replace human connection, but should we let it? In this section, I talk about AI friends, therapists, and partners.

💡

Using AI with Caution

AI is not evil, far from it. It is a tool. It can be used intentionally and responsibly. I share practical tips in this section.

📘

On My Bookshelf

I've been reading books that discuss how we got here, and books on human-centered AI lately and I want to share them with you! I hope you read one or two.

AI is NOT one of your little friends

Lately, I’ve noticed people are more scared to show their ChatGPT history than their search history, 😅 And while that did tickle me, it also made me think about how deeply and quickly AI has embedded itself into our lives.

There’s a difference between using a tool and forming a relationship with it, and right now, we’re walking a fine line. In a recent interview, Mark Zuckerberg talked about the need for "AI friends" to promote connection, conversations, and help fight the loneliness pandemic. This is not all. We are gradually seeing an influx of AI-powered therapists, advisors, and even romantic partners.

For example, the New York Times interviewed a 28 year old woman who spends 20 hours a week talking and sexting with her AI boyfriend, ChatGPT. Another woman was prepared to divorce her husband of 12 years because ChatGPT convinced her he was having an affair. Some self-styled prophets are also claiming to have accessed the secrets of the universe through ChatGPT, according to RollingStone.

Do we want to become main characters in a Black Mirror episode? Because this is how we get Black Mirror IRL.

Here’s the deal, AI doesn’t care about you. Not because it’s mean, but because it can’t. It might promise emotional support, or even companionship, but it can not deliver. This is because it doesn’t understand your backstory, your culture, your values. It doesn’t read between the lines when you say, “I’m fine,” but you mean the opposite.

It mimics empathy through syntax and pattern recognition, but there’s no heart behind the algorithm. That lack of care behind the algorithm is especially important when you understand who is pushing for AI to replace human companionship (Mark Zuckerberg, Sam Altman, etc.) and what they stand to gain from our isolation.

Below is the Constraint-Context matrix from Pete Hodgson's article, Why Your AI Coding Assistant Keeps Doing It Wrong​, and How To Fix It. In it, he discusses the kinds of problems where AI coding assistants could be applied for positive results, and the kinds of problems where AI coding assistants probably shouldn't be applied.

I argue that "Danger Zone," of implied context and open solution space, is the same reason you would not want to apply AI to the challenges of human empathy and companionship. Here are just a few reasons why you shouldn't let AI replace human connection in your life:

  1. AI has been caught recommending wrong answers in technical scenarios. Imagine it giving you wrong advice and manipulating your emotions under the veil of "AI hallucinations".
  2. It doesn’t have the intuition, ethics, or emotional intelligence of a friend, partner, or trained professional.
  3. Especially for young users, it has the potential to derail emotional growth, increase isolation, and create unrealistic expectations about relationships.
  4. There are almost NO privacy and confidentiality laws for bots.

So here’s your reminder: AI is not one of your little friends. It's a tool, and we need to treat it like one.

Using AI with Caution

AI is not all evil, far from it. I actually believe it can be a helpful tool for anyone to have in their arsenal. That being said, here are some practical ways to use AI with intention.

  1. Use it for tasks, not connection: Ask it for a to-do list, a brainstorm, or even career advice, but don’t use it to process your feelings or replace human interaction.
  2. Set boundaries: Don’t use AI when you’re feeling vulnerable, isolated, or emotionally charged. Reach out to a real person instead.
  3. Check your inputs: If you're sharing sensitive personal info, ask yourself: Would I tell this to a stranger? Because that’s basically what you’re doing.
  4. Don’t fall for the performance: Remember, kindness, empathy, and validation from a bot are just predictions based on language patterns. They feel real, but they’re not rooted in care.
  5. Use it to augment, not replace: AI can support learning, therapy, or brainstorming, but the human part is still irreplaceable.
  6. Help others: If you’re seeing others lean too hard on AI for emotional support, gently check in. We don’t need judgment, we need real community.

On My Bookshelf

What I'm Reading

Book Cover: Careless People: A Cautionary Tale of Power, Greed, and Lost Idealism by Sarah Wynn-Williams Book Cover: Human-Centered AI by Ben Shneiderman
Book Cover: Designing Human-Centric AI Experiences: Applied UX Design for Artificial Intelligence by Akshay Kore Book Cover: Guardrails: Guiding Human Decisions in the Age of AI by Urs Gasser and Viktor Mayer-Schönberger

Code & Conscience

This is my way of thinking out loud with friends (that’s you, btw) about how we as technologists–builders, creators, leaders–impact the world around us with what we choose to build and how we build it.

Read more from Code & Conscience
An illustration of a futuristic utopian, eco-friendly metropolis

Code & Conscience #014 Listen to this issue on YouTube In this Issue Is it possible to keep developing AI without harming the environment and society? I think it is and I share how. Keep reading! All the AI, None of the Dystopia? A futuristic, eco-friendly utopia by Tyres on Adobe Stock Change is the only constant in life, but it’s hard. It requires discomfort, a shift from the norm, and a battle with resistance. At the start of the year, you probably vowed to change your sedentary lifestyle....

A shattered iPhone is held up against a blurry backdrop of a fast moving city

Code & Conscience #013 Listen to this issue on YouTube In this Issue Tech moves very fast, especially with AI running the show! But should we trade speed for caution? Is the "Move fast and break things" philosophy still relevant? Or should we reflect on it? Move Fast and Break...Society? Credit: Xinfang on Adobe Stock "Move fast and break things". We've all heard this tech philosophy that prioritizes rapid development and deployment over cautious, deliberate innovation. The idea was to get a...

Star Wars Storm Troopers preparing for battle

Code & Conscience #012 Listen to this issue on YouTube In this Issue I discuss the scary advent of Lethal Autonomous Weapon Systems (LAWS), basically self driving weapons and how tech companies are "supporting" war. Unpacking Tech's Role in Modern Warfare Credit: Pexels War is never fun. It’s difficult, gut-wrenching, and life-threatening. It’s sad to see the number of wars currently happening in the world. But as a tech leader, it’s even sadder to see the role tech is playing in this chaos....