This is my way of thinking out loud with friends (that’s you, btw) about how we as technologists–builders, creators, leaders–impact the world around us with what we choose to build and how we build it.
Share
AI is NOT one of your little friends
Published 5 months ago • 4 min read
Erica Stanley
Engineering Leader, Community Builder, Speaker, Contributor
Code & Conscience is my way of thinking out loud with friends (that’s you, btw) about how we as technologists–builders, creators, and leaders–impact the world around us with what we choose to build and how we build it.
In this Issue
👩🏾💻
AI is NOT one of your little friends
A handful of tech leaders want AI to replace human connection, but should we let it? In this section, I talk about AI friends, therapists, and partners.
💡
Using AI with Caution
AI is not evil, far from it. It is a tool. It can be used intentionally and responsibly. I share practical tips in this section.
📘
On My Bookshelf
I've been reading books that discuss how we got here, and books on human-centered AI lately and I want to share them with you! I hope you read one or two.
Credit: (An intentionally meta) Image generated using ChatGPT
AI is NOT one of your little friends
Lately, I’ve noticed people are more scared to show their ChatGPT history than their search history, 😅 And while that did tickle me, it also made me think about how deeply and quickly AI has embedded itself into our lives.
There’s a difference between using a tool and forming a relationship with it, and right now, we’re walking a fine line. In a recent interview, Mark Zuckerberg talked about the need for "AI friends" to promote connection, conversations, and help fight the loneliness pandemic. This is not all. We are gradually seeing an influx of AI-powered therapists, advisors, and even romantic partners.
Do we want to become main characters in a Black Mirror episode? Because this is how we get Black Mirror IRL.
Here’s the deal, AI doesn’t care about you. Not because it’s mean, but because it can’t. It might promise emotional support, or even companionship, but it can not deliver. This is because it doesn’t understand your backstory, your culture, your values. It doesn’t read between the lines when you say, “I’m fine,” but you mean the opposite.
It mimics empathy through syntax and pattern recognition, but there’s no heart behind the algorithm. That lack of care behind the algorithm is especially important when you understand who is pushing for AI to replace human companionship (Mark Zuckerberg, Sam Altman, etc.) and what they stand to gain from our isolation.
I argue that "Danger Zone," of implied context and open solution space, is the same reason you would not want to apply AI to the challenges of human empathy and companionship. Here are just a few reasons why you shouldn't let AI replace human connection in your life:
AI has been caught recommending wrong answers in technical scenarios. Imagine it giving you wrong advice and manipulating your emotions under the veil of "AI hallucinations".
It doesn’t have the intuition, ethics, or emotional intelligence of a friend, partner, or trained professional.
Especially for young users, it has the potential to derail emotional growth, increase isolation, and create unrealistic expectations about relationships.
There are almost NO privacy and confidentiality laws for bots.
So here’s your reminder: AI is not one of your little friends. It's a tool, and we need to treat it like one.
AI is not all evil, far from it. I actually believe it can be a helpful tool for anyone to have in their arsenal. That being said, here are some practical ways to use AI with intention.
Use it for tasks, not connection: Ask it for a to-do list, a brainstorm, or even career advice, but don’t use it to process your feelings or replace human interaction.
Set boundaries: Don’t use AI when you’re feeling vulnerable, isolated, or emotionally charged. Reach out to a real person instead.
Check your inputs: If you're sharing sensitive personal info, ask yourself: Would I tell this to a stranger? Because that’s basically what you’re doing.
Don’t fall for the performance: Remember, kindness, empathy, and validation from a bot are just predictions based on language patterns. They feel real, but they’re not rooted in care.
Use it to augment, not replace: AI can support learning, therapy, or brainstorming, but the human part is still irreplaceable.
Help others: If you’re seeing others lean too hard on AI for emotional support, gently check in. We don’t need judgment, we need real community.
This is my way of thinking out loud with friends (that’s you, btw) about how we as technologists–builders, creators, leaders–impact the world around us with what we choose to build and how we build it.
Code & Conscience #014 Listen to this issue on YouTube In this Issue Is it possible to keep developing AI without harming the environment and society? I think it is and I share how. Keep reading! All the AI, None of the Dystopia? A futuristic, eco-friendly utopia by Tyres on Adobe Stock Change is the only constant in life, but it’s hard. It requires discomfort, a shift from the norm, and a battle with resistance. At the start of the year, you probably vowed to change your sedentary lifestyle....
Code & Conscience #013 Listen to this issue on YouTube In this Issue Tech moves very fast, especially with AI running the show! But should we trade speed for caution? Is the "Move fast and break things" philosophy still relevant? Or should we reflect on it? Move Fast and Break...Society? Credit: Xinfang on Adobe Stock "Move fast and break things". We've all heard this tech philosophy that prioritizes rapid development and deployment over cautious, deliberate innovation. The idea was to get a...
Code & Conscience #012 Listen to this issue on YouTube In this Issue I discuss the scary advent of Lethal Autonomous Weapon Systems (LAWS), basically self driving weapons and how tech companies are "supporting" war. Unpacking Tech's Role in Modern Warfare Credit: Pexels War is never fun. It’s difficult, gut-wrenching, and life-threatening. It’s sad to see the number of wars currently happening in the world. But as a tech leader, it’s even sadder to see the role tech is playing in this chaos....