Move Fast and Break...Society?


Code & Conscience #013

In this Issue

Tech moves very fast, especially with AI running the show! But should we trade speed for caution? Is the "Move fast and break things" philosophy still relevant? Or should we reflect on it?

Move Fast and Break...Society?

"Move fast and break things". We've all heard this tech philosophy that prioritizes rapid development and deployment over cautious, deliberate innovation. The idea was to get a product out quickly and fix issues later. While this approach helped companies like Meta (formerly Facebook) quickly dominate the market, its negative societal impacts have become too harmful and more widespread.

I'll be honest, this philosophy has helped shape the world we live in. Many products would never have made it to limelight if we waited for perfection before pushing to the public. However, a focus on speed without a counterbalance of responsibility can cause real damage. In fact, I dare say it is already causing real damage.

For example, Tea, a dating safety app that allows women do background checks on men and anonymously share "red flag" behaviour was hacked in July, exposing thousands of members' images, posts and comments. This app was "allegedly" vibe-coded and had a publicly accessible database. Any software engineer worth their salt knew this was a disaster waiting to happen. But the ease of shipping products with AI coupled with the "move fast and break things" philosophy blinded them. Now, an app that was meant to protect women ended up putting them in danger.

These are not the only problems, we've also seen a domino effect of negative consequences like:

  1. Misinformation and the erosion of truth
  2. Amplification of inequality
  3. Technical debt and poor quality innovations
  4. Workforce disruption
  5. Ethical dilemmas and loss of control

Why We Need to Move Away from this Philosophy

A similar phrase that reflects this philosophy is "If there is nothing wrong with your MVP, you’ve launched too late." But it's becoming glaring that a reckless, high-speed approach to innovation is unsustainable and, frankly, dangerous. In today's world, where an adolescent can whip up an app in a few minutes with no safety or ethical measures, we must do better and here’s why:

  1. It Fosters a Lack of Accountability. The absence of a clear line of accountability for negative outcomes is becoming snowballing into a big problem in tech. When something goes wrong (a data breach, an algorithmic error, or a platform that spreads hate speech) the companies responsible often face no meaningful consequences. We have fast unveiling of products, but slow regulations and nonexistent consequences.
  2. Innovation is Moving Faster Than Our Ability to Regulate It: The pace of AI development (and technology in general) outstrips the speed of legislation. This creates a regulatory gap where powerful tools are deployed without any clear rules or oversight. These days, companies are now taking it upon themselves to create their own ethical guidelines and internal governance frameworks as they wait for governments to catch up. This proactive approach to get ahead of regulation is a clear sign that the industry itself recognizes the danger of a completely unchecked, rapid-fire approach.
  3. It's Unsuitable for High-Stakes Systems: What may be a minor bug in a social media app can be catastrophic in a medical app. The "move fast and break things" ethos is completely inappropriate for technologies that have a direct impact on people's lives and livelihoods.
  4. It Deepens Social Inequality: AI is creating massive workforce displacement without giving people time to adapt. According to a Brookings Institute commentary, current worker retraining programs are struggling to keep pace, leaving many behind and increasing the risk of a widening societal divide. The focus on speed can end up prioritizing a small group of creators while leaving the rest of the world to deal with the consequences.
  5. It Puts Widely Used Tools at Risk. This mindset might be acceptable for early startups, but it is dangerous when applied to systems that millions, or even billions, of people depend on. The constant need to push out new features can lead to a state of perpetual instability.

I like the way Swati Tyagi put it, "true speed comes from building reliable systems, not from constantly fixing what’s broken." When a tool becomes a fundamental part of our lives, its stability, ethics, and trustworthiness matter more than speed alone.

Around the Web

▶️ The Past, Present, and Two Futures of Agentic AI Automation by Noble Ackerson

▶️ Move fast and break things” wasn’t a roadmap. It was a warning—and we ignored it by Radical Candor​

​📖 AI makes Silicon Valley’s philosophy of ‘move fast and break things’ untenable by Constance De Saint Laurent and Vlad Glăveanu

​▶️ Ethics first then advancement by MoGawdatOfficial​

▶️ Technofeudalism by Curiosity Theory

Good news, everyone! I'm now partnering with Bookshop.org to bring you recommendations based on books I'm reading on this issue's topics. That means I earn a commission if you click through and make a purchase, and your purchases support local bookstores! Bookshop.org has raised over $40M for independent book stores around the world!

Take a look at the new reads this week, all available on Bookshop.org

Book Cover: Technofeudalism: What Killed Capitalism by Yanis Varoufakis Book Cover: You, the Machine by Cause0

Erica Stanley

Engineering Leader, Community Builder, Speaker, Contributor

Code & Conscience is my way of thinking out loud with friends (that’s you, btw) about how we as technologists–builders, creators, leaders–impact the world around us with what we choose to build and how we build it.

Code & Conscience

This is my way of thinking out loud with friends (that’s you, btw) about how we as technologists–builders, creators, leaders–impact the world around us with what we choose to build and how we build it.

Read more from Code & Conscience
Star Wars Storm Troopers preparing for battle

Code & Conscience #012 Listen to this issue on YouTube In this Issue I discuss the scary advent of Lethal Autonomous Weapon Systems (LAWS), basically self driving weapons and how tech companies are "supporting" war. Unpacking Tech's Role in Modern Warfare Credit: Pexels War is never fun. It’s difficult, gut-wrenching, and life-threatening. It’s sad to see the number of wars currently happening in the world. But as a tech leader, it’s even sadder to see the role tech is playing in this chaos....

 Digital healthcare and network connection interface, Global health care

Code & Conscience #011 Listen to this issue on YouTube In this Issue I discuss the interrelationship between technology and health. Why it's raising eyebrows, its advantages, ethical dilemmas, and how tech leaders/engineers need a better approach to ethics. The Role of Technology in Public Health Credit: Adobe Stock Image by Ipopba 2020 was one of the weirdest years of my life, but to be honest, I feel it was a weird year for all of us. Besides sitting in our houses or hustling for toilet...

An abstract image of security cameras in front of the U. S. Capitol building on a backdrop of red and blue watercolor blobs.

Code & Conscience #010 Listen to this issue on YouTube In this Issue I discuss the latest trends in AI and surveillance, what this means for your privacy, safety, and freedom. I also have some good news to share, so keep reading! You're Being Followed! Credit: Pexels It's a cool Wednesday evening. You're driving home from your office, as usual, except something is strikingly different this time. A black sedan has been tailing you for a while now. You've made four turns (a circle) to confirm...