Unpacking Tech's Role in Modern Warfare


Code & Conscience #012

In this Issue

I discuss the scary advent of Lethal Autonomous Weapon Systems (LAWS), basically self driving weapons and how tech companies are "supporting" war.

Unpacking Tech's Role in Modern Warfare

War is never fun. It’s difficult, gut-wrenching, and life-threatening. It’s sad to see the number of wars currently happening in the world. But as a tech leader, it’s even sadder to see the role tech is playing in this chaos.


With Trump in the seat of power, big tech companies are landing multi-million-dollar defense contracts and rewriting their policies to allow “national security” applications. OpenAI, Google, Anthropic, xAI, and Scale AI have each landed contracts worth over $200 million to accelerate the development of artificial intelligence in the U.S. military.

For-profit tech companies should not be involved in conversations, actions, or the development of technology that could increase harm, especially war. Unfortunately, they are. Here are some ways your favorite tech companies and technologies are aiding and abetting war:

  1. Surveillance: Using Microsoft’s Azure cloud services, Israeli military intelligence reportedly built a surveillance system to store and scan millions of phone calls in Gaza and the West Bank. Calls of suspects and innocent civilians were intercepted daily, including calls for international aid.
  2. Artificial Intelligence: If you think AI is only used to create funny videos, then you are very wrong. It is also being used for smarter cyberattacks like phishing, distributed denial-of-service (DDoS) attacks, and more. For example, Cloudflare's 2024 report shows how AI-assisted hackers were capable of exploiting vulnerabilities within 22 minutes of a proof of concept being published. The proliferation of AI will also swing public opinion and decision-making in times of war.   And to add insult to injury, Grok won its 200 million dollar contract, after, going "super-Nazi."
  3. Autonomous Systems: Have you ever heard of Lethal Autonomous Weapon Systems (LAWS)? These are weapons designed to make life-and-death decisions without humans in the loop. According to Autonomous Weapons Watch, 17 weapon systems can operate autonomously, with China, Germany, Israel, South Korea, Russia, Turkey, Ukraine, and the U.S. leading their development. The UN is pushing back on this technology, stating that humans must remain in control of these systems—especially when lives are on the line.

As much as I’d like to assume these problems only exist in my favorite sci-fi books, the current state of the world pushes them to my face.

What Now?

Companies like Hugging Face and Mozilla have blatantly refused to contribute toward military purposes, but other companies are riding the wave. Employees at Microsoft and Google have protested their companies’ participation in the war. But where did that land them? Back in the job market. So what now?

  1. We need strong legal frameworks that put accountability on people and companies, especially those behind these technologies.
  2. More international cooperation that centers on human rights and ethical guardrails is needed, both within and outside the UN.
  3. Policymakers need to look closely and have honest conversations about who benefits from this tech, who gets harmed, and who decides.

As modern warfare changes and global tensions rise, we have to make sure the development of technology is guided by transparency, human supervision, and respect for life.

Q&A

I’m curious, what would you do if you discovered your company was “aiding” wars? With the current job market and how difficult it’s gotten to speak against authority, what would you do? Please send me your answer by replying to this email. I’m interested in hearing your thoughts!

Around the Web

📖 “We don’t want our music killing people”: Rock band quits Spotify over CEO Daniel Ek’s military investments by Sam Roche

📖 Cyberwarfare 2025 Report: How AI Is Reshaping Cyberattacks And Cybersecurity by Cybercrime Magazine

📖 Deadly Slop: Artificial intelligence on the battlefield by Sophia Goodfriend

Good news, everyone! I'm now partnering with Bookshop.org to bring you recommendations based on books I'm reading on this issue's topics. That means I earn a commission if you click through and make a purchase, and your purchases support local bookstores! Bookshop.org has raised over $40M for independent book stores around the world!

I'm excited to support all of your favorite local bookstores with the same high-quality recommendations I've been sharing since day one with the Code & Conscience community! Take a look at the new reads this week, all available on Bookshop.org

Book cover for "Scientific Way of Warfare: Order and Chaos on the Battlefields of Modernity" by Antoine J Bousque Bookcover for "The Coming Wave: AI, Power, and Our Future" by Mustafa Suleyman and Michael Bhaskar

Erica Stanley

Engineering Leader, Community Builder, Speaker, Contributor

Code & Conscience is my way of thinking out loud with friends (that’s you, btw) about how we as technologists–builders, creators, leaders–impact the world around us with what we choose to build and how we build it.

Code & Conscience

This is my way of thinking out loud with friends (that’s you, btw) about how we as technologists–builders, creators, leaders–impact the world around us with what we choose to build and how we build it.

Read more from Code & Conscience
An image of a bitten apple reflected in a mirror showing the side that wasn't eaten by Ronstik on Adobe Stock.

Code & Conscience #019 Listen to this issue on YouTube In this Issue AI does a lot of incredible things, but some of the ways it’s being used right now are hurting people, and maybe it's not accidental. In this issue, I look at how repeated generative AI harms suggest intention, why regulation hasn’t caught up, and what frameworks can help us prevent these harms going forward. Are Common GenAI Harms Intentional? By Ronstik on Adobe Stock It’s not new that technology always finds a way to harm...

An image of a compass pointing to the text "Better World" by Olivier Le Moal on Adobe stock.

Code & Conscience #018 Listen to this issue on YouTube In this Issue I’m so excited to wrap up this three-part series! As many of you know, I've been wrestling with how to build tech products that people genuinely love without accidentally setting the world on fire. That wrestling match turned into my upcoming book! This newsletter series is our third in-depth exploration of the ideas inside. And my book is an attempt to give the practical guide I wish I had: one that connects Tech Strategy...

An illustration of large light bulb surrounded by many connected gears of varying sizes

Code & Conscience #017 Listen to this issue on YouTube In this Issue I’m so excited to continue this three-part series! As many of you know, I've been wrestling with how to build tech products that people genuinely love without accidentally setting the world on fire. That wrestling match turned into my upcoming book! This newsletter is our second in-depth exploration of the ideas inside. And my book is an attempt to give the practical guide I wish I had: one that connects Tech Strategy (what...