Code & Conscience #012
In this Issue
I discuss the scary advent of Lethal Autonomous Weapon Systems (LAWS), basically self driving weapons and how tech companies are "supporting" war.
Unpacking Tech's Role in Modern Warfare
War is never fun. It’s difficult, gut-wrenching, and life-threatening. It’s sad to see the number of wars currently happening in the world. But as a tech leader, it’s even sadder to see the role tech is playing in this chaos.
With Trump in the seat of power, big tech companies are landing multi-million-dollar defense contracts and rewriting their policies to allow “national security” applications. OpenAI, Google, Anthropic, xAI, and Scale AI have each landed contracts worth over $200 million to accelerate the development of artificial intelligence in the U.S. military.
For-profit tech companies should not be involved in conversations, actions, or the development of technology that could increase harm, especially war. Unfortunately, they are. Here are some ways your favorite tech companies and technologies are aiding and abetting war:
- Surveillance: Using Microsoft’s Azure cloud services, Israeli military intelligence reportedly built a surveillance system to store and scan millions of phone calls in Gaza and the West Bank. Calls of suspects and innocent civilians were intercepted daily, including calls for international aid.
- Artificial Intelligence: If you think AI is only used to create funny videos, then you are very wrong. It is also being used for smarter cyberattacks like phishing, distributed denial-of-service (DDoS) attacks, and more. For example, Cloudflare's 2024 report shows how AI-assisted hackers were capable of exploiting vulnerabilities within 22 minutes of a proof of concept being published. The proliferation of AI will also swing public opinion and decision-making in times of war. And to add insult to injury, Grok won its 200 million dollar contract, after, going "super-Nazi."
- Autonomous Systems: Have you ever heard of Lethal Autonomous Weapon Systems (LAWS)? These are weapons designed to make life-and-death decisions without humans in the loop. According to Autonomous Weapons Watch, 17 weapon systems can operate autonomously, with China, Germany, Israel, South Korea, Russia, Turkey, Ukraine, and the U.S. leading their development. The UN is pushing back on this technology, stating that humans must remain in control of these systems—especially when lives are on the line.
As much as I’d like to assume these problems only exist in my favorite sci-fi books, the current state of the world pushes them to my face.
What Now?
Companies like Hugging Face and Mozilla have blatantly refused to contribute toward military purposes, but other companies are riding the wave. Employees at Microsoft and Google have protested their companies’ participation in the war. But where did that land them? Back in the job market. So what now?
- We need strong legal frameworks that put accountability on people and companies, especially those behind these technologies.
- More international cooperation that centers on human rights and ethical guardrails is needed, both within and outside the UN.
- Policymakers need to look closely and have honest conversations about who benefits from this tech, who gets harmed, and who decides.
As modern warfare changes and global tensions rise, we have to make sure the development of technology is guided by transparency, human supervision, and respect for life.
Q&A
I’m curious, what would you do if you discovered your company was “aiding” wars? With the current job market and how difficult it’s gotten to speak against authority, what would you do? Please send me your answer by replying to this email. I’m interested in hearing your thoughts!
Around the Web
📖 “We don’t want our music killing people”: Rock band quits Spotify over CEO Daniel Ek’s military investments by Sam Roche
📖 Cyberwarfare 2025 Report: How AI Is Reshaping Cyberattacks And Cybersecurity by Cybercrime Magazine
📖 Deadly Slop: Artificial intelligence on the battlefield by Sophia Goodfriend