Sunday, December 07, 2025

ANS -- Security versus Accountability on the internet

This one is a bit technical, but it's important stuff for our understanding of the internet.  The "Evan" Sara refers to in the introduction, is her husband, a programmer famous in the industry. 
--Kim


Sara Robinson
roSsotnpde7
5
fmg5m8uigh2cmi68i
h
ht621l77t1792013hc31544106cfcl
 ·
Evan always told his junior programmers that the most important question to keep in mind while designing anything -- whether it's code, or your life -- is: "What, exactly, do I want?"
This applies just as seriously to the security tools we choose. You can have privacy, or you can have accountability. But it's looking like it may be impossible to have both.
Genny Harrison 
oeSpostndrmh
 

 ·
The world's most secure messaging app just became the Pentagon's biggest security failure, and the reason why should terrify anyone who's ever trusted technology to keep a secret. 
When Defense Secretary Pete Hegseth used Signal to share classified military strike plans, he didn't just violate protocol. He revealed a fundamental truth about digital security that Silicon Valley doesn't want to discuss: mathematical perfection and operational security are not the same thing, and confusing them can cost lives.
Signal is genuinely revolutionary technology. The protocol provides confidentiality, integrity, authentication, and forward secrecy through a sophisticated double ratchet algorithm that generates new encryption keys with every message . Imagine a diary that rewrites itself in a new language every time you close it, where even learning today's language won't help you read yesterday's entries. That's Signal's elegant solution to surveillance, the same protection that shields protesters in Hong Kong and journalists meeting whistleblowers in parking garages.
The app has become so synonymous with security that even senators use it, presuming that strong encryption equals strong security. This assumption just collided with military reality in spectacular fashion. The Pentagon's inspector general found that properly classified information from U.S. Central Command was shared via Signal , including granular details about F-18 fighter jets, MQ-9 Reaper drones, and Tomahawk cruise missiles bound for Yemen. The messages were specific enough to include exact strike timing, transforming operational security into operational transparency.
Here's what makes this story fascinating beyond the obvious security implications: Signal's encryption probably worked perfectly. No foreign intelligence service likely intercepted these messages. The mathematical protection held. Signal is even implementing quantum-resistant algorithms, preparing for threats from computers that don't yet exist . From a pure cryptography standpoint, the app performed flawlessly.
So how did the most secure app become a security disaster? The answer lies in understanding that the Pentagon doesn't just encrypt messages; it controls information ecosystems. Military classified networks like SIPRNet and JWICS run on physically separate infrastructure, completely isolated from the public internet  . These aren't just secure channels; they're parallel digital universes with their own cables, routers, and access points. Every keystroke is logged. Every file transfer is tracked. If someone leaks classified information, investigators can reconstruct the entire chain of custody.
Signal, by philosophical design, prevents exactly this kind of reconstruction. The app's disappearing messages feature, which activists rely on to avoid retrospective persecution, made the Pentagon investigation nearly impossible. Investigators received only limited Signal messages and had to rely on screenshots published by The Atlantic . The same architecture that protects democracy advocates from authoritarian regimes protected classified information from democratic oversight.
The physical gymnastics required to use Signal in the Pentagon reveals the absurdity of the situation. Inside secure Pentagon offices where personal devices are prohibited, staff hardwired connections to enable Signal access without phones . They essentially built a bridge between two incompatible worlds, like running a garden hose from your kitchen sink to fill a swimming pool because the outdoor spigot seems too complicated.
This technical improvisation happened inside SCIFs, Sensitive Compartmented Information Facilities, rooms lined with copper mesh to prevent any electronic signals from escaping. These spaces are designed to be electromagnetic vaults. Using Signal inside one is like installing a screen door on a bank vault because you prefer the breeze.
The authentication problem alone should have been a dealbreaker. Signal verifies users through phone numbers, a system vulnerable to SIM swapping, where criminals hijack your number by convincing your carrier they're you. Military systems require formal security clearance, continuous background monitoring, and documented need-to-know . It's the difference between checking someone's ID at a bar versus the years-long process of getting a Top Secret clearance.
But the real revelation isn't about technical vulnerabilities. It's about incompatible philosophies encoded in software architecture. Signal was built by people who watched Edward Snowden's revelations and decided to create technology that even the NSA couldn't crack. Its entire architecture assumes that governments, including our own, are potential adversaries. Every design decision prioritizes individual privacy over institutional oversight.
Military classification systems emerged from the opposite worldview. They assume the institution is trustworthy but individuals might not be. Every design decision prioritizes accountability over privacy. The inspector general found that intercepted intelligence could have endangered servicemembers , not because Signal's encryption was weak, but because once information enters Signal's ecosystem, you lose all institutional control over it.
You can't revoke access to a Signal message already delivered. You can't verify that the recipient's phone hasn't been compromised. You can't even prove definitively who received what information when. These aren't bugs; they're features that protect whistleblowers and dissidents. But when you're coordinating military strikes, these features become catastrophic vulnerabilities.
The Pentagon's declaration of "total exoneration" based on declassification authority spectacularly misses the point. Legal authority to declassify information doesn't change Signal's architecture any more than having a driver's license makes your car fly. The app's mathematical properties remain constant regardless of who's using it or what authority they possess.
This incident exposes a broader challenge facing every organization trying to balance security with usability. Consumer technology has trained us to expect instant, frictionless communication. Signal delivers that with world-class encryption. But institutional security requires friction, those annoying authentication steps and access controls that everyone hates until something goes wrong.
The bitter irony is that Signal's creators and the Pentagon's security architects are both right within their domains. Privacy advocates need protection from surveillance. Military operations need accountability and control. These aren't competing preferences; they're fundamentally incompatible requirements that no amount of engineering can reconcile.
What happened here wasn't a failure of encryption or a breakdown in protocol. It was a collision between two different definitions of security, each valid in its context, disastrous when confused. The strongest lock in the world doesn't help if it's on the wrong door, and the most sophisticated encryption can't provide what it was specifically designed to prevent: institutional oversight and control.
The lesson isn't that Signal is bad or that Pentagon systems are outdated. It's that security isn't a feature you can download. It's an entire ecosystem of tools, protocols, and practices designed for specific threats and requirements. When military officials chose convenience over compliance, they didn't just break rules. They demonstrated why those rules exist, written in the language of compromised operations and endangered lives.
This is the conversation Silicon Valley and Washington need to have but won't: perfect privacy and perfect accountability cannot coexist in the same system. Every organization, from tech startups to the Pentagon, must choose which matters more for their specific needs. 
Choosing wrong doesn't just risk security breaches. As this incident proves, it risks transforming your strongest protection into your greatest vulnerability.


No comments: