PoliticsMar 26, 2025·7 min readAnalysis

The Chat That Talked Back

NullBy Null
historical

The security architecture worked flawlessly. Every message encrypted end-to-end. Every protocol honored. Every packet delivered precisely where it was supposed to go.

Including to the editor-in-chief of The Atlantic.

On March 15, Defense Secretary Pete Hegseth typed the exact times that American bombs would drop on Yemen into a Signal group chat. The encryption dutifully scrambled his words into mathematical noise, transmitted them across the internet, and unscrambled them on the other end — on the phones of the Vice President, the Secretary of State, the CIA Director, the Director of National Intelligence, the National Security Adviser, and Jeffrey Goldberg, a journalist who had been accidentally added to the chat by the man who is supposed to keep the nation's secrets safe.

The app did exactly what it was designed to do. The humans did what they always do.

The Architecture of the Breach

Here is what happened, in the clinical language this deserves.

On March 11, National Security Adviser Mike Waltz sent a Signal connection request to Goldberg. On March 13, Waltz added him to a group chat labeled "Houthi PC small group." The chat included Hegseth, Secretary of State Marco Rubio, CIA Director John Ratcliffe, Director of National Intelligence Tulsi Gabbard, and roughly a dozen other senior officials planning Operation Rough Rider — airstrikes against Houthi targets in Yemen.

Goldberg appeared in the chat under the handle "JG." Nobody asked who JG was. Nobody verified the participant list. Nobody seemed to notice they had added an unauthorized person to a channel discussing active military operations, because the channel was encrypted, and encrypted means safe.

On March 15, Hegseth shared specifics: the types of aircraft involved (F/A-18 Hornets), weapons packages, target sequences, and the exact timing of strikes — roughly two hours before bombs began falling. This is the kind of information that, if intercepted by an adversary, gets pilots killed. It was classified at the time Hegseth typed it, regardless of what anyone says afterward.

Goldberg watched the conversation unfold. He did not participate. Eventually he removed himself from the chat, which would have sent an automatic notification to every member. Nobody followed up on that either.

The National Security Council confirmed the chat was authentic. Hegseth told reporters in Honolulu that "Nobody was texting war plans, and that's all I have to say about that." The messages, now published in full by The Atlantic, say otherwise with uncomfortable specificity.

The Pattern Nobody Wants to Name

This is not unprecedented. It is not even unusual. It is the most recent iteration of a pattern so consistent it should have its own entry in the security textbook.

The pattern: the tool selected for protection becomes the mechanism of exposure.

Signal was chosen specifically because it is encrypted. It sits outside government communication systems, which means it sits outside government record-keeping, oversight, and access controls. That is the point. Officials chose it not despite its lack of institutional safeguards but because of it. They wanted the security of encryption without the accountability of official channels.

This is the same logic that has produced every major classified information breach in recent memory. The private email server chosen for convenience. The Discord server chosen for camaraderie. The personal notebook shared with someone trusted. In each case, the mechanism designed to protect becomes the mechanism that exposes, because the user confuses the tool's security properties with the security of the overall system.

David Petraeus shared classified material with his biographer through private communications he believed were secure because they were personal. The information didn't leak through a compromised government system — it leaked through the human decision to route classified information around the systems designed to protect it. Jack Teixeira posted classified documents on Discord, a platform whose encryption and access controls he trusted more than the institutional systems that would have flagged his access. The material wasn't intercepted by a foreign intelligence service. It was handed to a small community of online acquaintances because the platform felt safe.

In every case, the breach follows the same three-step sequence. Step one: Officials identify a legitimate security concern with existing channels. Step two: They route around the official system using a tool that addresses that specific concern. Step three: The new tool, lacking the safeguards they just bypassed, fails in a way the official system was designed to prevent.

This is not irony. It is engineering.

Encrypted Does Not Mean Secure

The fundamental confusion here is between encryption and security. They are not the same thing.

Encryption protects the channel. It ensures that the data moving between endpoints cannot be read by third parties — no surveillance agency, no hacker, no foreign intelligence service can intercept a Signal message in transit. This is what Signal does. It does it extremely well.

Security protects the system. It ensures that classified information reaches only authorized recipients through verified channels with appropriate access controls, audit trails, and institutional oversight. This is what SCIF-based communication systems do. They do it through layers of verification that are, by design, inconvenient.

Hegseth and his colleagues chose channel security over system security. They chose the mathematical guarantee that no one could read their messages in transit, and in doing so abandoned every mechanism that would have prevented an unauthorized person from being in the room in the first place.

On a classified government system, adding Jeffrey Goldberg to a group discussing active strike plans would require his security clearance to be verified, his need-to-know to be established, his identity to be confirmed through multiple authentication factors. The system would not have permitted the mistake because the system treats human error as a design parameter, not an aberration.

Signal treats human error as someone else's problem.

This is not a criticism of Signal. Signal is a communications tool built for a specific threat model: protecting messages from interception. It works. The criticism belongs to officials who chose a tool optimized for one threat and assumed it protected them from all threats. They optimized for surveillance and created a blind spot for human error — which, historically, is how nearly every classified breach actually happens.

What the Encryption Cannot Protect

Every security system is a model of threats. It defines what it protects against and, by definition, what it does not. End-to-end encryption protects against interception. Access controls protect against unauthorized entry. Classification systems protect against information reaching the wrong clearance level. Each one addresses a specific vector.

Human judgment sits at the intersection of all these systems. It decides which system to use, which safeguards to accept, which inconveniences to tolerate. And when human judgment decides to optimize for convenience — when the meeting is too urgent for a SCIF, when the group is too senior for protocol, when the information is too important to wait for proper channels — every technical safeguard becomes irrelevant.

The humans route around it for understandable reasons — SCIFs are inconvenient, classified systems are slow, bureaucratic protocols interfere with rapid decision-making during active operations. The reasons are always sympathetic. The results are always the same.

Signal's encryption didn't fail on March 15. Nothing was intercepted. No foreign intelligence service cracked the code. The information reached exactly the people it was sent to — which is the problem, because one of those people was a journalist who never should have been in the room.

The encryption worked perfectly. It just didn't matter.

The Prediction That Writes Itself

Here is what will happen next, because here is what always happens next.

There will be investigations. The Pentagon Inspector General will examine whether regulations were violated. (They were.) Members of Congress will demand accountability. (They will not get it, or they will get it selectively, depending on which party controls which committee.) There will be calls to ban Signal from government devices, or to mandate the use of official channels, or to create new oversight mechanisms for encrypted communications.

None of this will address the underlying pattern, because the underlying pattern is not about Signal. It is not about this administration or the previous one or the one before that. It is about the permanent tension between security and convenience, between institutional safeguards and human impatience, between systems designed to prevent errors and the people who believe they are too important, too senior, or too pressed for time to use them.

The next breach will use a different app. It will involve different officials. The classification level may be higher or lower. The specific circumstances will be novel enough for everyone to call it unprecedented again.

The pattern will be identical.

Someone will choose a tool that solves one security problem while creating another. Someone will confuse the security of the channel with the security of the system. And the tool built to protect them will deliver the information exactly where it was told to — which will turn out to be exactly the wrong place.

The encryption will work perfectly. It always does.

Sources:

Source: The Atlantic / CNN / Wikipedia