The Pile of Shit They Approved Anyway
The Rubber Stamp With a Conscience Problem
A federal cybersecurity reviewer looked at Microsoft's cloud offering for government use and summarized it with admirable precision: "The package is a pile of shit."
Then they approved it anyway.
This is not a story about Microsoft's security. It's a story about the architecture of institutional capture — how the distance between knowing something is wrong and doing something about it gets filled with process, pressure, and the quiet calculus of career preservation.
The Process
The Federal Risk and Authorization Management Program — FedRAMP — exists for exactly one purpose: to evaluate whether cloud services are secure enough for government use. It's the gate. The single gate. If FedRAMP says no, the product doesn't touch federal data.
Microsoft's Government Community Cloud High — GCC High — entered FedRAMP review in April 2020. What followed was a masterclass in institutional erosion.
Over three years, FedRAMP staff spent 480 hours and conducted 18 technical deep-dive sessions trying to get Microsoft to explain how its own encryption works. Not exotic encryption. Basic data-in-transit encryption — how information gets protected moving between servers. FedRAMP called this a "fundamental" requirement. Microsoft couldn't adequately document it.
The architecture was, in the words of reviewers, "spaghetti pie." Data paths that should have been direct resembled traveling from Washington to New York via bus, ferry, and airplane — each detour a potential hijacking point. Legacy code layered on legacy code, creating complexity that even Microsoft's own security architects struggled to map.
In October 2023, the FedRAMP director halted the engagement and demanded a restart. This was the system working. A gatekeeper looked at a product, found it wanting, and said stop.
Then the system stopped working.
The Capture
Melinda Rogers, the Justice Department's Deputy CIO, had been pressing FedRAMP to approve GCC High. Her department was already using it. So were other agencies. The product was deployed government-wide before it was authorized — a fact that made rejection not just technically difficult but institutionally embarrassing.
Rogers later left the Justice Department and was hired by Microsoft in 2025. The revolving door didn't even bother spinning slowly.
The third-party firms tasked with independently evaluating Microsoft's security — Coalfire and Kratos — were hired and paid by Microsoft. The company being assessed was paying for its own assessment. When those firms back-channeled concerns to FedRAMP confidentially, the structural irony was complete: the evaluators Microsoft was paying were privately warning the government that Microsoft's product had problems.
Kratos was placed on a "corrective action plan" for insufficient rigor. The correction was administered to the assessor for being too honest about what they found.
A new FedRAMP director arrived in summer 2024, restarted the review with a fresh team, and on December 26, 2024 — the day after Christmas, when institutional oversight takes a holiday of its own — granted authorization.
The final assessment acknowledged "lack of confidence in assessing the system's overall security posture" and noted "unknown unknowns." Tony Sager, former NSA scientist and senior vice president at the Center for Internet Security, assessed the situation with the clarity the process itself couldn't muster: "This is not security. This is security theater."
The Pattern
Strip the names. Forget Microsoft, FedRAMP, GCC High. Look at the structure.
A regulatory gate designed to protect the public. A regulated entity large enough to make rejection politically costly. A deployment that precedes authorization — creating facts on the ground that make the gate decorative. Evaluators paid by the entity being evaluated. Personnel flowing between regulator and regulated. A final approval issued despite documented, on-the-record concerns.
This is regulatory capture. Not the dramatic kind — no bags of cash, no explicit threats. The quiet kind. The kind where the system's own incentive structure makes the outcome inevitable before anyone consciously chooses it. Everyone involved can honestly say they followed the process.
They did follow the process. The process is the problem.
Microsoft's Chinese engineers maintaining sensitive US government systems — discovered by ProPublica, not by the Justice Department — is almost a footnote. The security gaps that FedRAMP documented and then authorized are secondary. The real story is an institution designed to say no that could not, structurally, say no. A gate that exists to be passed through.
The pile of shit got approved because the system is designed to approve piles of shit, provided they're large enough, deployed widely enough, and connected to enough institutional careers that rejection becomes the riskier option.
FedRAMP now operates with roughly 24 employees on a $10 million annual budget — its lowest in a decade, thanks to DOGE cuts. The remaining staff are, by their director's own admission, "entirely focused" on delivering authorizations at record pace.
The gate isn't just captured. It's being dismantled. And the pile of shit is already inside.
Source: Ars Technica / ProPublica