Quantum Shadows and Cryptographic Fault Lines: Rethinking NSA’s Stance on Hybrid Post-Quantum Cryptography
As the internet braces for the quantum era, the NSA’s resistance to hybrid cryptography raises questions—technical, historical, and political.
Imagine a future where a quantum computer can break the encryption that protects your bank account, your medical records, and even national secrets. That future may be closer than we think—and the race to secure our digital infrastructure is already underway. But in this high-stakes transition, a curious fault line has emerged: the US National Security Agency (NSA) has explicitly rejected hybrid post-quantum cryptography (PQC) in its Commercial National Security Algorithm Suite 2.0 (CNSA 2.0), even as the broader cryptographic community embraces it. Why?
That question is explored through an examination and synthesis of four sources: the IETF’s draft on hybrid key exchange in TLS 1.3, an Akamai blog post on building a quantum-safe internet, and two blog posts by cryptographer Daniel J. Bernstein—one on the standardization of Classic McEliece, and another dissecting NSA/GCHQ arguments against hybrid cryptography.
What Is Hybrid Post-Quantum Cryptography?
Hybrid PQC combines a traditional encryption algorithm (like RSA or elliptic curve cryptography) with a post-quantum algorithm (like ML-KEM or McEliece). The idea is simple but powerful: Even if one algorithm is broken —say, by a quantum computer in the case of RSA, or due to design or implementation errors attendant to a still-maturing technology in the case of ML-KEM —the other still protects the data. It’s a belt-and-suspenders approach that offers a safety net during the uncertain transition to quantum-resistant standards.
The IETF draft outlines how this can be implemented in TLS 1.3, the ubiquitious protocol that secures most internet traffic. (You’re using TLS every time you see the lock icon in a browser address bar.) The design is intentionally conservative, using a “concatenation” method to combine keys from both algorithms. This ensures that the final encryption key is secure as long as at least one of the underlying algorithms remains unbroken.
Why the NSA Says No
In CNSA 2.0, the NSA explicitly prohibits hybrid key exchange. Their rationale? According to Bernstein’s analysis, the NSA and its UK counterpart GCHQ argue that hybrid schemes add complexity, increase the attack surface, and may not offer meaningful security improvements. But Bernstein isn’t convinced. He points out that these arguments are vague, unquantified, and inconsistent with the NSA’s own history of supporting complex cryptographic systems when it suits their interests.
This skepticism is echoed in the Akamai blog, which highlights the urgency of preparing for quantum threats and supports hybrid cryptography as a pragmatic interim solution. Akamai is already rolling out hybrid support in phases, starting with connections between its servers and customer origins.
A Historical Echo: The DES Key Size Controversy
The NSA’s stance on hybrid cryptography bears an eerie resemblance to its position in the 1970s on the Data Encryption Standard (DES). Back then, the NSA reportedly influenced the decision to limit DES to a 56-bit key—just strong enough to resist casual attacks, but weak enough for the NSA to break with sufficient resources. This dual-purpose design allowed the agency to maintain access to encrypted data while appearing to support public security.
Could something similar be happening now? Bernstein suggests that rejecting hybrid cryptography might serve a similar dual purpose: discouraging robust, layered defenses that could interfere with the NSA’s own surveillance capabilities, while maintaining plausible deniability under the guise of technical conservatism.
Beyond Security: Political and Strategic Motives?
It’s worth considering that the NSA’s objections may not be purely technical. Hybrid cryptography complicates the cryptographic landscape, making it harder for intelligence agencies to predict or exploit weaknesses. It also empowers independent actors —companies, researchers, and foreign governments —to adopt stronger protections without waiting for official blessing.
Moreover, hybrid schemes could accelerate the adoption of post-quantum algorithms like Classic McEliece, which Bernstein champions for its conservative, time-tested design. If widely deployed in hybrid form, such algorithms could gain de facto standard status before the NSA is ready to endorse—or control—them.
What Happens If Hybrid TLS Becomes Standard?
If the IETF finalizes a hybrid TLS standard and major players like Akamai adopt it, CNSA 2.0 may find itself out of step with the real-world cryptographic ecosystem. This could force a re-evaluation of the NSA’s position, especially if hybrid deployments prove to be secure, efficient, and widely adopted.
In that case, CNSA 2.0 might need to evolve—perhaps by allowing hybrid schemes in limited contexts, or by endorsing specific combinations that meet both security and operational criteria. Alternatively, the NSA could double down, insisting on a clean break from classical algorithms and betting that the transition to pure PQC can happen quickly and safely.
Conclusion: A Call for Transparency and Debate
The debate over hybrid post-quantum cryptography is more than a technical squabble—it’s a window into the competing priorities of security, surveillance, and sovereignty in the digital age. As quantum computing looms on the horizon, the choices we make today will shape the security of tomorrow’s internet.
The NSA’s disapproval of hybrid cryptography deserves scrutiny —not just for what it says, but for what it might be trying to avoid saying. And as history has shown, when cryptographic decisions are made behind closed doors, the consequences can echo for decades.