- cross-posted to:
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
Relevant bit for those that don’t click through:
Daniel Bernstein at the University of Illinois Chicago says that the US National Institute of Standards and Technology (NIST) is deliberately obscuring the level of involvement the US National Security Agency (NSA) has in developing new encryption standards for “post-quantum cryptography” (PQC). He also believes that NIST has made errors – either accidental or deliberate – in calculations describing the security of the new standards. NIST denies the claims.
“NIST isn’t following procedures designed to stop NSA from weakening PQC,” says Bernstein. “People choosing cryptographic standards should be transparently and verifiably following clear public rules so that we don’t need to worry about their motivations. NIST promised transparency and then claimed it had shown all its work, but that claim simply isn’t true.”
Also, is this the same Daniel Bernstein from the 95’ ruling?
The export of cryptography from the United States was controlled as a munition starting from the Cold War until recategorization in 1996, with further relaxation in the late 1990s.[6] In 1995, Bernstein brought the court case Bernstein v. United States. The ruling in the case declared that software was protected speech under the First Amendment, which contributed to regulatory changes reducing controls on encryption.[7] Bernstein was originally represented by the Electronic Frontier Foundation.[8] He later represented himself.[9]
So highly reputable source with skin in the game thanks for the explanation.
Yeah you can observe this with letsencrypt failing to generate a certificate if you change the elliptic curve from an NSA generated curve to a generic/known safe one. Changing between different NSA curves are functionally fine. Forces all signed certificates to use curves that are known to have issues, deliberate or otherwise - i.e. backdoored.
You can’t use arbitrary curves with certificates, only those which are standardized because the CA will not implement anything which isn’t unambiguously defined in a standard with support by clients.
My point is that there is a documented listed of supported curves for ECDSA but attempting to use any other safe curve in the list results in a failure. I am not trying to use some arbitrary curve.
If your point is that no safe curve is permitted because the powers that be don’t permit it, TLS is doomed.
https://eff-certbot.readthedocs.io/en/latest/using.html#using-ecdsa-keys
The default is a curve widely believed to be unsafe, p256, with no functioning safe alternative.
That’s Bernstein’s website if anyone was wondering, showing p256 is unsafe.
I run a cryptography forum, I know this stuff, and the problem isn’t algorithmic weakness but complexity of implementation.
All major browsers and similar networking libraries now have safe implementations after experts have taken great care to handle the edge cases.
It’s not a fault with let’s encrypt. If they allowed nonstandard curves then almost nothing would be compatible with it, even the libraries which technically have the code for it because anything not in the TLS spec is disabled.
https://cabforum.org/baseline-requirements-certificate-contents/
CAB is the consortium of Certificate Authorities (TLS x509 certificate issuers)
With that said curve25519 is on its way into the standards
Tldr would be that there are no safe ECC curves in TLS? Yet
The WRC deals with unsafe curves all the time. I think picking a couple of spots on some of their curves at high speed would be interesting. Samir has been known to break some of these…
That’s worrying if true. However I couldn’t find a source. Even if true Let’s encrypt is probably the most secure option
Thanks, I am extremely skeptical and I might just reach out to let’s encrypt for clarification