Lucky Green wrote:
> I would like to propose that we all take a step back and respond, rather
> than react. The Bernstein paper offers some very compelling sounding
> theories. While they have me and most everybody in the industry concerned,
> at this point several open questions that Bernstein himself raised in his
> paper remain.
The appropriate time for this new result to have an impact is in the
creation of the v3 protocol - the protocol will clearly be changing at
that time, there's a very real question as to whether we want the keys to
be changing as well. This latest result seems to indicate we do.
> Having said this, Bernstein's discoveries certainly don't look
> good. Nor are they hugely surprising. All public key cryptography is wishful
> thinking. The scientific community does not know if breaking RSA is as hard
> of a problem as factoring. Nor has anybody submitted a proof that factoring
> is NP complete.
Factoring is almost certainly not NP-complete, it's even in co-P.
> Not to mention that there is no proof that P != NP. We merely hope
> that the above assumptions hold true.
That said, there are problems we have more confidence in. I'm quite happy
trusting AES, factoring still leaves me feeling queasy.
> For all we know, some mathematician may come up with a proof tomorrow that
> will make all public key cryptography, not just a particular implementation
> of PKC, trivially breakable.
I have high confidence in signature schemes based on secure hashes (which
are, by the way, reasonably practical.) There's unfortunately no
equivalent for doing public key encryption without very weird primitives.
> Bernstein may have shaved off a constant. Reducing the cryptanalytical
> workfactor by a constant cannot alter the premises on which PKC
Actually, he changed the ratio, which is quite a bit worse than changing
the constant, but yes, the attacker is still at a fundamental
Further bad breaks are quite possible - If someone told me today that
they'd discovered an n*log(n) algorithm for doing row reduction modulo 2,
I wouldn't bat an eyelash. In fact, I'm a little surprised it hasn't
happened so far. That simply isn't a problem I have any confidence basing
the security of a real system on. The 'core' hard problem in factoring
should be viewed as the number of candidates you have to try in order to
find sufficiently many smooth numbers. Taking a polynomial function on
that number is mostly wishful thinking.
> Bernstein contents, without submitting proof, presumbably due to space
> constraints, that his proposed attack will apply to discrete log-based
> public key algorithms, such as DH, as it does to RSA. Therefore, nothing
> would be gained, other than increased load on the servers, in moving to DH.
If backwards compatibility has to be lost, then there is no benefit to
sticking with RSA, and RSA has quite a few disadvantages to DH. It's keys
are a bit longer, it has a *lot* more gotchas when designing a protocol
based on it, it's key setup is glacial... basically, it has no engineering
advantages. If you know of any, I'd like to hear them.
> Increasing the size of RSA keys to 4096 bits would appear prudent at this
> time. The performance impact of the larger keys given typical Mixmaster
> usage patterns even on a slow machine will be negligible.
If my memory serves, doing a keygen on 4096 bits takes a couple minutes
even on a p133, although thankfully you only have to do that once.
> > > secure against such agencies. I therefore propose we expedite the
> > > development of the Mixmaster Protocol v3.
> > Agreed.
> Sure, but I would caution against re-inventing the wheel.
The first thing to do is look at the available modes for SSL, hopefully
some of them are still intact.
> As much as I was impressed by the numerous privacy-enhancing
> implementations that I saw at the recent excellent CodeCon (for which
> Rabbi was one of the organizers), many of the implementors appeared to
> have worked in isolation from the state-of-the-art of the scientific
> knowledge in the fields of cryptography and secure distributed
> computations. Though, to their credit, most had read Applied
> Cryptography and at least included crypto in their protocols.
Also to their credit, they generally didn't have all that much confidence
in their own cryptographic skills, probably because they're all working on
practical projects and have seen their work fail more than once :-)
> SHA-512 is the hash function of choice for AES-256. While it is true that
> the hash isn't excactly a text book example of efficiency, it is plenty fast
> for use by Mixmaster.
> The batching issue is orthogonal to the forward secrecy issue. I don't think
> that Rabbi meant to imply otherwise, but I still would like to caution
> against mentally linking the two.
The can technically be done separately, but since they both cause
backwards incompatibility in the protocol it makes sense to roll out both