[OTR-dev] Fwd: Some DH groups found weak; is OTR vulnerable?

Jacob Appelbaum jacob at appelbaum.net
Fri May 22 10:55:16 EDT 2015


On 5/22/15, Holger Levsen <holger at layer-acht.org> wrote:
> Hi Ian,
>
> On Freitag, 22. Mai 2015, Ian Goldberg wrote:
>> No, there is no reason to believe that the 1536-bit DH group used by OTR
>> is vulnerable.
>
> is it really a single group for all?

Yes, I believe that is correct - we're all using the same DH group as
defined in dh.c:

static const char* DH1536_MODULUS_S = "0x"
    "FFFFFFFFFFFFFFFFC90FDAA22168C234C4C6628B80DC1CD1"
    "29024E088A67CC74020BBEA63B139B22514A08798E3404DD"
    "EF9519B3CD3A431B302B0A6DF25F14374FE1356D6D51C245"
    "E485B576625E7EC6F44C42E9A637ED6B0BFF5CB6F406B7ED"
    "EE386BFB5A899FA5AE9F24117C4B1FE649286651ECE45B3D"
    "C2007CB8A163BF0598DA48361C55D39A69163FA8FD24CF5F"
    "83655D23DCA3AD961C62F356208552BB9ED529077096966D"
    "670C354E4ABC9804F1746C08CA237327FFFFFFFFFFFFFFFF";
static const char *DH1536_GENERATOR_S = "0x02";
static const int DH1536_MOD_LEN_BITS = 1536;
static const int DH1536_MOD_LEN_BYTES = 192;

> how about (optionally) creating a new, client specific one, on installation? and how about using eg. 4096 bits?

Using a larger DH group is totally reasonable - namely a 2048bit group
or a larger one. This is what the LogJam authors suggest. They also
suggest not using fixed primes that are largely shared by sets of
people. The main blocker to increasing the group size as I understand
it is that whenever this topic comes up for discussion it ends with
"lets just switch to ECC and do away with this problem entirely." That
then ends the entire discussion and no change is made.

Generally speaking, when it comes to generating a p - we have three
major areas of concern:

  bad entropy (see the mining your p's and q's paper)
  badly or maliciously chosen value (what is the best example? SSH had
the -1? 0? 1? bug, right?)
  shared primes (pre-computation changes the numbers in favor of the
attack, see LogJam)

A larger shared prime that is standardized is the safe option of the
three - it is hard to know if you've got bad entropy, it is also hard
to know if you've chosen badly.

>
> if I understand correctly, in a few years 1536 bits might not give
> sufficient protection anymore, if attacked liked this, and if there is a a full take of
> the otr communication then the "forward secrecy" is gone :/

There are two parts - part one is that the sieving and algebraic steps
of NFS can be done as pre-computation, and then only the descent phase
needs to be done to attack a specific conversation. Thus the more
people that share that group, the more valued the pre-computation.
There is no doubt in my mind that the group chosen by OTR is indeed a
target as we've seen ample evidence of them being thwarted by OTR
(FISA intercepts, etc).

If we assume that 1536bit DH is safe for a few years more, we can
assume that before that time arrives, this speedup is likely to be in
use.

The logjam attack explains *how* the NSA is likely decrypting SSH,
IPSec and SSL/TLS traffic at scale. And if they weren't doing it
before - they're certainly going to implement it now!

I've heard that some folks are upset by this paper as it does indeed
reveal one of their math tricks. It also reveals that huge chunks of
the deployed crypto on the net is just straight up garbage. The paper
and the work is absolutely brilliant.

Personally, I think it would make sense to bump the OTR protocol and
use a larger group. Later, I think it makes sense to make a version of
the OTR protocol that uses ECC. I hope we don't get stuck in the usual
trap of waiting for perfection.

My open question is: why do we trust any of the Oakley groups at all?
Do we really trust NIST? Do we really trust FIPS? If they were
underhanded - would we even understand how they might be maliciously
chosen?

How far down the rabbit hole do we want to go here? Do we want to pick
a totally unique set of groups for DH, where we trust that say, Ian,
won't choose badly? Should we just pick one new group? Or should we
standardize OTR Groups - generate ~10000 groups and then randomly pick
one of the set? That would further slow down the attacks against DH
and ensure that the pre-computation phase would need to be done for
*every* group rather than the current handful which is in the hundreds
at best.

All the best,
Jacob


More information about the OTR-dev mailing list